[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-nothinglo--Deep-Photo-Enhancer":3,"tool-nothinglo--Deep-Photo-Enhancer":62},[4,18,26,35,44,53],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":17},4358,"openclaw","openclaw\u002Fopenclaw","OpenClaw 是一款专为个人打造的本地化 AI 助手，旨在让你在自己的设备上拥有完全可控的智能伙伴。它打破了传统 AI 助手局限于特定网页或应用的束缚，能够直接接入你日常使用的各类通讯渠道，包括微信、WhatsApp、Telegram、Discord、iMessage 等数十种平台。无论你在哪个聊天软件中发送消息，OpenClaw 都能即时响应，甚至支持在 macOS、iOS 和 Android 设备上进行语音交互，并提供实时的画布渲染功能供你操控。\n\n这款工具主要解决了用户对数据隐私、响应速度以及“始终在线”体验的需求。通过将 AI 部署在本地，用户无需依赖云端服务即可享受快速、私密的智能辅助，真正实现了“你的数据，你做主”。其独特的技术亮点在于强大的网关架构，将控制平面与核心助手分离，确保跨平台通信的流畅性与扩展性。\n\nOpenClaw 非常适合希望构建个性化工作流的技术爱好者、开发者，以及注重隐私保护且不愿被单一生态绑定的普通用户。只要具备基础的终端操作能力（支持 macOS、Linux 及 Windows WSL2），即可通过简单的命令行引导完成部署。如果你渴望拥有一个懂你",349277,3,"2026-04-06T06:32:30",[13,14,15,16],"Agent","开发框架","图像","数据工具","ready",{"id":19,"name":20,"github_repo":21,"description_zh":22,"stars":23,"difficulty_score":10,"last_commit_at":24,"category_tags":25,"status":17},3808,"stable-diffusion-webui","AUTOMATIC1111\u002Fstable-diffusion-webui","stable-diffusion-webui 是一个基于 Gradio 构建的网页版操作界面，旨在让用户能够轻松地在本地运行和使用强大的 Stable Diffusion 图像生成模型。它解决了原始模型依赖命令行、操作门槛高且功能分散的痛点，将复杂的 AI 绘图流程整合进一个直观易用的图形化平台。\n\n无论是希望快速上手的普通创作者、需要精细控制画面细节的设计师，还是想要深入探索模型潜力的开发者与研究人员，都能从中获益。其核心亮点在于极高的功能丰富度：不仅支持文生图、图生图、局部重绘（Inpainting）和外绘（Outpainting）等基础模式，还独创了注意力机制调整、提示词矩阵、负向提示词以及“高清修复”等高级功能。此外，它内置了 GFPGAN 和 CodeFormer 等人脸修复工具，支持多种神经网络放大算法，并允许用户通过插件系统无限扩展能力。即使是显存有限的设备，stable-diffusion-webui 也提供了相应的优化选项，让高质量的 AI 艺术创作变得触手可及。",162132,"2026-04-05T11:01:52",[14,15,13],{"id":27,"name":28,"github_repo":29,"description_zh":30,"stars":31,"difficulty_score":32,"last_commit_at":33,"category_tags":34,"status":17},2271,"ComfyUI","Comfy-Org\u002FComfyUI","ComfyUI 是一款功能强大且高度模块化的视觉 AI 引擎，专为设计和执行复杂的 Stable Diffusion 图像生成流程而打造。它摒弃了传统的代码编写模式，采用直观的节点式流程图界面，让用户通过连接不同的功能模块即可构建个性化的生成管线。\n\n这一设计巧妙解决了高级 AI 绘图工作流配置复杂、灵活性不足的痛点。用户无需具备编程背景，也能自由组合模型、调整参数并实时预览效果，轻松实现从基础文生图到多步骤高清修复等各类复杂任务。ComfyUI 拥有极佳的兼容性，不仅支持 Windows、macOS 和 Linux 全平台，还广泛适配 NVIDIA、AMD、Intel 及苹果 Silicon 等多种硬件架构，并率先支持 SDXL、Flux、SD3 等前沿模型。\n\n无论是希望深入探索算法潜力的研究人员和开发者，还是追求极致创作自由度的设计师与资深 AI 绘画爱好者，ComfyUI 都能提供强大的支持。其独特的模块化架构允许社区不断扩展新功能，使其成为当前最灵活、生态最丰富的开源扩散模型工具之一，帮助用户将创意高效转化为现实。",108322,2,"2026-04-10T11:39:34",[14,15,13],{"id":36,"name":37,"github_repo":38,"description_zh":39,"stars":40,"difficulty_score":32,"last_commit_at":41,"category_tags":42,"status":17},6121,"gemini-cli","google-gemini\u002Fgemini-cli","gemini-cli 是一款由谷歌推出的开源 AI 命令行工具，它将强大的 Gemini 大模型能力直接集成到用户的终端环境中。对于习惯在命令行工作的开发者而言，它提供了一条从输入提示词到获取模型响应的最短路径，无需切换窗口即可享受智能辅助。\n\n这款工具主要解决了开发过程中频繁上下文切换的痛点，让用户能在熟悉的终端界面内直接完成代码理解、生成、调试以及自动化运维任务。无论是查询大型代码库、根据草图生成应用，还是执行复杂的 Git 操作，gemini-cli 都能通过自然语言指令高效处理。\n\n它特别适合广大软件工程师、DevOps 人员及技术研究人员使用。其核心亮点包括支持高达 100 万 token 的超长上下文窗口，具备出色的逻辑推理能力；内置 Google 搜索、文件操作及 Shell 命令执行等实用工具；更独特的是，它支持 MCP（模型上下文协议），允许用户灵活扩展自定义集成，连接如图像生成等外部能力。此外，个人谷歌账号即可享受免费的额度支持，且项目基于 Apache 2.0 协议完全开源，是提升终端工作效率的理想助手。",100752,"2026-04-10T01:20:03",[43,13,15,14],"插件",{"id":45,"name":46,"github_repo":47,"description_zh":48,"stars":49,"difficulty_score":10,"last_commit_at":50,"category_tags":51,"status":17},4487,"LLMs-from-scratch","rasbt\u002FLLMs-from-scratch","LLMs-from-scratch 是一个基于 PyTorch 的开源教育项目，旨在引导用户从零开始一步步构建一个类似 ChatGPT 的大型语言模型（LLM）。它不仅是同名技术著作的官方代码库，更提供了一套完整的实践方案，涵盖模型开发、预训练及微调的全过程。\n\n该项目主要解决了大模型领域“黑盒化”的学习痛点。许多开发者虽能调用现成模型，却难以深入理解其内部架构与训练机制。通过亲手编写每一行核心代码，用户能够透彻掌握 Transformer 架构、注意力机制等关键原理，从而真正理解大模型是如何“思考”的。此外，项目还包含了加载大型预训练权重进行微调的代码，帮助用户将理论知识延伸至实际应用。\n\nLLMs-from-scratch 特别适合希望深入底层原理的 AI 开发者、研究人员以及计算机专业的学生。对于不满足于仅使用 API，而是渴望探究模型构建细节的技术人员而言，这是极佳的学习资源。其独特的技术亮点在于“循序渐进”的教学设计：将复杂的系统工程拆解为清晰的步骤，配合详细的图表与示例，让构建一个虽小但功能完备的大模型变得触手可及。无论你是想夯实理论基础，还是为未来研发更大规模的模型做准备",90106,"2026-04-06T11:19:32",[52,15,13,14],"语言模型",{"id":54,"name":55,"github_repo":56,"description_zh":57,"stars":58,"difficulty_score":10,"last_commit_at":59,"category_tags":60,"status":17},4292,"Deep-Live-Cam","hacksider\u002FDeep-Live-Cam","Deep-Live-Cam 是一款专注于实时换脸与视频生成的开源工具，用户仅需一张静态照片，即可通过“一键操作”实现摄像头画面的即时变脸或制作深度伪造视频。它有效解决了传统换脸技术流程繁琐、对硬件配置要求极高以及难以实时预览的痛点，让高质量的数字内容创作变得触手可及。\n\n这款工具不仅适合开发者和技术研究人员探索算法边界，更因其极简的操作逻辑（仅需三步：选脸、选摄像头、启动），广泛适用于普通用户、内容创作者、设计师及直播主播。无论是为了动画角色定制、服装展示模特替换，还是制作趣味短视频和直播互动，Deep-Live-Cam 都能提供流畅的支持。\n\n其核心技术亮点在于强大的实时处理能力，支持口型遮罩（Mouth Mask）以保留使用者原始的嘴部动作，确保表情自然精准；同时具备“人脸映射”功能，可同时对画面中的多个主体应用不同面孔。此外，项目内置了严格的内容安全过滤机制，自动拦截涉及裸露、暴力等不当素材，并倡导用户在获得授权及明确标注的前提下合规使用，体现了技术发展与伦理责任的平衡。",88924,"2026-04-06T03:28:53",[14,15,13,61],"视频",{"id":63,"github_repo":64,"name":65,"description_en":66,"description_zh":67,"ai_summary_zh":67,"readme_en":68,"readme_zh":69,"quickstart_zh":70,"use_case_zh":71,"hero_image_url":72,"owner_login":73,"owner_name":74,"owner_avatar_url":75,"owner_bio":76,"owner_company":77,"owner_location":77,"owner_email":77,"owner_twitter":77,"owner_website":78,"owner_url":79,"languages":77,"stars":80,"forks":81,"last_commit_at":82,"license":83,"difficulty_score":84,"env_os":85,"env_gpu":86,"env_ram":85,"env_deps":87,"category_tags":91,"github_topics":77,"view_count":32,"oss_zip_url":77,"oss_zip_packed_at":77,"status":17,"created_at":92,"updated_at":93,"faqs":94,"releases":129},6814,"nothinglo\u002FDeep-Photo-Enhancer","Deep-Photo-Enhancer","TensorFlow implementation of the CVPR 2018 spotlight paper, Deep Photo Enhancer: Unpaired Learning for Image Enhancement from Photographs with GANs ","Deep-Photo-Enhancer 是一款基于 TensorFlow 实现的开源图像增强工具，源自 CVPR 2018 的焦点论文。它利用生成对抗网络（GAN）技术，能够自动将曝光不足、色彩暗淡或对比度低的原始照片，处理成具有专业摄影师修图效果的精美图片。\n\n该工具核心解决了传统图像增强方法依赖大量“原图 - 修图后”配对数据进行训练的难题。通过引入“无配对学习”机制，Deep-Photo-Enhancer 无需严格的图像对应关系，仅利用未配对的普通照片与高质量修图样本即可进行训练，大大降低了数据准备门槛，同时能有效保留图像细节并避免过度平滑。\n\n这款工具非常适合计算机视觉研究人员、AI 开发者以及需要批量处理图像的设计师使用。对于研究者，它提供了监督与非监督两种训练模式的完整代码及实验数据，便于深入探索 GAN 在图像处理中的应用；对于开发者，其提供的推理模型可快速集成到工作流中，实现自动化照片美化。虽然普通用户也可通过其演示网站体验效果，但要充分利用其开源特性，建议具备一定的深度学习基础。","# Deep Photo Enhancer: Unpaired Learning for Image Enhancement from Photographs with GANs\n\n### [[Demo website]](http:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002Fproject\u002FDeep-Photo-Enhancer\u002F) [[Youtube]](https:\u002F\u002Fwww.youtube.com\u002Fwatch?v=d7OXb2sqoec) [[Paper]](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002Fproject\u002FDeep-Photo-Enhancer\u002FCVPR-2018-DPE.pdf) [[Supplementary]](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002Fproject\u002FDeep-Photo-Enhancer\u002FCVPR-2018-DPE-sm-compress.pdf) [[Download Demo Video]](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002Fproject\u002FDeep-Photo-Enhancer\u002Fenhance_video_crop.mp4)\n### [[Spotlight Presentation-video]](https:\u002F\u002Fwww.youtube.com\u002Fwatch?v=7y-zyzJXxxI) [[Spotlight Presentation-pdf]](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002Fproject\u002FDeep-Photo-Enhancer\u002FCVPR-2018-DPE-spotlight-compress.pdf) [[Poster]](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002Fproject\u002FDeep-Photo-Enhancer\u002FCVPR-2018-DPE-poster-compress.pdf)\n\n\u003Cp align=\"center\">\n\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_b19a877b78b4.png\" width=\"80%\"\u002F>\n\u003Ca href=\"https:\u002F\u002Fwww.youtube.com\u002Fwatch?v=d7OXb2sqoec\" span>\n   \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_d6b2adc934b1.png\" width=\"90%\"\u002F>\n\u003C\u002Fa>\n\u003Ca href=\"https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002Fproject\u002FDeep-Photo-Enhancer\u002FCVPR-2018-DPE-poster-compress.pdf\" span>\n   \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_004d2620a48b.png\" width=\"100%\"\u002F>\n\u003C\u002Fa>\n\u003C\u002Fp>\nTensorFlow implementation of the CVPR 2018 spotlight paper, Deep Photo Enhancer: Unpaired Learning for Image Enhancement from Photographs with GANs. If you use any code or data from our work, please cite our paper.\n\n### [Update Jun. 05, 2019] Rename Model script\nI add the `rename_model.py` to the download link below.\n\n### [Update Mar. 31, 2019] Inference Models (Supervsied and Unsupervised).\nDownload link: [here](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002Fproject\u002FDeep-Photo-Enhancer\u002F[Online_Demo_Models]_Deep-Photo-Enhancer.zip). The code is exactly the same I used in my demo website. (Sorry, I do not have time to polish it...)\nSimplified tutorial: Using the function `getInputPhoto` and `processImg` in the `TF.py`\n\n### [Update Dec. 18, 2018] Data and Code (Supervsied and Unsupervised).\nThere are too many people asked me to release the code even the code is not polished and is ugly as me. Therefore, I put my ugly code and the data [here](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002Fproject\u002FDeep-Photo-Enhancer\u002F[Experimental_Code_Data]_Deep-Photo-Enhancer.zip). I also provide the [supervised code](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002Fproject\u002FDeep-Photo-Enhancer\u002F[Experimental_Supervised_Code]_Deep-Photo-Enhancer.zip). There are a lot of unnecessary parts in the code. I will refactor the code ASAP. Regarding the data, I put the name of the images we used on [MIT-Adobe FiveK dataset](https:\u002F\u002Fdata.csail.mit.edu\u002Fgraphics\u002Ffivek\u002F). I directly used Lightroom to decode the images to TIF format and used Lightroom to resize the long side of the images to 512 resolution (The label images are from retoucher C). I am not sure whether I have right to release the HDR dataset we collected from [Flickr](https:\u002F\u002Fwww.flickr.com\u002Fsearch\u002F?text=HDR) so I put the ID of them. You can download the images according to the IDs. (The code was run on 0.12 version of TensorFlow. The A-WGAN part in the code did not implement decreasing the lambda since the initial lambda was relatively small in that case.)\n\nSome useful issues: [#6](https:\u002F\u002Fgithub.com\u002Fnothinglo\u002FDeep-Photo-Enhancer\u002Fissues\u002F6), [#16](https:\u002F\u002Fgithub.com\u002Fnothinglo\u002FDeep-Photo-Enhancer\u002Fissues\u002F16), [#18](https:\u002F\u002Fgithub.com\u002Fnothinglo\u002FDeep-Photo-Enhancer\u002Fissues\u002F18), [#24](https:\u002F\u002Fgithub.com\u002Fnothinglo\u002FDeep-Photo-Enhancer\u002Fissues\u002F24), [#27](https:\u002F\u002Fgithub.com\u002Fnothinglo\u002FDeep-Photo-Enhancer\u002Fissues\u002F27), [#38](https:\u002F\u002Fgithub.com\u002Fnothinglo\u002FDeep-Photo-Enhancer\u002Fissues\u002F38), [#39](https:\u002F\u002Fgithub.com\u002Fnothinglo\u002FDeep-Photo-Enhancer\u002Fissues\u002F39)\n\n### Results\n\n\u003Cp align=\"center\">\u003C\u002Fp>\n\t\n| Method | Description |\n| :---: | --- |\n| Label | Retouched by photographer from MIT-Adobe 5K dataset [1] |\n| Our (HDR) | Our model trained on our HDR dataset with unpaired data |\n| Our (SL) | Our model trained on MIT-Adobe 5K dataset with paired data (supervised learning) |\n| Our (UL) | Our model trained on MIT-Adobe 5K dataset with unpaired data |\n| CycleGAN (HDR) | CycleGAN's model [2] trained on our HDR dataset with unpaired data |\n| DPED_device | DPED's model [3] trained on a specified device with paired data (supervised learning) |\n| CLHE | Heuristic method from [4] |\n| NPEA | Heuristic method from [5] |\n| FLLF | Heuristic method from [6] |\n\n\u003Cp>\u003C\u002Fp>\n\n\u003Ctable>\n  \u003Ctr>\n    \u003Cth>Input\u003C\u002Fth>\n    \u003Cth>Label\u003C\u002Fth>\n    \u003Cth>Our (HDR)\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f82d3c458739.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_e4bb7540364a.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_130ba72af2fc.png\"\u002F>\u003C\u002Ftd> \n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>Our (SL)\u003C\u002Fth>\n    \u003Cth>Our (UL)\u003C\u002Fth>\n    \u003Cth>CycleGAN (HDR)\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_af5e4f560f8b.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_3cd263fc0698.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_ee833c6e7407.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>DPED_iPhone6\u003C\u002Fth> \n    \u003Cth>DPED_iPhone7\u003C\u002Fth>\n    \u003Cth>DPED_Nexus5x\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_b04296a54bc7.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_4ec7b939cff7.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_b020bd474bb2.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>CLHE\u003C\u002Fth> \n    \u003Cth>NPEA\u003C\u002Fth>\n    \u003Cth>FLLF\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_0a4c30c09a18.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_7874cf3ff32f.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_304609d7022c.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n\u003C\u002Ftable>\n\u003Ctable>\n  \u003Ctr>\n    \u003Cth>Input (MIT-Adobe)\u003C\u002Fth>\n    \u003Cth>Our (HDR)\u003C\u002Fth>\n    \u003Cth>DPED_iPhone7\u003C\u002Fth>\n    \u003Cth>CLHE\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_a54444af9d0e.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_a81a2d16de36.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_842bbbf4b891.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_909f7dc1bd96.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_36992d19e241.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_4c38a98b9bff.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_a712f0d3c794.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_ba2a3673633f.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_adbf477cd550.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_aab6f51d436f.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_05e6d5a25fdb.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_b540cc7cadac.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_d59ddf020156.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_ed916bdac0e4.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_c9cd1820bd0f.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_7ac1cc4fa395.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_7fa96ae578fb.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_b704c68564af.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_ffba345df66a.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_0660cd42b47c.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_31f3625fadbe.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_a9018357a020.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_5efa6de48cad.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_02ff8e52b58d.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_15fcf12e833f.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f50ef71830b7.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_dc6c47e236f0.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_19fa67be7c84.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003C\u002Ftr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_3a8aaf40786d.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f8bebee3afe1.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_6ecd00177642.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f4b3baec08db.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n    \u003Ctr>\n    \u003Cth>Input (Internet)\u003C\u002Fth>\n    \u003Cth>Our (HDR)\u003C\u002Fth>\n    \u003Cth>DPED_iPhone7\u003C\u002Fth>\n    \u003Cth>CLHE\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_b1822e7ca46c.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_3b89ef3e0e47.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_89fe16a033d3.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f217985be266.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_1f58944d9fca.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_958834bbdb9b.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_642bc6a4c6e9.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_d10cf7be6812.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f2d51e37eb98.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_aa26ef82cf7a.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_84cab8972d00.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_ba183ac7e584.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_7c447a894cab.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_8fa69d5f233c.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_636cf29d3a5a.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_32886b140d48.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_d803c23a502d.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_e70be9b40a69.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_8cae109a88a5.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_efeafab351fe.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_aa443570d355.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_03f13b1bd1cc.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_d259097ff8f3.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_4b5b7d42da33.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_426026fc08fd.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_96fbd358c14a.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f673d90335a7.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_9ac200e0829a.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_b78937a949ea.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_c5641242c353.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_c07d1303c1f4.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_0283c2c21d22.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_d4374f3c839e.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_55486d6c0fa3.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_d0ca9d146b8d.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_afe3873f28b9.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_88cb6ac79387.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_2be3387608a7.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_4e23ad126e39.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_1c10b4741b1a.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_b37d31c02b3d.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_eab11bd171e4.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_4141237f1956.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_b0bc48f3e638.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_bf02db7235ab.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f0c3f474169e.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_cc53c7218882.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_00b533834463.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_5602dc92cf4f.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_2036f197c834.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_9fd9c0e46943.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_6cd792ed69b7.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n\u003C\u002Ftable>\n\n### User study\n\n\u003Ctable>\n  \u003Ctr>\n     \u003Ctd colspan=7 align=\"center\"> Preference Matrix\u003Cbr>\n(20 participants and 20 images using pairwise comparisons) \u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n     \u003Cth>\u003C\u002Fth>\n     \u003Cth>CycleGAN\u003C\u002Fth>\n     \u003Cth>DPED\u003C\u002Fth>\n     \u003Cth>NPEA\u003C\u002Fth>\n     \u003Cth>CLHE\u003C\u002Fth>\n     \u003Cth>Ours\u003C\u002Fth>\n     \u003Cth>Total\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n     \u003Cth>CycleGAN\u003C\u002Fth>\n     \u003Ctd align=\"center\">-\u003C\u002Ftd>\n     \u003Ctd align=\"center\">32\u003C\u002Ftd>\n     \u003Ctd align=\"center\">27\u003C\u002Ftd>\n     \u003Ctd align=\"center\">23\u003C\u002Ftd>\n     \u003Ctd align=\"center\">11\u003C\u002Ftd>\n     \u003Cth>93\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n     \u003Cth>DPED\u003C\u002Fth>\n     \u003Ctd align=\"center\">368\u003C\u002Ftd>\n     \u003Ctd align=\"center\">-\u003C\u002Ftd>\n     \u003Ctd align=\"center\">141\u003C\u002Ftd>\n     \u003Ctd align=\"center\">119\u003C\u002Ftd>\n     \u003Ctd align=\"center\">29\u003C\u002Ftd>\n     \u003Cth>657\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n     \u003Cth>NPEA\u003C\u002Fth>\n     \u003Ctd align=\"center\">373\u003C\u002Ftd>\n     \u003Ctd align=\"center\">259\u003C\u002Ftd>\n     \u003Ctd align=\"center\">-\u003C\u002Ftd>\n     \u003Ctd align=\"center\">142\u003C\u002Ftd>\n     \u003Ctd align=\"center\">50\u003C\u002Ftd>\n     \u003Cth>824\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n     \u003Cth>CLHE\u003C\u002Fth>\n     \u003Ctd align=\"center\">377\u003C\u002Ftd>\n     \u003Ctd align=\"center\">281\u003C\u002Ftd>\n     \u003Ctd align=\"center\">258\u003C\u002Ftd>\n     \u003Ctd align=\"center\">-\u003C\u002Ftd>\n     \u003Ctd align=\"center\">77\u003C\u002Ftd>\n     \u003Cth>993\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n     \u003Cth>Ours\u003C\u002Fth>\n     \u003Ctd align=\"center\">389\u003C\u002Ftd>\n     \u003Ctd align=\"center\">371\u003C\u002Ftd>\n     \u003Ctd align=\"center\">350\u003C\u002Ftd>\n     \u003Ctd align=\"center\">323\u003C\u002Ftd>\n     \u003Ctd align=\"center\">-\u003C\u002Fth>\n     \u003Cth>1433\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd colspan=7> Our model trained on HDR images ranked the first and CLHE was the runner-up. When comparing our model with CLHE, 81% of users (323 among 400) preferred our results. \u003C\u002Ftd>\n  \u003C\u002Ftr>\n\u003C\u002Ftable>\n\n### Other applications of global U-Net, A-WGAN and iBN\nThis paper proposes three improvements: global U-Net, adaptive WGAN (A-WGAN) and individual batch normalization (iBN). They generally improve results; and for some applications, the improvement is sufficient for crossing the bar and leading to success. We have applied them to some other applications.\n\n\u003Ctable>\n  \u003Ctr>\n    \u003Cth>Input\u003C\u002Fth>\n    \u003Cth>Ground truth\u003C\u002Fth>\n    \u003Cth>global U-Net\u003C\u002Fth>\n    \u003Cth>U-Net\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_a805c180ca03.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f3f7cebb7557.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_13dac171df06.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_7d19eeff9a70.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_0533e04e22f8.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_ef2df7b4dd39.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_7624e55aaaf8.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_7805121875aa.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_9f0d835d35ef.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_47c66a0aa8ca.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_69d96320ee6e.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_14cfa657cdf2.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd colspan=4>For global U-Net, we applied it to trimap segmentation for pets using the Oxford-IIIT Pet dataset. The accuracies of U-Net and global U-Net are 0.8759 and 0.8905 respectively.\n\u003C\u002Ftd>\n  \u003C\u002Ftr>\n\u003C\u002Ftable>\n\n\u003Ctable>\n  \u003Ctr>\n    \u003Cth>\u003C\u002Fth>\n    \u003Cth>λ = 0.1\u003C\u002Fth>\n    \u003Cth>λ = 10\u003C\u002Fth>\n    \u003Cth>λ = 1000\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>WGAN-GP\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_e198e6657785.png\"\u002F>\u003C\u002Fth> \n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_0935f9b7ba48.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_9fdf7e7ccbc6.png\"\u002F>\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>A-WGAN\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_ad2c13585e7a.png\"\u002F>\u003C\u002Fth> \n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_4957e1d05d06.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_92def81e46fb.png\"\u002F>\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd colspan=4>With different λ values, WGAN-GP could succeed or fail. The proposed A-WGAN is less dependent with λ and succeeded with all three λ values.\u003C\u002Ftd>\n  \u003C\u002Ftr>\n\u003C\u002Ftable>\n\n\u003Ctable>\n  \u003Ctr>\n    \u003Cth colspan=3>Male -> Female\u003C\u002Fth>\n    \u003Cth colspan=3>Female -> Male\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>Input\u003C\u002Fth>\n    \u003Cth>with iBN\u003C\u002Fth>\n    \u003Cth>w\u002Fo iBN\u003C\u002Fth>\n    \u003Cth>Input\u003C\u002Fth>\n    \u003Cth>with iBN\u003C\u002Fth>\n    \u003Cth>w\u002Fo iBN\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_12672830dd17.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f9753a1a5f40.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_1e6c5ec0fe1e.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_ae6edc0409e2.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_fb6d991429bb.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_374cc916ffee.png\"\u002F>\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_1012a9877773.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_0c88d71ff0e1.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f21c0ad53f30.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_d695fc99fe43.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_eaa0761a3b0a.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_6ba176d7d399.png\"\u002F>\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_39d18ae100b3.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_49e0895e3acc.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_38a5d48aa836.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_b23b0c00ed65.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_114efb66c6b9.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_c23ef8c8d513.png\"\u002F>\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_9c7024bd0069.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_64e0a6c874b3.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_ed730ea226f3.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_a77859909ad0.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_8500fd992edc.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_9d7bfe86f877.png\"\u002F>\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd colspan=6>We applied the 2-way GAN to gender change of face images. As shown in the figure, the 2-way GAN failed on the task but succeeded after employing the proposed iBN.\n\u003C\u002Ftd>\n  \u003C\u002Ftr>\n\u003C\u002Ftable>\n\n### Architecture\n\n\u003Ctable>\n  \u003Ctr>\n    \u003Cth colspan=2>Generator\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth colspan=2>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_3efffa24c2e3.png\"\u002F>\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth colspan=2>Discriminator\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth colspan=2>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_a9a57724b34d.png\"\u002F>\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>1-way GAN\u003C\u002Fth>\n    \u003Cth>2-way GAN\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_ee1cdb7a352f.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_26d170979e63.png\"\u002F>\u003C\u002Fth>\n  \u003C\u002Ftr>\n\u003C\u002Ftable>\n\n### Publication\n[Yu-Sheng Chen](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002F~nothinglo\u002F), [Yu-Ching Wang](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002F~urchinwang\u002F), [Man-Hsin Kao](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002F~cindy0711\u002F) and [Yung-Yu Chuang](https:\u002F\u002Fwww.csie.ntu.edu.tw\u002F~cyy\u002F).\n\n[National Taiwan University](https:\u002F\u002Fwww.ntu.edu.tw)\n\nDeep Photo Enhancer: Unpaired Learning for Image Enhancement from Photographs with GANs. Proceedings of IEEE International Conference on Computer Vision and Pattern Recognition 2018 (CVPR 2018), to appear, June 2018, Salt Lake City, USA.\n\n### Citation\n```\n@INPROCEEDINGS{Chen:2018:DPE,\n\tAUTHOR    = {Yu-Sheng Chen and Yu-Ching Wang and Man-Hsin Kao and Yung-Yu Chuang},\n\tTITLE     = {Deep Photo Enhancer: Unpaired Learning for Image Enhancement from Photographs with GANs},\n\tYEAR      = {2018},\n\tMONTH     = {June},\n\tBOOKTITLE = {Proceedings of IEEE International Conference on Computer Vision and Pattern Recognition (CVPR 2018)},\n\tPAGES     = {6306--6314},\n\tLOCATION  = {Salt Lake City},\n}\n```\n### Reference\n\n> 1. *Bychkovsky, V., Paris, S., Chan, E., Durand, F.: Learning photographic global tonal adjustment with a database of input\u002Foutput image pairs. In: Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition. pp. 97-104. CVPR'11 (2011)*\n> 2. *Zhu, J. Y., Park, T., Isola, P., Efros, A. A.: Unpaired image-to-image translation using cycle-consistent adversarial networks. In: Proceedings of the 2017 IEEE International Conference on Computer Vision. pp. 2242-2251. ICCV'17 (2017)*\n> 3. *Ignatov, A., Kobyshev, N., Vanhoey, K., Timofte, R., Van Gool, L.: DSLR-quality photos on mobile devices with deep convolutional networks. In: Proceedings of the 2017 IEEE International Conference on Computer Vision. pp. 3277-3285. ICCV'17 (2017)*\n> 4. *Wang, S., Cho, W., Jang, J., Abidi, M. A., Paik, J.: Contrast-dependent saturation adjustment for outdoor image enhancement. JOSA A. pp. 2532-2542. (2017)*\n> 5. *Wang, S., Zheng, J., Hu, H. M., Li, B.: Naturalness preserved enhancement algorithm for non-uniform illumination images. IEEE Transactions on Image Processing. pp. 3538-3548. TIP'13 (2013)*\n> 6. *Aubry, M., Paris, S., Hasinoff, S. W., Kautz, J., Durand, F.: Fast local laplacian filters: Theory and applications. ACM Transactions on Graphics. Article 167. TOG'14 (2014)*\n\n### Contact\nFeel free to contact me if there is any questions (Yu-Sheng Chen nothinglo@cmlab.csie.ntu.edu.tw).\n","# 深度照片增强器：基于GAN的无配对学习用于从照片中进行图像增强\n\n### [[演示网站]](http:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002Fproject\u002FDeep-Photo-Enhancer\u002F) [[YouTube]](https:\u002F\u002Fwww.youtube.com\u002Fwatch?v=d7OXb2sqoec) [[论文]](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002Fproject\u002FDeep-Photo-Enhancer\u002FCVPR-2018-DPE.pdf) [[补充材料]](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002Fproject\u002FDeep-Photo-Enhancer\u002FCVPR-2018-DPE-sm-compress.pdf) [[下载演示视频]](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002Fproject\u002FDeep-Photo-Enhancer\u002Fenhance_video_crop.mp4)\n### [[亮点报告视频]](https:\u002F\u002Fwww.youtube.com\u002Fwatch?v=7y-zyzJXxxI) [[亮点报告PDF]](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002Fproject\u002FDeep-Photo-Enhancer\u002FCVPR-2018-DPE-spotlight-compress.pdf) [[海报]](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002Fproject\u002FDeep-Photo-Enhancer\u002FCVPR-2018-DPE-poster-compress.pdf)\n\n\u003Cp align=\"center\">\n\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_b19a877b78b4.png\" width=\"80%\"\u002F>\n\u003Ca href=\"https:\u002F\u002Fwww.youtube.com\u002Fwatch?v=d7OXb2sqoec\" span>\n   \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_d6b2adc934b1.png\" width=\"90%\"\u002F>\n\u003C\u002Fa>\n\u003Ca href=\"https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002Fproject\u002FDeep-Photo-Enhancer\u002FCVPR-2018-DPE-poster-compress.pdf\" span>\n   \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_004d2620a48b.png\" width=\"100%\"\u002F>\n\u003C\u002Fa>\n\u003C\u002Fp>\n这是CVPR 2018亮点论文《深度照片增强器：基于GAN的无配对学习用于从照片中进行图像增强》的TensorFlow实现。如果您使用了我们工作中的任何代码或数据，请引用我们的论文。\n\n### [更新 2019年6月5日] 重命名模型脚本\n我在下面的下载链接中添加了`rename_model.py`。\n\n### [更新 2019年3月31日] 推理模型（有监督和无监督）。\n下载链接：[这里](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002Fproject\u002FDeep-Photo-Enhancer\u002F[Online_Demo_Models]_Deep-Photo-Enhancer.zip)。代码与我在演示网站中使用的完全相同。（抱歉，我没有时间对其进行优化……）\n简化教程：在`TF.py`中使用`getInputPhoto`和`processImg`函数。\n\n### [更新 2018年12月18日] 数据和代码（有监督和无监督）。\n太多人要求我发布代码，尽管代码并不完善且像我一样“丑陋”。因此，我将我的“丑陋”代码和数据放在了[这里](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002Fproject\u002FDeep-Photo-Enhancer\u002F[Experimental_Code_Data]_Deep-Photo-Enhancer.zip)。我还提供了[有监督的代码](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002Fproject\u002FDeep-Photo-Enhancer\u002F[Experimental_Supervised_Code]_Deep-Photo-Enhancer.zip)。代码中有很多不必要的部分。我会尽快重构代码。关于数据，我们将所用图片的名称标注在了[MIT-Adobe FiveK数据集](https:\u002F\u002Fdata.csail.mit.edu\u002Fgraphics\u002Ffivek\u002F)上。我直接使用Lightroom将图片解码为TIF格式，并调整图片长边至512分辨率（标签图来自修图师C）。我不确定是否有权公开我们从[Flickr](https:\u002F\u002Fwww.flickr.com\u002Fsearch\u002F?text=HDR)收集的HDR数据集，因此只提供了这些图片的ID。您可以根据ID下载这些图片。（代码是在TensorFlow 0.12版本上运行的。代码中的A-WGAN部分并未实施降低lambda值的操作，因为当时的初始lambda值已经相对较小。）\n\n一些有用的问题：[#6](https:\u002F\u002Fgithub.com\u002Fnothinglo\u002FDeep-Photo-Enhancer\u002Fissues\u002F6), [#16](https:\u002F\u002Fgithub.com\u002Fnothinglo\u002FDeep-Photo-Enhancer\u002Fissues\u002F16), [#18](https:\u002F\u002Fgithub.com\u002Fnothinglo\u002FDeep-Photo-Enhancer\u002Fissues\u002F18), [#24](https:\u002F\u002Fgithub.com\u002Fnothinglo\u002FDeep-Photo-Enhancer\u002Fissues\u002F24), [#27](https:\u002F\u002Fgithub.com\u002Fnothinglo\u002FDeep-Photo-Enhancer\u002Fissues\u002F27), [#38](https:\u002F\u002Fgithub.com\u002Fnothinglo\u002FDeep-Photo-Enhancer\u002Fissues\u002F38), [#39](https:\u002F\u002Fgithub.com\u002Fnothinglo\u002FDeep-Photo-Enhancer\u002Fissues\u002F39)\n\n### 结果\n\n\u003Cp align=\"center\">\u003C\u002Fp>\n\t\n| 方法 | 描述 |\n| :---: | --- |\n| 标签 | 由MIT-Adobe 5K数据集的摄影师修图而成 [1] |\n| 我们（HDR） | 我们基于无配对数据训练的HDR模型 |\n| 我们（SL） | 我们基于MIT-Adobe 5K数据集的配对数据训练的模型（有监督学习） |\n| 我们（UL） | 我们基于MIT-Adobe 5K数据集的无配对数据训练的模型 |\n| CycleGAN（HDR） | CycleGAN基于我们的HDR数据集训练的无配对模型 [2] |\n| DPED_device | DPED基于特定设备的配对数据训练的模型（有监督学习） [3] |\n| CLHE | 来自[4]的启发式方法 |\n| NPEA | 来自[5]的启发式方法 |\n| FLLF | 来自[6]的启发式方法 |\n\n\u003Cp>\u003C\u002Fp>\n\n\u003Ctable>\n  \u003Ctr>\n    \u003Cth>输入\u003C\u002Fth>\n    \u003Cth>标签\u003C\u002Fth>\n    \u003Cth>我们的（HDR）\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f82d3c458739.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_e4bb7540364a.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_130ba72af2fc.png\"\u002F>\u003C\u002Ftd> \n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>我们的（SL）\u003C\u002Fth>\n    \u003Cth>我们的（UL）\u003C\u002Fth>\n    \u003Cth>CycleGAN（HDR）\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_af5e4f560f8b.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_3cd263fc0698.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_ee833c6e7407.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>DPED_iPhone6\u003C\u002Fth> \n    \u003Cth>DPED_iPhone7\u003C\u002Fth>\n    \u003Cth>DPED_Nexus5x\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_b04296a54bc7.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_4ec7b939cff7.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_b020bd474bb2.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>CLHE\u003C\u002Fth> \n    \u003Cth>NPEA\u003C\u002Fth>\n    \u003Cth>FLLF\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_0a4c30c09a18.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_7874cf3ff32f.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_304609d7022c.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n\u003C\u002Ftable>\n\u003Ctable>\n  \u003Ctr>\n    \u003Cth>输入（MIT-Adobe）\u003C\u002Fth>\n    \u003Cth>我们的（HDR）\u003C\u002Fth>\n    \u003Cth>DPED_iPhone7\u003C\u002Fth>\n    \u003Cth>CLHE\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_a54444af9d0e.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_a81a2d16de36.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_842bbbf4b891.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_909f7dc1bd96.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_36992d19e241.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_4c38a98b9bff.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_a712f0d3c794.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_ba2a3673633f.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_adbf477cd550.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_aab6f51d436f.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_05e6d5a25fdb.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_b540cc7cadac.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_d59ddf020156.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_ed916bdac0e4.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_c9cd1820bd0f.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_7ac1cc4fa395.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_7fa96ae578fb.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_b704c68564af.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_ffba345df66a.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_0660cd42b47c.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_31f3625fadbe.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_a9018357a020.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_5efa6de48cad.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_02ff8e52b58d.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_15fcf12e833f.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f50ef71830b7.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_dc6c47e236f0.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_19fa67be7c84.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_3a8aaf40786d.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f8bebee3afe1.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_6ecd00177642.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f4b3baec08db.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>输入（互联网）\u003C\u002Fth>\n    \u003Cth>我们的（HDR）\u003C\u002Fth>\n    \u003Cth>DPED_iPhone7\u003C\u002Fth>\n    \u003Cth>CLHE\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_b1822e7ca46c.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_3b89ef3e0e47.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_89fe16a033d3.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f217985be266.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_1f58944d9fca.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_958834bbdb9b.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_642bc6a4c6e9.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_d10cf7be6812.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f2d51e37eb98.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_aa26ef82cf7a.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_84cab8972d00.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_ba183ac7e584.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_7c447a894cab.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_8fa69d5f233c.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_636cf29d3a5a.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_32886b140d48.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_d803c23a502d.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_e70be9b40a69.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_8cae109a88a5.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_efeafab351fe.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_aa443570d355.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_03f13b1bd1cc.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_d259097ff8f3.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_4b5b7d42da33.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_426026fc08fd.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_96fbd358c14a.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f673d90335a7.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_9ac200e0829a.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_b78937a949ea.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_c5641242c353.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_c07d1303c1f4.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_0283c2c21d22.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_d4374f3c839e.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_55486d6c0fa3.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_d0ca9d146b8d.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_afe3873f28b9.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_88cb6ac79387.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_2be3387608a7.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_4e23ad126e39.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_1c10b4741b1a.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_b37d31c02b3d.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_eab11bd171e4.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_4141237f1956.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_b0bc48f3e638.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_bf02db7235ab.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f0c3f474169e.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_cc53c7218882.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_00b533834463.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_5602dc92cf4f.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_2036f197c834.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_9fd9c0e46943.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_6cd792ed69b7.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n\u003C\u002Ftable>\n\n### 用户研究\n\n\u003Ctable>\n  \u003Ctr>\n     \u003Ctd colspan=7 align=\"center\"> 偏好矩阵\u003Cbr>\n(20名参与者和20张图像，采用成对比较法) \u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n     \u003Cth>\u003C\u002Fth>\n     \u003Cth>CycleGAN\u003C\u002Fth>\n     \u003Cth>DPED\u003C\u002Fth>\n     \u003Cth>NPEA\u003C\u002Fth>\n     \u003Cth>CLHE\u003C\u002Fth>\n     \u003Cth>我们的方法\u003C\u002Fth>\n     \u003Cth>总计\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n     \u003Cth>CycleGAN\u003C\u002Fth>\n     \u003Ctd align=\"center\">-\u003C\u002Ftd>\n     \u003Ctd align=\"center\">32\u003C\u002Ftd>\n     \u003Ctd align=\"center\">27\u003C\u002Ftd>\n     \u003Ctd align=\"center\">23\u003C\u002Ftd>\n     \u003Ctd align=\"center\">11\u003C\u002Ftd>\n     \u003Cth>93\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n     \u003Cth>DPED\u003C\u002Fth>\n     \u003Ctd align=\"center\">368\u003C\u002Ftd>\n     \u003Ctd align=\"center\">-\u003C\u002Ftd>\n     \u003Ctd align=\"center\">141\u003C\u002Ftd>\n     \u003Ctd align=\"center\">119\u003C\u002Ftd>\n     \u003Ctd align=\"center\">29\u003C\u002Ftd>\n     \u003Cth>657\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n     \u003Cth>NPEA\u003C\u002Fth>\n     \u003Ctd align=\"center\">373\u003C\u002Ftd>\n     \u003Ctd align=\"center\">259\u003C\u002Ftd>\n     \u003Ctd align=\"center\">-\u003C\u002Ftd>\n     \u003Ctd align=\"center\">142\u003C\u002Ftd>\n     \u003Ctd align=\"center\">50\u003C\u002Ftd>\n     \u003Cth>824\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n     \u003Cth>CLHE\u003C\u002Fth>\n     \u003Ctd align=\"center\">377\u003C\u002Ftd>\n     \u003Ctd align=\"center\">281\u003C\u002Ftd>\n     \u003Ctd align=\"center\">258\u003C\u002Ftd>\n     \u003Ctd align=\"center\">-\u003C\u002Ftd>\n     \u003Ctd align=\"center\">77\u003C\u002Ftd>\n     \u003Cth>993\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n     \u003Cth>我们的方法\u003C\u002Fth>\n     \u003Ctd align=\"center\">389\u003C\u002Ftd>\n     \u003Ctd align=\"center\">371\u003C\u002Ftd>\n     \u003Ctd align=\"center\">350\u003C\u002Ftd>\n     \u003Ctd align=\"center\">323\u003C\u002Ftd>\n     \u003Ctd align=\"center\">-\u003C\u002Ftd>\n     \u003Cth>1433\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd colspan=7> 我们在HDR图像上训练的模型排名第一，CLHE位居第二。在将我们的模型与CLHE进行比较时，81%的用户（400人中有323人）更倾向于我们的结果。 \u003C\u002Ftd>\n  \u003C\u002Ftr>\n\u003C\u002Ftable>\n\n### 全局U-Net、A-WGAN和iBN的其他应用\n本文提出了三项改进：全局U-Net、自适应WGAN（A-WGAN）和个体批归一化（iBN）。这些改进通常能够提升效果；对于某些应用而言，改进幅度足以跨越门槛并取得成功。我们已将这些技术应用于其他一些场景。\n\n\u003Ctable>\n  \u003Ctr>\n    \u003Cth>输入\u003C\u002Fth>\n    \u003Cth>真实标签\u003C\u002Fth>\n    \u003Cth>全局U-Net\u003C\u002Fth>\n    \u003Cth>U-Net\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_a805c180ca03.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f3f7cebb7557.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_13dac171df06.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_7d19eeff9a70.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_0533e04e22f8.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_ef2df7b4dd39.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_7624e55aaaf8.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_7805121875aa.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_9f0d835d35ef.png\"\u002F>\u003C\u002Ftd> \n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_47c66a0aa8ca.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_69d96320ee6e.png\"\u002F>\u003C\u002Ftd>\n    \u003Ctd>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_14cfa657cdf2.png\"\u002F>\u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd colspan=4> 对于全局U-Net，我们将其应用于牛津-IIIT宠物数据集上的宠物三元图分割任务。U-Net和全局U-Net的准确率分别为0.8759和0.8905。\n\u003C\u002Ftd>\n  \u003C\u002Ftr>\n\u003C\u002Ftable>\n\n\u003Ctable>\n  \u003Ctr>\n    \u003Cth>\u003C\u002Fth>\n    \u003Cth>λ = 0.1\u003C\u002Fth>\n    \u003Cth>λ = 10\u003C\u002Fth>\n    \u003Cth>λ = 1000\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>WGAN-GP\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_e198e6657785.png\"\u002F>\u003C\u002Fth> \n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_0935f9b7ba48.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_9fdf7e7ccbc6.png\"\u002F>\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>A-WGAN\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_ad2c13585e7a.png\"\u002F>\u003C\u002Fth> \n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_4957e1d05d06.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_92def81e46fb.png\"\u002F>\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd colspan=4> 在不同的λ值下，WGAN-GP的表现可能成功也可能失败。而提出的A-WGAN对λ的依赖性较低，在三个λ值下均取得了成功。\n\u003C\u002Ftd>\n  \u003C\u002Ftr>\n\u003C\u002Ftable>\n\n\u003Ctable>\n  \u003Ctr>\n    \u003Cth colspan=3>男性 -> 女性\u003C\u002Fth>\n    \u003Cth colspan=3>女性 -> 男性\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>输入\u003C\u002Fth>\n    \u003Cth>使用iBN\u003C\u002Fth>\n    \u003Cth>不使用iBN\u003C\u002Fth>\n    \u003Cth>输入\u003C\u002Fth>\n    \u003Cth>使用iBN\u003C\u002Fth>\n    \u003Cth>不使用iBN\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_12672830dd17.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f9753a1a5f40.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_1e6c5ec0fe1e.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_ae6edc0409e2.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_fb6d991429bb.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_374cc916ffee.png\"\u002F>\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_1012a9877773.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_0c88d71ff0e1.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_f21c0ad53f30.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_d695fc99fe43.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_eaa0761a3b0a.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_6ba176d7d399.png\"\u002F>\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_39d18ae100b3.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_49e0895e3acc.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_38a5d48aa836.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_b23b0c00ed65.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_114efb66c6b9.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_c23ef8c8d513.png\"\u002F>\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_9c7024bd0069.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_64e0a6c874b3.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_ed730ea226f3.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_a77859909ad0.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_8500fd992edc.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_9d7bfe86f877.png\"\u002F>\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd colspan=6> 我们将双向GAN应用于人脸图像的性别转换任务。如图所示，双向GAN在此任务中未能成功，但在采用我们提出的iBN后成功实现了目标。\n\u003C\u002Ftd>\n  \u003C\u002Ftr>\n\u003C\u002Ftable>\n\n### 架构\n\n\u003Ctable>\n  \u003Ctr>\n    \u003Cth colspan=2>生成器\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth colspan=2>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_3efffa24c2e3.png\"\u002F>\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth colspan=2>判别器\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth colspan=2>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_a9a57724b34d.png\"\u002F>\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>单向GAN\u003C\u002Fth>\n    \u003Cth>双向GAN\u003C\u002Fth>\n  \u003C\u002Ftr>\n  \u003Ctr>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_ee1cdb7a352f.png\"\u002F>\u003C\u002Fth>\n    \u003Cth>\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_readme_26d170979e63.png\"\u002F>\u003C\u002Fth>\n  \u003C\u002Ftr>\n\u003C\u002Ftable>\n\n### 出版物\n[陈宇升](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002F~nothinglo\u002F)、[王昱菁](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002F~urchinwang\u002F)、[高曼欣](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002F~cindy0711\u002F) 和 [庄永裕](https:\u002F\u002Fwww.csie.ntu.edu.tw\u002F~cyy\u002F)。\n\n[国立台湾大学](https:\u002F\u002Fwww.ntu.edu.tw)\n\n深度照片增强器：基于生成对抗网络的无配对摄影图像增强学习。IEEE 国际计算机视觉与模式识别会议 2018 年论文集（CVPR 2018），即将发表，2018 年 6 月，美国盐湖城。\n\n### 引用格式\n```\n@INPROCEEDINGS{Chen:2018:DPE,\n\tAUTHOR    = {Yu-Sheng Chen and Yu-Ching Wang and Man-Hsin Kao and Yung-Yu Chuang},\n\tTITLE     = {Deep Photo Enhancer: Unpaired Learning for Image Enhancement from Photographs with GANs},\n\tYEAR      = {2018},\n\tMONTH     = {June},\n\tBOOKTITLE = {Proceedings of IEEE International Conference on Computer Vision and Pattern Recognition (CVPR 2018)},\n\tPAGES     = {6306--6314},\n\tLOCATION  = {Salt Lake City},\n}\n```\n\n### 参考文献\n\n> 1. *Bychkovsky, V., Paris, S., Chan, E., Durand, F.: 利用输入\u002F输出图像对数据库学习摄影全局色调调整。载于 2011 年 IEEE 计算机视觉与模式识别会议论文集，第 97–104 页。CVPR'11（2011）*\n> 2. *Zhu, J. Y., Park, T., Isola, P., Efros, A. A.: 基于循环一致对抗网络的无配对图像到图像转换。载于 2017 年 IEEE 国际计算机视觉会议论文集，第 2242–2251 页。ICCV'17（2017）*\n> 3. *Ignatov, A., Kobyshev, N., Vanhoey, K., Timofte, R., Van Gool, L.: 使用深度卷积网络在移动设备上实现单反相机质量的照片。载于 2017 年 IEEE 国际计算机视觉会议论文集，第 3277–3285 页。ICCV'17（2017）*\n> 4. *Wang, S., Cho, W., Jang, J., Abidi, M. A., Paik, J.: 基于对比度的饱和度调整用于户外图像增强。JOSA A，第 2532–2542 页。（2017）*\n> 5. *Wang, S., Zheng, J., Hu, H. M., Li, B.: 保留自然感的非均匀光照图像增强算法。IEEE 图像处理汇刊，第 3538–3548 页。TIP'13（2013）*\n> 6. *Aubry, M., Paris, S., Hasinoff, S. W., Kautz, J., Durand, F.: 快速局部拉普拉斯滤波器：理论与应用。ACM 图形学汇刊，第 167 号文章。TOG'14（2014）*\n\n### 联系方式\n如有任何问题，欢迎随时联系我（陈宇升 nothinglo@cmlab.csie.ntu.edu.tw）。","# Deep-Photo-Enhancer 快速上手指南\n\nDeep-Photo-Enhancer 是一个基于 GAN（生成对抗网络）的图像增强工具，源自 CVPR 2018 Spotlight 论文。它支持无配对学习（Unpaired Learning），能够将普通照片自动优化为具有专业修图师风格的高质量图像。\n\n## 环境准备\n\n在开始之前，请确保您的开发环境满足以下要求：\n\n*   **操作系统**: Linux (推荐) 或 macOS (Windows 用户建议使用 WSL 或 Docker)。\n*   **Python**: 建议版本 3.6+ (原代码基于较旧环境，但现代 Python 通常兼容)。\n*   **深度学习框架**: **TensorFlow 1.12** (注意：原项目明确指出代码在 TensorFlow 0.12 版本运行，但后续更新和社区实践多基于 TF 1.x，推荐使用 **TensorFlow 1.12 - 1.15** 以获得最佳兼容性。TF 2.x 需使用兼容模式)。\n*   **GPU**: 推荐使用 NVIDIA GPU 以加速推理过程（可选，CPU 也可运行但速度较慢）。\n*   **依赖库**: `numpy`, `scipy`, `Pillow`, `matplotlib` 等常见科学计算库。\n\n> **提示**: 由于原项目依赖较旧的 TensorFlow 版本，建议创建一个独立的虚拟环境以避免冲突。\n\n## 安装步骤\n\n### 1. 克隆项目代码\n首先从 GitHub 获取源代码：\n\n```bash\ngit clone https:\u002F\u002Fgithub.com\u002Fnothinglo\u002FDeep-Photo-Enhancer.git\ncd Deep-Photo-Enhancer\n```\n\n### 2. 创建虚拟环境并安装依赖\n推荐使用 `conda` 或 `venv` 隔离环境。以下以 `conda` 为例安装 TensorFlow 1.12：\n\n```bash\n# 创建环境\nconda create -n dpe python=3.6\nconda activate dpe\n\n# 安装 TensorFlow 1.12 (GPU 版本请替换为 tensorflow-gpu==1.12)\npip install tensorflow==1.12.0\n\n# 安装其他必要依赖\npip install numpy scipy Pillow matplotlib\n```\n\n### 3. 下载预训练模型\n根据 README 中的更新信息（2019-03-31），作者提供了用于推理的简化模型（包含监督和无监督版本）。\n\n*   **下载地址**: [Online_Demo_Models_Deep-Photo-Enhancer.zip](https:\u002F\u002Fwww.cmlab.csie.ntu.edu.tw\u002Fproject\u002FDeep-Photo-Enhancer\u002F[Online_Demo_Models]_Deep-Photo-Enhancer.zip)\n    *   *注：如果官网下载速度慢，可尝试使用国内下载工具或代理加速。*\n\n下载完成后，解压文件并将模型文件夹放置在项目根目录下（具体路径需参考解压后的说明，通常放在 `model\u002F` 或与 `TF.py` 同级目录）。\n\n## 基本使用\n\n该项目提供了一个简化的推理接口，主要位于 `TF.py` 文件中。\n\n### 1. 准备输入图片\n将您需要增强的照片放入项目目录，例如命名为 `input.jpg`。\n\n### 2. 运行推理脚本\n您可以编写一个简单的 Python 脚本来调用核心函数，或者直接在交互式环境中运行。以下是基于 `TF.py` 中 `getInputPhoto` 和 `processImg` 函数的最小使用示例：\n\n```python\nimport TF\nimport scipy.misc # 注意：新版 scipy 可能移除了 misc，若报错请改用 imageio 或 PIL 处理\n\n# 配置路径\ninput_path = 'input.jpg'      # 输入图片路径\nmodel_path = 'path_to_model'  # 下载的预训练模型文件夹路径\noutput_path = 'output.jpg'    # 输出图片路径\n\n# 加载图片\n# getInputPhoto 负责读取并预处理图片\nimg = TF.getInputPhoto(input_path)\n\n# 执行增强\n# processImg 是核心推理函数\n# 参数说明：输入图像张量，模型路径，是否使用 HDR 模型等\nenhanced_img = TF.processImg(img, model_path, is_hdr=True)\n\n# 保存结果\n# 注意：根据 TF 版本不同，保存方式可能需要调整\nscipy.misc.imsave(output_path, enhanced_img)\n\nprint(f\"图像增强完成，已保存至 {output_path}\")\n```\n\n**关键参数说明：**\n*   `is_hdr=True`: 使用在 HDR 数据集上训练的无配对模型（推荐用于通用照片增强）。\n*   `is_hdr=False`: 使用在 MIT-Adobe 5K 数据集上训练的模型。\n\n### 3. 查看结果\n打开生成的 `output.jpg`，您将看到经过 AI 增强后的照片，其色彩、对比度和光影效果应更接近专业修图风格。\n\n---\n*注：由于原代码被作者自述为“未打磨（not polished）”，在实际运行中若遇到路径错误或维度不匹配问题，建议检查 `TF.py` 中的路径配置并根据报错微调输入输出的形状处理逻辑。*","一位独立摄影师正在整理一批十年前用旧相机拍摄的旅行照片，准备举办一场线上回顾展，但原始文件普遍存在曝光不足、色彩灰暗和噪点明显的问题。\n\n### 没有 Deep-Photo-Enhancer 时\n- 必须手动在 Lightroom 中逐张调整曝光、对比度和色彩平衡，处理几百张照片需要耗费数天时间。\n- 由于缺乏成对的“原图 - 精修图”训练数据，传统监督学习模型无法直接应用，难以批量自动化修复。\n- 强行使用普通滤镜会导致画面失真，暗部细节丢失严重，且无法还原真实的光照氛围。\n- 聘请专业修图师成本高昂，对于个人创作者而言预算难以承受。\n\n### 使用 Deep-Photo-Enhancer 后\n- 利用其无配对学习（Unpaired Learning）特性，直接输入低质量原图即可批量生成具有专业级光影效果的照片，效率提升数十倍。\n- 基于 GAN 的生成能力智能补充暗部细节并去除噪点，同时保持图像自然纹理，避免了过度锐化或伪影。\n- 模型在 HDR 数据集上训练过，能自动识别并恢复高动态范围场景，让逆光或大光比照片重现层次感。\n- 无需昂贵的标注数据或人工干预，个人开发者即可在本地部署 TensorFlow 版本，零成本实现影院级画质增强。\n\nDeep-Photo-Enhancer 通过先进的无配对生成对抗网络技术，将原本耗时费力的专业修图工作转化为高效的自动化流程，让老旧照片瞬间焕发新生。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fnothinglo_Deep-Photo-Enhancer_b19a877b.png","nothinglo","Yu-Sheng Chen","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002Fnothinglo_61c1ad32.jpg","Hi, I am UglyMan.",null,"https:\u002F\u002Fnothinglo.github.io","https:\u002F\u002Fgithub.com\u002Fnothinglo",808,110,"2026-04-11T02:18:30","MIT",4,"未说明","需要 NVIDIA GPU（基于 TensorFlow 和 GAN 架构推断），具体型号和显存大小未说明，CUDA 版本未说明",{"notes":88,"python":85,"dependencies":89},"该项目基于 2018 年的论文，代码依赖非常陈旧的 TensorFlow 0.12 版本。README 明确提到代码未经过美化且包含许多不必要的部分。训练数据涉及 MIT-Adobe FiveK 数据集（需自行处理）和 Flickr HDR 图像（仅提供 ID）。由于框架版本过低，在现代环境中运行可能需要复杂的兼容性调整或代码重构。",[90],"tensorflow==0.12",[15],"2026-03-27T02:49:30.150509","2026-04-12T13:58:37.221988",[95,100,105,110,115,120,125],{"id":96,"question_zh":97,"answer_zh":98,"source_url":99},30745,"非配对双向 AWGAN-GP 训练中的自适应梯度方案细节是什么？","关于非配对双向 AWGAN-GP 训练中使用的自适应梯度方案（包括调整梯度惩罚 lambda 的区域 `[lambda_min, lambda_max]` 以及梯度惩罚平均移动量的计算方法），维护者表示难以用文字详细描述，建议直接查看其上传的临时代码以了解具体实现逻辑。","https:\u002F\u002Fgithub.com\u002Fnothinglo\u002FDeep-Photo-Enhancer\u002Fissues\u002F1",{"id":101,"question_zh":102,"answer_zh":103,"source_url":104},30739,"如何对大于 512x512 的图像进行推理预测？","在推理过程中，只需将代码中的 `FLAGS['data_image_size']` 设置为 `None` 即可支持大于训练尺寸（如 512）的图像输入。维护者已上传了支持该功能的推理模型和演示网站代码供参考。","https:\u002F\u002Fgithub.com\u002Fnothinglo\u002FDeep-Photo-Enhancer\u002Fissues\u002F51",{"id":106,"question_zh":107,"answer_zh":108,"source_url":109},30740,"训练有监督模型时无法复现论文中的 PSNR 成绩（如 23.8），原因是什么？","性能差异主要源于数据集预处理方式的不同。维护者采用的方法是：将图像的长边缩放至 512 像素，使用填充（padding）使图像变为 512x512，并且在计算 PSNR 之前会先将图像裁剪到有效区域。此外，使用的修图师数据集（如 Retoucher C vs A）也会对结果产生显著影响。","https:\u002F\u002Fgithub.com\u002Fnothinglo\u002FDeep-Photo-Enhancer\u002Fissues\u002F33",{"id":111,"question_zh":112,"answer_zh":113,"source_url":114},30741,"使用自己的数据集测试时，结果出现偏色（如变蓝或人脸变黄）怎么办？","这通常是由于训练数据集与测试数据集之间的域不匹配（domain mismatch）造成的（例如训练集中包含大量蓝天导致模型偏向蓝色）。解决方案包括：1. 添加更多属于自己的数据集对模型进行微调（finetune）；2. 使用维护者上传的用于演示网站的推理模型和代码，这些模型可能具有更好的泛化能力。","https:\u002F\u002Fgithub.com\u002Fnothinglo\u002FDeep-Photo-Enhancer\u002Fissues\u002F41",{"id":116,"question_zh":117,"answer_zh":118,"source_url":119},30742,"训练时的超参数（学习率、Lambda、Alpha 等）是如何设置的？","对于有监督或相关训练设置：NetG 和 NetD 的学习率相同，前 75 个 epoch 为 1e-5，随后线性衰减至第 150 个 epoch 为零；初始 `lambda` 设为 10；`alpha` 参数设为 1000。数据集方面，使用 Lightroom 处理的图像即可。","https:\u002F\u002Fgithub.com\u002Fnothinglo\u002FDeep-Photo-Enhancer\u002Fissues\u002F4",{"id":121,"question_zh":122,"answer_zh":123,"source_url":124},30743,"如何处理颜色空间问题？应该使用 ProPhoto RGB 还是 sRGB，以及如何转换？","虽然 Adobe 5K 数据集声称使用 ProPhoto RGB，但在实际训练中直接使用可能导致输出图像颜色异常。建议尝试直接从 Lightroom 导出为 sRGB 格式进行训练。如果必须从 ProPhoto RGB 转换为 sRGB，可以通过两步矩阵变换实现：先将 ProPhoto RGB 转换为 CIE XYZ，再将 CIE XYZ 转换为 sRGB（转换矩阵可参考 brucelindbloom.com 相关文档）。","https:\u002F\u002Fgithub.com\u002Fnothinglo\u002FDeep-Photo-Enhancer\u002Fissues\u002F39",{"id":126,"question_zh":127,"answer_zh":128,"source_url":99},30744,"有监督训练中使用的是什么损失函数？","在有监督训练模式下，使用的是均方误差（MSE）损失函数。",[]]