[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-foamliu--Deep-Image-Matting-PyTorch":3,"tool-foamliu--Deep-Image-Matting-PyTorch":64},[4,18,26,35,44,53],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":17},4358,"openclaw","openclaw\u002Fopenclaw","OpenClaw 是一款专为个人打造的本地化 AI 助手，旨在让你在自己的设备上拥有完全可控的智能伙伴。它打破了传统 AI 助手局限于特定网页或应用的束缚，能够直接接入你日常使用的各类通讯渠道，包括微信、WhatsApp、Telegram、Discord、iMessage 等数十种平台。无论你在哪个聊天软件中发送消息，OpenClaw 都能即时响应，甚至支持在 macOS、iOS 和 Android 设备上进行语音交互，并提供实时的画布渲染功能供你操控。\n\n这款工具主要解决了用户对数据隐私、响应速度以及“始终在线”体验的需求。通过将 AI 部署在本地，用户无需依赖云端服务即可享受快速、私密的智能辅助，真正实现了“你的数据，你做主”。其独特的技术亮点在于强大的网关架构，将控制平面与核心助手分离，确保跨平台通信的流畅性与扩展性。\n\nOpenClaw 非常适合希望构建个性化工作流的技术爱好者、开发者，以及注重隐私保护且不愿被单一生态绑定的普通用户。只要具备基础的终端操作能力（支持 macOS、Linux 及 Windows WSL2），即可通过简单的命令行引导完成部署。如果你渴望拥有一个懂你",349277,3,"2026-04-06T06:32:30",[13,14,15,16],"Agent","开发框架","图像","数据工具","ready",{"id":19,"name":20,"github_repo":21,"description_zh":22,"stars":23,"difficulty_score":10,"last_commit_at":24,"category_tags":25,"status":17},3808,"stable-diffusion-webui","AUTOMATIC1111\u002Fstable-diffusion-webui","stable-diffusion-webui 是一个基于 Gradio 构建的网页版操作界面，旨在让用户能够轻松地在本地运行和使用强大的 Stable Diffusion 图像生成模型。它解决了原始模型依赖命令行、操作门槛高且功能分散的痛点，将复杂的 AI 绘图流程整合进一个直观易用的图形化平台。\n\n无论是希望快速上手的普通创作者、需要精细控制画面细节的设计师，还是想要深入探索模型潜力的开发者与研究人员，都能从中获益。其核心亮点在于极高的功能丰富度：不仅支持文生图、图生图、局部重绘（Inpainting）和外绘（Outpainting）等基础模式，还独创了注意力机制调整、提示词矩阵、负向提示词以及“高清修复”等高级功能。此外，它内置了 GFPGAN 和 CodeFormer 等人脸修复工具，支持多种神经网络放大算法，并允许用户通过插件系统无限扩展能力。即使是显存有限的设备，stable-diffusion-webui 也提供了相应的优化选项，让高质量的 AI 艺术创作变得触手可及。",162132,"2026-04-05T11:01:52",[14,15,13],{"id":27,"name":28,"github_repo":29,"description_zh":30,"stars":31,"difficulty_score":32,"last_commit_at":33,"category_tags":34,"status":17},2271,"ComfyUI","Comfy-Org\u002FComfyUI","ComfyUI 是一款功能强大且高度模块化的视觉 AI 引擎，专为设计和执行复杂的 Stable Diffusion 图像生成流程而打造。它摒弃了传统的代码编写模式，采用直观的节点式流程图界面，让用户通过连接不同的功能模块即可构建个性化的生成管线。\n\n这一设计巧妙解决了高级 AI 绘图工作流配置复杂、灵活性不足的痛点。用户无需具备编程背景，也能自由组合模型、调整参数并实时预览效果，轻松实现从基础文生图到多步骤高清修复等各类复杂任务。ComfyUI 拥有极佳的兼容性，不仅支持 Windows、macOS 和 Linux 全平台，还广泛适配 NVIDIA、AMD、Intel 及苹果 Silicon 等多种硬件架构，并率先支持 SDXL、Flux、SD3 等前沿模型。\n\n无论是希望深入探索算法潜力的研究人员和开发者，还是追求极致创作自由度的设计师与资深 AI 绘画爱好者，ComfyUI 都能提供强大的支持。其独特的模块化架构允许社区不断扩展新功能，使其成为当前最灵活、生态最丰富的开源扩散模型工具之一，帮助用户将创意高效转化为现实。",107888,2,"2026-04-06T11:32:50",[14,15,13],{"id":36,"name":37,"github_repo":38,"description_zh":39,"stars":40,"difficulty_score":10,"last_commit_at":41,"category_tags":42,"status":17},4487,"LLMs-from-scratch","rasbt\u002FLLMs-from-scratch","LLMs-from-scratch 是一个基于 PyTorch 的开源教育项目，旨在引导用户从零开始一步步构建一个类似 ChatGPT 的大型语言模型（LLM）。它不仅是同名技术著作的官方代码库，更提供了一套完整的实践方案，涵盖模型开发、预训练及微调的全过程。\n\n该项目主要解决了大模型领域“黑盒化”的学习痛点。许多开发者虽能调用现成模型，却难以深入理解其内部架构与训练机制。通过亲手编写每一行核心代码，用户能够透彻掌握 Transformer 架构、注意力机制等关键原理，从而真正理解大模型是如何“思考”的。此外，项目还包含了加载大型预训练权重进行微调的代码，帮助用户将理论知识延伸至实际应用。\n\nLLMs-from-scratch 特别适合希望深入底层原理的 AI 开发者、研究人员以及计算机专业的学生。对于不满足于仅使用 API，而是渴望探究模型构建细节的技术人员而言，这是极佳的学习资源。其独特的技术亮点在于“循序渐进”的教学设计：将复杂的系统工程拆解为清晰的步骤，配合详细的图表与示例，让构建一个虽小但功能完备的大模型变得触手可及。无论你是想夯实理论基础，还是为未来研发更大规模的模型做准备",90106,"2026-04-06T11:19:32",[43,15,13,14],"语言模型",{"id":45,"name":46,"github_repo":47,"description_zh":48,"stars":49,"difficulty_score":10,"last_commit_at":50,"category_tags":51,"status":17},4292,"Deep-Live-Cam","hacksider\u002FDeep-Live-Cam","Deep-Live-Cam 是一款专注于实时换脸与视频生成的开源工具，用户仅需一张静态照片，即可通过“一键操作”实现摄像头画面的即时变脸或制作深度伪造视频。它有效解决了传统换脸技术流程繁琐、对硬件配置要求极高以及难以实时预览的痛点，让高质量的数字内容创作变得触手可及。\n\n这款工具不仅适合开发者和技术研究人员探索算法边界，更因其极简的操作逻辑（仅需三步：选脸、选摄像头、启动），广泛适用于普通用户、内容创作者、设计师及直播主播。无论是为了动画角色定制、服装展示模特替换，还是制作趣味短视频和直播互动，Deep-Live-Cam 都能提供流畅的支持。\n\n其核心技术亮点在于强大的实时处理能力，支持口型遮罩（Mouth Mask）以保留使用者原始的嘴部动作，确保表情自然精准；同时具备“人脸映射”功能，可同时对画面中的多个主体应用不同面孔。此外，项目内置了严格的内容安全过滤机制，自动拦截涉及裸露、暴力等不当素材，并倡导用户在获得授权及明确标注的前提下合规使用，体现了技术发展与伦理责任的平衡。",88924,"2026-04-06T03:28:53",[14,15,13,52],"视频",{"id":54,"name":55,"github_repo":56,"description_zh":57,"stars":58,"difficulty_score":32,"last_commit_at":59,"category_tags":60,"status":17},2268,"ML-For-Beginners","microsoft\u002FML-For-Beginners","ML-For-Beginners 是由微软推出的一套系统化机器学习入门课程，旨在帮助零基础用户轻松掌握经典机器学习知识。这套课程将学习路径规划为 12 周，包含 26 节精炼课程和 52 道配套测验，内容涵盖从基础概念到实际应用的完整流程，有效解决了初学者面对庞大知识体系时无从下手、缺乏结构化指导的痛点。\n\n无论是希望转型的开发者、需要补充算法背景的研究人员，还是对人工智能充满好奇的普通爱好者，都能从中受益。课程不仅提供了清晰的理论讲解，还强调动手实践，让用户在循序渐进中建立扎实的技能基础。其独特的亮点在于强大的多语言支持，通过自动化机制提供了包括简体中文在内的 50 多种语言版本，极大地降低了全球不同背景用户的学习门槛。此外，项目采用开源协作模式，社区活跃且内容持续更新，确保学习者能获取前沿且准确的技术资讯。如果你正寻找一条清晰、友好且专业的机器学习入门之路，ML-For-Beginners 将是理想的起点。",85013,"2026-04-06T11:09:19",[15,16,52,61,13,62,43,14,63],"插件","其他","音频",{"id":65,"github_repo":66,"name":67,"description_en":68,"description_zh":69,"ai_summary_zh":69,"readme_en":70,"readme_zh":71,"quickstart_zh":72,"use_case_zh":73,"hero_image_url":74,"owner_login":75,"owner_name":76,"owner_avatar_url":77,"owner_bio":78,"owner_company":79,"owner_location":80,"owner_email":81,"owner_twitter":79,"owner_website":82,"owner_url":83,"languages":84,"stars":89,"forks":90,"last_commit_at":91,"license":92,"difficulty_score":10,"env_os":93,"env_gpu":94,"env_ram":94,"env_deps":95,"category_tags":101,"github_topics":79,"view_count":32,"oss_zip_url":79,"oss_zip_packed_at":79,"status":17,"created_at":102,"updated_at":103,"faqs":104,"releases":140},5065,"foamliu\u002FDeep-Image-Matting-PyTorch","Deep-Image-Matting-PyTorch","Deep Image Matting implementation in PyTorch","Deep-Image-Matting-PyTorch 是一个基于 PyTorch 框架实现的深度学习图像抠图工具，旨在精准分离图像前景与背景。它主要解决了传统抠图方法在处理毛发、半透明物体等复杂边缘时效果不佳的难题，能够根据用户提供的粗略标记（Trimap）自动生成高质量的 Alpha 通道遮罩。\n\n这款工具特别适合计算机视觉研究人员、AI 开发者以及需要高质量素材处理流程的设计师使用。对于希望复现经典算法或进行二次开发的科研人员，它提供了完整的训练、测试及评估代码；对于开发者，其模块化的设计便于集成到更大的图像处理系统中。\n\n在技术实现上，Deep-Image-Matting-PyTorch 对原始论文模型进行了重要优化：移除了参数量巨大且导致模型难以收敛的\"fc6\"全连接层，并引入了索引池化技术。这些改进不仅降低了训练难度，还提升了模型的稳定性。项目支持在 Composition-1k 等标准数据集上进行性能评估，并提供了详细的预处理、训练及可视化教程，帮助用户快速上手并验证实验结果。无论是学术研究还是工程实践，它都是一个高效可靠的开源选择。","# Deep Image Matting\n\nDeep Image Matting [paper](https:\u002F\u002Farxiv.org\u002Fabs\u002F1703.03872) implementation in PyTorch.\n\n## Differences\n\n1. \"fc6\" is dropped.\n2. Indices pooling.\n\n\u003Cp>\"fc6\" is clumpy, over 100 millions parameters, makes the model hard to converge. I guess it is the reason why the model (paper) has to be trained stagewisely.\n\n## Performance\n- The Composition-1k testing dataset.\n- Evaluate with whole image.\n- SAD normalized by 1000.\n- Input image is normalized with mean=[0.485, 0.456, 0.406] and std=[0.229, 0.224, 0.225].\n- Both erode and dialte to generate trimap.\n\n|Models|SAD|MSE|Download|\n|---|---|---|---|\n|paper-stage0|59.6|0.019||\n|paper-stage1|54.6|0.017||\n|paper-stage3|50.4|0.014||\n|my-stage0|66.8|0.024|[Link](https:\u002F\u002Fgithub.com\u002Ffoamliu\u002FDeep-Image-Matting-PyTorch\u002Freleases\u002Fdownload\u002Fv1.0\u002FBEST_checkpoint.tar)|\n\n## Dependencies\n\n- Python 3.5.2\n- PyTorch 1.1.0\n\n## Dataset\n### Adobe Deep Image Matting Dataset\nFollow the [instruction](https:\u002F\u002Fsites.google.com\u002Fview\u002Fdeepimagematting) to contact author for the dataset.\n\n### MSCOCO\nGo to [MSCOCO](http:\u002F\u002Fcocodataset.org\u002F#download) to download:\n* [2014 Train images](http:\u002F\u002Fimages.cocodataset.org\u002Fzips\u002Ftrain2014.zip)\n\n\n### PASCAL VOC\nGo to [PASCAL VOC](http:\u002F\u002Fhost.robots.ox.ac.uk\u002Fpascal\u002FVOC\u002F) to download:\n* VOC challenge 2008 [training\u002Fvalidation data](http:\u002F\u002Fhost.robots.ox.ac.uk\u002Fpascal\u002FVOC\u002Fvoc2008\u002FVOCtrainval_14-Jul-2008.tar)\n* The test data for the VOC2008 challenge\n\n## Usage\n### Data Pre-processing\nExtract training images:\n```bash\n$ python pre_process.py\n```\n\n### Train\n```bash\n$ python train.py\n```\n\nIf you want to visualize during training, run in your terminal:\n```bash\n$ tensorboard --logdir runs\n```\n\n## Experimental results\n\n### The Composition-1k testing dataset\n\n1. Test:\n```bash\n$ python test.py\n```\n\nIt prints out average SAD and MSE errors when finished.\n\n### The alphamatting.com dataset\n\n1. Download the evaluation datasets: Go to the [Datasets page](http:\u002F\u002Fwww.alphamatting.com\u002Fdatasets.php) and download the evaluation datasets. Make sure you pick the low-resolution dataset.\n\n2. Extract evaluation images:\n```bash\n$ python extract.py\n```\n\n3. Evaluate:\n```bash\n$ python eval.py\n```\n\nClick to view whole images:\n\nImage | Trimap1 | Trimap2 | Trimap3|\n|---|---|---|---|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_1269a7082dcf.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_6990e21f2b92.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_063a77616554.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_0fb974669a95.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_1269a7082dcf.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_8509b426cf95.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_b7e6e3ccb2a7.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_b583496fa679.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_bc325f3a1b1c.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_039fb9a522dc.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_97dc96632157.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_6bae74463ef6.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_bc325f3a1b1c.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_0021c25a7d18.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_d87ee846b2a8.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_40e19d25cb06.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_a4b7d62ce05b.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_a7b64b0f8f65.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_91d9f8f2688e.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_e8a3d7033a23.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_a4b7d62ce05b.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_d77fc903669b.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_c12cf13e84e1.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_7a58a85173b4.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_86db06c42bbc.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_0efd6e389d35.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_1ec82996b3e8.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_a42d57667113.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_86db06c42bbc.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_c15f74580952.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_d0219eba2581.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_bd8cb30687dc.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_b7c7054a3b61.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_742d12b3dea5.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_9b94e7f6a5f8.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_711a7ad3b080.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_b7c7054a3b61.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_d2a535f1e57b.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_8335728acdc9.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_8a3f4ddde38a.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_45cdb068c732.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_d1d2fc0bca96.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_cfb280647071.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_2bd07ce3c0c1.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_45cdb068c732.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_0010f8ab356b.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_b22a9cdae2e6.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_a966236f9ac3.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_1f90810e4d3b.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_2d5f04a33e9d.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_8586810de598.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_5a85dc8a6110.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_1f90810e4d3b.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_8819068d5864.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_d10f5dc24bbc.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_e68a17f56aac.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_1c001652eefa.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_871b6fb3c3b5.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_f64e1514c837.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_a16c2a2fba3c.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_1c001652eefa.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_0545030b3334.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_c8215830f1e5.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_1aa6a8bb859d.png)|\n\n### Demo\nDownload pre-trained Deep Image Matting [Link](https:\u002F\u002Fgithub.com\u002Ffoamliu\u002FDeep-Image-Matting-PyTorch\u002Freleases\u002Fdownload\u002Fv1.0\u002FBEST_checkpoint.tar) then run:\n```bash\n$ python demo.py\n```\n\nImage\u002FTrimap | Output\u002FGT | New BG\u002FCompose | \n|---|---|---|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_3593983a4e71.png)  | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_a33111325a0a.png)   | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_25c5e21fe24d.png) |\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_b7edd6957de6.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_8fda44f35a4e.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_a8b690ff8d50.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_1321f96e6e41.png)  | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_0792d1d99f75.png)   | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_013cceae9a82.png) | \n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_09ec201d48fe.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_72a4ca58fead.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_fd65bd503dc5.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_77c2871c4a3c.png)  | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_d5c6af87beb6.png)   | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_2de80e3f51dc.png) |\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_0c10b9d4a143.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_7713bd71f7b9.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_d0d75d6a3e9a.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_9595571b3e27.png)  | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_20425363b5fd.png)   | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_f491d81e76f0.png) |\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_4565a1921d64.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_186778b492fe.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_37cb5c19801c.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_b96cbbb8e02f.png)  | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_34eea1753cc4.png)   | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_06c0b51bfcc5.png) |\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_387b965028b8.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_5d4662d478a9.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_cd2549499f95.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_af8cdb6b703c.png)  | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_11d699e0bd11.png)   | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_3ad9e2dc3bcd.png) |\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_7358fb43dc07.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_071592373931.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_0396c502f252.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_d71885f04871.png)  | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_1084cc2b4e64.png)   | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_879032e2eb20.png) |\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_7107e72d01e2.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_dfbe8273dd1a.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_db1a81af79f9.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_26abe7091057.png)  | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_f04860d28b90.png)   | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_b585163b3cf3.png) |\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_e33e5e12220b.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_bf9d3cb4c832.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_55bafe208500.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_c463d5336b66.png)  | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_f39c8a25c5bb.png)   | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_a29e2111e1c0.png) |\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_95f9019e4497.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_8b50a573fb96.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_7132ff3373a7.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_9291c9219b70.png)  | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_5cfd7c9692ca.png)   | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_83c0839a998b.png) |\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_c20b9f2ec633.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_2740759da17c.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_c29673be9115.png)|\n\n\n## 小小的赞助~\n\u003Cp align=\"center\">\n\t\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_e3a9fcac2dcb.jpg\" alt=\"Sample\"  width=\"324\" height=\"504\">\n\t\u003Cp align=\"center\">\n\t\t\u003Cem>若对您有帮助可给予小小的赞助~\u003C\u002Fem>\n\t\u003C\u002Fp>\n\u003C\u002Fp>\n\u003Cbr\u002F>\u003Cbr\u002F>\u003Cbr\u002F>","# 深度图像抠图\n\n深度图像抠图 [论文](https:\u002F\u002Farxiv.org\u002Fabs\u002F1703.03872) 的 PyTorch 实现。\n\n## 差异\n\n1. 去掉了 \"fc6\" 层。\n2. 使用索引池化。\n\n“fc6” 层参数量巨大，超过一亿，导致模型难以收敛。我推测这也是为什么原论文中的模型需要分阶段训练的原因。\n\n## 性能\n- 使用 Composition-1k 测试数据集。\n- 对整张图像进行评估。\n- SAD 值已除以 1000 进行归一化。\n- 输入图像使用均值=[0.485, 0.456, 0.406] 和标准差=[0.229, 0.224, 0.225] 进行归一化。\n- 同时使用腐蚀和膨胀操作生成 trimap。\n\n|模型|SAD|MSE|下载|\n|---|---|---|---|\n|论文-stage0|59.6|0.019|||\n|论文-stage1|54.6|0.017|||\n|论文-stage3|50.4|0.014|||\n|我的stage0|66.8|0.024|[链接](https:\u002F\u002Fgithub.com\u002Ffoamliu\u002FDeep-Image-Matting-PyTorch\u002Freleases\u002Fdownload\u002Fv1.0\u002FBEST_checkpoint.tar)||\n\n## 依赖\n- Python 3.5.2\n- PyTorch 1.1.0\n\n## 数据集\n### Adobe 深度图像抠图数据集\n请按照 [说明](https:\u002F\u002Fsites.google.com\u002Fview\u002Fdeepimagematting) 联系作者获取数据集。\n\n### MSCOCO\n前往 [MSCOCO](http:\u002F\u002Fcocodataset.org\u002F#download) 下载：\n* [2014 年训练图像](http:\u002F\u002Fimages.cocodataset.org\u002Fzips\u002Ftrain2014.zip)\n\n### PASCAL VOC\n前往 [PASCAL VOC](http:\u002F\u002Fhost.robots.ox.ac.uk\u002Fpascal\u002FVOC\u002F) 下载：\n* VOC 2008 挑战赛 [训练\u002F验证数据](http:\u002F\u002Fhost.robots.ox.ac.uk\u002Fpascal\u002FVOC\u002Fvoc2008\u002FVOCtrainval_14-Jul-2008.tar)\n* VOC 2008 挑战赛的测试数据\n\n## 使用方法\n### 数据预处理\n提取训练图像：\n```bash\n$ python pre_process.py\n```\n\n### 训练\n```bash\n$ python train.py\n```\n\n如果希望在训练过程中可视化，可以在终端运行：\n```bash\n$ tensorboard --logdir runs\n```\n\n## 实验结果\n\n### Composition-1k 测试数据集\n\n1. 测试：\n```bash\n$ python test.py\n```\n\n运行结束后会打印出平均 SAD 和 MSE 错误。\n\n### alphamatting.com 数据集\n\n1. 下载评估数据集：前往 [数据集页面](http:\u002F\u002Fwww.alphamatting.com\u002Fdatasets.php) 下载评估数据集，确保选择低分辨率版本。\n\n2. 提取评估图像：\n```bash\n$ python extract.py\n```\n\n3. 评估：\n```bash\n$ python eval.py\n```\n\n点击查看完整图片：\n\n图片 | 三图1 | 三图2 | 三图3|\n|---|---|---|---|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_1269a7082dcf.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_6990e21f2b92.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_063a77616554.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_0fb974669a95.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_1269a7082dcf.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_8509b426cf95.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_b7e6e3ccb2a7.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_b583496fa679.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_bc325f3a1b1c.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_039fb9a522dc.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_97dc96632157.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_6bae74463ef6.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_bc325f3a1b1c.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_0021c25a7d18.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_d87ee846b2a8.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_40e19d25cb06.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_a4b7d62ce05b.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_a7b64b0f8f65.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_91d9f8f2688e.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_e8a3d7033a23.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_a4b7d62ce05b.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_d77fc903669b.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_c12cf13e84e1.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_7a58a85173b4.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_86db06c42bbc.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_0efd6e389d35.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_1ec82996b3e8.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_a42d57667113.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_86db06c42bbc.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_c15f74580952.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_d0219eba2581.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_bd8cb30687dc.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_b7c7054a3b61.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_742d12b3dea5.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_9b94e7f6a5f8.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_711a7ad3b080.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_b7c7054a3b61.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_d2a535f1e57b.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_8335728acdc9.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_8a3f4ddde38a.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_45cdb068c732.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_d1d2fc0bca96.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_cfb280647071.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_2bd07ce3c0c1.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_45cdb068c732.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_0010f8ab356b.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_b22a9cdae2e6.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_a966236f9ac3.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_1f90810e4d3b.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_2d5f04a33e9d.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_8586810de598.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_5a85dc8a6110.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_1f90810e4d3b.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_8819068d5864.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_d10f5dc24bbc.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_e68a17f56aac.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_1c001652eefa.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_871b6fb3c3b5.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_f64e1514c837.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_a16c2a2fba3c.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_1c001652eefa.png) |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_0545030b3334.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_c8215830f1e5.png)|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_1aa6a8bb859d.png)|\n\n### 演示\n下载预训练的深度图像抠图模型 [链接](https:\u002F\u002Fgithub.com\u002Ffoamliu\u002FDeep-Image-Matting-PyTorch\u002Freleases\u002Fdownload\u002Fv1.0\u002FBEST_checkpoint.tar)，然后运行：\n```bash\n$ python demo.py\n```\n\n图像\u002FTrimap | 输出\u002FGT | 新背景\u002F合成 | \n|---|---|---|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_3593983a4e71.png)  | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_a33111325a0a.png)   | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_25c5e21fe24d.png) |\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_b7edd6957de6.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_8fda44f35a4e.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_a8b690ff8d50.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_1321f96e6e41.png)  | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_0792d1d99f75.png)   | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_013cceae9a82.png) | \n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_09ec201d48fe.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_72a4ca58fead.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_fd65bd503dc5.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_77c2871c4a3c.png)  | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_d5c6af87beb6.png)   | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_2de80e3f51dc.png) |\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_0c10b9d4a143.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_7713bd71f7b9.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_d0d75d6a3e9a.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_9595571b3e27.png)  | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_20425363b5fd.png)   | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_f491d81e76f0.png) |\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_4565a1921d64.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_186778b492fe.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_37cb5c19801c.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_b96cbbb8e02f.png)  | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_34eea1753cc4.png)   | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_06c0b51bfcc5.png) |\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_387b965028b8.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_5d4662d478a9.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_cd2549499f95.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_af8cdb6b703c.png)  | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_11d699e0bd11.png)   | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_3ad9e2dc3bcd.png) |\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_7358fb43dc07.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_071592373931.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_0396c502f252.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_d71885f04871.png)  | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_1084cc2b4e64.png)   | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_879032e2eb20.png) |\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_7107e72d01e2.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_dfbe8273dd1a.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_db1a81af79f9.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_26abe7091057.png)  | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_f04860d28b90.png)   | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_b585163b3cf3.png) |\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_e33e5e12220b.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_bf9d3cb4c832.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_55bafe208500.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_c463d5336b66.png)  | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_f39c8a25c5bb.png)   | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_a29e2111e1c0.png) |\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_95f9019e4497.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_8b50a573fb96.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_7132ff3373a7.png)|\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_9291c9219b70.png)  | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_5cfd7c9692ca.png)   | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_83c0839a998b.png) |\n|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_c20b9f2ec633.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_2740759da17c.png) | ![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_c29673be9115.png)|\n\n\n## 小小的赞助~\n\u003Cp align=\"center\">\n\t\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_readme_e3a9fcac2dcb.jpg\" alt=\"Sample\"  width=\"324\" height=\"504\">\n\t\u003Cp align=\"center\">\n\t\t\u003Cem>若对您有帮助可给予小小的赞助~\u003C\u002Fem>\n\t\u003C\u002Fp>\n\u003C\u002Fp>\n\u003Cbr\u002F>\u003Cbr\u002F>\u003Cbr\u002F>","# Deep-Image-Matting-PyTorch 快速上手指南\n\nDeep Image Matting 是一个基于 PyTorch 实现的深度图像抠图工具，用于从背景中精确分离前景对象并生成 Alpha 通道。本指南将帮助你快速搭建环境并运行演示。\n\n## 1. 环境准备\n\n在开始之前，请确保你的系统满足以下要求：\n\n*   **操作系统**: Linux \u002F macOS \u002F Windows\n*   **Python**: 3.5.2 或更高版本（推荐 3.6+）\n*   **深度学习框架**: PyTorch 1.1.0 或兼容版本\n*   **可视化工具 (可选)**: TensorBoard (用于训练过程监控)\n\n**依赖安装建议：**\n国内开发者建议使用清华源或阿里源加速 PyTorch 及相关依赖的安装。\n\n```bash\n# 示例：使用 pip 安装基础依赖（请根据实际 CUDA 版本选择对应的 PyTorch 安装命令）\npip install torch==1.1.0 torchvision==0.3.0 -i https:\u002F\u002Fpypi.tuna.tsinghua.edu.cn\u002Fsimple\npip install tensorboard -i https:\u002F\u002Fpypi.tuna.tsinghua.edu.cn\u002Fsimple\n```\n\n## 2. 安装步骤\n\n克隆项目代码并下载预训练模型即可开始使用，无需复杂的编译过程。\n\n### 2.1 克隆仓库\n```bash\ngit clone https:\u002F\u002Fgithub.com\u002Ffoamliu\u002FDeep-Image-Matting-PyTorch.git\ncd Deep-Image-Matting-PyTorch\n```\n\n### 2.2 下载预训练模型\n为了直接体验效果，请下载作者提供的预训练权重文件 (`BEST_checkpoint.tar`)。\n\n**手动下载：**\n访问 [发布页面](https:\u002F\u002Fgithub.com\u002Ffoamliu\u002FDeep-Image-Matting-PyTorch\u002Freleases\u002Fdownload\u002Fv1.0\u002FBEST_checkpoint.tar) 下载文件，并将其放置在项目根目录或代码指定的检查点路径下。\n\n**或使用命令行下载 (Linux\u002FMac):**\n```bash\nwget https:\u002F\u002Fgithub.com\u002Ffoamliu\u002FDeep-Image-Matting-PyTorch\u002Freleases\u002Fdownload\u002Fv1.0\u002FBEST_checkpoint.tar\n```\n\n## 3. 基本使用\n\n### 3.1 运行演示 (Demo)\n下载好预训练模型后，运行 `demo.py` 即可对内置示例图片进行抠图处理，并生成合成结果。\n\n```bash\npython demo.py\n```\n\n运行结束后，程序将在输出目录生成以下结果：\n*   **Alpha Matte**: 生成的透明度通道。\n*   **Composed Image**: 将抠出的前景与新背景合成的图像。\n\n### 3.2 训练自己的模型 (可选)\n如果你需要使用自定义数据集进行训练，请先准备数据（如 Adobe Deep Image Matting Dataset, MSCOCO, PASCAL VOC），然后执行以下步骤：\n\n1.  **数据预处理**:\n    ```bash\n    python pre_process.py\n    ```\n\n2.  **开始训练**:\n    ```bash\n    python train.py\n    ```\n\n3.  **监控训练过程 (可选)**:\n    在新终端窗口运行以下命令查看 Loss 曲线：\n    ```bash\n    tensorboard --logdir runs\n    ```\n\n### 3.3 模型评估 (可选)\n使用 Composition-1k 测试集评估模型性能：\n```bash\npython test.py\n```\n\n使用 alphamatting.com 数据集评估：\n```bash\n# 1. 提取评估图片\npython extract.py\n# 2. 执行评估\npython eval.py\n```","某电商设计团队需要为数千张商品图快速更换背景，以适配不同节日的营销海报，但商品边缘包含复杂的毛发或透明材质。\n\n### 没有 Deep-Image-Matting-PyTorch 时\n- 设计师必须使用钢笔工具手动逐帧勾勒轮廓，处理一张带有毛绒玩具的商品图平均耗时 45 分钟以上。\n- 对于半透明婚纱或玻璃器皿，传统魔棒工具无法识别细微透明度，导致抠图边缘生硬、出现明显白边。\n- 面对海量图片需求，团队不得不外包部分工作，不仅成本高昂，且外包返回的素材质量参差不齐，返工率高。\n- 缺乏统一的自动化标准，不同设计师产出的抠图细节不一致，严重影响最终海报的视觉专业度。\n\n### 使用 Deep-Image-Matting-PyTorch 后\n- 只需提供简单的三色遮罩（Trimap），Deep-Image-Matting-PyTorch 即可在数秒内自动计算出高精度的 Alpha 通道，单图处理时间缩短至分钟级。\n- 基于深度学习的模型能精准捕捉发丝级细节和半透明区域，生成的边缘自然柔和，完美保留光影过渡，无需后期手动修补。\n- 团队可编写脚本批量调用该模型，一夜之间完成数千张商品图的自动化抠图，大幅降低人力与外包成本。\n- 算法输出结果稳定统一，确保了所有营销素材的边缘处理风格一致，显著提升了整体设计效率与产出质量。\n\nDeep-Image-Matting-PyTorch 将原本依赖人工经验的繁琐抠图工作转化为高效的自动化流程，彻底解决了复杂边缘图像处理的规模化难题。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Ffoamliu_Deep-Image-Matting-PyTorch_6e64bf7e.png","foamliu","Yang Liu","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002Ffoamliu_11ac94fc.jpg","AGI researcher",null,"Shanghai","foamliu@yeah.net","https:\u002F\u002Ffoamliu.github.io\u002F","https:\u002F\u002Fgithub.com\u002Ffoamliu",[85],{"name":86,"color":87,"percentage":88},"Python","#3572A5",100,830,183,"2026-04-03T02:07:13","MIT","","未说明",{"notes":96,"python":97,"dependencies":98},"该项目基于较旧的 PyTorch 1.1.0 版本，现代环境可能需要调整依赖。数据集（Adobe Deep Image Matting）需联系作者获取，其他数据集（MSCOCO, PASCAL VOC）需手动下载。训练可视化需使用 TensorBoard。","3.5.2",[99,100],"PyTorch==1.1.0","TensorBoard",[15],"2026-03-27T02:49:30.150509","2026-04-07T22:50:56.722773",[105,110,115,120,125,130,135],{"id":106,"question_zh":107,"answer_zh":108,"source_url":109},23034,"下载的 BEST_checkpoint.tar 文件无法解压，提示不是 tar 归档格式或文件损坏，该怎么办？","该文件实际上并不是标准的 TAR 压缩包，而是一个 PyTorch 模型文件（.pth），只是后缀名被标记为了 .tar。请不要尝试使用 tar 命令或解压软件去解压它。在代码中加载时，应直接使用 `torch.load('BEST_checkpoint.tar')` 进行读取，具体用法请参考 demo.py 文件中的加载逻辑。","https:\u002F\u002Fgithub.com\u002Ffoamliu\u002FDeep-Image-Matting-PyTorch\u002Fissues\u002F37",{"id":111,"question_zh":112,"answer_zh":113,"source_url":114},23035,"运行 demo.py 时出现 \"ValueError: Sample larger than population\" 错误，如何解决？","这个错误是因为背景图片数量不足。代码试图从背景图片列表中随机采样 10 张图片，但实际提供的背景图片数量少于 10 张。解决方法是准备更多的背景图片放入指定目录，确保背景图片总数大于或等于代码中设定的采样数量（默认为 10）。","https:\u002F\u002Fgithub.com\u002Ffoamliu\u002FDeep-Image-Matting-PyTorch\u002Fissues\u002F32",{"id":116,"question_zh":117,"answer_zh":118,"source_url":119},23036,"如何为自己的图片生成 Trimap（三分图）？有自动生成的脚本吗？","该项目本身没有提供直接从原图自动生成 Trimap 的独立脚本，因为模型高度依赖输入的 Trimap。生成 Trimap 通常需要对应的 Alpha 通道图。你可以参考项目中的 `data_gen.py` 文件（第 87 行附近）查看基于 Alpha 生成 Trimap 的逻辑。如果没有 Alpha 图，通常需要通过 Photoshop 手动绘制，或者先使用其他语义分割模型估算前景掩码来辅助生成。","https:\u002F\u002Fgithub.com\u002Ffoamliu\u002FDeep-Image-Matting-PyTorch\u002Fissues\u002F31",{"id":121,"question_zh":122,"answer_zh":123,"source_url":124},23037,"运行 pre_process.py 时报错 \"File not Found\"，即使我已经下载了 Adobe_Deep_Image_Matting_Dataset 并放在 Data 文件夹下，原因是什么？","预处理脚本需要多个数据集文件才能正常运行，仅有一个数据集是不够的。你需要下载并准备好以下四个文件：\n1. train2014.zip\n2. VOC2008test.tar\n3. VOCtrainval_14-Jul-2008.tar\n4. Adobe_Deep_Matting_Dataset.zip\n请确保这四个文件都下载完成并放置在正确的位置后再运行脚本。","https:\u002F\u002Fgithub.com\u002Ffoamliu\u002FDeep-Image-Matting-PyTorch\u002Fissues\u002F25",{"id":126,"question_zh":127,"answer_zh":128,"source_url":129},23038,"运行 demo.py 时遇到 \"RuntimeError: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False\" 错误，怎么处理？","这个错误表示保存的模型是在 CUDA (GPU) 环境下训练的，但当前运行环境没有检测到可用的 GPU（即 `torch.cuda.is_available()` 返回 False）。解决方法是在加载模型时指定 `map_location` 参数强制映射到 CPU。例如，将加载代码修改为：`checkpoint = torch.load(checkpoint_path, map_location='cpu')`。","https:\u002F\u002Fgithub.com\u002Ffoamliu\u002FDeep-Image-Matting-PyTorch\u002Fissues\u002F22",{"id":131,"question_zh":132,"answer_zh":133,"source_url":134},23039,"我想用自己的全身人像数据集（高分辨率，包含脚、腿等非头发区域）进行训练或测试，直接裁剪包含这些区域的 Trimap 可以吗？","可以输入这些图像，但需要注意该模型对 Trimap 的依赖性非常高。CNN 模型通常不是在原始全尺寸图像上训练的，建议将高分辨率图像（如 4000*4000）裁剪或调整到合适的尺寸（如 2k）再进行训练或推理。只要 Trimap 准确标注了前景、背景和未知区域，即使包含脚、腿等身体部位，模型也能处理，但务必保证 Trimap 的质量。","https:\u002F\u002Fgithub.com\u002Ffoamliu\u002FDeep-Image-Matting-PyTorch\u002Fissues\u002F38",{"id":136,"question_zh":137,"answer_zh":138,"source_url":139},23040,"在哪里可以找到预训练模型文件 BEST_checkpoint.tar？","预训练模型通常不在代码仓库的直接文件列表中，而是通过外部链接提供。请查看项目的 README 文档或 Issue 讨论区（如 Issue #19 中的回复），作者通常会提供百度网盘、Google Drive 或其他云存储的下载链接。如果链接失效，可以尝试在 Issues 中留言询问维护者或其他用户是否有备份。","https:\u002F\u002Fgithub.com\u002Ffoamliu\u002FDeep-Image-Matting-PyTorch\u002Fissues\u002F19",[141],{"id":142,"version":143,"summary_zh":79,"released_at":144},136771,"v1.0","2019-07-17T03:28:44"]