[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-pytorchbearer--torchbearer":3,"tool-pytorchbearer--torchbearer":64},[4,17,27,35,43,56],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":16},3808,"stable-diffusion-webui","AUTOMATIC1111\u002Fstable-diffusion-webui","stable-diffusion-webui 是一个基于 Gradio 构建的网页版操作界面，旨在让用户能够轻松地在本地运行和使用强大的 Stable Diffusion 图像生成模型。它解决了原始模型依赖命令行、操作门槛高且功能分散的痛点，将复杂的 AI 绘图流程整合进一个直观易用的图形化平台。\n\n无论是希望快速上手的普通创作者、需要精细控制画面细节的设计师，还是想要深入探索模型潜力的开发者与研究人员，都能从中获益。其核心亮点在于极高的功能丰富度：不仅支持文生图、图生图、局部重绘（Inpainting）和外绘（Outpainting）等基础模式，还独创了注意力机制调整、提示词矩阵、负向提示词以及“高清修复”等高级功能。此外，它内置了 GFPGAN 和 CodeFormer 等人脸修复工具，支持多种神经网络放大算法，并允许用户通过插件系统无限扩展能力。即使是显存有限的设备，stable-diffusion-webui 也提供了相应的优化选项，让高质量的 AI 艺术创作变得触手可及。",162132,3,"2026-04-05T11:01:52",[13,14,15],"开发框架","图像","Agent","ready",{"id":18,"name":19,"github_repo":20,"description_zh":21,"stars":22,"difficulty_score":23,"last_commit_at":24,"category_tags":25,"status":16},1381,"everything-claude-code","affaan-m\u002Feverything-claude-code","everything-claude-code 是一套专为 AI 编程助手（如 Claude Code、Codex、Cursor 等）打造的高性能优化系统。它不仅仅是一组配置文件，而是一个经过长期实战打磨的完整框架，旨在解决 AI 代理在实际开发中面临的效率低下、记忆丢失、安全隐患及缺乏持续学习能力等核心痛点。\n\n通过引入技能模块化、直觉增强、记忆持久化机制以及内置的安全扫描功能，everything-claude-code 能显著提升 AI 在复杂任务中的表现，帮助开发者构建更稳定、更智能的生产级 AI 代理。其独特的“研究优先”开发理念和针对 Token 消耗的优化策略，使得模型响应更快、成本更低，同时有效防御潜在的攻击向量。\n\n这套工具特别适合软件开发者、AI 研究人员以及希望深度定制 AI 工作流的技术团队使用。无论您是在构建大型代码库，还是需要 AI 协助进行安全审计与自动化测试，everything-claude-code 都能提供强大的底层支持。作为一个曾荣获 Anthropic 黑客大奖的开源项目，它融合了多语言支持与丰富的实战钩子（hooks），让 AI 真正成长为懂上",138956,2,"2026-04-05T11:33:21",[13,15,26],"语言模型",{"id":28,"name":29,"github_repo":30,"description_zh":31,"stars":32,"difficulty_score":23,"last_commit_at":33,"category_tags":34,"status":16},2271,"ComfyUI","Comfy-Org\u002FComfyUI","ComfyUI 是一款功能强大且高度模块化的视觉 AI 引擎，专为设计和执行复杂的 Stable Diffusion 图像生成流程而打造。它摒弃了传统的代码编写模式，采用直观的节点式流程图界面，让用户通过连接不同的功能模块即可构建个性化的生成管线。\n\n这一设计巧妙解决了高级 AI 绘图工作流配置复杂、灵活性不足的痛点。用户无需具备编程背景，也能自由组合模型、调整参数并实时预览效果，轻松实现从基础文生图到多步骤高清修复等各类复杂任务。ComfyUI 拥有极佳的兼容性，不仅支持 Windows、macOS 和 Linux 全平台，还广泛适配 NVIDIA、AMD、Intel 及苹果 Silicon 等多种硬件架构，并率先支持 SDXL、Flux、SD3 等前沿模型。\n\n无论是希望深入探索算法潜力的研究人员和开发者，还是追求极致创作自由度的设计师与资深 AI 绘画爱好者，ComfyUI 都能提供强大的支持。其独特的模块化架构允许社区不断扩展新功能，使其成为当前最灵活、生态最丰富的开源扩散模型工具之一，帮助用户将创意高效转化为现实。",107662,"2026-04-03T11:11:01",[13,14,15],{"id":36,"name":37,"github_repo":38,"description_zh":39,"stars":40,"difficulty_score":23,"last_commit_at":41,"category_tags":42,"status":16},3704,"NextChat","ChatGPTNextWeb\u002FNextChat","NextChat 是一款轻量且极速的 AI 助手，旨在为用户提供流畅、跨平台的大模型交互体验。它完美解决了用户在多设备间切换时难以保持对话连续性，以及面对众多 AI 模型不知如何统一管理的痛点。无论是日常办公、学习辅助还是创意激发，NextChat 都能让用户随时随地通过网页、iOS、Android、Windows、MacOS 或 Linux 端无缝接入智能服务。\n\n这款工具非常适合普通用户、学生、职场人士以及需要私有化部署的企业团队使用。对于开发者而言，它也提供了便捷的自托管方案，支持一键部署到 Vercel 或 Zeabur 等平台。\n\nNextChat 的核心亮点在于其广泛的模型兼容性，原生支持 Claude、DeepSeek、GPT-4 及 Gemini Pro 等主流大模型，让用户在一个界面即可自由切换不同 AI 能力。此外，它还率先支持 MCP（Model Context Protocol）协议，增强了上下文处理能力。针对企业用户，NextChat 提供专业版解决方案，具备品牌定制、细粒度权限控制、内部知识库整合及安全审计等功能，满足公司对数据隐私和个性化管理的高标准要求。",87618,"2026-04-05T07:20:52",[13,26],{"id":44,"name":45,"github_repo":46,"description_zh":47,"stars":48,"difficulty_score":23,"last_commit_at":49,"category_tags":50,"status":16},2268,"ML-For-Beginners","microsoft\u002FML-For-Beginners","ML-For-Beginners 是由微软推出的一套系统化机器学习入门课程，旨在帮助零基础用户轻松掌握经典机器学习知识。这套课程将学习路径规划为 12 周，包含 26 节精炼课程和 52 道配套测验，内容涵盖从基础概念到实际应用的完整流程，有效解决了初学者面对庞大知识体系时无从下手、缺乏结构化指导的痛点。\n\n无论是希望转型的开发者、需要补充算法背景的研究人员，还是对人工智能充满好奇的普通爱好者，都能从中受益。课程不仅提供了清晰的理论讲解，还强调动手实践，让用户在循序渐进中建立扎实的技能基础。其独特的亮点在于强大的多语言支持，通过自动化机制提供了包括简体中文在内的 50 多种语言版本，极大地降低了全球不同背景用户的学习门槛。此外，项目采用开源协作模式，社区活跃且内容持续更新，确保学习者能获取前沿且准确的技术资讯。如果你正寻找一条清晰、友好且专业的机器学习入门之路，ML-For-Beginners 将是理想的起点。",84991,"2026-04-05T10:45:23",[14,51,52,53,15,54,26,13,55],"数据工具","视频","插件","其他","音频",{"id":57,"name":58,"github_repo":59,"description_zh":60,"stars":61,"difficulty_score":10,"last_commit_at":62,"category_tags":63,"status":16},3128,"ragflow","infiniflow\u002Fragflow","RAGFlow 是一款领先的开源检索增强生成（RAG）引擎，旨在为大语言模型构建更精准、可靠的上下文层。它巧妙地将前沿的 RAG 技术与智能体（Agent）能力相结合，不仅支持从各类文档中高效提取知识，还能让模型基于这些知识进行逻辑推理和任务执行。\n\n在大模型应用中，幻觉问题和知识滞后是常见痛点。RAGFlow 通过深度解析复杂文档结构（如表格、图表及混合排版），显著提升了信息检索的准确度，从而有效减少模型“胡编乱造”的现象，确保回答既有据可依又具备时效性。其内置的智能体机制更进一步，使系统不仅能回答问题，还能自主规划步骤解决复杂问题。\n\n这款工具特别适合开发者、企业技术团队以及 AI 研究人员使用。无论是希望快速搭建私有知识库问答系统，还是致力于探索大模型在垂直领域落地的创新者，都能从中受益。RAGFlow 提供了可视化的工作流编排界面和灵活的 API 接口，既降低了非算法背景用户的上手门槛，也满足了专业开发者对系统深度定制的需求。作为基于 Apache 2.0 协议开源的项目，它正成为连接通用大模型与行业专有知识之间的重要桥梁。",77062,"2026-04-04T04:44:48",[15,14,13,26,54],{"id":65,"github_repo":66,"name":67,"description_en":68,"description_zh":69,"ai_summary_zh":69,"readme_en":70,"readme_zh":71,"quickstart_zh":72,"use_case_zh":73,"hero_image_url":74,"owner_login":75,"owner_name":75,"owner_avatar_url":76,"owner_bio":77,"owner_company":78,"owner_location":78,"owner_email":78,"owner_twitter":78,"owner_website":78,"owner_url":79,"languages":80,"stars":85,"forks":86,"last_commit_at":87,"license":88,"difficulty_score":89,"env_os":90,"env_gpu":90,"env_ram":90,"env_deps":91,"category_tags":96,"github_topics":97,"view_count":10,"oss_zip_url":78,"oss_zip_packed_at":78,"status":16,"created_at":104,"updated_at":105,"faqs":106,"releases":135},694,"pytorchbearer\u002Ftorchbearer","torchbearer","torchbearer: A model fitting library for PyTorch","torchbearer 是一款专为 PyTorch 打造的模型训练库，旨在简化深度学习项目的开发流程。它通过封装底层逻辑，大幅减少了训练过程中所需的样板代码，同时保留了 PyTorch 框架的灵活性与开放性。对于从事深度学习和可微编程的研究人员及开发者而言，torchbearer 能显著提升编码效率，让你专注于模型设计而非重复性代码。其独特的回调机制允许用户轻松定制训练步骤，并提供丰富的可视化支持，让实验过程更加透明可控。值得注意的是，torchbearer 目前已宣布将重心转移至 PyTorch Lightning，未来不再规划新功能，但仍会持续修复漏洞并适配新版本的 PyTorch。如果你正在寻找成熟的训练框架，也可以关注其继任者方案。","**Note:**\nWe're moving to PyTorch Lightning! Read about the move [here](https:\u002F\u002Fmedium.com\u002Fpytorch\u002Fpytorch-frameworks-unite-torchbearer-joins-pytorch-lightning-c588e1e68c98). From the end of February, torchbearer will no longer be actively maintained. We'll continue to fix bugs when they are found and ensure that torchbearer runs on new versions of pytorch. However, we won't plan or implement any new functionality (if there's something you'd like to see in a training library, consider creating an issue on [PyTorch Lightning](https:\u002F\u002Fgithub.com\u002FPyTorchLightning\u002Fpytorch-lightning)).\n\n\u003Cimg alt=\"logo\" src=\"https:\u002F\u002Fraw.githubusercontent.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fmaster\u002Fdocs\u002F_static\u002Fimg\u002Flogo_dark_text.svg?sanitize=true\" width=\"100%\"\u002F>\n\n[![PyPI version](https:\u002F\u002Fbadge.fury.io\u002Fpy\u002Ftorchbearer.svg)](https:\u002F\u002Fbadge.fury.io\u002Fpy\u002Ftorchbearer) [![Python 2.7 | 3.5 | 3.6 | 3.7](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-2.7%20%7C%203.5%20%7C%203.6%20%7C%203.7-brightgreen.svg)](https:\u002F\u002Fwww.python.org\u002F) [![PyTorch 1.0.0 | 1.1.0 | 1.2.0 | 1.3.0 | 1.4.0](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpytorch-1.0.0%20%7C%201.1.0%20%7C%201.2.0%20%7C%201.3.0%20%7C%201.4.0-brightgreen.svg)](https:\u002F\u002Fpytorch.org\u002F) [![Build Status](https:\u002F\u002Ftravis-ci.com\u002Fpytorchbearer\u002Ftorchbearer.svg?branch=master)](https:\u002F\u002Ftravis-ci.com\u002Fpytorchbearer\u002Ftorchbearer) [![codecov](https:\u002F\u002Fcodecov.io\u002Fgh\u002Fpytorchbearer\u002Ftorchbearer\u002Fbranch\u002Fmaster\u002Fgraph\u002Fbadge.svg)](https:\u002F\u002Fcodecov.io\u002Fgh\u002Fpytorchbearer\u002Ftorchbearer) [![Documentation Status](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_13d664e1afd7.png)](https:\u002F\u002Ftorchbearer.readthedocs.io\u002Fen\u002Flatest\u002F?badge=latest) [![Downloads](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_2c20201a83a0.png)](https:\u002F\u002Fpepy.tech\u002Fproject\u002Ftorchbearer)\n\n\u003Cp align=\"center\">\n  \u003Ca href=\"http:\u002F\u002Fpytorchbearer.org\">Website\u003C\u002Fa> •\n  \u003Ca href=\"https:\u002F\u002Ftorchbearer.readthedocs.io\u002Fen\u002Flatest\u002F\">Docs\u003C\u002Fa> •\n  \u003Ca href=\"#examples\">Examples\u003C\u002Fa> •\n  \u003Ca href=\"#install\">Install\u003C\u002Fa> •\n  \u003Ca href=\"#citing\">Citing\u003C\u002Fa> •\n  \u003Ca href=\"#related\">Related\u003C\u002Fa>\n\u003C\u002Fp>\n\n\u003Ca id=\"about\">\u003C\u002Fa>\n\nA PyTorch model fitting library designed for use by researchers (or anyone really) working in deep learning or differentiable programming. Specifically, we aim to dramatically reduce the amount of boilerplate code you need to write without limiting the functionality and openness of PyTorch.\n\n\u003Ca id=\"examples\">\u003C\u002Fa>\n\n## Examples\n\n\u003Ca id=\"general\">\u003C\u002Fa>\n\n### General\n\n\u003Ctable>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\" width=\"160\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_8120f2f1b18a.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>Quickstart:\u003C\u002Fb> Get up and running with torchbearer, training a simple CNN on CIFAR-10.\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\" width=\"80\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fquickstart.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fquickstart.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fquickstart.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_d1dfbcc2a432.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>Callbacks:\u003C\u002Fb> A detailed exploration of callbacks in torchbearer, with some useful visualisations.\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fcallbacks.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fcallbacks.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fcallbacks.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_dcabba14b100.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>Imaging:\u003C\u002Fb> A detailed exploration of the imaging sub-package in torchbearer, useful for showing visualisations during training.\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fimaging.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fimaging.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fimaging.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\" colspan=\"2\">\n            \u003Cb>Serialization:\u003C\u002Fb> This guide gives an introduction to serializing and restarting training in torchbearer.\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fserialization.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fserialization.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fserialization.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\" colspan=\"2\">\n            \u003Cb>History and Replay:\u003C\u002Fb> This guide gives an introduction to the history returned by a trial and the ability to replay training.\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fhistory.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fhistory.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fhistory.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\" colspan=\"2\">\n            \u003Cb>Custom Data Loaders:\u003C\u002Fb> This guide gives an introduction on how to run custom data loaders in torchbearer.\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fcustom_loaders.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fcustom_loaders.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fcustom_loaders.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\" colspan=\"2\">\n            \u003Cb>Data Parallel:\u003C\u002Fb> This guide gives an introduction to using torchbearer with DataParrallel.\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fdata_parallel.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fdata_parallel.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fdata_parallel.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_eefcbc254e28.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>LiveLossPlot:\u003C\u002Fb> A demonstration of the LiveLossPlot callback included in torchbearer.\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Flivelossplot.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Flivelossplot.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Flivelossplot.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_f82c0bf37f19.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>PyCM:\u003C\u002Fb> A demonstration of the PyCM callback included in torchbearer for generating confusion matrices.\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fpycm.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fpycm.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fpycm.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\" colspan=\"2\">\n            \u003Cb>NVIDIA Apex:\u003C\u002Fb> A guide showing how to perform half and mixed precision training in torchbearer with NVIDIA Apex.\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fapex_torchbearer.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fapex_torchbearer.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fapex_torchbearer.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n\u003C\u002Ftable>\n\n\u003Ca id=\"deep\">\u003C\u002Fa>\n\n### Deep Learning\n\n\u003Ctable>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\" width=\"160\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_b7ab15147544.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>Training a VAE:\u003C\u002Fb> A demonstration of how to train (add do a simple visualisation of) a Variational Auto-Encoder (VAE) on MNIST with torchbearer.\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\" width=\"80\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fvae.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fvae.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fvae.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_9c38c73d8fe9.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>Training a GAN:\u003C\u002Fb> A demonstration of how to train (add do a simple visualisation of) a Generative Adversarial Network (GAN) on MNIST with torchbearer.\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fgan.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fgan.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fgan.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_7fd16958a1d2.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>Generating Adversarial Examples:\u003C\u002Fb> A demonstration of how to perform a simple adversarial attack with torchbearer.\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fadversarial.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fadversarial.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fadversarial.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_ad670798c241.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>Transfer Learning with Torchbearer:\u003C\u002Fb> A demonstration of how to perform transfer learning on STL10 with torchbearer.\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Ftransfer_learning.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Ftransfer_learning.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Ftransfer_learning.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_f7ad9cabc68e.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>Regularisers in Torchbearer:\u003C\u002Fb> A demonstration of how to use all of the built-in regularisers in torchbearer (Mixup, CutOut, CutMix, Random Erase, Label Smoothing and Sample Pairing).\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fregularisers.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n        \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fregularisers.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fregularisers.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\" colspan=\"2\">\n            \u003Cb>Manifold Mixup:\u003C\u002Fb> A demonstration of how to use the Manifold Mixup callback in Torchbearer.\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fmanifold_mixup.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fmanifold_mixup.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fmanifold_mixup.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_efa088eeb98b.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>Class Appearance Model:\u003C\u002Fb> A demonstration of the Class Appearance Model (CAM) callback in torchbearer.\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fcam.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fcam.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fcam.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n\u003C\u002Ftable>\n\n\u003Ca id=\"diff\">\u003C\u002Fa>\n\n### Differentiable Programming\n\n\u003Ctable>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\" width=\"160\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_56b42ea143d9.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>Optimising Functions:\u003C\u002Fb> An example (and some fun visualisations) showing how torchbearer can be used for the purpose of optimising functions with respect to their parameters using gradient descent.\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\" width=\"80\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fbasic_opt.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fbasic_opt.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fbasic_opt.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_45984acec35b.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>Linear SVM:\u003C\u002Fb> Train a linear support vector machine (SVM) using torchbearer, with an interactive visualisation!\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fsvm_linear.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fsvm_linear.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fsvm_linear.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_7fccdb90a935.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>Breaking Adam:\u003C\u002Fb> The Adam optimiser doesn't always converge, in this example we reimplement some of the function optimisations from the AMSGrad paper showing this empirically.\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Famsgrad.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Famsgrad.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Famsgrad.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n\u003C\u002Ftable>\n\n\u003Ca id=\"installation\">\u003C\u002Fa>\n\n## Install\n\nThe easiest way to install torchbearer is with pip:\n\n`pip install torchbearer`\n\nAlternatively, build from source with:\n\n`pip install git+https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer`\n\n\u003Ca id=\"citing\">\u003C\u002Fa>\n\n## Citing Torchbearer\n\nIf you find that torchbearer is useful to your research then please consider citing our preprint: [Torchbearer: A Model Fitting Library for PyTorch](https:\u002F\u002Farxiv.org\u002Fabs\u002F1809.03363), with the following BibTeX entry:\n\n```\n@article{torchbearer2018,\n  author = {Ethan Harris and Matthew Painter and Jonathon Hare},\n  title = {Torchbearer: A Model Fitting Library for PyTorch},\n  journal  = {arXiv preprint arXiv:1809.03363},\n  year = {2018}\n}\n```\n\n\u003Ca id=\"related\">\u003C\u002Fa>\n\n## Related\n\nTorchbearer isn't the only library for training PyTorch models. Here are a few others that might better suit your needs (this is by no means a complete list, see the [awesome pytorch list](https:\u002F\u002Fgithub.com\u002Fbharathgs\u002FAwesome-pytorch-list) or [the incredible pytorch](https:\u002F\u002Fgithub.com\u002Fritchieng\u002Fthe-incredible-pytorch) for more):\n- [skorch](https:\u002F\u002Fgithub.com\u002Fdnouri\u002Fskorch), model wrapper that enables use with scikit-learn - crossval etc. can be very useful\n- [PyToune](https:\u002F\u002Fgithub.com\u002FGRAAL-Research\u002Fpytoune), simple Keras style API\n- [ignite](https:\u002F\u002Fgithub.com\u002Fpytorch\u002Fignite), advanced model training from the makers of PyTorch, can need a lot of code for advanced functions (e.g. Tensorboard)\n- [TorchNetTwo (TNT)](https:\u002F\u002Fgithub.com\u002Fpytorch\u002Ftnt), can be complex to use but well established, somewhat replaced by ignite\n- [Inferno](https:\u002F\u002Fgithub.com\u002Finferno-pytorch\u002Finferno), training utilities and convenience classes for PyTorch   \n- [Pytorch Lightning](https:\u002F\u002Fgithub.com\u002FwilliamFalcon\u002Fpytorch-lightning), lightweight wrapper on top of PyTorch with advanced multi-gpu and cluster support\n- [Pywick](https:\u002F\u002Fgithub.com\u002Fachaiah\u002Fpywick), high-level training framework, based on torchsample, support for various segmentation models\n","**注意：**\n我们要迁移到 PyTorch Lightning！关于此次迁移的详情请[阅读此处](https:\u002F\u002Fmedium.com\u002Fpytorch\u002Fpytorch-frameworks-unite-torchbearer-joins-pytorch-lightning-c588e1e68c98)。从 2 月底开始，torchbearer 将不再积极维护。我们将继续修复发现的错误，并确保 torchbearer 能在新版本的 PyTorch 上运行。但是，我们不会计划或实现任何新功能（如果您希望在训练库中看到某些功能，请考虑在 [PyTorch Lightning](https:\u002F\u002Fgithub.com\u002FPyTorchLightning\u002Fpytorch-lightning) 上创建一个问题 (Issue)）。\n\n\u003Cimg alt=\"logo\" src=\"https:\u002F\u002Fraw.githubusercontent.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fmaster\u002Fdocs\u002F_static\u002Fimg\u002Flogo_dark_text.svg?sanitize=true\" width=\"100%\"\u002F>\n\n[![PyPI version](https:\u002F\u002Fbadge.fury.io\u002Fpy\u002Ftorchbearer.svg)](https:\u002F\u002Fbadge.fury.io\u002Fpy\u002Ftorchbearer) [![Python 2.7 | 3.5 | 3.6 | 3.7](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-2.7%20%7C%203.5%20%7C%203.6%20%7C%203.7-brightgreen.svg)](https:\u002F\u002Fwww.python.org\u002F) [![PyTorch 1.0.0 | 1.1.0 | 1.2.0 | 1.3.0 | 1.4.0](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpytorch-1.0.0%20%7C%201.1.0%20%7C%201.2.0%20%7C%201.3.0%20%7C%201.4.0-brightgreen.svg)](https:\u002F\u002Fpytorch.org\u002F) [![Build Status](https:\u002F\u002Ftravis-ci.com\u002Fpytorchbearer\u002Ftorchbearer.svg?branch=master)](https:\u002F\u002Ftravis-ci.com\u002Fpytorchbearer\u002Ftorchbearer) [![codecov](https:\u002F\u002Fcodecov.io\u002Fgh\u002Fpytorchbearer\u002Ftorchbearer\u002Fbranch\u002Fmaster\u002Fgraph\u002Fbadge.svg)](https:\u002F\u002Fcodecov.io\u002Fgh\u002Fpytorchbearer\u002Ftorchbearer) [![Documentation Status](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_13d664e1afd7.png)](https:\u002F\u002Ftorchbearer.readthedocs.io\u002Fen\u002Flatest\u002F?badge=latest) [![Downloads](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_2c20201a83a0.png)](https:\u002F\u002Fpepy.tech\u002Fproject\u002Ftorchbearer)\n\n\u003Cp align=\"center\">\n  \u003Ca href=\"http:\u002F\u002Fpytorchbearer.org\">网站\u003C\u002Fa> •\n  \u003Ca href=\"https:\u002F\u002Ftorchbearer.readthedocs.io\u002Fen\u002Flatest\u002F\">文档\u003C\u002Fa> •\n  \u003Ca href=\"#examples\">示例\u003C\u002Fa> •\n  \u003Ca href=\"#install\">安装\u003C\u002Fa> •\n  \u003Ca href=\"#citing\">引用\u003C\u002Fa> •\n  \u003Ca href=\"#related\">相关\u003C\u002Fa>\n\u003C\u002Fp>\n\n\u003Ca id=\"about\">\u003C\u002Fa>\n\n一个专为从事深度学习 (Deep Learning) 或可微编程 (Differentiable Programming) 的研究人员（或任何人）设计的 PyTorch 模型训练库。具体来说，我们的目标是大幅减少您需要编写的样板代码 (Boilerplate Code) 量，同时不限制 PyTorch 的功能性和开放性。\n\n\u003Ca id=\"examples\">\u003C\u002Fa>\n\n## 示例\n\n\u003Ca id=\"general\">\u003C\u002Fa>\n\n### 通用\n\n\u003Ctable>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\" width=\"160\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_8120f2f1b18a.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>快速入门：\u003C\u002Fb> 使用 torchbearer 快速上手，在 CIFAR-10 上训练一个简单的 CNN（卷积神经网络）。\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\" width=\"80\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fquickstart.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fquickstart.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fquickstart.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_d1dfbcc2a432.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>回调函数：\u003C\u002Fb> 深入探索 torchbearer 中的回调函数，包含一些有用的可视化效果。\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fcallbacks.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fcallbacks.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fcallbacks.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_dcabba14b100.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>图像可视化：\u003C\u002Fb> 深入探索 torchbearer 中的 imaging 子包，用于在训练期间显示可视化内容。\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fimaging.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fimaging.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fimaging.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\" colspan=\"2\">\n            \u003Cb>序列化：\u003C\u002Fb> 本指南介绍了如何在 torchbearer 中序列化和重启训练。\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fserialization.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fserialization.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fserialization.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\" colspan=\"2\">\n            \u003Cb>历史记录与重放：\u003C\u002Fb> 本指南介绍了由 trial 返回的历史记录以及重放训练的功能。\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fhistory.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fhistory.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fhistory.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\" colspan=\"2\">\n            \u003Cb>自定义数据加载器：\u003C\u002Fb> 本指南介绍了如何在 torchbearer 中运行自定义数据加载器。\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fcustom_loaders.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fcustom_loaders.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fcustom_loaders.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\" colspan=\"2\">\n            \u003Cb>数据并行：\u003C\u002Fb> 本指南介绍了如何在 torchbearer 中使用 DataParallel。\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fdata_parallel.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fdata_parallel.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fdata_parallel.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_eefcbc254e28.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>LiveLossPlot：\u003C\u002Fb> 展示 torchbearer 中包含的 LiveLossPlot 回调功能。\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Flivelossplot.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Flivelossplot.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Flivelossplot.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_f82c0bf37f19.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>PyCM：\u003C\u002Fb> 展示 torchbearer 中包含的 PyCM 回调功能，用于生成混淆矩阵。\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fpycm.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fpycm.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fpycm.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\" colspan=\"2\">\n            \u003Cb>NVIDIA Apex：\u003C\u002Fb> 本指南展示如何使用 NVIDIA Apex 在 torchbearer 中进行半精度和混合精度训练。\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fapex_torchbearer.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fapex_torchbearer.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fapex_torchbearer.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n\u003C\u002Ftable>\n\n\u003Ca id=\"deep\">\u003C\u002Fa>\n\n\n\n### 深度学习\n\n\u003Ctable>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\" width=\"160\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_b7ab15147544.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>训练 VAE：\u003C\u002Fb> 演示如何使用 torchbearer 在 MNIST 上训练（并进行简单可视化）变分自编码器 (VAE)。\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\" width=\"80\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fvae.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fvae.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fvae.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_9c38c73d8fe9.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>训练 GAN：\u003C\u002Fb> 演示如何使用 torchbearer 在 MNIST 上训练（并进行简单可视化）生成对抗网络 (GAN)。\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fgan.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fgan.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fgan.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_7fd16958a1d2.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>生成对抗样本：\u003C\u002Fb> 演示如何使用 torchbearer 执行简单的对抗攻击。\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fadversarial.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fadversarial.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fadversarial.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_ad670798c241.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>使用 Torchbearer 进行迁移学习：\u003C\u002Fb> 演示如何使用 torchbearer 在 STL10 上进行迁移学习。\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Ftransfer_learning.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Ftransfer_learning.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Ftransfer_learning.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_f7ad9cabc68e.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>Torchbearer 中的正则化器：\u003C\u002Fb> 演示如何使用 torchbearer 中所有的内置正则化器（Mixup、CutOut、CutMix、Random Erase、Label Smoothing 和 Sample Pairing）。\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fregularisers.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n        \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fregularisers.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fregularisers.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\" colspan=\"2\">\n            \u003Cb>流形 Mixup：\u003C\u002Fb> 演示如何在 Torchbearer 中使用 Manifold Mixup 回调函数。\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fmanifold_mixup.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fmanifold_mixup.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fmanifold_mixup.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_efa088eeb98b.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>类别外观模型：\u003C\u002Fb> 演示 torchbearer 中的类别外观模型 (CAM) 回调函数。\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fcam.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fcam.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fcam.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n\u003C\u002Ftable>\n\n\u003Ca id=\"diff\">\u003C\u002Fa>\n\n\n\n### 可微编程\n\n\u003Ctable>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\" width=\"160\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_56b42ea143d9.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>函数优化：\u003C\u002Fb> 一个示例（以及一些有趣的可视化），展示如何使用 torchbearer 通过梯度下降法针对参数优化函数。\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\" width=\"80\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fbasic_opt.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fbasic_opt.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fbasic_opt.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_45984acec35b.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>线性 SVM：\u003C\u002Fb> 使用 torchbearer 训练线性支持向量机（SVM），并带有交互式可视化！\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fsvm_linear.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fsvm_linear.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fsvm_linear.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd rowspan=\"3\">\n            \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_7fccdb90a935.jpg\" width=\"256\">\n        \u003C\u002Ftd>    \n        \u003Ctd rowspan=\"3\">\n            \u003Cb>打破 Adam：\u003C\u002Fb> Adam 优化器并不总是收敛，在本例中，我们重新实现了 AMSGrad 论文中的一些函数优化方法，从实证角度展示了这一点。\n        \u003C\u002Ftd>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fnbviewer.jupyter.org\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Famsgrad.ipynb\">\n                \u003Cimg src=\"http:\u002F\u002Fwww.pytorchbearer.org\u002Fassets\u002Fimg\u002Fnbviewer_logo.svg\" height=\"34\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Famsgrad.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_438c17272c5f.png\" height=\"32\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n    \u003Ctr>\n        \u003Ctd align=\"center\">\n            \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Famsgrad.ipynb\">\n                \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_readme_5a89a23c2924.png\" height=\"28\">\n            \u003C\u002Fa>\n        \u003C\u002Ftd>\n    \u003C\u002Ftr>\n\u003C\u002Ftable>\n\n\u003Ca id=\"installation\">\u003C\u002Fa>\n\n## 安装\n\n使用 pip 安装 torchbearer 是最简单的方法：\n\n`pip install torchbearer`\n\n或者，通过以下方式从源码构建：\n\n`pip install git+https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer`\n\n\u003Ca id=\"citing\">\u003C\u002Fa>\n\n## 引用 Torchbearer\n\n如果您发现 torchbearer 对您的研究有用，请考虑引用我们的预印本：[Torchbearer: A Model Fitting Library for PyTorch](https:\u002F\u002Farxiv.org\u002Fabs\u002F1809.03363)，并使用以下 BibTeX 条目：\n\n```\n@article{torchbearer2018,\n  author = {Ethan Harris and Matthew Painter and Jonathon Hare},\n  title = {Torchbearer: A Model Fitting Library for PyTorch},\n  journal  = {arXiv preprint arXiv:1809.03363},\n  year = {2018}\n}\n```\n\n\u003Ca id=\"related\">\u003C\u002Fa>\n\n## 相关资源\n\nTorchbearer 并非训练 PyTorch 模型的唯一库。以下是其他一些可能更适合您需求的库（这绝非完整列表，更多请参考 [awesome pytorch list](https:\u002F\u002Fgithub.com\u002Fbharathgs\u002FAwesome-pytorch-list) 或 [the incredible pytorch](https:\u002F\u002Fgithub.com\u002Fritchieng\u002Fthe-incredible-pytorch)）：\n- [skorch](https:\u002F\u002Fgithub.com\u002Fdnouri\u002Fskorch)，模型包装器，支持与 scikit-learn 配合使用 - 交叉验证（crossval）等非常有用\n- [PyToune](https:\u002F\u002Fgithub.com\u002FGRAAL-Research\u002Fpytoune)，简单的 Keras 风格 API\n- [ignite](https:\u002F\u002Fgithub.com\u002Fpytorch\u002Fignite)，PyTorch 开发者提供的先进模型训练工具，高级功能（如 Tensorboard）可能需要较多代码\n- [TorchNetTwo (TNT)](https:\u002F\u002Fgithub.com\u002Fpytorch\u002Ftnt)，使用可能较复杂但已成熟，某种程度上已被 ignite 取代\n- [Inferno](https:\u002F\u002Fgithub.com\u002Finferno-pytorch\u002Finferno)，PyTorch 的训练工具和便利类   \n- [Pytorch Lightning](https:\u002F\u002Fgithub.com\u002FwilliamFalcon\u002Fpytorch-lightning)，基于 PyTorch 的轻量级包装器，支持高级多 GPU 和集群功能\n- [Pywick](https:\u002F\u002Fgithub.com\u002Fachaiah\u002Fpywick)，高层训练框架，基于 torchsample，支持各种分割模型","# torchbearer 快速上手指南\n\n> ⚠️ **重要提示**：该项目已迁移至 **PyTorch Lightning**。从 2 月底开始，torchbearer 将不再积极维护新功能（仅修复 Bug）。如果您需要新的训练功能，建议在 [PyTorch Lightning](https:\u002F\u002Fgithub.com\u002FPyTorchLightning\u002Fpytorch-lightning) 上创建 Issue 或直接使用其替代方案。\n\n## 1. 环境准备\n\n本工具专为深度学习及可微编程研究人员设计，旨在减少 PyTorch 的样板代码。请确保满足以下版本要求：\n\n*   **Python**: 2.7 或 3.5 - 3.7\n*   **PyTorch**: 1.0.0 - 1.4.0\n*   **依赖**: 需预先安装 PyTorch 及其对应版本的 CUDA 支持（如适用）。\n\n## 2. 安装步骤\n\n推荐使用国内镜像源加速下载：\n\n```bash\npip install torchbearer -i https:\u002F\u002Fpypi.tuna.tsinghua.edu.cn\u002Fsimple\n```\n\n若需指定特定版本的 PyTorch，请先通过 `pip install torch==1.x.x` 安装对应版本。\n\n## 3. 基本使用\n\ntorchbearer 的核心概念是 `Trial`，它封装了训练循环、回调和指标计算。以下是一个最简化的使用流程示例：\n\n```python\nfrom torchbearer import Trial, Model, Metrics\nimport torch.nn as nn\nimport torch.optim as optim\n\n# 1. 定义模型和数据加载器\nmodel = nn.Sequential(...) \noptimizer = optim.Adam(model.parameters())\nloss_fn = nn.CrossEntropyLoss()\ntrain_loader = ... \n\n# 2. 初始化 Trial\ntrial = Trial(model, optimizer, loss_fn, metrics=[Metrics.Accuracy()])\n\n# 3. 运行训练\ntrial.run(epochs=10)\n```\n\n### 更多示例\n完整的快速入门代码（包含 CIFAR-10 数据集上的 CNN 训练）请参考官方 Notebook：\n*   [Quickstart Notebook (GitHub)](https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fquickstart.ipynb)\n*   [Quickstart Notebook (Colab)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fpytorchbearer\u002Ftorchbearer\u002Fblob\u002Fmaster\u002Fdocs\u002F_static\u002Fnotebooks\u002Fquickstart.ipynb)\n\n其他高级功能（如 Callbacks、Imaging、History 等）详见文档链接：[ReadTheDocs](https:\u002F\u002Ftorchbearer.readthedocs.io\u002Fen\u002Flatest\u002F)","某计算机视觉团队正在构建一个医疗影像诊断模型，需要快速验证不同网络结构的收敛效果。\n\n### 没有 torchbearer 时\n- 每次修改网络结构都要重写数百行训练循环，包含繁琐的数据预处理、梯度清零及设备调度逻辑。\n- 手动记录训练集和验证集的 Loss 及 Accuracy，不仅容易出错，也无法实时对比历史实验结果。\n- 实现模型断点续训或根据验证集表现自动停止训练时，代码逻辑复杂且难以维护，易引入 Bug。\n- 缺乏统一的接口来管理训练状态，团队协作时不同成员的代码风格差异巨大，增加沟通成本。\n\n### 使用 torchbearer 后\n- torchbearer 将训练流程抽象为简洁的迭代器，只需传入模型和数据集即可开始训练，极大简化代码量。\n- 内建 Metrics 系统自动追踪各项指标，支持直接输出图表，省去手动日志记录和文件管理的麻烦。\n- 利用 Callbacks 机制轻松添加早停、权重保存等功能，代码模块化程度高，便于团队成员共享复用。\n- 自动处理设备迁移与状态同步，确保多卡训练时的稳定性，显著降低环境配置和维护成本。\n\n通过消除重复性工程代码，torchbearer 让研究人员能将精力完全集中在算法优化上。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpytorchbearer_torchbearer_d24b8662.png","pytorchbearer","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002Fpytorchbearer_d2398710.png","The torchbearer project, by @ecs-vlc",null,"https:\u002F\u002Fgithub.com\u002Fpytorchbearer",[81],{"name":82,"color":83,"percentage":84},"Python","#3572A5",100,641,63,"2026-01-08T08:38:16","MIT",1,"未说明",{"notes":92,"python":93,"dependencies":94},"注意：该项目已停止积极维护（自 2 月底起），建议迁移至 PyTorch Lightning；仅兼容 PyTorch 1.0.0 至 1.4.0 版本","2.7, 3.5, 3.6, 3.7",[95],"torch>=1.0.0,\u003C1.5.0",[13],[98,99,100,101,102,103],"pytorch","deep-learning","differentiable-programming","machine-learning","python3","model-fitting","2026-03-27T02:49:30.150509","2026-04-06T05:37:49.028606",[107,112,116,120,125,130],{"id":108,"question_zh":109,"answer_zh":110,"source_url":111},2905,"为什么准确率指标的标准差（acc_std）始终为 0？","当计算的方差为负数时（通常发生在各批次准确率非常相似时），标准差可能为 0。这是由于将 PyTorch float 张量转换为 Python 浮点数时的精度误差导致的。","https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fissues\u002F306",{"id":113,"question_zh":114,"answer_zh":115,"source_url":111},2906,"为什么 running_acc 和最终 acc 值不一致或超出 0-1 范围？","acc 值理论上应在 0 到 1 之间。如果不一致，可能是网络输出形状不正确导致均值指标计数未正确更新。建议检查网络输出形状并确认数据格式是否符合预期。",{"id":117,"question_zh":118,"answer_zh":119,"source_url":111},2907,"Seq2Seq 模型中准确率计算是否支持忽略填充索引（ignore_index）？","目前尚未默认支持，但维护者已注意到此需求并计划添加该选项（类似 nn.CrossEntropyLoss）。用户可提交 PR 协助实现，具体需编辑相关代码以排除占位符索引（如 0 索引）。",{"id":121,"question_zh":122,"answer_zh":123,"source_url":124},2908,"在 Jupyter Notebook 中使用 Tqdm 进度条时出现换行问题如何解决？","该问题已在主分支（master）修复。之前是因为 fit 调用间回调列表持久化导致多个 Tqdm 回调同时运行。请更新至最新版本验证修复效果。","https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fissues\u002F305",{"id":126,"question_zh":127,"answer_zh":128,"source_url":129},2909,"调用 trial.evaluate() 时 ReduceLROnPlateau 回调报错怎么办？","这通常是由于版本过旧导致的。维护者表示会发布新版本。建议尝试卸载并重新安装 torchbearer，或拉取最新代码以确保包含修复。","https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fissues\u002F650",{"id":131,"question_zh":132,"answer_zh":133,"source_url":134},2910,"训练过程中遇到 CUDA 内存分配错误（CUBLAS_STATUS_ALLOC_FAILED）时回调会如何处理？","当前版本会抛出警告但不会退出使用它的回调。维护者计划添加逻辑，确保在失败时正确退出相关回调以防止后续错误。建议关注后续版本更新。","https:\u002F\u002Fgithub.com\u002Fpytorchbearer\u002Ftorchbearer\u002Fissues\u002F658",[136,141,146,151,156,161,166,171,176,181,186,191,196,201,206,211,216,221,226,231],{"id":137,"version":138,"summary_zh":139,"released_at":140},102377,"0.5.4","## [0.5.4] - 2023-11-13\r\n### Added\r\n- Allow imaging callback's `to_file` to use state information in the file name\r\n### Changed\r\n### Deprecated\r\n### Removed\r\n### Fixed\r\n- Fixed warnings about the `epoch` argument in schedulers in newer versions of pytorch\r\n- Fixed a bug in access metrics function and callbacks that use it\r\n- Fixed bug where schedulers were called before optimisers with newer versions of pytorch\r\n- Fixed a bug where the csv logger closed the file too early\r\n- Fixed compat with pytorch > 1.1.0 versioning\r\n- Fixed typos in doc strings\r\n- Fixes for tests where pytorch >2 Tensors were causing issues with mocks\r\n- Fix bug in gradient clipping where the parameter generator was consumed on the first pass\r\n","2023-11-13T17:26:36",{"id":142,"version":143,"summary_zh":144,"released_at":145},102378,"0.5.3","## [0.5.3] - 2020-01-31\r\n### Added\r\n- Method in bases to access metrics\r\n### Changed\r\n### Deprecated\r\n### Removed\r\n### Fixed\r\n- Metric access bugs in various callbacks","2020-01-31T14:07:22",{"id":147,"version":148,"summary_zh":149,"released_at":150},102379,"0.5.2","## [0.5.2] - 2020-28-01\r\n### Added\r\n- Added option to use mixup loss with cutmix\r\n- Support for PyTorch 1.4.0\r\n### Changed\r\n- Changed PyCM save methods to use `*args` and `**kwargs`\r\n### Deprecated\r\n### Removed\r\n### Fixed\r\n- Fixed a bug where the PyCM callback would fail when saving","2020-01-28T16:14:47",{"id":152,"version":153,"summary_zh":154,"released_at":155},102380,"0.5.1","## [0.5.1] - 2019-11-06\r\n### Added\r\n- Added BCPlus callback for between-class learning\r\n- Added support for PyTorch 1.3\r\n- Added a show flag to the `ImagingCallback.to_pyplot` method, set to false to stop it from calling `plt.show`\r\n- Added manifold mixup  \r\n### Changed\r\n- Changed the default behaviour of `ImagingCallback.to_pyplot` to turn off the axis\r\n### Deprecated\r\n### Removed\r\n### Fixed\r\n- Fixed a bug when resuming an old state dict with tqdm enabled\r\n- Fixed a bug in imaging where passing a title to `to_pyplot` was not possible","2019-11-06T17:40:44",{"id":157,"version":158,"summary_zh":159,"released_at":160},102381,"0.5.0","## [0.5.0] - 2019-09-17\r\n### Added\r\n- Added PyTorch CyclicLR scheduler\r\n### Changed\r\n- Torchbearer now supports Modules with multiple inputs and multiple outputs\r\n### Deprecated\r\n### Removed\r\n- Cyclic LR callback in favour of torch cyclic lr scheduler\r\n- Removed support for PyTorch 0.4.x\r\n### Fixed\r\n- Fixed bug where aggregate predictions couldn't handle empty list\r\n- Fixed a bug where Runtime Errors on forward weren't handled properly\r\n- Fixed a bug where exceptions on forward wouldn't print the traceback properly\r\n- Fixed a documentation mistake whereby ReduceLROnPlateau was said to increase learning rate","2019-09-17T13:29:22",{"id":162,"version":163,"summary_zh":164,"released_at":165},102382,"0.4.0","## [0.4.0] - 2019-07-05\r\n### Added\r\n- Added ``with_loader`` trial method that allows running of custom batch loaders\r\n- Added a Mock Model which is set when None is passed as the model to a Trial. Mock Model always returns None. \r\n- Added `__call__(state)` to `StateKey` so that they can now be used as losses\r\n- Added a callback to do cutout regularisation\r\n- Added a `with_data` trial method that allows passing of train, val and test data in one call\r\n- Added the missing on_init callback decorator\r\n- Added a `step_on_batch` flag to the early stopping callback\r\n- Added multi image support to `imaging`\r\n- Added a callback to unpack state into torchbearer.X at sample time for specified keys and update state after the forward pass based on model outputs. This is useful for using DataParallel which pass the main state dict directly. \r\n- Added callback for generating confusion matrices with PyCM\r\n- Added a mixup callback with associated loss\r\n- Added Label Smoothing Regularisation (LSR) callback\r\n- Added CutMix regularisation\r\n- Added default metric from paper for when Mixup loss is used\r\n### Changed\r\n- Changed history to now just be a list of records\r\n- Categorical Accuracy metric now also accepts tensors of size (B, C) and gets the max over C for the taget class\r\n### Deprecated\r\n### Removed\r\n- Removed the variational sub-package, this will now be packaged separately\r\n- Removed `verbose` argument from the early stopping callback\r\n### Fixed\r\n- Fixed a bug where list or dictionary metrics would cause the tensorboard callback to error\r\n- Fixed a bug where running a trial without training steps would error\r\n- Fixed a bug where the caching imaging callback didn't reset data so couldn't be run in multiple trials\r\n- Fixed a bug in the `ClassAppearanceModel` callback\r\n- Fixed a bug where the state given to predict was not a State object\r\n- Fixed a bug with Cutout on gpu\r\n- Fixed a bug where MakeGrid callback wasn't passing all arguments correctly\r\n- Fixed a bug in `ImagingCallback` that would sometimes cause `make_grid` to throw an error\r\n- Fixed a bug where the verbose argument would not work unless given as a keyword argument\r\n- Fixed a bug where the data_key argument would sometimes not work as expected\r\n- Fixed a bug where cutout required a seed\r\n- Fixed a bug where cutmix wasn't sendign the beta distribution sample to the device","2019-09-17T11:30:36",{"id":167,"version":168,"summary_zh":169,"released_at":170},102383,"0.3.2","## [0.3.2] - 2019-05-28\r\n### Added\r\n### Changed\r\n### Deprecated\r\n### Removed\r\n### Fixed\r\n- Fixed a bug where for_steps would sometimes not work as expected if called in the wrong order\r\n- Fixed a bug where torchbearer installed via pip would crash on import","2019-05-28T13:07:41",{"id":172,"version":173,"summary_zh":174,"released_at":175},102392,"0.2.1","## [0.2.1] - 2018-09-11\r\n### Added\r\n- Evaluation and prediction can now be done on any data using data_key keywork arg\r\n- Text tensorboard\u002Fvisdom logger that writes epoch\u002Fbatch metrics to text\r\n### Changed\r\n- TensorboardX, Numpy, Scikit-learn and Scipy are no longer dependancies and only required if using the tensorboard callbacks or roc metric\r\n### Deprecated\r\n### Removed\r\n### Fixed\r\n- Model class setting generator incorrectly leading to stop iterations. \r\n- Argument ordering is consistent in `Trial.with_generators` and `Trial.__init__`\r\n- Added a state dict for the early stopping callback\r\n- Fixed visdom parameters not getting set in some cases","2018-09-11T13:56:03",{"id":177,"version":178,"summary_zh":179,"released_at":180},102376,"0.5.5","[0.5.5] - 2023-12-01\r\n\r\n# Changed\r\n\r\n# Deprecated\r\n\r\n# Removed\r\n\r\n# Fixed\r\n\r\n- Fixed versions in setup.py","2023-12-01T18:41:20",{"id":182,"version":183,"summary_zh":184,"released_at":185},102384,"0.3.1","## [0.3.1] - 2019-05-24\r\n### Added\r\n- Added cyclic learning rate finder\r\n- Added on_init callback hook to run at the end of trial init\r\n- Added callbacks for weight initialisation in ``torchbearer.callbacks.init``\r\n- Added ``with_closure`` trial method that allows running of custom closures \r\n- Added ``base_closure`` function to bases that allows creation of standard training loop closures\r\n- Added ``ImagingCallback`` class for callbacks which produce images that can be sent to tensorboard, visdom or a file\r\n- Added ``CachingImagingCallback`` and ``MakeGrid`` callback to make a grid of images\r\n- Added the option to give the ``only_if`` callback decorator a function of self and state rather than just state\r\n- Added Layer-sequential unit-variance (LSUV) initialization\r\n- Added ClassAppearanceModel callback and example page for visualising CNNs\r\n- Added on_checkpoint callback decorator\r\n- Added support for PyTorch 1.1.0\r\n### Changed\r\n- `No_grad` and `enable_grad` decorators are now also context managers\r\n### Deprecated\r\n### Removed\r\n- Removed the fluent decorator, just use return self\r\n- Removed install dependency on `torchvision`, still required for some functionality\r\n### Fixed\r\n- Fixed bug where replay errored when train or val steps were None\r\n- Fixed a bug where mock optimser wouldn't call it's closure\r\n- Fixed a bug where the notebook check raised ModuleNotFoundError when IPython not installed\r\n- Fixed a memory leak with metrics that causes issues with very long epochs\r\n- Fixed a bug with the once and once_per_epoch decorators\r\n- Fixed a bug where the test criterion wouldn't accept a function of state\r\n- Fixed a bug where type inference would not work correctly when chaining ``Trial`` methods\r\n- Fixed a bug where checkpointers would error when they couldn't find the old checkpoint to overwrite\r\n- Fixed a bug where the 'test' label would sometimes not populate correctly in the default accuracy metric","2019-05-24T07:40:35",{"id":187,"version":188,"summary_zh":189,"released_at":190},102385,"0.3.0","## [0.3.0] - 2019-02-28\r\n### Added\r\n- Added torchbearer.variational, a sub-package for implementations of state of the art variational auto-encoders\r\n- Added SimpleUniform and SimpleExponential distributions\r\n- Added a decorator which can be used to cite a research article as part of a doc string\r\n- Added an optional dimension argument to the mean, std and running_mean metric aggregators\r\n- Added a var metric and decorator which can be used to calculate the variance of a metric\r\n- Added an unbiased flag to the std and var metrics to optionally not apply Bessel's correction (consistent with torch.std \u002F torch.var)\r\n- Added support for rounding 1D lists to the Tqdm callback\r\n- Added SimpleWeibull distribution\r\n- Added support for Python 2.7\r\n- Added SimpleWeibullSimpleWeibullKL\r\n- Added SimpleExponentialSimpleExponentialKL\r\n- Added the option for model parameters only saving to Checkpointers.\r\n- Added documentation about serialization.\r\n- Added support for indefinite data loading. Iterators can now be run until complete independent of epochs or iterators can be refreshed during an epoch if complete. \r\n- Added support for batch intervals in interval checkpointer\r\n- Added line magic ``%torchbearer notebook``\r\n- Added 'accuracy' variants of 'acc' default metrics\r\n### Changed\r\n- Changed the default behaviour of the std metric to compute the sample std, in line with torch.std\r\n- Tqdm precision argument now rounds to decimal places rather than significant figures\r\n- Trial will now simply infer if the model has an argument called 'state'\r\n- Torchbearer now infers if inside a notebook and will use the appropriate tqdm module if not set\r\n### Deprecated\r\n### Removed\r\n- Removed the old Model API (deprecated since version 0.2.0)\r\n- Removed the 'pass_state' argument from Trial, this will now be inferred\r\n- Removed the 'std' decorator from the default metrics\r\n### Fixed\r\n- Fixed a bug in the weight decay callback which would result in potentially negative decay (now just uses torch.norm)\r\n- Fixed a bug in the cite decorator causing the citation to not show up correctly\r\n- Fixed a memory leak in the mse primitive metric","2019-02-28T14:42:52",{"id":192,"version":193,"summary_zh":194,"released_at":195},102386,"0.2.6.1","## [0.2.6.1] - 2019-02-25\r\n### Fixed\r\n- Fixed a bug where predictions would multiply when predict was called more than once","2019-02-25T14:20:37",{"id":197,"version":198,"summary_zh":199,"released_at":200},102387,"0.2.6","## [0.2.6] - 2018-12-19\r\n### Added\r\n### Changed\r\n- Y_PRED, Y_TRUE and X can now equivalently be accessed as PREDICTION, TARGET and INPUT respectively\r\n### Deprecated\r\n### Removed\r\n### Fixed\r\n- Fixed a bug where the LiveLossPlot callback would trigger an error if run and evaluate were called separately\r\n- Fixed a bug where state key errors would report to the wrong stack level\r\n- Fixed a bug where the user would wrongly get a state key error in some cases","2018-12-19T14:09:56",{"id":202,"version":203,"summary_zh":204,"released_at":205},102388,"0.2.5","## [0.2.5] - 2018-12-19\r\n### Added\r\n- Added flag to replay to replay only a single batch per epoch\r\n- Added support for PyTorch 1.0.0 and Python 3.7\r\n- MetricTree can now unpack dictionaries from root, this is useful if you want to get a mean of a metric. However, this should be used with caution as it extracts only the first value in the dict and ignores the rest.\r\n- Added a callback for the livelossplot visualisation tool for notebooks\r\n### Changed\r\n- All error \u002F accuracy metrics can now optionally take state keys for predictions and targets as arguments\r\n### Deprecated\r\n### Removed\r\n### Fixed\r\n- Fixed a bug with the EpochLambda metric which required y_true \u002F y_pred to have specific forms","2018-12-19T09:02:59",{"id":207,"version":208,"summary_zh":209,"released_at":210},102389,"0.2.4","## [0.2.4] - 2018-11-16\r\n### Added\r\n- Added metric functionality to state keys so that they can be used as metrics if desired\r\n- Added customizable precision to the printer callbacks\r\n- Added threshold to binary accuracy. Now it will appropriately handle any values in \\[0, 1\\]\r\n### Changed\r\n- Changed the default printer precision to 4s.f.\r\n- Tqdm on_epoch now shows metrics immediately when resuming\r\n### Deprecated\r\n### Removed\r\n### Fixed\r\n- Fixed a bug which would incorrectly trigger version warnings when loading in models\r\n- Fixed bugs where the Trial would not fail gracefully if required objects were not in state\r\n- Fixed a bug where none criterion didn't work with the add_to_loss callback\r\n- Fixed a bug where tqdm on_epoch always started at 0","2018-11-16T14:01:24",{"id":212,"version":213,"summary_zh":214,"released_at":215},102390,"0.2.3","## [0.2.3] - 2018-10-12\r\n### Added\r\n- Added string representation of Trial to give summary\r\n- Added option to log Trial summary to TensorboardText\r\n- Added a callback point ('on_checkpoint') which can be used for model checkpointing after the history ios updated\r\n### Changed\r\n- When resuming training checkpointers no longer delete the state file the trial was loaded from\r\n- Changed the metric eval to include a data_key which tells us what data we are evaluating on\r\n### Deprecated\r\n### Removed\r\n### Fixed\r\n- Fixed a bug where callbacks weren't handled correctly in the predict and evaluate methods of Trial\r\n- Fixed a bug where the history wasn't updated when new metrics were calculated with the evaluate method of Trial\r\n- Fixed a bug where tensorboard writers couldn't be reused \r\n- Fixed a bug where the none criterion didn't require gradient\r\n- Fix bug where tqdm wouldn't get correct iterator length when evaluating on test generator\r\n- Fixed a bug where evaluating before training tried to update history before it existed\r\n- Fixed a bug where the metrics would output 'val_acc' even if evaluating on test or train data\r\n- Fixed a bug where roc metric didn't detach y_pred before sending to numpy\r\n- Fixed a bug where resuming from a checkpoint saved with one of the callbacks didn't populate the epoch number correctly","2018-10-12T15:36:25",{"id":217,"version":218,"summary_zh":219,"released_at":220},102391,"0.2.2","## [0.2.2] - 2018-09-18\r\n### Added\r\n- The default_for_key metric decorator can now be used to pass arguments to the init of the inner metric\r\n- The default metric for the key 'top_10_acc' is now the TopKCategoricalAccuracy metric with k set to 10\r\n- Added global verbose flag for trial that can be overridden by run, evaluate, predict\r\n- Added an LR metric which retrieves the current learning rate from the optimizer, default for key 'lr'\r\n### Fixed\r\n- Fixed a bug where the DefaultAccuracy metric would not put the inner metric in eval mode if the first call to reset was after the call to eval\r\n- Fixed a bug where trying to load a state dict in a different session to where it was saved didn't work properly\r\n- Fixed a bug where the empty criterion would trigger an error if no Y_TRUE was put in state","2018-09-18T07:24:09",{"id":222,"version":223,"summary_zh":224,"released_at":225},102393,"0.2.0","See [**NEW!**] in README.md for new key features\r\n\r\n## [0.2.0] - 2018-08-21\r\n### Added\r\n- Added the ability to pass custom arguments to the tqdm callback\r\n- Added an ignore_index flag to the categorical accuracy metric, similar to nn.CrossEntropyLoss. Usage: ``metrics=[CategoricalAccuracyFactory(ignore_index=0)]``\r\n- Added TopKCategoricalAccuracy metric (default for key: top\\_5\\_acc)\r\n- Added BinaryAccuracy metric (default for key: binary\\_acc)\r\n- Added MeanSquaredError metric (default for key: mse)\r\n- Added DefaultAccuracy metric (use with 'acc' or 'accuracy') - infers accuracy from the criterion\r\n- New Trial api ``torchbearer.Trial`` to replace the Model api. Trial api is more atomic and uses the fluent pattern to allow chaining of methods.\r\n- ``torchbearer.Trial`` has with_x_generator and with_x_data methods to add training\u002Fvalidation\u002Ftesting generators to the trial. There is a with_generators method to allow passing of all generators in one call.\r\n- ``torchbearer.Trial`` has for_x_steps and for_steps to allow running of trails without explicit generators or data tensors\r\n- ``torchbearer.Trial`` keeps a history of run calls which tracks number of epochs ran and the final metrics at each epoch. This allows seamless resuming of trial running.\r\n- ``torchbearer.Trial.state_dict`` now returns the trial history and callback list state allowing for full resuming of trials\r\n- ``torchbearer.Trial`` has a replay method that can replay training (with callbacks and display) from the history. This is useful when loading trials from state.\r\n- The backward call can now be passed args by setting ``state[torchbearer.BACKWARD_ARGS]``\r\n- ``torchbearer.Trial`` implements the forward pass, loss calculation and backward call as a optimizer closure\r\n- Metrics are now explicitly calculated with no gradient\r\n### Changed\r\n- Callback decorators can now be chained to allow construction with multiple methods filled\r\n- Callbacks can now implement ``state_dict`` and ``load_state_dict` to allow callbacks to resume with state\r\n- State dictionary is now accepts StateKey objects which are unique and generated through ``torchbearer.state.get_state``\r\n- State dictionary now warns when accessed with strings as this allows for collisions\r\n- Checkpointer callbacks will now resume from a state dict when resume=True in Trial\r\n### Deprecated\r\n- ``torchbearer.Model`` has been deprecated in favour of the new ``torchbearer.Trial`` api\r\n### Removed\r\n- Removed the MetricFactory class. Decorators still work in the same way but the Factory is no longer needed.\r\n### Fixed","2018-08-21T10:34:49",{"id":227,"version":228,"summary_zh":229,"released_at":230},102394,"0.1.7","## [0.1.7] - 2018-08-14\r\n### Added\r\n- Added visdom logging support to tensorbard callbacks\r\n- Added option to choose tqdm module (tqdm, tqdm_notebook, ...) to Tqdm callback\r\n- Added some new decorators to simplify custom callbacks that must only run under certain conditions (or even just once).\r\n### Changed\r\n- Instantiation of Model will now trigger a warning pending the new Trial API in the next version\r\n- TensorboardX dependancy now version 1.4\r\n### Deprecated\r\n### Removed\r\n### Fixed\r\n- Mean and standard deviation calculations now work correctly for network outputs with many dimensions\r\n- Callback list no longer shared between fit calls, now a new copy is made each fit","2018-08-14T12:50:39",{"id":232,"version":233,"summary_zh":234,"released_at":235},102395,"0.1.6","## [0.1.6] - 2018-08-10\r\n### Added\r\n- Added a verbose level (options are now 0,1,2) which will print progress for the entire fit call, updating every epoch. Useful when doing dynamic programming with little data.\r\n- Added support for dictionary outputs of dataloader\r\n- Added abstract superclass for building TensorBoardX based callbacks\r\n### Changed\r\n- Timer callback can now also be used as a metric which allows display of specified timings to printers and has been moved to metrics.\r\n- The loss_criterion is renamed to criterion in `torchbearer.Model` arguments.\r\n- The criterion in `torchbearer.Model` is now optional and will provide a zero loss tensor if it is not given.\r\n- TensorBoard callbacks refactored to be based on a common super class\r\n- TensorBoard callbacks refactored to use a common `SummaryWriter` for each log directory\r\n### Deprecated\r\n### Removed\r\n### Fixed\r\n- Standard deviation calculation now returns 0 instead of complex value when given very close samples","2018-08-10T12:58:53"]