[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-google--jaxopt":3,"tool-google--jaxopt":64},[4,17,27,35,43,56],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":16},3808,"stable-diffusion-webui","AUTOMATIC1111\u002Fstable-diffusion-webui","stable-diffusion-webui 是一个基于 Gradio 构建的网页版操作界面，旨在让用户能够轻松地在本地运行和使用强大的 Stable Diffusion 图像生成模型。它解决了原始模型依赖命令行、操作门槛高且功能分散的痛点，将复杂的 AI 绘图流程整合进一个直观易用的图形化平台。\n\n无论是希望快速上手的普通创作者、需要精细控制画面细节的设计师，还是想要深入探索模型潜力的开发者与研究人员，都能从中获益。其核心亮点在于极高的功能丰富度：不仅支持文生图、图生图、局部重绘（Inpainting）和外绘（Outpainting）等基础模式，还独创了注意力机制调整、提示词矩阵、负向提示词以及“高清修复”等高级功能。此外，它内置了 GFPGAN 和 CodeFormer 等人脸修复工具，支持多种神经网络放大算法，并允许用户通过插件系统无限扩展能力。即使是显存有限的设备，stable-diffusion-webui 也提供了相应的优化选项，让高质量的 AI 艺术创作变得触手可及。",162132,3,"2026-04-05T11:01:52",[13,14,15],"开发框架","图像","Agent","ready",{"id":18,"name":19,"github_repo":20,"description_zh":21,"stars":22,"difficulty_score":23,"last_commit_at":24,"category_tags":25,"status":16},1381,"everything-claude-code","affaan-m\u002Feverything-claude-code","everything-claude-code 是一套专为 AI 编程助手（如 Claude Code、Codex、Cursor 等）打造的高性能优化系统。它不仅仅是一组配置文件，而是一个经过长期实战打磨的完整框架，旨在解决 AI 代理在实际开发中面临的效率低下、记忆丢失、安全隐患及缺乏持续学习能力等核心痛点。\n\n通过引入技能模块化、直觉增强、记忆持久化机制以及内置的安全扫描功能，everything-claude-code 能显著提升 AI 在复杂任务中的表现，帮助开发者构建更稳定、更智能的生产级 AI 代理。其独特的“研究优先”开发理念和针对 Token 消耗的优化策略，使得模型响应更快、成本更低，同时有效防御潜在的攻击向量。\n\n这套工具特别适合软件开发者、AI 研究人员以及希望深度定制 AI 工作流的技术团队使用。无论您是在构建大型代码库，还是需要 AI 协助进行安全审计与自动化测试，everything-claude-code 都能提供强大的底层支持。作为一个曾荣获 Anthropic 黑客大奖的开源项目，它融合了多语言支持与丰富的实战钩子（hooks），让 AI 真正成长为懂上",138956,2,"2026-04-05T11:33:21",[13,15,26],"语言模型",{"id":28,"name":29,"github_repo":30,"description_zh":31,"stars":32,"difficulty_score":23,"last_commit_at":33,"category_tags":34,"status":16},2271,"ComfyUI","Comfy-Org\u002FComfyUI","ComfyUI 是一款功能强大且高度模块化的视觉 AI 引擎，专为设计和执行复杂的 Stable Diffusion 图像生成流程而打造。它摒弃了传统的代码编写模式，采用直观的节点式流程图界面，让用户通过连接不同的功能模块即可构建个性化的生成管线。\n\n这一设计巧妙解决了高级 AI 绘图工作流配置复杂、灵活性不足的痛点。用户无需具备编程背景，也能自由组合模型、调整参数并实时预览效果，轻松实现从基础文生图到多步骤高清修复等各类复杂任务。ComfyUI 拥有极佳的兼容性，不仅支持 Windows、macOS 和 Linux 全平台，还广泛适配 NVIDIA、AMD、Intel 及苹果 Silicon 等多种硬件架构，并率先支持 SDXL、Flux、SD3 等前沿模型。\n\n无论是希望深入探索算法潜力的研究人员和开发者，还是追求极致创作自由度的设计师与资深 AI 绘画爱好者，ComfyUI 都能提供强大的支持。其独特的模块化架构允许社区不断扩展新功能，使其成为当前最灵活、生态最丰富的开源扩散模型工具之一，帮助用户将创意高效转化为现实。",107662,"2026-04-03T11:11:01",[13,14,15],{"id":36,"name":37,"github_repo":38,"description_zh":39,"stars":40,"difficulty_score":23,"last_commit_at":41,"category_tags":42,"status":16},3704,"NextChat","ChatGPTNextWeb\u002FNextChat","NextChat 是一款轻量且极速的 AI 助手，旨在为用户提供流畅、跨平台的大模型交互体验。它完美解决了用户在多设备间切换时难以保持对话连续性，以及面对众多 AI 模型不知如何统一管理的痛点。无论是日常办公、学习辅助还是创意激发，NextChat 都能让用户随时随地通过网页、iOS、Android、Windows、MacOS 或 Linux 端无缝接入智能服务。\n\n这款工具非常适合普通用户、学生、职场人士以及需要私有化部署的企业团队使用。对于开发者而言，它也提供了便捷的自托管方案，支持一键部署到 Vercel 或 Zeabur 等平台。\n\nNextChat 的核心亮点在于其广泛的模型兼容性，原生支持 Claude、DeepSeek、GPT-4 及 Gemini Pro 等主流大模型，让用户在一个界面即可自由切换不同 AI 能力。此外，它还率先支持 MCP（Model Context Protocol）协议，增强了上下文处理能力。针对企业用户，NextChat 提供专业版解决方案，具备品牌定制、细粒度权限控制、内部知识库整合及安全审计等功能，满足公司对数据隐私和个性化管理的高标准要求。",87618,"2026-04-05T07:20:52",[13,26],{"id":44,"name":45,"github_repo":46,"description_zh":47,"stars":48,"difficulty_score":23,"last_commit_at":49,"category_tags":50,"status":16},2268,"ML-For-Beginners","microsoft\u002FML-For-Beginners","ML-For-Beginners 是由微软推出的一套系统化机器学习入门课程，旨在帮助零基础用户轻松掌握经典机器学习知识。这套课程将学习路径规划为 12 周，包含 26 节精炼课程和 52 道配套测验，内容涵盖从基础概念到实际应用的完整流程，有效解决了初学者面对庞大知识体系时无从下手、缺乏结构化指导的痛点。\n\n无论是希望转型的开发者、需要补充算法背景的研究人员，还是对人工智能充满好奇的普通爱好者，都能从中受益。课程不仅提供了清晰的理论讲解，还强调动手实践，让用户在循序渐进中建立扎实的技能基础。其独特的亮点在于强大的多语言支持，通过自动化机制提供了包括简体中文在内的 50 多种语言版本，极大地降低了全球不同背景用户的学习门槛。此外，项目采用开源协作模式，社区活跃且内容持续更新，确保学习者能获取前沿且准确的技术资讯。如果你正寻找一条清晰、友好且专业的机器学习入门之路，ML-For-Beginners 将是理想的起点。",84991,"2026-04-05T10:45:23",[14,51,52,53,15,54,26,13,55],"数据工具","视频","插件","其他","音频",{"id":57,"name":58,"github_repo":59,"description_zh":60,"stars":61,"difficulty_score":10,"last_commit_at":62,"category_tags":63,"status":16},3128,"ragflow","infiniflow\u002Fragflow","RAGFlow 是一款领先的开源检索增强生成（RAG）引擎，旨在为大语言模型构建更精准、可靠的上下文层。它巧妙地将前沿的 RAG 技术与智能体（Agent）能力相结合，不仅支持从各类文档中高效提取知识，还能让模型基于这些知识进行逻辑推理和任务执行。\n\n在大模型应用中，幻觉问题和知识滞后是常见痛点。RAGFlow 通过深度解析复杂文档结构（如表格、图表及混合排版），显著提升了信息检索的准确度，从而有效减少模型“胡编乱造”的现象，确保回答既有据可依又具备时效性。其内置的智能体机制更进一步，使系统不仅能回答问题，还能自主规划步骤解决复杂问题。\n\n这款工具特别适合开发者、企业技术团队以及 AI 研究人员使用。无论是希望快速搭建私有知识库问答系统，还是致力于探索大模型在垂直领域落地的创新者，都能从中受益。RAGFlow 提供了可视化的工作流编排界面和灵活的 API 接口，既降低了非算法背景用户的上手门槛，也满足了专业开发者对系统深度定制的需求。作为基于 Apache 2.0 协议开源的项目，它正成为连接通用大模型与行业专有知识之间的重要桥梁。",77062,"2026-04-04T04:44:48",[15,14,13,26,54],{"id":65,"github_repo":66,"name":67,"description_en":68,"description_zh":69,"ai_summary_zh":69,"readme_en":70,"readme_zh":71,"quickstart_zh":72,"use_case_zh":73,"hero_image_url":74,"owner_login":75,"owner_name":76,"owner_avatar_url":77,"owner_bio":78,"owner_company":79,"owner_location":79,"owner_email":80,"owner_twitter":81,"owner_website":82,"owner_url":83,"languages":84,"stars":89,"forks":90,"last_commit_at":91,"license":92,"difficulty_score":93,"env_os":94,"env_gpu":95,"env_ram":94,"env_deps":96,"category_tags":100,"github_topics":101,"view_count":106,"oss_zip_url":79,"oss_zip_packed_at":79,"status":16,"created_at":107,"updated_at":108,"faqs":109,"releases":139},111,"google\u002Fjaxopt","jaxopt","Hardware accelerated, batchable and differentiable optimizers in JAX.","jaxopt 是一个基于 JAX 生态构建的优化器库，专为提供硬件加速、可批量化且可微分的优化算法而设计，主要解决了在机器学习与科学计算中高效求解复杂优化问题并利用现代硬件加速的挑战。jaxopt 不仅支持在 GPU 和 TPU 上运行，还能通过 JAX 的 vmap 功能自动向量化处理多个优化实例，显著提升计算效率。\n\njaxopt 特别适合从事深度学习、元学习或需要自定义优化流程的研究人员与开发者。其独特技术亮点在于强大的微分能力，支持对优化问题的解进行隐式微分，或通过展开算法迭代进行自动微分，这使得 jaxopt 在超参数优化和隐式层模型等前沿研究中极具价值。\n\n不过需要特别提醒的是，jaxopt 目前已停止维护，部分核心功能已迁移至 optax 库。虽然不再推荐用于新的生产环境，但 jaxopt 的代码实现和相关的隐式微分论文仍具有重要的学术参考意义。建议新用户在使用前查阅 JAX 官方文档，优先考虑更活跃的替代方案。","# JAXopt\n\n[**Status**](#status)\n| [**Installation**](#installation)\n| [**Documentation**](https:\u002F\u002Fjaxopt.github.io)\n| [**Examples**](https:\u002F\u002Fgithub.com\u002Fgoogle\u002Fjaxopt\u002Ftree\u002Fmain\u002Fexamples)\n| [**Cite us**](#citeus)\n\nHardware accelerated, batchable and differentiable optimizers in\n[JAX](https:\u002F\u002Fgithub.com\u002Fgoogle\u002Fjax).\n\n- **Hardware accelerated:** our implementations run on GPU and TPU, in addition\n  to CPU.\n- **Batchable:** multiple instances of the same optimization problem can be\n  automatically vectorized using JAX's vmap.\n- **Differentiable:** optimization problem solutions can be differentiated with\n  respect to their inputs either implicitly or via autodiff of unrolled\n  algorithm iterations.\n\n## Status\u003Ca id=\"status\">\u003C\u002Fa>\n\nJAXopt is no longer maintained nor developed. Alternatives may be found on the\nJAX [website](https:\u002F\u002Fdocs.jax.dev\u002Fen\u002Flatest\u002F). Some of its features (like\nlosses, projections, lbfgs optimizer) have been ported into\n[optax](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Foptax). We are sincerely grateful for\nall the community contributions the project has garnered over the years.\n\n## Installation\u003Ca id=\"installation\">\u003C\u002Fa>\n\nTo install the latest release of JAXopt, use the following command:\n\n```bash\n$ pip install jaxopt\n```\n\nTo install the **development** version, use the following command instead:\n\n```bash\n$ pip install git+https:\u002F\u002Fgithub.com\u002Fgoogle\u002Fjaxopt\n```\n\nAlternatively, it can be installed from sources with the following command:\n\n```bash\n$ python setup.py install\n```\n\n## Cite us\u003Ca id=\"citeus\">\u003C\u002Fa>\n\nOur implicit differentiation framework is described in this\n[paper](https:\u002F\u002Farxiv.org\u002Fabs\u002F2105.15183). To cite it:\n\n```\n@article{jaxopt_implicit_diff,\n  title={Efficient and Modular Implicit Differentiation},\n  author={Blondel, Mathieu and Berthet, Quentin and Cuturi, Marco and Frostig, Roy \n    and Hoyer, Stephan and Llinares-L{\\'o}pez, Felipe and Pedregosa, Fabian \n    and Vert, Jean-Philippe},\n  journal={arXiv preprint arXiv:2105.15183},\n  year={2021}\n}\n```\n\n## Disclaimer\n\nJAXopt was an open source project maintained by a dedicated team in Google\nResearch. It is not an official Google product.\n\n","# JAXopt\n\n[**状态**](#status)\n| [**安装**](#installation)\n| [**文档**](https:\u002F\u002Fjaxopt.github.io)\n| [**示例**](https:\u002F\u002Fgithub.com\u002Fgoogle\u002Fjaxopt\u002Ftree\u002Fmain\u002Fexamples)\n| [**引用我们**](#citeus)\n\n基于 [JAX](https:\u002F\u002Fgithub.com\u002Fgoogle\u002Fjax) 的硬件加速、可批量化且可微分的 optimizers（优化器）。\n\n- **硬件加速：** 我们的实现不仅可以在 CPU（中央处理器）上运行，还可以在 GPU（图形处理器）和 TPU（张量处理单元）上运行。\n- **可批量化：** 使用 JAX 的 vmap（向量化映射）可以自动对同一优化问题的多个实例进行向量化处理。\n- **可微分：** 优化问题的解可以针对其输入进行微分，既可以通过隐式微分（implicit differentiation）方式，也可以通过展开算法迭代后的自动微分（autodiff）实现。\n\n## 状态\u003Ca id=\"status\">\u003C\u002Fa>\n\nJAXopt 已不再维护或开发。替代方案可在 JAX [网站](https:\u002F\u002Fdocs.jax.dev\u002Fen\u002Flatest\u002F) 上找到。其部分功能（如 losses（损失函数）、projections（投影）、lbfgs optimizer（L-BFGS 优化器））已移植到 [optax](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Foptax)。我们衷心感谢该项目多年来获得的所有社区贡献。\n\n## 安装\u003Ca id=\"installation\">\u003C\u002Fa>\n\n要安装最新版本的 JAXopt，请使用以下命令：\n\n```bash\n$ pip install jaxopt\n```\n\n要安装 **开发** 版本，请改用以下命令：\n\n```bash\n$ pip install git+https:\u002F\u002Fgithub.com\u002Fgoogle\u002Fjaxopt\n```\n\n或者，可以使用以下命令从源代码安装：\n\n```bash\n$ python setup.py install\n```\n\n## 引用我们\u003Ca id=\"citeus\">\u003C\u002Fa>\n\n我们的隐式微分（implicit differentiation）框架在此 [论文](https:\u002F\u002Farxiv.org\u002Fabs\u002F2105.15183) 中进行了描述。引用格式如下：\n\n```\n@article{jaxopt_implicit_diff,\n  title={Efficient and Modular Implicit Differentiation},\n  author={Blondel, Mathieu and Berthet, Quentin and Cuturi, Marco and Frostig, Roy \n    and Hoyer, Stephan and Llinares-L{\\'o}pez, Felipe and Pedregosa, Fabian \n    and Vert, Jean-Philippe},\n  journal={arXiv preprint arXiv:2105.15183},\n  year={2021}\n}\n```\n\n## 免责声明\n\nJAXopt 曾是一个由 Google Research 专门团队维护的开源项目。它不是 Google 的官方产品。","# JAXopt 快速上手指南\n\n> **⚠️ 重要提示：项目已停止维护**\n>\n> 根据官方说明，**JAXopt 已不再维护或开发**。部分功能（如 losses, projections, lbfgs optimizer）已迁移至 [Optax](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Foptax)。建议新项目优先考虑使用 JAX 官方推荐的替代方案。\n\n## 环境准备\n\n- **系统要求**：支持 CPU、GPU 或 TPU 环境。\n- **前置依赖**：\n  - Python\n  - [JAX](https:\u002F\u002Fgithub.com\u002Fgoogle\u002Fjax)（硬件加速基础库）\n\n## 安装步骤\n\n建议使用国内 PyPI 镜像源以加速下载。\n\n**安装最新稳定版：**\n```bash\n$ pip install jaxopt -i https:\u002F\u002Fpypi.tuna.tsinghua.edu.cn\u002Fsimple\n```\n\n**安装开发版：**\n```bash\n$ pip install git+https:\u002F\u002Fgithub.com\u002Fgoogle\u002Fjaxopt\n```\n\n**从源码安装：**\n```bash\n$ python setup.py install\n```\n\n## 基本使用\n\nJAXopt 提供了可硬件加速、可批量化且可微分的优化器。以下是最基础的导入示例：\n\n```python\nfrom jaxopt import GradientDescent\n\n# 具体用法请参考官方 examples 目录\n# https:\u002F\u002Fgithub.com\u002Fgoogle\u002Fjaxopt\u002Ftree\u002Fmain\u002Fexamples\n```\n\n**注意**：由于项目已停止维护，如需使用优化器功能，请查阅 [Optax](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Foptax) 或 JAX 官方文档获取最新支持。","某 AI 实验室正在研发基于元学习的少样本图像分类模型，核心难点在于需要在内层优化步骤中高效计算关于超参数的梯度，这对优化器的可微性与硬件适应性提出了极高要求。\n\n### 没有 jaxopt 时\n- 原生优化器仅支持 CPU 运行，无法利用集群现有的 GPU 或 TPU 资源，导致单次实验耗时数天，迭代周期过长。\n- 处理批量元任务时需手动编写复杂的向量化代码，难以利用 JAX 的自动 vmap 功能，代码冗余且开发效率极低。\n- 对优化结果求导需手动展开所有迭代步骤，导致计算图过大，显存占用过高且容易引发梯度爆炸或消失问题。\n\n### 使用 jaxopt 后\n- 直接调用硬件加速接口，优化过程无缝迁移至 GPU\u002FTPU 运行，充分利用并行计算能力，训练速度提升数十倍。\n- 利用 batchable 特性，自动向量化多个优化实例，代码量减少一半且易于扩展到新任务，维护成本显著降低。\n- 支持隐式微分，无需展开迭代即可对优化解求导，大幅降低显存消耗并保持梯度数值稳定，模型收敛更可靠。\n\n尽管 jaxopt 已停止维护，但其确立的硬件加速与隐式微分范式，仍为解决嵌套优化场景下的效率与梯度传播难题提供了核心思路。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle_jaxopt_88c8a645.png","google","Google","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002Fgoogle_c4bedcda.png","Google ❤️ Open Source",null,"opensource@google.com","GoogleOSS","https:\u002F\u002Fopensource.google\u002F","https:\u002F\u002Fgithub.com\u002Fgoogle",[85],{"name":86,"color":87,"percentage":88},"Python","#3572A5",100,1030,73,"2026-03-31T12:27:44","Apache-2.0",1,"未说明","非必需，支持 GPU 和 TPU 加速，具体型号、显存及 CUDA 版本未说明",{"notes":97,"python":94,"dependencies":98},"该项目已不再维护或开发，建议寻找替代方案（如 optax）。支持 CPU、GPU 和 TPU 硬件加速。非 Google 官方产品。",[99],"jax",[13],[99,102,103,104,105],"optimization","deep-learning","differentiable-programming","bi-level",6,"2026-03-27T02:49:30.150509","2026-04-06T05:16:37.211978",[110,115,119,124,129,134],{"id":111,"question_zh":112,"answer_zh":113,"source_url":114},65,"为什么在构造函数中设置了 `jit=True`，LevenbergMarquardt 优化器运行仍然很慢？","仅在优化器构造函数中设置 `jit=True` 是不够的。你需要对 `optimizer.run` 函数再次使用 `jit` 装饰器，或者直接使用 `__call__` 方法。例如：`jit(optimizer.run)`。此外，确保代码中没有其他导致编译开销的操作。用户反馈正确 JIT 后速度可从 30 秒提升至 1 秒。","https:\u002F\u002Fgithub.com\u002Fgoogle\u002Fjaxopt\u002Fissues\u002F277",{"id":116,"question_zh":117,"answer_zh":118,"source_url":114},66,"如何正确测量 JAXopt 算法的运行时间以避免编译时间干扰？","Benchmark 时需要确保不包含编译时间。建议使用 `time.process_time` 提取 CPU 时间，并在正式计时前先运行一次代码以完成编译。不要将编译时间与运行时间平均计算，否则会导致性能评估不准确。",{"id":120,"question_zh":121,"answer_zh":122,"source_url":123},67,"为什么 LBFGSB 算法返回的参数有时会超出指定的 bounds 范围？","这是一个已知问题，已在后续版本中修复。请更新到最新版本。修复涉及重新实现 zoom linesearch 以接受 `max_stepsize`。如果问题仍然存在，尝试设置 `linesearch=\"zoom\"` 并检查 `max_stepsize` 配置，避免使用 Hager Zhang line search 以防越界。","https:\u002F\u002Fgithub.com\u002Fgoogle\u002Fjaxopt\u002Fissues\u002F439",{"id":125,"question_zh":126,"answer_zh":127,"source_url":128},68,"如何在二次规划投影函数（如 projection_polyhedron）中使用 vmap？","早期版本存在 bug 导致 `TracerArrayConversionError`。PR #79 修复了该问题。修复后，可以在 `jaxopt.ProjectedGradient` 调用中尝试设置 `jit=False` 来避免某些崩溃错误。建议先安装 `cvxpy` 依赖，并更新到修复后的版本以支持 vmap。","https:\u002F\u002Fgithub.com\u002Fgoogle\u002Fjaxopt\u002Fissues\u002F70",{"id":130,"question_zh":131,"answer_zh":132,"source_url":133},69,"当原始解（primal solution）是 PyTree 结构时，如何处理 KKT 条件微分？","`make_kkt_optimality_fun` 目前不是公共 API（位于 `_src` 中），且内部测试主要针对单 jnp 数组。如果遇到 PyTree 结构不匹配的 TypeError，建议暂时将原始解存储为单个 jnp 数组并在需要时 reshape，或等待官方暴露该公共 API 以支持通用 PyTree。","https:\u002F\u002Fgithub.com\u002Fgoogle\u002Fjaxopt\u002Fissues\u002F20",{"id":135,"question_zh":136,"answer_zh":137,"source_url":138},70,"为什么 flax_resnet.py 示例在 CIFAR-10 上的准确率低于标准基线？","这是一个已知问题，已在 1.5 年后通过 commit 75724b5 修复。如果遇到类似准确率低下问题（如验证集准确率不超过 70%），请确保使用最新版本的示例代码，并检查数据集加载及增强配置是否与官方修复后的版本一致。","https:\u002F\u002Fgithub.com\u002Fgoogle\u002Fjaxopt\u002Fissues\u002F156",[140,145,150,155,160,165,170,175,180,185,190,195,200,205,210,215,220,225,230],{"id":141,"version":142,"summary_zh":143,"released_at":144},99738,"jaxopt-v0.8.5","# Bug fixes and enhancements\r\n\r\n- Remove support for python 3.9, test on python 3.12, 3.13, by Vincent Roulet\r\n- Migrate uses of jax pure callback, by Dan Foreman-Mackey.\r\n\r\n# Contributors\r\n\r\nDan Foreman-Mackey, Vincent Roulet.","2025-04-14T17:58:42",{"id":146,"version":147,"summary_zh":148,"released_at":149},99739,"jaxopt-v0.8.4","# Bug fixes and enhancements\r\n\r\n- Remove invalid escape patterns using raw docstring by Tom McTiernan.\r\n- Fix lasso with scalar l1reg by Edoardo Balzani.\r\n- Remove unused imports by Neil Girdhar.\r\n- Fix tests, by Vincent Roulet.\r\n- Fix print for verbose=False in zoom linesearch, by Vincent Roulet.\r\n- Fix BoxOSQP when using pytrees, by Diego Ferigo.\r\n- Drop Python 3.8 by Neil Girdhar.\r\n\r\n# Contributors\r\n\r\nDiego Ferigo, Neil Girdhar, Vincent Roulet, Edoardo Balzani, Tom McTiernan.\r\n","2025-04-10T17:48:21",{"id":151,"version":152,"summary_zh":153,"released_at":154},99740,"jaxopt-v0.8.3","# Bug fixes and enhancements\r\n\r\n- Fix fragile tests in perturbation module, by Roy Frostig.\r\n- Better verbose handling, by Vincent Roulet.\r\n- Remove boston dataset, by Vincent Roulet.\r\n- Control variate support in perturbation module, by Quentin Berthet.\r\n\r\n# Contributors\r\n\r\nRoy Frostig, Vincent Roulet, Quentin Berthet.\r\n","2024-01-10T12:03:47",{"id":156,"version":157,"summary_zh":158,"released_at":159},99741,"jaxopt-v0.8.2","# Bug fixes and enhancements\r\n\r\n- Added SPS+ variant to PolyakSGD, by Fabian Pedregosa.\r\n- Fixed jax.config import, by Sergei Lebedev.\r\n- Fixed typos in the doc + upload missing image, by Fabian Pedregosa.\r\n\r\n# Contributors\r\n\r\nFabian Pedregosa, Sergei Lebedev.","2023-11-06T11:00:41",{"id":161,"version":162,"summary_zh":163,"released_at":164},99742,"jaxopt-v0.8.1","# Bug fixes and enhancements\r\n\r\n- Improved Resnet examples in Flax and Haiku, by Fabian Pedregosa.\r\n- Documentation is built as part of the continuous integration workflow, by Fabian Pedregosa.\r\n- Jit update method, by Mathieu Blondel.\r\n- Linesearch improvements, by Vincent Roulet.\r\n- GaussNewton and LevenbergMarquardt improvements, by Amir Saadat.\r\n- Verbose support even when jit=True, by Amir Saadat.\r\n- Various doc improvements, by Fabian Pedregosa.\r\n\r\n# Contributors\r\n\r\nAmir Saadat, Fabian Pedregosa, Mathieu Blondel, Peter Hawkins, Vincent Roulet.","2023-10-06T23:31:47",{"id":166,"version":167,"summary_zh":168,"released_at":169},99743,"jaxopt-v0.8","# New features\r\n\r\n- Added Broyden algorithm to solve non linear root equations, by Zaccharie Ramzi.\r\n\r\n# Bug fixes and enhancements\r\n\r\n- Fixed “LBFGSB produces NaN for certain conditions”, by Vincent Roulet.\r\n- Better layout for notebook gallery, by Fabian Pedregosa.\r\n- Added support for complex numbers in nonlinear conjugate gradient, by Gregor Thalhammer.\r\n- Added support for complex numbers in LBFGS and zoom line search, by Gregor Thalhammer.\r\n- Various LBFGS improvements, by Vincent Roulet.\r\n- Completely revamped zoom linesearch, by Vincent Roulet.\r\n- Added continuous integration for python 3.11, fixed typing extension issue, by Vincent Roulet.\r\n- Added number of function\u002Fgrad\u002Fprox\u002Fetc… evaluation for various solvers, by Zaccharie Ramzi.\r\n- Fixed implicit differentiation in LBFGSB, by Nathan Simpson.\r\n- Ensure weak-type consistency, by Mathieu Blondel.\r\n- Extend convenience API for QP + doc, by Louis Bethune.\r\n- Add interpolation explanations for Polyak\u002FArmijo API doc, by Louis Bethune.\r\n- Support has_aux in AndersonAcceleration, by Louis Bethune.\r\n- Drop Python 3.7 support, by Mathieu Blondel.\r\n- Fixed typos in perturbation docs, by Guillaume Dalle.\r\n- Fixed pytree support in tree_inf_norm, by Emily Fertig.\r\n- Fixed NaN handling in LBFGSB, by Srinivas Vasudevan.\r\n- Use jnp.ndarray instead of jnp.array, by Peter Hawkins.\r\n\r\n# Contributors\r\n\r\nLouis Bethune, Emily Fertig, Fabian Pedregosa, Gregor Thalhammer-Thurner, Guillaume Dalle, Mathieu Blondel, Nathan Simpson, Peter Hawkins, Srinivas Vasudevan, Vincent Roulet, Zaccharie Ramzi.","2023-08-15T09:07:13",{"id":171,"version":172,"summary_zh":173,"released_at":174},99744,"jaxopt-v0.7","# New features\r\n\r\n- Added jaxopt.LBFGSB, by Emily Fertig.\r\n- Added jaxopt.perturbations.make_perturbed_fun, by Quentin Berthet.\r\n\r\n# Bug fixes and enhancements\r\n\r\n- Allow to pass a function as value_and_grad option, by Chansoo Lee.\r\n- Fixed imaml tutorial (speed and correctness), by Zaccharie Ramzi.\r\n- Misc improvements in resnet_flax example, by Fabian Pedregosa.\r\n- Fixed prox to handle pytrees, by Vincent Roulet.\r\n- Added control variate to make_perturbed_argmax, by Lawrence Stewart.\r\n- Added inverse hessian approximation to the returned state, Aymeric Galan.\r\n- Avoid closing over dynamic jax tracers in the bisection solver, by Roy Frostig.\r\n- Follow pjit API changes, by Yash Katariya and Peter Hawkins.\r\n- Added isotonic module to documentation, by Mathieu Blondel.\r\n\r\n# Contributors\r\n\r\nAymeric Galan, Chansoo Lee, Emily Fertig, Fabian Pedregosa, Lawrence Stewart, Mathieu Blondel, Peter Hawkins, Quentin Berthet, Roy Frostig, Vincent Roulet, Yash Katariya, Zaccharie Ramzi.\r\n","2023-05-26T21:58:44",{"id":176,"version":177,"summary_zh":178,"released_at":179},99745,"jaxopt-v0.6","# New features\r\n\r\n- Added new Hager-Zhang linesearch in LBFGS, by Srinivas Vasudevan (code review by Emily Fertig).\r\n- Added perceptron and hinge losses, by Quentin Berthet.\r\n- Added binary sparsemax loss, sparse_plus and sparse_sigmoid, by Vincent Roulet.\r\n- Added isotonic regression, by Michael Sander.\r\n\r\n# Bug fixes and enhancements\r\n\r\n- Added TPU support to notebooks, by Ayush Shridhar.\r\n- Allowed users to restart from a previous optimizer state in LBFGS, by Zaccharie Ramzi.\r\n- Added faster error computation in gradient descent algorithm, by Zaccharie Ramzi.\r\n- Got rid of extra function call in BFGS and LBFGS, by Zaccharie Ramzi.\r\n- Improved dtype consistency between input and output of update method, by Mathieu Blondel.\r\n- Added perturbed optimizers notebook and narrative documentation, by Quentin Berthet and Fabian Pedregosa.\r\n- Enabled auxiliary value returned by linesearch methods, by Zaccharie Ramzi.\r\n- Added distributed examples to the website, by Fabian Pedregosa.\r\n- Added Custom loop pjit example, by Felipe Llinares.\r\n- Fixed wrong latex in maml.ipynb, by Fabian Pedregosa.\r\n- Fixed bug in backtracking line search, by Srinivas Vasudevan (code review by Emily Fertig).\r\n- Added pylintrc to top level directory, by Fabian Pedregosa.\r\n- Corrected the condition function in LBFGS, by Zaccharie Ramzi.\r\n- Added custom loop pmap example, by Felipe Llinares.\r\n- Fixed pytree support in IterativeRefinement, by Louis Béthune.\r\n- Fixed has_aux support in ArmijoSGD, by Louis Béthune.\r\n- Documentation improvements, by Fabian Pedregosa and Mathieu Blondel.\r\n\r\n# Contributors\r\n\r\nAyush Shridhar, Fabian Pedregosa, Felipe Llinares, Louis Bethune, Mathieu Blondel, Michael Sander, Quentin Berthet, Srinivas Vasudevan, Vincent Roulet, Zaccharie Ramzi.","2023-02-09T15:54:35",{"id":181,"version":182,"summary_zh":183,"released_at":184},99746,"jaxopt-v0.5.5","# New features\r\n\r\n- Added MAML example by Fabian Pedregosa based on initial code by Paul Vicol and Eric Jiang.\r\n- Added the possibility to stop LBFGS after a line search failure, by Zaccharie Ramzi.\r\n- Added gamma to LBFGS state, by Zaccharie Ramzi.\r\n- Added jaxopt.BFGS, by Mathieu Blondel.\r\n- Added value_and_grad option to all gradient-based solvers, by Mathieu Blondel.\r\n- Added Fenchel-Young loss, by Quentin Berthet.\r\n- Added projection_sparse_simplex, by Tianlin Liu.\r\n\r\n# Bug fixes and enhancements\r\n\r\n- Fixed missing args,kwargs in resnet example, by Louis Béthune.\r\n- Corrected the implicit diff examples, by Zaccharie Ramzi.\r\n- Small optimization in l2-regularized semi-dual OT, by Mathieu Blondel.\r\n- Numerical stability improvements in jaxopt.LevenbergMarquardt, by Amir Saadat.\r\n- Dtype consistency in LBFGS, by Alex Botev.\r\n\r\n# Deprecations\r\n\r\n- jaxopt.QuadraticProgramming is now fully removed. Use jaxopt.CvxpyQP, jaxopt.OSQP, jaxopt.BoxOSQP and jaxopt.EqualityConstrainedQP instead.\r\n\r\n# Contributors\r\n\r\nAlex Botev, Amir Saadat, Fabian Pedregosa, Louis Béthune, Mathieu Blondel, Quentin Berthet, Tianlin Liu, Zaccharie Ramzi.\r\n","2022-10-20T09:08:54",{"id":186,"version":187,"summary_zh":188,"released_at":189},99747,"jaxopt-v0.5","# New features\r\n\r\n- Added optimal transport related projections: projection_transport, projection_birkhoff, kl_projection_transport, and kl_projection_birkhoff, by Mathieu Blondel (semi-dual formulation) and Tianlin Liu (dual formulation).\r\n\r\n# Bug fixes and enhancements\r\n\r\n- Fix LaTeX rendering issue in notebooks, by Amélie Héliou.\r\n- Avoid gradient recompilations in zoom line search, by Mathieu Blondel.\r\n- Fix unused Jacobian issue in jaxopt.ScipyRootFinding, by Louis Béthune.\r\n- Use zoom line search by default in jaxopt.LBFGS and jaxopt.NonlinearCG, by Mathieu Blondel.\r\n- Pass tolerance argument to jaxopt.ScipyMinimize, by pipme.\r\n- Handle has_aux in jaxopt.LevenbergMarquardt, by Keunhong Park.\r\n- Add maxiter keyword argument in jaxopt.ScipyMinimize, by Fabian Pedregosa.\r\n\r\n# Contributors\r\n\r\nLouis Béthune, Mathieu Blondel, Amélie Héliou, Keunhong Park, Fabian Pedregosa, pipme.","2022-08-30T10:01:37",{"id":191,"version":192,"summary_zh":193,"released_at":194},99748,"jaxopt-v0.4.3","# New features\r\n\r\n- Added zoom line search in jaxopt.LBFGS, by Mathieu Blondel. It can be enabled with the linesearch=\"zoom\" option.\r\n\r\n# Bug fixes and enhancements\r\n\r\n- Added support for quadratic polynomial fun in jaxopt.BoxOSQP and jaxopt.OSQP, by Louis Béthune.\r\n- Added a notebook for the dataset distillation example, by Amélie Héliou.\r\n- Fixed wrong links and deprecation warnings in notebooks, by Fabian Pedregosa.\r\n- Changed losses to avoid roundoff, by Jack Valmadre.\r\n- Fixed init_params bug in multiclass_svm example, by Louis Béthune.\r\n\r\n# Contributors\r\n\r\nLouis Béthune, Mathieu Blondel, Amélie Héliou, Fabian Pedregosa, Jack Valmadre.","2022-06-28T17:50:09",{"id":196,"version":197,"summary_zh":198,"released_at":199},99749,"jaxopt-v0.4.2","# Bug fixes and enhancements\r\n\r\n- Fix issue with positional arguments in jaxopt.LBFGS and jaxopt.NonlinearCG, by Mathieu Blondel.\r\n","2022-06-10T20:41:59",{"id":201,"version":202,"summary_zh":203,"released_at":204},99750,"jaxopt-v0.4.1","# Bug fixes and enhancements\r\n\r\n- Improvements in [jaxopt.LBFGS](https:\u002F\u002Fgithub.com\u002Fgoogle\u002Fjaxopt\u002Freleases\u002F_autosummary\u002Fjaxopt.LBFGS.html#jaxopt.LBFGS): fixed bug when using use_gamma=True, added stepsize option, strengthened tests, by Mathieu Blondel.\r\n- Fixed link in resnet notebook, by Fabian Pedregosa.\r\n\r\n# Contributors\r\n\r\nFabian Pedregosa, Mathieu Blondel.","2022-06-10T13:48:20",{"id":206,"version":207,"summary_zh":208,"released_at":209},99751,"jaxopt-v0.4","# New features\r\n\r\n- Added solver jaxopt.LevenbergMarquardt, by Amir Saadat.\r\n- Added solver jaxopt.BoxCDQP, by Mathieu Blondel.\r\n- Added projection_hypercube, by Mathieu Blondel.\r\n\r\n# Bug fixes and enhancements\r\n\r\n- Fixed solve_normal_cg when the linear operator is “nonsquare” (does not map to a space of same dimension), by Mathieu Blondel.\r\n- Fixed edge case in jaxopt.Bisection, by Mathieu Blondel.\r\n- Replaced deprecated tree_multimap with tree_map, by Fan Yang.\r\n- Added support for leaf cond pytrees in tree_where, by Felipe Llinares.\r\n- Added Python 3.10 support officially, by Jeppe Klitgaard.\r\n- In scipy wrappers, converted pytree leaves to jax arrays to determine their shape\u002Fdtype, by Roy Frostig.\r\n- Converted the “Resnet” and “Adversarial Training” examples to notebooks, by Fabian Pedregosa.\r\n\r\n# Contributors\r\n\r\nAmir Saadat, Fabian Pedregosa, Fan Yang, Felipe Llinares, Jeppe Klitgaard, Mathieu Blondel, Roy Frostig.","2022-05-24T09:07:16",{"id":211,"version":212,"summary_zh":213,"released_at":214},99752,"jaxopt-v0.3.1","# New features\r\n\r\n- Pjit-based example of data parallel training using Flax, by Felipe Llinares.\r\n\r\n# Bug fixes and enhancements\r\n\r\n- Support for GPU and state of the art adversarial training algorithm (PGD) on the robust_training.py example, by Fabian Pedregosa.\r\n- Update line search in LBFGS to use jit and unroll from LBFGS, by Ian Williamson.\r\n- Support dynamic maximum iteration count in iterative solvers, by Roy Frostig.\r\n- Fix tree_where for singleton pytrees, by Louis Béthune.\r\n- Remove QuadraticProg in projections and set ``init_params=None`` by default in QP solvers, by Louis Béthune.\r\n- Add missing 'value' attribute in LbfgsState, by Mathieu Blondel.\r\n\r\n# Contributors\r\n\r\nFelipe Llinares, Fabian Pedregosa, Ian Williamson, Louis Bétune, Mathieu Blondel, Roy Frostig.","2022-02-28T21:55:02",{"id":216,"version":217,"summary_zh":218,"released_at":219},99753,"jaxopt-v0.3","# New features\r\n\r\n- jaxopt.LBFGS\r\n- jaxopt.BacktrackingLineSearch\r\n- jaxopt.GaussNewton\r\n- jaxopt.NonlinearCG\r\n\r\n# Bug fixes and enhancements\r\n\r\n- Support implicit AD in higher-order differentiation.\r\n\r\n# Contributors\r\n\r\nAmir Saadat, Fabian Pedregosa, Geoffrey Négiar, Hyunsung Lee, Mathieu Blondel, Roy Frostig.","2022-01-31T14:38:32",{"id":221,"version":222,"summary_zh":223,"released_at":224},99754,"jaxopt-v0.2","# New features\r\n\r\n- Quadratic programming solvers jaxopt.CvxpyQP, jaxopt.OSQP, jaxopt.BoxOSQP and jaxopt.EqualityConstrainedQP\r\n- Iterative refinement\r\n\r\n# New examples\r\n\r\n- Resnet example with Flax and JAXopt.\r\n\r\n# Bug fixes and enhancements\r\n\r\n- Prevent recompilation of loops in solver.run if executing without jit.\r\n- Prevents recomputation of gradient in OptaxSolver.\r\n- Make solver.update jittable and ensure output states are consistent.\r\n- Allow Callable for the stepsize argument in jaxopt.ProximalGradient, jaxopt.ProjectedGradient and jaxopt.GradientDescent.\r\n\r\n# Deprecated features\r\n\r\n- jaxopt.QuadraticProgramming is deprecated and will be removed in v0.3. Use jaxopt.CvxpyQP, jaxopt.OSQP, jaxopt.BoxOSQP and jaxopt.EqualityConstrainedQP instead.\r\n\r\n# Contributors\r\n\r\nFabian Pedregosa, Felipe Llinares, Geoffrey Negiar, Louis Bethune, Mathieu Blondel, Vikas Sindhwani.","2021-12-18T22:01:10",{"id":226,"version":227,"summary_zh":228,"released_at":229},99755,"jaxopt-v0.1.1","# New features\r\n\r\n- Added solver jaxopt.ArmijoSGD\r\n- Added example Deep Equilibrium (DEQ) model in Flax with Anderson acceleration.\r\n- Added example Comparison of different SGD algorithms.\r\n\r\n# Bug fixes\r\n\r\n- Allow non-jittable proximity operators in jaxopt.ProximalGradient\r\n- Raise an exception if a quadratic program is infeasible or unbounded\r\n\r\n# Contributors\r\n\r\nFabian Pedregosa, Louis Bethune, Mathieu Blondel.","2021-10-19T14:02:52",{"id":231,"version":232,"summary_zh":233,"released_at":234},99756,"jaxopt-v0.1","# Classes\r\n\r\n- jaxopt.AndersonAcceleration\r\n- jaxopt.AndersonWrapper\r\n- jaxopt.Bisection\r\n- jaxopt.BlockCoordinateDescent\r\n- jaxopt.FixedPointIteration\r\n- jaxopt.GradientDescent\r\n- jaxopt.MirrorDescent\r\n- jaxopt.OptaxSolver\r\n- jaxopt.PolyakSGD\r\n- jaxopt.ProjectedGradient\r\n- jaxopt.ProximalGradient\r\n- jaxopt.QuadraticProgramming\r\n- jaxopt.ScipyBoundedLeastSquares\r\n- jaxopt.ScipyBoundedMinimize\r\n- jaxopt.ScipyLeastSquares\r\n- jaxopt.ScipyMinimize\r\n- jaxopt.ScipyRootFinding\r\n- Implicit differentiation\r\n\r\n# Examples\r\n\r\n- Binary kernel SVM with intercept.\r\n- Image classification example with Flax and JAXopt.\r\n- Image classification example with Haiku and JAXopt.\r\n- VAE example with Haiku and JAXopt.\r\n- Implicit differentiation of lasso.\r\n- Multiclass linear SVM (without intercept).\r\n- Non-negative matrix factorizaton (NMF) using alternating minimization.\r\n- Dataset distillation.\r\n- Implicit differentiation of ridge regression.\r\n- Robust training.\r\n- Anderson acceleration of gradient descent.\r\n- Anderson acceleration of block coordinate descent.\r\n- Anderson acceleration in application to Picard–Lindelöf theorem.\r\n\r\n# Contributors\r\n\r\nFabian Pedregosa, Felipe Llinares, Robert Gower, Louis Bethune, Marco Cuturi, Mathieu Blondel, Peter Hawkins, Quentin Berthet, Roy Frostig, Ta-Chu Kao","2021-10-14T11:28:17"]