[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-patrick-kidger--optimistix":3,"tool-patrick-kidger--optimistix":64},[4,17,27,35,43,56],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":16},3808,"stable-diffusion-webui","AUTOMATIC1111\u002Fstable-diffusion-webui","stable-diffusion-webui 是一个基于 Gradio 构建的网页版操作界面，旨在让用户能够轻松地在本地运行和使用强大的 Stable Diffusion 图像生成模型。它解决了原始模型依赖命令行、操作门槛高且功能分散的痛点，将复杂的 AI 绘图流程整合进一个直观易用的图形化平台。\n\n无论是希望快速上手的普通创作者、需要精细控制画面细节的设计师，还是想要深入探索模型潜力的开发者与研究人员，都能从中获益。其核心亮点在于极高的功能丰富度：不仅支持文生图、图生图、局部重绘（Inpainting）和外绘（Outpainting）等基础模式，还独创了注意力机制调整、提示词矩阵、负向提示词以及“高清修复”等高级功能。此外，它内置了 GFPGAN 和 CodeFormer 等人脸修复工具，支持多种神经网络放大算法，并允许用户通过插件系统无限扩展能力。即使是显存有限的设备，stable-diffusion-webui 也提供了相应的优化选项，让高质量的 AI 艺术创作变得触手可及。",162132,3,"2026-04-05T11:01:52",[13,14,15],"开发框架","图像","Agent","ready",{"id":18,"name":19,"github_repo":20,"description_zh":21,"stars":22,"difficulty_score":23,"last_commit_at":24,"category_tags":25,"status":16},1381,"everything-claude-code","affaan-m\u002Feverything-claude-code","everything-claude-code 是一套专为 AI 编程助手（如 Claude Code、Codex、Cursor 等）打造的高性能优化系统。它不仅仅是一组配置文件，而是一个经过长期实战打磨的完整框架，旨在解决 AI 代理在实际开发中面临的效率低下、记忆丢失、安全隐患及缺乏持续学习能力等核心痛点。\n\n通过引入技能模块化、直觉增强、记忆持久化机制以及内置的安全扫描功能，everything-claude-code 能显著提升 AI 在复杂任务中的表现，帮助开发者构建更稳定、更智能的生产级 AI 代理。其独特的“研究优先”开发理念和针对 Token 消耗的优化策略，使得模型响应更快、成本更低，同时有效防御潜在的攻击向量。\n\n这套工具特别适合软件开发者、AI 研究人员以及希望深度定制 AI 工作流的技术团队使用。无论您是在构建大型代码库，还是需要 AI 协助进行安全审计与自动化测试，everything-claude-code 都能提供强大的底层支持。作为一个曾荣获 Anthropic 黑客大奖的开源项目，它融合了多语言支持与丰富的实战钩子（hooks），让 AI 真正成长为懂上",140436,2,"2026-04-05T23:32:43",[13,15,26],"语言模型",{"id":28,"name":29,"github_repo":30,"description_zh":31,"stars":32,"difficulty_score":23,"last_commit_at":33,"category_tags":34,"status":16},2271,"ComfyUI","Comfy-Org\u002FComfyUI","ComfyUI 是一款功能强大且高度模块化的视觉 AI 引擎，专为设计和执行复杂的 Stable Diffusion 图像生成流程而打造。它摒弃了传统的代码编写模式，采用直观的节点式流程图界面，让用户通过连接不同的功能模块即可构建个性化的生成管线。\n\n这一设计巧妙解决了高级 AI 绘图工作流配置复杂、灵活性不足的痛点。用户无需具备编程背景，也能自由组合模型、调整参数并实时预览效果，轻松实现从基础文生图到多步骤高清修复等各类复杂任务。ComfyUI 拥有极佳的兼容性，不仅支持 Windows、macOS 和 Linux 全平台，还广泛适配 NVIDIA、AMD、Intel 及苹果 Silicon 等多种硬件架构，并率先支持 SDXL、Flux、SD3 等前沿模型。\n\n无论是希望深入探索算法潜力的研究人员和开发者，还是追求极致创作自由度的设计师与资深 AI 绘画爱好者，ComfyUI 都能提供强大的支持。其独特的模块化架构允许社区不断扩展新功能，使其成为当前最灵活、生态最丰富的开源扩散模型工具之一，帮助用户将创意高效转化为现实。",107662,"2026-04-03T11:11:01",[13,14,15],{"id":36,"name":37,"github_repo":38,"description_zh":39,"stars":40,"difficulty_score":23,"last_commit_at":41,"category_tags":42,"status":16},3704,"NextChat","ChatGPTNextWeb\u002FNextChat","NextChat 是一款轻量且极速的 AI 助手，旨在为用户提供流畅、跨平台的大模型交互体验。它完美解决了用户在多设备间切换时难以保持对话连续性，以及面对众多 AI 模型不知如何统一管理的痛点。无论是日常办公、学习辅助还是创意激发，NextChat 都能让用户随时随地通过网页、iOS、Android、Windows、MacOS 或 Linux 端无缝接入智能服务。\n\n这款工具非常适合普通用户、学生、职场人士以及需要私有化部署的企业团队使用。对于开发者而言，它也提供了便捷的自托管方案，支持一键部署到 Vercel 或 Zeabur 等平台。\n\nNextChat 的核心亮点在于其广泛的模型兼容性，原生支持 Claude、DeepSeek、GPT-4 及 Gemini Pro 等主流大模型，让用户在一个界面即可自由切换不同 AI 能力。此外，它还率先支持 MCP（Model Context Protocol）协议，增强了上下文处理能力。针对企业用户，NextChat 提供专业版解决方案，具备品牌定制、细粒度权限控制、内部知识库整合及安全审计等功能，满足公司对数据隐私和个性化管理的高标准要求。",87618,"2026-04-05T07:20:52",[13,26],{"id":44,"name":45,"github_repo":46,"description_zh":47,"stars":48,"difficulty_score":23,"last_commit_at":49,"category_tags":50,"status":16},2268,"ML-For-Beginners","microsoft\u002FML-For-Beginners","ML-For-Beginners 是由微软推出的一套系统化机器学习入门课程，旨在帮助零基础用户轻松掌握经典机器学习知识。这套课程将学习路径规划为 12 周，包含 26 节精炼课程和 52 道配套测验，内容涵盖从基础概念到实际应用的完整流程，有效解决了初学者面对庞大知识体系时无从下手、缺乏结构化指导的痛点。\n\n无论是希望转型的开发者、需要补充算法背景的研究人员，还是对人工智能充满好奇的普通爱好者，都能从中受益。课程不仅提供了清晰的理论讲解，还强调动手实践，让用户在循序渐进中建立扎实的技能基础。其独特的亮点在于强大的多语言支持，通过自动化机制提供了包括简体中文在内的 50 多种语言版本，极大地降低了全球不同背景用户的学习门槛。此外，项目采用开源协作模式，社区活跃且内容持续更新，确保学习者能获取前沿且准确的技术资讯。如果你正寻找一条清晰、友好且专业的机器学习入门之路，ML-For-Beginners 将是理想的起点。",84991,"2026-04-05T10:45:23",[14,51,52,53,15,54,26,13,55],"数据工具","视频","插件","其他","音频",{"id":57,"name":58,"github_repo":59,"description_zh":60,"stars":61,"difficulty_score":10,"last_commit_at":62,"category_tags":63,"status":16},3128,"ragflow","infiniflow\u002Fragflow","RAGFlow 是一款领先的开源检索增强生成（RAG）引擎，旨在为大语言模型构建更精准、可靠的上下文层。它巧妙地将前沿的 RAG 技术与智能体（Agent）能力相结合，不仅支持从各类文档中高效提取知识，还能让模型基于这些知识进行逻辑推理和任务执行。\n\n在大模型应用中，幻觉问题和知识滞后是常见痛点。RAGFlow 通过深度解析复杂文档结构（如表格、图表及混合排版），显著提升了信息检索的准确度，从而有效减少模型“胡编乱造”的现象，确保回答既有据可依又具备时效性。其内置的智能体机制更进一步，使系统不仅能回答问题，还能自主规划步骤解决复杂问题。\n\n这款工具特别适合开发者、企业技术团队以及 AI 研究人员使用。无论是希望快速搭建私有知识库问答系统，还是致力于探索大模型在垂直领域落地的创新者，都能从中受益。RAGFlow 提供了可视化的工作流编排界面和灵活的 API 接口，既降低了非算法背景用户的上手门槛，也满足了专业开发者对系统深度定制的需求。作为基于 Apache 2.0 协议开源的项目，它正成为连接通用大模型与行业专有知识之间的重要桥梁。",77062,"2026-04-04T04:44:48",[15,14,13,26,54],{"id":65,"github_repo":66,"name":67,"description_en":68,"description_zh":69,"ai_summary_zh":69,"readme_en":70,"readme_zh":71,"quickstart_zh":72,"use_case_zh":73,"hero_image_url":74,"owner_login":75,"owner_name":76,"owner_avatar_url":77,"owner_bio":78,"owner_company":79,"owner_location":80,"owner_email":81,"owner_twitter":82,"owner_website":83,"owner_url":84,"languages":85,"stars":90,"forks":91,"last_commit_at":92,"license":93,"difficulty_score":23,"env_os":94,"env_gpu":95,"env_ram":94,"env_deps":96,"category_tags":103,"github_topics":104,"view_count":23,"oss_zip_url":81,"oss_zip_packed_at":81,"status":16,"created_at":109,"updated_at":110,"faqs":111,"releases":140},2157,"patrick-kidger\u002Foptimistix","optimistix","Nonlinear optimisation (root-finding, least squares, ...) in JAX+Equinox. https:\u002F\u002Fdocs.kidger.site\u002Foptimistix\u002F","Optimistix 是一个基于 JAX 和 Equinox 构建的开源库，专注于解决各类非线性优化问题，包括求根、最小二乘法、不动点迭代及函数最小化等。它主要解决了科学计算与深度学习中复杂方程求解难、算法组合灵活性低以及硬件加速适配繁琐的痛点。\n\n这款工具特别适合需要高性能数值计算的科研人员、算法工程师及开发者使用，尤其是那些已经在 JAX 生态中进行微分方程求解或模型训练的用户。Optimistix 的核心亮点在于其高度的模块化与互操作性：用户可以将求根问题自动转化为最小二乘问题，并灵活组合不同的优化策略（如将 BFGS 算法与信赖域更新结合）。此外，它原生支持 PyTree 数据结构，能够充分利用 JAX 的自动微分、自动并行化特性，轻松在 GPU 或 TPU 上实现快速编译与高效运行，并能与主流优化库 Optax 无缝协作，为复杂数学问题的求解提供了简洁而强大的方案。","\u003Ch1 align='center'>Optimistix\u003C\u002Fh1>\n\nOptimistix is a [JAX](https:\u002F\u002Fgithub.com\u002Fgoogle\u002Fjax) library for nonlinear solvers: root finding, minimisation, fixed points, and least squares.\n\nFeatures include:\n\n- interoperable solvers: e.g. autoconvert root find problems to least squares problems, then solve using a minimisation algorithm.\n- modular optimisers: e.g. use a BFGS quadratic bowl with a dogleg descent path with a trust region update.\n- using a PyTree as the state.\n- fast compilation and runtimes.\n- interoperability with [Optax](https:\u002F\u002Fgithub.com\u002Fdeepmind\u002Foptax).\n- all the benefits of working with JAX: autodiff, autoparallelism, GPU\u002FTPU support etc.\n\n## Installation\n\n```bash\npip install optimistix\n```\n\nRequires Python 3.11+.\n\n## Documentation\n\nAvailable at [https:\u002F\u002Fdocs.kidger.site\u002Foptimistix](https:\u002F\u002Fdocs.kidger.site\u002Foptimistix).\n\n## Quick example\n\n```python\nimport jax.numpy as jnp\nimport optimistix as optx\n\n# Let's solve the ODE dy\u002Fdt=tanh(y(t)) with the implicit Euler method.\n# We need to find y1 s.t. y1 = y0 + tanh(y1)dt.\n\ny0 = jnp.array(1.)\ndt = jnp.array(0.1)\n\ndef fn(y, args):\n    return y0 + jnp.tanh(y) * dt\n\nsolver = optx.Newton(rtol=1e-5, atol=1e-5)\nsol = optx.fixed_point(fn, solver, y0)\ny1 = sol.value  # satisfies y1 == fn(y1)\n```\n\n## Citation\n\nIf you found this library to be useful in academic work, then please cite: ([arXiv link](https:\u002F\u002Farxiv.org\u002Fabs\u002F2402.09983))\n\n```bibtex\n@article{optimistix2024,\n    title={Optimistix: modular optimisation in JAX and Equinox},\n    author={Jason Rader and Terry Lyons and Patrick Kidger},\n    journal={arXiv:2402.09983},\n    year={2024},\n}\n```\n\n## See also: other libraries in the JAX ecosystem\n\n**Always useful**  \n[Equinox](https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Fequinox): neural networks and everything not already in core JAX!  \n[jaxtyping](https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Fjaxtyping): type annotations for shape\u002Fdtype of arrays.  \n\n**Deep learning**  \n[Optax](https:\u002F\u002Fgithub.com\u002Fdeepmind\u002Foptax): first-order gradient (SGD, Adam, ...) optimisers.  \n[Orbax](https:\u002F\u002Fgithub.com\u002Fgoogle\u002Forbax): checkpointing (async\u002Fmulti-host\u002Fmulti-device).  \n[Levanter](https:\u002F\u002Fgithub.com\u002Fstanford-crfm\u002Flevanter): scalable+reliable training of foundation models (e.g. LLMs).  \n[paramax](https:\u002F\u002Fgithub.com\u002Fdanielward27\u002Fparamax): parameterizations and constraints for PyTrees.  \n\n**Scientific computing**  \n[Diffrax](https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Fdiffrax): numerical differential equation solvers.  \n[Lineax](https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Flineax): linear solvers.  \n[BlackJAX](https:\u002F\u002Fgithub.com\u002Fblackjax-devs\u002Fblackjax): probabilistic+Bayesian sampling.  \n[sympy2jax](https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Fsympy2jax): SymPy\u003C->JAX conversion; train symbolic expressions via gradient descent.  \n[PySR](https:\u002F\u002Fgithub.com\u002FmilesCranmer\u002FPySR): symbolic regression. (Non-JAX honourable mention!)  \n\n**Awesome JAX**  \n[Awesome JAX](https:\u002F\u002Fgithub.com\u002Fn2cholas\u002Fawesome-jax): a longer list of other JAX projects.  \n\n## Credit\n\nOptimistix was primarily built by Jason Rader (@packquickly): [Twitter](https:\u002F\u002Ftwitter.com\u002Fpackquickly); [GitHub](https:\u002F\u002Fgithub.com\u002Fpackquickly); [Website](https:\u002F\u002Fwww.packquickly.com\u002F). It is being co-maintained by Johanna Haffner (@johannahaffner): [GitHub](https:\u002F\u002Fgithub.com\u002Fjohannahaffner); [Website](https:\u002F\u002Fhaffner.dev).\n","\u003Ch1 align='center'>Optimistix\u003C\u002Fh1>\n\nOptimistix 是一个基于 [JAX](https:\u002F\u002Fgithub.com\u002Fgoogle\u002Fjax) 的非线性求解器库，用于求解方程的根、优化问题、不动点以及最小二乘问题。\n\n其主要特性包括：\n\n- 可互操作的求解器：例如，可以自动将求根问题转换为最小二乘问题，然后使用优化算法进行求解。\n- 模块化的优化器：例如，可以结合 BFGS 二次模型与狗腿下降路径，并采用信赖域更新策略。\n- 使用 PyTree 作为状态表示。\n- 快速的编译和运行时性能。\n- 与 [Optax](https:\u002F\u002Fgithub.com\u002Fdeepmind\u002Foptax) 兼容。\n- 充分利用 JAX 的优势：自动微分、自动并行化、GPU\u002FTPU 支持等。\n\n## 安装\n\n```bash\npip install optimistix\n```\n\n需要 Python 3.11 或更高版本。\n\n## 文档\n\n文档地址：[https:\u002F\u002Fdocs.kidger.site\u002Foptimistix](https:\u002F\u002Fdocs.kidger.site\u002Foptimistix)。\n\n## 快速示例\n\n```python\nimport jax.numpy as jnp\nimport optimistix as optx\n\n# 让我们用隐式欧拉法求解常微分方程 dy\u002Fdt = tanh(y(t))。\n# 我们需要找到 y1，使得 y1 = y0 + tanh(y1) * dt。\n\ny0 = jnp.array(1.)\ndt = jnp.array(0.1)\n\ndef fn(y, args):\n    return y0 + jnp.tanh(y) * dt\n\nsolver = optx.Newton(rtol=1e-5, atol=1e-5)\nsol = optx.fixed_point(fn, solver, y0)\ny1 = sol.value  # 满足 y1 == fn(y1)\n```\n\n## 引用\n\n如果您在学术工作中使用了本库，请引用以下文献：([arXiv 链接](https:\u002F\u002Farxiv.org\u002Fabs\u002F2402.09983))\n\n```bibtex\n@article{optimistix2024,\n    title={Optimistix: modular optimisation in JAX and Equinox},\n    author={Jason Rader and Terry Lyons and Patrick Kidger},\n    journal={arXiv:2402.09983},\n    year={2024},\n}\n```\n\n## 相关项目：JAX 生态系统中的其他库\n\n**始终实用**  \n[Equinox](https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Fequinox)：神经网络及 JAX 核心库中尚未包含的所有内容！  \n[jaxtyping](https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Fjaxtyping)：用于数组形状和数据类型的类型注解。  \n\n**深度学习**  \n[Optax](https:\u002F\u002Fgithub.com\u002Fdeepmind\u002Foptax)：一阶梯度优化器（SGD、Adam 等）。  \n[Orbax](https:\u002F\u002Fgithub.com\u002Fgoogle\u002Forbax)：检查点保存（异步\u002F多主机\u002F多设备）。  \n[Levanter](https:\u002F\u002Fgithub.com\u002Fstanford-crfm\u002Flevanter)：可扩展且可靠的基座模型训练（如 LLM）。  \n[paramax](https:\u002F\u002Fgithub.com\u002Fdanielward27\u002Fparamax)：PyTree 的参数化与约束。  \n\n**科学计算**  \n[Diffrax](https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Fdiffrax)：数值微分方程求解器。  \n[Lineax](https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Flineax)：线性方程组求解器。  \n[BlackJAX](https:\u002F\u002Fgithub.com\u002Fblackjax-devs\u002Fblackjax)：概率与贝叶斯采样。  \n[sympy2jax](https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Fsympy2jax)：SymPy 到 JAX 的转换；可通过梯度下降训练符号表达式。  \n[PySR](https:\u002F\u002Fgithub.com\u002FmilesCranmer\u002FPySR)：符号回归。（非 JAX 的优秀项目！）  \n\n**Awesome JAX**  \n[Awesome JAX](https:\u002F\u002Fgithub.com\u002Fn2cholas\u002Fawesome-jax)：更全面的 JAX 项目列表。  \n\n## 致谢\n\nOptimistix 主要由 Jason Rader (@packquickly) 开发：[Twitter](https:\u002F\u002Ftwitter.com\u002Fpackquickly)；[GitHub](https:\u002F\u002Fgithub.com\u002Fpackquickly)；[官网](https:\u002F\u002Fwww.packquickly.com\u002F)。目前由 Johanna Haffner (@johannahaffner) 协同维护：[GitHub](https:\u002F\u002Fgithub.com\u002Fjohannahaffner)；[官网](https:\u002F\u002Fhaffner.dev)。","# Optimistix 快速上手指南\n\nOptimistix 是一个基于 JAX 的非线性求解器库，支持求根、最小化、不动点迭代和最小二乘法等问题。它具备模块化设计、自动微分、GPU\u002FTPU 加速以及与 Optax 生态互操作等特性。\n\n## 环境准备\n\n- **Python 版本**：3.11 或更高\n- **前置依赖**：\n  - [JAX](https:\u002F\u002Fgithub.com\u002Fgoogle\u002Fjax)（会自动安装）\n  - 可选：[Optax](https:\u002F\u002Fgithub.com\u002Fdeepmind\u002Foptax)（用于梯度优化器集成）\n\n> 💡 提示：国内用户可使用清华或中科大镜像源加速安装依赖。\n\n## 安装步骤\n\n使用 pip 直接安装：\n\n```bash\npip install optimistix\n```\n\n如需使用国内镜像加速安装：\n\n```bash\npip install optimistix -i https:\u002F\u002Fpypi.tuna.tsinghua.edu.cn\u002Fsimple\n```\n\n## 基本使用\n\n以下示例演示如何使用 `optx.fixed_point` 求解一个简单的不动点问题（隐式欧拉法解 ODE）：\n\n```python\nimport jax.numpy as jnp\nimport optimistix as optx\n\n# 定义方程：y1 = y0 + tanh(y1) * dt\ny0 = jnp.array(1.)\ndt = jnp.array(0.1)\n\ndef fn(y, args):\n    return y0 + jnp.tanh(y) * dt\n\n# 选择牛顿法求解器\nsolver = optx.Newton(rtol=1e-5, atol=1e-5)\n\n# 求解不动点\nsol = optx.fixed_point(fn, solver, y0)\ny1 = sol.value  # 满足 y1 == fn(y1)\n```\n\n该示例展示了 Optimistix 的核心用法：定义目标函数、选择求解器、调用求解接口并获取结果。更多高级功能（如最小化、自定义优化路径等）请参考官方文档：[https:\u002F\u002Fdocs.kidger.site\u002Foptimistix](https:\u002F\u002Fdocs.kidger.site\u002Foptimistix)","某量化研究团队正在构建基于隐式数值方法的高频交易预测模型，需要高效求解复杂的非线性方程组以校准市场参数。\n\n### 没有 optimistix 时\n- **算法割裂严重**：面对根查找或最小二乘等不同数学问题，需手动切换 SciPy 或自定义求解器，代码碎片化且难以统一维护。\n- **硬件加速困难**：传统求解器无法原生利用 JAX 的自动微分和 GPU\u002FTPU 并行能力，导致大规模参数校准耗时极长。\n- **组合灵活性差**：若想尝试“用最小二乘算法解根查找问题”等高级策略，需重写大量底层逻辑，实验迭代周期长达数天。\n- **生态兼容成本高**：难以直接与 Optax 优化器或 Equinox 神经网络模块无缝对接，数据格式转换繁琐且易出错。\n\n### 使用 optimistix 后\n- **统一求解接口**：通过 interoperable solvers 特性，自动将根查找问题转化为最小二乘形式，一套代码即可灵活处理各类非线性求解难题。\n- **原生高性能计算**：直接继承 JAX 的自动微分与编译加速特性，将原本数小时的参数拟合任务缩短至分钟级，充分释放 GPU 算力。\n- **模块化自由组合**：支持像搭积木一样组合 BFGS、信赖域等算法组件（如 dogleg 路径），研究人员可快速验证新型混合优化策略。\n- **生态无缝融合**：天然兼容 PyTree 数据结构，能与 Optax 及 Equinox 模型深度集成，实现了从神经网络构建到方程求解的端到端流畅开发。\n\noptimistix 通过模块化设计与 JAX 生态的深度整合，将复杂非线性求解的开发效率提升了数量级，让科研人员能专注于算法创新而非底层实现。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fpatrick-kidger_optimistix_d27ae11d.png","patrick-kidger","Patrick Kidger","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002Fpatrick-kidger_a5d56fa4.png","ML+proteins, sciML, numerics, neural ODEs ╱ building 'scipy w\u002F autodiff+GPU' in JAX: Equinox, Diffrax, Lineax, etc ╱ solo traveller, martial artist, scuba diver","Cradle.bio","Zürich",null,"PatrickKidger","https:\u002F\u002Fkidger.site","https:\u002F\u002Fgithub.com\u002Fpatrick-kidger",[86],{"name":87,"color":88,"percentage":89},"Python","#3572A5",100,565,49,"2026-04-05T10:32:24","Apache-2.0","未说明","非必需，但支持通过 JAX 使用 GPU\u002FTPU 加速（具体型号和显存取决于底层 JAX 配置及任务规模）",{"notes":97,"python":98,"dependencies":99},"该库基于 JAX 构建，因此运行环境需预先配置好 JAX 及其对应的硬件后端（CPU\u002FGPU\u002FTPU）。若需启用 GPU 加速，请确保已安装支持 CUDA 的 jax[cuda] 版本；若使用 TPU，需在相应云环境中运行。库本身轻量，主要依赖为 JAX 生态组件。","3.11+",[100,101,102],"jax","equinox","optax (可选)",[13],[105,101,100,106,107,108],"deep-learning","neural-networks","optimisation","optimization","2026-03-27T02:49:30.150509","2026-04-06T09:44:31.085994",[112,117,122,127,132,136],{"id":113,"question_zh":114,"answer_zh":115,"source_url":116},9960,"在使用 `jax.vmap` 对包含 optimistix 的函数进行批处理并计算梯度时，遇到 'pytree does not match out_structure' 错误怎么办？","这是一个已知问题，通常与 `jax.checkpoint` 的设计决策有关。维护者已在 Equinox 库中提供了修复方案。请确保将 `equinox` 库升级到最新版本（修复已合并），或者在等待发布期间，可以尝试使用修复分支。该错误并非由用户代码逻辑错误直接导致，而是底层自动微分机制在处理隐式求解反向传播时的兼容性问题。","https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fissues\u002F48",{"id":118,"question_zh":119,"answer_zh":120,"source_url":121},9961,"`OptaxMinimiser` 是否支持带有回溯线搜索（backtracking linesearch）的 Optax 优化器（如 L-BFGS 或带线搜索的 Adam）？","早期版本不支持，因为 Optax 的线搜索更新函数需要额外的关键字参数（如 'value', 'grad', 'value_fn'），而 `OptaxMinimiser` 未传递这些参数。此问题已在后续版本（PR #122）中修复。现在 `optimistix` 能够正确传递这些额外参数给 Optax 的更新函数。如果您仍遇到问题，请确保升级到最新版本。注意：Optax 自带的线搜索基于 `lax.while_loop`，目前不支持反向模式微分，如有需要建议使用 `optimistix` 自带的线搜索功能。","https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fissues\u002F121",{"id":123,"question_zh":124,"answer_zh":125,"source_url":126},9962,"Optimistix 是否适合用作实现无梯度（全局、约束、进化）优化算法的基础库？","Optimistix 主要专注于利用导数信息的局部优化方法。虽然库中已实现了简单的无梯度求解器（如 Nelder-Mead），但其核心架构更偏向于微分优化。对于复杂的无梯度、全局或进化算法，维护者建议参考社区实现的独立包（如 `mutax`），或者将其视为超出当前核心范围的功能。如果您需要此类功能，可能需要自行扩展或寻找专门的库。","https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fissues\u002F166",{"id":128,"question_zh":129,"answer_zh":130,"source_url":131},9963,"在使用 Optimistix 替换迭代求解器时遇到运行时错误（如 XlaRuntimeError）且难以获得良好结果，该如何调试？","这类问题通常源于具体的数值不稳定或配置不当。维护者建议提供一个最小可复现示例（MWE, Minimum Working Example）以便定位问题。如果无法立即提供 MWE，可以尝试暂时切换回牛顿求解器（Newton solver）或其他更稳定的求解器作为替代方案。此外，检查输入数据的范围和初始猜测值是否合理也是常见的排查步骤。","https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fissues\u002F74",{"id":133,"question_zh":134,"answer_zh":135,"source_url":121},9964,"如何在 Optimistix 中实现类似 L-BFGS 的拟牛顿法？","可以通过子类化 `AbstractQuasiNewtonUpdate` 类来实现不同的 Hessian 近似方法，包括 L-BFGS。虽然官方正在开发相关的约束优化功能，但用户可以参考现有的原型代码或通过继承该类来自定义实现。这表明库具有足够的扩展性来支持高级拟牛顿算法的开发。",{"id":137,"question_zh":138,"answer_zh":139,"source_url":131},9965,"Optimistix 项目是否还在支持 Python 3.9？","是的，至少在目前阶段，维护者表示继续支持 Python 3.9 没有坏处，也没有强烈的意愿立即放弃对该版本的支持。用户可以继续在 Python 3.9 环境中使用该库。",[141,146,151,156,161,166,171,176,181],{"id":142,"version":143,"summary_zh":144,"released_at":145},116911,"v0.1.0","**Breaking changes:**\r\n\r\n- Verbose printing has changed (across all supported solvers: `optimistix.{DogLeg, LevenbergMarquardt, IndirectLevenbergMarquardt, LBFGS, ...etc}`).\r\n\r\n    Instead of consuming a frozenset of elements to display, e.g. `LevenbergMarquardt(..., verbose=frozenset({\"loss\"}))`, then this should be set to either `False` (the default, print nothing), `True` (print everything), or to a callable to have full control over what will be printed.\r\n\r\n**Features:**\r\n\r\n- `optimistix.FixedPointIteration` now optionally supports damping. (Thanks @aidancrilly! #210)\r\n\r\n**Bugfixes:**\r\n\r\n- No longer triggering JAX initialisation on import. (#186)\r\n- `optimistix.BestSoFarMinimiser` and `optimistix.BestSoFarLeastSquares` should now correctly check the final step when determining the best result. (#33, #211)\r\n- `optimistix.BFGS` now accepts `float32` input even when JAX is in `float64` mode. Previously this would crash. (#207, #212)\r\n- In the `solver.step` API, solvers were incorrectly reporting the `aux` output from a different step. I don't think this should affect results from the ultimate e.g. `root_find(...).aux`, but it will have affected those interacting with the per-step API directly. (#211)\r\n- If a `solver.terminate` returns a non-successful result then this will terminate the solve. Previously this would be ignored and only the boolean flag was respected.\r\n\r\n**Misc:**\r\n\r\n- If a nonfinite value occurs during the solve then the solve will terminate.\r\n- Minimum Python version is now 3.11.\r\n- For contributors: added lots of benchmarks.\r\n\r\n## New Contributors\r\n* @jaeminoh made their first contribution in https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fpull\u002F178\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fcompare\u002Fv0.0.11...v0.1.0","2026-02-16T13:35:47",{"id":147,"version":148,"summary_zh":149,"released_at":150},116912,"v0.0.11","**New minimisers:**\r\n\r\n- `optimistix.LBFGS`. BFGS, but low-memory! (Thanks @BalzaniEdoardo! #135)\r\n- `optimistix.DFP`. This is a variant of BFGS. (Thanks @pfackeldey! #125)\r\n- Plus their abstract counterparts: `optimistix.{AbstractQuasiNewton, AbstractDFP, AbstractLBFGS}`\r\n- `optimistix.GoldenSearch`. This is a simple 1D algorithm that does not require gradient evaluations. (Thanks @johannahaffner! #169)\r\n\r\n**Features:**\r\n\r\n- All minimisers now support a choice of either forward or reverse mode autodiff in their internal jacobian calculations. This is settable via `optimistix.minimise(..., options={\"autodiff_mode\": \"fwd\"|\"bwd})`. The default is `\"bwd\"`, corresponding to reverse-mode autodiff. (Thanks @johannahaffner! #114)\r\n- `optimistix.OptaxMinimser` now supports Optax solvers that include a line search. (Thanks @bagibence! #121, #122)\r\n\r\n**Bugfixes:**\r\n\r\n- Bugfix for `optimistix.{Newton,Chord}` crashing on 32-bit problems whilst in 64-bit mode. (#155, #156)\r\n\r\n## New Contributors\r\n* @danielward27 made their first contribution in https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fpull\u002F111\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fcompare\u002Fv0.0.10...v0.0.11","2025-10-17T20:53:30",{"id":152,"version":153,"summary_zh":154,"released_at":155},116913,"v0.0.10","- Compatibility with JAX 0.4.38 (no more warnings). Make sure to also upgrade to Equinox v0.11.11.\r\n- Feature: added `optx.BFGS(verbose=...)` to enable verbose step-by-step printing (Thanks @johannahaffner! #95).\r\n    - This also got added to `optx.AbstractBFGS`. If you are creating a custom subclass of `AbstractBFGS` then you will need to add this as an attribute.\r\n- Bugfix: `optx.root_find(..., SomeLeastSquaresSolver(), ...).aux` will now return _just_ the auxiliary value, instead of a `(root, aux)` pair. (The root is anyway available on `.value`.)\r\n- Fixed some docs (Thanks @johannahaffner! #98, #99)\r\n\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fcompare\u002Fv0.0.9...v0.0.10","2024-12-24T15:15:25",{"id":157,"version":158,"summary_zh":159,"released_at":160},116914,"v0.0.9","This is a compatibility release: JAX 0.4.34 changed how custom autodifferentiation rules work. As of this update we should now be compatible with this. (#87)\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fcompare\u002Fv0.0.8...v0.0.9","2024-10-21T09:20:24",{"id":162,"version":163,"summary_zh":164,"released_at":165},116915,"v0.0.8","- Compatibility with latest JAX (fixed some warnings, fixed some crashes from the `jax.ShapeDtypeStruct.weak_dtype` breaking change).\r\n- Many solvers that need to calculate Jacobians will now preferentially use forward mode over reverse mode where possible. This generally improves speed. (#61)\r\n- `OptaxMinimiser` now supports Optax optimizers that need the current parameter state. (Thanks @NeilGirdhar! https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fpull\u002F77)\r\n- `Bisection` now optionally supports expanding the interval to find the root. (Thanks @NeilGirdhar! https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fpull\u002F78)\r\n- Added warning that complex numbers are still a work in progress -- turns out that some facets of optimizing functions C->R autodifferentiably is perhaps still an open research question (?)\r\n- Doc fixes. (Thanks @johannahaffner! #69)\r\n- Added `py.type` file to signal static type checking compatibility.  (Thanks @NeilGirdhar! https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fpull\u002F80)\r\n\r\n## New Contributors\r\n* @johannahaffner made their first contribution in https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fpull\u002F69\r\n* @NeilGirdhar made their first contribution in https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fpull\u002F77\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fcompare\u002Fv0.0.7...v0.0.8","2024-09-30T00:41:50",{"id":167,"version":168,"summary_zh":169,"released_at":170},116916,"v0.0.7","## Features\r\n\r\n- Support for complex numbers! In theory these should now be supported everywhere. In practice we're still considering this a little experimental at the moment, just in case we've missed something. (Huge thanks to @Randl! #53)\r\n- `optimistix.{AbstractGaussNewton, GaussNewton, LevenbergMarquardt, IndirectLevenbergMarquardt, Dogleg}` should now all support using reverse-mode autodiff to calculate Jacobians. (#51)\r\n\r\n## Bugfixes\r\n\r\n- Fixed a crash when using `jax.disable_jit` (See https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Fdiffrax\u002Fissues\u002F368, #43)\r\n- Fixed terminating on the first step for functions with initial value zero. (Thanks @NightWinkle! #49)\r\n\r\n## Other\r\n\r\n- Documentation fixes. (Thanks @ColCarroll! #32)\r\n- Now compatible with `jax_numpy_rank_promotion=raise` and `jax_numpy_dtype_promotion=strict`.\r\n\r\n## New Contributors\r\n* @ColCarroll made their first contribution in https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fpull\u002F32\r\n* @NightWinkle made their first contribution in https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fpull\u002F49\r\n* @Randl made their first contribution in https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fpull\u002F53\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fcompare\u002Fv0.0.6...v0.0.7","2024-05-12T12:53:25",{"id":172,"version":173,"summary_zh":174,"released_at":175},116917,"v0.0.6","Bugfix release!\r\n\r\n- Fixed recompilation happening on every step when [iterating step-by-step](https:\u002F\u002Fdocs.kidger.site\u002Foptimistix\u002Fexamples\u002Finteractive\u002F) with a Gauss--Newton solver. (#30)\r\n- `sol.state` previously included only the array-valued parts of the state (for `sol = optx.{minimise, least_squares, root_find, fixed_point}(...)`). It now includes everything.\r\n- Fixed `optx.internal.implicit_jvp` misbehaving for non-Optimistix use cases.\r\n- Fixed `optx.{Newton,Chord}(cauchy_termination=False)` failing when started close to the solution.\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fcompare\u002Fv0.0.5...v0.0.6","2023-12-27T17:06:22",{"id":177,"version":178,"summary_zh":179,"released_at":180},116918,"v0.0.5","Very simple release!\r\n\r\n* Added `optimistix.compat.minimize` as a replacement for `jax.scipy.optimize.minimize`. (#14)\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fpatrick-kidger\u002Foptimistix\u002Fcompare\u002Fv0.0.4...v0.0.5","2023-10-13T21:15:26",{"id":182,"version":183,"summary_zh":184,"released_at":185},116919,"v0.0.4","Hurrah! How exciting.","2023-10-05T23:56:08"]