[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-Nano-Collective--nanocoder":3,"tool-Nano-Collective--nanocoder":64},[4,17,27,35,43,56],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":16},3808,"stable-diffusion-webui","AUTOMATIC1111\u002Fstable-diffusion-webui","stable-diffusion-webui 是一个基于 Gradio 构建的网页版操作界面，旨在让用户能够轻松地在本地运行和使用强大的 Stable Diffusion 图像生成模型。它解决了原始模型依赖命令行、操作门槛高且功能分散的痛点，将复杂的 AI 绘图流程整合进一个直观易用的图形化平台。\n\n无论是希望快速上手的普通创作者、需要精细控制画面细节的设计师，还是想要深入探索模型潜力的开发者与研究人员，都能从中获益。其核心亮点在于极高的功能丰富度：不仅支持文生图、图生图、局部重绘（Inpainting）和外绘（Outpainting）等基础模式，还独创了注意力机制调整、提示词矩阵、负向提示词以及“高清修复”等高级功能。此外，它内置了 GFPGAN 和 CodeFormer 等人脸修复工具，支持多种神经网络放大算法，并允许用户通过插件系统无限扩展能力。即使是显存有限的设备，stable-diffusion-webui 也提供了相应的优化选项，让高质量的 AI 艺术创作变得触手可及。",162132,3,"2026-04-05T11:01:52",[13,14,15],"开发框架","图像","Agent","ready",{"id":18,"name":19,"github_repo":20,"description_zh":21,"stars":22,"difficulty_score":23,"last_commit_at":24,"category_tags":25,"status":16},1381,"everything-claude-code","affaan-m\u002Feverything-claude-code","everything-claude-code 是一套专为 AI 编程助手（如 Claude Code、Codex、Cursor 等）打造的高性能优化系统。它不仅仅是一组配置文件，而是一个经过长期实战打磨的完整框架，旨在解决 AI 代理在实际开发中面临的效率低下、记忆丢失、安全隐患及缺乏持续学习能力等核心痛点。\n\n通过引入技能模块化、直觉增强、记忆持久化机制以及内置的安全扫描功能，everything-claude-code 能显著提升 AI 在复杂任务中的表现，帮助开发者构建更稳定、更智能的生产级 AI 代理。其独特的“研究优先”开发理念和针对 Token 消耗的优化策略，使得模型响应更快、成本更低，同时有效防御潜在的攻击向量。\n\n这套工具特别适合软件开发者、AI 研究人员以及希望深度定制 AI 工作流的技术团队使用。无论您是在构建大型代码库，还是需要 AI 协助进行安全审计与自动化测试，everything-claude-code 都能提供强大的底层支持。作为一个曾荣获 Anthropic 黑客大奖的开源项目，它融合了多语言支持与丰富的实战钩子（hooks），让 AI 真正成长为懂上",140436,2,"2026-04-05T23:32:43",[13,15,26],"语言模型",{"id":28,"name":29,"github_repo":30,"description_zh":31,"stars":32,"difficulty_score":23,"last_commit_at":33,"category_tags":34,"status":16},2271,"ComfyUI","Comfy-Org\u002FComfyUI","ComfyUI 是一款功能强大且高度模块化的视觉 AI 引擎，专为设计和执行复杂的 Stable Diffusion 图像生成流程而打造。它摒弃了传统的代码编写模式，采用直观的节点式流程图界面，让用户通过连接不同的功能模块即可构建个性化的生成管线。\n\n这一设计巧妙解决了高级 AI 绘图工作流配置复杂、灵活性不足的痛点。用户无需具备编程背景，也能自由组合模型、调整参数并实时预览效果，轻松实现从基础文生图到多步骤高清修复等各类复杂任务。ComfyUI 拥有极佳的兼容性，不仅支持 Windows、macOS 和 Linux 全平台，还广泛适配 NVIDIA、AMD、Intel 及苹果 Silicon 等多种硬件架构，并率先支持 SDXL、Flux、SD3 等前沿模型。\n\n无论是希望深入探索算法潜力的研究人员和开发者，还是追求极致创作自由度的设计师与资深 AI 绘画爱好者，ComfyUI 都能提供强大的支持。其独特的模块化架构允许社区不断扩展新功能，使其成为当前最灵活、生态最丰富的开源扩散模型工具之一，帮助用户将创意高效转化为现实。",107662,"2026-04-03T11:11:01",[13,14,15],{"id":36,"name":37,"github_repo":38,"description_zh":39,"stars":40,"difficulty_score":23,"last_commit_at":41,"category_tags":42,"status":16},3704,"NextChat","ChatGPTNextWeb\u002FNextChat","NextChat 是一款轻量且极速的 AI 助手，旨在为用户提供流畅、跨平台的大模型交互体验。它完美解决了用户在多设备间切换时难以保持对话连续性，以及面对众多 AI 模型不知如何统一管理的痛点。无论是日常办公、学习辅助还是创意激发，NextChat 都能让用户随时随地通过网页、iOS、Android、Windows、MacOS 或 Linux 端无缝接入智能服务。\n\n这款工具非常适合普通用户、学生、职场人士以及需要私有化部署的企业团队使用。对于开发者而言，它也提供了便捷的自托管方案，支持一键部署到 Vercel 或 Zeabur 等平台。\n\nNextChat 的核心亮点在于其广泛的模型兼容性，原生支持 Claude、DeepSeek、GPT-4 及 Gemini Pro 等主流大模型，让用户在一个界面即可自由切换不同 AI 能力。此外，它还率先支持 MCP（Model Context Protocol）协议，增强了上下文处理能力。针对企业用户，NextChat 提供专业版解决方案，具备品牌定制、细粒度权限控制、内部知识库整合及安全审计等功能，满足公司对数据隐私和个性化管理的高标准要求。",87618,"2026-04-05T07:20:52",[13,26],{"id":44,"name":45,"github_repo":46,"description_zh":47,"stars":48,"difficulty_score":23,"last_commit_at":49,"category_tags":50,"status":16},2268,"ML-For-Beginners","microsoft\u002FML-For-Beginners","ML-For-Beginners 是由微软推出的一套系统化机器学习入门课程，旨在帮助零基础用户轻松掌握经典机器学习知识。这套课程将学习路径规划为 12 周，包含 26 节精炼课程和 52 道配套测验，内容涵盖从基础概念到实际应用的完整流程，有效解决了初学者面对庞大知识体系时无从下手、缺乏结构化指导的痛点。\n\n无论是希望转型的开发者、需要补充算法背景的研究人员，还是对人工智能充满好奇的普通爱好者，都能从中受益。课程不仅提供了清晰的理论讲解，还强调动手实践，让用户在循序渐进中建立扎实的技能基础。其独特的亮点在于强大的多语言支持，通过自动化机制提供了包括简体中文在内的 50 多种语言版本，极大地降低了全球不同背景用户的学习门槛。此外，项目采用开源协作模式，社区活跃且内容持续更新，确保学习者能获取前沿且准确的技术资讯。如果你正寻找一条清晰、友好且专业的机器学习入门之路，ML-For-Beginners 将是理想的起点。",84991,"2026-04-05T10:45:23",[14,51,52,53,15,54,26,13,55],"数据工具","视频","插件","其他","音频",{"id":57,"name":58,"github_repo":59,"description_zh":60,"stars":61,"difficulty_score":10,"last_commit_at":62,"category_tags":63,"status":16},3128,"ragflow","infiniflow\u002Fragflow","RAGFlow 是一款领先的开源检索增强生成（RAG）引擎，旨在为大语言模型构建更精准、可靠的上下文层。它巧妙地将前沿的 RAG 技术与智能体（Agent）能力相结合，不仅支持从各类文档中高效提取知识，还能让模型基于这些知识进行逻辑推理和任务执行。\n\n在大模型应用中，幻觉问题和知识滞后是常见痛点。RAGFlow 通过深度解析复杂文档结构（如表格、图表及混合排版），显著提升了信息检索的准确度，从而有效减少模型“胡编乱造”的现象，确保回答既有据可依又具备时效性。其内置的智能体机制更进一步，使系统不仅能回答问题，还能自主规划步骤解决复杂问题。\n\n这款工具特别适合开发者、企业技术团队以及 AI 研究人员使用。无论是希望快速搭建私有知识库问答系统，还是致力于探索大模型在垂直领域落地的创新者，都能从中受益。RAGFlow 提供了可视化的工作流编排界面和灵活的 API 接口，既降低了非算法背景用户的上手门槛，也满足了专业开发者对系统深度定制的需求。作为基于 Apache 2.0 协议开源的项目，它正成为连接通用大模型与行业专有知识之间的重要桥梁。",77062,"2026-04-04T04:44:48",[15,14,13,26,54],{"id":65,"github_repo":66,"name":67,"description_en":68,"description_zh":69,"ai_summary_zh":70,"readme_en":71,"readme_zh":72,"quickstart_zh":73,"use_case_zh":74,"hero_image_url":75,"owner_login":76,"owner_name":77,"owner_avatar_url":78,"owner_bio":79,"owner_company":80,"owner_location":80,"owner_email":80,"owner_twitter":80,"owner_website":81,"owner_url":82,"languages":83,"stars":106,"forks":107,"last_commit_at":108,"license":109,"difficulty_score":23,"env_os":110,"env_gpu":111,"env_ram":112,"env_deps":113,"category_tags":119,"github_topics":120,"view_count":23,"oss_zip_url":80,"oss_zip_packed_at":80,"status":16,"created_at":130,"updated_at":131,"faqs":132,"releases":161},2173,"Nano-Collective\u002Fnanocoder","nanocoder","A beautiful local-first coding agent running in your terminal - built by the community for the community ⚒","Nanocoder 是一款运行在终端本地的智能编程助手，由非营利的 Nano Collective 社区共同构建。它将类似 Claude Code 或 Gemini CLI 的强大代理编码能力带入本地环境，支持连接本地大模型（如通过 Ollama）或受控的 API 服务（如 OpenRouter），让开发者能在熟悉的命令行界面中直接进行代码分析、重构、文件操作及命令执行。\n\n针对开发者对数据隐私和工具控制权的担忧，Nanocoder 坚持“本地优先”原则，确保代码不出本地即可享受 AI 辅助，同时提供透明的配置选项。它解决了传统云端 AI 编程工具可能存在的数据泄露风险及依赖特定厂商的问题，让用户能灵活选择模型提供商，真正实现自主可控的智能化开发流程。\n\n这款工具特别适合注重隐私安全、习惯使用命令行的高效开发者，以及希望在不依赖昂贵云服务的前提下探索本地大模型应用的技术人员。其独特的技术亮点在于完全开源透明、支持多模型后端切换、内置丰富的斜杠命令与快捷键系统，并可通过 MCP 服务器扩展功能。安装简便，只需一条 npm 命令即可全局部署，是追求自由、开放与协作精神的现代开发者的理想选择","Nanocoder 是一款运行在终端本地的智能编程助手，由非营利的 Nano Collective 社区共同构建。它将类似 Claude Code 或 Gemini CLI 的强大代理编码能力带入本地环境，支持连接本地大模型（如通过 Ollama）或受控的 API 服务（如 OpenRouter），让开发者能在熟悉的命令行界面中直接进行代码分析、重构、文件操作及命令执行。\n\n针对开发者对数据隐私和工具控制权的担忧，Nanocoder 坚持“本地优先”原则，确保代码不出本地即可享受 AI 辅助，同时提供透明的配置选项。它解决了传统云端 AI 编程工具可能存在的数据泄露风险及依赖特定厂商的问题，让用户能灵活选择模型提供商，真正实现自主可控的智能化开发流程。\n\n这款工具特别适合注重隐私安全、习惯使用命令行的高效开发者，以及希望在不依赖昂贵云服务的前提下探索本地大模型应用的技术人员。其独特的技术亮点在于完全开源透明、支持多模型后端切换、内置丰富的斜杠命令与快捷键系统，并可通过 MCP 服务器扩展功能。安装简便，只需一条 npm 命令即可全局部署，是追求自由、开放与协作精神的现代开发者的理想选择。","# Nanocoder\n\nA local-first CLI coding agent built by the [Nano Collective](https:\u002F\u002Fgithub.com\u002FNano-Collective) — a community collective building AI tooling not for profit, but for the community. Everything we build is open, transparent, and driven by the people who use it. AI done right.\n\nNanocoder brings the power of agentic coding tools like Claude Code and Gemini CLI to local models or controlled APIs like OpenRouter. Built with privacy and control in mind, it supports multiple AI providers with tool support for file operations and command execution.\n\n![Example](.\u002F.github\u002Fassets\u002Fexample.gif)\n\n---\n![Build Status](https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fraw\u002Fmain\u002Fbadges\u002Fbuild.svg)\n![Coverage](https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fraw\u002Fmain\u002Fbadges\u002Fcoverage.svg)\n![Version](https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fraw\u002Fmain\u002Fbadges\u002Fnpm-version.svg)\n![NPM Downloads](https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fraw\u002Fmain\u002Fbadges\u002Fnpm-downloads-monthly.svg)\n![NPM License](https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fraw\u002Fmain\u002Fbadges\u002Fnpm-license.svg)\n![Repo Size](https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fraw\u002Fmain\u002Fbadges\u002Frepo-size.svg)\n![Stars](https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fraw\u002Fmain\u002Fbadges\u002Fstars.svg)\n![Forks](https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fraw\u002Fmain\u002Fbadges\u002Fforks.svg)\n\n## Quick Start\n\n```bash\nnpm install -g @nanocollective\u002Fnanocoder\nnanocoder\n```\n\nAlso available via [Homebrew](docs\u002Fgetting-started\u002Finstallation.md#homebrew-macoslinux) and [Nix Flakes](docs\u002Fgetting-started\u002Finstallation.md#nix-flakes).\n\n### CLI Flags\n\nSpecify provider and model directly:\n\n```bash\n# Non-interactive mode with specific provider\u002Fmodel\nnanocoder --provider openrouter --model google\u002Fgemini-3.1-flash run \"analyze src\u002Fapp.ts\"\n\n# Interactive mode starting with specific provider\nnanocoder --provider ollama --model llama3.1\n\n# Flags can appear before or after 'run' command\nnanocoder run --provider openrouter \"refactor database module\"\n```\n\n## Documentation\n\nFull documentation is available online at **[docs.nanocollective.org](https:\u002F\u002Fdocs.nanocollective.org\u002Fnanocoder\u002Fdocs)** or in the [docs\u002F](docs\u002F) folder:\n\n- **[Getting Started](docs\u002Fgetting-started\u002Findex.md)** - Installation, setup, and first steps\n- **[Configuration](docs\u002Fconfiguration\u002Findex.md)** - AI providers, MCP servers, preferences, logging, timeouts\n- **[Features](docs\u002Ffeatures\u002Findex.md)** - Custom commands, checkpointing, development modes, task management, and more\n- **[Commands Reference](docs\u002Fcommands.md)** - Complete list of built-in slash commands\n- **[Keyboard Shortcuts](docs\u002Fkeyboard-shortcuts.md)** - Full shortcut reference\n- **[Community](docs\u002Fcommunity.md)** - Contributing, Discord, and how to help\n\n## Community\n\nThe Nano Collective is a community collective building AI tooling for the community, not for profit. We'd love your help!\n\n- **Contributing**: See [CONTRIBUTING.md](CONTRIBUTING.md) for development setup and guidelines\n- **Discord**: [Join our server](https:\u002F\u002Fdiscord.gg\u002FktPDV6rekE) to connect with other users and contributors\n- **GitHub**: Open issues or join discussions on our repository\n","# Nanocoder\n\n由 [Nano Collective](https:\u002F\u002Fgithub.com\u002FNano-Collective) 构建的本地优先 CLI 编码代理——这是一个非营利性社区组织，致力于为社区打造 AI 工具。我们构建的每一项产品都开放、透明，并由使用者共同驱动。这才是正确的 AI 打造方式。\n\nNanocoder 将 Claude Code 和 Gemini CLI 等代理式编码工具的强大功能引入本地模型或受控 API（如 OpenRouter）。它以隐私和控制为核心设计，支持多家 AI 提供商，并提供文件操作和命令执行等工具支持。\n\n![示例](.\u002F.github\u002Fassets\u002Fexample.gif)\n\n---\n![构建状态](https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fraw\u002Fmain\u002Fbadges\u002Fbuild.svg)\n![覆盖率](https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fraw\u002Fmain\u002Fbadges\u002Fcoverage.svg)\n![版本](https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fraw\u002Fmain\u002Fbadges\u002Fnpm-version.svg)\n![NPM 下载量](https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fraw\u002Fmain\u002Fbadges\u002Fnpm-downloads-monthly.svg)\n![NPM 许可证](https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fraw\u002Fmain\u002Fbadges\u002Fnpm-license.svg)\n![仓库大小](https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fraw\u002Fmain\u002Fbadges\u002Frepo-size.svg)\n![星标](https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fraw\u002Fmain\u002Fbadges\u002Fstars.svg)\n![叉子](https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fraw\u002Fmain\u002Fbadges\u002Fforks.svg)\n\n## 快速入门\n\n```bash\nnpm install -g @nanocollective\u002Fnanocoder\nnanocoder\n```\n\n也可通过 [Homebrew](docs\u002Fgetting-started\u002Finstallation.md#homebrew-macoslinux) 和 [Nix Flakes](docs\u002Fgetting-started\u002Finstallation.md#nix-flakes) 安装。\n\n### CLI 标志\n\n直接指定提供商和模型：\n\n```bash\n# 非交互模式，使用特定提供商\u002F模型\nnanocoder --provider openrouter --model google\u002Fgemini-3.1-flash run \"analyze src\u002Fapp.ts\"\n\n# 交互模式，从特定提供商开始\nnanocoder --provider ollama --model llama3.1\n\n# 标志可以出现在 'run' 命令之前或之后\nnanocoder run --provider openrouter \"refactor database module\"\n```\n\n## 文档\n\n完整文档可在线上 **[docs.nanocollective.org](https:\u002F\u002Fdocs.nanocollective.org\u002Fnanocoder\u002Fdocs)** 或在 [docs\u002F](docs\u002F) 文件夹中找到：\n\n- **[入门指南](docs\u002Fgetting-started\u002Findex.md)** —— 安装、设置及第一步操作\n- **[配置](docs\u002Fconfiguration\u002Findex.md)** —— AI 提供商、MCP 服务器、偏好设置、日志记录、超时等\n- **[特性](docs\u002Ffeatures\u002Findex.md)** —— 自定义命令、检查点、开发模式、任务管理等\n- **[命令参考](docs\u002Fcommands.md)** —— 内置斜杠命令的完整列表\n- **[键盘快捷键](docs\u002Fkeyboard-shortcuts.md)** —— 完整快捷键参考\n- **[社区](docs\u002Fcommunity.md)** —— 贡献、Discord 社区及如何参与帮助\n\n## 社区\n\nNano Collective 是一个非营利性的社区组织，专注于为社区构建 AI 工具。我们非常欢迎您的参与！\n\n- **贡献**：请参阅 [CONTRIBUTING.md](CONTRIBUTING.md)，了解开发环境搭建及贡献指南\n- **Discord**：加入我们的服务器 [discord.gg\u002FktPDV6rekE](https:\u002F\u002Fdiscord.gg\u002FktPDV6rekE)，与其他用户和贡献者交流\n- **GitHub**：在我们的仓库中提交问题或参与讨论","# Nanocoder 快速上手指南\n\nNanocoder 是一款由 Nano Collective 社区打造的“本地优先”命令行 AI 编程助手。它支持连接本地模型（如 Ollama）或受控 API（如 OpenRouter），在提供强大智能编码能力的同时，确保数据隐私与控制权。\n\n## 环境准备\n\n*   **系统要求**：支持 macOS、Linux 及 Windows（需安装 Node.js 环境）。\n*   **前置依赖**：\n    *   已安装 **Node.js** (建议 LTS 版本) 和 **npm**。\n    *   （可选）若使用本地模型，请预先安装并运行 **Ollama** 或其他兼容的本地推理服务。\n    *   （可选）若使用云端模型，请准备好相应的 API Key（如 OpenRouter Key）。\n\n> **提示**：国内开发者若遇到 npm 安装缓慢问题，可临时切换至国内镜像源：\n> ```bash\n> npm config set registry https:\u002F\u002Fregistry.npmmirror.com\n> ```\n\n## 安装步骤\n\n推荐使用 npm 进行全局安装：\n\n```bash\nnpm install -g @nanocollective\u002Fnanocoder\n```\n\n*其他安装方式*：macOS\u002FLinux 用户也可通过 **Homebrew** 或 **Nix Flakes** 安装（详见官方文档）。\n\n## 基本使用\n\n安装完成后，即可在终端直接使用 `nanocoder` 命令。\n\n### 1. 启动交互式模式\n直接输入命令进入交互界面，根据提示选择提供商和模型：\n\n```bash\nnanocoder\n```\n\n### 2. 指定提供商与模型启动\n你可以直接在命令行指定 AI 提供商和模型名称启动：\n\n**使用本地模型 (Ollama):**\n```bash\nnanocoder --provider ollama --model llama3.1\n```\n\n**使用云端 API (OpenRouter):**\n```bash\nnanocoder --provider openrouter --model google\u002Fgemini-3.1-flash\n```\n\n### 3. 执行单次任务（非交互模式）\n直接让 Nanocoder 执行特定指令并输出结果，无需进入交互界面：\n\n```bash\nnanocoder --provider openrouter --model google\u002Fgemini-3.1-flash run \"analyze src\u002Fapp.ts\"\n```\n\n或者将 `run` 命令放在前面：\n\n```bash\nnanocoder run --provider openrouter \"refactor database module\"\n```\n\n启动后，你可以利用其文件操作和命令执行能力，通过自然语言让 AI 协助你编写、重构或分析代码。","一位后端开发者需要在断网或高隐私要求的内网环境中，快速重构遗留的数据库模块并修复潜在的类型错误。\n\n### 没有 nanocoder 时\n- **隐私与网络焦虑**：担心将核心业务代码上传至云端大模型会泄露数据，且在无外网环境下无法使用任何 AI 辅助工具。\n- **上下文切换频繁**：需要在编辑器、浏览器文档和终端之间反复跳转，手动复制粘贴代码片段来询问逻辑问题，打断心流。\n- **操作繁琐低效**：AI 给出的修改建议无法直接执行，开发者必须手动定位文件、逐行修改代码并重新运行测试命令验证结果。\n- **成本不可控**：依赖商业 API 进行大规模代码分析时，高昂的 Token 费用让团队在尝试新技术时顾虑重重。\n\n### 使用 nanocoder 后\n- **本地优先的安全体验**：直接通过 Ollama 调用本地部署的 Llama 3.1 模型，所有代码分析与生成均在本地完成，彻底杜绝数据外泄风险。\n- **终端内的无缝交互**：直接在终端输入 `nanocoder run \"refactor database module\"`，AI 自动读取文件上下文并提供方案，无需离开命令行环境。\n- **自主执行与验证**：nanocoder 具备文件操作和命令执行能力，能自动应用代码修改并运行测试脚本，开发者只需确认关键步骤即可。\n- **灵活的低成本接入**：支持通过 OpenRouter 等接口按需切换模型，或在本地免费运行，让高频次的代码重构任务不再受预算限制。\n\nnanocoder 让开发者在完全掌控数据和环境的前提下，享受到了媲美云端智能体的自动化编码效率。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FNano-Collective_nanocoder_a674e052.png","Nano-Collective","Nano Collective","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002FNano-Collective_26c29cc6.png","We're a collective building open-source AI. Our goal is to create powerful, privacy-first tools, developed by the community for the community.",null,"https:\u002F\u002Fnanocollective.org","https:\u002F\u002Fgithub.com\u002FNano-Collective",[84,88,92,95,99,102],{"name":85,"color":86,"percentage":87},"TypeScript","#3178c6",99.1,{"name":89,"color":90,"percentage":91},"JavaScript","#f1e05a",0.4,{"name":93,"color":94,"percentage":91},"Shell","#89e051",{"name":96,"color":97,"percentage":98},"Nix","#7e7eff",0.1,{"name":100,"color":101,"percentage":98},"Dockerfile","#384d54",{"name":103,"color":104,"percentage":105},"Ruby","#701516",0,1632,146,"2026-04-05T22:35:08","NOASSERTION","Linux, macOS, Windows","未说明（支持本地模型如 Ollama，具体 GPU 需求取决于所选模型；也支持云端 API 如 OpenRouter，无需本地 GPU）","未说明（取决于运行的本地模型大小）",{"notes":114,"python":115,"dependencies":116},"该工具是一个基于 Node.js 的命令行代理，可通过 npm、Homebrew 或 Nix Flakes 安装。它既支持连接本地模型（需自行配置如 Ollama 等运行环境），也支持远程 API（如 OpenRouter）。若使用本地模型，硬件需求由所选模型决定；若使用远程 API，则无特殊本地硬件要求。","未说明（基于 Node.js，通过 npm 安装）",[117,118],"Node.js","npm",[13,15,26,14],[121,122,123,124,125,126,127,128,129],"ai","ai-agents","ai-coding","coding-agents","llm","llm-inference","ollama","openai","openrouter","2026-03-27T02:49:30.150509","2026-04-06T08:17:50.629394",[133,138,143,148,153,157],{"id":134,"question_zh":135,"answer_zh":136,"source_url":137},10018,"如何跳过 MCP 服务器配置步骤？","在配置向导的 MCP 选择界面底部，提供了\"Skip adding MCPs\"（跳过添加 MCP 服务器）或\"Done adding MCP servers\"选项。无论您当前处于\"Local\"（本地）还是\"Remote\"（远程）标签页，都可以点击该选项直接跳转到下一步或最终审查界面，无需强制配置服务器。","https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fissues\u002F100",{"id":139,"question_zh":140,"answer_zh":141,"source_url":142},10016,"Nanocoder 是否支持不使用工具调用（non-tool-calling）的模型？","是的，从 v1.8.0 版本开始初步支持，并在 v1.11.0 版本中大幅改进。现在非工具调用模型可以正常工作，循环和重新提示的情况已减少。但仍存在长程任务准确性不足的问题。维护者正在通过集成 LiteLLM、优化系统提示词和 ReAct 设置来进一步改进。对于本地设置，可以参考 Issue #23 中的配置作为临时方案。","https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fissues\u002F21",{"id":144,"question_zh":145,"answer_zh":146,"source_url":147},10017,"如何在 Nanocoder 中输入多行提示词（换行而不发送）？","多行输入功能已实现，但不同终端的快捷键支持不同：\n1. **Shift+Enter**：适用于能正确发送 Shift 修饰符的终端。\n2. **Option\u002FAlt+Enter**：适用于 VS Code 集成终端（Windows 上为 Alt+Enter）。\n3. **Ctrl+J**：最可靠的通用方法，适用于 KDE Konsole、Ghostty 等大多数终端，因为它直接发送换行符 (\\n)。\n如果 Shift+Enter 无效（例如显示乱码），请尝试使用 Ctrl+J。","https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fissues\u002F299",{"id":149,"question_zh":150,"answer_zh":151,"source_url":152},10019,"在 SSH 远程连接或某些终端中运行时出现界面闪烁或窗口位置错误怎么办？","该问题已在 v1.13.4 版本中通过合并优化的 Pull Request 得到解决。如果您仍遇到此问题，请确保升级到最新版本。该修复针对渲染问题和闪烁进行了重大优化。如果是 Windows 本地用户遇到类似问题且未通过 SSH，也建议升级测试，因为底层渲染逻辑已更新。","https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fissues\u002F44",{"id":154,"question_zh":155,"answer_zh":156,"source_url":142},10020,"为什么我的非工具调用模型（如 Qwen3 Coder）在工具调用后需要重新提示或陷入循环？","这是已知限制。虽然 v1.11.0 改善了情况，但非工具调用模型在处理长程任务时，工具调用的准确性和任务坚持度仍可能产生混合结果。模型有时需要在工具调用后被重新提示才能继续，或偶尔陷入工具调用循环。开发团队正在专门的分支（fix\u002Flocal-non-tool-calling-model-improvements）中进行修复和改进。",{"id":158,"question_zh":159,"answer_zh":160,"source_url":137},10021,"MCP 服务器配置界面太杂乱，如何区分本地和远程服务器？","新的 UI 改进已将 MCP 服务器配置界面组织为标签页形式，包括：\n- **Local (💻)**：显示本地 stdio 服务器（如文件系统、GitHub、PostgreSQL）。\n- **Remote (🌐)**：显示远程 http\u002Fwebsocket 服务器（如 DeepWiki、Context7）。\n- **All (⚙️)** 和 **Popular (⭐)**：分别显示所有服务器和推荐服务器。\n这种分类设计帮助用户快速区分服务器类型并找到相关模板。",[162,167,172,177,182,187,192,197,202,207,212,217,222,227,232,237,242,247,252,257],{"id":163,"version":164,"summary_zh":165,"released_at":166},116941,"v1.24.1","## What's Changed\n\n- Added `--context-max` CLI flag for setting the context limit from the command line, complementing the existing `\u002Fcontext-max` command and `NANOCODER_CONTEXT_LIMIT` env variable.\n\n- Removed time from the system prompt to keep the KV cache more stable across requests. Thanks to @initialxy. Closes #415.\n\n- Task tool results are no longer displayed as compacted results during ensuring task progress remains visible in the conversation.\n\n- User input now uses the same text wrapping as assistant messages for a more consistent chat appearance.\n\n- Improved `search_file_contents` tool robustness.\n\nIf there are any problems, feedback or thoughts please drop an issue or message us through Discord! Thank you for using Nanocoder.\n\n### Installation\n```bash\nnpm install -g @nanocollective\u002Fnanocoder\n```\n\n### Usage\n```bash\nnanocoder\n```\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fcompare\u002Fv1.24.0...v1.24.1","2026-03-19T12:49:45",{"id":168,"version":169,"summary_zh":170,"released_at":171},116942,"v1.24.0","## What's Changed\n\n- **BREAKING**: Removed legacy `~\u002F.agents.config.json` config file support. Nanocoder no longer checks the home directory for a dot-prefixed config file. If you are still using this path, move your config to the platform-specific directory: `~\u002FLibrary\u002FPreferences\u002Fnanocoder\u002Fagents.config.json` (macOS), `~\u002F.config\u002Fnanocoder\u002Fagents.config.json` (Linux), or `%APPDATA%\\nanocoder\\agents.config.json` (Windows).\n\n- **BREAKING**: Removed legacy `~\u002F.nanocoder-preferences.json` preferences file support. Preferences are now only loaded from the platform-specific directory (e.g. `~\u002FLibrary\u002FPreferences\u002Fnanocoder\u002Fnanocoder-preferences.json` on macOS) or a project-level `nanocoder-preferences.json` in your working directory. To migrate, move your existing file: `mv ~\u002F.nanocoder-preferences.json ~\u002FLibrary\u002FPreferences\u002Fnanocoder\u002Fnanocoder-preferences.json`\n\n- **BREAKING**: Removed deprecated array format for MCP server configuration. Only the object format is now supported: `{ \"mcpServers\": { \"serverName\": { ... } } }`. If you are using the array format in `.mcp.json`, convert each array entry to an object key using the server name.\n\n- **BREAKING**: Removed `agents.config.json` fallback for MCP server loading. Global MCP servers must now be configured in `~\u002F.config\u002Fnanocoder\u002F.mcp.json` (Linux), `~\u002FLibrary\u002FPreferences\u002Fnanocoder\u002F.mcp.json` (macOS), or `%APPDATA%\\nanocoder\\.mcp.json` (Windows). Provider configuration still uses `agents.config.json`.\n\n- **BREAKING**: Removed `auth` and `reconnect` fields from MCP server configuration. The `auth` field was never functional (both HTTP and WebSocket transports logged warnings that it was unsupported). The `reconnect` field was never implemented. Use `headers` for HTTP authentication instead (e.g. `\"headers\": { \"Authorization\": \"Bearer $TOKEN\" }`).\n\n- Added `\u002Fresume` command for restoring previous chat sessions. Sessions are automatically saved and can be resumed from an interactive selector. Sessions are filtered by the current project directory by default, with an `--all` flag to show all sessions. Thanks to @yashksaini-coder.\n\n- Added `--provider` and `--model` CLI flags for non-interactive provider and model specification, allowing CI\u002FCD scripts and automation to skip the setup wizard. Closes #394. Thanks to @james2doyle.\n\n- Added `NANOCODER_PROVIDERS` environment variable support for configuring providers without config files, useful for Docker containers and CI environments. Closes #307. Thanks to @kaustubha07.\n\n- Added GitHub Copilot as a provider template with OAuth device flow authentication and `\u002Fcopilot-login` command. Thanks to @yashksaini-coder.\n\n- Added MLX Server provider template for local Apple Silicon inference. Closes #318.\n\n- Added parallel tool execution allowing the model to run multiple independent tool calls concurrently for faster task completion.\n\n- Added compact mode toggle via `Ctrl+L` in chat input to collapse the conversation view.\n\n- Added VS Code fork support for IDE integration (Cursor, Windsurf, VSCodium, etc.). Thanks to @kapsner.\n\n- Added Aurora Borealis theme.\n\n- Added notice when the model falls back to XML tool calls, informing users they can switch to a model with native tool calling support.\n\n- Adopted AI SDK human-in-the-loop pattern for tool approval. Tool confirmation now uses the SDK's built-in `tool-approval-request`\u002F`tool-approval-response` flow instead of manual tool-call splitting, improving reliability and reducing code complexity.\n\n- Simplified tool processing by removing double XML parsing and the JSON tool call parser. Tool call parsing now happens in a single place and only on the XML fallback path for non-tool-calling models.\n\n- Restructured documentation into a Nextra-compatible `docs\u002F` folder structure with nested sections for getting-started, configuration, and features. The README is now a concise landing page linking to the full docs.\n\n- Refactored app-utils into focused handler files, extracted shared utilities, unified mode state, and stubbed commands for cleaner architecture.\n\n- Fix: `alwaysAllow` field in MCP server configuration was silently dropped during config loading due to a missing field mapping. MCP tools configured with `alwaysAllow` now correctly skip confirmation prompts as documented.\n\n- Fix: Provider timeouts are now respected in non-interactive mode. Thanks to @kaustubha07. Closes #402.\n\n- Fix: Non-interactive mode no longer exits prematurely when the prompt or response contains the word \"error\".\n\n- Fix: Invalid CLI arguments no longer trigger the setup wizard. Thanks to @james2doyle.\n\n- Fix: Installation detector no longer falsely reports Homebrew on macOS when `HOMEBREW_PREFIX` is set but Nanocoder was installed via npm. Closes #392.\n\n- Fix: Preserve draft message when navigating through history with arrow keys.\n\n- Fix: `fetch_url` display now truncates to fit terminal width.\n\n- Fix: Validation failures no longer incorrectly prompt for tool confirmation.\n\n- Fix: Various error message an","2026-03-17T18:58:54",{"id":173,"version":174,"summary_zh":175,"released_at":176},116943,"v1.23.0","## What's Changed\n\n- Added `ask_user` tool for interactive question prompts. The LLM can now present the user with a question and selectable options during a conversation, returning their answer to guide the next step. Uses a global question-queue to bridge the tool's suspended Promise with the Ink UI component.\n\n- Added per-project cron scheduler for running AI tasks on a schedule. Schedule files live in `.nanocoder\u002Fschedules\u002F` as markdown prompts with YAML frontmatter, managed via the `\u002Fschedule` command (`create`, `add`, `remove`, `list`, `logs`, `start`). Includes cron expression parsing, sequential job queue with deduplication, dedicated scheduler mode with auto-accept, and run history logging.\n\n- Added centralized graceful shutdown system. A `ShutdownManager` now coordinates cleanup of all services (VS Code server, MCP client, LSP manager, health monitor, logger) on exit, preventing orphaned child processes and dangling connections. Configurable via `NANOCODER_DEFAULT_SHUTDOWN_TIMEOUT` env variable. Closes #239.\n\n- Added file operation tools: `delete_file`, `move_file`, `create_directory`, and `copy_file`. Reorganized existing file tools into a `file-ops\u002F` directory group.\n\n- Added readline keybind support to text input. Replaces `ink-text-input` with a custom `TextInput` component supporting Ctrl+W (delete word), Ctrl+U (kill to start), Ctrl+K (kill to end), Ctrl+A\u002FE (jump to start\u002Fend), and Ctrl+B\u002FF (move char). Closes #354.\n\n- Added `\u002Fcontext-max` command and `NANOCODER_CONTEXT_LIMIT` env variable for manual context length override on models not listed on models.dev. Resolution order: session override > env variable > models.dev > null. Closes #379.\n\n- Added `\u002Fide` command matching the `--vscode` flag for toggling VS Code integration from within a session.\n\n- Added persistent context percentage display in the mode indicator, replacing the previous context checker component.\n\n- Added `include` and `path` parameters to `search_file_contents` tool for scoping searches to specific file patterns and directories.\n\n- Added Kanagawa theme.\n\n- Refactored the skills system into custom commands, eliminating redundant parsers, loaders, and test suites. Commands gain optional skill-like fields (`tags`, `triggers`, `estimated-tokens`, `resources`) for auto-injection and relevance scoring. The `\u002Fskills` command is removed and its functionality absorbed into `\u002Fcommands` with new subcommands (`show`, `refresh`). Thanks to @yashksaini-coder for the initial skills implementation in PR #370.\n\n- V2 type-safe tool system overhaul with defensive parsing. Implements a three-tiered defense system for handling chaotic LLM outputs, preventing crashes from non-string responses and enabling robust self-correction. Includes universal type safety with `ensureString()`, response normalization, confidence system inversion, ghost echo deduplication, and AI SDK contract fixes. Local LLM experience is now significantly more stable. Thanks to @cleyesode. Closes #362.\n\n- Fix: XML parser now uses optimistic matching for consistency with the JSON parser. Thanks to @cleyesode.\n\n- Fix: Bash tool now emits progress immediately on stdout\u002Fstderr data instead of waiting for the 500ms timer, so fast-completing commands show streaming output.\n\n- Fix: Recognize `127.0.0.1` as a local server URL and tighten error classification. Ollama users configuring `127.0.0.1` instead of `localhost` no longer experience misleading connection errors. Replaced broad `connect` substring match with specific error codes to prevent misclassifying \"disconnect\"\u002F\"reconnect\". Closes #366.\n\n- Fix: Skip loading git tools when not inside a git repository.\n\n- Fix: Strip ANSI escape codes before running regex matching in tool formatters.\n\n- Fix: Gap in layout during auto-compact.\n\n- Fix: Hardened `write_file` validation and MCP client type safety.\n\n- Fix: Use local `TextInput` component instead of the missing `ink-text-input` package.\n\n- Fix(mcp): Use Python-based `mcp-server-fetch` instead of non-existent npm package.\n\n- Security: Semgrep and audit fixes.\n\n- Dependency updates: `ai` 6.0.95, `@ai-sdk\u002Fanthropic` 3.0.46, `@ai-sdk\u002Fgoogle` 3.0.30, `undici` 7.22.0, `sonic-boom` 4.2.1.\n\nIf there are any problems, feedback or thoughts please drop an issue or message us through Discord! Thank you for using Nanocoder. 🙌\n\n### Installation\n```bash\nnpm install -g @nanocollective\u002Fnanocoder\n```\n\n### Usage\n```bash\nnanocoder\n```\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fcompare\u002Fv1.22.5...v1.23.0","2026-02-26T13:35:25",{"id":178,"version":179,"summary_zh":180,"released_at":181},116944,"v1.22.5","## What's Changed\n\n- Added MiniMax Coding Plan and GLM-5 to provider templates in the configuration wizard.\n\n- Fix: Model context limit lookups now use models.dev as the primary source instead of the hardcoded fallback table. This prevents stale hardcoded values from overriding accurate upstream data. The hardcoded table remains as an offline-only fallback. Also fixes greedy key matching where shorter keys like `mixtral` would match before `mixtral:8x22b`, and replaces first-match name lookups with scored matching for more accurate results.\n\n- Fix: Binary and excessively large files tagged with `@` no longer pollute the LLM context window with unreadable content.\n\n- Fix: Diff preview panel no longer steals terminal focus from the active input.\n\n- Fix: Reduced verbosity of the `string_replace` error formatter output.\n\n- Fix: Reject null and non-object arguments in JSON tool calls, preventing formatter crashes from malformed tool call arguments. Thanks to @cleyesode.\n\n- Fix: Restored `formatError` usage for validation and execution errors.\n\n- Dependency updates: `ink-gradient` 4.0.0, `react` 19.2.4, `@nanocollective\u002Fget-md` 1.1.1, `@ai-sdk\u002Fanthropic` 3.0.43, `pino` 10.3.1, `@types\u002Freact` 19.2.14.\n\n### Installation\n```bash\nnpm install -g @nanocollective\u002Fnanocoder\n```\n\n### Usage\n```bash\nnanocoder\n```\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fcompare\u002Fv1.22.4...v1.22.5","2026-02-15T22:26:36",{"id":183,"version":184,"summary_zh":185,"released_at":186},116945,"v1.22.4","## What's Changed\n\n- Security: Tool validators now run inside the AI SDK's auto-execution loop. Previously, tools with `needsApproval: false` (like `read_file`) were auto-executed by the AI SDK's `generateText` without any path validation, allowing the model to read or write files outside the project directory using absolute or `~` paths. Validators are now wrapped into each tool's `execute` function at registration time, ensuring validation runs in all code paths.\n\n- Security: Reject home directory shorthand (`~`) in file path validation. Paths starting with `~` are not expanded by Node.js and could bypass project boundary checks.\n\n- Fix: Tab characters in code blocks within assistant messages now render at 2-space width instead of the terminal default of 8 spaces. This prevents long lines from wrapping prematurely and eliminates the blocky visual effect on messages containing indented code.\n\n- Fix: `normalizeIndentation` no longer short-circuits when the minimum indent is 0. Previously, if any line in the context window had zero indentation, raw tab characters passed through to the terminal unchanged, rendering at 8-space width in `string_replace` diff previews.\n\n### Installation\n```bash\nnpm install -g @nanocollective\u002Fnanocoder\n```\n\n### Usage\n```bash\nnanocoder\n```\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fcompare\u002Fv1.22.3...v1.22.4","2026-02-09T23:28:16",{"id":188,"version":189,"summary_zh":190,"released_at":191},116946,"v1.22.3","## What's Changed\n\n- Fix: Removed tool call deduplication from JSON parser that silently dropped duplicate tool calls, breaking the 1:1 pairing between tool calls and results expected by AI SDK. This caused \"Tool result is missing for tool call\" errors that would end the agent's turn prematurely. Consolidated three overlapping regex patterns into a single comprehensive pattern to prevent duplicate matches. Thanks to @cleyesode.\n\n- Fix: Added missing capture group for arguments in the consolidated JSON tool call regex pattern, which caused inline tool calls to have empty arguments instead of actual parsed values.\n\n- Fix: When the model batched read-only and write tools in a single response (e.g. `read_file` + `string_replace`), the auto-executed read tools would recurse into the next conversation turn, abandoning the confirmation-needed write tools. This left orphaned `tool_use` blocks without matching `tool_result` entries, triggering intermittent \"Tool result is missing for tool call\" errors with the Anthropic provider.\n\n- Dependency updates: `@ai-sdk\u002Fopenai-compatible` 2.0.27, `undici` 7.21.0, `@biomejs\u002Fbiome` 2.3.14, `@types\u002Fvscode` 1.109.0, `@types\u002Fnode` 25.2.1.\n\n### Installation\n```bash\nnpm install -g @nanocollective\u002Fnanocoder\n```\n\n### Usage\n```bash\nnanocoder\n```\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fcompare\u002Fv1.22.2...v1.22.3","2026-02-08T22:40:40",{"id":193,"version":194,"summary_zh":195,"released_at":196},116947,"v1.22.2","## What's Changed\n\n- Fix: Markdown tables in assistant messages were rendered at full terminal width instead of accounting for the message box border and padding, causing broken box-drawing characters when lines wrapped.\n\n### Installation\n```bash\nnpm install -g @nanocollective\u002Fnanocoder\n```\n\n### Usage\n```bash\nnanocoder\n```\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fcompare\u002Fv1.22.1...v1.22.2","2026-02-06T18:24:00",{"id":198,"version":199,"summary_zh":200,"released_at":201},116948,"v1.22.1","## What's Changed\n\n- Added native Anthropic SDK support via `@ai-sdk\u002Fanthropic` package. The Anthropic Claude provider template now uses `sdkProvider: 'anthropic'` for direct API integration instead of the OpenAI-compatible wrapper.\n\n- Fixed Kimi Code provider template to use the native `@ai-sdk\u002Fanthropic` SDK with correct base URL and configuration passthrough.\n\n- Fix: User message token count now reflects the full assembled content including pasted content and tagged file contents, instead of only counting the placeholder text.\n\n- Fix: Removed aggressive tool call deduplication that silently dropped duplicate tool call IDs and identical function signatures. This could cause \"Tool result is missing for tool call\" errors with providers like Anthropic that strictly validate tool call\u002Fresult pairing.\n\n### Installation\n```bash\nnpm install -g @nanocollective\u002Fnanocoder\n```\n\n### Usage\n```bash\nnanocoder\n```\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fcompare\u002Fv1.22.0...v1.22.1","2026-02-05T20:30:40",{"id":203,"version":204,"summary_zh":205,"released_at":206},116949,"v1.22.0","## What's Changed\n\n- Added `\u002Fexplorer` command for interactive file browsing with tree view navigation, file preview with syntax highlighting, multi-file selection, search mode, and VS Code integration. Closes #298.\n\n- Added task management tools (`create_task`, `list_tasks`, `update_task`, `delete_task`) with `\u002Ftasks` slash command for models to track and manage progress on complex work. Tasks persist in `.nanocoder\u002Ftasks.json` and are automatically cleared on CLI boot and `\u002Fclear` command.\n\n- Added `\u002Fsettings` command for interactive command menu to configure UI theme and shapes without editing config files directly.\n\n- Added `sdkProvider` configuration option for native Google Gemini support. This fixes the \"missing thought_signature\" error with Gemini 3 models by using the `@ai-sdk\u002Fgoogle` package. Closes #302.\n\n- Added custom headers support in provider configuration. This enables authentication through tunnels like Cloudflare. Thanks to @nicolalamacchia.\n\n- Added Kimi Code provider template in configuration wizard.\n\n- Added new themes with updated user input and user message styling for better visual clarity and consistency.\n\n- Added token count display after messages and completion message to provide visibility into context usage throughout conversations.\n\n- Refactored git tools for better consistency, improved error handling, standardized parameter handling across all git operations, and enhanced user feedback messages.\n\n- Added line truncation in `write_file` and `string_replace` formatters to prevent excessive output from files with very long lines and neaten user experience on narrow terminals.\n\n- Fix: `\u002Fusage` command crash when context data is unavailable.\n\n- Fix: String replace error handling for edge cases.\n\n- Fix: Multiple security audit issues resolved.\n\n- Fix: Various styling improvements across components.\n\n- Fix: Dependency lockfile issues resolved.\n\nIf there are any problems, feedback or thoughts please drop an issue or message us through Discord! Thank you for using Nanocoder. 🙌\n\n### Installation\n```bash\nnpm install -g @nanocollective\u002Fnanocoder\n```\n\n### Usage\n```bash\nnanocoder\n```\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fcompare\u002Fv1.21.0...v1.22.0","2026-02-05T17:10:25",{"id":208,"version":209,"summary_zh":210,"released_at":211},116950,"v1.21.0","## What's Changed\n\n- Added `\u002Fcompact` command for context compression with `--restore` flag support to restore messages from backup. The command now includes auto-compact functionality, consistent token counting, and improved compression for very long messages. Thanks to @Pahari47.\n\n- Added hierarchical configuration loading for both provider configs and MCP servers. Local project configurations now properly override global settings, and Claude Code's object-style MCP configuration format is now supported. Thanks to @Avtrkrb.\n\n- Added `alwaysAllow` configuration option for MCP servers to auto-approve trusted tools without confirmation prompts. Thanks to @namar0x0309.\n\n- Added automatic tool support error detection and retry mechanism. Models that don't support function calling are now detected and requests automatically retry without tools. Thanks to @ThomasBrugman.\n\n- Added `--version` and `--help` CLI command options for quick reference. Thanks to @Avtrkrb.\n\n- Added `\u002Fquit` command as an alternative way to exit Nanocoder. Thanks to @Avtrkrb.\n\n- Added `\u002Fnanocoder-shape` command for selecting branding font styles.\n\n- Added keyboard shortcuts documentation to README.\n\n- Renamed `\u002Fsetup-config` to `\u002Fsetup-providers` for clearer naming.\n\n- Improved `\u002Fmcp` command modal with better colors and title formatting. Thanks to @Avtrkrb.\n\n- Improved `\u002Fhelp` command title heading styling. Thanks to @Avtrkrb.\n\n- Added CLI test harness for non-interactive mode testing. Thanks to @akramcodez.\n\n- Added comprehensive test suite for tool error detection. Thanks to @ThomasBrugman.\n\n- Added `DisableToolModels` documentation to README. Thanks to @ThomasBrugman.\n\n- Fix: Resolved bash tool keeping processes alive after command completion.\n\n- Fix: Corrected log directory paths and enabled file logging in production.\n\n- Fix: Improved deprecation message for MCP config to display correct config directory instead of hardcoded Linux path. Thanks to @Avtrkrb.\n\n- Fix: Resolved shell command security scanning alerts built from environment values. Thanks to @Avtrkrb.\n\n- Fix: Security audit dependencies updated.\n\nIf there are any problems, feedback or thoughts please drop an issue or message us through Discord! Thank you for using Nanocoder. 🙌\n\n### Installation\n```bash\nnpm install -g @nanocollective\u002Fnanocoder\n```\n\n### Usage\n```bash\nnanocoder\n```\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fcompare\u002Fv1.20.4...v1.21.0","2026-01-20T16:51:43",{"id":213,"version":214,"summary_zh":215,"released_at":216},116951,"v1.20.4","## What's Changed\n\n- Fixed configuration wizard blocking users from entering HTTP URLs for remote Ollama servers. The wizard now allows any valid HTTP\u002FHTTPS URL without requiring local network addresses.\n\n- Fixed `@modelcontextprotocol\u002Fsdk` dependency version to resolve npm audit security issue.\n\n- Fixed TLS certificate errors when using `uvx` MCP servers behind corporate proxies. Nanocoder now automatically adds `--native-tls` to uvx commands to use system certificates instead of rustls.\n\nIf there are any problems, feedback or thoughts please drop an issue or message us through Discord! Thank you for using Nanocoder. 🙌\n\n### Installation\n```bash\nnpm install -g @nanocollective\u002Fnanocoder\n```\n\n### Usage\n```bash\nnanocoder\n```\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fcompare\u002Fv1.20.3...v1.20.4","2026-01-07T22:47:57",{"id":218,"version":219,"summary_zh":220,"released_at":221},116952,"v1.20.3","## What's Changed\n\n- Fixed `search_file_contents` returning excessive tokens by truncating long matching lines to 300 characters. Previously, searching in files with long lines (minified JS, base64 data, etc.) could return ~100k tokens for just 30 matches.\n\n- Added validation to `read_file` to reject minified\u002Fbinary files (lines >10,000 characters). These files consume excessive tokens without providing useful information to the model. Use `metadata_only=true` to still check file properties.\n\n- Fixed `web_search` result count display showing mismatched values (e.g., \"10 \u002F 5 results\"). The formatter now correctly uses the same default as the search execution.\n\n- Improved `web_search` and `fetch_url` formatter layouts to match `execute_bash` style with consistent column alignment and spacing.\n\nIf there are any problems, feedback or thoughts please drop an issue or message us through Discord! Thank you for using Nanocoder. 🙌\n\n### Installation\n```bash\nnpm install -g @nanocollective\u002Fnanocoder\n```\n\n### Usage\n```bash\nnanocoder\n```\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fcompare\u002Fv1.20.2...v1.20.3","2026-01-06T23:04:38",{"id":223,"version":224,"summary_zh":225,"released_at":226},116953,"v1.20.2","## What's Changed\n\n- Added preview generation to git workflow tools (`git-status-enhanced`, `git-smart-commit`, `git-create-pr`) showing results before execution.\n\n- Fixed `string-replace` line number display in result mode - now correctly shows line numbers of new content after replacement.\n\n- Added hammer icon (⚒) to git tool formatters for visual consistency.\n\n- Improved formatting in `bash-progress`, `execute-bash`, and `read-file` tools with better spacing and layout.\n\n- Simplified `string-replace` validation logic and removed redundant success messages.\n\n- Fix: Running `\u002Finit --force` added duplication to `AGENTS.md`.\n\nIf there are any problems, feedback or thoughts please drop an issue or message us through Discord! Thank you for using Nanocoder. 🙌\n\n### Installation\n```bash\nnpm install -g @nanocollective\u002Fnanocoder\n```\n\n### Usage\n```bash\nnanocoder\n```\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fcompare\u002Fv1.20.1...v1.20.2","2026-01-06T00:17:36",{"id":228,"version":229,"summary_zh":230,"released_at":231},116954,"v1.20.1","## What's Changed\n\nFix: React Context Error - useTitleShape must be used within a TitleShapeProvider\n\nIf there are any problems, feedback or thoughts please drop an issue or message us through Discord! Thank you for using Nanocoder. 🙌\n\n### Installation\n```bash\nnpm install -g @nanocollective\u002Fnanocoder\n```\n\n### Usage\n```bash\nnanocoder\n```\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fcompare\u002Fv1.20.0...v1.20.1","2026-01-05T12:06:02",{"id":233,"version":234,"summary_zh":235,"released_at":236},116955,"v1.20.0","## What's Changed\n\nHappy New Year! We all hope you had a great holidays and are feeling refreshed ready for 2026 🌟\n\n- Added Catpuccin themes (Latte, Frappe, Macchiato, Mocha) with gradient color support. Thanks to @Avtrkrb.\n\n- Added VS Code context menu integration - you can now right-click selected code and ask Nanocoder about it directly.\n\n- Added comprehensive testing achieving 90%+ code coverage across components, hooks, tools, and utilities. Tests now include unit and integration coverage for critical paths.\n\n- Added automated PR checks workflow with format, type, lint, and test validation. Pull requests now get automatic quality checks. Thanks to @Avtrkrb.\n\n- Added LSP support for Deno, GraphQL, Docker\u002FDocker Compose, and Markdown language servers with automatic project detection. Thanks to @yashksaini-coder.\n\n- Added auto-fetch models feature in setup wizard - providers can now automatically fetch available models during configuration. Thanks to @JimStenstrom.\n\n- Added git workflow integration tools including smart commit message generation, PR template creation, branch naming suggestions, and enhanced status reporting. Thanks to @JimStenstrom.\n\n- Added file content caching to reduce tool confirmation delays and improve performance. Thanks to @JimStenstrom.\n\n- Added path boundary validation to file manipulation tools to prevent directory traversal attacks.\n\n- Added granular debug logging with structured pino logger throughout catch blocks for better error tracking. Thanks to @JimStenstrom and @abhisek1221.\n\n- Added devcontainer support for streamlined development environments. Thanks to @Avtrkrb.\n\n- Added stylized title boxes with powerline-style shapes and real-time preview in custom commands. Thanks to @Avtrkrb.\n\n- Added real-time bash output progress with live updates during command execution.\n\n- Added inline word-level highlighting to string_replace diff display for clearer change visualization.\n\n- Improved code exploration tools with better tool calling prompts and descriptions and new `list_directories` tool. Thanks to @DenizOkcu.\n\n- Centralized token calculation in tools with consistent usage display in formatters. Thanks to @DenizOkcu.\n\n- Added AI SDK error types for better tool call error handling. Thanks to @DenizOkcu.\n\n- Centralized ignored file patterns usage throughout Nanocoder for consistency. Thanks to @DenizOkcu.\n\n- Refactored App component into focused modules (useAppState, useAppInitialization, useChatHandler, useToolHandler, useModeHandlers) for better maintainability.\n\n- Refactored message components to unify structure and fix memoization inconsistency. Thanks to @abhisek1221.\n\n- Refactored handleMessageSubmission into focused handler functions for better code organization. Thanks to @JimStenstrom.\n\n- Refactored health-monitor, log-query, and AISDKClient into smaller focused modules.\n\n- Renamed multiple files to kebab-case for consistency (AISDKClient.ts → ai-sdk-client.ts, appUtils.ts → app-util.ts, conversationState.ts → conversation-state.ts). Thanks to @JimStenstrom.\n\n- Replaced sync fs operations with async readFile for better performance. Thanks to @namar0x0309.\n\n- Improved tool formatter indentation for better readability.\n\n- Extracted magic numbers to named constants for better code clarity. Thanks to @JimStenstrom.\n\n- Enhanced validateRestorePath to check directory writability. Thanks to @yashksaini-coder.\n\n- Fix: Resolved \"Interrupted by user\" error appearing on empty model responses.\n\n- Fix: Command completion now prioritizes prefix matches over suffix matches for more intuitive autocomplete.\n\n- Fix: Resolved duplicate React keys issue by using useRef for component key counter. Thanks to @JimStenstrom.\n\n- Fix: Development mode context synchronization prevents autoaccept race conditions. Thanks to @JimStenstrom.\n\n- Fix: Bounded completedActions array to prevent memory growth during long sessions. Thanks to @JimStenstrom.\n\n- Fix: User input cycling now works correctly.\n\n- Fix: Slash + Tab now shows all available commands instead of subset.\n\n- Fix: Command injection vulnerabilities in shell commands resolved.\n\n- Fix: Large paste truncation in slow terminals resolved. Thanks to @Alvaro842DEV.\n\n- Fix: find_files tool now correctly recognizes all pattern types.\n\n- Fix: Tool over-fetching in find and search tools reduced for better performance. Thanks to @pulkitgarg04.\n\n- Fix: Prompt history handling improved with better state management.\n\n- Fix: Paragraphs now render correctly in user messages.\n\n- Fix: Added helpful error messages for missing MCP server commands. Thanks to @JimStenstrom.\n\n- Fix: Size limits added to unbounded caches to prevent memory issues.\n\n- Fix: Resolved several security scanning alerts for string escaping and encoding. Thanks to @Avtrkrb.\n\n- Fix: Switched to crypto.randomUUID and crypto.randomBytes for secure ID generation. Thanks to @JimStenstrom and @abhisek1221.\n\n- Fix: Broken pino logging documentation link in README.\n\n- Fix: Husky pre-commit hook c","2026-01-04T22:15:44",{"id":238,"version":239,"summary_zh":240,"released_at":241},116956,"v1.19.2","## What's Changed\n\n- Refactored file editing tools by replacing line-based tools with modern content-based editing for better reliability and context efficiency.\n\n- Replaced `create_file` with `write_file` - a tool for whole-file rewrites, ideal for generated code, config files, complete file replacements and the creation of new files.\n\n- Optimized system prompt to be more concise and reduce token usage.\n\n- Fix: Tool call results were incorrectly being passed as user messages, causing hallucinations in model responses. This has caused great gains for models like GLM 4.6 which commonly struggles with context poisoning.\n\n- Fix: `\u002Fusage` command now correctly displays context usage information.\n\nIf there are any problems, feedback or thoughts please drop an issue or message us through Discord! Thank you for using Nanocoder. 🙌\n\n### Installation\n```bash\nnpm install -g @nanocollective\u002Fnanocoder\n```\n\n### Usage\n```bash\nnanocoder\n```\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fcompare\u002Fv1.19.1...v1.19.2","2025-12-18T12:25:36",{"id":243,"version":244,"summary_zh":245,"released_at":246},116957,"v1.19.1","## What's Changed\n\n- Fix Nix releases.\n\nIf there are any problems, feedback or thoughts please drop an issue or message us through Discord! Thank you for using Nanocoder. 🙌\n\n### Installation\n```bash\nnpm install -g @nanocollective\u002Fnanocoder\n```\n\n### Usage\n```bash\nnanocoder\n```\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fcompare\u002Fv1.19.0...v1.19.1","2025-12-15T20:39:00",{"id":248,"version":249,"summary_zh":250,"released_at":251},116958,"v1.19.0","## What's Changed\n\n- Added non-interactive mode for running Nanocoder in CI\u002FCD pipelines and scripts. Pass commands via CLI arguments and Nanocoder will execute and exit automatically. Thanks to @namar0x0309.\n\n- Added conversation checkpointing system with interactive loading for saving and restoring conversation state across sessions. Thanks to @akramcodez.\n\n- Added enterprise-grade Pino logging system with structured logging, request tracking, performance monitoring, and configurable log levels. Thanks to @Avtrkrb.\n\n- Switched to Biome for formatting and linting, replacing Prettier and ESLint for faster, more consistent code quality tooling. Thanks to @akramcodez.\n\n- Added Poe.com as a provider template in the configuration wizard. Closes issue #74.\n\n- Added Mistral AI as a provider template in the configuration wizard.\n\n- Updated Ollama model contexts.\n\n- Added `--force` flag to `\u002Finit` command for regenerating AGENTS.md without prompting.\n\n- Removed `ink-titled-box` dependency and replaced it with a custom implementation. Closes issue #136.\n\n- Fixed security vulnerabilities by addressing pnpm audit reports. Thanks to @spinualexandru.\n\n- Fixed README table of contents anchors for proper navigation on GitHub forks. Thanks to @Azd325.\n\n- Refactored GitHub Actions workflows to reduce duplication and improve maintainability.\n\nIf there are any problems, feedback or thoughts please drop an issue or message us through Discord! Thank you for using Nanocoder. 🙌\n\n### Installation\n```bash\nnpm install -g @nanocollective\u002Fnanocoder\n```\n\n### Usage\n```bash\nnanocoder\n```\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fcompare\u002Fv1.18.0...v1.19.0","2025-12-15T20:25:29",{"id":253,"version":254,"summary_zh":255,"released_at":256},116959,"v1.18.0","## What's Changed\n\n- Upgraded to AI SDK v6 beta to improve model and tool calling performance and introduce multi-step tool calls support. Thanks to @DenizOkcu.\n\n- Added `\u002Fdebugging` command to toggle detailed tool call information for debugging purposes. Thanks to @DenizOkcu.\n\n- Replaced `\u002Frecommendations` command with `\u002Fmodel-database` command that provides searchable model information from an up-to-date database, making model recommendations easier to maintain.\n\n- Added GitHub issue templates for bug reports and feature requests to improve community contributions.\n\n- LSP and MCP server connection status is now displayed in the Status component, providing cleaner visibility and removing verbose connection messages from the main UI. Thanks to @Avtrkrb.\n\n- Various improvements to context management, error handling, and code refactoring for better maintainability.\n\n- Fixed locale-related test failures by setting test environment to en-US.UTF-8. Thanks to @DenizOkcu.\n\n- Removed streaming for now as it continued having issues with layouts, flickering and more, especially with the upgrade to AI SDK v6.\n\nIf there are any problems, feedback or thoughts please drop an issue or message us through Discord! Thank you for using Nanocoder. 🙌\n\n### Installation\n```bash\nnpm install -g @nanocollective\u002Fnanocoder\n```\n\n### Usage\n```bash\nnanocoder\n```\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fcompare\u002Fv1.17.3...v1.18.0","2025-12-07T23:45:02",{"id":258,"version":259,"summary_zh":260,"released_at":261},116960,"v1.17.3","## What's Changed\n\n- Added GitHub models as a provider addressing issue #67 with minimal code changes. Thanks to @JimStenstrom\n\n- Added `\u002Flsp` command to list connected LSP servers. Thanks to @anithanarayanswamy\n\n- Fix: Improve error handling for Ollama JSON parsing. Addresses issue #87. Thanks to @JimStenstrom\n\nIf there are any problems, feedback or thoughts please drop an issue or message us through Discord! Thank you for using Nanocoder. 🙌\n\n### Installation\n```bash\nnpm install -g @nanocollective\u002Fnanocoder\n```\n\n### Usage\n```bash\nnanocoder\n```\n\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FNano-Collective\u002Fnanocoder\u002Fcompare\u002Fv1.17.2...v1.17.3","2025-12-02T13:16:51"]