[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-getcellm--cellm":3,"tool-getcellm--cellm":64},[4,17,27,35,43,56],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":16},3808,"stable-diffusion-webui","AUTOMATIC1111\u002Fstable-diffusion-webui","stable-diffusion-webui 是一个基于 Gradio 构建的网页版操作界面，旨在让用户能够轻松地在本地运行和使用强大的 Stable Diffusion 图像生成模型。它解决了原始模型依赖命令行、操作门槛高且功能分散的痛点，将复杂的 AI 绘图流程整合进一个直观易用的图形化平台。\n\n无论是希望快速上手的普通创作者、需要精细控制画面细节的设计师，还是想要深入探索模型潜力的开发者与研究人员，都能从中获益。其核心亮点在于极高的功能丰富度：不仅支持文生图、图生图、局部重绘（Inpainting）和外绘（Outpainting）等基础模式，还独创了注意力机制调整、提示词矩阵、负向提示词以及“高清修复”等高级功能。此外，它内置了 GFPGAN 和 CodeFormer 等人脸修复工具，支持多种神经网络放大算法，并允许用户通过插件系统无限扩展能力。即使是显存有限的设备，stable-diffusion-webui 也提供了相应的优化选项，让高质量的 AI 艺术创作变得触手可及。",162132,3,"2026-04-05T11:01:52",[13,14,15],"开发框架","图像","Agent","ready",{"id":18,"name":19,"github_repo":20,"description_zh":21,"stars":22,"difficulty_score":23,"last_commit_at":24,"category_tags":25,"status":16},1381,"everything-claude-code","affaan-m\u002Feverything-claude-code","everything-claude-code 是一套专为 AI 编程助手（如 Claude Code、Codex、Cursor 等）打造的高性能优化系统。它不仅仅是一组配置文件，而是一个经过长期实战打磨的完整框架，旨在解决 AI 代理在实际开发中面临的效率低下、记忆丢失、安全隐患及缺乏持续学习能力等核心痛点。\n\n通过引入技能模块化、直觉增强、记忆持久化机制以及内置的安全扫描功能，everything-claude-code 能显著提升 AI 在复杂任务中的表现，帮助开发者构建更稳定、更智能的生产级 AI 代理。其独特的“研究优先”开发理念和针对 Token 消耗的优化策略，使得模型响应更快、成本更低，同时有效防御潜在的攻击向量。\n\n这套工具特别适合软件开发者、AI 研究人员以及希望深度定制 AI 工作流的技术团队使用。无论您是在构建大型代码库，还是需要 AI 协助进行安全审计与自动化测试，everything-claude-code 都能提供强大的底层支持。作为一个曾荣获 Anthropic 黑客大奖的开源项目，它融合了多语言支持与丰富的实战钩子（hooks），让 AI 真正成长为懂上",138956,2,"2026-04-05T11:33:21",[13,15,26],"语言模型",{"id":28,"name":29,"github_repo":30,"description_zh":31,"stars":32,"difficulty_score":23,"last_commit_at":33,"category_tags":34,"status":16},2271,"ComfyUI","Comfy-Org\u002FComfyUI","ComfyUI 是一款功能强大且高度模块化的视觉 AI 引擎，专为设计和执行复杂的 Stable Diffusion 图像生成流程而打造。它摒弃了传统的代码编写模式，采用直观的节点式流程图界面，让用户通过连接不同的功能模块即可构建个性化的生成管线。\n\n这一设计巧妙解决了高级 AI 绘图工作流配置复杂、灵活性不足的痛点。用户无需具备编程背景，也能自由组合模型、调整参数并实时预览效果，轻松实现从基础文生图到多步骤高清修复等各类复杂任务。ComfyUI 拥有极佳的兼容性，不仅支持 Windows、macOS 和 Linux 全平台，还广泛适配 NVIDIA、AMD、Intel 及苹果 Silicon 等多种硬件架构，并率先支持 SDXL、Flux、SD3 等前沿模型。\n\n无论是希望深入探索算法潜力的研究人员和开发者，还是追求极致创作自由度的设计师与资深 AI 绘画爱好者，ComfyUI 都能提供强大的支持。其独特的模块化架构允许社区不断扩展新功能，使其成为当前最灵活、生态最丰富的开源扩散模型工具之一，帮助用户将创意高效转化为现实。",107662,"2026-04-03T11:11:01",[13,14,15],{"id":36,"name":37,"github_repo":38,"description_zh":39,"stars":40,"difficulty_score":23,"last_commit_at":41,"category_tags":42,"status":16},3704,"NextChat","ChatGPTNextWeb\u002FNextChat","NextChat 是一款轻量且极速的 AI 助手，旨在为用户提供流畅、跨平台的大模型交互体验。它完美解决了用户在多设备间切换时难以保持对话连续性，以及面对众多 AI 模型不知如何统一管理的痛点。无论是日常办公、学习辅助还是创意激发，NextChat 都能让用户随时随地通过网页、iOS、Android、Windows、MacOS 或 Linux 端无缝接入智能服务。\n\n这款工具非常适合普通用户、学生、职场人士以及需要私有化部署的企业团队使用。对于开发者而言，它也提供了便捷的自托管方案，支持一键部署到 Vercel 或 Zeabur 等平台。\n\nNextChat 的核心亮点在于其广泛的模型兼容性，原生支持 Claude、DeepSeek、GPT-4 及 Gemini Pro 等主流大模型，让用户在一个界面即可自由切换不同 AI 能力。此外，它还率先支持 MCP（Model Context Protocol）协议，增强了上下文处理能力。针对企业用户，NextChat 提供专业版解决方案，具备品牌定制、细粒度权限控制、内部知识库整合及安全审计等功能，满足公司对数据隐私和个性化管理的高标准要求。",87618,"2026-04-05T07:20:52",[13,26],{"id":44,"name":45,"github_repo":46,"description_zh":47,"stars":48,"difficulty_score":23,"last_commit_at":49,"category_tags":50,"status":16},2268,"ML-For-Beginners","microsoft\u002FML-For-Beginners","ML-For-Beginners 是由微软推出的一套系统化机器学习入门课程，旨在帮助零基础用户轻松掌握经典机器学习知识。这套课程将学习路径规划为 12 周，包含 26 节精炼课程和 52 道配套测验，内容涵盖从基础概念到实际应用的完整流程，有效解决了初学者面对庞大知识体系时无从下手、缺乏结构化指导的痛点。\n\n无论是希望转型的开发者、需要补充算法背景的研究人员，还是对人工智能充满好奇的普通爱好者，都能从中受益。课程不仅提供了清晰的理论讲解，还强调动手实践，让用户在循序渐进中建立扎实的技能基础。其独特的亮点在于强大的多语言支持，通过自动化机制提供了包括简体中文在内的 50 多种语言版本，极大地降低了全球不同背景用户的学习门槛。此外，项目采用开源协作模式，社区活跃且内容持续更新，确保学习者能获取前沿且准确的技术资讯。如果你正寻找一条清晰、友好且专业的机器学习入门之路，ML-For-Beginners 将是理想的起点。",84991,"2026-04-05T10:45:23",[14,51,52,53,15,54,26,13,55],"数据工具","视频","插件","其他","音频",{"id":57,"name":58,"github_repo":59,"description_zh":60,"stars":61,"difficulty_score":10,"last_commit_at":62,"category_tags":63,"status":16},3128,"ragflow","infiniflow\u002Fragflow","RAGFlow 是一款领先的开源检索增强生成（RAG）引擎，旨在为大语言模型构建更精准、可靠的上下文层。它巧妙地将前沿的 RAG 技术与智能体（Agent）能力相结合，不仅支持从各类文档中高效提取知识，还能让模型基于这些知识进行逻辑推理和任务执行。\n\n在大模型应用中，幻觉问题和知识滞后是常见痛点。RAGFlow 通过深度解析复杂文档结构（如表格、图表及混合排版），显著提升了信息检索的准确度，从而有效减少模型“胡编乱造”的现象，确保回答既有据可依又具备时效性。其内置的智能体机制更进一步，使系统不仅能回答问题，还能自主规划步骤解决复杂问题。\n\n这款工具特别适合开发者、企业技术团队以及 AI 研究人员使用。无论是希望快速搭建私有知识库问答系统，还是致力于探索大模型在垂直领域落地的创新者，都能从中受益。RAGFlow 提供了可视化的工作流编排界面和灵活的 API 接口，既降低了非算法背景用户的上手门槛，也满足了专业开发者对系统深度定制的需求。作为基于 Apache 2.0 协议开源的项目，它正成为连接通用大模型与行业专有知识之间的重要桥梁。",77062,"2026-04-04T04:44:48",[15,14,13,26,54],{"id":65,"github_repo":66,"name":67,"description_en":68,"description_zh":69,"ai_summary_zh":69,"readme_en":70,"readme_zh":71,"quickstart_zh":72,"use_case_zh":73,"hero_image_url":74,"owner_login":75,"owner_name":75,"owner_avatar_url":76,"owner_bio":77,"owner_company":77,"owner_location":77,"owner_email":77,"owner_twitter":77,"owner_website":77,"owner_url":78,"languages":79,"stars":88,"forks":89,"last_commit_at":90,"license":91,"difficulty_score":23,"env_os":92,"env_gpu":93,"env_ram":93,"env_deps":94,"category_tags":97,"github_topics":98,"view_count":10,"oss_zip_url":77,"oss_zip_packed_at":77,"status":16,"created_at":101,"updated_at":102,"faqs":103,"releases":133},234,"getcellm\u002Fcellm","cellm","Use LLMs in Excel formulas","Cellm 是一款 Excel 插件，让你直接在单元格公式中调用大语言模型（如 ChatGPT、Gemma 等），就像使用 SUM 或 VLOOKUP 一样自然。只需输入类似 =PROMPT(\"提取文本中的人名\", A1) 的公式，就能对成百上千行数据批量执行 AI 任务，比如信息抽取、分类、翻译或内容生成，省去反复复制粘贴到聊天窗口的繁琐操作。\n\n它主要解决的是非技术用户在日常工作中处理重复性文本任务效率低的问题。市场、运营、财务、销售等团队无需依赖开发人员，就能在熟悉的 Excel 环境中快速自动化数据清洗、竞品监控、多语言内容处理等工作。例如，输入一列网址，Cellm 可自动抓取网页标题、翻译并分类，几分钟变几秒钟。\n\nCellm 支持主流云 API（如 OpenAI）和本地模型（如通过 Ollama 运行的 Gemma），兼顾灵活性与数据隐私。适合熟悉 Excel 但不写代码的普通办公用户，也适合希望快速验证 AI 应用场景的研究者或业务人员。需要注意的是，AI 输出可能存在误差，关键场景仍需人工复核。","[![CI](https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Factions\u002Fworkflows\u002Fci.yml\u002Fbadge.svg)](https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Factions\u002Fworkflows\u002Fci.yml)\n\n# Cellm\nUse AI in Excel formulas to run your prompt on thousands of rows of data in minutes.\n\n[Website](https:\u002F\u002Fwww.getcellm.com) | [Documentation](https:\u002F\u002Fdocs.getcellm.com) | [Releases](https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Freleases) | ⭐ **Star this repo** to help others discover Cellm!\n\n## What is Cellm?\nCellm is an Excel extension that lets you use Large Language Models (LLMs) like ChatGPT in cell formulas. Cellm's `=PROMPT()` function outputs AI responses to a range of text, similar to how Excel's `=SUM()` function outputs the sum of a range of numbers.  \n\nFor example, you can write `=PROMPT(\"Extract all person names mentioned in the text.\", A1)` in a cell's formula and drag the cell to apply the prompt to many rows. Cellm is useful when you want to use AI for repetitive tasks that would normally require copy-pasting data in and out of a chat window many times.\n\n## Why use Cellm?\n- Make quick work of data cleaning, classification, and extraction tasks.\n- Enable marketing, finance, sales, operations and other teams to automate everyday tasks without depending on developers.\n- Immediately free yourself and your team from repetitive manual work with the spreadsheet they already master.\n- Bypass lengthy rollouts of specialized AI apps. Your team already have Excel on their computers.\n- Create your own web scraper via MCP servers. Monitor your competitor's blogs, prices, and social media everyday before your daily 09:00 meeting. \n\n> “I love feeding data to ChatGPT, one copy-paste at a time”\n> — no one who’s run the same prompt 5 times\n\n## Example\nSay you need to track your international competitors, but their websites are in different languages. Visiting each one, finding the latest update, and plugging it into a translation tool totally sucks. Instead, let Cellm do the manual work for you:\n\nhttps:\u002F\u002Fgithub.com\u002Fuser-attachments\u002Fassets\u002F8967f557-50b8-4e39-80e8-86a1246c5a42\n\nThis example uses news websites. We give Cellm a list of URLs and write a simple prompt that asks Cellm to grab the top headline from each one. Then, in the next columns, we ask the model to translate the headline, identify its original language, and even sort it into a category like \"Politics\" or \"Business\". \n\nWith a drag to autofill, Cellm visits every site, pulls your data and organizes it for you. What would have taken perhaps an hour of manual work is now done in seconds. Imagine what you could prepare every day before your daily 09:00 meeting.\n\nJust remember that the models do make mistakes at times. They might misunderstand a headline or assign the wrong category. It is your responsibility to validate that the results are accurate enough for your use case.\n\n## Quick start\n\n### Requirements\n\n- Windows 10 or higher\n- [.NET 9.0 Runtime](https:\u002F\u002Fdotnet.microsoft.com\u002Fen-us\u002Fdownload\u002Fdotnet\u002F9.0)\n- [Excel 2010 or higher (desktop app)](https:\u002F\u002Fwww.microsoft.com\u002Fen-us\u002Fmicrosoft-365\u002Fexcel)\n\n### Install\n\n1. Go to the [Release page](https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Freleases) and download `Cellm-AddIn-Release-x64.msi`.\n\n2. Run the installer.\n\n3. Open Excel, choose a provider from the drop-down menu in the Cellm tab, and plug in your API key.\n\nYou can also use local models, e.g., via [Ollama](https:\u002F\u002Follama.com\u002F). Download and install [Ollama](https:\u002F\u002Follama.com\u002F), open Windows Terminal (open start menu, type `Windows Terminal`, and click `OK`), type `ollama pull gemma3:4b`, and wait for the download to finish. Open Excel, choose the Ollama provider from the drop-down menu in the Cellm tab, and you are good to go.\n\n## Pricing\n- **Free tier:** Use local models or your own API keys \n- **Paid tiers:** Available for teams needing managed infrastructure and EU data processing\n- [View pricing →](https:\u002F\u002Fgetcellm.com\u002Fpricing)\n\n## Basic usage\n\nSelect a cell and type `=PROMPT(\"What model are you and who made you?\")`. For Gemma 3 4B, it will tell you that it's called \"Gemma\" and made by Google DeepMind.\n\nYou can also use cell references. For example, copy a news article into cell A1 and type in cell B1: `=PROMPT(\"Extract all person names mentioned in the text\", A1)`. You can reference many cells using standard Excel notation, e.g. `=PROMPT(\"Extract all person names in the cells\", A1:F10)` or reference multiple separate ranges, e.g. `=PROMPT(\"Compare these datasets\", A1:B10, D1:E10)`\n\nFor more advanced usage, including function calling and configuration, see our [documentation](https:\u002F\u002Fdocs.getcellm.com).\n\n## Models\n\nCellm supports:\n- Hosted models from Azure, AWS, Google, Anthropic, OpenAI, Mistral, and others\n- Local models via Ollama, Llamafiles, or vLLM\n\nFor detailed information about configuring different models, see our documentation on [local models](https:\u002F\u002Fdocs.getcellm.com\u002Fmodels\u002Flocal-models) and [hosted models](https:\u002F\u002Fdocs.getcellm.com\u002Fmodels\u002Fhosted-models).\n\n## Use cases\n\nCellm is useful for repetitive tasks on both structured and unstructured data:\n\n1. **Competitive monitoring:** Track competitor pricing across 50 websites daily\n2. **Multi-language support:** Analyze customer feedback in 10+ languages\n3. **Text classification:** Categorize survey responses, support tickets, etc.\n4. **Model comparison:** Compare results from different LLMs side by side\n5. **Data cleaning:** Standardize names, fix formatting issues\n6. **Content summarization:** Condense articles, papers, or reports\n7. **Entity recognition:** Pull out names, locations, dates from text\n\nFor more use cases and examples, see our [Prompting Guide](https:\u002F\u002Fdocs.getcellm.com\u002Fusage\u002Fprompting).\n\n## Development\n\nFor build instructions with Visual Studio or command line, see our [development guide](https:\u002F\u002Fdocs.getcellm.com\u002Fget-started\u002Fdevelopment).\n\n## Why did we make Cellm?\nA friend was writing a systematic review paper and had to compare 7,500 papers against inclusion\u002Fexclusion criteria to identify papers relevant to her research. We thought this was a great use case for LLMs but quickly realized that individually copying papers in and out of chat windows was a total pain. This sparked the idea to make an AI tool to automate repetitive tasks for people who would rather avoid programming.\n\nA quick prototype enabled her to quickly import a CSV file into Excel and classify all 7,500 papers with a prompt like \"If the paper studies diabetic neuropathy and stroke, return INCLUDE otherwise return EXCLUDE\". So we decided to develop it further.\n\nWe think Cellm is really cool because it enables everyone to automate tasks with AI to a level that was previously available only to programmers.\n\n## Telemetry\nTo help us improve Cellm, we collect limited, anonymous telemetry data:\n\n- **Crash reports:** To help us fix bugs.\n- **Prompts:** To help us understand usage patterns. For example, if you use `=PROMPT(\"Extract person names\", A1:B2)`, we capture the text \"Extract person names\" and prompt options. The prompt options are things like the model you use and the temperature setting. We do not capture the data in cells A1:B2. \n\nWe do not collect any data from your spreadsheet and we have no way of associating your prompts with you. You can see for yourself at [src\u002FCellm\u002FModels\u002FBehaviors\u002FSentryBehavior.cs](src\u002FCellm\u002FModels\u002FBehaviors\u002FSentryBehavior.cs).\n\nYou can disable telemetry at any time by adding the following contents to the `appsettings.Local.json` file in installation directory `C:\\Users\\{username}\\AppData\\Roaming\\Cellm`:\n\n```json\n{\n    \"SentryConfiguration\": {\n        \"IsEnabled\": false\n    }\n}\n```\n\n## License\n\nFair Core License, Version 1.0, Apache 2.0 Future License\n","[![CI](https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Factions\u002Fworkflows\u002Fci.yml\u002Fbadge.svg)](https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Factions\u002Fworkflows\u002Fci.yml)\n\n# Cellm\n在 Excel 公式中使用 AI，在几分钟内对数千行数据运行你的提示（prompt）。\n\n[网站](https:\u002F\u002Fwww.getcellm.com) | [文档](https:\u002F\u002Fdocs.getcellm.com) | [发布版本](https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Freleases) | ⭐ **给本仓库点个 Star**，帮助更多人发现 Cellm！\n\n## 什么是 Cellm？\nCellm 是一个 Excel 扩展插件，让你可以在单元格公式中使用大型语言模型（LLM，Large Language Models），例如 ChatGPT。Cellm 的 `=PROMPT()` 函数会对一段文本范围输出 AI 的响应结果，其用法类似于 Excel 中的 `=SUM()` 函数对数字范围求和。\n\n例如，你可以在单元格公式中写入 `=PROMPT(\"提取文本中提到的所有人名。\", A1)`，然后拖动该单元格，将提示应用到多行数据上。当你需要使用 AI 处理重复性任务、而这些任务通常需要反复将数据复制粘贴到聊天窗口时，Cellm 就非常有用。\n\n## 为什么要使用 Cellm？\n- 快速完成数据清洗、分类和信息提取任务。\n- 让市场、财务、销售、运营等团队无需依赖开发人员即可自动化日常任务。\n- 立即解放你和你的团队，摆脱重复的手动操作，直接在他们已经熟练掌握的电子表格中工作。\n- 无需漫长部署专门的 AI 应用——你的团队电脑上早已安装了 Excel。\n- 通过 MCP 服务器创建自己的网页爬虫。每天早上 9 点会议前，自动监控竞争对手的博客、价格和社交媒体动态。\n\n> “我喜欢把数据喂给 ChatGPT，一次复制粘贴就够了。”\n> —— 没有人会在连续执行同一提示 5 次后还这么说\n\n## 示例\n假设你需要追踪国际竞争对手，但他们的网站使用不同语言。逐一访问每个网站、查找最新动态、再复制到翻译工具中，这过程非常痛苦。现在，让 Cellm 为你完成这些手动工作：\n\nhttps:\u002F\u002Fgithub.com\u002Fuser-attachments\u002Fassets\u002F8967f557-50b8-4e39-80e8-86a1246c5a42\n\n此示例使用新闻网站。我们提供一组 URL 列表，并编写一个简单提示，要求 Cellm 从每个网站抓取头条新闻。接着，在后续列中，我们让模型翻译该标题、识别原始语言，甚至将其归类为“政治”或“商业”等类别。\n\n只需拖动填充，Cellm 就会访问每个网站，提取数据并为你整理好。原本可能需要一小时的手动工作，现在几秒钟就能完成。想象一下，每天早上 9 点会议前你能准备多少内容。\n\n但请注意，模型有时也会出错。它可能会误解标题或分配错误的类别。你有责任验证结果是否足够准确，以满足你的使用场景。\n\n## 快速开始\n\n### 系统要求\n\n- Windows 10 或更高版本\n- [.NET 9.0 运行时](https:\u002F\u002Fdotnet.microsoft.com\u002Fen-us\u002Fdownload\u002Fdotnet\u002F9.0)\n- [Excel 2010 或更高版本（桌面应用）](https:\u002F\u002Fwww.microsoft.com\u002Fen-us\u002Fmicrosoft-365\u002Fexcel)\n\n### 安装步骤\n\n1. 前往 [发布页面](https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Freleases)，下载 `Cellm-AddIn-Release-x64.msi`。\n\n2. 运行安装程序。\n\n3. 打开 Excel，在 Cellm 选项卡的下拉菜单中选择一个提供商（provider），并输入你的 API 密钥。\n\n你也可以使用本地模型，例如通过 [Ollama](https:\u002F\u002Follama.com\u002F)。下载并安装 [Ollama](https:\u002F\u002Follama.com\u002F)，打开 Windows 终端（在开始菜单中搜索 `Windows Terminal` 并点击 `OK`），输入命令 `ollama pull gemma3:4b`，等待下载完成。然后打开 Excel，在 Cellm 选项卡的下拉菜单中选择 Ollama 提供商，即可开始使用。\n\n## 定价\n- **免费版**：使用本地模型或你自己的 API 密钥\n- **付费版**：适用于需要托管基础设施和欧盟数据处理的团队\n- [查看定价 →](https:\u002F\u002Fgetcellm.com\u002Fpricing)\n\n## 基础用法\n\n选中一个单元格，输入 `=PROMPT(\"你是什么模型？谁开发了你？\")`。对于 Gemma 3 4B 模型，它会告诉你它的名字是 “Gemma”，由 Google DeepMind 开发。\n\n你也可以使用单元格引用。例如，将一篇新闻文章复制到 A1 单元格，在 B1 单元格中输入：`=PROMPT(\"提取文本中提到的所有人名\", A1)`。你可以使用标准 Excel 表示法引用多个单元格，例如 `=PROMPT(\"提取以下单元格中提到的所有人名\", A1:F10)`，或引用多个不连续的区域，例如 `=PROMPT(\"比较这些数据集\", A1:B10, D1:E10)`。\n\n如需了解更高级的用法（包括函数调用和配置），请参阅我们的 [文档](https:\u002F\u002Fdocs.getcellm.com)。\n\n## 支持的模型\n\nCellm 支持：\n- 来自 Azure、AWS、Google、Anthropic、OpenAI、Mistral 等平台的托管模型（hosted models）\n- 通过 Ollama、Llamafiles 或 vLLM 运行的本地模型（local models）\n\n有关配置不同模型的详细信息，请参阅我们的文档：[本地模型](https:\u002F\u002Fdocs.getcellm.com\u002Fmodels\u002Flocal-models) 和 [托管模型](https:\u002F\u002Fdocs.getcellm.com\u002Fmodels\u002Fhosted-models)。\n\n## 使用场景\n\nCellm 适用于对结构化和非结构化数据执行重复性任务：\n\n1. **竞品监控**：每天追踪 50 个网站的竞品价格  \n2. **多语言支持**：分析 10 多种语言的客户反馈  \n3. **文本分类**：对调查回复、客服工单等进行分类  \n4. **模型对比**：并排比较不同 LLM 的输出结果  \n5. **数据清洗**：标准化名称、修复格式问题  \n6. **内容摘要**：压缩文章、论文或报告  \n7. **实体识别**：从文本中提取人名、地点、日期等信息  \n\n更多使用案例和示例，请参阅我们的 [提示指南（Prompting Guide）](https:\u002F\u002Fdocs.getcellm.com\u002Fusage\u002Fprompting)。\n\n## 开发\n\n如需使用 Visual Studio 或命令行进行构建的说明，请参阅我们的 [开发指南](https:\u002F\u002Fdocs.getcellm.com\u002Fget-started\u002Fdevelopment)。\n\n## 我们为什么开发 Cellm？\n一位朋友正在撰写系统性综述论文，需要根据纳入\u002F排除标准比对 7,500 篇论文，以筛选出与她研究相关的文献。我们认为这是 LLM 的绝佳应用场景，但很快意识到，如果一篇篇地将论文复制粘贴到聊天窗口中，过程极其繁琐。这促使我们萌生了开发一款 AI 工具的想法，帮助那些不想编程的人自动化重复任务。\n\n我们快速制作了一个原型，让她能将 CSV 文件导入 Excel，并通过类似 `\"如果该论文研究糖尿病神经病变和中风，则返回 INCLUDE，否则返回 EXCLUDE\"` 的提示，一次性对全部 7,500 篇论文进行分类。于是我们决定进一步开发这个工具。\n\n我们认为 Cellm 非常酷，因为它让每个人都能利用 AI 自动化任务，而这种能力过去只有程序员才能实现。\n\n## 遥测（Telemetry）\n\n为了帮助我们改进 Cellm，我们会收集有限且匿名的遥测数据：\n\n- **崩溃报告（Crash reports）**：帮助我们修复错误。\n- **提示词（Prompts）**：帮助我们了解使用模式。例如，如果你使用了 `=PROMPT(\"Extract person names\", A1:B2)`，我们会捕获文本 \"Extract person names\" 以及提示选项（prompt options）。提示选项包括你使用的模型（model）和温度（temperature）设置等。我们**不会**捕获单元格 A1:B2 中的数据。\n\n我们不会收集你电子表格中的任何数据，也无法将你的提示与你本人关联。你可以自行查看代码：[src\u002FCellm\u002FModels\u002FBehaviors\u002FSentryBehavior.cs](src\u002FCellm\u002FModels\u002FBehaviors\u002FSentryBehavior.cs)。\n\n你可以随时通过在安装目录 `C:\\Users\\{username}\\AppData\\Roaming\\Cellm` 下的 `appsettings.Local.json` 文件中添加以下内容来禁用遥测：\n\n```json\n{\n    \"SentryConfiguration\": {\n        \"IsEnabled\": false\n    }\n}\n```\n\n## 许可证\n\nFair Core License, Version 1.0, Apache 2.0 Future License","# Cellm 快速上手指南\n\n## 环境准备\n\n- **操作系统**：Windows 10 或更高版本  \n- **运行时依赖**：[.NET 9.0 Runtime](https:\u002F\u002Fdotnet.microsoft.com\u002Fen-us\u002Fdownload\u002Fdotnet\u002F9.0)（建议从官方下载安装）  \n- **Excel 版本**：Excel 2010 或更高版本（桌面版，不支持网页版）\n\n> 💡 提示：国内用户可尝试使用 [.NET 官方镜像](https:\u002F\u002Fdotnet.microsoft.com\u002Fzh-cn\u002Fdownload\u002Fdotnet\u002F9.0) 加速下载。\n\n## 安装步骤\n\n1. 访问 [Release 页面](https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Freleases)，下载安装包 `Cellm-AddIn-Release-x64.msi`。\n2. 双击运行 `.msi` 文件完成安装。\n3. 打开 Excel，在顶部菜单栏找到 **Cellm** 选项卡：\n   - 从下拉菜单中选择 AI 模型提供商（如 OpenAI、Ollama 等）\n   - 输入对应的 API Key（若使用本地模型则无需）\n\n### 使用本地模型（可选）\n\n如需使用本地 LLM（例如通过 Ollama）：\n\n```bash\n# 安装 Ollama（访问 https:\u002F\u002Follama.com\u002Fdownload\u002FOllamaSetup.exe 下载安装）\n# 在 Windows Terminal 中执行：\nollama pull gemma3:4b\n```\n\n安装完成后，在 Excel 的 Cellm 选项卡中选择 **Ollama** 作为提供商即可使用。\n\n## 基本使用\n\n在任意单元格中输入以下公式即可调用 AI：\n\n```excel\n=PROMPT(\"What model are you and who made you?\")\n```\n\n### 引用单元格内容\n\n将文本放入 A1 单元格，然后在 B1 输入：\n\n```excel\n=PROMPT(\"Extract all person names mentioned in the text\", A1)\n```\n\n支持多单元格引用：\n\n```excel\n=PROMPT(\"Extract all person names in the cells\", A1:F10)\n```\n\n或多个区域：\n\n```excel\n=PROMPT(\"Compare these datasets\", A1:B10, D1:E10)\n```\n\n拖动填充柄即可批量处理成千上万行数据。","某跨境电商运营专员每天需监控50家海外竞品官网的最新促销信息，并将活动内容翻译成中文、归类为“折扣”“新品”或“清仓”，以便团队晨会讨论。\n\n### 没有 cellm 时\n- 需手动逐个打开竞品网站，复制首页Banner或公告栏文本，再粘贴到翻译工具中处理。\n- 翻译后还需人工判断活动类型，耗时且易因疲劳导致分类错误。\n- 整个流程每天至少花费1小时，若遇网站改版或语言切换，效率进一步下降。\n- 团队无法及时获取完整情报，常在晨会中依赖零散截图，决策依据不足。\n- 若想批量处理历史数据（如上周所有促销），几乎不可行，只能放弃分析。\n\n### 使用 cellm 后\n- 在Excel中列出所有竞品URL，用`=PROMPT(\"提取页面主促销文案\", A2)`自动抓取内容，拖拽即可批量执行。\n- 新增列使用`=PROMPT(\"将以下文本翻译成中文\", B2)`和`=PROMPT(\"判断该促销属于：折扣\u002F新品\u002F清仓\", C2)`，一键完成翻译与分类。\n- 全流程在Excel内完成，无需切换多个工具，10分钟内生成结构化日报。\n- 数据可直接用于图表或共享给同事，晨会前自动刷新最新结果。\n- 历史数据只需保留URL列表，随时重新运行公式回溯分析，提升复盘效率。\n\ncellm 将原本碎片化、高重复的手工操作转化为可复用的智能公式，让普通业务人员也能高效驾驭AI处理真实业务数据。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgetcellm_cellm_c063bb43.png","getcellm","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002Fgetcellm_588a8f97.png",null,"https:\u002F\u002Fgithub.com\u002Fgetcellm",[80,84],{"name":81,"color":82,"percentage":83},"C#","#178600",98.8,{"name":85,"color":86,"percentage":87},"PowerShell","#012456",1.2,922,42,"2026-04-05T10:48:26","NOASSERTION","Windows","未说明",{"notes":95,"python":93,"dependencies":96},"需要安装 .NET 9.0 Runtime 和 Excel 2010 或更高版本（桌面应用）。支持通过 Ollama 使用本地模型（如 gemma3:4b），此时需额外安装 Ollama 并拉取相应模型。使用托管模型需提供对应服务商的 API 密钥。",[],[14,15,13],[99,100],"ai","excel","2026-03-27T02:49:30.150509","2026-04-06T06:51:52.805782",[104,109,114,119,124,128],{"id":105,"question_zh":106,"answer_zh":107,"source_url":108},695,"导入 Excel 插件时提示“未在配置中找到 'OpenAiCompatibleConfiguration' 节”怎么办？","该问题已在 PR #92 中修复。请确保拉取最新代码并重新构建项目。此修复为 OpenAiCompatible 提供商设置了默认值，若缺少 API 密钥会显示更明确的错误信息。如仍遇到问题，请确认是否使用了最新的 main 分支代码。","https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fissues\u002F89",{"id":110,"question_zh":111,"answer_zh":112,"source_url":113},696,"双击 Cellm-AddIn-packed.xll 安装时提示“文件可能已损坏或不安全”如何解决？","请下载并使用 64 位版本的插件文件 Cellm-AddIn64-packed.xll，而非 Cellm-AddIn-packed.xll（32 位版本）。此前 README 中曾错误地推荐了 32 位版本，现已修正，仅推荐使用 64 位版本以避免此问题。","https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fissues\u002F105",{"id":115,"question_zh":116,"answer_zh":117,"source_url":118},697,"使用 ChatGPT 或 Anthropic 等托管 LLM 时提示 API Key 错误，但密钥本身是有效的，如何解决？","请确保在修改 appsettings.Local.json 文件中的 API 密钥后重启 Excel，以便加载新的配置。此外，建议使用 main 分支的最新代码进行编译，该问题已在新版本中修复。如果通过 UI 设置密钥仍失败，可尝试手动编辑配置文件并重启 Excel。","https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fissues\u002F104",{"id":120,"question_zh":121,"answer_zh":122,"source_url":123},698,"在 Excel 表格中使用结构化引用（如 [@sold]）调用 PROMPT 函数不生效怎么办？","该问题已在 PR #228 中修复。请使用 main 分支的最新代码自行编译，或等待下一个正式版本发布。修复后即可正常在表格中使用类似 =PROMPT([@sold],\"clean up and output a single number\") 的公式。","https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fissues\u002F218",{"id":125,"question_zh":126,"answer_zh":127,"source_url":113},699,"使用 OpenAI API 兼容模式时更改模型名称无效，仍报错“模型 'gpt-4o' 不存在”，如何解决？","“gpt-4o”仅为示例占位符。您可以在模型名称输入框中直接填写实际模型名，例如 \"openaicompatible\u002FPhi-4\"。此前版本存在模型名未正确传递的问题，已在 Issue #116 中修复。请确保使用最新版本，并直接在 UI 文本框中输入正确的模型名称。",{"id":129,"question_zh":130,"answer_zh":131,"source_url":132},700,"对大量单元格应用公式时出现“目标机器主动拒绝连接”错误，如何避免？","该问题源于短时间内发起过多 API 请求。虽然 Issue #113 提出了增加限流、队列或重试机制的建议，但目前官方尚未在 FAQ 中提供具体配置方案。建议临时解决方案是减少并发单元格数量，或使用支持本地模型（如 Ollama）以提高稳定性。后续版本可能会加入请求控制机制。","https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fissues\u002F113",[134,139,144,149,154,159,164],{"id":135,"version":136,"summary_zh":137,"released_at":138},100268,"v0.5.0","This release adds support for multiple cell ranges in PROMPT functions, adds OpenRouter model provider, and includes bug fixes and improvements.\r\n\r\n## ⚠️ Breaking Changes\r\n\r\n**Excel UDF Function Signature Change**\r\n\r\nThe `PROMPT` and `PROMPTMODEL` functions now accept multiple cell ranges instead of a temperature parameter:\r\n\r\n- **Old signature**: `=PROMPT(instructions, A1:D4, 0.7)`\r\n- **New signature**: `=PROMPT(instructions, A1:D4, F10, K11:O30)`\r\n\r\nTemperature is now configured solely through the ribbon UI. This change enables passing multiple cells and cell ranges as context to your prompts.\r\n\r\n**Migration**: Remove the temperature parameter from your `=PROMPT`s.\r\n\r\n## What's New\r\n\r\n- **OpenRouter Provider**: Connect to 200+ AI models through OpenRouter, giving you access to a wide variety of language models from different providers in one place\r\n- **Mistral Large Support**: Added support for Mistral's flagship large language model\r\n- **Test Suite**: Introduced comprehensive test coverage to improve reliability and maintainability\r\n- **Excel-DNA Update**: Updated to Excel-DNA v1.9.0 for better performance and compatibility\r\n\r\n## Bug Fixes\r\n\r\n- Fixed incorrect ChatRole usage in AddAssistantMessage method\r\n- Improved provider name logging for better debugging\r\n\r\n## What's Changed\r\n* build(deps): bump actions\u002Fcheckout from 5 to 6 by @dependabot[bot] in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F307\r\n* feat: Add OpenRouter provider by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F308\r\n* feat: Add Mistral Large by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F310\r\n* feat!: Support multiple cell ranges in PROMPT functions by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F312\r\n* docs: Add temperature preset details to PROMPT and PROMPTMODEL functions docs by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F313\r\n* build: Update Excel-DNA to v1.9.0 by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F314\r\n* feat: Add tests by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F316\r\n* fix: Log provider name by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F318\r\n* fix: Use ChatRole.Assistant in AddAssistantMessage method by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F319\r\n* chore: bump version to v0.5.0 by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F320\r\n\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fcompare\u002Fv0.4.1...v0.5.0","2026-01-12T19:52:11",{"id":140,"version":141,"summary_zh":142,"released_at":143},100269,"v0.4.1","This is a maintenance release with documentation improvements and a critical bug fix.\r\n\r\n## Bug Fixes\r\n\r\n- Fixed Cellm proxy base address: Resolved an issue with the Cellm-managed models proxy configuration that could prevent proper connection to the service.\r\n\r\n## What's Changed\r\n* docs: Improve README.md by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F300\r\n* build: Bump version by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F301\r\n* docs: v0.4.0 by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F303\r\n* docs(v0.4.0): Fix overview, quickstart by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F304\r\n* fix: Cellm proxy base address by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F305\r\n* chore: Bump version to v0.4.1 by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F306\r\n\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fcompare\u002Fv0.4.0...v0.4.1","2025-11-15T20:25:02",{"id":145,"version":146,"summary_zh":147,"released_at":148},100270,"v0.4.0","This release is a big one! We're introducing Cellm-managed models, structured outputs that make organizing your spreadsheets easier, and many performance improvements and bug fixes.\r\n\r\nWe're beginning to invite users from our waitlist to try Cellm-managed models. Your first 30 days are on us, and we've built the system with privacy as our top priority: all data is processed exclusively within the EU using Mistral models. If you haven't signed up yet, head over to [getcellm.com](https:\u002F\u002Fgetcellm.com\u002F) to join the waitlist—there's still time to get in!\r\n\r\n## What's New\r\n\r\n* **Cellm-Managed Models:** Once invited, you can now sign in to your Cellm account and access AI models without managing API keys yourself. We handle the provider connections, so you can focus on your work. Until you get an invite, you can still download Cellm and use your own API keys with popular providers like Anthropic, OpenAI, DeepSeek, and Mistral.\r\n\r\n* **MCP Support**: You can connect your spreadsheets to Model Context Protocol (MCP) servers, giving your Excel formulas access to external data sources and tools. This transforms Excel into a low-code automation platform where you can query databases, send emails, or pull data from APIs.\r\n\r\n* **Structured Outputs:** New functions (`PROMPT.TOROW`, `PROMPT.TOCOLUMN`, `PROMPT.TORANGE`) let you control exactly how AI responses flow into your spreadsheet. Need a single value? A row of data? A column? A table? Just pick the right function and Cellm handles the formatting. This makes it much easier to build dynamic spreadsheets that transform AI responses into usable data structures.\r\n\r\n* **More Provider Options:** Added support for Google Gemini, Azure OpenAI, and AWS Bedrock. Combined with our existing integrations, you now have access to nearly every major AI provider from within Excel. We've also updated model lists across providers to include the latest releases.\r\n\r\n* **Installer:** The new MSI installer automatically sets up Cellm on your machine and registers the plugin with Excel so that it automatically loads on startup.\r\n\r\n## Performance Improvements\r\n\r\n* **Better Excel Responsiveness:** Moved more processing off Excel's main thread, so the application stays responsive even when you're running multiple prompts simultaneously. You should notice this especially when working with many cells at once.\r\n\r\n* **Smarter Retry Logic:** Tuned the resilience pipeline to handle API failures more gracefully, with clearer error messages when something goes wrong (like missing API keys).\r\n\r\n**Statistics**: Added prompt counting, token counting, average tokens per second, and average request per second statistics.\r\n\r\n## Bug Fixes\r\n\r\n* Fixed an issue where concurrent calls with identical arguments would incorrectly reuse responses\r\n* Resolved temperature parsing issues in non-US cultures\r\n* Fixed several MCP-related concurrency and caching issues\r\n* Improved handling of structured outputs when tools are enabled\r\n* Corrected token counting and speed statistics\r\n\r\n## What's Changed\r\n* fix: Sentry traces sample rate by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F144\r\n* feat: Add entitlements and Cellm provider by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F145\r\n* refactor: Remove Serde class by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F146\r\n* refactor: Remove dedicated Llamafile provider, use OpenAiCompatible provider instead by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F147\r\n* feat: Add max output tokens to PromptBuilder by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F148\r\n* refactor: Services by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F149\r\n* refactor: Remove appsettings.Local.{provider].json, use UI instead by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F150\r\n* feat: Do not auto-download Ollama models by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F151\r\n* refactor: Use ChatClientFactory in a single RequestHandler by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F152\r\n* refactor: Drop separate shared source project for models by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F154\r\n* style: Adopt \"-Async\" postfix naming convention by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F155\r\n* fix: Add basic auth to Cellm provider HttpClient by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F156\r\n* feat: Add MCP entitlement by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F157\r\n* feat: Add account UI by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F158\r\n* fix: Concurrent access to MCP tool cache by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F159\r\n* style: Use System.Net.Http auth header by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F160\r\n* refactor: Model Group UI by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F161\r\n* feat: Add MCP menu by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F163\r\n* feat: U","2025-10-27T18:43:38",{"id":150,"version":151,"summary_zh":152,"released_at":153},100271,"v0.3.0","This release introduces new capabilities for connecting Cellm to external systems and improves underlying integrations with model providers.\r\n\r\n## What's New\r\n* **Add support for Model Context Protocol (MCP):** You can now connect Cellm to MCP-compatible servers, allowing your Excel sheets to interact with external data sources or trigger actions via Cellm functions. Effectively, it turns Excel into a low-code automation tool, enabling workflows orchestrated directly from your spreadsheet.  We're still trying to wrap our head around the possibilities this unlocks. They are freakin' endless. \r\n* **Adopt `Anthropic.SDK` for Anthropic Calls:** Previously, we rolled our own custom Anthropic client because of the lack of an official SDK. We've migrated our Anthropic integration to the community-driven `Anthropic.SDK` ([github.com\u002Ftghamm\u002FAnthropic.SDK](https:\u002F\u002Fgithub.com\u002Ftghamm\u002FAnthropic.SDK)). This decision was driven by Anthropic's adoption of this SDK for their official .NET MCP library. This move ensures better standardization, aligns Cellm with `Microsoft.Extensions.AI` patterns, and leverages ongoing community improvements.\r\n* **Telemetry:** To help us identify bug and understand usage patterns, we now send anonymized crash reports and model interaction details to Sentry.  We never capture any data from your spreadsheet. Still, you can opt out of this at any time by adding the following to your appsettings.Local.json:\r\n\r\n```json\r\n{\r\n    \"SentryConfiguration\": {\r\n        \"IsEnabled\": false\r\n    }\r\n}\r\n```\r\n\r\n## Bug fixes\r\n* Fixed a regression that inadvertently broke the use of tools with AI models.\r\n* Adjusted the prompt caching mechanism to correctly invalidate the cache when tools are added or removed.\r\n* Tuned the default settings for the rate limiter (Retry and Circuit Breaker policies) to work together more effectively during periods of high activity. These defaults prioritize stability and avoiding upstream provider rate limits. You can always crank up the limits or remove them altogether, but aggressive settings may lead to more errors from the AI provider.\r\n\r\n## What's Changed\r\n* Merge develop into main by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F129\r\n* fix: Prompts involving tool calls would return the tool call message, not the assistant's reply to the tool call result by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F130\r\n* build: Update Microsoft.Extensions.AI version to 9.4.0-preview.1.25207.5 by @MackinnonBuck in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F133\r\n* Merge dev by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F136\r\n* build: Update dependencies by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F138\r\n* fix: Retry on broken circuit and Anthropic's rate limits exceeded, guarantee first retry waits until opened circuit is closed, and reduce aggresiveness of circuit breaker by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F140\r\n* fix: Adding\u002Fremoving tools did not invalidate cache by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F139\r\n* refactor: Sentry PipelineBehavior by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F137\r\n* fix: Sentry transaction contexts by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F142\r\n* build: Bump version by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F143\r\n\r\n## New Contributors\r\n* @MackinnonBuck made their first contribution in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F133\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fcompare\u002Fv0.2.0...v0.3.0","2025-04-12T14:40:36",{"id":155,"version":156,"summary_zh":157,"released_at":158},100272,"v0.2.0","This release addresses several issues when working with multiple prompts simultaneously and introduces comprehensive documentation!\r\n\r\n## What's New:\r\n- **Official Documentation**: Our new [documentation](https:\u002F\u002Fdocs.getcellm.com\u002F) is now available!\r\n\r\n## Bug Fixes:\r\n- **UI Responsiveness**: Fixed an issue where the interface would become unresponsive when sending multiple prompts\r\n- **UI Configuration**: Configuration changes now apply immediately without requiring application restart\r\n- **Smarter Rate Limiting**: Implemented centralized rate limiting that applies individually to each provider, creating a smoother experience when sending multiple prompts to different models simultaneously\r\n\r\nThese improvements should make your workflow more efficient and reliable when working with multiple prompts across different models. Enjoy!\r\n\r\n## What's Changed\r\n* refactor: Move Cellm.Models to shared project by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F111\r\n* docs: Improve README by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F112\r\n* feat: Add retry mechanism to Ollama and OpenAiCompatible providers by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F114\r\n* fix: Prompt not cached because cache key length sometimes too long by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F115\r\n* refactor: Add IChatClient with transient lifetime for all providers by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F116\r\n* docs: Add documentation by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F117\r\n* docs: Tighten up README, link to docs by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F118\r\n* docs: Update R2 domain by @zachasme in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F119\r\n* fix: Unblock UI thread when sending many prompts by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F120\r\n* refactor: Apply rate limit across providers by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F121\r\n* fix: Increase default rate limit by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F125\r\n\r\n## New Contributors\r\n* @zachasme made their first contribution in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F119\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fcompare\u002Fv0.1.1...v0.2.0","2025-03-19T20:17:26",{"id":160,"version":161,"summary_zh":162,"released_at":163},100273,"v0.1.1","This is a minor release that mainly fixes a bug where changing API keys either via UI or appsettings.Local.json would not get picked until Excel was restarted.\r\n\r\n## What's Changed\r\n* docs: Add CLA by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F102\r\n* docs: Fix README typos by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F103\r\n* build: Add global.json to specify .NET SDK version 9.X.X by @johnnyoshika in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F101\r\n* fix: Changing API keys while app is running by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F109\r\n\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fcompare\u002Fv0.1.0...v0.1.1","2025-02-06T08:32:08",{"id":165,"version":166,"summary_zh":167,"released_at":168},100274,"v0.1.0","# **Release v0.1.0**  \r\nCellm is an Excel extension that lets you use Large Language Models (LLMs) like ChatGPT in cell formulas. It is designed for automation of repetitive text-based tasks and comes with\r\n\r\n- **Local and hosted models**: Defaults to free local inference (Gemma 2 2B via Ollama) while supporting commercial APIs  \r\n- **Formula-driven workflow**: `=PROMPT()` and `=PROMPTWITH()` functions for drag-and-fill operations across cell ranges.\r\n\r\n### Install\r\n\r\n1. Download `Cellm-AddIn64-packed.xll` and `appsettings.json`. Put them in the _same_ folder.\r\n\r\n2. Double-click on `Cellm-AddIn64-packed.xll`. Excel will open and install Cellm.\r\n\r\n3. Download and install [Ollama](https:\u002F\u002Follama.com\u002F). Cellm uses Ollama and the Gemma 2 2B model by default. Gemma 2 2B will be downloaded automatically the first time you call `=PROMPT()`. To call other models, see the Models section in the README. \r\n\r\n### Uninstall\r\n\r\n1. In Excel, go to File > Options > Add-Ins.\r\n2. In the `Manage` drop-down menu, select `Excel Add-ins` and click `Go...`.\r\n3. Uncheck `Cellm-AddIn64-packed.xll` and click `OK`.\r\n\r\n### **Known Limitations**  \r\n1. **Windows-only**: No macOS\u002FLinux support planned for initial versions  \r\n2. **Input constraints**:  \r\n   - Formula arguments limited to 8,192 characters (Excel string limit)  \r\n   - No native support for multi-turn conversations  \r\n3. **Model variability**: Output quality depends on selected LLM (validate critically)  \r\n\r\n---\r\n\r\n### **Contribution & Feedback**  \r\nReport issues or suggest improvements via [GitHub Issues](https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fissues).  \r\n\r\n### **Install**\r\n\r\nDownload `Cellm-AddIn64-packed.xll` and `appsettings.json` and put it in the same folder. Then double-click on `Cellm-AddIn64-packed.xll`. Excel will open with Cellm installed.\r\n\r\n---\r\n\r\n**License**: [Fair Core License](https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm#license)  \r\n**Full Documentation**: [README](https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm#readme)\r\n\r\n## What's Changed\r\n* feat: Add LlamafileClient by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F1\r\n* bug: Fix AddSystemMessage by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F4\r\n* bug: Fix llamafile health uri by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F3\r\n* models: Add qwen-0.5b by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F2\r\n* docs: Tighten up README by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F6\r\n* feat: Manually dispose of ServiceLocator by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F7\r\n* bug: By default assign telemetry to default model of provider, not default model of Cellm; refactor: Rename GoogleClient to GoogleAiClient by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F8\r\n* docs: Add support for Mistral by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F9\r\n* bug: Disable sentry by default until fix for missing immutable arrays is identified by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F11\r\n* feat: Add concurrency rate limiting by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F10\r\n* feat: Add support for running multiple Llamafiles simultaneously  by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F12\r\n* build: Enforce code style in build by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F13\r\n* git: Add Excel files to .gitignore by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F14\r\n* docs: Improve README by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F15\r\n* Prompt: Further optimize system prompt for small models with limited instruction-following capability. Larger models will understand anyway by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F16\r\n* docs: Proof-read README.md by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F17\r\n* feat: Add support for OpenAI tools by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F18\r\n* feat: Upgrade default Anthropic model to claude-3-5-sonnet-20241022 by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F20\r\n* refactor: Add provider enum by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F19\r\n* refactor: Rename CellmFunctions to Functions and CellPrompts to SystemMessages by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F21\r\n* ci: Add conventional commits lint by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F22\r\n* build: Make internals visible to Cellm.Tests by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F23\r\n* refactor: Pull out XllPath into settable property by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F24\r\n* feat: Add SentryBehavior and CachingBehavior to model request pipeline by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F25\r\n* refactor: Splot Tools into ToolRunner and ToolFactory by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F26\r\n* feat: Upgrade Claude 3.5 Sonnet by @kaspermarstal in https:\u002F\u002Fgithub.com\u002Fgetcellm\u002Fcellm\u002Fpull\u002F27\r\n* refactor: Remove s","2025-01-30T22:18:21"]