[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-microsoft--PhiCookBook":3,"tool-microsoft--PhiCookBook":65},[4,17,25,39,48,56],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":16},1381,"everything-claude-code","affaan-m\u002Feverything-claude-code","everything-claude-code 是一套专为 AI 编程助手（如 Claude Code、Codex、Cursor 等）打造的高性能优化系统。它不仅仅是一组配置文件，而是一个经过长期实战打磨的完整框架，旨在解决 AI 代理在实际开发中面临的效率低下、记忆丢失、安全隐患及缺乏持续学习能力等核心痛点。\n\n通过引入技能模块化、直觉增强、记忆持久化机制以及内置的安全扫描功能，everything-claude-code 能显著提升 AI 在复杂任务中的表现，帮助开发者构建更稳定、更智能的生产级 AI 代理。其独特的“研究优先”开发理念和针对 Token 消耗的优化策略，使得模型响应更快、成本更低，同时有效防御潜在的攻击向量。\n\n这套工具特别适合软件开发者、AI 研究人员以及希望深度定制 AI 工作流的技术团队使用。无论您是在构建大型代码库，还是需要 AI 协助进行安全审计与自动化测试，everything-claude-code 都能提供强大的底层支持。作为一个曾荣获 Anthropic 黑客大奖的开源项目，它融合了多语言支持与丰富的实战钩子（hooks），让 AI 真正成长为懂上",138956,2,"2026-04-05T11:33:21",[13,14,15],"开发框架","Agent","语言模型","ready",{"id":18,"name":19,"github_repo":20,"description_zh":21,"stars":22,"difficulty_score":10,"last_commit_at":23,"category_tags":24,"status":16},3704,"NextChat","ChatGPTNextWeb\u002FNextChat","NextChat 是一款轻量且极速的 AI 助手，旨在为用户提供流畅、跨平台的大模型交互体验。它完美解决了用户在多设备间切换时难以保持对话连续性，以及面对众多 AI 模型不知如何统一管理的痛点。无论是日常办公、学习辅助还是创意激发，NextChat 都能让用户随时随地通过网页、iOS、Android、Windows、MacOS 或 Linux 端无缝接入智能服务。\n\n这款工具非常适合普通用户、学生、职场人士以及需要私有化部署的企业团队使用。对于开发者而言，它也提供了便捷的自托管方案，支持一键部署到 Vercel 或 Zeabur 等平台。\n\nNextChat 的核心亮点在于其广泛的模型兼容性，原生支持 Claude、DeepSeek、GPT-4 及 Gemini Pro 等主流大模型，让用户在一个界面即可自由切换不同 AI 能力。此外，它还率先支持 MCP（Model Context Protocol）协议，增强了上下文处理能力。针对企业用户，NextChat 提供专业版解决方案，具备品牌定制、细粒度权限控制、内部知识库整合及安全审计等功能，满足公司对数据隐私和个性化管理的高标准要求。",87618,"2026-04-05T07:20:52",[13,15],{"id":26,"name":27,"github_repo":28,"description_zh":29,"stars":30,"difficulty_score":10,"last_commit_at":31,"category_tags":32,"status":16},2268,"ML-For-Beginners","microsoft\u002FML-For-Beginners","ML-For-Beginners 是由微软推出的一套系统化机器学习入门课程，旨在帮助零基础用户轻松掌握经典机器学习知识。这套课程将学习路径规划为 12 周，包含 26 节精炼课程和 52 道配套测验，内容涵盖从基础概念到实际应用的完整流程，有效解决了初学者面对庞大知识体系时无从下手、缺乏结构化指导的痛点。\n\n无论是希望转型的开发者、需要补充算法背景的研究人员，还是对人工智能充满好奇的普通爱好者，都能从中受益。课程不仅提供了清晰的理论讲解，还强调动手实践，让用户在循序渐进中建立扎实的技能基础。其独特的亮点在于强大的多语言支持，通过自动化机制提供了包括简体中文在内的 50 多种语言版本，极大地降低了全球不同背景用户的学习门槛。此外，项目采用开源协作模式，社区活跃且内容持续更新，确保学习者能获取前沿且准确的技术资讯。如果你正寻找一条清晰、友好且专业的机器学习入门之路，ML-For-Beginners 将是理想的起点。",84991,"2026-04-05T10:45:23",[33,34,35,36,14,37,15,13,38],"图像","数据工具","视频","插件","其他","音频",{"id":40,"name":41,"github_repo":42,"description_zh":43,"stars":44,"difficulty_score":45,"last_commit_at":46,"category_tags":47,"status":16},3128,"ragflow","infiniflow\u002Fragflow","RAGFlow 是一款领先的开源检索增强生成（RAG）引擎，旨在为大语言模型构建更精准、可靠的上下文层。它巧妙地将前沿的 RAG 技术与智能体（Agent）能力相结合，不仅支持从各类文档中高效提取知识，还能让模型基于这些知识进行逻辑推理和任务执行。\n\n在大模型应用中，幻觉问题和知识滞后是常见痛点。RAGFlow 通过深度解析复杂文档结构（如表格、图表及混合排版），显著提升了信息检索的准确度，从而有效减少模型“胡编乱造”的现象，确保回答既有据可依又具备时效性。其内置的智能体机制更进一步，使系统不仅能回答问题，还能自主规划步骤解决复杂问题。\n\n这款工具特别适合开发者、企业技术团队以及 AI 研究人员使用。无论是希望快速搭建私有知识库问答系统，还是致力于探索大模型在垂直领域落地的创新者，都能从中受益。RAGFlow 提供了可视化的工作流编排界面和灵活的 API 接口，既降低了非算法背景用户的上手门槛，也满足了专业开发者对系统深度定制的需求。作为基于 Apache 2.0 协议开源的项目，它正成为连接通用大模型与行业专有知识之间的重要桥梁。",77062,3,"2026-04-04T04:44:48",[14,33,13,15,37],{"id":49,"name":50,"github_repo":51,"description_zh":52,"stars":53,"difficulty_score":45,"last_commit_at":54,"category_tags":55,"status":16},519,"PaddleOCR","PaddlePaddle\u002FPaddleOCR","PaddleOCR 是一款基于百度飞桨框架开发的高性能开源光学字符识别工具包。它的核心能力是将图片、PDF 等文档中的文字提取出来，转换成计算机可读取的结构化数据，让机器真正“看懂”图文内容。\n\n面对海量纸质或电子文档，PaddleOCR 解决了人工录入效率低、数字化成本高的问题。尤其在人工智能领域，它扮演着连接图像与大型语言模型（LLM）的桥梁角色，能将视觉信息直接转化为文本输入，助力智能问答、文档分析等应用场景落地。\n\nPaddleOCR 适合开发者、算法研究人员以及有文档自动化需求的普通用户。其技术优势十分明显：不仅支持全球 100 多种语言的识别，还能在 Windows、Linux、macOS 等多个系统上运行，并灵活适配 CPU、GPU、NPU 等各类硬件。作为一个轻量级且社区活跃的开源项目，PaddleOCR 既能满足快速集成的需求，也能支撑前沿的视觉语言研究，是处理文字识别任务的理想选择。",74913,"2026-04-05T10:44:17",[15,33,13,37],{"id":57,"name":58,"github_repo":59,"description_zh":60,"stars":61,"difficulty_score":62,"last_commit_at":63,"category_tags":64,"status":16},3215,"awesome-machine-learning","josephmisiti\u002Fawesome-machine-learning","awesome-machine-learning 是一份精心整理的机器学习资源清单，汇集了全球优秀的机器学习框架、库和软件工具。面对机器学习领域技术迭代快、资源分散且难以甄选的痛点，这份清单按编程语言（如 Python、C++、Go 等）和应用场景（如计算机视觉、自然语言处理、深度学习等）进行了系统化分类，帮助使用者快速定位高质量项目。\n\n它特别适合开发者、数据科学家及研究人员使用。无论是初学者寻找入门库，还是资深工程师对比不同语言的技术选型，都能从中获得极具价值的参考。此外，清单还延伸提供了免费书籍、在线课程、行业会议、技术博客及线下聚会等丰富资源，构建了从学习到实践的全链路支持体系。\n\n其独特亮点在于严格的维护标准：明确标记已停止维护或长期未更新的项目，确保推荐内容的时效性与可靠性。作为机器学习领域的“导航图”，awesome-machine-learning 以开源协作的方式持续更新，旨在降低技术探索门槛，让每一位从业者都能高效地站在巨人的肩膀上创新。",72149,1,"2026-04-03T21:50:24",[13,37],{"id":66,"github_repo":67,"name":68,"description_en":69,"description_zh":70,"ai_summary_zh":71,"readme_en":72,"readme_zh":73,"quickstart_zh":74,"use_case_zh":75,"hero_image_url":76,"owner_login":77,"owner_name":78,"owner_avatar_url":79,"owner_bio":80,"owner_company":81,"owner_location":81,"owner_email":82,"owner_twitter":83,"owner_website":84,"owner_url":85,"languages":86,"stars":120,"forks":121,"last_commit_at":122,"license":123,"difficulty_score":10,"env_os":124,"env_gpu":125,"env_ram":126,"env_deps":127,"category_tags":139,"github_topics":140,"view_count":10,"oss_zip_url":81,"oss_zip_packed_at":81,"status":16,"created_at":154,"updated_at":155,"faqs":156,"releases":186},2705,"microsoft\u002FPhiCookBook","PhiCookBook","This is a Phi Family of SLMs book for getting started with Phi Models. Phi a family of open sourced AI models developed by Microsoft. Phi models are the most capable and cost-effective small language models (SLMs) available, outperforming models of the same size and next size up across a variety of language, reasoning, coding, and math benchmarks","PhiCookBook 是微软官方推出的 Phi 系列小语言模型（SLM）实战指南，旨在帮助开发者快速上手并应用这些高性能开源模型。作为目前性价比极高的小语言模型家族，Phi 在语言理解、逻辑推理、代码生成及数学计算等多个基准测试中，表现不仅超越同体量模型，甚至媲美更大规模的模型。\n\n许多开发者在资源受限的环境下，往往难以部署大型 AI 模型。PhiCookBook 通过提供丰富的动手示例和代码模板，解决了这一痛点，让用户能够轻松将 Phi 模型部署到云端或边缘设备，在有限的算力条件下构建高效的生成式 AI 应用。无论是进行多语言处理、文本对话，还是处理图像与音频任务，这里都能找到对应的实践方案。\n\n这份资源特别适合软件开发者、AI 研究人员以及希望探索轻量级大模型应用的技术爱好者。其独特亮点在于提供了开箱即用的开发环境支持，用户可直接通过 GitHub Codespaces 或 VS Code 容器一键启动项目，无需繁琐配置。此外，社区还通过自动化工作流支持全球数十种语言的文档翻译，确保了内容的广泛可访问性与实时更新。无论你是想尝试最新的 SLM 技术，还是寻求低成本的高效 AI 解","PhiCookBook 是微软官方推出的 Phi 系列小语言模型（SLM）实战指南，旨在帮助开发者快速上手并应用这些高性能开源模型。作为目前性价比极高的小语言模型家族，Phi 在语言理解、逻辑推理、代码生成及数学计算等多个基准测试中，表现不仅超越同体量模型，甚至媲美更大规模的模型。\n\n许多开发者在资源受限的环境下，往往难以部署大型 AI 模型。PhiCookBook 通过提供丰富的动手示例和代码模板，解决了这一痛点，让用户能够轻松将 Phi 模型部署到云端或边缘设备，在有限的算力条件下构建高效的生成式 AI 应用。无论是进行多语言处理、文本对话，还是处理图像与音频任务，这里都能找到对应的实践方案。\n\n这份资源特别适合软件开发者、AI 研究人员以及希望探索轻量级大模型应用的技术爱好者。其独特亮点在于提供了开箱即用的开发环境支持，用户可直接通过 GitHub Codespaces 或 VS Code 容器一键启动项目，无需繁琐配置。此外，社区还通过自动化工作流支持全球数十种语言的文档翻译，确保了内容的广泛可访问性与实时更新。无论你是想尝试最新的 SLM 技术，还是寻求低成本的高效 AI 解决方案，PhiCookBook 都是理想的入门起点。","# Phi Cookbook: Hands-On Examples with Microsoft's Phi Models\n\n[![Open and use the samples in GitHub Codespaces](https:\u002F\u002Fgithub.com\u002Fcodespaces\u002Fbadge.svg)](https:\u002F\u002Fcodespaces.new\u002Fmicrosoft\u002Fphicookbook)\n[![Open in Dev Containers](https:\u002F\u002Fimg.shields.io\u002Fstatic\u002Fv1?style=for-the-badge&label=Dev%20Containers&message=Open&color=blue&logo=visualstudiocode)](https:\u002F\u002Fvscode.dev\u002Fredirect?url=vscode:\u002F\u002Fms-vscode-remote.remote-containers\u002FcloneInVolume?url=https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fphicookbook)\n\n[![GitHub contributors](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fcontributors\u002Fmicrosoft\u002Fphicookbook.svg)](https:\u002F\u002FGitHub.com\u002Fmicrosoft\u002Fphicookbook\u002Fgraphs\u002Fcontributors\u002F?WT.mc_id=aiml-137032-kinfeylo)\n[![GitHub issues](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fissues\u002Fmicrosoft\u002Fphicookbook.svg)](https:\u002F\u002FGitHub.com\u002Fmicrosoft\u002Fphicookbook\u002Fissues\u002F?WT.mc_id=aiml-137032-kinfeylo)\n[![GitHub pull-requests](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fissues-pr\u002Fmicrosoft\u002Fphicookbook.svg)](https:\u002F\u002FGitHub.com\u002Fmicrosoft\u002Fphicookbook\u002Fpulls\u002F?WT.mc_id=aiml-137032-kinfeylo)\n[![PRs Welcome](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FPRs-welcome-brightgreen.svg?style=flat-square)](http:\u002F\u002Fmakeapullrequest.com?WT.mc_id=aiml-137032-kinfeylo)\n\n[![GitHub watchers](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fwatchers\u002Fmicrosoft\u002Fphicookbook.svg?style=social&label=Watch)](https:\u002F\u002FGitHub.com\u002Fmicrosoft\u002Fphicookbook\u002Fwatchers\u002F?WT.mc_id=aiml-137032-kinfeylo)\n[![GitHub forks](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fforks\u002Fmicrosoft\u002Fphicookbook.svg?style=social&label=Fork)](https:\u002F\u002FGitHub.com\u002Fmicrosoft\u002Fphicookbook\u002Fnetwork\u002F?WT.mc_id=aiml-137032-kinfeylo)\n[![GitHub stars](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fstars\u002Fmicrosoft\u002Fphicookbook?style=social&label=Star)](https:\u002F\u002FGitHub.com\u002Fmicrosoft\u002Fphicookbook\u002Fstargazers\u002F?WT.mc_id=aiml-137032-kinfeylo)\n\n[![Microsoft Foundry Discord](https:\u002F\u002Fdcbadge.limes.pink\u002Fapi\u002Fserver\u002FByRwuEEgH4)](https:\u002F\u002Fdiscord.com\u002Finvite\u002FByRwuEEgH4)\n\nPhi is a series of open source AI models developed by Microsoft. \n\nPhi is currently the most powerful and cost-effective small language model (SLM), with very good benchmarks in multi-language, reasoning, text\u002Fchat generation,coding, images, audio and other scenarios. \n\nYou can deploy Phi to the cloud or to edge devices, and you can easily build generative AI applications with limited computing power.\n\nFollow these steps to get started using these resource :\n1. **Fork the Repository**: Click [![GitHub forks](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fforks\u002Fmicrosoft\u002Fphicookbook.svg?style=social&label=Fork)](https:\u002F\u002FGitHub.com\u002Fmicrosoft\u002Fphicookbook\u002Fnetwork\u002F?WT.mc_id=aiml-137032-kinfeylo)\n2. **Clone the Repository**:   `git clone https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FPhiCookBook.git`\n3. [**Join The Microsoft AI Discord Community and meet experts and fellow developers**](https:\u002F\u002Fdiscord.com\u002Finvite\u002FByRwuEEgH4?WT.mc_id=aiml-137032-kinfeylo)\n\n![cover](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fmicrosoft_PhiCookBook_readme_b7278d1b8b50.png)\n\n### 🌐 Multi-Language Support\n\n#### Supported via GitHub Action (Automated & Always Up-to-Date)\n\n\u003C!-- CO-OP TRANSLATOR LANGUAGES TABLE START -->\n[Arabic](.\u002Ftranslations\u002Far\u002FREADME.md) | [Bengali](.\u002Ftranslations\u002Fbn\u002FREADME.md) | [Bulgarian](.\u002Ftranslations\u002Fbg\u002FREADME.md) | [Burmese (Myanmar)](.\u002Ftranslations\u002Fmy\u002FREADME.md) | [Chinese (Simplified)](.\u002Ftranslations\u002Fzh-CN\u002FREADME.md) | [Chinese (Traditional, Hong Kong)](.\u002Ftranslations\u002Fzh-HK\u002FREADME.md) | [Chinese (Traditional, Macau)](.\u002Ftranslations\u002Fzh-MO\u002FREADME.md) | [Chinese (Traditional, Taiwan)](.\u002Ftranslations\u002Fzh-TW\u002FREADME.md) | [Croatian](.\u002Ftranslations\u002Fhr\u002FREADME.md) | [Czech](.\u002Ftranslations\u002Fcs\u002FREADME.md) | [Danish](.\u002Ftranslations\u002Fda\u002FREADME.md) | [Dutch](.\u002Ftranslations\u002Fnl\u002FREADME.md) | [Estonian](.\u002Ftranslations\u002Fet\u002FREADME.md) | [Finnish](.\u002Ftranslations\u002Ffi\u002FREADME.md) | [French](.\u002Ftranslations\u002Ffr\u002FREADME.md) | [German](.\u002Ftranslations\u002Fde\u002FREADME.md) | [Greek](.\u002Ftranslations\u002Fel\u002FREADME.md) | [Hebrew](.\u002Ftranslations\u002Fhe\u002FREADME.md) | [Hindi](.\u002Ftranslations\u002Fhi\u002FREADME.md) | [Hungarian](.\u002Ftranslations\u002Fhu\u002FREADME.md) | [Indonesian](.\u002Ftranslations\u002Fid\u002FREADME.md) | [Italian](.\u002Ftranslations\u002Fit\u002FREADME.md) | [Japanese](.\u002Ftranslations\u002Fja\u002FREADME.md) | [Kannada](.\u002Ftranslations\u002Fkn\u002FREADME.md) | [Korean](.\u002Ftranslations\u002Fko\u002FREADME.md) | [Lithuanian](.\u002Ftranslations\u002Flt\u002FREADME.md) | [Malay](.\u002Ftranslations\u002Fms\u002FREADME.md) | [Malayalam](.\u002Ftranslations\u002Fml\u002FREADME.md) | [Marathi](.\u002Ftranslations\u002Fmr\u002FREADME.md) | [Nepali](.\u002Ftranslations\u002Fne\u002FREADME.md) | [Nigerian Pidgin](.\u002Ftranslations\u002Fpcm\u002FREADME.md) | [Norwegian](.\u002Ftranslations\u002Fno\u002FREADME.md) | [Persian (Farsi)](.\u002Ftranslations\u002Ffa\u002FREADME.md) | [Polish](.\u002Ftranslations\u002Fpl\u002FREADME.md) | [Portuguese (Brazil)](.\u002Ftranslations\u002Fpt-BR\u002FREADME.md) | [Portuguese (Portugal)](.\u002Ftranslations\u002Fpt-PT\u002FREADME.md) | [Punjabi (Gurmukhi)](.\u002Ftranslations\u002Fpa\u002FREADME.md) | [Romanian](.\u002Ftranslations\u002Fro\u002FREADME.md) | [Russian](.\u002Ftranslations\u002Fru\u002FREADME.md) | [Serbian (Cyrillic)](.\u002Ftranslations\u002Fsr\u002FREADME.md) | [Slovak](.\u002Ftranslations\u002Fsk\u002FREADME.md) | [Slovenian](.\u002Ftranslations\u002Fsl\u002FREADME.md) | [Spanish](.\u002Ftranslations\u002Fes\u002FREADME.md) | [Swahili](.\u002Ftranslations\u002Fsw\u002FREADME.md) | [Swedish](.\u002Ftranslations\u002Fsv\u002FREADME.md) | [Tagalog (Filipino)](.\u002Ftranslations\u002Ftl\u002FREADME.md) | [Tamil](.\u002Ftranslations\u002Fta\u002FREADME.md) | [Telugu](.\u002Ftranslations\u002Fte\u002FREADME.md) | [Thai](.\u002Ftranslations\u002Fth\u002FREADME.md) | [Turkish](.\u002Ftranslations\u002Ftr\u002FREADME.md) | [Ukrainian](.\u002Ftranslations\u002Fuk\u002FREADME.md) | [Urdu](.\u002Ftranslations\u002Fur\u002FREADME.md) | [Vietnamese](.\u002Ftranslations\u002Fvi\u002FREADME.md)\n\n> **Prefer to Clone Locally?**\n>\n> This repository includes 50+ language translations which significantly increases the download size. To clone without translations, use sparse checkout:\n>\n> **Bash \u002F macOS \u002F Linux:**\n> ```bash\n> git clone --filter=blob:none --sparse https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FPhiCookBook.git\n> cd PhiCookBook\n> git sparse-checkout set --no-cone '\u002F*' '!translations' '!translated_images'\n> ```\n>\n> **CMD (Windows):**\n> ```cmd\n> git clone --filter=blob:none --sparse https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FPhiCookBook.git\n> cd PhiCookBook\n> git sparse-checkout set --no-cone \"\u002F*\" \"!translations\" \"!translated_images\"\n> ```\n>\n> This gives you everything you need to complete the course with a much faster download.\n\u003C!-- CO-OP TRANSLATOR LANGUAGES TABLE END -->\n\n## Table of Contents\n\n- Introduction\n  - [Welcome to the Phi Family](.\u002Fmd\u002F01.Introduction\u002F01\u002F01.PhiFamily.md)\n  - [Setting up your environment](.\u002Fmd\u002F01.Introduction\u002F01\u002F01.EnvironmentSetup.md)\n  - [Understanding Key Technologies](.\u002Fmd\u002F01.Introduction\u002F01\u002F01.Understandingtech.md)\n  - [AI Safety for Phi Models](.\u002Fmd\u002F01.Introduction\u002F01\u002F01.AISafety.md)\n  - [Phi Hardware Support](.\u002Fmd\u002F01.Introduction\u002F01\u002F01.Hardwaresupport.md)\n  - [Phi Models & Availability across platforms](.\u002Fmd\u002F01.Introduction\u002F01\u002F01.Edgeandcloud.md)\n  - [Using Guidance-ai and Phi](.\u002Fmd\u002F01.Introduction\u002F01\u002F01.Guidance.md)\n  - [GitHub Marketplace Models](https:\u002F\u002Fgithub.com\u002Fmarketplace\u002Fmodels)\n  - [Azure AI Model Catalog](https:\u002F\u002Fai.azure.com)\n\n- Inference Phi in different environment\n    -  [Hugging face](.\u002Fmd\u002F01.Introduction\u002F02\u002F01.HF.md)\n    -  [GitHub Models](.\u002Fmd\u002F01.Introduction\u002F02\u002F02.GitHubModel.md)\n    -  [Microsoft Foundry Model Catalog](.\u002Fmd\u002F01.Introduction\u002F02\u002F03.AzureAIFoundry.md)\n    -  [Ollama](.\u002Fmd\u002F01.Introduction\u002F02\u002F04.Ollama.md)\n    -  [AI Toolkit VSCode (AITK)](.\u002Fmd\u002F01.Introduction\u002F02\u002F05.AITK.md)\n    -  [NVIDIA NIM](.\u002Fmd\u002F01.Introduction\u002F02\u002F06.NVIDIA.md)\n    -  [Foundry Local](.\u002Fmd\u002F01.Introduction\u002F02\u002F07.FoundryLocal.md)\n\n- Inference Phi Family\n    - [Inference Phi in iOS](.\u002Fmd\u002F01.Introduction\u002F03\u002FiOS_Inference.md)\n    - [Inference Phi in Android](.\u002Fmd\u002F01.Introduction\u002F03\u002FAndroid_Inference.md)\n    - [Inference Phi in Jetson](.\u002Fmd\u002F01.Introduction\u002F03\u002FJetson_Inference.md)\n    - [Inference Phi in AI PC](.\u002Fmd\u002F01.Introduction\u002F03\u002FAIPC_Inference.md)\n    - [Inference Phi with Apple MLX Framework](.\u002Fmd\u002F01.Introduction\u002F03\u002FMLX_Inference.md)\n    - [Inference Phi in Local Server](.\u002Fmd\u002F01.Introduction\u002F03\u002FLocal_Server_Inference.md)\n    - [Inference Phi in Remote Server using AI Toolkit](.\u002Fmd\u002F01.Introduction\u002F03\u002FRemote_Interence.md)\n    - [Inference Phi with Rust](.\u002Fmd\u002F01.Introduction\u002F03\u002FRust_Inference.md)\n    - [Inference Phi--Vision in Local](.\u002Fmd\u002F01.Introduction\u002F03\u002FVision_Inference.md)\n    - [Inference Phi with Kaito AKS, Azure Containers(official support)](.\u002Fmd\u002F01.Introduction\u002F03\u002FKaito_Inference.md)\n-  [Quantifying Phi Family](.\u002Fmd\u002F01.Introduction\u002F04\u002FQuantifyingPhi.md)\n    - [Quantizing Phi-3.5 \u002F 4 using llama.cpp](.\u002Fmd\u002F01.Introduction\u002F04\u002FUsingLlamacppQuantifyingPhi.md)\n    - [Quantizing Phi-3.5 \u002F 4 using Generative AI extensions for onnxruntime](.\u002Fmd\u002F01.Introduction\u002F04\u002FUsingORTGenAIQuantifyingPhi.md)\n    - [Quantizing Phi-3.5 \u002F 4  using Intel OpenVINO](.\u002Fmd\u002F01.Introduction\u002F04\u002FUsingIntelOpenVINOQuantifyingPhi.md)\n    - [Quantizing Phi-3.5 \u002F 4  using Apple MLX Framework](.\u002Fmd\u002F01.Introduction\u002F04\u002FUsingAppleMLXQuantifyingPhi.md)\n\n-  Evaluation Phi\n    - [Response AI](.\u002Fmd\u002F01.Introduction\u002F05\u002FResponsibleAI.md)\n    - [Microsoft Foundry for Evaluation](.\u002Fmd\u002F01.Introduction\u002F05\u002FAIFoundry.md)\n    - [Using Promptflow for Evaluation](.\u002Fmd\u002F01.Introduction\u002F05\u002FPromptflow.md)\n \n- RAG with Azure AI Search\n    - [How to use Phi-4-mini and Phi-4-multimodal(RAG) with Azure AI Search](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FPhiCookBook\u002Fblob\u002Fmain\u002Fcode\u002F06.E2E\u002FE2E_Phi-4-RAG-Azure-AI-Search.ipynb)\n\n- Phi application development samples\n  - Text & Chat Applications\n    - Phi-4 Samples \n      - [📓] [Chat With Phi-4-mini ONNX Model](.\u002Fmd\u002F02.Application\u002F01.TextAndChat\u002FPhi4\u002FChatWithPhi4ONNX\u002FREADME.md)\n      - [Chat with Phi-4 local ONNX Model .NET](.\u002Fmd\u002F04.HOL\u002Fdotnet\u002Fsrc\u002FLabsPhi4-Chat-01OnnxRuntime\u002F)\n      - [Chat .NET Console App with Phi-4 ONNX using Sementic Kernel](.\u002Fmd\u002F04.HOL\u002Fdotnet\u002Fsrc\u002FLabsPhi4-Chat-02SK\u002F)\n    - Phi-3 \u002F 3.5 Samples\n      - [Local Chatbot in the browser using Phi3, ONNX Runtime Web and WebGPU](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fonnxruntime-inference-examples\u002Ftree\u002Fmain\u002Fjs\u002Fchat)\n      - [OpenVino Chat](.\u002Fmd\u002F02.Application\u002F01.TextAndChat\u002FPhi3\u002FE2E_OpenVino_Chat.md)\n      - [Multi Model - Interactive Phi-3-mini and OpenAI Whisper](.\u002Fmd\u002F02.Application\u002F01.TextAndChat\u002FPhi3\u002FE2E_Phi-3-mini_with_whisper.md)\n      - [MLFlow - Building a wrapper and using Phi-3 with MLFlow](.\u002Fmd\u002F\u002F02.Application\u002F01.TextAndChat\u002FPhi3\u002FE2E_Phi-3-MLflow.md)\n      - [Model Optimization - How to optimize Phi-3-min model for ONNX Runtime Web with Olive](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FOlive\u002Ftree\u002Fmain\u002Fexamples\u002Fphi3)\n      - [WinUI3 App with Phi-3 mini-4k-instruct-onnx](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FPhi3-Chat-WinUI3-Sample\u002F)\n      -[WinUI3 Multi Model AI Powered Notes App Sample](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fai-powered-notes-winui3-sample)\n      - [Fine-tune and Integrate custom Phi-3 models with Prompt flow](.\u002Fmd\u002F02.Application\u002F01.TextAndChat\u002FPhi3\u002FE2E_Phi-3-FineTuning_PromptFlow_Integration.md)\n      - [Fine-tune and Integrate custom Phi-3 models with Prompt flow in Microsoft Foundry](.\u002Fmd\u002F02.Application\u002F01.TextAndChat\u002FPhi3\u002FE2E_Phi-3-FineTuning_PromptFlow_Integration_AIFoundry.md)\n      - [Evaluate the Fine-tuned Phi-3 \u002F Phi-3.5 Model in Microsoft Foundry Focusing on Microsoft's Responsible AI Principles](.\u002Fmd\u002F02.Application\u002F01.TextAndChat\u002FPhi3\u002FE2E_Phi-3-Evaluation_AIFoundry.md)\n      - [📓] [Phi-3.5-mini-instruct language prediction sample (Chinese\u002FEnglish)](.\u002Fmd\u002F02.Application\u002F01.TextAndChat\u002FPhi3\u002Fphi3-instruct-demo.ipynb)\n      - [Phi-3.5-Instruct WebGPU RAG Chatbot](.\u002Fmd\u002F02.Application\u002F01.TextAndChat\u002FPhi3\u002FWebGPUWithPhi35Readme.md)\n      - [Using Windows GPU to create Prompt flow solution with Phi-3.5-Instruct ONNX](.\u002Fmd\u002F02.Application\u002F01.TextAndChat\u002FPhi3\u002FUsingPromptFlowWithONNX.md)\n      - [Using Microsoft Phi-3.5 tflite to create Android app](.\u002Fmd\u002F02.Application\u002F01.TextAndChat\u002FPhi3\u002FUsingPhi35TFLiteCreateAndroidApp.md)\n      - [Q&A .NET Example using local ONNX Phi-3 model using the Microsoft.ML.OnnxRuntime](.\u002Fmd\u002F04.HOL\u002Fdotnet\u002Fsrc\u002FLabsPhi301\u002F)\n      - [Console chat .NET app with Semantic Kernel and Phi-3](.\u002Fmd\u002F04.HOL\u002Fdotnet\u002Fsrc\u002FLabsPhi302\u002F)\n\n  - Azure AI Inference SDK Code Based Samples \n    - Phi-4 Samples \n      - [📓] [Generate project code using Phi-4-multimodal](.\u002Fmd\u002F02.Application\u002F02.Code\u002FPhi4\u002FGenProjectCode\u002FREADME.md)\n    - Phi-3 \u002F 3.5 Samples\n      - [Build your own Visual Studio Code GitHub Copilot Chat with Microsoft Phi-3 Family](.\u002Fmd\u002F02.Application\u002F02.Code\u002FPhi3\u002FVSCodeExt\u002FREADME.md)\n      - [Create your own Visual Studio Code Chat Copilot Agent with Phi-3.5 by GitHub Models](\u002Fmd\u002F02.Application\u002F02.Code\u002FPhi3\u002FCreateVSCodeChatAgentWithGitHubModels.md)\n\n  - Advanced Reasoning Samples\n    - Phi-4 Samples \n      - [📓] [Phi-4-mini-reasoning or Phi-4-reasoning Samples](.\u002Fmd\u002F02.Application\u002F03.AdvancedReasoning\u002FPhi4\u002FAdvancedResoningPhi4mini\u002FREADME.md)\n      - [📓] [Fine-tuning Phi-4-mini-reasoning with Microsoft Olive](.\u002Fmd\u002F02.Application\u002F03.AdvancedReasoning\u002FPhi4\u002FAdvancedResoningPhi4mini\u002Folive_ft_phi_4_reasoning_with_medicaldata.ipynb)\n      - [📓] [Fine-tuning Phi-4-mini-reasoning with Apple MLX](.\u002Fmd\u002F02.Application\u002F03.AdvancedReasoning\u002FPhi4\u002FAdvancedResoningPhi4mini\u002Fmlx_ft_phi_4_reasoning_with_medicaldata.ipynb)\n      - [📓] [Phi-4-mini-reasoning with GitHub Models](.\u002Fmd\u002F02.Application\u002F02.Code\u002FPhi4r\u002Fgithub_models_inference.ipynb)\n      - [📓] [Phi-4-mini-reasoning with Microsoft Foundry Models](.\u002Fmd\u002F02.Application\u002F02.Code\u002FPhi4r\u002Fazure_models_inference.ipynb)\n  - Demos\n      - [Phi-4-mini demos hosted on Hugging Face Spaces](https:\u002F\u002Fhuggingface.co\u002Fspaces\u002Fmicrosoft\u002Fphi-4-mini?WT.mc_id=aiml-137032-kinfeylo)\n      - [Phi-4-multimodal demos hosted on Hugginge Face Spaces](https:\u002F\u002Fhuggingface.co\u002Fspaces\u002Fmicrosoft\u002Fphi-4-multimodal?WT.mc_id=aiml-137032-kinfeylo)\n  - Vision Samples\n    - Phi-4 Samples \n      - [📓] [Use Phi-4-multimodal to read images and generate code](.\u002Fmd\u002F02.Application\u002F04.Vision\u002FPhi4\u002FCreateFrontend\u002FREADME.md) \n    - Phi-3 \u002F 3.5 Samples\n      -  [📓][Phi-3-vision-Image text to text](.\u002Fmd\u002F02.Application\u002F04.Vision\u002FPhi3\u002FE2E_Phi-3-vision-image-text-to-text-online-endpoint.ipynb)\n      - [Phi-3-vision-ONNX](https:\u002F\u002Fonnxruntime.ai\u002Fdocs\u002Fgenai\u002Ftutorials\u002Fphi3-v.html)\n      - [📓][Phi-3-vision CLIP Embedding](.\u002Fmd\u002F02.Application\u002F04.Vision\u002FPhi3\u002FE2E_Phi-3-vision-image-text-to-text-online-endpoint.ipynb)\n      - [DEMO: Phi-3 Recycling](https:\u002F\u002Fgithub.com\u002Fjennifermarsman\u002FPhiRecycling\u002F)\n      - [Phi-3-vision - Visual language assistant - with Phi3-Vision and OpenVINO](https:\u002F\u002Fdocs.openvino.ai\u002Fnightly\u002Fnotebooks\u002Fphi-3-vision-with-output.html)\n      - [Phi-3 Vision Nvidia NIM](.\u002Fmd\u002F02.Application\u002F04.Vision\u002FPhi3\u002FE2E_Nvidia_NIM_Vision.md)\n      - [Phi-3 Vision OpenVino](.\u002Fmd\u002F02.Application\u002F04.Vision\u002FPhi3\u002FE2E_OpenVino_Phi3Vision.md)\n      - [📓][Phi-3.5 Vision multi-frame or multi-image sample](.\u002Fmd\u002F02.Application\u002F04.Vision\u002FPhi3\u002Fphi3-vision-demo.ipynb)\n      - [Phi-3 Vision Local ONNX Model using the Microsoft.ML.OnnxRuntime .NET](.\u002Fmd\u002F04.HOL\u002Fdotnet\u002Fsrc\u002FLabsPhi303\u002F)\n      - [Menu based Phi-3 Vision Local ONNX Model using the Microsoft.ML.OnnxRuntime .NET](.\u002Fmd\u002F04.HOL\u002Fdotnet\u002Fsrc\u002FLabsPhi304\u002F)\n\n  - Reasoning-Vision Samples\n    - Phi-4-Reasoning-Vision-15B \n      - [📓] [Using Phi-4-Reasoning-Vision-15B to detect jaywalking](.\u002Fmd\u002F02.Application\u002F10.ReasoningVision\u002FPhi_4_reasoning_vision_15b_Jaywalking.ipynb)\n      - [📓] [Using Phi-4-Reasoning-Vision-15B to math](.\u002Fmd\u002F02.Application\u002F10.ReasoningVision\u002FPhi_4_reasoning_vision_15b_Math.ipynb)\n      - [📓] [Using Phi-4-Reasoning-Vision-15B to detect UI](.\u002Fmd\u002F02.Application\u002F10.ReasoningVision\u002FPhi_4_reasoning_vision_15b_ui.ipynb)\n\n  - Math Samples\n    -  Phi-4-Mini-Flash-Reasoning-Instruct Samples  [Math Demo with Phi-4-Mini-Flash-Reasoning-Instruct](.\u002Fmd\u002F02.Application\u002F09.Math\u002FMathDemo.ipynb)\n\n  - Audio Samples\n    - Phi-4 Samples \n      - [📓] [Extracting audio transcripts using Phi-4-multimodal](.\u002Fmd\u002F02.Application\u002F05.Audio\u002FPhi4\u002FTransciption\u002FREADME.md)\n      - [📓] [Phi-4-multimodal Audio Sample](.\u002Fmd\u002F02.Application\u002F05.Audio\u002FPhi4\u002FSiri\u002Fdemo.ipynb)\n      - [📓] [Phi-4-multimodal Speech Translation Sample](.\u002Fmd\u002F02.Application\u002F05.Audio\u002FPhi4\u002FTranslate\u002Fdemo.ipynb)\n      - [.NET console application using Phi-4-multimodal Audio to analyze an audio file and generate transcript](.\u002Fmd\u002F04.HOL\u002Fdotnet\u002Fsrc\u002FLabsPhi4-MultiModal-02Audio\u002F)\n\n  - MOE Samples\n    - Phi-3 \u002F 3.5 Samples\n      - [📓] [Phi-3.5 Mixture of Experts Models (MoEs) Social Media Sample](.\u002Fmd\u002F02.Application\u002F06.MoE\u002FPhi3\u002Fphi3_moe_demo.ipynb)\n      - [📓] [Building a Retrieval-Augmented Generation (RAG) Pipeline with NVIDIA NIM Phi-3 MOE, Azure AI Search, and LlamaIndex](.\u002Fmd\u002F02.Application\u002F06.MoE\u002FPhi3\u002Fazure-ai-search-nvidia-rag.ipynb)\n      - \n  - Function Calling Samples\n    - Phi-4 Samples 🆕\n      -  [📓] [Using Function Calling With Phi-4-mini](.\u002Fmd\u002F02.Application\u002F07.FunctionCalling\u002FPhi4\u002FFunctionCallingBasic\u002FREADME.md)\n      -  [📓] [Using Function Calling to create multi-agents With Phi-4-mini](.\u002Fmd\u002F02.Application\u002F07.FunctionCalling\u002FPhi4\u002FMultiagents\u002FPhi_4_mini_multiagent.ipynb)\n      -  [📓] [Using Function Calling with Ollama](.\u002Fmd\u002F02.Application\u002F07.FunctionCalling\u002FPhi4\u002FOllama\u002Follama_functioncalling.ipynb)\n      -  [📓] [Using Function Calling with ONNX](.\u002Fmd\u002F02.Application\u002F07.FunctionCalling\u002FPhi4\u002FONNX\u002Fonnx_parallel_functioncalling.ipynb)\n  - Multimodal Mixing Samples\n    - Phi-4 Samples 🆕\n      -  [📓] [Using Phi-4-multimodal as a Technology journalist](.\u002Fmd\u002F02.Application\u002F08.Multimodel\u002FPhi4\u002FTechJournalist\u002Fphi_4_mm_audio_text_publish_news.ipynb)\n      - [.NET console application using Phi-4-multimodal to analyze images](.\u002Fmd\u002F04.HOL\u002Fdotnet\u002Fsrc\u002FLabsPhi4-MultiModal-01Images\u002F)\n\n- Fine-tuning Phi Samples\n  - [Fine-tuning Scenarios](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_Scenarios.md)\n  - [Fine-tuning vs RAG](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_vs_RAG.md)\n  - [Fine-tuning Let Phi-3 become an industry expert](.\u002Fmd\u002F03.FineTuning\u002FLetPhi3gotoIndustriy.md)\n  - [Fine-tuning Phi-3 with AI Toolkit for VS Code](.\u002Fmd\u002F03.FineTuning\u002FFinetuning_VSCodeaitoolkit.md)\n  - [Fine-tuning Phi-3 with Azure Machine Learning Service](.\u002Fmd\u002F03.FineTuning\u002FIntroduce_AzureML.md)\n  - [Fine-tuning Phi-3 with Lora](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_Lora.md)\n  - [Fine-tuning Phi-3 with QLora](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_Qlora.md)\n  - [Fine-tuning Phi-3 with Microsoft Foundry](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_AIFoundry.md)\n  - [Fine-tuning Phi-3 with Azure ML CLI\u002FSDK](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_MLSDK.md)\n  - [Fine-tuning with Microsoft Olive](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_MicrosoftOlive.md)\n  - [Fine-tuning with Microsoft Olive Hands-On Lab](.\u002Fmd\u002F03.FineTuning\u002Folive-lab\u002Freadme.md)\n  - [Fine-tuning Phi-3-vision with Weights and Bias](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_Phi-3-visionWandB.md)\n  - [Fine-tuning Phi-3 with Apple MLX Framework](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_MLX.md)\n  - [Fine-tuning Phi-3-vision (official support)](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_Vision.md)\n  - [Fine-Tuning Phi-3 with Kaito AKS , Azure Containers(official Support)](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_Kaito.md)\n  - [Fine-Tuning Phi-3 and 3.5 Vision](https:\u002F\u002Fgithub.com\u002F2U1\u002FPhi3-Vision-Finetune)\n\n- Hands on Lab\n  - [Exploring cutting-edge models: LLMs, SLMs, local development and more](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Faitour-exploring-cutting-edge-models)\n  - [Unlocking NLP Potential: Fine-Tuning with Microsoft Olive](https:\u002F\u002Fgithub.com\u002Fazure\u002FIgnite_FineTuning_workshop)\n\n- Academic Research Papers and Publications\n  - [Textbooks Are All You Need II: phi-1.5 technical report](https:\u002F\u002Farxiv.org\u002Fabs\u002F2309.05463)\n  - [Phi-3 Technical Report: A Highly Capable Language Model Locally on Your Phone](https:\u002F\u002Farxiv.org\u002Fabs\u002F2404.14219)\n  - [Phi-4 Technical Report](https:\u002F\u002Farxiv.org\u002Fabs\u002F2412.08905)\n  - [Phi-4-Mini Technical Report: Compact yet Powerful Multimodal Language Models via Mixture-of-LoRAs](https:\u002F\u002Farxiv.org\u002Fabs\u002F2503.01743)\n  - [Optimizing Small Language Models for In-Vehicle Function-Calling](https:\u002F\u002Farxiv.org\u002Fabs\u002F2501.02342)\n  - [(WhyPHI) Fine-Tuning PHI-3 for Multiple-Choice Question Answering: Methodology, Results, and Challenges](https:\u002F\u002Farxiv.org\u002Fabs\u002F2501.01588)\n  - [Phi-4-reasoning Technical Report](https:\u002F\u002Fwww.microsoft.com\u002Fen-us\u002Fresearch\u002Fwp-content\u002Fuploads\u002F2025\u002F04\u002Fphi_4_reasoning.pdf)\n  - [Phi-4-mini-reasoning Technical Report](https:\u002F\u002Fhuggingface.co\u002Fmicrosoft\u002FPhi-4-mini-reasoning\u002Fblob\u002Fmain\u002FPhi-4-Mini-Reasoning.pdf)\n\n## Using Phi Models\n\n### Phi on Microsoft Foundry\n\nYou can learn how to use Microsoft Phi and how to build E2E solutions in your different hardware devices. To experience Phi for yourself, start by playing with the models and customizing Phi for your scenarios using the [Microsoft Foundry Azure AI Model Catalog](https:\u002F\u002Faka.ms\u002Fphi3-azure-ai) you can learn more at Getting Started with [Microsoft Foundry](\u002Fmd\u002F02.QuickStart\u002FAzureAIFoundry_QuickStart.md)\n\n**Playground**\nEach model has a dedicated playground to test the model [Azure AI Playground](https:\u002F\u002Faka.ms\u002Ftry-phi3).\n\n### Phi on GitHub Models\n\nYou can learn how to use Microsoft Phi and how to build E2E solutions in your different hardware devices. To experience Phi for yourself, start by playing with the model and customizing Phi for your scenarios using the [GitHub Model Catalog](https:\u002F\u002Fgithub.com\u002Fmarketplace\u002Fmodels?WT.mc_id=aiml-137032-kinfeylo) you can learn more at Getting Started with [GitHub Model Catalog](\u002Fmd\u002F02.QuickStart\u002FGitHubModel_QuickStart.md)\n\n**Playground**\nEach model has a dedicated [playground to test the model](\u002Fmd\u002F02.QuickStart\u002FGitHubModel_QuickStart.md).\n\n### Phi on Hugging Face\n\nYou can also find the model on the [Hugging Face](https:\u002F\u002Fhuggingface.co\u002Fmicrosoft)\n\n**Playground**\n [Hugging Chat playground](https:\u002F\u002Fhuggingface.co\u002Fchat\u002Fmodels\u002Fmicrosoft\u002FPhi-3-mini-4k-instruct)\n\n ## 🎒 Other Courses\n\nOur team produces other courses! Check out:\n\n\u003C!-- CO-OP TRANSLATOR OTHER COURSES START -->\n### LangChain\n[![LangChain4j for Beginners](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FLangChain4j%20for%20Beginners-22C55E?style=for-the-badge&&labelColor=E5E7EB&color=0553D6)](https:\u002F\u002Faka.ms\u002Flangchain4j-for-beginners)\n[![LangChain.js for Beginners](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FLangChain.js%20for%20Beginners-22C55E?style=for-the-badge&labelColor=E5E7EB&color=0553D6)](https:\u002F\u002Faka.ms\u002Flangchainjs-for-beginners?WT.mc_id=m365-94501-dwahlin)\n[![LangChain for Beginners](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FLangChain%20for%20Beginners-22C55E?style=for-the-badge&labelColor=E5E7EB&color=0553D6)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Flangchain-for-beginners?WT.mc_id=m365-94501-dwahlin)\n---\n\n### Azure \u002F Edge \u002F MCP \u002F Agents\n[![AZD for Beginners](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FAZD%20for%20Beginners-0078D4?style=for-the-badge&labelColor=E5E7EB&color=0078D4)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FAZD-for-beginners?WT.mc_id=academic-105485-koreyst)\n[![Edge AI for Beginners](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FEdge%20AI%20for%20Beginners-00B8E4?style=for-the-badge&labelColor=E5E7EB&color=00B8E4)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fedgeai-for-beginners?WT.mc_id=academic-105485-koreyst)\n[![MCP for Beginners](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FMCP%20for%20Beginners-009688?style=for-the-badge&labelColor=E5E7EB&color=009688)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fmcp-for-beginners?WT.mc_id=academic-105485-koreyst)\n[![AI Agents for Beginners](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FAI%20Agents%20for%20Beginners-00C49A?style=for-the-badge&labelColor=E5E7EB&color=00C49A)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fai-agents-for-beginners?WT.mc_id=academic-105485-koreyst)\n\n---\n \n### Generative AI Series\n[![Generative AI for Beginners](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FGenerative%20AI%20for%20Beginners-8B5CF6?style=for-the-badge&labelColor=E5E7EB&color=8B5CF6)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fgenerative-ai-for-beginners?WT.mc_id=academic-105485-koreyst)\n[![Generative AI (.NET)](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FGenerative%20AI%20(.NET)-9333EA?style=for-the-badge&labelColor=E5E7EB&color=9333EA)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FGenerative-AI-for-beginners-dotnet?WT.mc_id=academic-105485-koreyst)\n[![Generative AI (Java)](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FGenerative%20AI%20(Java)-C084FC?style=for-the-badge&labelColor=E5E7EB&color=C084FC)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fgenerative-ai-for-beginners-java?WT.mc_id=academic-105485-koreyst)\n[![Generative AI (JavaScript)](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FGenerative%20AI%20(JavaScript)-E879F9?style=for-the-badge&labelColor=E5E7EB&color=E879F9)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fgenerative-ai-with-javascript?WT.mc_id=academic-105485-koreyst)\n\n---\n \n### Core Learning\n[![ML for Beginners](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FML%20for%20Beginners-22C55E?style=for-the-badge&labelColor=E5E7EB&color=22C55E)](https:\u002F\u002Faka.ms\u002Fml-beginners?WT.mc_id=academic-105485-koreyst)\n[![Data Science for Beginners](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FData%20Science%20for%20Beginners-84CC16?style=for-the-badge&labelColor=E5E7EB&color=84CC16)](https:\u002F\u002Faka.ms\u002Fdatascience-beginners?WT.mc_id=academic-105485-koreyst)\n[![AI for Beginners](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FAI%20for%20Beginners-A3E635?style=for-the-badge&labelColor=E5E7EB&color=A3E635)](https:\u002F\u002Faka.ms\u002Fai-beginners?WT.mc_id=academic-105485-koreyst)\n[![Cybersecurity for Beginners](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FCybersecurity%20for%20Beginners-F97316?style=for-the-badge&labelColor=E5E7EB&color=F97316)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FSecurity-101?WT.mc_id=academic-96948-sayoung)\n[![Web Dev for Beginners](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FWeb%20Dev%20for%20Beginners-EC4899?style=for-the-badge&labelColor=E5E7EB&color=EC4899)](https:\u002F\u002Faka.ms\u002Fwebdev-beginners?WT.mc_id=academic-105485-koreyst)\n[![IoT for Beginners](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FIoT%20for%20Beginners-14B8A6?style=for-the-badge&labelColor=E5E7EB&color=14B8A6)](https:\u002F\u002Faka.ms\u002Fiot-beginners?WT.mc_id=academic-105485-koreyst)\n[![XR Development for Beginners](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FXR%20Development%20for%20Beginners-38BDF8?style=for-the-badge&labelColor=E5E7EB&color=38BDF8)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fxr-development-for-beginners?WT.mc_id=academic-105485-koreyst)\n\n---\n \n### Copilot Series\n[![Copilot for AI Paired Programming](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FCopilot%20for%20AI%20Paired%20Programming-FACC15?style=for-the-badge&labelColor=E5E7EB&color=FACC15)](https:\u002F\u002Faka.ms\u002FGitHubCopilotAI?WT.mc_id=academic-105485-koreyst)\n[![Copilot for C#\u002F.NET](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FCopilot%20for%20C%23\u002F.NET-FBBF24?style=for-the-badge&labelColor=E5E7EB&color=FBBF24)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fmastering-github-copilot-for-dotnet-csharp-developers?WT.mc_id=academic-105485-koreyst)\n[![Copilot Adventure](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FCopilot%20Adventure-FDE68A?style=for-the-badge&labelColor=E5E7EB&color=FDE68A)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FCopilotAdventures?WT.mc_id=academic-105485-koreyst)\n\u003C!-- CO-OP TRANSLATOR OTHER COURSES END -->\n\n## Responsible AI \n\nMicrosoft is committed to helping our customers use our AI products responsibly, sharing our learnings, and building trust-based partnerships through tools like Transparency Notes and Impact Assessments. Many of these resources can be found at [https:\u002F\u002Faka.ms\u002FRAI](https:\u002F\u002Faka.ms\u002FRAI).\nMicrosoft’s approach to responsible AI is grounded in our AI principles of fairness, reliability and safety, privacy and security, inclusiveness, transparency, and accountability.\n\nLarge-scale natural language, image, and speech models - like the ones used in this sample - can potentially behave in ways that are unfair, unreliable, or offensive, in turn causing harms. Please consult the [Azure OpenAI service Transparency note](https:\u002F\u002Flearn.microsoft.com\u002Flegal\u002Fcognitive-services\u002Fopenai\u002Ftransparency-note?tabs=text) to be informed about risks and limitations.\n\nThe recommended approach to mitigating these risks is to include a safety system in your architecture that can detect and prevent harmful behavior. [Azure AI Content Safety](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fai-services\u002Fcontent-safety\u002Foverview) provides an independent layer of protection, able to detect harmful user-generated and AI-generated content in applications and services. Azure AI Content Safety includes text and image APIs that allow you to detect material that is harmful. Within Microsoft Foundry, the Content Safety service allows you to view, explore and try out sample code for detecting harmful content across different modalities. The following [quickstart documentation](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fai-services\u002Fcontent-safety\u002Fquickstart-text?tabs=visual-studio%2Clinux&pivots=programming-language-rest) guides you through making requests to the service.\n\nAnother aspect to take into account is the overall application performance. With multi-modal and multi-models applications, we consider performance to mean that the system performs as you and your users expect, including not generating harmful outputs. It's important to assess the performance of your overall application using [Performance and Quality and Risk and Safety evaluators](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fai-studio\u002Fconcepts\u002Fevaluation-metrics-built-in). You also have the ability to create and evaluate with [custom evaluators](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fai-studio\u002Fhow-to\u002Fdevelop\u002Fevaluate-sdk#custom-evaluators).\n\nYou can evaluate your AI application in your development environment using the [Azure AI Evaluation SDK](https:\u002F\u002Fmicrosoft.github.io\u002Fpromptflow\u002Findex.html). Given either a test dataset or a target, your generative AI application generations are quantitatively measured with built-in evaluators or custom evaluators of your choice. To get started with the azure ai evaluation sdk to evaluate your system, you can follow the [quickstart guide](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fai-studio\u002Fhow-to\u002Fdevelop\u002Fflow-evaluate-sdk). Once you execute an evaluation run, you can [visualize the results in Microsoft Foundry](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fai-studio\u002Fhow-to\u002Fevaluate-flow-results). \n\n## Trademarks\n\nThis project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow [Microsoft's Trademark & Brand Guidelines](https:\u002F\u002Fwww.microsoft.com\u002Flegal\u002Fintellectualproperty\u002Ftrademarks\u002Fusage\u002Fgeneral).\nUse of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.\n\n## Getting Help\n\nIf you get stuck or have any questions about building AI apps, join:\n\n[![Microsoft Foundry Discord](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FDiscord-Microsoft_Foundry_Community_Discord-blue?style=for-the-badge&logo=discord&color=5865f2&logoColor=fff)](https:\u002F\u002Faka.ms\u002Ffoundry\u002Fdiscord)\n\nIf you have product feedback or errors while building visit:\n\n[![Microsoft Foundry Developer Forum](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FGitHub-Microsoft_Foundry_Developer_Forum-blue?style=for-the-badge&logo=github&color=000000&logoColor=fff)](https:\u002F\u002Faka.ms\u002Ffoundry\u002Fforum)\n","# Phi 烹饪书：使用微软 Phi 模型的实战示例\n\n[![在 GitHub Codespaces 中打开并使用示例](https:\u002F\u002Fgithub.com\u002Fcodespaces\u002Fbadge.svg)](https:\u002F\u002Fcodespaces.new\u002Fmicrosoft\u002Fphicookbook)\n[![在 Dev Containers 中打开](https:\u002F\u002Fimg.shields.io\u002Fstatic\u002Fv1?style=for-the-badge&label=Dev%20Containers&message=Open&color=blue&logo=visualstudiocode)](https:\u002F\u002Fvscode.dev\u002Fredirect?url=vscode:\u002F\u002Fms-vscode-remote.remote-containers\u002FcloneInVolume?url=https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fphicookbook)\n\n[![GitHub 贡献者](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fcontributors\u002Fmicrosoft\u002Fphicookbook.svg)](https:\u002F\u002FGitHub.com\u002Fmicrosoft\u002Fphicookbook\u002Fgraphs\u002Fcontributors\u002F?WT.mc_id=aiml-137032-kinfeylo)\n[![GitHub 问题](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fissues\u002Fmicrosoft\u002Fphicookbook.svg)](https:\u002F\u002FGitHub.com\u002Fmicrosoft\u002Fphicookbook\u002Fissues\u002F?WT.mc_id=aiml-137032-kinfeylo)\n[![GitHub 拉取请求](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fissues-pr\u002Fmicrosoft\u002Fphicookbook.svg)](https:\u002F\u002FGitHub.com\u002Fmicrosoft\u002Fphicookbook\u002Fpulls\u002F?WT.mc_id=aiml-137032-kinfeylo)\n[![欢迎提交 PR](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FPRs-welcome-brightgreen.svg?style=flat-square)](http:\u002F\u002Fmakeapullrequest.com?WT.mc_id=aiml-137032-kinfeylo)\n\n[![GitHub 监视者](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fwatchers\u002Fmicrosoft\u002Fphicookbook.svg?style=social&label=Watch)](https:\u002F\u002FGitHub.com\u002Fmicrosoft\u002Fphicookbook\u002Fwatchers\u002F?WT.mc_id=aiml-137032-kinfeylo)\n[![GitHub 分叉](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fforks\u002Fmicrosoft\u002Fphicookbook.svg?style=social&label=Fork)](https:\u002F\u002FGitHub.com\u002Fmicrosoft\u002Fphicookbook\u002Fnetwork\u002F?WT.mc_id=aiml-137032-kinfeylo)\n[![GitHub 星标](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fstars\u002Fmicrosoft\u002Fphicookbook?style=social&label=Star)](https:\u002F\u002FGitHub.com\u002Fmicrosoft\u002Fphicookbook\u002Fstargazers\u002F?WT.mc_id=aiml-137032-kinfeylo)\n\n[![Microsoft Foundry Discord](https:\u002F\u002Fdcbadge.limes.pink\u002Fapi\u002Fserver\u002FByRwuEEgH4)](https:\u002F\u002Fdiscord.com\u002Finvite\u002FByRwuEEgH4)\n\nPhi 是由微软开发的一系列开源 AI 模型。\n\nPhi 目前是最强大且最具成本效益的小型语言模型 (SLM)，在多语言、推理、文本\u002F聊天生成、编码、图像、音频等场景中均表现出色。\n\n您可以将 Phi 部署到云端或边缘设备上，并且只需有限的计算资源即可轻松构建生成式 AI 应用程序。\n\n请按照以下步骤开始使用这些资源：\n1. **分叉仓库**：点击 [![GitHub 分叉](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fforks\u002Fmicrosoft\u002Fphicookbook.svg?style=social&label=Fork)](https:\u002F\u002FGitHub.com\u002Fmicrosoft\u002Fphicookbook\u002Fnetwork\u002F?WT.mc_id=aiml-137032-kinfeylo)\n2. **克隆仓库**：`git clone https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FPhiCookBook.git`\n3. [**加入 Microsoft AI Discord 社区，与专家和开发者交流**](https:\u002F\u002Fdiscord.com\u002Finvite\u002FByRwuEEgH4?WT.mc_id=aiml-137032-kinfeylo)\n\n![封面](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fmicrosoft_PhiCookBook_readme_b7278d1b8b50.png)\n\n### 🌐 多语言支持\n\n#### 通过 GitHub Action 支持（自动化且始终保持最新）\n\n\u003C!-- CO-OP 翻译语言表格开始 -->\n阿拉伯语 (Arabic) | 孟加拉语 (Bengali) | 保加利亚语 (Bulgarian) | 缅甸语 (Burmese) | 简体中文 (Chinese, Simplified) | 繁体中文（香港）(Chinese, Traditional, Hong Kong) | 繁体中文（澳门）(Chinese, Traditional, Macau) | 繁体中文（台湾）(Chinese, Traditional, Taiwan) | 克罗地亚语 (Croatian) | 捷克语 (Czech) | 丹麦语 (Danish) | 荷兰语 (Dutch) | 爱沙尼亚语 (Estonian) | 芬兰语 (Finnish) | 法语 (French) | 德语 (German) | 希腊语 (Greek) | 希伯来语 (Hebrew) | 印地语 (Hindi) | 匈牙利语 (Hungarian) | 印度尼西亚语 (Indonesian) | 意大利语 (Italian) | 日语 (Japanese) | 卡纳达语 (Kannada) | 韩语 (Korean) | 立陶宛语 (Lithuanian) | 马来语 (Malay) | 马拉雅拉姆语 (Malayalam) | 马拉地语 (Marathi) | 尼泊尔语 (Nepali) | 尼日利亚皮钦语 (Nigerian Pidgin) | 挪威语 (Norwegian) | 波斯语 (Persian, Farsi) | 波兰语 (Polish) | 巴西葡萄牙语 (Portuguese, Brazil) | 葡萄牙语 (Portugal) | 旁遮普语 (Gurmukhi) | 罗马尼亚语 (Romanian) | 俄语 (Russian) | 塞尔维亚语（西里尔字母）(Serbian, Cyrillic) | 斯洛伐克语 (Slovak) | 斯洛文尼亚语 (Slovenian) | 西班牙语 (Spanish) | 斯瓦希里语 (Swahili) | 瑞典语 (Swedish) | 他加禄语 (Tagalog, Filipino) | 泰米尔语 (Tamil) | 泰卢固语 (Telugu) | 泰语 (Thai) | 土耳其语 (Turkish) | 乌克兰语 (Ukrainian) | 乌尔都语 (Urdu) | 越南语 (Vietnamese)\n\n> **更倾向于本地克隆吗？**\n>\n> 此仓库包含 50 多种语言的翻译，这会显著增加下载大小。若想不包含翻译进行克隆，请使用稀疏检出功能：\n>\n> **Bash \u002F macOS \u002F Linux：**\n> ```bash\n> git clone --filter=blob:none --sparse https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FPhiCookBook.git\n> cd PhiCookBook\n> git sparse-checkout set --no-cone '\u002F*' '!translations' '!translated_images'\n> ```\n>\n> **CMD（Windows）：**\n> ```cmd\n> git clone --filter=blob:none --sparse https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FPhiCookBook.git\n> cd PhiCookBook\n> git sparse-checkout set --no-cone \"\u002F*\" \"!translations\" \"!translated_images\"\n> ```\n>\n> 这样可以快速下载所需内容，完成课程学习。\n\u003C!-- CO-OP 翻译语言表格结束 -->\n\n## 目录\n\n- 引言\n  - [欢迎来到 Phi 家族](.\u002Fmd\u002F01.Introduction\u002F01\u002F01.PhiFamily.md)\n  - [设置您的环境](.\u002Fmd\u002F01.Introduction\u002F01\u002F01.EnvironmentSetup.md)\n  - [理解关键技术](.\u002Fmd\u002F01.Introduction\u002F01\u002F01.Understandingtech.md)\n  - [Phi 模型的 AI 安全性](.\u002Fmd\u002F01.Introduction\u002F01\u002F01.AISafety.md)\n  - [Phi 的硬件支持](.\u002Fmd\u002F01.Introduction\u002F01\u002F01.Hardwaresupport.md)\n  - [Phi 模型及跨平台可用性](.\u002Fmd\u002F01.Introduction\u002F01\u002F01.Edgeandcloud.md)\n  - [使用 Guidance-ai 和 Phi](.\u002Fmd\u002F01.Introduction\u002F01\u002F01.Guidance.md)\n  - [GitHub Marketplace 模型](https:\u002F\u002Fgithub.com\u002Fmarketplace\u002Fmodels)\n  - [Azure AI 模型目录](https:\u002F\u002Fai.azure.com)\n\n- 不同环境下的 Phi 推理\n    -  [Hugging Face](.\u002Fmd\u002F01.Introduction\u002F02\u002F01.HF.md)\n    -  [GitHub Models](.\u002Fmd\u002F01.Introduction\u002F02\u002F02.GitHubModel.md)\n    -  [Microsoft Foundry 模型目录](.\u002Fmd\u002F01.Introduction\u002F02\u002F03.AzureAIFoundry.md)\n    -  [Ollama](.\u002Fmd\u002F01.Introduction\u002F02\u002F04.Ollama.md)\n    -  [AI 工具包 VSCode (AITK)](.\u002Fmd\u002F01.Introduction\u002F02\u002F05.AITK.md)\n    -  [NVIDIA NIM](.\u002Fmd\u002F01.Introduction\u002F02\u002F06.NVIDIA.md)\n    -  [Foundry 本地](.\u002Fmd\u002F01.Introduction\u002F02\u002F07.FoundryLocal.md)\n\n- Phi 系列的推理\n    -  [iOS 上的 Phi 推理](.\u002Fmd\u002F01.Introduction\u002F03\u002FiOS_Inference.md)\n    -  [Android 上的 Phi 推理](.\u002Fmd\u002F01.Introduction\u002F03\u002FAndroid_Inference.md)\n    -  [Jetson 上的 Phi 推理](.\u002Fmd\u002F01.Introduction\u002F03\u002FJetson_Inference.md)\n    -  [AI PC 上的 Phi 推理](.\u002Fmd\u002F01.Introduction\u002F03\u002FAIPC_Inference.md)\n    -  [使用 Apple MLX 框架的 Phi 推理](.\u002Fmd\u002F01.Introduction\u002F03\u002FMLX_Inference.md)\n    -  [本地服务器上的 Phi 推理](.\u002Fmd\u002F01.Introduction\u002F03\u002FLocal_Server_Inference.md)\n    -  [使用 AI 工具包在远程服务器上进行 Phi 推理](.\u002Fmd\u002F01.Introduction\u002F03\u002FRemote_Interence.md)\n    -  [使用 Rust 进行 Phi 推理](.\u002Fmd\u002F01.Introduction\u002F03\u002FRust_Inference.md)\n    -  [本地视觉任务中的 Phi 推理](.\u002Fmd\u002F01.Introduction\u002F03\u002FVision_Inference.md)\n    -  [使用 Kaito AKS 和 Azure 容器（官方支持）的 Phi 推理](.\u002Fmd\u002F01.Introduction\u002F03\u002FKaito_Inference.md)\n\n-  [Phi 系列的量化](.\u002Fmd\u002F01.Introduction\u002F04\u002FQuantifyingPhi.md)\n    -  [使用 llama.cpp 对 Phi-3.5 \u002F 4 进行量化](.\u002Fmd\u002F01.Introduction\u002F04\u002FUsingLlamacppQuantifyingPhi.md)\n    -  [使用 ONNX Runtime 的生成式 AI 扩展对 Phi-3.5 \u002F 4 进行量化](.\u002Fmd\u002F01.Introduction\u002F04\u002FUsingORTGenAIQuantifyingPhi.md)\n    -  [使用 Intel OpenVINO 对 Phi-3.5 \u002F 4 进行量化](.\u002Fmd\u002F01.Introduction\u002F04\u002FUsingIntelOpenVINOQuantifyingPhi.md)\n    -  [使用 Apple MLX 框架对 Phi-3.5 \u002F 4 进行量化](.\u002Fmd\u002F01.Introduction\u002F04\u002FUsingAppleMLXQuantifyingPhi.md)\n\n-  Phi 的评估\n    -  [负责任的人工智能](.\u002Fmd\u002F01.Introduction\u002F05\u002FResponsibleAI.md)\n    -  [用于评估的 Microsoft Foundry](.\u002Fmd\u002F01.Introduction\u002F05\u002FAIFoundry.md)\n    -  [使用 Promptflow 进行评估](.\u002Fmd\u002F01.Introduction\u002F05\u002FPromptflow.md)\n\n- 使用 Azure AI Search 的 RAG\n    -  [如何将 Phi-4-mini 和 Phi-4 多模态（RAG）与 Azure AI Search 结合使用](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FPhiCookBook\u002Fblob\u002Fmain\u002Fcode\u002F06.E2E\u002FE2E_Phi-4-RAG-Azure-AI-Search.ipynb)\n\n- Phi 应用开发示例\n  - 文本与聊天应用\n    -  Phi-4 示例\n      -  [📓] [使用 Phi-4-mini ONNX 模型进行聊天](.\u002Fmd\u002F02.Application\u002F01.TextAndChat\u002FPhi4\u002FChatWithPhi4ONNX\u002FREADME.md)\n      -  [.NET 中使用本地 Phi-4 ONNX 模型进行聊天](.\u002Fmd\u002F04.HOL\u002Fdotnet\u002Fsrc\u002FLabsPhi4-Chat-01OnnxRuntime\u002F)\n      -  [使用语义核构建 .NET 控制台应用程序，与 Phi-4 ONNX 进行聊天](.\u002Fmd\u002F04.HOL\u002Fdotnet\u002Fsrc\u002FLabsPhi4-Chat-02SK\u002F)\n    -  Phi-3 \u002F 3.5 示例\n      -  [使用 Phi3、ONNX Runtime Web 和 WebGPU 在浏览器中实现本地聊天机器人](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fonnxruntime-inference-examples\u002Ftree\u002Fmain\u002Fjs\u002Fchat)\n      -  [OpenVino 聊天](.\u002Fmd\u002F02.Application\u002F01.TextAndChat\u002FPhi3\u002FE2E_OpenVino_Chat.md)\n      -  [多模型交互：Phi-3-mini 与 OpenAI Whisper](.\u002Fmd\u002F02.Application\u002F01.TextAndChat\u002FPhi3\u002FE2E_Phi-3-mini_with_whisper.md)\n      -  [MLFlow — 构建封装并使用 Phi-3 与 MLFlow](.\u002Fmd\u002F\u002F02.Application\u002F01.TextAndChat\u002FPhi3\u002FE2E_Phi-3-MLflow.md)\n      -  [模型优化 — 如何使用 Olive 为 ONNX Runtime Web 优化 Phi-3-min 模型](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FOlive\u002Ftree\u002Fmain\u002Fexamples\u002Fphi3)\n      -  [WinUI3 应用程序，搭载 Phi-3 mini-4k-instruct-onnx](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FPhi3-Chat-WinUI3-Sample\u002F)\n      -  [WinUI3 多模型 AI 驱动笔记应用示例](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fai-powered-notes-winui3-sample)\n      -  [使用 Prompt flow 微调并集成自定义 Phi-3 模型](.\u002Fmd\u002F02.Application\u002F01.TextAndChat\u002FPhi3\u002FE2E_Phi-3-FineTuning_PromptFlow_Integration.md)\n      -  [在 Microsoft Foundry 中使用 Prompt flow 微调并集成自定义 Phi-3 模型](.\u002Fmd\u002F02.Application\u002F01.TextAndChat\u002FPhi3\u002FE2E_Phi-3-FineTuning_PromptFlow_Integration_AIFoundry.md)\n      -  [在 Microsoft Foundry 中评估微调后的 Phi-3 \u002F Phi-3.5 模型，重点关注微软的负责任 AI 原则](.\u002Fmd\u002F02.Application\u002F01.TextAndChat\u002FPhi3\u002FE2E_Phi-3-Evaluation_AIFoundry.md)\n      -  [📓] [Phi-3.5-mini-instruct 语言预测示例（中文\u002F英文）](.\u002Fmd\u002F02.Application\u002F01.TextAndChat\u002FPhi3\u002Fphi3-instruct-demo.ipynb)\n      -  [Phi-3.5-Instruct WebGPU RAG 聊天机器人](.\u002Fmd\u002F02.Application\u002F01.TextAndChat\u002FPhi3\u002FWebGPUWithPhi35Readme.md)\n      -  [利用 Windows GPU 创建基于 Prompt flow 的解决方案，结合 Phi-3.5-Instruct ONNX](.\u002Fmd\u002F02.Application\u002F01.TextAndChat\u002FPhi3\u002FUsingPromptFlowWithONNX.md)\n      -  [使用 Microsoft Phi-3.5 tflite 开发 Android 应用程序](.\u002Fmd\u002F02.Application\u002F01.TextAndChat\u002FPhi3\u002FUsingPhi35TFLiteCreateAndroidApp.md)\n      -  [使用本地 ONNX Phi-3 模型的 Q&A .NET 示例，借助 Microsoft.ML.OnnxRuntime](.\u002Fmd\u002F04.HOL\u002Fdotnet\u002Fsrc\u002FLabsPhi301\u002F)\n      -  [使用语义核和 Phi-3 构建 .NET 控制台聊天应用程序](.\u002Fmd\u002F04.HOL\u002Fdotnet\u002Fsrc\u002FLabsPhi302\u002F)\n\n  - 基于 Azure AI 推理 SDK 的代码示例\n    -  Phi-4 示例\n      -  [📓] [使用 Phi-4 多模态生成项目代码](.\u002Fmd\u002F02.Application\u002F02.Code\u002FPhi4\u002FGenProjectCode\u002FREADME.md)\n    -  Phi-3 \u002F 3.5 示例\n      -  [使用 Microsoft Phi-3 系列打造属于自己的 Visual Studio Code GitHub Copilot 聊天功能](.\u002Fmd\u002F02.Application\u002F02.Code\u002FPhi3\u002FVSCodeExt\u002FREADME.md)\n      -  [通过 GitHub Models 创建基于 Phi-3.5 的 Visual Studio Code Chat Copilot Agent](\u002Fmd\u002F02.Application\u002F02.Code\u002FPhi3\u002FCreateVSCodeChatAgentWithGitHubModels.md)\n\n- 高级推理样本\n    - Phi-4 样本\n      - [📓] [Phi-4-mini-reasoning 或 Phi-4-reasoning 样本](.\u002Fmd\u002F02.Application\u002F03.AdvancedReasoning\u002FPhi4\u002FAdvancedResoningPhi4mini\u002FREADME.md)\n      - [📓] [使用 Microsoft Olive 微调 Phi-4-mini-reasoning](.\u002Fmd\u002F02.Application\u002F03.AdvancedReasoning\u002FPhi4\u002FAdvancedResoningPhi4mini\u002Folive_ft_phi_4_reasoning_with_medicaldata.ipynb)\n      - [] [使用 Apple MLX 微调 Phi-4-mini-reasoning](.\u002Fmd\u002F02.Application\u002F03.AdvancedReasoning\u002FPhi4\u002FAdvancedResoningPhi4mini\u002Fmlx_ft_phi_4_reasoning_with_medicaldata.ipynb)\n      - [] [使用 GitHub Models 的 Phi-4-mini-reasoning](.\u002Fmd\u002F02.Application\u002F02.Code\u002FPhi4r\u002Fgithub_models_inference.ipynb)\n      - [] [使用 Microsoft Foundry Models 的 Phi-4-mini-reasoning](.\u002Fmd\u002F02.Application\u002F02.Code\u002FPhi4r\u002Fazure_models_inference.ipynb)\n  - 演示\n      - [托管在 Hugging Face Spaces 上的 Phi-4-mini 演示](https:\u002F\u002Fhuggingface.co\u002Fspaces\u002Fmicrosoft\u002Fphi-4-mini?WT.mc_id=aiml-137032-kinfeylo)\n      - [托管在 Hugging Face Spaces 上的 Phi-4 多模态演示](https:\u002F\u002Fhuggingface.co\u002Fspaces\u002Fmicrosoft\u002Fphi-4-multimodal?WT.mc_id=aiml-137032-kinfeylo)\n  - 视觉样本\n    - Phi-4 样本\n      - [] [使用 Phi-4 多模态模型读取图像并生成代码](.\u002Fmd\u002F02.Application\u002F04.Vision\u002FPhi4\u002FCreateFrontend\u002FREADME.md)\n    - Phi-3 \u002F 3.5 样本\n      - [][Phi-3-vision 图像文本到文本](.\u002Fmd\u002F02.Application\u002F04.Vision\u002FPhi3\u002FE2E_Phi-3-vision-image-text-to-text-online-endpoint.ipynb)\n      - [Phi-3-vision-ONNX](https:\u002F\u002Fonnxruntime.ai\u002Fdocs\u002Fgenai\u002Ftutorials\u002Fphi3-v.html)\n      - [][Phi-3-vision CLIP 嵌入](.\u002Fmd\u002F02.Application\u002F04.Vision\u002FPhi3\u002FE2E_Phi-3-vision-image-text-to-text-online-endpoint.ipynb)\n      - [演示：Phi-3 回收](https:\u002F\u002Fgithub.com\u002Fjennifermarsman\u002FPhiRecycling\u002F)\n      - [Phi-3-vision - 视觉语言助手 - 使用 Phi3-Vision 和 OpenVINO](https:\u002F\u002Fdocs.openvino.ai\u002Fnightly\u002Fnotebooks\u002Fphi-3-vision-with-output.html)\n      - [Phi-3 Vision Nvidia NIM](.\u002Fmd\u002F02.Application\u002F04.Vision\u002FPhi3\u002FE2E_Nvidia_NIM_Vision.md)\n      - [Phi-3 Vision OpenVino](.\u002Fmd\u002F02.Application\u002F04.Vision\u002FPhi3\u002FE2E_OpenVino_Phi3Vision.md)\n      - [][Phi-3.5 Vision 多帧或多图像样本](.\u002Fmd\u002F02.Application\u002F04.Vision\u002FPhi3\u002Fphi3-vision-demo.ipynb)\n      - [使用 Microsoft.ML.OnnxRuntime .NET 的本地 ONNX 模型](.\u002Fmd\u002F04.HOL\u002Fdotnet\u002Fsrc\u002FLabsPhi303\u002F)\n      - [基于菜单的本地 ONNX 模型，使用 Microsoft.ML.OnnxRuntime .NET](.\u002Fmd\u002F04.HOL\u002Fdotnet\u002Fsrc\u002FLabsPhi304\u002F)\n\n  - 推理-视觉样本\n    - Phi-4-Reasoning-Vision-15B\n      - [] [使用 Phi-4-Reasoning-Vision-15B 检测乱穿马路行为](.\u002Fmd\u002F02.Application\u002F10.ReasoningVision\u002FPhi_4_reasoning_vision_15b_Jaywalking.ipynb)\n      - [] [使用 Phi-4-Reasoning-Vision-15B 进行数学计算](.\u002Fmd\u002F02.Application\u002F10.ReasoningVision\u002FPhi_4_reasoning_vision_15b_Math.ipynb)\n      - [] [使用 Phi-4-Reasoning-Vision-15B 检测用户界面问题](.\u002Fmd\u002F02.Application\u002F10.ReasoningVision\u002FPhi_4_reasoning_vision_15b_ui.ipynb)\n\n  - 数学样本\n    - Phi-4-Mini-Flash-Reasoning-Instruct 样本 [使用 Phi-4-Mini-Flash-Reasoning-Instruct 的数学演示](.\u002Fmd\u002F02.Application\u002F09.Math\u002FMathDemo.ipynb)\n\n  - 音频样本\n    - Phi-4 样本\n      - [] [使用 Phi-4 多模态模型提取音频转录本](.\u002Fmd\u002F02.Application\u002F05.Audio\u002FPhi4\u002FTransciption\u002FREADME.md)\n      - [] [Phi-4 多模态音频样本](.\u002Fmd\u002F02.Application\u002F05.Audio\u002FPhi4\u002FSiri\u002Fdemo.ipynb)\n      - [] [Phi-4 多模态语音翻译样本](.\u002Fmd\u002F02.Application\u002F05.Audio\u002FPhi4\u002FTranslate\u002Fdemo.ipynb)\n      - [.NET 控制台应用程序，使用 Phi-4 多模态音频分析音频文件并生成转录本](.\u002Fmd\u002F04.HOL\u002Fdotnet\u002Fsrc\u002FLabsPhi4-MultiModal-02Audio\u002F)\n\n  - MOE 样本\n    - Phi-3 \u002F 3.5 样本\n      - [] [Phi-3.5 混合专家模型 (MoEs) 社交媒体样本](.\u002Fmd\u002F02.Application\u002F06.MoE\u002FPhi3\u002Fphi3_moe_demo.ipynb)\n      - [] [使用 NVIDIA NIM Phi-3 MOE、Azure AI Search 和 LlamaIndex 构建检索增强生成 (RAG) 流程](.\u002Fmd\u002F02.Application\u002F06.MoE\u002FPhi3\u002Fazure-ai-search-nvidia-rag.ipynb)\n      - \n  - 函数调用样本\n    - Phi-4 样本 🆕\n      - [] [使用 Phi-4-mini 进行函数调用](.\u002Fmd\u002F02.Application\u002F07.FunctionCalling\u002FPhi4\u002FFunctionCallingBasic\u002FREADME.md)\n      - [] [使用 Phi-4-mini 创建多智能体系统](.\u002Fmd\u002F02.Application\u002F07.FunctionCalling\u002FPhi4\u002FMultiagents\u002FPhi_4_mini_multiagent.ipynb)\n      - [] [使用 Ollama 进行函数调用](.\u002Fmd\u002F02.Application\u002F07.FunctionCalling\u002FPhi4\u002FOllama\u002Follama_functioncalling.ipynb)\n      - [] [使用 ONNX 进行函数调用](.\u002Fmd\u002F02.Application\u002F07.FunctionCalling\u002FPhi4\u002FONNX\u002Fonnx_parallel_functioncalling.ipynb)\n  - 多模态混合样本\n    - Phi-4 样本 🆕\n      - [] [将 Phi-4 多模态模型用作科技记者](.\u002Fmd\u002F02.Application\u002F08.Multimodel\u002FPhi4\u002FTechJournalist\u002Fphi_4_mm_audio_text_publish_news.ipynb)\n      - [.NET 控制台应用程序，使用 Phi-4 多模态模型分析图像](.\u002Fmd\u002F04.HOL\u002Fdotnet\u002Fsrc\u002FLabsPhi4-MultiModal-01Images\u002F)\n\n- 微调 Phi 样本\n  - [微调场景](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_Scenarios.md)\n  - [微调与 RAG](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_vs_RAG.md)\n  - [让 Phi-3 成为行业专家的微调](.\u002Fmd\u002F03.FineTuning\u002FLetPhi3gotoIndustriy.md)\n  - [使用 VS Code 的 AI 工具包微调 Phi-3](.\u002Fmd\u002F03.FineTuning\u002FFinetuning_VSCodeaitoolkit.md)\n  - [使用 Azure Machine Learning Service 微调 Phi-3](.\u002Fmd\u002F03.FineTuning\u002FIntroduce_AzureML.md)\n  - [使用 Lora 微调 Phi-3](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_Lora.md)\n  - [使用 QLora 微调 Phi-3](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_Qlora.md)\n  - [使用 Microsoft Foundry 微调 Phi-3](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_AIFoundry.md)\n  - [使用 Azure ML CLI\u002FSDK 微调 Phi-3](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_MLSDK.md)\n  - [使用 Microsoft Olive 微调](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_MicrosoftOlive.md)\n  - [使用 Microsoft Olive 的实践实验室](.\u002Fmd\u002F03.FineTuning\u002Folive-lab\u002Freadme.md)\n  - [使用 Weights and Bias 微调 Phi-3-vision](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_Phi-3-visionWandB.md)\n  - [使用 Apple MLX 框架微调 Phi-3](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_MLX.md)\n  - [官方支持下的 Phi-3-vision 微调](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_Vision.md)\n  - [使用 Kaito AKS 和 Azure Containers（官方支持）微调 Phi-3](.\u002Fmd\u002F03.FineTuning\u002FFineTuning_Kaito.md)\n  - [Phi-3 和 3.5 Vision 的微调](https:\u002F\u002Fgithub.com\u002F2U1\u002FPhi3-Vision-Finetune)\n\n- 实践实验室\n  - [探索前沿模型：LLM、SLM、本地开发等](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Faitour-exploring-cutting-edge-models)\n  - [释放 NLP 潜力：使用 Microsoft Olive 进行微调](https:\u002F\u002Fgithub.com\u002Fazure\u002FIgnite_FineTuning_workshop)\n\n- 学术研究论文和出版物\n  - [教科书就是全部所需 II：phi-1.5 技术报告](https:\u002F\u002Farxiv.org\u002Fabs\u002F2309.05463)\n  - [Phi-3 技术报告：一款可在您手机上本地运行的强大语言模型](https:\u002F\u002Farxiv.org\u002Fabs\u002F2404.14219)\n  - [Phi-4 技术报告](https:\u002F\u002Farxiv.org\u002Fabs\u002F2412.08905)\n  - [Phi-4-Mini 技术报告：通过 LoRA 混合实现紧凑而强大的多模态语言模型](https:\u002F\u002Farxiv.org\u002Fabs\u002F2503.01743)\n  - [优化小型语言模型以用于车载函数调用](https:\u002F\u002Farxiv.org\u002Fabs\u002F2501.02342)\n  - [(WhyPHI) 针对多项选择题回答微调 PHI-3：方法、结果与挑战](https:\u002F\u002Farxiv.org\u002Fabs\u002F2501.01588)\n  - [Phi-4-reasoning 技术报告](https:\u002F\u002Fwww.microsoft.com\u002Fen-us\u002Fresearch\u002Fwp-content\u002Fuploads\u002F2025\u002F04\u002Fphi_4_reasoning.pdf)\n  - [Phi-4-mini-reasoning 技术报告](https:\u002F\u002Fhuggingface.co\u002Fmicrosoft\u002FPhi-4-mini-reasoning\u002Fblob\u002Fmain\u002FPhi-4-Mini-Reasoning.pdf)\n\n\n\n## 使用 Phi 模型\n\n### Phi 在 Microsoft Foundry 上\n\n您可以学习如何使用 Microsoft Phi，并在不同的硬件设备上构建端到端解决方案。要亲自体验 Phi，可以从试用这些模型开始，并使用 [Microsoft Foundry Azure AI 模型目录](https:\u002F\u002Faka.ms\u002Fphi3-azure-ai)，根据您的场景自定义 Phi。更多信息请参阅《Microsoft Foundry 入门》(\u002Fmd\u002F02.QuickStart\u002FAzureAIFoundry_QuickStart.md)。\n\n**游乐场**\n每个模型都有一个专门的游乐场来测试该模型 [Azure AI Playground](https:\u002F\u002Faka.ms\u002Ftry-phi3)。\n\n### Phi 在 GitHub Models 上\n\n您可以学习如何使用 Microsoft Phi，并在不同的硬件设备上构建端到端解决方案。要亲自体验 Phi，可以从试用该模型开始，并使用 [GitHub 模型目录](https:\u002F\u002Fgithub.com\u002Fmarketplace\u002Fmodels?WT.mc_id=aiml-137032-kinfeylo)，根据您的场景自定义 Phi。更多信息请参阅《GitHub 模型目录入门》(\u002Fmd\u002F02.QuickStart\u002FGitHubModel_QuickStart.md)。\n\n**游乐场**\n每个模型都有一个专门的 [游乐场来测试模型](\u002Fmd\u002F02.QuickStart\u002FGitHubModel_QuickStart.md)。\n\n### Phi 在 Hugging Face 上\n\n您也可以在 [Hugging Face](https:\u002F\u002Fhuggingface.co\u002Fmicrosoft) 上找到该模型。\n\n**游乐场**\n[Hugging Chat 游乐场](https:\u002F\u002Fhuggingface.co\u002Fchat\u002Fmodels\u002Fmicrosoft\u002FPhi-3-mini-4k-instruct)\n\n ## 🎒 其他课程\n\n我们的团队还制作了其他课程！请查看：\n\n\u003C!-- CO-OP TRANSLATOR OTHER COURSES START -->\n### LangChain\n[![LangChain4j 初学者指南](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FLangChain4j%20for%20Beginners-22C55E?style=for-the-badge&&labelColor=E5E7EB&color=0553D6)](https:\u002F\u002Faka.ms\u002Flangchain4j-for-beginners)\n[![LangChain.js 初学者指南](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FLangChain.js%20for%20Beginners-22C55E?style=for-the-badge&labelColor=E5E7EB&color=0553D6)](https:\u002F\u002Faka.ms\u002Flangchainjs-for-beginners?WT.mc_id=m365-94501-dwahlin)\n[![LangChain 初学者指南](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FLangChain%20for%20Beginners-22C55E?style=for-the-badge&labelColor=E5E7EB&color=0553D6)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Flangchain-for-beginners?WT.mc_id=m365-94501-dwahlin)\n---\n\n### Azure \u002F Edge \u002F MCP \u002F Agents\n[![AZD 初学者指南](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FAZD%20for%20Beginners-0078D4?style=for-the-badge&labelColor=E5E7EB&color=0078D4)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FAZD-for-beginners?WT.mc_id=academic-105485-koreyst)\n[![Edge AI 初学者指南](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FEdge%20AI%20for%20Beginners-00B8E4?style=for-the-badge&labelColor=E5E7EB&color=00B8E4)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fedgeai-for-beginners?WT.mc_id=academic-105485-koreyst)\n[![MCP 初学者指南](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FMCP%20for%20Beginners-009688?style=for-the-badge&labelColor=E5E7EB&color=009688)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fmcp-for-beginners?WT.mc_id=academic-105485-koreyst)\n[![AI 代理初学者指南](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FAI%20Agents%20for%20Beginners-00C49A?style=for-the-badge&labelColor=E5E7EB&color=00C49A)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fai-agents-for-beginners?WT.mc_id=academic-105485-koreyst)\n\n---\n \n### 生成式 AI 系列\n[![生成式 AI 初学者指南](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FGenerative%20AI%20for%20Beginners-8B5CF6?style=for-the-badge&labelColor=E5E7EB&color=8B5CF6)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fgenerative-ai-for-beginners?WT.mc_id=academic-105485-koreyst)\n[![生成式 AI (.NET)](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FGenerative%20AI%20(.NET)-9333EA?style=for-the-badge&labelColor=E5E7EB&color=9333EA)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FGenerative-AI-for-beginners-dotnet?WT.mc_id=academic-105485-koreyst)\n[![生成式 AI (Java)](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FGenerative%20AI%20(Java)-C084FC?style=for-the-badge&labelColor=E5E7EB&color=C084FC)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fgenerative-ai-for-beginners-java?WT.mc_id=academic-105485-koreyst)\n[![生成式 AI (JavaScript)](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FGenerative%20AI%20(JavaScript)-E879F9?style=for-the-badge&labelColor=E5E7EB&color=E879F9)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fgenerative-ai-with-javascript?WT.mc_id=academic-105485-koreyst)\n\n---\n \n### 核心学习\n[![机器学习初学者指南](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FML%20for%20Beginners-22C55E?style=for-the-badge&labelColor=E5E7EB&color=22C55E)](https:\u002F\u002Faka.ms\u002Fml-beginners?WT.mc_id=academic-105485-koreyst)\n[![数据科学初学者指南](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FData%20Science%20for%20Beginners-84CC16?style=for-the-badge&labelColor=E5E7EB&color=84CC16)](https:\u002F\u002Faka.ms\u002Fdatascience-beginners?WT.mc_id=academic-105485-koreyst)\n[![人工智能初学者指南](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FAI%20for%20Beginners-A3E635?style=for-the-badge&labelColor=E5E7EB&color=A3E635)](https:\u002F\u002Faka.ms\u002Fai-beginners?WT.mc_id=academic-105485-koreyst)\n[![网络安全初学者指南](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FCybersecurity%20for%20Beginners-F97316?style=for-the-badge&labelColor=E5E7EB&color=F97316)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FSecurity-101?WT.mc_id=academic-96948-sayoung)\n[![Web 开发初学者指南](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FWeb%20Dev%20for%20Beginners-EC4899?style=for-the-badge&labelColor=E5E7EB&color=EC4899)](https:\u002F\u002Faka.ms\u002Fwebdev-beginners?WT.mc_id=academic-105485-koreyst)\n[![物联网初学者指南](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FIoT%20for%20Beginners-14B8A6?style=for-the-badge&labelColor=E5E7EB&color=14B8A6)](https:\u002F\u002Faka.ms\u002Fiot-beginners?WT.mc_id=academic-105485-koreyst)\n[![XR 开发初学者指南](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FXR%20Development%20for%20Beginners-38BDF8?style=for-the-badge&labelColor=E5E7EB&color=38BDF8)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fxr-development-for-beginners?WT.mc_id=academic-105485-koreyst)\n\n---\n\n### Copilot 系列\n[![Copilot 用于 AI 结对编程](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FCopilot%20for%20AI%20Paired%20Programming-FACC15?style=for-the-badge&labelColor=E5E7EB&color=FACC15)](https:\u002F\u002Faka.ms\u002FGitHubCopilotAI?WT.mc_id=academic-105485-koreyst)\n[![Copilot 用于 C#\u002F.NET](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FCopilot%20for%20C%23\u002F.NET-FBBF24?style=for-the-badge&labelColor=E5E7EB&color=FBBF24)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fmastering-github-copilot-for-dotnet-csharp-developers?WT.mc_id=academic-105485-koreyst)\n[![Copilot 冒险](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FCopilot%20Adventure-FDE68A?style=for-the-badge&labelColor=E5E7EB&color=FDE68A)](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FCopilotAdventures?WT.mc_id=academic-105485-koreyst)\n\u003C!-- CO-OP TRANSLATOR OTHER COURSES END -->\n\n## 负责任的人工智能 \n\n微软致力于帮助客户负责任地使用我们的 AI 产品，分享我们的经验教训，并通过透明度说明和影响评估等工具建立基于信任的合作关系。这些资源中的许多都可以在 [https:\u002F\u002Faka.ms\u002FRAI](https:\u002F\u002Faka.ms\u002FRAI) 上找到。\n微软在负责任的人工智能方面的做法以我们的 AI 原则为基础，即公平性、可靠性和安全性、隐私与安全性、包容性、透明度以及问责制。\n\n像本示例中使用的大型自然语言、图像和语音模型一样，它们可能会以不公平、不可靠或冒犯性的方式运行，从而造成危害。请查阅 [Azure OpenAI 服务透明度说明](https:\u002F\u002Flearn.microsoft.com\u002Flegal\u002Fcognitive-services\u002Fopenai\u002Ftransparency-note?tabs=text)，以了解相关风险和限制。\n\n缓解这些风险的推荐方法是在您的架构中加入一个安全系统，该系统能够检测并阻止有害行为。[Azure AI 内容安全](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fai-services\u002Fcontent-safety\u002Foverview) 提供了一个独立的保护层，能够在应用程序和服务中检测用户生成和 AI 生成的有害内容。Azure AI 内容安全包括文本和图像 API，使您能够检测有害内容。在 Microsoft Foundry 中，内容安全服务允许您查看、探索并试用用于检测不同模态下有害内容的示例代码。以下 [快速入门文档](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fai-services\u002Fcontent-safety\u002Fquickstart-text?tabs=visual-studio%2Clinux&pivots=programming-language-rest) 将指导您如何向该服务发出请求。\n\n另一个需要考虑的方面是整体应用程序性能。对于多模态和多模型的应用程序，我们认为性能是指系统能够按照您和用户的期望运行，包括不产生有害输出。使用 [性能、质量、风险和安全评估器](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fai-studio\u002Fconcepts\u002Fevaluation-metrics-built-in) 来评估整个应用程序的性能非常重要。您还可以使用 [自定义评估器](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fai-studio\u002Fhow-to\u002Fdevelop\u002Fevaluate-sdk#custom-evaluators) 创建并进行评估。\n\n您可以在开发环境中使用 [Azure AI 评估 SDK](https:\u002F\u002Fmicrosoft.github.io\u002Fpromptflow\u002Findex.html) 对您的 AI 应用程序进行评估。无论您使用测试数据集还是目标，您的生成式 AI 应用程序生成的内容都将通过内置评估器或您选择的自定义评估器进行定量测量。要开始使用 Azure AI 评估 SDK 来评估您的系统，您可以按照 [快速入门指南](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fai-studio\u002Fhow-to\u002Fdevelop\u002Fflow-evaluate-sdk) 操作。执行评估后，您可以在 [Microsoft Foundry 中可视化结果](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fai-studio\u002Fhow-to\u002Fevaluate-flow-results)。\n\n## 商标\n\n该项目可能包含项目、产品或服务的商标或徽标。微软商标或徽标的授权使用须遵守并遵循 [微软商标与品牌指南](https:\u002F\u002Fwww.microsoft.com\u002Flegal\u002Fintellectualproperty\u002Ftrademarks\u002Fusage\u002Fgeneral)。\n在该项目的修改版本中使用微软商标或徽标时，不得引起混淆或暗示微软的赞助。任何第三方商标或徽标的使用均应遵守该第三方的相关政策。\n\n## 获取帮助\n\n如果您在构建 AI 应用程序时遇到困难或有任何疑问，请加入：\n\n[![Microsoft Foundry Discord](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FDiscord-Microsoft_Foundry_Community_Discord-blue?style=for-the-badge&logo=discord&color=5865f2&logoColor=fff)](https:\u002F\u002Faka.ms\u002Ffoundry\u002Fdiscord)\n\n如果您在构建过程中遇到产品反馈或错误，请访问：\n\n[![Microsoft Foundry 开发者论坛](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FGitHub-Microsoft_Foundry_Developer_Forum-blue?style=for-the-badge&logo=github&color=000000&logoColor=fff)](https:\u002F\u002Faka.ms\u002Ffoundry\u002Fforum)","# PhiCookBook 快速上手指南\n\nPhiCookBook 是微软官方提供的开源项目，旨在通过丰富的实战示例，帮助开发者在云端、边缘设备及本地环境中部署和使用 Microsoft Phi 系列小型语言模型（SLM）。Phi 模型以高性价比著称，在多语言处理、逻辑推理、代码生成及多模态任务中表现优异。\n\n## 1. 环境准备\n\n在开始之前，请确保您的开发环境满足以下基本要求：\n\n*   **操作系统**：Windows 10\u002F11, macOS (Intel\u002FApple Silicon), 或主流 Linux 发行版。\n*   **基础工具**：\n    *   Git (用于克隆仓库)\n    *   Python 3.8+ (大部分示例基于 Python)\n    *   VS Code (推荐安装 Dev Containers 扩展以获得一致环境)\n*   **硬件要求**：\n    *   **本地运行**：建议至少 8GB 内存（运行量化模型），16GB+ 内存体验更佳。若需运行未量化大模型或进行微调，建议使用具备 CUDA 支持的 NVIDIA GPU 或 Apple M 系列芯片。\n    *   **边缘设备**：支持 iOS, Android, NVIDIA Jetson, 或 AI PC (NPU)。\n*   **账号准备**：\n    *   GitHub 账号（用于 Fork 和访问 Codespaces）\n    *   （可选）Azure 账号（用于访问 Azure AI Foundry 或云端资源）\n\n> **提示**：本项目支持多种推理后端，如 Ollama, ONNX Runtime, Hugging Face Transformers, MLX 等，具体依赖将在各示例章节中单独说明。\n\n## 2. 安装步骤\n\n### 方法一：使用 GitHub Codespaces（推荐，零配置）\n这是最快速的启动方式，无需在本地安装任何依赖，直接在浏览器中运行。\n\n1.  访问 [PhiCookBook GitHub 仓库](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fphicookbook)。\n2.  点击页面顶部的 **\"Code\"** 按钮，选择 **\"Codespaces\"** 标签页。\n3.  点击 **\"Create codespace on main\"**。\n4.  等待环境构建完成后，即可在浏览器内的 VS Code 界面直接运行示例。\n\n### 方法二：本地克隆仓库\n\n如果您希望在本地开发，请使用以下命令。由于仓库包含 50 多种语言的翻译文件，体积较大，推荐使用 **稀疏检出 (Sparse Checkout)** 仅下载核心代码和中文文档，以加快下载速度。\n\n**Linux \u002F macOS \u002F Git Bash (Windows):**\n\n```bash\n# 1. 浅层克隆并过滤大文件\ngit clone --filter=blob:none --sparse https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FPhiCookBook.git\n\n# 2. 进入目录\ncd PhiCookBook\n\n# 3. 设置稀疏检出：只获取根目录文件，排除 translations 和 translated_images 文件夹\ngit sparse-checkout set --no-cone '\u002F*' '!translations' '!translated_images'\n```\n\n**Windows CMD:**\n\n```cmd\ngit clone --filter=blob:none --sparse https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FPhiCookBook.git\ncd PhiCookBook\ngit sparse-checkout set --no-cone \"\u002F*\" \"!translations\" \"!translated_images\"\n```\n\n### 安装 Python 依赖\n\n进入具体的示例目录后（例如 `md\u002F01.Introduction\u002F02\u002F01.HF.md` 对应的代码目录），通常需要先安装依赖。通用安装命令如下：\n\n```bash\npip install -r requirements.txt\n```\n\n*注：部分特定示例（如 ONNX 或 MLX）可能需要额外的特定库，请参考对应示例文件夹下的 README。*\n\n## 3. 基本使用\n\nPhiCookBook 提供了多种推理方式的示例。以下是最简单的两种入门方式：使用 **Ollama** 进行本地快速对话，或使用 **Hugging Face** 进行代码调用。\n\n### 场景 A：使用 Ollama 快速体验（最简单）\n\n如果您已安装 [Ollama](https:\u002F\u002Follama.com\u002F)，这是运行 Phi 模型最便捷的方式。\n\n1.  **拉取模型**：\n    ```bash\n    ollama pull phi3.5\n    ```\n    *(注：也可尝试 `phi4` 或其他版本，视 Ollama 库更新情况而定)*\n\n2.  **运行对话**：\n    ```bash\n    ollama run phi3.5 \"你好，请介绍一下你自己。\"\n    ```\n\n3.  **在代码中调用** (Python 示例)：\n    ```python\n    import ollama\n\n    response = ollama.chat(model='phi3.5', messages=[\n      {\n        'role': 'user',\n        'content': '为什么天空是蓝色的？',\n      },\n    ])\n    print(response['message']['content'])\n    ```\n\n### 场景 B：使用 Hugging Face Transformers 进行推理\n\n适合需要深度定制或使用最新模型权重的开发者。\n\n1.  **安装依赖**：\n    ```bash\n    pip install transformers torch accelerate\n    ```\n\n2.  **Python 推理示例**：\n    创建一个 `test_phi.py` 文件：\n\n    ```python\n    from transformers import AutoModelForCausalLM, AutoTokenizer\n\n    # 加载模型和分词器 (以 Phi-3.5-mini 为例)\n    model_id = \"microsoft\u002FPhi-3.5-mini-instruct\"\n    tokenizer = AutoTokenizer.from_pretrained(model_id, trust_remote_code=True)\n    model = AutoModelForCausalLM.from_pretrained(\n        model_id, \n        device_map=\"auto\", \n        trust_remote_code=True,\n        torch_dtype=\"auto\" \n    )\n\n    # 准备输入\n    messages = [\n        {\"role\": \"user\", \"content\": \"用一句话解释量子纠缠。\"}\n    ]\n\n    # 生成回复\n    input_ids = tokenizer.apply_chat_template(messages, return_tensors=\"pt\").to(model.device)\n    output_ids = model.generate(input_ids, max_new_tokens=100)\n    \n    # 解码输出\n    response = tokenizer.decode(output_ids[0], skip_special_tokens=True)\n    print(response)\n    ```\n\n### 下一步探索\n\n完成上述基础步骤后，您可以查阅仓库中的 `md` 目录，探索更多高级主题：\n*   **多平台部署**：参考 `Inference Phi in iOS\u002FAndroid\u002FJetson` 章节。\n*   **模型量化**：学习如何使用 `llama.cpp` 或 `ONNX` 压缩模型以适应低显存设备。\n*   **应用开发**：查看 `Phi application development samples` 中的 RAG 检索增强生成、.NET 集成及 WebGPU 浏览器端运行示例。","一家初创教育科技公司希望在低成本边缘设备（如树莓派）上部署一个支持多语言辅导和基础代码讲解的 AI 助教，但团队算力预算极其有限。\n\n### 没有 PhiCookBook 时\n- **模型选型困难**：开发者在海量开源模型中难以快速找到既能在低算力设备运行，又具备优秀推理和编码能力的小型语言模型（SLM）。\n- **环境配置繁琐**：手动搭建推理环境耗时耗力，缺乏针对边缘设备优化的现成部署脚本，导致项目启动阶段就遭遇技术瓶颈。\n- **多语言支持缺失**：自行训练或微调模型以支持全球多种语言成本高昂，且难以保证小模型在非英语场景下的表现。\n- **开发门槛高**：缺乏具体的代码示例和最佳实践指南，团队成员需要从零摸索如何将模型集成到实际应用中，试错成本极高。\n\n### 使用 PhiCookBook 后\n- **精准模型匹配**：直接获取微软官方推荐的 Phi 系列模型指南，确认其在同尺寸下推理与编码能力的领先地位，完美契合边缘计算需求。\n- **一键快速启动**：利用提供的 GitHub Codespaces 和 Dev Containers 配置，几分钟内即可在本地或云端复现完整的开发与运行环境。\n- **原生多语言能力**：借助仓库中覆盖全球数十种语言的自动化翻译文档和示例，轻松实现多语言辅导功能，无需额外训练。\n- **实战代码参考**：直接复用书中关于文本生成、逻辑推理及代码解释的动手示例，大幅缩短从概念验证到产品原型的开发周期。\n\nPhiCookBook 让开发者能够以最低的成本和最快的速度，将微软顶尖的小型语言模型能力转化为落地的边缘 AI 应用。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fmicrosoft_PhiCookBook_b7278d1b.png","microsoft","Microsoft","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002Fmicrosoft_4900709c.png","Open source projects and samples from Microsoft",null,"opensource@microsoft.com","OpenAtMicrosoft","https:\u002F\u002Fopensource.microsoft.com","https:\u002F\u002Fgithub.com\u002Fmicrosoft",[87,91,95,99,102,105,108,111,114,117],{"name":88,"color":89,"percentage":90},"Jupyter Notebook","#DA5B0B",99.9,{"name":92,"color":93,"percentage":94},"Python","#3572A5",0.1,{"name":96,"color":97,"percentage":98},"C#","#178600",0,{"name":100,"color":101,"percentage":98},"JavaScript","#f1e05a",{"name":103,"color":104,"percentage":98},"Kotlin","#A97BFF",{"name":106,"color":107,"percentage":98},"TypeScript","#3178c6",{"name":109,"color":110,"percentage":98},"HTML","#e34c26",{"name":112,"color":113,"percentage":98},"Shell","#89e051",{"name":115,"color":116,"percentage":98},"CSS","#663399",{"name":118,"color":119,"percentage":98},"Dockerfile","#384d54",3725,483,"2026-04-02T13:20:14","MIT","Linux, macOS, Windows","非必需（支持 CPU 推理）。若使用 GPU 加速，支持 NVIDIA (CUDA)、Apple Silicon (MLX)、Intel (OpenVINO) 及 WebGPU。具体显存需求取决于模型版本（Phi-3\u002F3.5\u002F4）及量化程度，小模型可在低显存或集成显卡上运行。","未说明（取决于具体模型大小和是否量化，边缘设备示例表明可在有限内存下运行）",{"notes":128,"python":129,"dependencies":130},"该项目是微软 Phi 系列模型的实战指南，支持极广泛的部署环境，包括云端、边缘设备（iOS, Android, Jetson）、AI PC 及浏览器（WebGPU）。提供多种量化工具（llama.cpp, ONNX Runtime, OpenVINO, MLX）以适应不同硬件。可通过 GitHub Codespaces 或 Dev Containers 一键启动开发环境，无需本地配置复杂依赖。","未说明",[131,132,133,134,135,136,137,138],"onnxruntime","transformers","llama.cpp","openvino","mlx","torch","semantic-kernel","promptflow",[15,37],[141,142,143,144,145,146,147,148,149,150,151,152,153],"phi3","phi3-testing","phi3-vision","phi4","cookbook","language-model","phi-4","slm","small-language-model","phi-4-mini","phi-4-multimodal","phi4-mini","phi4-multimodal","2026-03-27T02:49:30.150509","2026-04-06T05:16:25.864522",[157,162,167,172,177,182],{"id":158,"question_zh":159,"answer_zh":160,"source_url":161},12531,"Phi-3 模型为什么会生成随机乱码或无限长的重复文本？","建议尝试使用 Phi-3.5-ONNX 版本（https:\u002F\u002Fhuggingface.co\u002Fmicrosoft\u002FPhi-3.5-mini-instruct-onnx）并更新到最新的 ORT-genai 库。有用户反馈切换到该版本后问题得到解决。如果问题仍然存在，请检查是否使用了最新的 ONNX 发布版。","https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FPhiCookBook\u002Fissues\u002F45",{"id":163,"question_zh":164,"answer_zh":165,"source_url":166},12532,"Phi-3-vision 模型的 LoRA 微调脚本在最新模型版本上失效怎么办？","这是由于 Huggingface 上的模型本身发生了变更。解决方法是在加载模型时指定旧的 revision 哈希值，例如：revision='f998a184b56bf0399b3af85c50b20ec0d5688f5f'。此外，维护者已修复了相关代码以支持最新版本的微调和视觉模型解冻，建议拉取仓库最新代码或参考 Issue #103 的修复方案。","https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FPhiCookBook\u002Fissues\u002F101",{"id":168,"question_zh":169,"answer_zh":170,"source_url":171},12533,"在 Copilot+ 笔记本上运行 C# Lab 示例时报错找不到 'onnxruntime-genai' DLL 如何解决？","该错误通常是因为缺少特定的运行时依赖。请尝试安装 Microsoft.ML.OnnxRuntime.QNN 包（https:\u002F\u002Fwww.nuget.org\u002Fpackages\u002FMicrosoft.ML.OnnxRuntime.QNN），这是针对特定硬件（如 NPU）优化的版本。注意 Phi Silica 和 OCR API 可能尚未包含在当前的 Windows App SDK 稳定版中，需关注未来的 1.6 版本或使用上述 QNN 包替代。","https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FPhiCookBook\u002Fissues\u002F84",{"id":173,"question_zh":174,"answer_zh":175,"source_url":176},12534,"如何在 Phi-3 中实现函数调用（Function Calling\u002FTools）？","需要在系统消息（System Message）中明确定义工具及其 JSON Schema。以下是一个通用的提示词模板：\n\n\u003C|system|>\n\"你是一个可以协助用户完成各种任务的 AI 助手。你可以使用以下函数：\n[\n    {\n        \"name\": \"function_name\",\n        \"description\": \"function_description\",\n        \"parameters\": [\n            {\n                \"name\": \"parameter_name\",\n                \"type\": \"parameter_type\",\n                \"description\": \"parameter_description\"\n            }\n        ],\n        \"required\": [ \"required_parameter_name\" ]\n    }\n]\n\n当用户提问时，如果需要调用函数，请按以下格式输出：\n[\n    { \"name\": \"function_name\", \"params\": {参数字典}, \"output\": \"输出变量名\" }\n]\n\"\n\n此外，社区也开发了专门针对函数调用优化的模型（如 Trelis\u002FPhi-3-mini-128k-instruct-function-calling），可以直接使用以获得更好的效果。","https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FPhiCookBook\u002Fissues\u002F13",{"id":178,"question_zh":179,"answer_zh":180,"source_url":181},12535,"Phi-4 多模态模型的示例代码运行失败如何处理？","虽然提供的数据中该 Issue 内容被截断，但根据同类问题的经验，通常是因为模型文件路径配置错误或依赖库版本不匹配。请确保已拉取最新的多模态模型权重，并检查推理代码中的图像预处理步骤是否与模型要求的输入格式一致。如果是使用 ONNX 或 DirectML，请确认运行时库已更新至支持 Phi-4 架构的版本。","https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002FPhiCookBook\u002Fissues\u002F273",{"id":183,"question_zh":184,"answer_zh":185,"source_url":166},12536,"微调视觉模型时遇到梯度为 None 的警告怎么办？","出现 'None of the inputs have requires_grad=True' 警告通常是因为模型参数未被正确标记为需要梯度。在最新的修复中，维护者已经调整了脚本以允许对视觉模型部分进行微调。请确保使用的是仓库中最新的训练脚本，并且在使用 LoRA 时正确配置了目标模块。如果仍然报错，尝试显式冻结或解冻视觉编码器部分，参考 Issue #101 中的讨论，最新代码已兼容这两种模式。",[]]