[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-Azure-Samples--serverless-chat-langchainjs":3,"tool-Azure-Samples--serverless-chat-langchainjs":61},[4,18,26,36,44,53],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":17},4358,"openclaw","openclaw\u002Fopenclaw","OpenClaw 是一款专为个人打造的本地化 AI 助手，旨在让你在自己的设备上拥有完全可控的智能伙伴。它打破了传统 AI 助手局限于特定网页或应用的束缚，能够直接接入你日常使用的各类通讯渠道，包括微信、WhatsApp、Telegram、Discord、iMessage 等数十种平台。无论你在哪个聊天软件中发送消息，OpenClaw 都能即时响应，甚至支持在 macOS、iOS 和 Android 设备上进行语音交互，并提供实时的画布渲染功能供你操控。\n\n这款工具主要解决了用户对数据隐私、响应速度以及“始终在线”体验的需求。通过将 AI 部署在本地，用户无需依赖云端服务即可享受快速、私密的智能辅助，真正实现了“你的数据，你做主”。其独特的技术亮点在于强大的网关架构，将控制平面与核心助手分离，确保跨平台通信的流畅性与扩展性。\n\nOpenClaw 非常适合希望构建个性化工作流的技术爱好者、开发者，以及注重隐私保护且不愿被单一生态绑定的普通用户。只要具备基础的终端操作能力（支持 macOS、Linux 及 Windows WSL2），即可通过简单的命令行引导完成部署。如果你渴望拥有一个懂你",349277,3,"2026-04-06T06:32:30",[13,14,15,16],"Agent","开发框架","图像","数据工具","ready",{"id":19,"name":20,"github_repo":21,"description_zh":22,"stars":23,"difficulty_score":10,"last_commit_at":24,"category_tags":25,"status":17},3808,"stable-diffusion-webui","AUTOMATIC1111\u002Fstable-diffusion-webui","stable-diffusion-webui 是一个基于 Gradio 构建的网页版操作界面，旨在让用户能够轻松地在本地运行和使用强大的 Stable Diffusion 图像生成模型。它解决了原始模型依赖命令行、操作门槛高且功能分散的痛点，将复杂的 AI 绘图流程整合进一个直观易用的图形化平台。\n\n无论是希望快速上手的普通创作者、需要精细控制画面细节的设计师，还是想要深入探索模型潜力的开发者与研究人员，都能从中获益。其核心亮点在于极高的功能丰富度：不仅支持文生图、图生图、局部重绘（Inpainting）和外绘（Outpainting）等基础模式，还独创了注意力机制调整、提示词矩阵、负向提示词以及“高清修复”等高级功能。此外，它内置了 GFPGAN 和 CodeFormer 等人脸修复工具，支持多种神经网络放大算法，并允许用户通过插件系统无限扩展能力。即使是显存有限的设备，stable-diffusion-webui 也提供了相应的优化选项，让高质量的 AI 艺术创作变得触手可及。",162132,"2026-04-05T11:01:52",[14,15,13],{"id":27,"name":28,"github_repo":29,"description_zh":30,"stars":31,"difficulty_score":32,"last_commit_at":33,"category_tags":34,"status":17},1381,"everything-claude-code","affaan-m\u002Feverything-claude-code","everything-claude-code 是一套专为 AI 编程助手（如 Claude Code、Codex、Cursor 等）打造的高性能优化系统。它不仅仅是一组配置文件，而是一个经过长期实战打磨的完整框架，旨在解决 AI 代理在实际开发中面临的效率低下、记忆丢失、安全隐患及缺乏持续学习能力等核心痛点。\n\n通过引入技能模块化、直觉增强、记忆持久化机制以及内置的安全扫描功能，everything-claude-code 能显著提升 AI 在复杂任务中的表现，帮助开发者构建更稳定、更智能的生产级 AI 代理。其独特的“研究优先”开发理念和针对 Token 消耗的优化策略，使得模型响应更快、成本更低，同时有效防御潜在的攻击向量。\n\n这套工具特别适合软件开发者、AI 研究人员以及希望深度定制 AI 工作流的技术团队使用。无论您是在构建大型代码库，还是需要 AI 协助进行安全审计与自动化测试，everything-claude-code 都能提供强大的底层支持。作为一个曾荣获 Anthropic 黑客大奖的开源项目，它融合了多语言支持与丰富的实战钩子（hooks），让 AI 真正成长为懂上",144730,2,"2026-04-07T23:26:32",[14,13,35],"语言模型",{"id":37,"name":38,"github_repo":39,"description_zh":40,"stars":41,"difficulty_score":32,"last_commit_at":42,"category_tags":43,"status":17},2271,"ComfyUI","Comfy-Org\u002FComfyUI","ComfyUI 是一款功能强大且高度模块化的视觉 AI 引擎，专为设计和执行复杂的 Stable Diffusion 图像生成流程而打造。它摒弃了传统的代码编写模式，采用直观的节点式流程图界面，让用户通过连接不同的功能模块即可构建个性化的生成管线。\n\n这一设计巧妙解决了高级 AI 绘图工作流配置复杂、灵活性不足的痛点。用户无需具备编程背景，也能自由组合模型、调整参数并实时预览效果，轻松实现从基础文生图到多步骤高清修复等各类复杂任务。ComfyUI 拥有极佳的兼容性，不仅支持 Windows、macOS 和 Linux 全平台，还广泛适配 NVIDIA、AMD、Intel 及苹果 Silicon 等多种硬件架构，并率先支持 SDXL、Flux、SD3 等前沿模型。\n\n无论是希望深入探索算法潜力的研究人员和开发者，还是追求极致创作自由度的设计师与资深 AI 绘画爱好者，ComfyUI 都能提供强大的支持。其独特的模块化架构允许社区不断扩展新功能，使其成为当前最灵活、生态最丰富的开源扩散模型工具之一，帮助用户将创意高效转化为现实。",107888,"2026-04-06T11:32:50",[14,15,13],{"id":45,"name":46,"github_repo":47,"description_zh":48,"stars":49,"difficulty_score":32,"last_commit_at":50,"category_tags":51,"status":17},4721,"markitdown","microsoft\u002Fmarkitdown","MarkItDown 是一款由微软 AutoGen 团队打造的轻量级 Python 工具，专为将各类文件高效转换为 Markdown 格式而设计。它支持 PDF、Word、Excel、PPT、图片（含 OCR）、音频（含语音转录）、HTML 乃至 YouTube 链接等多种格式的解析，能够精准提取文档中的标题、列表、表格和链接等关键结构信息。\n\n在人工智能应用日益普及的今天，大语言模型（LLM）虽擅长处理文本，却难以直接读取复杂的二进制办公文档。MarkItDown 恰好解决了这一痛点，它将非结构化或半结构化的文件转化为模型“原生理解”且 Token 效率极高的 Markdown 格式，成为连接本地文件与 AI 分析 pipeline 的理想桥梁。此外，它还提供了 MCP（模型上下文协议）服务器，可无缝集成到 Claude Desktop 等 LLM 应用中。\n\n这款工具特别适合开发者、数据科学家及 AI 研究人员使用，尤其是那些需要构建文档检索增强生成（RAG）系统、进行批量文本分析或希望让 AI 助手直接“阅读”本地文件的用户。虽然生成的内容也具备一定可读性，但其核心优势在于为机器",93400,"2026-04-06T19:52:38",[52,14],"插件",{"id":54,"name":55,"github_repo":56,"description_zh":57,"stars":58,"difficulty_score":10,"last_commit_at":59,"category_tags":60,"status":17},4487,"LLMs-from-scratch","rasbt\u002FLLMs-from-scratch","LLMs-from-scratch 是一个基于 PyTorch 的开源教育项目，旨在引导用户从零开始一步步构建一个类似 ChatGPT 的大型语言模型（LLM）。它不仅是同名技术著作的官方代码库，更提供了一套完整的实践方案，涵盖模型开发、预训练及微调的全过程。\n\n该项目主要解决了大模型领域“黑盒化”的学习痛点。许多开发者虽能调用现成模型，却难以深入理解其内部架构与训练机制。通过亲手编写每一行核心代码，用户能够透彻掌握 Transformer 架构、注意力机制等关键原理，从而真正理解大模型是如何“思考”的。此外，项目还包含了加载大型预训练权重进行微调的代码，帮助用户将理论知识延伸至实际应用。\n\nLLMs-from-scratch 特别适合希望深入底层原理的 AI 开发者、研究人员以及计算机专业的学生。对于不满足于仅使用 API，而是渴望探究模型构建细节的技术人员而言，这是极佳的学习资源。其独特的技术亮点在于“循序渐进”的教学设计：将复杂的系统工程拆解为清晰的步骤，配合详细的图表与示例，让构建一个虽小但功能完备的大模型变得触手可及。无论你是想夯实理论基础，还是为未来研发更大规模的模型做准备",90106,"2026-04-06T11:19:32",[35,15,13,14],{"id":62,"github_repo":63,"name":64,"description_en":65,"description_zh":66,"ai_summary_zh":67,"readme_en":68,"readme_zh":69,"quickstart_zh":70,"use_case_zh":71,"hero_image_url":72,"owner_login":73,"owner_name":74,"owner_avatar_url":75,"owner_bio":76,"owner_company":77,"owner_location":77,"owner_email":77,"owner_twitter":77,"owner_website":78,"owner_url":79,"languages":80,"stars":96,"forks":97,"last_commit_at":98,"license":99,"difficulty_score":100,"env_os":101,"env_gpu":102,"env_ram":103,"env_deps":104,"category_tags":112,"github_topics":114,"view_count":32,"oss_zip_url":77,"oss_zip_packed_at":77,"status":17,"created_at":128,"updated_at":129,"faqs":130,"releases":160},5288,"Azure-Samples\u002Fserverless-chat-langchainjs","serverless-chat-langchainjs","Build your own serverless AI Chat with Retrieval-Augmented-Generation using LangChain.js, TypeScript and Azure","serverless-chat-langchainjs 是一个帮助开发者快速构建无服务器架构 AI 聊天机器人的开源项目。它基于 LangChain.js、TypeScript 和 Azure 云服务，核心功能是结合检索增强生成（RAG）技术，让机器人能够依据企业私有文档（如服务条款、隐私政策或支持指南）来回答用户问题，从而有效解决大模型普遍存在的“幻觉”及缺乏特定领域知识的问题。\n\n该项目特别适合希望探索 AI 应用落地的前端工程师、全栈开发者以及技术研究人员。对于想要在不管理复杂基础设施的前提下，验证 RAG 架构可行性的团队来说，它是一个理想的起点。其独特的技术亮点在于完全采用无服务器设计：前端托管于 Azure Static Web Apps，后端逻辑运行在 Azure Functions 上，并使用 Azure Cosmos DB 作为向量数据库存储知识。此外，项目对本地开发非常友好，支持通过 Ollama 搭配 Llama 3.1 模型进行零成本测试，无需立即依赖云端资源。整体架构清晰，代码示例丰富，能帮助使用者轻松理解从文档处理到智能问答的完整流程，是学习现代 JavaS","serverless-chat-langchainjs 是一个帮助开发者快速构建无服务器架构 AI 聊天机器人的开源项目。它基于 LangChain.js、TypeScript 和 Azure 云服务，核心功能是结合检索增强生成（RAG）技术，让机器人能够依据企业私有文档（如服务条款、隐私政策或支持指南）来回答用户问题，从而有效解决大模型普遍存在的“幻觉”及缺乏特定领域知识的问题。\n\n该项目特别适合希望探索 AI 应用落地的前端工程师、全栈开发者以及技术研究人员。对于想要在不管理复杂基础设施的前提下，验证 RAG 架构可行性的团队来说，它是一个理想的起点。其独特的技术亮点在于完全采用无服务器设计：前端托管于 Azure Static Web Apps，后端逻辑运行在 Azure Functions 上，并使用 Azure Cosmos DB 作为向量数据库存储知识。此外，项目对本地开发非常友好，支持通过 Ollama 搭配 Llama 3.1 模型进行零成本测试，无需立即依赖云端资源。整体架构清晰，代码示例丰富，能帮助使用者轻松理解从文档处理到智能问答的完整流程，是学习现代 JavaScript AI 开发模式的优质范本。","\u003C!-- prettier-ignore -->\n\u003Cdiv align=\"center\">\n\n\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FAzure-Samples_serverless-chat-langchainjs_readme_0ef1ff1f66a8.png\" alt=\"\" align=\"center\" height=\"64\" \u002F>\n\n# Serverless AI Chat with RAG using LangChain.js\n\n[![Open project in GitHub Codespaces](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FCodespaces-Open-blue?style=flat-square&logo=github)](https:\u002F\u002Fcodespaces.new\u002FAzure-Samples\u002Fserverless-chat-langchainjs?hide_repo_select=true&ref=main&quickstart=true)\n[![Join Azure AI Foundry Discord](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FDiscord-Azure_AI_Community-blue?style=flat-square&logo=discord&color=5865f2&logoColor=fff)](https:\u002F\u002Faka.ms\u002Ffoundry\u002Fdiscord)\n[![Official Learn documentation](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FDocumentation-00a3ee?style=flat-square)](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fdeveloper\u002Fjavascript\u002Fai\u002Fget-started-app-chat-template-langchainjs)\n[![Watch to learn about RAG and this sample on YouTube](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FYouTube-d95652.svg?style=flat-square&logo=youtube)](https:\u002F\u002Fwww.youtube.com\u002Fwatch?v=xkFOmx5yxIA&list=PLlrxD0HtieHi5ZpsHULPLxm839IrhmeDk&index=4)\n[![dev.to blog post walkthrough](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FBlog%20post-black?style=flat-square&logo=dev.to)](https:\u002F\u002Fdev.to\u002Fazure\u002Fbuild-a-serverless-chatgpt-with-rag-using-langchainjs-3487)\n\u003Cbr>\n[![Build Status](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Factions\u002Fworkflow\u002Fstatus\u002FAzure-Samples\u002Fserverless-chat-langchainjs\u002Fbuild-test.yaml?style=flat-square&label=Build)](https:\u002F\u002Fgithub.com\u002FAzure-Samples\u002Fserverless-chat-langchainjs\u002Factions)\n![Node version](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FNode.js->=20-3c873a?style=flat-square)\n[![Ollama + Llama3.1](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FOllama-Llama3.1-ff7000?style=flat-square)](https:\u002F\u002Follama.com\u002Flibrary\u002Fllama3.1)\n[![TypeScript](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FTypeScript-blue?style=flat-square&logo=typescript&logoColor=white)](https:\u002F\u002Fwww.typescriptlang.org)\n[![License](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FLicense-MIT-yellow?style=flat-square)](LICENSE)\n\n:star: If you like this sample, star it on GitHub — it helps a lot!\n\n[Overview](#overview) • [Get started](#getting-started) • [Run the sample](#run-the-sample) • [Resources](#resources) • [FAQ](#faq) • [Troubleshooting](#troubleshooting)\n\n![Animation showing the chat app in action](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FAzure-Samples_serverless-chat-langchainjs_readme_b9aa23f63114.gif)\n\n\u003C\u002Fdiv>\n\nThis sample shows how to build a serverless AI chat experience with Retrieval-Augmented Generation using [LangChain.js](https:\u002F\u002Fjs.langchain.com\u002F) and Azure. The application is hosted on [Azure Static Web Apps](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fstatic-web-apps\u002Foverview) and [Azure Functions](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fazure-functions\u002Ffunctions-overview?pivots=programming-language-javascript), with [Azure Cosmos DB for NoSQL](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fcosmos-db\u002Fnosql\u002Fvector-) as the vector database. You can use it as a starting point for building more complex AI applications.\n\n> [!TIP]\n> You can test this application locally without any cost using [Ollama](https:\u002F\u002Follama.com\u002F). Follow the instructions in the [Local Development](#local-development) section to get started.\n\n## Overview\n\nBuilding AI applications can be complex and time-consuming, but using LangChain.js and Azure serverless technologies allows to greatly simplify the process. This application is a chatbot that uses a set of enterprise documents to generate responses to user queries.\n\nWe provide sample data to make this sample ready to try, but feel free to replace it with your own. We use a fictitious company called _Contoso Real Estate_, and the experience allows its customers to ask support questions about the usage of its products. The sample data includes a set of documents that describes its terms of service, privacy policy and a support guide.\n\n\u003Cdiv align=\"center\">\n  \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FAzure-Samples_serverless-chat-langchainjs_readme_b1b8c2ca8266.png\" alt=\"Application architecture\" width=\"640px\" \u002F>\n\u003C\u002Fdiv>\n\nThis application is made from multiple components:\n\n- A web app made with a single chat web component built with [Lit](https:\u002F\u002Flit.dev) and hosted on [Azure Static Web Apps](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fstatic-web-apps\u002Foverview). The code is located in the `packages\u002Fwebapp` folder.\n\n- A serverless API built with [Azure Functions](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fazure-functions\u002Ffunctions-overview?pivots=programming-language-javascript) and using [LangChain.js](https:\u002F\u002Fjs.langchain.com\u002F) to ingest the documents and generate responses to the user chat queries. The code is located in the `packages\u002Fapi` folder.\n\n- A database to store chat sessions and the text extracted from the documents and the vectors generated by LangChain.js, using [Azure Cosmos DB for NoSQL](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fcosmos-db\u002Fnosql\u002F).\n\n- A file storage to store the source documents, using [Azure Blob Storage](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fstorage\u002Fblobs\u002Fstorage-blobs-introduction).\n\nWe use the [HTTP protocol for AI chat apps](https:\u002F\u002Faka.ms\u002Fchatprotocol) to communicate between the web app and the API.\n\n## Features\n\n- **Serverless Architecture**: Utilizes Azure Functions and Azure Static Web Apps for a fully serverless deployment.\n- **Retrieval-Augmented Generation (RAG)**: Combines the power of Azure Cosmos DB and LangChain.js to provide relevant and accurate responses.\n- **Chat Sessions History**: Maintains a personal chat history for each user, allowing them to revisit previous conversations.\n- **Scalable and Cost-Effective**: Leverages Azure's serverless offerings to provide a scalable and cost-effective solution.\n- **Local Development**: Supports local development using Ollama for testing without any cloud costs.\n\n## Getting started\n\nThere are multiple ways to get started with this project.\n\nThe quickest way is to use [GitHub Codespaces](#use-github-codespaces) that provides a preconfigured environment for you. Alternatively, you can [set up your local environment](#use-your-local-environment) following the instructions below.\n\n> [!IMPORTANT]\n> If you want to run this sample entirely locally using Ollama, you have to follow the instructions in the [local environment](#use-your-local-environment) section.\n\n### Use your local environment\n\nYou need to install following tools to work on your local machine:\n\n- [Node.js LTS](https:\u002F\u002Fnodejs.org\u002Fdownload\u002F)\n- [Azure Developer CLI](https:\u002F\u002Faka.ms\u002Fazure-dev\u002Finstall)\n- [Git](https:\u002F\u002Fgit-scm.com\u002Fdownloads)\n- [PowerShell 7+](https:\u002F\u002Fgithub.com\u002Fpowershell\u002Fpowershell) _(for Windows users only)_\n  - **Important**: Ensure you can run `pwsh.exe` from a PowerShell command. If this fails, you likely need to upgrade PowerShell.\n  - Instead of Powershell, you can also use Git Bash or WSL to run the Azure Developer CLI commands.\n- [Azure Functions Core Tools](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fazure-functions\u002Ffunctions-run-local?tabs=macos%2Cisolated-process%2Cnode-v4%2Cpython-v2%2Chttp-trigger%2Ccontainer-apps&pivots=programming-language-javascript) _(should be installed automatically with NPM, only install manually if the API fails to start)_\n\nThen you can get the project code:\n\n1. [**Fork**](https:\u002F\u002Fgithub.com\u002FAzure-Samples\u002Fserverless-chat-langchainjs\u002Ffork) the project to create your own copy of this repository.\n2. On your forked repository, select the **Code** button, then the **Local** tab, and copy the URL of your forked repository.\n\n\u003Cdiv align=\"center\">\n  \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FAzure-Samples_serverless-chat-langchainjs_readme_66cda97c4ed5.png\" alt=\"Screenshot showing how to copy the repository URL\" width=\"400px\" \u002F>\n\u003C\u002Fdiv>\n3. Open a terminal and run this command to clone the repo: \u003Ccode> git clone &lt;your-repo-url&gt; \u003C\u002Fcode>\n\n### Use GitHub Codespaces\n\nYou can run this project directly in your browser by using GitHub Codespaces, which will open a web-based VS Code:\n\n[![Open in GitHub Codespaces](https:\u002F\u002Fimg.shields.io\u002Fstatic\u002Fv1?style=for-the-badge&label=GitHub+Codespaces&message=Open&color=blue&logo=github)](https:\u002F\u002Fcodespaces.new\u002FAzure-Samples\u002Fserverless-chat-langchainjs?hide_repo_select=true&ref&quickstart=true)\n\n### Use a VSCode dev container\n\nA similar option to Codespaces is VS Code Dev Containers, that will open the project in your local VS Code instance using the [Dev Containers extension](https:\u002F\u002Fmarketplace.visualstudio.com\u002Fitems?itemName=ms-vscode-remote.remote-containers).\n\nYou will also need to have [Docker](https:\u002F\u002Fwww.docker.com\u002Fproducts\u002Fdocker-desktop) installed on your machine to run the container.\n\n[![Open in Dev Containers](https:\u002F\u002Fimg.shields.io\u002Fstatic\u002Fv1?style=for-the-badge&label=Dev%20Containers&message=Open&color=blue&logo=visualstudiocode)](https:\u002F\u002Fvscode.dev\u002Fredirect?url=vscode:\u002F\u002Fms-vscode-remote.remote-containers\u002FcloneInVolume?url=https:\u002F\u002Fgithub.com\u002FAzure-Samples\u002Fserverless-chat-langchainjs)\n\n## Run the sample\n\nThere are multiple ways to run this sample: locally using Ollama or Azure OpenAI models, or by deploying it to Azure.\n\n### Deploy the sample to Azure\n\n#### Azure prerequisites\n\n- **Azure account**. If you're new to Azure, [get an Azure account for free](https:\u002F\u002Fazure.microsoft.com\u002Ffree) to get free Azure credits to get started. If you're a student, you can also get free credits with [Azure for Students](https:\u002F\u002Faka.ms\u002Fazureforstudents).\n- **Azure subscription with access enabled for the Azure OpenAI service**. You can request access with [this form](https:\u002F\u002Faka.ms\u002Foaiapply).\n- **Azure account permissions**:\n  - Your Azure account must have `Microsoft.Authorization\u002FroleAssignments\u002Fwrite` permissions, such as [Role Based Access Control Administrator](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Frole-based-access-control\u002Fbuilt-in-roles#role-based-access-control-administrator-preview), [User Access Administrator](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Frole-based-access-control\u002Fbuilt-in-roles#user-access-administrator), or [Owner](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Frole-based-access-control\u002Fbuilt-in-roles#owner). If you don't have subscription-level permissions, you must be granted [RBAC](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Frole-based-access-control\u002Fbuilt-in-roles#role-based-access-control-administrator-preview) for an existing resource group and [deploy to that existing group](docs\u002Fdeploy_existing.md#resource-group).\n  - Your Azure account also needs `Microsoft.Resources\u002Fdeployments\u002Fwrite` permissions on the subscription level.\n\n#### Cost estimation\n\nSee the [cost estimation](.\u002Fdocs\u002Fcost.md) details for running this sample on Azure.\n\n#### Deploy the sample\n\n1. Open a terminal and navigate to the root of the project.\n2. Authenticate with Azure by running `azd auth login`.\n3. Run `azd up` to deploy the application to Azure. This will provision Azure resources, deploy this sample, and build the search index based on the files found in the `.\u002Fdata` folder.\n   - You will be prompted to select a base location for the resources. If you're unsure of which location to choose, select `eastus2`.\n   - By default, the OpenAI resource will be deployed to `eastus2`. You can set a different location with `azd env set AZURE_OPENAI_RESOURCE_GROUP_LOCATION \u003Clocation>`. Currently only a short list of locations is accepted. That location list is based on the [OpenAI model availability table](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fai-services\u002Fopenai\u002Fconcepts\u002Fmodels#standard-deployment-model-availability) and may become outdated as availability changes.\n\nThe deployment process will take a few minutes. Once it's done, you'll see the URL of the web app in the terminal.\n\n\u003Cdiv align=\"center\">\n  \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FAzure-Samples_serverless-chat-langchainjs_readme_ebe0447fb361.png\" alt=\"Screenshot of the azd up command result\" width=\"600px\" \u002F>\n\u003C\u002Fdiv>\n\nYou can now open the web app in your browser and start chatting with the bot.\n\n##### Enhance security\n\nWhen deploying the sample in an enterprise context, you may want to enforce tighter security restrictions to protect your data and resources. See the [enhance security](.\u002Fdocs\u002Fenhance-security.md) guide for more information.\n\n#### Enable CI\u002FCD\n\nIf you want to enable Continuous Deployment for your forked repository, you need to configure the Azure pipeline first:\n\n1. Open a terminal at the root of your forked project.\n2. Authenticate with Azure by running `azd auth login`.\n3. Run `azd pipeline config` to configure the required secrets and variables for connecting to Azure from GitHub Actions.\n   - This command will set up the necessary Azure service principal and configure GitHub repository secrets.\n   - Follow the prompts to complete the configuration.\n\nOnce configured, the GitHub Actions workflow will automatically deploy your application to Azure whenever you push changes to the main branch.\n\n#### Clean up\n\nTo clean up all the Azure resources created by this sample:\n\n1. Run `azd down --purge`\n2. When asked if you are sure you want to continue, enter `y`\n\nThe resource group and all the resources will be deleted.\n\n### Run the sample locally with Ollama\n\nIf you have a machine with enough resources, you can run this sample entirely locally without using any cloud resources. To do that, you first have to install [Ollama](https:\u002F\u002Follama.com) and then run the following commands to download the models on your machine:\n\n```bash\nollama pull llama3.1:latest\nollama pull nomic-embed-text:latest\n```\n\n> [!NOTE]\n> The `llama3.1` model with download a few gigabytes of data, so it can take some time depending on your internet connection.\n\nAfter that you have to install the NPM dependencies:\n\n```bash\nnpm install\n```\n\nThen you can start the application by running the following command which will start the web app and the API locally:\n\n```bash\nnpm start\n```\n\nThen, open a new terminal running concurrently and run the following command to upload the PDF documents from the `\u002Fdata` folder to the API:\n\n```bash\nnpm run upload:docs\n```\n\nThis only has to be done once, unless you want to add more documents.\n\nYou can now open the URL `http:\u002F\u002Flocalhost:8000` in your browser to start chatting with the bot.\n\n> [!NOTE]\n> While local models usually works well enough to answer the questions, sometimes they may not be able to follow perfectly the advanced formatting instructions for the citations and follow-up questions. This is expected, and a limitation of using smaller local models.\n\n### Run the sample locally with Azure OpenAI models\n\nFirst you need to provision the Azure resources needed to run the sample. Follow the instructions in the [Deploy the sample to Azure](#deploy-the-sample-to-azure) section to deploy the sample to Azure, then you'll be able to run the sample locally using the deployed Azure resources.\n\nOnce your deployment is complete, you should see a `.env` file in the `packages\u002Fapi` folder. This file contains the environment variables needed to run the application using Azure resources.\n\nTo run the sample, you can then use the same commands as for the Ollama setup. This will start the web app and the API locally:\n\n```bash\nnpm start\n```\n\nOpen the URL `http:\u002F\u002Flocalhost:8000` in your browser to start chatting with the bot.\n\nNote that the documents are uploaded automatically when deploying the sample to Azure with `azd up`.\n\n> [!TIP]\n> You can switch back to using Ollama models by simply deleting the `packages\u002Fapi\u002F.env` file and starting the application again. To regenerate the `.env` file, you can run `azd env get-values > packages\u002Fapi\u002F.env`.\n\n## Resources\n\nHere are some resources to learn more about the technologies used in this sample:\n\n- [LangChain.js documentation](https:\u002F\u002Fjs.langchain.com)\n- [Generative AI with JavaScript](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fgenerative-ai-with-javascript)\n- [Generative AI For Beginners](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fgenerative-ai-for-beginners)\n- [Azure OpenAI Service](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fai-services\u002Fopenai\u002Foverview)\n- [Azure Cosmos DB for NoSQL](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fcosmos-db\u002Fnosql\u002F)\n- [Ask YouTube: LangChain.js + Azure Quickstart sample](https:\u002F\u002Fgithub.com\u002FAzure-Samples\u002Flangchainjs-quickstart-demo)\n- [Chat + Enterprise data with Azure OpenAI and Azure AI Search](https:\u002F\u002Fgithub.com\u002FAzure-Samples\u002Fazure-search-openai-javascript)\n- [Revolutionize your Enterprise Data with Chat: Next-gen Apps w\u002F Azure OpenAI and AI Search](https:\u002F\u002Faka.ms\u002Fentgptsearchblog)\n\nYou can also find [more Azure AI samples here](https:\u002F\u002Fgithub.com\u002FAzure-Samples\u002Fazureai-samples).\n\n## FAQ\n\nYou can find answers to frequently asked questions in the [FAQ](.\u002Fdocs\u002Ffaq.md).\n\n## Troubleshooting\n\nIf you have any issue when running or deploying this sample, please check the [troubleshooting guide](.\u002Fdocs\u002Ftroubleshooting.md). If you can't find a solution to your problem, please [open an issue](https:\u002F\u002Fgithub.com\u002FAzure-Samples\u002Fserverless-chat-langchainjs\u002Fissues) in this repository.\n\n## Guidance\n\nFor more detailed guidance on how to use this sample, please refer to the [tutorial](.\u002Fdocs\u002Ftutorial\u002F01-introduction.md).\n\n## Getting Help\n\nIf you get stuck or have any questions about building AI apps, join:\n\n[![Azure AI Foundry Discord](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FDiscord-Azure_AI_Foundry_Community_Discord-blue?style=for-the-badge&logo=discord&color=5865f2&logoColor=fff)](https:\u002F\u002Faka.ms\u002Ffoundry\u002Fdiscord)\n\nIf you have product feedback or errors while building visit:\n\n[![Azure AI Foundry Developer Forum](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FGitHub-Azure_AI_Foundry_Developer_Forum-blue?style=for-the-badge&logo=github&color=000000&logoColor=fff)](https:\u002F\u002Faka.ms\u002Ffoundry\u002Fforum)\n\n## Contributing\n\nThis project welcomes contributions and suggestions. Most contributions require you to agree to a\nContributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us\nthe rights to use your contribution. For details, visit https:\u002F\u002Fcla.opensource.microsoft.com.\n\nWhen you submit a pull request, a CLA bot will automatically determine whether you need to provide\na CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions\nprovided by the bot. You will only need to do this once across all repos using our CLA.\n\nThis project has adopted the [Microsoft Open Source Code of Conduct](https:\u002F\u002Fopensource.microsoft.com\u002Fcodeofconduct\u002F).\nFor more information see the [Code of Conduct FAQ](https:\u002F\u002Fopensource.microsoft.com\u002Fcodeofconduct\u002Ffaq\u002F) or\ncontact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.\n\n## Trademarks\n\nThis project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft\ntrademarks or logos is subject to and must follow\n[Microsoft's Trademark & Brand Guidelines](https:\u002F\u002Fwww.microsoft.com\u002Fen-us\u002Flegal\u002Fintellectualproperty\u002Ftrademarks\u002Fusage\u002Fgeneral).\nUse of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship.\nAny use of third-party trademarks or logos are subject to those third-party's policies.\n","\u003C!-- prettier-ignore -->\n\u003Cdiv align=\"center\">\n\n\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FAzure-Samples_serverless-chat-langchainjs_readme_0ef1ff1f66a8.png\" alt=\"\" align=\"center\" height=\"64\" \u002F>\n\n# 使用 LangChain.js 的无服务器 RAG 聊天应用\n\n[![在 GitHub Codespaces 中打开项目](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FCodespaces-Open-blue?style=flat-square&logo=github)](https:\u002F\u002Fcodespaces.new\u002FAzure-Samples\u002Fserverless-chat-langchainjs?hide_repo_select=true&ref=main&quickstart=true)\n[![加入 Azure AI Foundry Discord](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FDiscord-Azure_AI_Community-blue?style=flat-square&logo=discord&color=5865f2&logoColor=fff)](https:\u002F\u002Faka.ms\u002Ffoundry\u002Fdiscord)\n[![官方 Learn 文档](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FDocumentation-00a3ee?style=flat-square)](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fdeveloper\u002Fjavascript\u002Fai\u002Fget-started-app-chat-template-langchainjs)\n[![观看 YouTube 视频，了解 RAG 和本示例](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FYouTube-d95652.svg?style=flat-square&logo=youtube)](https:\u002F\u002Fwww.youtube.com\u002Fwatch?v=xkFOmx5yxIA&list=PLlrxD0HtieHi5ZpsHULPLxm839IrhmeDk&index=4)\n[![dev.to 博客文章教程](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FBlog%20post-black?style=flat-square&logo=dev.to)](https:\u002F\u002Fdev.to\u002Fazure\u002Fbuild-a-serverless-chatgpt-with-rag-using-langchainjs-3487)\n\u003Cbr>\n[![构建状态](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Factions\u002Fworkflow\u002Fstatus\u002FAzure-Samples\u002Fserverless-chat-langchainjs\u002Fbuild-test.yaml?style=flat-square&label=Build)](https:\u002F\u002Fgithub.com\u002FAzure-Samples\u002Fserverless-chat-langchainjs\u002Factions)\n![Node.js 版本](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FNode.js->=20-3c873a?style=flat-square)\n[![Ollama + Llama3.1](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FOllama-Llama3.1-ff7000?style=flat-square)](https:\u002F\u002Follama.com\u002Flibrary\u002Fllama3.1)\n[![TypeScript](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FTypeScript-blue?style=flat-square&logo=typescript&logoColor=white)](https:\u002F\u002Fwww.typescriptlang.org)\n[![许可证](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FLicense-MIT-yellow?style=flat-square)](LICENSE)\n\n:star: 如果你喜欢这个示例，请在 GitHub 上给它点个赞——这对我们帮助很大！\n\n[概述](#overview) • [开始使用](#getting-started) • [运行示例](#run-the-sample) • [资源](#resources) • [常见问题](#faq) • [故障排除](#troubleshooting)\n\n![展示聊天应用运行情况的动画](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FAzure-Samples_serverless-chat-langchainjs_readme_b9aa23f63114.gif)\n\n\u003C\u002Fdiv>\n\n本示例展示了如何使用 [LangChain.js](https:\u002F\u002Fjs.langchain.com\u002F) 和 Azure 构建基于检索增强生成（RAG）的无服务器 AI 聊天体验。该应用托管在 [Azure 静态 Web 应用](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fstatic-web-apps\u002Foverview)和 [Azure Functions](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fazure-functions\u002Ffunctions-overview?pivots=programming-language-javascript)上，向量数据库则使用 [Azure Cosmos DB for NoSQL](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fcosmos-db\u002Fnosql\u002Fvector-)。你可以将其作为构建更复杂 AI 应用程序的起点。\n\n> [!TIP]\n> 你可以使用 [Ollama](https:\u002F\u002Follama.com\u002F) 在本地免费测试此应用程序。请按照“本地开发”部分中的说明开始操作。\n\n## 概述\n\n构建 AI 应用程序可能既复杂又耗时，但借助 LangChain.js 和 Azure 的无服务器技术，可以大大简化这一过程。此应用程序是一个聊天机器人，它利用一组企业文档为用户查询生成响应。\n\n我们提供了示例数据，使此示例可以直接试用，但你也可以自由替换为自己的数据。我们以一家名为 _Contoso Real Estate_ 的虚构公司为例，其客户可以通过该体验就产品使用问题向客服提问。示例数据包括服务条款、隐私政策以及支持指南等文档。\n\n\u003Cdiv align=\"center\">\n  \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FAzure-Samples_serverless-chat-langchainjs_readme_b1b8c2ca8266.png\" alt=\"应用架构\" width=\"640px\" \u002F>\n\u003C\u002Fdiv>\n\n该应用程序由多个组件构成：\n\n- 一个使用 [Lit](https:\u002F\u002Flit.dev) 构建的单聊 Web 组件，并托管在 [Azure 静态 Web 应用](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fstatic-web-apps\u002Foverview)上的 Web 应用。代码位于 `packages\u002Fwebapp` 文件夹中。\n\n- 一个使用 [Azure Functions](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fazure-functions\u002Ffunctions-overview?pivots=programming-language-javascript) 构建的无服务器 API，利用 [LangChain.js](https:\u002F\u002Fjs.langchain.com\u002F) 处理文档并为用户的聊天请求生成响应。代码位于 `packages\u002Fapi` 文件夹中。\n\n- 一个用于存储聊天会话、从文档中提取的文本以及 LangChain.js 生成的向量的数据库，使用的是 [Azure Cosmos DB for NoSQL](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fcosmos-db\u002Fnosql\u002F)。\n\n- 一个用于存储源文档的文件存储，使用的是 [Azure Blob 存储](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fstorage\u002Fblobs\u002Fstorage-blobs-introduction)。\n\n我们使用 [AI 聊天应用的 HTTP 协议](https:\u002F\u002Faka.ms\u002Fchatprotocol)来实现 Web 应用与 API 之间的通信。\n\n## 特性\n\n- **无服务器架构**：采用 Azure Functions 和 Azure 静态 Web 应用，实现完全无服务器部署。\n- **检索增强生成 (RAG)**：结合 Azure Cosmos DB 和 LangChain.js 的强大功能，提供相关且准确的回复。\n- **聊天记录历史**：为每位用户维护个人聊天历史，方便他们回顾之前的对话。\n- **可扩展且经济高效**：利用 Azure 的无服务器服务，提供可扩展且经济高效的解决方案。\n- **本地开发**：支持使用 Ollama 进行本地开发，无需任何云成本即可进行测试。\n\n## 开始使用\n\n有多种方式可以开始使用该项目。\n\n最快捷的方式是使用 [GitHub Codespaces](#use-github-codespaces)，它会为你提供一个预配置好的环境。或者，你也可以按照以下说明 [设置本地环境](#use-your-local-environment)。\n\n> [!IMPORTANT]\n> 如果你想完全在本地使用 Ollama 运行此示例，必须按照“本地环境”部分中的说明操作。\n\n### 使用本地环境\n\n您需要在本地机器上安装以下工具：\n\n- [Node.js LTS](https:\u002F\u002Fnodejs.org\u002Fdownload\u002F)\n- [Azure 开发者 CLI](https:\u002F\u002Faka.ms\u002Fazure-dev\u002Finstall)\n- [Git](https:\u002F\u002Fgit-scm.com\u002Fdownloads)\n- [PowerShell 7+](https:\u002F\u002Fgithub.com\u002Fpowershell\u002Fpowershell) _(仅适用于 Windows 用户)_\n  - **重要提示**：请确保可以从 PowerShell 命令行运行 `pwsh.exe`。如果无法运行，您可能需要升级 PowerShell。\n  - 您也可以使用 Git Bash 或 WSL 来运行 Azure 开发者 CLI 命令，而无需使用 PowerShell。\n- [Azure Functions Core Tools](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fazure-functions\u002Ffunctions-run-local?tabs=macos%2Cisolated-process%2Cnode-v4%2Cpython-v2%2Chttp-trigger%2Ccontainer-apps&pivots=programming-language-javascript) _(通常会随 NPM 自动安装，只有在 API 启动失败时才需手动安装)_\n\n然后您可以获取项目代码：\n\n1. [**Fork**](https:\u002F\u002Fgithub.com\u002FAzure-Samples\u002Fserverless-chat-langchainjs\u002Ffork) 该项目以创建您自己的仓库副本。\n2. 在您的分支仓库中，选择 **Code** 按钮，然后选择 **Local** 选项卡，并复制您分支仓库的 URL。\n\n\u003Cdiv align=\"center\">\n  \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FAzure-Samples_serverless-chat-langchainjs_readme_66cda97c4ed5.png\" alt=\"截图显示如何复制仓库 URL\" width=\"400px\" \u002F>\n\u003C\u002Fdiv>\n3. 打开终端并运行以下命令克隆仓库：\u003Ccode> git clone &lt;your-repo-url&gt; \u003C\u002Fcode>\n\n### 使用 GitHub Codespaces\n\n您可以通过 GitHub Codespaces 直接在浏览器中运行此项目，它将打开一个基于 Web 的 VS Code：\n\n[![在 GitHub Codespaces 中打开](https:\u002F\u002Fimg.shields.io\u002Fstatic\u002Fv1?style=for-the-badge&label=GitHub+Codespaces&message=Open&color=blue&logo=github)](https:\u002F\u002Fcodespaces.new\u002FAzure-Samples\u002Fserverless-chat-langchainjs?hide_repo_select=true&ref&quickstart=true)\n\n### 使用 VSCode 开发容器\n\n与 Codespaces 类似的选项是 VS Code 开发容器，它将使用 [Dev Containers 扩展](https:\u002F\u002Fmarketplace.visualstudio.com\u002Fitems?itemName=ms-vscode-remote.remote-containers)在您本地的 VS Code 实例中打开项目。\n\n您还需要在本地机器上安装 [Docker](https:\u002F\u002Fwww.docker.com\u002Fproducts\u002Fdocker-desktop) 才能运行容器。\n\n[![在 Dev Containers 中打开](https:\u002F\u002Fimg.shields.io\u002Fstatic\u002Fv1?style=for-the-badge&label=Dev%20Containers&message=Open&color=blue&logo=visualstudiocode)](https:\u002F\u002Fvscode.dev\u002Fredirect?url=vscode:\u002F\u002Fms-vscode-remote.remote-containers\u002FcloneInVolume?url=https:\u002F\u002Fgithub.com\u002FAzure-Samples\u002Fserverless-chat-langchainjs)\n\n## 运行示例\n\n有多种方式可以运行此示例：在本地使用 Ollama 或 Azure OpenAI 模型，或者将其部署到 Azure。\n\n### 将示例部署到 Azure\n\n#### Azure 前提条件\n\n- **Azure 账户**。如果您是 Azure 新用户，可以 [免费注册 Azure 账户](https:\u002F\u002Fazure.microsoft.com\u002Ffree)，获得免费的 Azure 积分开始使用。如果是学生，还可以通过 [Azure for Students](https:\u002F\u002Faka.ms\u002Fazureforstudents) 获取免费积分。\n- **已启用 Azure OpenAI 服务访问权限的 Azure 订阅**。您可以通过 [此表格](https:\u002F\u002Faka.ms\u002Foaiapply)申请访问权限。\n- **Azure 账户权限**：\n  - 您的 Azure 账户必须具有 `Microsoft.Authorization\u002FroleAssignments\u002Fwrite` 权限，例如 [基于角色的访问控制管理员](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Frole-based-access-control\u002Fbuilt-in-roles#role-based-access-control-administrator-preview)、[用户访问管理员](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Frole-based-access-control\u002Fbuilt-in-roles#user-access-administrator) 或 [所有者](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Frole-based-access-control\u002Fbuilt-in-roles#owner)。如果您没有订阅级别的权限，必须为现有资源组授予 [RBAC](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Frole-based-access-control\u002Fbuilt-in-roles#role-based-access-control-administrator-preview) 权限，并 [部署到该现有资源组](docs\u002Fdeploy_existing.md#resource-group)。\n  - 您的 Azure 账户还需具备订阅级别的 `Microsoft.Resources\u002Fdeployments\u002Fwrite` 权限。\n\n#### 成本估算\n\n有关在 Azure 上运行此示例的成本估算详情，请参阅 [成本估算](.\u002Fdocs\u002Fcost.md)。\n\n#### 部署示例\n\n1. 打开终端并导航到项目根目录。\n2. 运行 `azd auth login` 进行 Azure 身份验证。\n3. 运行 `azd up` 将应用程序部署到 Azure。此操作将预配 Azure 资源、部署示例，并根据 `.\u002Fdata` 文件夹中的文件构建搜索索引。\n   - 系统会提示您选择资源的基础位置。如果您不确定选择哪个位置，可以选择 `eastus2`。\n   - 默认情况下，OpenAI 资源将部署到 `eastus2`。您可以通过 `azd env set AZURE_OPENAI_RESOURCE_GROUP_LOCATION \u003Clocation>` 设置其他位置。目前仅支持少数几个位置，这些位置基于 [OpenAI 模型可用性表](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fai-services\u002Fopenai\u002Fconcepts\u002Fmodels#standard-deployment-model-availability)，并且可能会随着可用性的变化而更新。\n\n部署过程需要几分钟时间。完成后，您将在终端中看到 Web 应用程序的 URL。\n\n\u003Cdiv align=\"center\">\n  \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FAzure-Samples_serverless-chat-langchainjs_readme_ebe0447fb361.png\" alt=\"azd up 命令结果截图\" width=\"600px\" \u002F>\n\u003C\u002Fdiv>\n\n现在您可以在浏览器中打开 Web 应用程序，并开始与机器人聊天。\n\n##### 增强安全性\n\n在企业环境中部署示例时，您可能希望实施更严格的安全限制，以保护您的数据和资源。有关更多信息，请参阅 [增强安全性](.\u002Fdocs\u002Fenhance-security.md) 指南。\n\n#### 启用 CI\u002FCD\n\n如果您希望为您的分支仓库启用持续部署，需要先配置 Azure 管道：\n\n1. 在您的分支项目的根目录下打开终端。\n2. 运行 `azd auth login` 进行 Azure 身份验证。\n3. 运行 `azd pipeline config` 配置从 GitHub Actions 连接到 Azure 所需的密钥和变量。\n   - 此命令将设置必要的 Azure 服务主体，并配置 GitHub 仓库的密钥。\n   - 按照提示完成配置。\n\n配置完成后，每当您将更改推送到主分支时，GitHub Actions 工作流就会自动将您的应用程序部署到 Azure。\n\n#### 清理\n\n要清理由此示例创建的所有 Azure 资源：\n\n1. 运行 `azd down --purge`\n2. 当系统询问您是否确定继续时，输入 `y`\n\n资源组及所有资源将被删除。\n\n### 在本地使用 Ollama 运行示例\n\n如果你的机器资源足够，你可以完全在本地运行这个示例，而无需使用任何云资源。为此，你首先需要安装 [Ollama](https:\u002F\u002Follama.com)，然后运行以下命令以在你的机器上下载模型：\n\n```bash\nollama pull llama3.1:latest\nollama pull nomic-embed-text:latest\n```\n\n> [!NOTE]\n> `llama3.1` 模型会下载几 GB 的数据，因此根据你的网络连接速度，可能需要一些时间。\n\n之后，你需要安装 NPM 依赖项：\n\n```bash\nnpm install\n```\n\n然后，你可以通过运行以下命令来启动应用程序，这将同时在本地启动 Web 应用和 API：\n\n```bash\nnpm start\n```\n\n接着，打开一个新的终端窗口并同时运行以下命令，将 `\u002Fdata` 文件夹中的 PDF 文档上传到 API：\n\n```bash\nnpm run upload:docs\n```\n\n这一步只需要执行一次，除非你想要添加更多文档。\n\n现在，你可以在浏览器中打开 `http:\u002F\u002Flocalhost:8000` 来开始与机器人聊天。\n\n> [!NOTE]\n> 虽然本地模型通常足以回答问题，但有时它们可能无法完全遵循关于引用和后续问题的高级格式化指令。这是预期的行为，也是使用较小本地模型的一个局限性。\n\n### 在本地使用 Azure OpenAI 模型运行示例\n\n首先，你需要预配运行示例所需的 Azure 资源。按照[将示例部署到 Azure](#deploy-the-sample-to-azure) 部分的说明将示例部署到 Azure，然后你就可以使用已部署的 Azure 资源在本地运行示例。\n\n部署完成后，你应该会在 `packages\u002Fapi` 文件夹中看到一个 `.env` 文件。该文件包含使用 Azure 资源运行应用程序所需的环境变量。\n\n要运行示例，你可以使用与 Ollama 设置相同的命令。这将同时在本地启动 Web 应用程序和 API：\n\n```bash\nnpm start\n```\n\n打开浏览器中的 `http:\u002F\u002Flocalhost:8000` URL，即可开始与机器人聊天。\n\n请注意，当你使用 `azd up` 将示例部署到 Azure 时，文档会自动上传。\n\n> [!TIP]\n> 你可以通过简单地删除 `packages\u002Fapi\u002F.env` 文件并重新启动应用程序，切换回使用 Ollama 模型。要重新生成 `.env` 文件，可以运行 `azd env get-values > packages\u002Fapi\u002F.env`。\n\n## 资源\n\n以下是一些资源，可以帮助你深入了解本示例中使用的技术：\n\n- [LangChain.js 文档](https:\u002F\u002Fjs.langchain.com)\n- [使用 JavaScript 的生成式 AI](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fgenerative-ai-with-javascript)\n- [面向初学者的生成式 AI](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fgenerative-ai-for-beginners)\n- [Azure OpenAI 服务](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fai-services\u002Fopenai\u002Foverview)\n- [Azure Cosmos DB for NoSQL](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fcosmos-db\u002Fnosql\u002F)\n- [Ask YouTube：LangChain.js + Azure 快速入门示例](https:\u002F\u002Fgithub.com\u002FAzure-Samples\u002Flangchainjs-quickstart-demo)\n- [使用 Azure OpenAI 和 Azure AI Search 的聊天 + 企业数据](https:\u002F\u002Fgithub.com\u002FAzure-Samples\u002Fazure-search-openai-javascript)\n- [用聊天彻底革新你的企业数据：基于 Azure OpenAI 和 AI Search 的下一代应用](https:\u002F\u002Faka.ms\u002Fentgptsearchblog)\n\n你还可以在[这里找到更多 Azure AI 示例](https:\u002F\u002Fgithub.com\u002FAzure-Samples\u002Fazureai-samples)。\n\n## 常见问题解答\n\n你可以在[常见问题解答](.\u002Fdocs\u002Ffaq.md)中找到常见问题的答案。\n\n## 故障排除\n\n如果你在运行或部署此示例时遇到任何问题，请查看[故障排除指南](.\u002Fdocs\u002Ftroubleshooting.md)。如果仍然无法解决问题，请在此仓库中[提交一个问题](https:\u002F\u002Fgithub.com\u002FAzure-Samples\u002Fserverless-chat-langchainjs\u002Fissues)。\n\n## 指导\n\n有关如何使用此示例的更详细指导，请参阅[教程](.\u002Fdocs\u002Ftutorial\u002F01-introduction.md)。\n\n## 获取帮助\n\n如果你在构建 AI 应用时遇到困难或有任何疑问，请加入：\n\n[![Azure AI Foundry Discord](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FDiscord-Azure_AI_Foundry_Community_Discord-blue?style=for-the-badge&logo=discord&color=5865f2&logoColor=fff)](https:\u002F\u002Faka.ms\u002Ffoundry\u002Fdiscord)\n\n如果你在构建过程中有产品反馈或遇到错误，请访问：\n\n[![Azure AI Foundry 开发者论坛](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FGitHub-Azure_AI_Foundry_Developer_Forum-blue?style=for-the-badge&logo=github&color=000000&logoColor=fff)](https:\u002F\u002Faka.ms\u002Ffoundry\u002Fforum)\n\n## 贡献\n\n本项目欢迎贡献和建议。大多数贡献都需要你同意一份贡献者许可协议 (CLA)，声明你有权且确实授予我们使用你贡献的权利。有关详情，请访问 https:\u002F\u002Fcla.opensource.microsoft.com。\n\n当你提交拉取请求时，CLA 机器人会自动判断你是否需要提供 CLA，并相应地标记 PR（例如状态检查、评论）。只需按照机器人提供的指示操作即可。对于使用我们 CLA 的所有仓库，你只需执行一次此操作。\n\n本项目采用了[微软开源行为准则](https:\u002F\u002Fopensource.microsoft.com\u002Fcodeofconduct\u002F)。有关更多信息，请参阅[行为准则常见问题解答](https:\u002F\u002Fopensource.microsoft.com\u002Fcodeofconduct\u002Ffaq\u002F)，或如有其他疑问或意见，请联系 [opencode@microsoft.com](mailto:opencode@microsoft.com)。\n\n## 商标\n\n本项目可能包含项目、产品或服务的商标或徽标。对微软商标或徽标的授权使用必须遵守并遵循[微软商标与品牌指南](https:\u002F\u002Fwww.microsoft.com\u002Fen-us\u002Flegal\u002Fintellectualproperty\u002Ftrademarks\u002Fusage\u002Fgeneral)。在本项目的修改版本中使用微软商标或徽标时，不得造成混淆或暗示微软的赞助。任何第三方商标或徽标的使用均受其各自政策的约束。","# Serverless-chat-langchainjs 快速上手指南\n\n本项目展示了如何使用 **LangChain.js** 和 **Azure Serverless** 技术构建一个基于检索增强生成（RAG）的无服务器 AI 聊天应用。它支持使用企业文档作为知识库，通过 Azure Cosmos DB 进行向量存储，并提供本地开发模式（使用 Ollama）以零成本进行测试。\n\n## 环境准备\n\n在开始之前，请确保您的开发环境满足以下要求：\n\n### 系统要求\n- **Node.js**: LTS 版本 (>= 20)\n- **操作系统**: Windows, macOS 或 Linux\n  - *Windows 用户注意*: 需要安装 **PowerShell 7+** (确保 `pwsh.exe` 可用)，或者使用 Git Bash \u002F WSL。\n\n### 前置依赖工具\n请安装以下工具：\n1. **Git**: [下载地址](https:\u002F\u002Fgit-scm.com\u002Fdownloads)\n2. **Azure Developer CLI (azd)**: [安装指南](https:\u002F\u002Faka.ms\u002Fazure-dev\u002Finstall)\n3. **Azure Functions Core Tools**: \n   - 通常随 NPM 自动安装，若 API 启动失败请手动安装。\n   - [官方文档](https:\u002F\u002Flearn.microsoft.com\u002Fazure\u002Fazure-functions\u002Ffunctions-run-local)\n\n> **提示**: 若仅进行本地测试（不使用 Azure 云资源），您还需要安装 **[Ollama](https:\u002F\u002Follama.com\u002F)** 并拉取模型（如 `llama3.1`）。\n\n## 安装步骤\n\n### 1. 获取代码\n首先 Fork 项目到您的 GitHub 账户，然后克隆到本地：\n\n```bash\ngit clone \u003Cyour-repo-url>\ncd serverless-chat-langchainjs\n```\n\n*或者直接使用 GitHub Codespaces 在线开发（无需本地配置）：*\n[![Open in GitHub Codespaces](https:\u002F\u002Fimg.shields.io\u002Fstatic\u002Fv1?style=for-the-badge&label=GitHub+Codespaces&message=Open&color=blue&logo=github)](https:\u002F\u002Fcodespaces.new\u002FAzure-Samples\u002Fserverless-chat-langchainjs?hide_repo_select=true&ref=main&quickstart=true)\n\n### 2. 部署到 Azure (云端运行)\n\n如果您希望使用 Azure OpenAI 和云服务运行完整应用：\n\n**前置条件：**\n- 拥有 Azure 账号及订阅。\n- 订阅已启用 **Azure OpenAI** 服务访问权限。\n- 账号具备 `Microsoft.Authorization\u002FroleAssignments\u002Fwrite` 和 `Microsoft.Resources\u002Fdeployments\u002Fwrite` 权限。\n\n**部署命令：**\n\n```bash\n# 1. 登录 Azure\nazd auth login\n\n# 2. 部署应用（创建资源、部署代码、构建索引）\nazd up\n```\n\n*执行过程中会提示选择区域，推荐选择 `eastus2`。部署完成后，终端将显示 Web 应用的访问 URL。*\n\n### 3. 本地运行 (零成本测试)\n\n如果您希望使用本地模型（Ollama）进行测试，无需 Azure 账号：\n\n1. 确保已安装 Ollama 并运行模型：\n   ```bash\n   ollama pull llama3.1\n   ollama serve\n   ```\n2. 在项目根目录按照 [Local Development](#local-development) 相关说明配置环境变量（通常需设置 `LLM_ENDPOINT` 指向本地 Ollama 地址）。\n3. 启动本地开发服务器（参考项目 `package.json` 中的脚本，通常为）：\n   ```bash\n   npm install\n   npm run dev\n   ```\n\n## 基本使用\n\n### 启动应用\n完成上述“部署”或“本地运行”步骤后，应用将启动。\n\n- **云端版**: 在浏览器打开 `azd up` 输出的 URL。\n- **本地版**: 通常访问 `http:\u002F\u002Flocalhost:8000` (具体端口视配置而定)。\n\n### 体验功能\n1. **对话交互**: 在聊天界面输入问题，系统将基于 `.\u002Fdata` 文件夹中的示例文档（Contoso Real Estate 公司的服务条款、隐私政策等）生成回答。\n2. **上传文档**: 您可以替换 `.\u002Fdata` 文件夹中的内容为您自己的企业文档，重新运行索引构建流程（云端部署时自动执行，本地需参考相关脚本），即可构建专属知识库聊天机器人。\n3. **历史记录**: 应用会自动保存用户的聊天会话历史，方便回溯。\n\n### 架构简述\n- **前端**: 基于 Lit 构建的 Web 组件，托管于 Azure Static Web Apps。\n- **后端**: 使用 Azure Functions 运行 LangChain.js 逻辑，处理文档摄入和问答生成。\n- **存储**: 使用 Azure Cosmos DB (NoSQL) 存储向量数据，Azure Blob Storage 存储源文件。","某中型房地产科技公司\"Contoso 实业”的客户支持团队，正面临海量产品文档（服务条款、隐私政策、操作指南）导致的人工回复效率低下问题，急需构建一个能自动基于文档回答用户疑问的智能客服系统。\n\n### 没有 serverless-chat-langchainjs 时\n- **开发门槛高且周期长**：团队需手动搭建向量数据库、编写复杂的 RAG（检索增强生成）逻辑链，并自行处理前端与后端的对接，耗时数周仍难以跑通原型。\n- **运维成本高昂**：传统架构需要常备服务器资源，即便在深夜无用户咨询时也产生高额算力费用，且需专人维护服务器伸缩策略。\n- **本地测试困难**：缺乏便捷的本地调试方案，开发人员必须依赖云端昂贵的大模型 API 进行迭代，试错成本极高。\n- **数据更新滞后**：每当公司发布新的隐私政策或产品指南，重新索引文档并同步到聊天机器人中流程繁琐，容易导致客服回答过时信息。\n\n### 使用 serverless-chat-langchainjs 后\n- **快速落地原型**：利用 LangChain.js 和 Azure 模板，团队在几小时内即可部署包含前端、函数计算和 Cosmos DB 向量库的完整应用，大幅缩短上线时间。\n- **真正的按需付费**：依托 Azure Functions 和 Static Web Apps 的无服务器架构，系统仅在用户发起对话时消耗资源，闲时零成本，运维压力归零。\n- **低成本本地开发**：支持集成 Ollama 和本地 Llama3.1 模型，开发者可在零费用的环境下完成代码编写与逻辑验证，显著提升迭代效率。\n- **动态知识同步**：内置的文档处理流水线让新上传的服务条款能迅速被向量化并纳入检索范围，确保客户随时获得最新、准确的政策解答。\n\nserverless-chat-langchainjs 通过整合成熟的无服务器架构与 RAG 模式，将企业级智能客服的开发难度从“专家级”降维至“入门级”，同时实现了成本与效率的最优平衡。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FAzure-Samples_serverless-chat-langchainjs_b9aa23f6.gif","Azure-Samples","Azure Samples","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002FAzure-Samples_101a6251.png","Microsoft Azure code samples and examples in .NET, Java, Python, JavaScript, TypeScript, PHP and Ruby",null,"https:\u002F\u002Flearn.microsoft.com\u002Fazure","https:\u002F\u002Fgithub.com\u002FAzure-Samples",[81,85,89,92],{"name":82,"color":83,"percentage":84},"TypeScript","#3178c6",77.6,{"name":86,"color":87,"percentage":88},"Bicep","#519aba",17.4,{"name":90,"color":91,"percentage":10},"HTML","#e34c26",{"name":93,"color":94,"percentage":95},"JavaScript","#f1e05a",2.1,854,488,"2026-04-07T17:27:27","MIT",4,"Linux, macOS, Windows","非必需。云端部署无需本地 GPU；本地开发使用 Ollama 时，取决于所选模型（如 Llama3.1）的硬件需求，通常建议有独立显卡或高性能 CPU。","未说明（本地运行大模型时建议 16GB+）",{"notes":105,"python":106,"dependencies":107},"该项目基于 JavaScript\u002FTypeScript 而非 Python。核心运行环境需安装 Node.js LTS (版本 20+)。本地开发推荐使用 Ollama 运行大模型以节省成本。部署到 Azure 需要 Azure 订阅及 OpenAI 服务访问权限。Windows 用户建议使用 PowerShell 7+、Git Bash 或 WSL。","不适用",[108,109,110,111,82],"Node.js >= 20","LangChain.js","Azure Functions Core Tools","Lit",[35,14,113],"其他",[115,116,117,118,119,120,121,122,123,124,125,126,127],"azure","azure-functions","generative-ai","javascript","langchain-js","mongodb","retrieval-augmented-generation","serverless","typescript","rag","chatbot","azd-templates","ai-azd-templates","2026-03-27T02:49:30.150509","2026-04-08T09:16:38.172292",[131,136,141,145,150,155],{"id":132,"question_zh":133,"answer_zh":134,"source_url":135},23970,"如果想将嵌入模型从 text-embedding-ada-002 更改为 text-embedding-3-small，需要修改哪些代码或执行哪些步骤？","若要更改已部署版本的嵌入模型，由于向量长度可能不同，您必须先在 Azure 门户中删除 AI Search 实例中现有的索引。删除后，使用命令 `azd up` 重新部署应用程序即可。","https:\u002F\u002Fgithub.com\u002FAzure-Samples\u002Fserverless-chat-langchainjs\u002Fissues\u002F69",{"id":137,"question_zh":138,"answer_zh":139,"source_url":140},23971,"在使用 azd 部署时遇到失败或超时错误，应该如何解决？","这通常是由于 Azure 端的临时超时问题导致的。建议等待几个小时让超时周期过去，然后重试部署。如果问题持续存在，可以尝试更换部署区域（例如切换到 East US 2）。","https:\u002F\u002Fgithub.com\u002FAzure-Samples\u002Fserverless-chat-langchainjs\u002Fissues\u002F64",{"id":142,"question_zh":143,"answer_zh":144,"source_url":135},23972,"该项目是否支持使用 Azure Document Intelligence 进行文档解析？","当前示例旨在保持尽可能简单以便理解，因此未直接集成 Azure Document Intelligence。如果您需要更完整的功能（包括使用 Azure Document Intelligence 进行文档摄入），可以参考其他功能更全的示例项目，如 azure-search-openai-demo。",{"id":146,"question_zh":147,"answer_zh":148,"source_url":149},23973,"如何在本地运行测试？","虽然项目计划添加端到端（E2E）测试框架（如 Playwright）并支持通过 `npm test` 运行，但目前相关测试功能尚在规划或停滞状态。建议查看仓库根目录下的 `\u002Ftest` 文件夹或关注后续更新以获取最新的测试运行指南。","https:\u002F\u002Fgithub.com\u002FAzure-Samples\u002Fserverless-chat-langchainjs\u002Fissues\u002F5",{"id":151,"question_zh":152,"answer_zh":153,"source_url":154},23974,"教程文档（tutorial folder）是否与最新代码同步？","项目正在努力根据代码样本更新 `tutorial` 文件夹，旨在将其转化为逐步教程或官方文档模块。目前部分章节（如介绍、环境设置、RAG 架构理解等）已完成更新，但部分 API 开发章节仍在进行中。","https:\u002F\u002Fgithub.com\u002FAzure-Samples\u002Fserverless-chat-langchainjs\u002Fissues\u002F37",{"id":156,"question_zh":157,"answer_zh":158,"source_url":159},23975,"是否有计划添加基于 LangChain.js 工具的评估功能？","社区曾提议在 `\u002Ftest` 文件夹中创建评估目录，并使用 LangChain.js 标准评估或 Promptfoo 来演示 RAG 结果的评估。不过，该议题目前处于停滞状态，尚未正式实现。用户可以关注项目后续动态或自行参考 LangChain.js 文档进行扩展。","https:\u002F\u002Fgithub.com\u002FAzure-Samples\u002Fserverless-chat-langchainjs\u002Fissues\u002F25",[]]