[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-567-labs--instructor-js":3,"tool-567-labs--instructor-js":64},[4,17,27,35,43,56],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":16},3808,"stable-diffusion-webui","AUTOMATIC1111\u002Fstable-diffusion-webui","stable-diffusion-webui 是一个基于 Gradio 构建的网页版操作界面，旨在让用户能够轻松地在本地运行和使用强大的 Stable Diffusion 图像生成模型。它解决了原始模型依赖命令行、操作门槛高且功能分散的痛点，将复杂的 AI 绘图流程整合进一个直观易用的图形化平台。\n\n无论是希望快速上手的普通创作者、需要精细控制画面细节的设计师，还是想要深入探索模型潜力的开发者与研究人员，都能从中获益。其核心亮点在于极高的功能丰富度：不仅支持文生图、图生图、局部重绘（Inpainting）和外绘（Outpainting）等基础模式，还独创了注意力机制调整、提示词矩阵、负向提示词以及“高清修复”等高级功能。此外，它内置了 GFPGAN 和 CodeFormer 等人脸修复工具，支持多种神经网络放大算法，并允许用户通过插件系统无限扩展能力。即使是显存有限的设备，stable-diffusion-webui 也提供了相应的优化选项，让高质量的 AI 艺术创作变得触手可及。",162132,3,"2026-04-05T11:01:52",[13,14,15],"开发框架","图像","Agent","ready",{"id":18,"name":19,"github_repo":20,"description_zh":21,"stars":22,"difficulty_score":23,"last_commit_at":24,"category_tags":25,"status":16},1381,"everything-claude-code","affaan-m\u002Feverything-claude-code","everything-claude-code 是一套专为 AI 编程助手（如 Claude Code、Codex、Cursor 等）打造的高性能优化系统。它不仅仅是一组配置文件，而是一个经过长期实战打磨的完整框架，旨在解决 AI 代理在实际开发中面临的效率低下、记忆丢失、安全隐患及缺乏持续学习能力等核心痛点。\n\n通过引入技能模块化、直觉增强、记忆持久化机制以及内置的安全扫描功能，everything-claude-code 能显著提升 AI 在复杂任务中的表现，帮助开发者构建更稳定、更智能的生产级 AI 代理。其独特的“研究优先”开发理念和针对 Token 消耗的优化策略，使得模型响应更快、成本更低，同时有效防御潜在的攻击向量。\n\n这套工具特别适合软件开发者、AI 研究人员以及希望深度定制 AI 工作流的技术团队使用。无论您是在构建大型代码库，还是需要 AI 协助进行安全审计与自动化测试，everything-claude-code 都能提供强大的底层支持。作为一个曾荣获 Anthropic 黑客大奖的开源项目，它融合了多语言支持与丰富的实战钩子（hooks），让 AI 真正成长为懂上",138956,2,"2026-04-05T11:33:21",[13,15,26],"语言模型",{"id":28,"name":29,"github_repo":30,"description_zh":31,"stars":32,"difficulty_score":23,"last_commit_at":33,"category_tags":34,"status":16},2271,"ComfyUI","Comfy-Org\u002FComfyUI","ComfyUI 是一款功能强大且高度模块化的视觉 AI 引擎，专为设计和执行复杂的 Stable Diffusion 图像生成流程而打造。它摒弃了传统的代码编写模式，采用直观的节点式流程图界面，让用户通过连接不同的功能模块即可构建个性化的生成管线。\n\n这一设计巧妙解决了高级 AI 绘图工作流配置复杂、灵活性不足的痛点。用户无需具备编程背景，也能自由组合模型、调整参数并实时预览效果，轻松实现从基础文生图到多步骤高清修复等各类复杂任务。ComfyUI 拥有极佳的兼容性，不仅支持 Windows、macOS 和 Linux 全平台，还广泛适配 NVIDIA、AMD、Intel 及苹果 Silicon 等多种硬件架构，并率先支持 SDXL、Flux、SD3 等前沿模型。\n\n无论是希望深入探索算法潜力的研究人员和开发者，还是追求极致创作自由度的设计师与资深 AI 绘画爱好者，ComfyUI 都能提供强大的支持。其独特的模块化架构允许社区不断扩展新功能，使其成为当前最灵活、生态最丰富的开源扩散模型工具之一，帮助用户将创意高效转化为现实。",107662,"2026-04-03T11:11:01",[13,14,15],{"id":36,"name":37,"github_repo":38,"description_zh":39,"stars":40,"difficulty_score":23,"last_commit_at":41,"category_tags":42,"status":16},3704,"NextChat","ChatGPTNextWeb\u002FNextChat","NextChat 是一款轻量且极速的 AI 助手，旨在为用户提供流畅、跨平台的大模型交互体验。它完美解决了用户在多设备间切换时难以保持对话连续性，以及面对众多 AI 模型不知如何统一管理的痛点。无论是日常办公、学习辅助还是创意激发，NextChat 都能让用户随时随地通过网页、iOS、Android、Windows、MacOS 或 Linux 端无缝接入智能服务。\n\n这款工具非常适合普通用户、学生、职场人士以及需要私有化部署的企业团队使用。对于开发者而言，它也提供了便捷的自托管方案，支持一键部署到 Vercel 或 Zeabur 等平台。\n\nNextChat 的核心亮点在于其广泛的模型兼容性，原生支持 Claude、DeepSeek、GPT-4 及 Gemini Pro 等主流大模型，让用户在一个界面即可自由切换不同 AI 能力。此外，它还率先支持 MCP（Model Context Protocol）协议，增强了上下文处理能力。针对企业用户，NextChat 提供专业版解决方案，具备品牌定制、细粒度权限控制、内部知识库整合及安全审计等功能，满足公司对数据隐私和个性化管理的高标准要求。",87618,"2026-04-05T07:20:52",[13,26],{"id":44,"name":45,"github_repo":46,"description_zh":47,"stars":48,"difficulty_score":23,"last_commit_at":49,"category_tags":50,"status":16},2268,"ML-For-Beginners","microsoft\u002FML-For-Beginners","ML-For-Beginners 是由微软推出的一套系统化机器学习入门课程，旨在帮助零基础用户轻松掌握经典机器学习知识。这套课程将学习路径规划为 12 周，包含 26 节精炼课程和 52 道配套测验，内容涵盖从基础概念到实际应用的完整流程，有效解决了初学者面对庞大知识体系时无从下手、缺乏结构化指导的痛点。\n\n无论是希望转型的开发者、需要补充算法背景的研究人员，还是对人工智能充满好奇的普通爱好者，都能从中受益。课程不仅提供了清晰的理论讲解，还强调动手实践，让用户在循序渐进中建立扎实的技能基础。其独特的亮点在于强大的多语言支持，通过自动化机制提供了包括简体中文在内的 50 多种语言版本，极大地降低了全球不同背景用户的学习门槛。此外，项目采用开源协作模式，社区活跃且内容持续更新，确保学习者能获取前沿且准确的技术资讯。如果你正寻找一条清晰、友好且专业的机器学习入门之路，ML-For-Beginners 将是理想的起点。",84991,"2026-04-05T10:45:23",[14,51,52,53,15,54,26,13,55],"数据工具","视频","插件","其他","音频",{"id":57,"name":58,"github_repo":59,"description_zh":60,"stars":61,"difficulty_score":10,"last_commit_at":62,"category_tags":63,"status":16},3128,"ragflow","infiniflow\u002Fragflow","RAGFlow 是一款领先的开源检索增强生成（RAG）引擎，旨在为大语言模型构建更精准、可靠的上下文层。它巧妙地将前沿的 RAG 技术与智能体（Agent）能力相结合，不仅支持从各类文档中高效提取知识，还能让模型基于这些知识进行逻辑推理和任务执行。\n\n在大模型应用中，幻觉问题和知识滞后是常见痛点。RAGFlow 通过深度解析复杂文档结构（如表格、图表及混合排版），显著提升了信息检索的准确度，从而有效减少模型“胡编乱造”的现象，确保回答既有据可依又具备时效性。其内置的智能体机制更进一步，使系统不仅能回答问题，还能自主规划步骤解决复杂问题。\n\n这款工具特别适合开发者、企业技术团队以及 AI 研究人员使用。无论是希望快速搭建私有知识库问答系统，还是致力于探索大模型在垂直领域落地的创新者，都能从中受益。RAGFlow 提供了可视化的工作流编排界面和灵活的 API 接口，既降低了非算法背景用户的上手门槛，也满足了专业开发者对系统深度定制的需求。作为基于 Apache 2.0 协议开源的项目，它正成为连接通用大模型与行业专有知识之间的重要桥梁。",77062,"2026-04-04T04:44:48",[15,14,13,26,54],{"id":65,"github_repo":66,"name":67,"description_en":68,"description_zh":69,"ai_summary_zh":69,"readme_en":70,"readme_zh":71,"quickstart_zh":72,"use_case_zh":73,"hero_image_url":74,"owner_login":75,"owner_name":75,"owner_avatar_url":76,"owner_bio":77,"owner_company":77,"owner_location":77,"owner_email":77,"owner_twitter":77,"owner_website":77,"owner_url":78,"languages":79,"stars":96,"forks":97,"last_commit_at":98,"license":99,"difficulty_score":23,"env_os":100,"env_gpu":101,"env_ram":101,"env_deps":102,"category_tags":108,"github_topics":109,"view_count":111,"oss_zip_url":77,"oss_zip_packed_at":77,"status":16,"created_at":112,"updated_at":113,"faqs":114,"releases":144},1763,"567-labs\u002Finstructor-js","instructor-js","structured extraction for llms","instructor-js 是专为 TypeScript 开发者设计的结构化数据提取工具，基于大语言模型（LLM）实现。它通过 Zod 定义数据 schema，将非结构化文本（如用户描述）自动转换为类型安全的 JavaScript 对象，无需手动解析或后处理。例如，输入 \"Jason Liu is 30 years old\"，可直接输出 { age: 30, name: \"Jason Liu\" }。传统方法常依赖正则表达式或复杂逻辑，容易出错且维护困难，而 instructor-js 利用 LLM 的理解能力直接生成结构化数据，显著提升开发效率。特别适合需要高精度数据提取的 TypeScript 项目，如聊天机器人、数据爬取或自动化流程。核心亮点包括静态类型推断、OpenAI tools 模式支持，以及简洁透明的 API 设计，让复杂的数据处理变得简单直观。","# instructor-js\n\n_Structured extraction in Typescript, powered by llms, designed for simplicity, transparency, and control._\n\n---\n\n[![Twitter Follow](https:\u002F\u002Fimg.shields.io\u002Ftwitter\u002Ffollow\u002Fjxnlco?style=social)](https:\u002F\u002Ftwitter.com\u002Fjxnlco)\n[![Twitter Follow](https:\u002F\u002Fimg.shields.io\u002Ftwitter\u002Ffollow\u002Fdimitrikennedy?style=social)](https:\u002F\u002Ftwitter.com\u002Fdimitrikennedy)\n[![NPM Version](https:\u002F\u002Fimg.shields.io\u002Fnpm\u002Fv\u002F@instructor-ai\u002Finstructor.svg)](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@instructor-ai\u002Finstructor)\n[![Documentation](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fdocs-available-brightgreen)](https:\u002F\u002Fjxnl.github.io\u002Finstructor-js)\n[![GitHub issues](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fissues\u002Finstructor-ai\u002Finstructor-js.svg)](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fissues)\n[![Discord](https:\u002F\u002Fimg.shields.io\u002Fdiscord\u002F1192334452110659664?label=discord)](https:\u002F\u002Fdiscord.gg\u002FCV8sPM5k5Y)\n\nDive into the world of Typescript-based structured extraction, by OpenAI's function calling API and Zod, typeScript-first schema validation with static type inference. Instructor stands out for its simplicity, transparency, and user-centric design. Whether you're a seasoned developer or just starting out, you'll find Instructor's approach intuitive and steerable.\n\n\n## Installation\n\n```bash\nbun add @instructor-ai\u002Finstructor zod openai\n```\n\n```bash\nnpm i @instructor-ai\u002Finstructor zod openai\n```\n\n```bash\npnpm add @instructor-ai\u002Finstructor zod openai\n```\n\n## Basic Usage\nTo check out all the tips and tricks to prompt and extract data, check out the [documentation](https:\u002F\u002Finstructor-ai.github.io\u002Finstructor-js\u002Ftips\u002Fprompting\u002F).\n\n\n```typescript\n\nimport Instructor from \"@instructor-ai\u002Finstructor\";\nimport OpenAI from \"openai\"\nimport { z } from \"zod\"\n\nconst oai = new OpenAI({\n  apiKey: process.env.OPENAI_API_KEY ?? undefined,\n  organization: process.env.OPENAI_ORG_ID ?? undefined\n})\n\nconst client = Instructor({\n  client: oai,\n  mode: \"TOOLS\"\n})\n\nconst UserSchema = z.object({\n  \u002F\u002F Description will be used in the prompt\n  age: z.number().describe(\"The age of the user\"), \n  name: z.string()\n})\n\n\n\u002F\u002F User will be of type z.infer\u003Ctypeof UserSchema>\nconst user = await client.chat.completions.create({\n  messages: [{ role: \"user\", content: \"Jason Liu is 30 years old\" }],\n  model: \"gpt-3.5-turbo\",\n  response_model: { \n    schema: UserSchema, \n    name: \"User\"\n  }\n})\n\nconsole.log(user)\n\u002F\u002F { age: 30, name: \"Jason Liu\" }\n```\n\n\n## API Reference\n\n### Instructor Class\nThe main class for creating an Instructor client.\n\n**createInstructor**\n```typescript\nfunction createInstructor\u003CC extends GenericClient | OpenAI>(args: {\n  client: OpenAILikeClient\u003CC>;\n  mode: Mode;\n  debug?: boolean;\n}): InstructorClient\u003CC>\n```\n\nCreates an instance of the Instructor class.\n\n- client: An OpenAI-like client.\n- mode: The mode of operation.\n- debug: Whether to log debug messages.\n\nReturns the extended OpenAI-Like client.\n\n\n**chat.completions.create**\n```typescript\nchat.completions.create\u003C\n        T extends z.AnyZodObject,\n        P extends T extends z.AnyZodObject ? ChatCompletionCreateParamsWithModel\u003CT>\n        : ClientTypeChatCompletionParams\u003COpenAILikeClient\u003CC>> & { response_model: never }\n      >(\n        params: P\n      ): Promise\u003CReturnTypeBasedOnParams\u003Ctypeof this.client, P>>\n```\nWhen response_model is present in the params, creates a chat completion with structured extraction based on the provided schema - otherwise will proxy back to the provided client.\n\n- params: Chat completion parameters including the response model schema.\n- Returns a promise resolving to the extracted data based on the schema.\n\n\n### Modes\n\nInstructor supports different modes for defining the structure and format of the response from the language model. These modes are defined in the `zod-stream` package and are as follows:\n\n- `FUNCTIONS` (*DEPRECATED*): Generates a response using OpenAI's function calling API. It maps to the necessary parameters for the function calling API, including the `function_call` and `functions` properties. \n\n- `TOOLS`: Generates a response using OpenAI's tool specification. It constructs the required parameters for the tool specification, including the `tool_choice` and `tools` properties.\n\n- `JSON`: It sets the `response_format` to `json_object` and includes the JSON schema in the system message to guide the response generation. (Together & Anyscale)\n\n- `MD_JSON`: Generates a response in JSON format embedded within a Markdown code block. It includes the JSON schema in the system message and expects the response to be a valid JSON object wrapped in a Markdown code block.\n\n- `JSON_SCHEMA`: Generates a response using \"JSON mode\" that conforms to a provided JSON schema. It sets the `response_format` to `json_object` with the provided schema and includes the schema description in the system message.\n\n\n\n## Examples\n\n\n### Streaming Completions\nInstructor supports partial streaming completions, allowing you to receive extracted data in real-time as the model generates its response. This can be useful for providing a more interactive user experience or processing large amounts of data incrementally.\n\n```typescript\nimport Instructor from \"@instructor-ai\u002Finstructor\"\nimport OpenAI from \"openai\"\nimport { z } from \"zod\"\n\nconst textBlock = `\n  In our recent online meeting, participants from various backgrounds joined to discuss the upcoming tech conference. \n  The names and contact details of the participants were as follows:\n\n  - Name: John Doe, Email: johndoe@email.com, Twitter: @TechGuru44\n  - Name: Jane Smith, Email: janesmith@email.com, Twitter: @DigitalDiva88\n  - Name: Alex Johnson, Email: alexj@email.com, Twitter: @CodeMaster2023\n\n  During the meeting, we agreed on several key points. The conference will be held on March 15th, 2024, at the Grand Tech Arena located at 4521 Innovation Drive. Dr. Emily Johnson, a renowned AI researcher, will be our keynote speaker. The budget for the event is set at $50,000, covering venue costs, speaker fees, and promotional activities. \n\n  Each participant is expected to contribute an article to the conference blog by February 20th. A follow-up meeting is scheduled for January 25th at 3 PM GMT to finalize the agenda and confirm the list of speakers.\n`\n\nasync function extractData() {\n  const ExtractionSchema = z.object({\n    users: z.array(\n      z.object({\n        name: z.string(),\n        handle: z.string(),\n        twitter: z.string()\n      })\n    ).min(3),\n    location: z.string(),\n    budget: z.number()\n  })\n\n  const oai = new OpenAI({\n    apiKey: process.env.OPENAI_API_KEY ?? undefined,\n    organization: process.env.OPENAI_ORG_ID ?? undefined\n  })\n\n  const client = Instructor({\n    client: oai,\n    mode: \"TOOLS\"\n  })\n\n  const extractionStream = await client.chat.completions.create({\n    messages: [{ role: \"user\", content: textBlock }],\n    model: \"gpt-3.5-turbo\",\n    response_model: {\n      schema: ExtractionSchema,\n      name: \"Extraction\"\n    },\n    max_retries: 3,\n    stream: true\n  })\n\n  let extractedData = {}\n  for await (const result of extractionStream) {\n    extractedData = result\n    console.log(\"Partial extraction:\", result)\n  }\n  \n  console.log(\"Final extraction:\", extractedData)\n}\n\nextractData()\n```\n\nIn this example, we define an ExtractionSchema using Zod to specify the structure of the data we want to extract. We then create an Instructor client with streaming enabled and pass the schema to the response_model parameter.\n\nThe extractionStream variable holds an async generator that yields partial extraction results as they become available. We iterate over the stream using a for await...of loop, updating the extractedData object with each partial result and logging it to the console.\n\nFinally, we log the complete extracted data once the stream is exhausted.\n\n\n### Using Different Providers via proxy\nInstructor supports various providers that adhere to the OpenAI API specification. You can easily switch between providers by configuring the appropriate client and specifying the desired model and mode.\n\n**Anyscale**\n```typescript\nimport Instructor from \"@instructor-ai\u002Finstructor\"\nimport OpenAI from \"openai\"\nimport { z } from \"zod\"\n\nconst UserSchema = z.object({\n  age: z.number(),\n  name: z.string().refine(name => name.includes(\" \"), {\n    message: \"Name must contain a space\"\n  })\n})\n\nasync function extractUser() {\n  const client = new OpenAI({\n    baseURL: \"https:\u002F\u002Fapi.endpoints.anyscale.com\u002Fv1\",\n    apiKey: process.env.ANYSCALE_API_KEY\n  })\n\n  const instructor = Instructor({\n    client: client,\n    mode: \"TOOLS\"\n  })\n\n  const user = await instructor.chat.completions.create({\n    messages: [{ role: \"user\", content: \"Jason Liu is 30 years old\" }],\n    model: \"mistralai\u002FMixtral-8x7B-Instruct-v0.1\",\n    response_model: {\n      schema: UserSchema,\n      name: \"User\"\n    },\n    max_retries: 4\n  })\n\n  return user\n}\n\nconst anyscaleUser = await extractUser()\nconsole.log(\"Anyscale user:\", anyscaleUser)\n```\n\n**Together**\n```typescript\nimport Instructor from \"@instructor-ai\u002Finstructor\"\nimport OpenAI from \"openai\"\nimport { z } from \"zod\"\n\nconst UserSchema = z.object({\n  age: z.number(),\n  name: z.string().refine(name => name.includes(\" \"), {\n    message: \"Name must contain a space\"\n  })\n})\n\nasync function extractUser() {\n  const client = new OpenAI({\n    baseURL: \"https:\u002F\u002Fapi.together.xyz\u002Fv1\",\n    apiKey: process.env.TOGETHER_API_KEY\n  })\n\n  const instructor = Instructor({\n    client: client,\n    mode: \"TOOLS\"\n  })\n\n  const user = await instructor.chat.completions.create({\n    messages: [{ role: \"user\", content: \"Jason Liu is 30 years old\" }],\n    model: \"mistralai\u002FMixtral-8x7B-Instruct-v0.1\",\n    response_model: {\n      schema: UserSchema,\n      name: \"User\"\n    },\n    max_retries: 4\n  })\n\n  return user\n}\n\nconst togetherUser = await extractUser()\nconsole.log(\"Together user:\", togetherUser)\n```\n\nIn these examples, we specify a specific base URL and API key from Anyscale, and Together..\n\nThe extractUser function takes the model, mode, and provider as parameters. It retrieves the corresponding provider configuration, creates an OpenAI client, and initializes an Instructor instance with the specified mode.\n\nWe then call instructor.chat.completions.create with the desired model, response schema, and other parameters to extract the user information.\n\nBy varying the provider, model, and mode arguments when calling extractUser, you can easily switch between different providers and configurations.\n\n\n### Using Non-OpenAI Providers with llm-polyglot\n\nInstructor supports integration with providers that don't adhere to the OpenAI SDK, such as Anthropic, Azure, and Cohere, through the [`llm-polyglot`](https:\u002F\u002Fgithub.com\u002Fhack-dance\u002Fisland-ai\u002Ftree\u002Fmain\u002Fpublic-packages\u002Fllm-client) library maintained by @dimitrikennedy. This library provides a unified interface for interacting with various language models across different providers.\n\n```typescript\nimport { createLLMClient } from \"llm-polyglot\"\nimport Instructor from \"@instructor-ai\u002Finstructor\"\nimport { z } from \"zod\"\n\nconst anthropicClient = createLLMClient({\n  provider: \"anthropic\",\n  apiKey: process.env.ANTHROPIC_API_KEY\n})\n\nconst UserSchema = z.object({\n  age: z.number(),\n  name: z.string()\n})\n\nconst instructor = Instructor\u003Ctypeof anthropicClient>({\n  client: anthropicClient,\n  mode: \"TOOLS\"\n})\n\nasync function extractUser() {\n  const user = await instructor.chat.completions.create({\n    model: \"claude-3-opus-20240229\",\n    max_tokens: 1000,\n    messages: [\n      {\n        role: \"user\",\n        content: \"My name is Dimitri Kennedy.\"\n      }\n    ],\n    response_model: {\n      name: \"extract_name\",\n      schema: UserSchema\n    }\n  })\n\n  return user\n}\n\n\u002F\u002F Example usage\nconst extractedUser = await extractUser()\nconsole.log(\"Extracted user:\", extractedUser)\n```\n\nIn this example, we use the createLLMClient function from the llm-polyglot library to create a client for the Anthropic provider. We pass the provider name (\"anthropic\") and the corresponding API key to the function.\n\nNext, we define a UserSchema using Zod to specify the structure of the user data we want to extract.\n\nWe create an Instructor instance by passing the Anthropic client and the desired mode to the Instructor function. Note that we use Instructor\u003Ctypeof anthropicClient> to specify the client type explicitly.\n\nThe extractUser function demonstrates how to use the Instructor instance to extract user information from a given input. We call instructor.chat.completions.create with the appropriate model (\"claude-3-opus-20240229\" in this case), parameters, and the response_model that includes our UserSchema.\n\nFinally, we log the extracted user information.\n\nBy leveraging the llm-polyglot library, Instructor enables seamless integration with a wide range of providers beyond those that follow the OpenAI SDK. This allows you to take advantage of the unique capabilities and models offered by different providers while still benefiting from the structured extraction and validation features of Instructor.\n\nFor additional support and information on using other providers with [llm-polyglot](https:\u002F\u002Fgithub.com\u002Fhack-dance\u002Fisland-ai\u002Ftree\u002Fmain\u002Fpublic-packages\u002Fllm-client), please refer to the library's documentation and examples.\n\n\n## More Examples\n\nIf you'd like to see more check out our [cookbook](docs\u002Fexamples\u002Findex.md).\n\n[Installing Instructor](docs\u002Finstallation.md) is a breeze.\n\n\n## Built on Island AI\n\nInstructor is built on top of several powerful packages from the [Island AI](https:\u002F\u002Fgithub.com\u002Fhack-dance\u002Fisland-ai) toolkit, developed and maintained by [Dimitri Kennedy](https:\u002F\u002Ftwitter.com\u002Fdimitrikennedy). These packages provide essential functionality for structured data handling and streaming with Large Language Models.\n\n### zod-stream\n\n[zod-stream](https:\u002F\u002Fgithub.com\u002Fhack-dance\u002Fisland-ai\u002Ftree\u002Fmain\u002Fpublic-packages\u002Fzod-stream) is a client module that interfaces directly with LLM streams. It utilizes Schema-Stream for efficient parsing and is equipped with tools for processing raw responses from OpenAI, categorizing them by mode (function, tools, JSON, etc.), and ensuring proper error handling and stream conversion. It's ideal for API integration delivering structured LLM response streams.\n\n### schema-stream\n\n[schema-stream](https:\u002F\u002Fgithub.com\u002Fhack-dance\u002Fisland-ai\u002Ftree\u002Fmain\u002Fpublic-packages\u002Fschema-stream) is a JSON streaming parser that incrementally constructs and updates response models based on Zod schemas. It's designed for real-time data processing and incremental model hydration.\n\n\n### llm-polyglot\n\n[llm-polyglot](https:\u002F\u002Fgithub.com\u002Fhack-dance\u002Fisland-ai\u002Ftree\u002Fmain\u002Fpublic-packages\u002Fllm-client) is a library that provides a unified interface for interacting with various language models across different providers, such as OpenAI, Anthropic, Azure, and Cohere. It simplifies the process of working with multiple LLM providers and enables seamless integration with Instructor.\n\nInstructor leverages the power of these Island AI packages to deliver a seamless and efficient experience for structured data extraction and streaming with LLMs. The collaboration between Dimitri Kennedy, the creator of Island AI, and Jason Liu, the author of the original Instructor Python package, has led to the development of the TypeScript version of Instructor, which introduces the concept of partial JSON streaming from LLM's.\n\nFor more information about Island AI and its packages, please refer to the [Island AI repository](https:\u002F\u002Fgithub.com\u002Fhack-dance\u002Fisland-ai).\n\n\n## Why use Instructor?\n\nThe question of using Instructor is fundamentally a question of why to use zod.\n\n1. **Works with the OpenAI SDK** — Instructor follows OpenAI's API. This means you can use the same API for both prompting and extraction across multiple providers that support the OpenAI API.\n\n2. **Customizable** — Zod is highly customizable. You can define your own validators, custom error messages, and more.\n\n3. **Ecosystem** Zod is the most widely used data validation library for Typescript.\n\n4. **Battle Tested** — Zod is downloaded over 24M times per month, and supported by a large community of contributors.\n\n\n\n\n## Contributing\n\nIf you want to help out, checkout some of the issues marked as `good-first-issue` or `help-wanted`. Found [here](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Flabels\u002Fgood%20first%20issue). They could be anything from code improvements, a guest blog post, or a new cook book.\n\nCheckout the [contribution guide]() for details on how to set things up, testing, changesets and guidelines.\n\n> ℹ️ **Tip:**  Support in other languages\n\n    Check out ports to other languages below:\n\n    - [Python](https:\u002F\u002Fwww.github.com\u002Fjxnl\u002Finstructor)\n    - [Elixir](https:\u002F\u002Fgithub.com\u002Fthmsmlr\u002Finstructor_ex\u002F)\n\n    If you want to port Instructor to another language, please reach out to us on [Twitter](https:\u002F\u002Ftwitter.com\u002Fjxnlco) we'd love to help you get started!\n\n## License\n\nThis project is licensed under the terms of the MIT License.\n","# instructor-js\n\n基于 TypeScript 的结构化提取，由大语言模型驱动，旨在实现简单、透明和可控。\n\n---\n\n[![Twitter 关注](https:\u002F\u002Fimg.shields.io\u002Ftwitter\u002Ffollow\u002Fjxnlco?style=social)](https:\u002F\u002Ftwitter.com\u002Fjxnlco)\n[![Twitter 关注](https:\u002F\u002Fimg.shields.io\u002Ftwitter\u002Ffollow\u002Fdimitrikennedy?style=social)](https:\u002F\u002Ftwitter.com\u002Fdimitrikennedy)\n[![NPM 版本](https:\u002F\u002Fimg.shields.io\u002Fnpm\u002Fv\u002F@instructor-ai\u002Finstructor.svg)](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@instructor-ai\u002Finstructor)\n[![文档](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fdocs-available-brightgreen)](https:\u002F\u002Fjxnl.github.io\u002Finstructor-js)\n[![GitHub 问题](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fissues\u002Finstructor-ai\u002Finstructor-js.svg)](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fissues)\n[![Discord](https:\u002F\u002Fimg.shields.io\u002Fdiscord\u002F1192334452110659664?label=discord)](https:\u002F\u002Fdiscord.gg\u002FCV8sPM5k5Y)\n\n深入了解基于 TypeScript 的结构化提取，借助 OpenAI 的函数调用 API 和 Zod，以 TypeScript 优先的模式进行类型验证并支持静态类型推断。Instructor 因其简单性、透明性和以用户为中心的设计而脱颖而出。无论您是经验丰富的开发者还是刚刚入门，都会发现 Instructor 的方法直观且易于掌控。\n\n\n## 安装\n\n```bash\nbun add @instructor-ai\u002Finstructor zod openai\n```\n\n```bash\nnpm i @instructor-ai\u002Finstructor zod openai\n```\n\n```bash\npnpm add @instructor-ai\u002Finstructor zod openai\n```\n\n## 基本用法\n要了解提示和提取数据的所有技巧与窍门，请查看[文档](https:\u002F\u002Finstructor-ai.github.io\u002Finstructor-js\u002Ftips\u002Fprompting\u002F)。\n\n\n```typescript\n\nimport Instructor from \"@instructor-ai\u002Finstructor\";\nimport OpenAI from \"openai\"\nimport { z } from \"zod\"\n\nconst oai = new OpenAI({\n  apiKey: process.env.OPENAI_API_KEY ?? undefined,\n  organization: process.env.OPENAI_ORG_ID ?? undefined\n})\n\nconst client = Instructor({\n  client: oai,\n  mode: \"TOOLS\"\n})\n\nconst UserSchema = z.object({\n  \u002F\u002F 描述将用于提示中\n  age: z.number().describe(\"用户的年龄\"), \n  name: z.string()\n})\n\n\n\u002F\u002F 用户将属于 z.infer\u003Ctypeof UserSchema> 类型\nconst user = await client.chat.completions.create({\n  messages: [{ role: \"user\", content: \"Jason Liu 是 30 岁\" }],\n  model: \"gpt-3.5-turbo\",\n  response_model: { \n    schema: UserSchema, \n    name: \"User\"\n  }\n})\n\nconsole.log(user)\n\u002F\u002F { age: 30, name: \"Jason Liu\" }\n```\n\n\n## API 参考\n\n### Instructor 类\n创建 Instructor 客户端的主要类。\n\n**createInstructor**\n```typescript\nfunction createInstructor\u003CC extends GenericClient | OpenAI>(args: {\n  client: OpenAILikeClient\u003CC>;\n  mode: Mode;\n  debug?: boolean;\n}): InstructorClient\u003CC>\n```\n\n创建 Instructor 类的实例。\n\n- client: 类似 OpenAI 的客户端。\n- mode: 运行模式。\n- debug: 是否记录调试信息。\n\n返回扩展后的 OpenAI 类似客户端。\n\n\n**chat.completions.create**\n```typescript\nchat.completions.create\u003C\n        T extends z.AnyZodObject,\n        P extends T extends z.AnyZodObject ? ChatCompletionCreateParamsWithModel\u003CT>\n        : ClientTypeChatCompletionParams\u003COpenAILikeClient\u003CC>> & { response_model: never }\n      >(\n        params: P\n      ): Promise\u003CReturnTypeBasedOnParams\u003Ctypeof this.client, P>>\n```\n当 params 中存在 response_model 时，根据提供的模式生成结构化提取的聊天完成；否则将代理回提供的客户端。\n\n- params: 包括响应模式模式的聊天完成参数。\n- 返回一个解析为根据模式提取的数据的 Promise。\n\n\n### 模式\n\nInstructor 支持不同的模式来定义语言模型响应的结构和格式。这些模式在 `zod-stream` 包中定义，具体如下：\n\n- `FUNCTIONS`（*已弃用*）：使用 OpenAI 的函数调用 API 生成响应。它映射到函数调用 API 所需的参数，包括 `function_call` 和 `functions` 属性。\n\n- `TOOLS`：使用 OpenAI 的工具规范生成响应。它构建了工具规范所需的参数，包括 `tool_choice` 和 `tools` 属性。\n\n- `JSON`：将 `response_format` 设置为 `json_object`，并在系统消息中包含 JSON 模式以指导响应生成。（Together & Anyscale）\n\n- `MD_JSON`：生成嵌入在 Markdown 代码块中的 JSON 格式响应。它在系统消息中包含 JSON 模式，并期望响应是一个有效的 JSON 对象，包裹在 Markdown 代码块中。\n\n- `JSON_SCHEMA`：使用“JSON 模式”生成符合所提供 JSON 模式的响应。它将 `response_format` 设置为 `json_object` 并包含所提供的模式，同时在系统消息中包含模式描述。\n\n\n## 示例\n\n### 流式完成\nInstructor 支持部分流式完成，允许您在模型生成响应时实时接收提取的数据。这有助于提供更富交互性的用户体验，或以增量方式处理大量数据。\n\n```typescript\nimport Instructor from \"@instructor-ai\u002Finstructor\"\nimport OpenAI from \"openai\"\nimport { z } from \"zod\"\n\nconst textBlock = `\n  在我们最近的线上会议中，来自不同背景的参会者齐聚一堂，共同探讨即将召开的技术大会。 \n  参会者的姓名和联系方式如下：\n\n  - 姓名：John Doe，邮箱：johndoe@email.com，Twitter：@TechGuru44\n  - 姓名：Jane Smith，邮箱：janesmith@email.com，Twitter：@DigitalDiva88\n  - 姓名：Alex Johnson，邮箱：alexj@email.com，Twitter：@CodeMaster2023\n\n  会议期间，我们达成了若干关键共识。大会将于2024年3月15日在位于创新大道4521号的盛大科技竞技场举行。著名人工智能研究员艾米丽·约翰逊博士将担任我们的主旨演讲嘉宾。本次活动的预算定为5万美元，涵盖场地费用、演讲嘉宾酬金及推广活动等开支。\n\n  每位参会者需在2月20日前向大会博客提交一篇文章。后续会议定于1月25日下午3点（GMT）召开，以敲定会议议程并确认演讲嘉宾名单。\n`\n\nasync function extractData() {\n  const ExtractionSchema = z.object({\n    users: z.array(\n      z.object({\n        name: z.string(),\n        handle: z.string(),\n        twitter: z.string()\n      })\n    ).min(3),\n    location: z.string(),\n    budget: z.number()\n  })\n\n  const oai = new OpenAI({\n    apiKey: process.env.OPENAI_API_KEY ?? undefined,\n    organization: process.env.OPENAI_ORG_ID ?? undefined\n  })\n\n  const client = Instructor({\n    client: oai,\n    mode: \"TOOLS\"\n  })\n\n  const extractionStream = await client.chat.completions.create({\n    messages: [{ role: \"user\", content: textBlock }],\n    model: \"gpt-3.5-turbo\",\n    response_model: {\n      schema: ExtractionSchema,\n      name: \"Extraction\"\n    },\n    max_retries: 3,\n    stream: true\n  })\n\n  let extractedData = {}\n  for await (const result of extractionStream) {\n    extractedData = result\n    console.log(\"部分提取结果:\", result)\n  }\n  \n  console.log(\"最终提取结果:\", extractedData)\n}\n\nextractData()\n```\n\n在本示例中，我们使用 Zod 定义了一个 ExtractionSchema，用于指定我们要提取的数据结构。然后，我们创建了一个启用了流式处理的 Instructor 客户端，并将该模式传递给 response_model 参数。\n\nextractionStream 变量保存了一个异步生成器，当部分提取结果可用时，它会逐个返回这些结果。我们通过 for await...of 循环遍历这个流，每次获取一个部分结果后更新 extractedData 对象，并将其打印到控制台。\n\n最后，当流处理完毕后，我们打印出完整的提取数据。\n\n\n### 通过代理使用不同提供商\nInstructor 支持多种符合 OpenAI API 规范的提供商。您只需配置相应的客户端，并指定所需的模型和模式，即可轻松切换不同的提供商。\n\n**Anyscale**\n```typescript\nimport Instructor from \"@instructor-ai\u002Finstructor\"\nimport OpenAI from \"openai\"\nimport { z } from \"zod\"\n\nconst UserSchema = z.object({\n  age: z.number(),\n  name: z.string().refine(name => name.includes(\" \"), {\n    message: \"姓名必须包含空格\"\n  })\n})\n\nasync function extractUser() {\n  const client = new OpenAI({\n    baseURL: \"https:\u002F\u002Fapi.endpoints.anyscale.com\u002Fv1\",\n    apiKey: process.env.ANYSCALE_API_KEY\n  })\n\n  const instructor = Instructor({\n    client: client,\n    mode: \"TOOLS\"\n  })\n\n  const user = await instructor.chat.completions.create({\n    messages: [{ role: \"user\", content: \"Jason Liu 是30岁\" }],\n    model: \"mistralai\u002FMixtral-8x7B-Instruct-v0.1\",\n    response_model: {\n      schema: UserSchema,\n      name: \"User\"\n    },\n    max_retries: 4\n  })\n\n  return user\n}\n\nconst anyscaleUser = await extractUser()\nconsole.log(\"Anyscale 用户:\", anyscaleUser)\n```\n\n**Together**\n```typescript\nimport Instructor from \"@instructor-ai\u002Finstructor\"\nimport OpenAI from \"openai\"\nimport { z } from \"zod\"\n\nconst UserSchema = z.object({\n  age: z.number(),\n  name: z.string().refine(name => name.includes(\" \"), {\n    message: \"姓名必须包含空格\"\n  })\n})\n\nasync function extractUser() {\n  const client = new OpenAI({\n    baseURL: \"https:\u002F\u002Fapi.together.xyz\u002Fv1\",\n    apiKey: process.env.TOGETHER_API_KEY\n  })\n\n  const instructor = Instructor({\n    client: client,\n    mode: \"TOOLS\"\n  })\n\n  const user = await instructor.chat.completions.create({\n    messages: [{ role: \"user\", content: \"Jason Liu 是30岁\" }],\n    model: \"mistralai\u002FMixtral-8x7B-Instruct-v0.1\",\n    response_model: {\n      schema: UserSchema,\n      name: \"User\"\n    },\n    max_retries: 4\n  })\n\n  return user\n}\n\nconst togetherUser = await extractUser()\nconsole.log(\"Together 用户:\", togetherUser)\n```\n\n在这些示例中，我们指定了 Anyscale 和 Together 的特定基础 URL 和 API 密钥。\n\nextractUser 函数接受模型、模式和提供商作为参数。它会获取对应的提供商配置，创建 OpenAI 客户端，并用指定模式初始化 Instructor 实例。\n\n随后，我们调用 instructor.chat.completions.create 方法，传入所需的模型、响应模式及其他参数，以提取用户信息。\n\n通过在调用 extractUser 时改变提供商、模型和模式参数，您可以轻松地在不同提供商和配置之间切换。\n\n### 使用 llm-polyglot 与非 OpenAI 提供商\n\nInstructor 支持通过 @dimitrikennedy 维护的 [`llm-polyglot`](https:\u002F\u002Fgithub.com\u002Fhack-dance\u002Fisland-ai\u002Ftree\u002Fmain\u002Fpublic-packages\u002Fllm-client) 库，与不遵循 OpenAI SDK 的提供商集成，例如 Anthropic、Azure 和 Cohere。该库提供了一个统一的接口，用于与不同提供商的各种语言模型交互。\n\n```typescript\nimport { createLLMClient } from \"llm-polyglot\"\nimport Instructor from \"@instructor-ai\u002Finstructor\"\nimport { z } from \"zod\"\n\nconst anthropicClient = createLLMClient({\n  provider: \"anthropic\",\n  apiKey: process.env.ANTHROPIC_API_KEY\n})\n\nconst UserSchema = z.object({\n  age: z.number(),\n  name: z.string()\n})\n\nconst instructor = Instructor\u003Ctypeof anthropicClient>({\n  client: anthropicClient,\n  mode: \"TOOLS\"\n})\n\nasync function extractUser() {\n  const user = await instructor.chat.completions.create({\n    model: \"claude-3-opus-20240229\",\n    max_tokens: 1000,\n    messages: [\n      {\n        role: \"user\",\n        content: \"我的名字是迪米特里·肯尼迪。\"\n      }\n    ],\n    response_model: {\n      name: \"extract_name\",\n      schema: UserSchema\n    }\n  })\n\n  return user\n}\n\n\u002F\u002F 示例用法\nconst extractedUser = await extractUser()\nconsole.log(\"提取的用户:\", extractedUser)\n```\n\n在本示例中，我们使用 llm-polyglot 库中的 createLLMClient 函数为 Anthropic 提供商创建了一个客户端。我们将提供商名称（“anthropic”）和相应的 API 密钥传递给该函数。\n\n接下来，我们使用 Zod 定义了 UserSchema，以指定我们要提取的用户数据的结构。\n\n我们通过将 Anthropic 客户端和所需模式传递给 Instructor 函数来创建一个 Instructor 实例。请注意，我们使用 Instructor\u003Ctypeof anthropicClient> 显式指定客户端类型。\n\nextractUser 函数展示了如何使用 Instructor 实例从给定输入中提取用户信息。我们调用 instructor.chat.completions.create，传入适当的模型（本例中为 “claude-3-opus-20240229”）、参数以及包含我们 UserSchema 的 response_model。\n\n最后，我们记录提取的用户信息。\n\n借助 llm-polyglot 库，Instructor 能够无缝集成多种提供商，而不仅限于那些遵循 OpenAI SDK 的提供商。这使您能够充分利用不同提供商提供的独特功能和模型，同时仍受益于 Instructor 的结构化提取和验证功能。\n\n如需更多支持以及有关使用其他提供商与 [llm-polyglot](https:\u002F\u002Fgithub.com\u002Fhack-dance\u002Fisland-ai\u002Ftree\u002Fmain\u002Fpublic-packages\u002Fllm-client) 的信息，请参阅该库的文档和示例。\n\n\n## 更多示例\n\n如果您想了解更多，请查看我们的[食谱](docs\u002Fexamples\u002Findex.md)。\n\n[安装 Instructor](docs\u002Finstallation.md) 非常简单。\n\n\n## 基于 Island AI 构建\n\nInstructor 基于 [Island AI](https:\u002F\u002Fgithub.com\u002Fhack-dance\u002Fisland-ai) 工具包中的多个强大软件包构建，这些软件包由 [Dimitri Kennedy](https:\u002F\u002Ftwitter.com\u002Fdimitrikennedy) 开发并维护。这些软件包提供了处理结构化数据和使用大型语言模型进行流式处理的关键功能。\n\n### zod-stream\n\n[zod-stream](https:\u002F\u002Fgithub.com\u002Fhack-dance\u002Fisland-ai\u002Ftree\u002Fmain\u002Fpublic-packages\u002Fzod-stream) 是一个直接与 LLM 流对接的客户端模块。它利用 Schema-Stream 进行高效解析，并配备了处理 OpenAI 原始响应的工具，按模式（函数、工具、JSON 等）对响应进行分类，同时确保正确的错误处理和流转换。它非常适合用于交付结构化 LLM 响应流的 API 集成。\n\n### schema-stream\n\n[schema-stream](https:\u002F\u002Fgithub.com\u002Fhack-dance\u002Fisland-ai\u002Ftree\u002Fmain\u002Fpublic-packages\u002Fschema-stream) 是一个 JSON 流式解析器，它根据 Zod 模式逐步构建和更新响应模型。它专为实时数据处理和增量模型填充而设计。\n\n\n### llm-polyglot\n\n[llm-polyglot](https:\u002F\u002Fgithub.com\u002Fhack-dance\u002Fisland-ai\u002Ftree\u002Fmain\u002Fpublic-packages\u002Fllm-client) 是一个库，提供了一个统一的接口，用于与不同提供商的各种语言模型交互，例如 OpenAI、Anthropic、Azure 和 Cohere。它简化了与多个 LLM 提供商合作的过程，并实现了与 Instructor 的无缝集成。\n\nInstructor 利用了这些 Island AI 软件包的强大功能，为使用 LLM 进行结构化数据提取和流式处理提供了无缝且高效的体验。Dimitri Kennedy（Island AI 的创建者）与 Jason Liu（原始 Instructor Python 包的作者）的合作促成了 Instructor 的 TypeScript 版本的开发，该版本引入了来自 LLM 的部分 JSON 流的概念。\n\n如需了解有关 Island AI 及其软件包的更多信息，请参阅 [Island AI 仓库](https:\u002F\u002Fgithub.com\u002Fhack-dance\u002Fisland-ai)。\n\n\n## 为什么使用 Instructor？\n\n使用 Instructor 的问题本质上就是为什么使用 Zod 的问题。\n\n1. **兼容 OpenAI SDK** — Instructor 遵循 OpenAI 的 API。这意味着您可以在支持 OpenAI API 的多个提供商之间，对提示和提取使用相同的 API。\n\n2. **可定制性** — Zod 具有高度可定制性。您可以定义自己的验证器、自定义错误消息等。\n\n3. **生态系统** Zod 是 Typescript 中使用最广泛的数据验证库。\n\n4. **久经考验** — Zod 每月下载量超过 2400 万次，并拥有庞大的贡献者社区支持。\n\n\n\n\n## 贡献\n\n如果您想帮忙，请查看标记为 `good-first-issue` 或 `help-wanted` 的一些问题。这些问题位于 [这里](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Flabels\u002Fgood%20first%20issue)。它们可以是代码改进、客座博客文章或新的食谱。\n\n请查看[贡献指南](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fblob\u002Fmain\u002FCONTRIBUTING.md)，了解如何设置环境、测试、变更集和相关规范。\n\n> ℹ️ **提示：** 支持其他语言\n\n    请查看以下其他语言的移植版本：\n\n    - [Python](https:\u002F\u002Fwww.github.com\u002Fjxnl\u002Finstructor)\n    - [Elixir](https:\u002F\u002Fgithub.com\u002Fthmsmlr\u002Finstructor_ex\u002F)\n\n    如果您想将 Instructor 移植到另一种语言，请在 [Twitter](https:\u002F\u002Ftwitter.com\u002Fjxnlco) 上联系我们，我们很乐意帮助您入门！\n\n## 许可证\n\n本项目采用 MIT 许可证条款授权。","# instructor-js 中文快速上手指南\n\n## 环境准备\n\n- **系统要求**：Node.js 16+（推荐 18+）\n- **前置依赖**：\n  - OpenAI API 密钥（可从 [platform.openai.com](https:\u002F\u002Fplatform.openai.com\u002F) 获取）\n  - 可选：支持 OpenAI API 的国内代理（如 [https:\u002F\u002Fapi.openai.com.cn](https:\u002F\u002Fapi.openai.com.cn)）用于加速访问\n\n## 安装步骤\n\n使用任意包管理器安装依赖：\n\n```bash\nbun add @instructor-ai\u002Finstructor zod openai\n```\n\n```bash\nnpm i @instructor-ai\u002Finstructor zod openai\n```\n\n```bash\npnpm add @instructor-ai\u002Finstructor zod openai\n```\n\n> ✅ 推荐使用 `pnpm` 或 `bun` 以获得更快的安装速度和更小的 node_modules。\n\n## 基本使用\n\n```typescript\nimport Instructor from \"@instructor-ai\u002Finstructor\";\nimport OpenAI from \"openai\";\nimport { z } from \"zod\";\n\nconst oai = new OpenAI({\n  apiKey: process.env.OPENAI_API_KEY ?? undefined,\n  organization: process.env.OPENAI_ORG_ID ?? undefined\n});\n\nconst client = Instructor({\n  client: oai,\n  mode: \"TOOLS\"\n});\n\nconst UserSchema = z.object({\n  age: z.number().describe(\"用户年龄\"),\n  name: z.string()\n});\n\nconst user = await client.chat.completions.create({\n  messages: [{ role: \"user\", content: \"张三今年 28 岁\" }],\n  model: \"gpt-3.5-turbo\",\n  response_model: { \n    schema: UserSchema, \n    name: \"User\"\n  }\n});\n\nconsole.log(user);\n\u002F\u002F 输出：{ age: 28, name: \"张三\" }\n```","一家跨境电商公司运营团队每天需从上千条客户邮件中提取用户姓名、订单号、退货原因等结构化数据，用于自动处理售后工单。团队此前依赖人工抄录或正则匹配，效率低且易出错。\n\n### 没有 instructor-js 时\n- 每条邮件需人工阅读并复制粘贴到Excel，每人每天最多处理150条，严重拖慢响应速度。\n- 正则表达式无法识别“我想要退货，因为尺码不合适”这类自然语言表达，漏提率高达30%。\n- 数据格式混乱，有时“年龄”写成“30岁”，有时是“thirty”，手动清洗耗时且容易引入错误。\n- 后端接口要求严格JSON结构，但提取结果常缺字段或类型错误，导致API调用频繁失败。\n- 团队没有专职数据工程师，开发自研提取系统成本高、维护难，项目长期停滞。\n\n### 使用 instructor-js 后\n- 只需一行代码调用，即可从任意邮件内容中精准提取{name, order_id, reason}结构化数据，处理速度提升至每分钟50条。\n- 基于Zod Schema定义字段规则，自动识别“尺码不对”“颜色不喜欢”等语义，准确率提升至98%以上。\n- 类型安全保障：返回数据自动校验为{string, string, string}，杜绝空值或数字误写为文本，后端无需额外校验。\n- 开发者只需定义Schema，无需编写复杂提示词工程，新人30分钟即可接入系统。\n- 与现有Node.js服务无缝集成，无需引入新语言或复杂框架，部署成本几乎为零。\n\ninstructor-js 让非AI专家的开发团队，用几行TypeScript代码实现了原本需要专业NLP团队半年才能建成的结构化数据提取能力。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002F567-labs_instructor-js_ea7f6ab6.png","567-labs","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002F567-labs_08acc0e9.png",null,"https:\u002F\u002Fgithub.com\u002F567-labs",[80,84,88,92],{"name":81,"color":82,"percentage":83},"TypeScript","#3178c6",87.8,{"name":85,"color":86,"percentage":87},"HTML","#e34c26",10.6,{"name":89,"color":90,"percentage":91},"JavaScript","#f1e05a",1.3,{"name":93,"color":94,"percentage":95},"Shell","#89e051",0.2,789,73,"2026-04-04T03:15:40","MIT","Linux, macOS, Windows","未说明",{"notes":103,"python":101,"dependencies":104},"该工具为 TypeScript\u002FJavaScript 库，依赖 OpenAI 兼容的 API 服务（如 OpenAI、Anyscale、Together、Anthropic 等），无需本地部署模型，运行环境基于 Node.js，建议使用 Bun、npm 或 pnpm 管理依赖，需配置有效的 API 密钥。",[105,106,107],"@instructor-ai\u002Finstructor","zod","openai",[13,26],[110,107,106],"llm",4,"2026-03-27T02:49:30.150509","2026-04-06T05:36:44.408973",[115,120,125,130,135,140],{"id":116,"question_zh":117,"answer_zh":118,"source_url":119},8861,"如何在 instructor-js 中使用 Groq API？","直接使用 groq-sdk 初始化客户端，并传入 Instructor，无需修改 baseURL。示例代码：import Instructor from '@instructor-ai\u002Finstructor'; import Groq from 'groq-sdk'; const groq = new Groq({ apiKey: GROQ_API_KEY }); const client = Instructor({ client: groq, mode: 'TOOLS' });","https:\u002F\u002Fgithub.com\u002F567-labs\u002Finstructor-js\u002Fissues\u002F170",{"id":121,"question_zh":122,"answer_zh":123,"source_url":124},8862,"使用 Zod 的 .default() 时出现 'Unsupported type: ZodDefault' 错误怎么办？","该问题已修复，升级到最新版本即可。此前是底层 schema-stream 库不支持 ZodDefault，现已添加支持。确保安装最新版 @instructor-ai\u002Finstructor 和相关依赖。","https:\u002F\u002Fgithub.com\u002F567-labs\u002Finstructor-js\u002Fissues\u002F84",{"id":126,"question_zh":127,"answer_zh":128,"source_url":129},8863,"运行示例时出现 'TypeError: g is not a constructor' 错误如何解决？","该错误通常由 ES 模块解析问题引起，尤其是在使用 ts-node 或 TSX 时。解决方案：升级依赖包以明确导出配置，或改用 bun 运行。推荐使用 bun 安装并运行项目，或确保项目中所有上游依赖都正确声明了 exports 字段。","https:\u002F\u002Fgithub.com\u002F567-labs\u002Finstructor-js\u002Fissues\u002F87",{"id":131,"question_zh":132,"answer_zh":133,"source_url":134},8864,"如何为 instructor-js 设置测试和 CI？","可参考 Python 版本的测试结构（https:\u002F\u002Fgithub.com\u002Fjxnl\u002Finstructor\u002Ftree\u002Fmain\u002Ftests\u002Fopenai）编写测试用例，使用 Jest 或 Vitest 覆盖不同 model 和 mode 的组合。建议添加 CircleCI 或 GitHub Actions 实现自动化测试和覆盖率统计，提升代码质量。","https:\u002F\u002Fgithub.com\u002F567-labs\u002Finstructor-js\u002Fissues\u002F5",{"id":136,"question_zh":137,"answer_zh":138,"source_url":139},8865,"instructor-js 是否已发布到 npm？","是的，已发布。包名为 @instructor-ai\u002Finstructor。可通过 npm install @instructor-ai\u002Finstructor 安装。早期曾因 npm 包名 'instructor' 被占用而延迟，现已解决。","https:\u002F\u002Fgithub.com\u002F567-labs\u002Finstructor-js\u002Fissues\u002F7",{"id":141,"question_zh":142,"answer_zh":143,"source_url":119},8866,"如何调试 instructor-js 的请求内容？","可通过重写 global.fetch 来捕获原始请求体。示例代码：const originalFetch = global.fetch; global.fetch = async (url, options) => { console.log('Fetch Request:', options.body); return await originalFetch(url, options); }; 可用于排查 JSON 解析或 API 通信问题。",[145,150,155,160,165,170,175,180,185,190,195,200,205,210,215,220,225,230],{"id":146,"version":147,"summary_zh":148,"released_at":149},106228,"v1.7.0","### Minor Changes\n\n-   [#195](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F195) [`3aac90e`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002F3aac90e7e965c0769d4ff98aadad78b2aada948b) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - adding a new mode to support parsing thinking blocks out of markdown json responses (R1)\n","2025-01-27T23:44:43",{"id":151,"version":152,"summary_zh":153,"released_at":154},106229,"v1.6.0","### Minor Changes\n\n-   [#193](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F193) [`da449de`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002Fda449de837389ccc30584b8954a64e3cafa22832) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - update peer deps + remove baseUrl check on generic client type guard\n\n-   [#191](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F191) [`bf240b2`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002Fbf240b27c893ee3e6da8f09e1d9c14004eb0d604) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - Updating core openai\u002Fzod stream\u002Fzod\u002Fanthropic dependencies to latest - updating tests and support validation to better handle changes without warning unnecessarily.\n\n    Peer dependencies will likely need to be updated to match the latest.\n","2025-01-14T04:38:34",{"id":156,"version":157,"summary_zh":158,"released_at":159},106230,"v1.5.0","### Minor Changes\n\n-   [#179](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F179) [`1b56bd1`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002F1b56bd18e0c7c02da94ee70e7837a155c3502b5c) Thanks [@morgante](https:\u002F\u002Fgithub.com\u002Fmorgante)! - Restore CommonJS compatibility for OpenAI streaming\n\n### Patch Changes\n\n-   [`9486edb`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002F9486edb470295067ee7a537fc409132dceba5d10) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - update mjs import from openai\n","2024-06-16T23:20:45",{"id":161,"version":162,"summary_zh":163,"released_at":164},106231,"v1.4.0","### Minor Changes\r\n\r\n-   [#182](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F182) [`0a5bbd8`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002F0a5bbd8082915bcc8c4686d34fec5d5f034ebd9c) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - update client types to better support non oai clients + updates to allow for passing usage properties into meta from non-oai clients\r\n- [#177](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F177) - add new option for providing custom logger + add new option for retrying on any error\r\n","2024-06-13T03:50:47",{"id":166,"version":167,"summary_zh":168,"released_at":169},106232,"v1.3.0","### Minor Changes\n\n-   [#176](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F176) [`6dd4255`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002F6dd42554e89d36c93132eace2dd67951297831bd) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - add ability to include usage from streams by teeing stream when option is present\n\n-   [#177](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F177) [`09f04d1`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002F09f04d1ff7a943679a7c49e4b20a23827cbdaae4) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - add new option for providing custom logger\n    add new option for retrying on any error\n","2024-05-17T17:09:29",{"id":171,"version":172,"summary_zh":173,"released_at":174},106233,"v1.2.1","### Patch Changes\n\n-   [#166](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F166) [`ddfe257`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002Fddfe2572c672708fb9ad20ad6726cb3af07c5148) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - make sure we pass through \\_meta on non stream completions\n","2024-04-21T03:14:00",{"id":176,"version":177,"summary_zh":178,"released_at":179},106234,"v1.2.0","### Minor Changes\n\n-   [#164](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F164) [`6942d65`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002F6942d652b7750fac4306c4d713399cdc03e86a9b) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - adding request option pass through + handling non validation errors a little bit better and not retrying if not validation error specifically\n","2024-04-21T02:36:07",{"id":181,"version":182,"summary_zh":183,"released_at":184},106235,"v1.1.2","### Patch Changes\n\n-   [#162](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F162) [`287aa27`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002F287aa27d92450d73dd300de7e84927d94cae9220) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - add groq to supported providers - remove error on validation and warn instead so we dont fail if we are out of date on the mappings\n","2024-04-20T02:39:40",{"id":186,"version":187,"summary_zh":188,"released_at":189},106236,"v1.1.1","### Patch Changes\n\n-   [#157](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F157) [`c272342`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002Fc272342c9baa8631990afa66bcb7dafb3c81f78b) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - updates zod-stream dep to get control charachter filtering on teh raw stream\n","2024-04-10T17:22:37",{"id":191,"version":192,"summary_zh":193,"released_at":194},106237,"v1.1.0","### Minor Changes\n\n-   [#153](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F153) [`76ef059`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002F76ef0591a1e34b73923d0c21afcf9e09e99b6b7c) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - updated client types to be more flexible - added tests for latest anthropic updates and llm-polyglot major\n","2024-04-07T04:05:31",{"id":196,"version":197,"summary_zh":198,"released_at":199},106238,"v1.0.0","### Major Changes\n\n-   [#144](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F144) [`d0275ff`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002Fd0275ff3b91d87d05a72c98001a49222e3cba348) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - updating all types to better support non openai clients - this changes some of the previously exported types and adds a few new ones\n\n-   [#125](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F125) [`c205286`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002Fc205286dccdbc6feacfd2aeeca0e0ba449631a57) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - Updating zod-stream major and stream output types - this change moves the internal properties tacked onto the stream output from many \\_properties to one \\_meta object with the properties nested - this also adds explicit types so when used in ts projects it doesnt yell.\n\n### Minor Changes\n\n-   [#132](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F132) [`f65672c`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002Ff65672cfe443e37cb32ee721aa406ca093125ffb) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - adding meta to standard completions as well and including usage - also added more verbose debug logs and new provider specific transformers to handle discrepencies in various apis\n","2024-03-26T16:04:03",{"id":201,"version":202,"summary_zh":203,"released_at":204},106239,"v0.0.7","### Patch Changes\n\n-   [#123](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F123) [`70d3874`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002F70d38747339a33ecca2d60c75140db3c200260fc) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - updating zod-stream\u002Fschema-stream to pick up on updates to enums and handling better defaults\n","2024-02-23T17:15:41",{"id":206,"version":207,"summary_zh":208,"released_at":209},106240,"v0.0.6","### Patch Changes\n\n-   [#104](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F104) [`95aa27f`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002F95aa27f75a6ac719b1640eee1c48c5861573defc) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - explicit check for oai url vs falling through to other\n","2024-02-14T03:35:12",{"id":211,"version":212,"summary_zh":213,"released_at":214},106241,"v0.0.5","### Patch Changes\n\n-   [#99](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F99) [`c9ab910`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002Fc9ab9104e554e4f24b55f69cf24b784091c7bfb1) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - Adding explicit support for non-oai providers - currently anyscale and together ai - will do explicit checks on mode selected vs provider and model\n\n-   [#97](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F97) [`c7aec7c`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002Fc7aec7c072aaa6921a30995332a9fb61938dce9d) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - Fixing inference on stream types when using npm or pnpm\n","2024-02-01T04:14:23",{"id":216,"version":217,"summary_zh":218,"released_at":219},106242,"v0.0.4","### Patch Changes\n\n-   [#90](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F90) [`771d175`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002F771d1750361b409ed8a59adfdf79a29174b67c87) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - Updating build and exports for wider range of support\n","2024-01-29T02:05:26",{"id":221,"version":222,"summary_zh":223,"released_at":224},106243,"v0.0.3","### Patch Changes\n\n-   [#86](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F86) [`205c6cb`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002F205c6cbc4e276b792953352e546ada356467aab5) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - update zodstream and schema stream to support zod defaults\n\n-   [#74](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F74) [`f93d93b`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002Ff93d93b7553af81a727bd8783d18c2901bb0d11a) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - Type updates\n","2024-01-23T18:13:20",{"id":226,"version":227,"summary_zh":228,"released_at":229},106244,"v0.0.2","### Patch Changes\n\n-   [#66](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F66) [`dc22633`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002Fdc226330a57ee5b06ff1ee44a2ad7c4526f5796d) Thanks [@ethanleifer](https:\u002F\u002Fgithub.com\u002Fethanleifer)! - Cleanup Types, make response_model.name required and rely on inference\n\n-   [#72](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F72) [`265a9e5`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002F265a9e5fd2d8b0fdeaa98ee8b3ee3c27fa1c6a2b) Thanks [@ethanleifer](https:\u002F\u002Fgithub.com\u002Fethanleifer)! - Implements testing for typescript inference\n","2024-01-08T20:38:56",{"id":231,"version":232,"summary_zh":233,"released_at":234},106245,"v0.0.1","### Patch Changes\n\n-   [#62](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fpull\u002F62) [`a939ff5`](https:\u002F\u002Fgithub.com\u002Finstructor-ai\u002Finstructor-js\u002Fcommit\u002Fa939ff5713c4b90437a73e62e83f8c713ac0a782) Thanks [@roodboi](https:\u002F\u002Fgithub.com\u002Froodboi)! - V0 release\n","2024-01-07T02:45:05"]