[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-mistralai--client-python":3,"tool-mistralai--client-python":62},[4,18,26,36,46,54],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":17},4358,"openclaw","openclaw\u002Fopenclaw","OpenClaw 是一款专为个人打造的本地化 AI 助手，旨在让你在自己的设备上拥有完全可控的智能伙伴。它打破了传统 AI 助手局限于特定网页或应用的束缚，能够直接接入你日常使用的各类通讯渠道，包括微信、WhatsApp、Telegram、Discord、iMessage 等数十种平台。无论你在哪个聊天软件中发送消息，OpenClaw 都能即时响应，甚至支持在 macOS、iOS 和 Android 设备上进行语音交互，并提供实时的画布渲染功能供你操控。\n\n这款工具主要解决了用户对数据隐私、响应速度以及“始终在线”体验的需求。通过将 AI 部署在本地，用户无需依赖云端服务即可享受快速、私密的智能辅助，真正实现了“你的数据，你做主”。其独特的技术亮点在于强大的网关架构，将控制平面与核心助手分离，确保跨平台通信的流畅性与扩展性。\n\nOpenClaw 非常适合希望构建个性化工作流的技术爱好者、开发者，以及注重隐私保护且不愿被单一生态绑定的普通用户。只要具备基础的终端操作能力（支持 macOS、Linux 及 Windows WSL2），即可通过简单的命令行引导完成部署。如果你渴望拥有一个懂你",349277,3,"2026-04-06T06:32:30",[13,14,15,16],"Agent","开发框架","图像","数据工具","ready",{"id":19,"name":20,"github_repo":21,"description_zh":22,"stars":23,"difficulty_score":10,"last_commit_at":24,"category_tags":25,"status":17},3808,"stable-diffusion-webui","AUTOMATIC1111\u002Fstable-diffusion-webui","stable-diffusion-webui 是一个基于 Gradio 构建的网页版操作界面，旨在让用户能够轻松地在本地运行和使用强大的 Stable Diffusion 图像生成模型。它解决了原始模型依赖命令行、操作门槛高且功能分散的痛点，将复杂的 AI 绘图流程整合进一个直观易用的图形化平台。\n\n无论是希望快速上手的普通创作者、需要精细控制画面细节的设计师，还是想要深入探索模型潜力的开发者与研究人员，都能从中获益。其核心亮点在于极高的功能丰富度：不仅支持文生图、图生图、局部重绘（Inpainting）和外绘（Outpainting）等基础模式，还独创了注意力机制调整、提示词矩阵、负向提示词以及“高清修复”等高级功能。此外，它内置了 GFPGAN 和 CodeFormer 等人脸修复工具，支持多种神经网络放大算法，并允许用户通过插件系统无限扩展能力。即使是显存有限的设备，stable-diffusion-webui 也提供了相应的优化选项，让高质量的 AI 艺术创作变得触手可及。",162132,"2026-04-05T11:01:52",[14,15,13],{"id":27,"name":28,"github_repo":29,"description_zh":30,"stars":31,"difficulty_score":32,"last_commit_at":33,"category_tags":34,"status":17},1381,"everything-claude-code","affaan-m\u002Feverything-claude-code","everything-claude-code 是一套专为 AI 编程助手（如 Claude Code、Codex、Cursor 等）打造的高性能优化系统。它不仅仅是一组配置文件，而是一个经过长期实战打磨的完整框架，旨在解决 AI 代理在实际开发中面临的效率低下、记忆丢失、安全隐患及缺乏持续学习能力等核心痛点。\n\n通过引入技能模块化、直觉增强、记忆持久化机制以及内置的安全扫描功能，everything-claude-code 能显著提升 AI 在复杂任务中的表现，帮助开发者构建更稳定、更智能的生产级 AI 代理。其独特的“研究优先”开发理念和针对 Token 消耗的优化策略，使得模型响应更快、成本更低，同时有效防御潜在的攻击向量。\n\n这套工具特别适合软件开发者、AI 研究人员以及希望深度定制 AI 工作流的技术团队使用。无论您是在构建大型代码库，还是需要 AI 协助进行安全审计与自动化测试，everything-claude-code 都能提供强大的底层支持。作为一个曾荣获 Anthropic 黑客大奖的开源项目，它融合了多语言支持与丰富的实战钩子（hooks），让 AI 真正成长为懂上",158594,2,"2026-04-16T23:34:05",[14,13,35],"语言模型",{"id":37,"name":38,"github_repo":39,"description_zh":40,"stars":41,"difficulty_score":42,"last_commit_at":43,"category_tags":44,"status":17},8272,"opencode","anomalyco\u002Fopencode","OpenCode 是一款开源的 AI 编程助手（Coding Agent），旨在像一位智能搭档一样融入您的开发流程。它不仅仅是一个代码补全插件，而是一个能够理解项目上下文、自主规划任务并执行复杂编码操作的智能体。无论是生成全新功能、重构现有代码，还是排查难以定位的 Bug，OpenCode 都能通过自然语言交互高效完成，显著减少开发者在重复性劳动和上下文切换上的时间消耗。\n\n这款工具专为软件开发者、工程师及技术研究人员设计，特别适合希望利用大模型能力来提升编码效率、加速原型开发或处理遗留代码维护的专业人群。其核心亮点在于完全开源的架构，这意味着用户可以审查代码逻辑、自定义行为策略，甚至私有化部署以保障数据安全，彻底打破了传统闭源 AI 助手的“黑盒”限制。\n\n在技术体验上，OpenCode 提供了灵活的终端界面（Terminal UI）和正在测试中的桌面应用程序，支持 macOS、Windows 及 Linux 全平台。它兼容多种包管理工具，安装便捷，并能无缝集成到现有的开发环境中。无论您是追求极致控制权的资深极客，还是渴望提升产出的独立开发者，OpenCode 都提供了一个透明、可信",144296,1,"2026-04-16T14:50:03",[13,45],"插件",{"id":47,"name":48,"github_repo":49,"description_zh":50,"stars":51,"difficulty_score":32,"last_commit_at":52,"category_tags":53,"status":17},2271,"ComfyUI","Comfy-Org\u002FComfyUI","ComfyUI 是一款功能强大且高度模块化的视觉 AI 引擎，专为设计和执行复杂的 Stable Diffusion 图像生成流程而打造。它摒弃了传统的代码编写模式，采用直观的节点式流程图界面，让用户通过连接不同的功能模块即可构建个性化的生成管线。\n\n这一设计巧妙解决了高级 AI 绘图工作流配置复杂、灵活性不足的痛点。用户无需具备编程背景，也能自由组合模型、调整参数并实时预览效果，轻松实现从基础文生图到多步骤高清修复等各类复杂任务。ComfyUI 拥有极佳的兼容性，不仅支持 Windows、macOS 和 Linux 全平台，还广泛适配 NVIDIA、AMD、Intel 及苹果 Silicon 等多种硬件架构，并率先支持 SDXL、Flux、SD3 等前沿模型。\n\n无论是希望深入探索算法潜力的研究人员和开发者，还是追求极致创作自由度的设计师与资深 AI 绘画爱好者，ComfyUI 都能提供强大的支持。其独特的模块化架构允许社区不断扩展新功能，使其成为当前最灵活、生态最丰富的开源扩散模型工具之一，帮助用户将创意高效转化为现实。",108322,"2026-04-10T11:39:34",[14,15,13],{"id":55,"name":56,"github_repo":57,"description_zh":58,"stars":59,"difficulty_score":32,"last_commit_at":60,"category_tags":61,"status":17},6121,"gemini-cli","google-gemini\u002Fgemini-cli","gemini-cli 是一款由谷歌推出的开源 AI 命令行工具，它将强大的 Gemini 大模型能力直接集成到用户的终端环境中。对于习惯在命令行工作的开发者而言，它提供了一条从输入提示词到获取模型响应的最短路径，无需切换窗口即可享受智能辅助。\n\n这款工具主要解决了开发过程中频繁上下文切换的痛点，让用户能在熟悉的终端界面内直接完成代码理解、生成、调试以及自动化运维任务。无论是查询大型代码库、根据草图生成应用，还是执行复杂的 Git 操作，gemini-cli 都能通过自然语言指令高效处理。\n\n它特别适合广大软件工程师、DevOps 人员及技术研究人员使用。其核心亮点包括支持高达 100 万 token 的超长上下文窗口，具备出色的逻辑推理能力；内置 Google 搜索、文件操作及 Shell 命令执行等实用工具；更独特的是，它支持 MCP（模型上下文协议），允许用户灵活扩展自定义集成，连接如图像生成等外部能力。此外，个人谷歌账号即可享受免费的额度支持，且项目基于 Apache 2.0 协议完全开源，是提升终端工作效率的理想助手。",100752,"2026-04-10T01:20:03",[45,13,15,14],{"id":63,"github_repo":64,"name":65,"description_en":66,"description_zh":67,"ai_summary_zh":67,"readme_en":68,"readme_zh":69,"quickstart_zh":70,"use_case_zh":71,"hero_image_url":72,"owner_login":73,"owner_name":74,"owner_avatar_url":75,"owner_bio":74,"owner_company":76,"owner_location":76,"owner_email":77,"owner_twitter":76,"owner_website":78,"owner_url":79,"languages":80,"stars":93,"forks":94,"last_commit_at":95,"license":96,"difficulty_score":32,"env_os":97,"env_gpu":98,"env_ram":98,"env_deps":99,"category_tags":106,"github_topics":76,"view_count":32,"oss_zip_url":76,"oss_zip_packed_at":76,"status":17,"created_at":107,"updated_at":108,"faqs":109,"releases":139},8122,"mistralai\u002Fclient-python","client-python","Python client library for Mistral AI platform","client-python 是 Mistral AI 官方推出的 Python 客户端库，旨在帮助开发者轻松接入 Mistral 大模型平台。它封装了复杂的 API 调用细节，让使用者只需几行代码即可调用强大的聊天补全（Chat Completion）和文本嵌入（Embeddings）功能，从而快速构建智能应用或进行算法验证。\n\n这款工具主要解决了开发者在集成大模型时面临的配置繁琐、请求处理复杂以及错误重试机制缺失等痛点。通过提供简洁的接口，它支持流式响应（Server-sent events）、自动分页、文件上传及灵活的重试策略，显著提升了开发效率与系统稳定性。此外，它还兼容 uv、pip 和 Poetry 等多种主流包管理工具，并针对 Agent 开发提供了额外的依赖支持。\n\nclient-python 非常适合 Python 开发者、AI 研究人员以及需要在大模型基础上进行二次创作的技术团队使用。无论是希望快速原型验证的研究者，还是致力于生产环境部署的工程师，都能借助其现代化的设计和完善的文档，顺畅地将 Mistral 的先进能力融入自己的项目中。","# Mistral Python Client\n\n## Migrating from v1\n\nIf you are upgrading from v1 to v2, check the [migration guide](https:\u002F\u002Fgithub.com\u002Fmistralai\u002Fclient-python\u002Fblob\u002Fmain\u002FMIGRATION.md) for details on breaking changes and how to update your code.\n\n## API Key Setup\n\nBefore you begin, you will need a Mistral AI API key.\n\n1. Get your own Mistral API Key: \u003Chttps:\u002F\u002Fdocs.mistral.ai\u002F#api-access>\n2. Set your Mistral API Key as an environment variable. You only need to do this once.\n\n```bash\n# set Mistral API Key (using zsh for example)\n$ echo 'export MISTRAL_API_KEY=[your_key_here]' >> ~\u002F.zshenv\n\n# reload the environment (or just quit and open a new terminal)\n$ source ~\u002F.zshenv\n```\n\n\u003C!-- Start Summary [summary] -->\n## Summary\n\nMistral AI API: Our Chat Completion and Embeddings APIs specification. Create your account on [La Plateforme](https:\u002F\u002Fconsole.mistral.ai) to get access and read the [docs](https:\u002F\u002Fdocs.mistral.ai) to learn how to use it.\n\u003C!-- End Summary [summary] -->\n\n\u003C!-- Start Table of Contents [toc] -->\n## Table of Contents\n\u003C!-- $toc-max-depth=2 -->\n* [Mistral Python Client](#mistral-python-client)\n  * [Migrating from v1](#migrating-from-v1)\n  * [API Key Setup](#api-key-setup)\n  * [SDK Installation](#sdk-installation)\n  * [SDK Example Usage](#sdk-example-usage)\n  * [Providers' SDKs Example Usage](#providers-sdks-example-usage)\n  * [Available Resources and Operations](#available-resources-and-operations)\n  * [Server-sent event streaming](#server-sent-event-streaming)\n  * [Pagination](#pagination)\n  * [File uploads](#file-uploads)\n  * [Retries](#retries)\n  * [Error Handling](#error-handling)\n  * [Server Selection](#server-selection)\n  * [Custom HTTP Client](#custom-http-client)\n  * [Authentication](#authentication)\n  * [Resource Management](#resource-management)\n  * [Debugging](#debugging)\n  * [IDE Support](#ide-support)\n* [Development](#development)\n  * [Contributions](#contributions)\n\n\u003C!-- End Table of Contents [toc] -->\n\n\u003C!-- Start SDK Installation [installation] -->\n## SDK Installation\n\n> [!NOTE]\n> **Python version upgrade policy**\n>\n> Once a Python version reaches its [official end of life date](https:\u002F\u002Fdevguide.python.org\u002Fversions\u002F), a 3-month grace period is provided for users to upgrade. Following this grace period, the minimum python version supported in the SDK will be updated.\n\nThe SDK can be installed with *uv*, *pip*, or *poetry* package managers.\n\n### uv\n\n*uv* is a fast Python package installer and resolver, designed as a drop-in replacement for pip and pip-tools. It's recommended for its speed and modern Python tooling capabilities.\n\n```bash\nuv add mistralai\n```\n\n### PIP\n\n*PIP* is the default package installer for Python, enabling easy installation and management of packages from PyPI via the command line.\n\n```bash\npip install mistralai\n```\n\n### Poetry\n\n*Poetry* is a modern tool that simplifies dependency management and package publishing by using a single `pyproject.toml` file to handle project metadata and dependencies.\n\n```bash\npoetry add mistralai\n```\n\n### Shell and script usage with `uv`\n\nYou can use this SDK in a Python shell with [uv](https:\u002F\u002Fdocs.astral.sh\u002Fuv\u002F) and the `uvx` command that comes with it like so:\n\n```shell\nuvx --from mistralai python\n```\n\nIt's also possible to write a standalone Python script without needing to set up a whole project like so:\n\n```python\n#!\u002Fusr\u002Fbin\u002Fenv -S uv run --script\n# \u002F\u002F\u002F script\n# requires-python = \">=3.10\"\n# dependencies = [\n#     \"mistralai\",\n# ]\n# \u002F\u002F\u002F\n\nfrom mistralai.client import Mistral\n\nsdk = Mistral(\n  # SDK arguments\n)\n\n# Rest of script here...\n```\n\nOnce that is saved to a file, you can run it with `uv run script.py` where\n`script.py` can be replaced with the actual file name.\n\u003C!-- End SDK Installation [installation] -->\n\n### Agents extra dependencies\n\nWhen using the agents related feature it is required to add the `agents` extra dependencies. This can be added when\ninstalling the package:\n\n```bash\npip install \"mistralai[agents]\"\n```\n\n> Note: These features require Python 3.10+ (the SDK minimum).\n\n### Additional packages\n\nAdditional `mistralai-*` packages (e.g. `mistralai-workflows`) can be installed separately and are available under the `mistralai` namespace:\n\n```bash\npip install mistralai-workflows\n```\n\n\u003C!-- Start SDK Example Usage [usage] -->\n## SDK Example Usage\n\n### Create Chat Completions\n\nThis example shows how to create chat completions.\n\n```python\n# Synchronous Example\nfrom mistralai.client import Mistral\nimport os\n\n\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.chat.complete(model=\"mistral-large-latest\", messages=[\n        {\n            \"role\": \"user\",\n            \"content\": \"Who is the best French painter? Answer in one short sentence.\",\n        },\n    ], stream=False, response_format={\n        \"type\": \"text\",\n    })\n\n    # Handle response\n    print(res)\n```\n\n\u003C\u002Fbr>\n\nThe same SDK client can also be used to make asynchronous requests by importing asyncio.\n\n```python\n# Asynchronous Example\nimport asyncio\nfrom mistralai.client import Mistral\nimport os\n\nasync def main():\n\n    async with Mistral(\n        api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n    ) as mistral:\n\n        res = await mistral.chat.complete_async(model=\"mistral-large-latest\", messages=[\n            {\n                \"role\": \"user\",\n                \"content\": \"Who is the best French painter? Answer in one short sentence.\",\n            },\n        ], stream=False, response_format={\n            \"type\": \"text\",\n        })\n\n        # Handle response\n        print(res)\n\nasyncio.run(main())\n```\n\n### Upload a file\n\nThis example shows how to upload a file.\n\n```python\n# Synchronous Example\nfrom mistralai.client import Mistral\nimport os\n\n\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.files.upload(file={\n        \"file_name\": \"example.file\",\n        \"content\": open(\"example.file\", \"rb\"),\n    }, visibility=\"workspace\")\n\n    # Handle response\n    print(res)\n```\n\n\u003C\u002Fbr>\n\nThe same SDK client can also be used to make asynchronous requests by importing asyncio.\n\n```python\n# Asynchronous Example\nimport asyncio\nfrom mistralai.client import Mistral\nimport os\n\nasync def main():\n\n    async with Mistral(\n        api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n    ) as mistral:\n\n        res = await mistral.files.upload_async(file={\n            \"file_name\": \"example.file\",\n            \"content\": open(\"example.file\", \"rb\"),\n        }, visibility=\"workspace\")\n\n        # Handle response\n        print(res)\n\nasyncio.run(main())\n```\n\n### Create Agents Completions\n\nThis example shows how to create agents completions.\n\n```python\n# Synchronous Example\nfrom mistralai.client import Mistral\nimport os\n\n\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.agents.complete(messages=[\n        {\n            \"role\": \"user\",\n            \"content\": \"Who is the best French painter? Answer in one short sentence.\",\n        },\n    ], agent_id=\"\u003Cid>\", stream=False, response_format={\n        \"type\": \"text\",\n    })\n\n    # Handle response\n    print(res)\n```\n\n\u003C\u002Fbr>\n\nThe same SDK client can also be used to make asynchronous requests by importing asyncio.\n\n```python\n# Asynchronous Example\nimport asyncio\nfrom mistralai.client import Mistral\nimport os\n\nasync def main():\n\n    async with Mistral(\n        api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n    ) as mistral:\n\n        res = await mistral.agents.complete_async(messages=[\n            {\n                \"role\": \"user\",\n                \"content\": \"Who is the best French painter? Answer in one short sentence.\",\n            },\n        ], agent_id=\"\u003Cid>\", stream=False, response_format={\n            \"type\": \"text\",\n        })\n\n        # Handle response\n        print(res)\n\nasyncio.run(main())\n```\n\n### Create Embedding Request\n\nThis example shows how to create embedding request.\n\n```python\n# Synchronous Example\nfrom mistralai.client import Mistral\nimport os\n\n\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.embeddings.create(model=\"mistral-embed\", inputs=[\n        \"Embed this sentence.\",\n        \"As well as this one.\",\n    ])\n\n    # Handle response\n    print(res)\n```\n\n\u003C\u002Fbr>\n\nThe same SDK client can also be used to make asynchronous requests by importing asyncio.\n\n```python\n# Asynchronous Example\nimport asyncio\nfrom mistralai.client import Mistral\nimport os\n\nasync def main():\n\n    async with Mistral(\n        api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n    ) as mistral:\n\n        res = await mistral.embeddings.create_async(model=\"mistral-embed\", inputs=[\n            \"Embed this sentence.\",\n            \"As well as this one.\",\n        ])\n\n        # Handle response\n        print(res)\n\nasyncio.run(main())\n```\n\u003C!-- End SDK Example Usage [usage] -->\n\n\n### More examples\n\nYou can run the examples in the `examples\u002F` directory using `uv run`.\n\n\n## Providers' SDKs Example Usage\n\n### Azure AI\n\n**Prerequisites**\n\nBefore you begin, ensure you have `AZURE_ENDPOINT` and an `AZURE_API_KEY`. To obtain these, you will need to deploy Mistral on Azure AI.\nSee [instructions for deploying Mistral on Azure AI here](https:\u002F\u002Fdocs.mistral.ai\u002Fdeployment\u002Fcloud\u002Fazure\u002F).\n\n**Step 1: Install**\n\n```bash\npip install mistralai\n```\n\n**Step 2: Example Usage**\n\nHere's a basic example to get you started. You can also run [the example in the `examples` directory](\u002Fexamples\u002Fazure).\n\n```python\nimport os\nfrom mistralai.azure.client import MistralAzure\n\n# The SDK automatically injects api-version as a query parameter\nclient = MistralAzure(\n    api_key=os.environ[\"AZURE_API_KEY\"],\n    server_url=os.environ[\"AZURE_ENDPOINT\"],\n    api_version=\"2024-05-01-preview\",  # Optional, this is the default\n)\n\nres = client.chat.complete(\n    model=os.environ[\"AZURE_MODEL\"],\n    messages=[\n        {\n            \"role\": \"user\",\n            \"content\": \"Hello there!\",\n        }\n    ],\n)\nprint(res.choices[0].message.content)\n```\n\n### Google Cloud\n\n\n**Prerequisites**\n\nBefore you begin, you will need to create a Google Cloud project and enable the Mistral API. To do this, follow the instructions [here](https:\u002F\u002Fdocs.mistral.ai\u002Fdeployment\u002Fcloud\u002Fvertex\u002F).\n\nTo run this locally you will also need to ensure you are authenticated with Google Cloud. You can do this by running\n\n```bash\ngcloud auth application-default login\n```\n\n**Step 1: Install**\n\n```bash\npip install mistralai\n# For GCP authentication support (required):\npip install \"mistralai[gcp]\"\n```\n\n**Step 2: Example Usage**\n\nHere's a basic example to get you started. You can also run [the example in the `examples` directory](\u002Fexamples\u002Fgcp).\n\nThe SDK automatically:\n- Detects credentials via `google.auth.default()`\n- Auto-refreshes tokens when they expire\n- Builds the Vertex AI URL from `project_id` and `region`\n\n```python\nimport os\nfrom mistralai.gcp.client import MistralGCP\n\n# The SDK auto-detects credentials and builds the Vertex AI URL\nclient = MistralGCP(\n    project_id=os.environ.get(\"GCP_PROJECT_ID\"),  # Optional: auto-detected from credentials\n    region=\"us-central1\",  # Default: europe-west4\n)\n\nres = client.chat.complete(\n    model=\"mistral-small-2503\",\n    messages=[\n        {\n            \"role\": \"user\",\n            \"content\": \"Hello there!\",\n        }\n    ],\n)\nprint(res.choices[0].message.content)\n```\n\n\n\u003C!-- Start Available Resources and Operations [operations] -->\n## Available Resources and Operations\n\n\u003Cdetails open>\n\u003Csummary>Available methods\u003C\u002Fsummary>\n\n### [Agents](docs\u002Fsdks\u002Fagents\u002FREADME.md)\n\n* [complete](docs\u002Fsdks\u002Fagents\u002FREADME.md#complete) - Agents Completion\n* [stream](docs\u002Fsdks\u002Fagents\u002FREADME.md#stream) - Stream Agents completion\n\n### [Audio.Speech](docs\u002Fsdks\u002Fspeech\u002FREADME.md)\n\n* [complete](docs\u002Fsdks\u002Fspeech\u002FREADME.md#complete) - Speech\n\n### [Audio.Transcriptions](docs\u002Fsdks\u002Ftranscriptions\u002FREADME.md)\n\n* [complete](docs\u002Fsdks\u002Ftranscriptions\u002FREADME.md#complete) - Create Transcription\n* [stream](docs\u002Fsdks\u002Ftranscriptions\u002FREADME.md#stream) - Create Streaming Transcription (SSE)\n\n### [Audio.Voices](docs\u002Fsdks\u002Fvoices\u002FREADME.md)\n\n* [list](docs\u002Fsdks\u002Fvoices\u002FREADME.md#list) - List all voices\n* [create](docs\u002Fsdks\u002Fvoices\u002FREADME.md#create) - Create a new voice\n* [delete](docs\u002Fsdks\u002Fvoices\u002FREADME.md#delete) - Delete a custom voice\n* [update](docs\u002Fsdks\u002Fvoices\u002FREADME.md#update) - Update voice metadata\n* [get](docs\u002Fsdks\u002Fvoices\u002FREADME.md#get) - Get voice details\n* [get_sample_audio](docs\u002Fsdks\u002Fvoices\u002FREADME.md#get_sample_audio) - Get voice sample audio\n\n### [Batch.Jobs](docs\u002Fsdks\u002Fbatchjobs\u002FREADME.md)\n\n* [list](docs\u002Fsdks\u002Fbatchjobs\u002FREADME.md#list) - Get Batch Jobs\n* [create](docs\u002Fsdks\u002Fbatchjobs\u002FREADME.md#create) - Create Batch Job\n* [get](docs\u002Fsdks\u002Fbatchjobs\u002FREADME.md#get) - Get Batch Job\n* [delete](docs\u002Fsdks\u002Fbatchjobs\u002FREADME.md#delete) - Delete Batch Job\n* [cancel](docs\u002Fsdks\u002Fbatchjobs\u002FREADME.md#cancel) - Cancel Batch Job\n\n### [Beta.Agents](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md)\n\n* [create](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#create) - Create a agent that can be used within a conversation.\n* [list](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#list) - List agent entities.\n* [get](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#get) - Retrieve an agent entity.\n* [update](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#update) - Update an agent entity.\n* [delete](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#delete) - Delete an agent entity.\n* [update_version](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#update_version) - Update an agent version.\n* [list_versions](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#list_versions) - List all versions of an agent.\n* [get_version](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#get_version) - Retrieve a specific version of an agent.\n* [create_version_alias](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#create_version_alias) - Create or update an agent version alias.\n* [list_version_aliases](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#list_version_aliases) - List all aliases for an agent.\n* [delete_version_alias](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#delete_version_alias) - Delete an agent version alias.\n\n### [Beta.Connectors](docs\u002Fsdks\u002Fconnectors\u002FREADME.md)\n\n* [create](docs\u002Fsdks\u002Fconnectors\u002FREADME.md#create) - Create a new connector.\n* [list](docs\u002Fsdks\u002Fconnectors\u002FREADME.md#list) - List all connectors.\n* [get_auth_url](docs\u002Fsdks\u002Fconnectors\u002FREADME.md#get_auth_url) - Get the auth URL for a connector.\n* [call_tool](docs\u002Fsdks\u002Fconnectors\u002FREADME.md#call_tool) - Call Connector Tool\n* [list_tools](docs\u002Fsdks\u002Fconnectors\u002FREADME.md#list_tools) - List tools for a connector.\n* [get](docs\u002Fsdks\u002Fconnectors\u002FREADME.md#get) - Get a connector.\n* [update](docs\u002Fsdks\u002Fconnectors\u002FREADME.md#update) - Update a connector.\n* [delete](docs\u002Fsdks\u002Fconnectors\u002FREADME.md#delete) - Delete a connector.\n\n### [Beta.Conversations](docs\u002Fsdks\u002Fconversations\u002FREADME.md)\n\n* [start](docs\u002Fsdks\u002Fconversations\u002FREADME.md#start) - Create a conversation and append entries to it.\n* [list](docs\u002Fsdks\u002Fconversations\u002FREADME.md#list) - List all created conversations.\n* [get](docs\u002Fsdks\u002Fconversations\u002FREADME.md#get) - Retrieve a conversation information.\n* [delete](docs\u002Fsdks\u002Fconversations\u002FREADME.md#delete) - Delete a conversation.\n* [append](docs\u002Fsdks\u002Fconversations\u002FREADME.md#append) - Append new entries to an existing conversation.\n* [get_history](docs\u002Fsdks\u002Fconversations\u002FREADME.md#get_history) - Retrieve all entries in a conversation.\n* [get_messages](docs\u002Fsdks\u002Fconversations\u002FREADME.md#get_messages) - Retrieve all messages in a conversation.\n* [restart](docs\u002Fsdks\u002Fconversations\u002FREADME.md#restart) - Restart a conversation starting from a given entry.\n* [start_stream](docs\u002Fsdks\u002Fconversations\u002FREADME.md#start_stream) - Create a conversation and append entries to it.\n* [append_stream](docs\u002Fsdks\u002Fconversations\u002FREADME.md#append_stream) - Append new entries to an existing conversation.\n* [restart_stream](docs\u002Fsdks\u002Fconversations\u002FREADME.md#restart_stream) - Restart a conversation starting from a given entry.\n\n### [Beta.Libraries](docs\u002Fsdks\u002Flibraries\u002FREADME.md)\n\n* [list](docs\u002Fsdks\u002Flibraries\u002FREADME.md#list) - List all libraries you have access to.\n* [create](docs\u002Fsdks\u002Flibraries\u002FREADME.md#create) - Create a new Library.\n* [get](docs\u002Fsdks\u002Flibraries\u002FREADME.md#get) - Detailed information about a specific Library.\n* [delete](docs\u002Fsdks\u002Flibraries\u002FREADME.md#delete) - Delete a library and all of it's document.\n* [update](docs\u002Fsdks\u002Flibraries\u002FREADME.md#update) - Update a library.\n\n#### [Beta.Libraries.Accesses](docs\u002Fsdks\u002Faccesses\u002FREADME.md)\n\n* [list](docs\u002Fsdks\u002Faccesses\u002FREADME.md#list) - List all of the access to this library.\n* [update_or_create](docs\u002Fsdks\u002Faccesses\u002FREADME.md#update_or_create) - Create or update an access level.\n* [delete](docs\u002Fsdks\u002Faccesses\u002FREADME.md#delete) - Delete an access level.\n\n#### [Beta.Libraries.Documents](docs\u002Fsdks\u002Fdocuments\u002FREADME.md)\n\n* [list](docs\u002Fsdks\u002Fdocuments\u002FREADME.md#list) - List documents in a given library.\n* [upload](docs\u002Fsdks\u002Fdocuments\u002FREADME.md#upload) - Upload a new document.\n* [get](docs\u002Fsdks\u002Fdocuments\u002FREADME.md#get) - Retrieve the metadata of a specific document.\n* [update](docs\u002Fsdks\u002Fdocuments\u002FREADME.md#update) - Update the metadata of a specific document.\n* [delete](docs\u002Fsdks\u002Fdocuments\u002FREADME.md#delete) - Delete a document.\n* [text_content](docs\u002Fsdks\u002Fdocuments\u002FREADME.md#text_content) - Retrieve the text content of a specific document.\n* [status](docs\u002Fsdks\u002Fdocuments\u002FREADME.md#status) - Retrieve the processing status of a specific document.\n* [get_signed_url](docs\u002Fsdks\u002Fdocuments\u002FREADME.md#get_signed_url) - Retrieve the signed URL of a specific document.\n* [extracted_text_signed_url](docs\u002Fsdks\u002Fdocuments\u002FREADME.md#extracted_text_signed_url) - Retrieve the signed URL of text extracted from a given document.\n* [reprocess](docs\u002Fsdks\u002Fdocuments\u002FREADME.md#reprocess) - Reprocess a document.\n\n### [Beta.Observability.Campaigns](docs\u002Fsdks\u002Fcampaigns\u002FREADME.md)\n\n* [create](docs\u002Fsdks\u002Fcampaigns\u002FREADME.md#create) - Create and start a new campaign\n* [list](docs\u002Fsdks\u002Fcampaigns\u002FREADME.md#list) - Get all campaigns\n* [fetch](docs\u002Fsdks\u002Fcampaigns\u002FREADME.md#fetch) - Get campaign by id\n* [delete](docs\u002Fsdks\u002Fcampaigns\u002FREADME.md#delete) - Delete a campaign\n* [fetch_status](docs\u002Fsdks\u002Fcampaigns\u002FREADME.md#fetch_status) - Get campaign status by campaign id\n* [list_events](docs\u002Fsdks\u002Fcampaigns\u002FREADME.md#list_events) - Get event ids that were selected by the given campaign\n\n### [Beta.Observability.ChatCompletionEvents](docs\u002Fsdks\u002Fchatcompletionevents\u002FREADME.md)\n\n* [search](docs\u002Fsdks\u002Fchatcompletionevents\u002FREADME.md#search) - Get Chat Completion Events\n* [search_ids](docs\u002Fsdks\u002Fchatcompletionevents\u002FREADME.md#search_ids) - Alternative to \u002Fsearch that returns only the IDs and that can return many IDs at once\n* [fetch](docs\u002Fsdks\u002Fchatcompletionevents\u002FREADME.md#fetch) - Get Chat Completion Event\n* [fetch_similar_events](docs\u002Fsdks\u002Fchatcompletionevents\u002FREADME.md#fetch_similar_events) - Get Similar Chat Completion Events\n* [judge](docs\u002Fsdks\u002Fchatcompletionevents\u002FREADME.md#judge) - Run Judge on an event based on the given options\n\n#### [Beta.Observability.ChatCompletionEvents.Fields](docs\u002Fsdks\u002Ffields\u002FREADME.md)\n\n* [list](docs\u002Fsdks\u002Ffields\u002FREADME.md#list) - Get Chat Completion Fields\n* [fetch_options](docs\u002Fsdks\u002Ffields\u002FREADME.md#fetch_options) - Get Chat Completion Field Options\n* [fetch_option_counts](docs\u002Fsdks\u002Ffields\u002FREADME.md#fetch_option_counts) - Get Chat Completion Field Options Counts\n\n### [Beta.Observability.Datasets](docs\u002Fsdks\u002Fdatasets\u002FREADME.md)\n\n* [create](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#create) - Create a new empty dataset\n* [list](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#list) - List existing datasets\n* [fetch](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#fetch) - Get dataset by id\n* [delete](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#delete) - Delete a dataset\n* [update](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#update) - Patch dataset\n* [list_records](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#list_records) - List existing records in the dataset\n* [create_record](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#create_record) - Add a conversation to the dataset\n* [import_from_campaign](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#import_from_campaign) - Populate the dataset with a campaign\n* [import_from_explorer](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#import_from_explorer) - Populate the dataset with samples from the explorer\n* [import_from_file](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#import_from_file) - Populate the dataset with samples from an uploaded file\n* [import_from_playground](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#import_from_playground) - Populate the dataset with samples from the playground\n* [import_from_dataset_records](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#import_from_dataset_records) - Populate the dataset with samples from another dataset\n* [export_to_jsonl](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#export_to_jsonl) - Export to the Files API and retrieve presigned URL to download the resulting JSONL file\n* [fetch_task](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#fetch_task) - Get status of a dataset import task\n* [list_tasks](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#list_tasks) - List import tasks for the given dataset\n\n#### [Beta.Observability.Datasets.Records](docs\u002Fsdks\u002Frecords\u002FREADME.md)\n\n* [fetch](docs\u002Fsdks\u002Frecords\u002FREADME.md#fetch) - Get the content of a given conversation from a dataset\n* [delete](docs\u002Fsdks\u002Frecords\u002FREADME.md#delete) - Delete a record from a dataset\n* [bulk_delete](docs\u002Fsdks\u002Frecords\u002FREADME.md#bulk_delete) - Delete multiple records from datasets\n* [judge](docs\u002Fsdks\u002Frecords\u002FREADME.md#judge) - Run Judge on a dataset record based on the given options\n* [update_payload](docs\u002Fsdks\u002Frecords\u002FREADME.md#update_payload) - Update a dataset record conversation payload\n* [update_properties](docs\u002Fsdks\u002Frecords\u002FREADME.md#update_properties) - Update conversation properties\n\n### [Beta.Observability.Judges](docs\u002Fsdks\u002Fjudges\u002FREADME.md)\n\n* [create](docs\u002Fsdks\u002Fjudges\u002FREADME.md#create) - Create a new judge\n* [list](docs\u002Fsdks\u002Fjudges\u002FREADME.md#list) - Get judges with optional filtering and search\n* [fetch](docs\u002Fsdks\u002Fjudges\u002FREADME.md#fetch) - Get judge by id\n* [delete](docs\u002Fsdks\u002Fjudges\u002FREADME.md#delete) - Delete a judge\n* [update](docs\u002Fsdks\u002Fjudges\u002FREADME.md#update) - Update a judge\n* [judge_conversation](docs\u002Fsdks\u002Fjudges\u002FREADME.md#judge_conversation) - Run a saved judge on a conversation\n\n### [Chat](docs\u002Fsdks\u002Fchat\u002FREADME.md)\n\n* [complete](docs\u002Fsdks\u002Fchat\u002FREADME.md#complete) - Chat Completion\n* [stream](docs\u002Fsdks\u002Fchat\u002FREADME.md#stream) - Stream chat completion\n\n### [Classifiers](docs\u002Fsdks\u002Fclassifiers\u002FREADME.md)\n\n* [moderate](docs\u002Fsdks\u002Fclassifiers\u002FREADME.md#moderate) - Moderations\n* [moderate_chat](docs\u002Fsdks\u002Fclassifiers\u002FREADME.md#moderate_chat) - Chat Moderations\n* [classify](docs\u002Fsdks\u002Fclassifiers\u002FREADME.md#classify) - Classifications\n* [classify_chat](docs\u002Fsdks\u002Fclassifiers\u002FREADME.md#classify_chat) - Chat Classifications\n\n### [Embeddings](docs\u002Fsdks\u002Fembeddings\u002FREADME.md)\n\n* [create](docs\u002Fsdks\u002Fembeddings\u002FREADME.md#create) - Embeddings\n\n### [Events](docs\u002Fsdks\u002Fevents\u002FREADME.md)\n\n* [get_stream_events](docs\u002Fsdks\u002Fevents\u002FREADME.md#get_stream_events) - Get Stream Events\n* [get_workflow_events](docs\u002Fsdks\u002Fevents\u002FREADME.md#get_workflow_events) - Get Workflow Events\n\n### [Files](docs\u002Fsdks\u002Ffiles\u002FREADME.md)\n\n* [upload](docs\u002Fsdks\u002Ffiles\u002FREADME.md#upload) - Upload File\n* [list](docs\u002Fsdks\u002Ffiles\u002FREADME.md#list) - List Files\n* [retrieve](docs\u002Fsdks\u002Ffiles\u002FREADME.md#retrieve) - Retrieve File\n* [delete](docs\u002Fsdks\u002Ffiles\u002FREADME.md#delete) - Delete File\n* [download](docs\u002Fsdks\u002Ffiles\u002FREADME.md#download) - Download File\n* [get_signed_url](docs\u002Fsdks\u002Ffiles\u002FREADME.md#get_signed_url) - Get Signed Url\n\n### [Fim](docs\u002Fsdks\u002Ffim\u002FREADME.md)\n\n* [complete](docs\u002Fsdks\u002Ffim\u002FREADME.md#complete) - Fim Completion\n* [stream](docs\u002Fsdks\u002Ffim\u002FREADME.md#stream) - Stream fim completion\n\n### [FineTuning.Jobs](docs\u002Fsdks\u002Ffinetuningjobs\u002FREADME.md)\n\n* [list](docs\u002Fsdks\u002Ffinetuningjobs\u002FREADME.md#list) - Get Fine Tuning Jobs\n* [create](docs\u002Fsdks\u002Ffinetuningjobs\u002FREADME.md#create) - Create Fine Tuning Job\n* [get](docs\u002Fsdks\u002Ffinetuningjobs\u002FREADME.md#get) - Get Fine Tuning Job\n* [cancel](docs\u002Fsdks\u002Ffinetuningjobs\u002FREADME.md#cancel) - Cancel Fine Tuning Job\n* [start](docs\u002Fsdks\u002Ffinetuningjobs\u002FREADME.md#start) - Start Fine Tuning Job\n\n### [Models](docs\u002Fsdks\u002Fmodels\u002FREADME.md)\n\n* [list](docs\u002Fsdks\u002Fmodels\u002FREADME.md#list) - List Models\n* [retrieve](docs\u002Fsdks\u002Fmodels\u002FREADME.md#retrieve) - Retrieve Model\n* [delete](docs\u002Fsdks\u002Fmodels\u002FREADME.md#delete) - Delete Model\n* [update](docs\u002Fsdks\u002Fmodels\u002FREADME.md#update) - Update Fine Tuned Model\n* [archive](docs\u002Fsdks\u002Fmodels\u002FREADME.md#archive) - Archive Fine Tuned Model\n* [unarchive](docs\u002Fsdks\u002Fmodels\u002FREADME.md#unarchive) - Unarchive Fine Tuned Model\n\n### [Ocr](docs\u002Fsdks\u002Focr\u002FREADME.md)\n\n* [process](docs\u002Fsdks\u002Focr\u002FREADME.md#process) - OCR\n\n### [Workflows](docs\u002Fsdks\u002Fworkflows\u002FREADME.md)\n\n* [get_workflows](docs\u002Fsdks\u002Fworkflows\u002FREADME.md#get_workflows) - Get Workflows\n* [get_workflow_registrations](docs\u002Fsdks\u002Fworkflows\u002FREADME.md#get_workflow_registrations) - Get Workflow Registrations\n* [execute_workflow](docs\u002Fsdks\u002Fworkflows\u002FREADME.md#execute_workflow) - Execute Workflow\n* [~~execute_workflow_registration~~](docs\u002Fsdks\u002Fworkflows\u002FREADME.md#execute_workflow_registration) - Execute Workflow Registration :warning: **Deprecated**\n* [get_workflow](docs\u002Fsdks\u002Fworkflows\u002FREADME.md#get_workflow) - Get Workflow\n* [update_workflow](docs\u002Fsdks\u002Fworkflows\u002FREADME.md#update_workflow) - Update Workflow\n* [get_workflow_registration](docs\u002Fsdks\u002Fworkflows\u002FREADME.md#get_workflow_registration) - Get Workflow Registration\n* [archive_workflow](docs\u002Fsdks\u002Fworkflows\u002FREADME.md#archive_workflow) - Archive Workflow\n* [unarchive_workflow](docs\u002Fsdks\u002Fworkflows\u002FREADME.md#unarchive_workflow) - Unarchive Workflow\n\n#### [Workflows.Deployments](docs\u002Fsdks\u002Fdeployments\u002FREADME.md)\n\n* [list_deployments](docs\u002Fsdks\u002Fdeployments\u002FREADME.md#list_deployments) - List Deployments\n* [get_deployment](docs\u002Fsdks\u002Fdeployments\u002FREADME.md#get_deployment) - Get Deployment\n\n#### [Workflows.Events](docs\u002Fsdks\u002Fworkflowsevents\u002FREADME.md)\n\n* [get_stream_events](docs\u002Fsdks\u002Fworkflowsevents\u002FREADME.md#get_stream_events) - Get Stream Events\n* [get_workflow_events](docs\u002Fsdks\u002Fworkflowsevents\u002FREADME.md#get_workflow_events) - Get Workflow Events\n\n#### [Workflows.Executions](docs\u002Fsdks\u002Fexecutions\u002FREADME.md)\n\n* [get_workflow_execution](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#get_workflow_execution) - Get Workflow Execution\n* [get_workflow_execution_history](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#get_workflow_execution_history) - Get Workflow Execution History\n* [signal_workflow_execution](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#signal_workflow_execution) - Signal Workflow Execution\n* [query_workflow_execution](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#query_workflow_execution) - Query Workflow Execution\n* [terminate_workflow_execution](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#terminate_workflow_execution) - Terminate Workflow Execution\n* [batch_terminate_workflow_executions](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#batch_terminate_workflow_executions) - Batch Terminate Workflow Executions\n* [cancel_workflow_execution](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#cancel_workflow_execution) - Cancel Workflow Execution\n* [batch_cancel_workflow_executions](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#batch_cancel_workflow_executions) - Batch Cancel Workflow Executions\n* [reset_workflow](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#reset_workflow) - Reset Workflow\n* [update_workflow_execution](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#update_workflow_execution) - Update Workflow Execution\n* [get_workflow_execution_trace_otel](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#get_workflow_execution_trace_otel) - Get Workflow Execution Trace Otel\n* [get_workflow_execution_trace_summary](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#get_workflow_execution_trace_summary) - Get Workflow Execution Trace Summary\n* [get_workflow_execution_trace_events](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#get_workflow_execution_trace_events) - Get Workflow Execution Trace Events\n* [stream](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#stream) - Stream\n\n#### [Workflows.Metrics](docs\u002Fsdks\u002Fmetrics\u002FREADME.md)\n\n* [get_workflow_metrics](docs\u002Fsdks\u002Fmetrics\u002FREADME.md#get_workflow_metrics) - Get Workflow Metrics\n\n#### [Workflows.Runs](docs\u002Fsdks\u002Fruns\u002FREADME.md)\n\n* [list_runs](docs\u002Fsdks\u002Fruns\u002FREADME.md#list_runs) - List Runs\n* [get_run](docs\u002Fsdks\u002Fruns\u002FREADME.md#get_run) - Get Run\n* [get_run_history](docs\u002Fsdks\u002Fruns\u002FREADME.md#get_run_history) - Get Run History\n\n#### [Workflows.Schedules](docs\u002Fsdks\u002Fschedules\u002FREADME.md)\n\n* [get_schedules](docs\u002Fsdks\u002Fschedules\u002FREADME.md#get_schedules) - Get Schedules\n* [schedule_workflow](docs\u002Fsdks\u002Fschedules\u002FREADME.md#schedule_workflow) - Schedule Workflow\n* [unschedule_workflow](docs\u002Fsdks\u002Fschedules\u002FREADME.md#unschedule_workflow) - Unschedule Workflow\n\n\u003C\u002Fdetails>\n\u003C!-- End Available Resources and Operations [operations] -->\n\n\u003C!-- Start Server-sent event streaming [eventstream] -->\n## Server-sent event streaming\n\n[Server-sent events][mdn-sse] are used to stream content from certain\noperations. These operations will expose the stream as [Generator][generator] that\ncan be consumed using a simple `for` loop. The loop will\nterminate when the server no longer has any events to send and closes the\nunderlying connection.  \n\nThe stream is also a [Context Manager][context-manager] and can be used with the `with` statement and will close the\nunderlying connection when the context is exited.\n\n```python\nfrom mistralai.client import Mistral\nimport os\n\n\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.audio.speech.complete(input=\"\u003Cvalue>\", stream=False, additional_properties={\n\n    })\n\n    with res as event_stream:\n        for event in event_stream:\n            # handle event\n            print(event, flush=True)\n\n```\n\n[mdn-sse]: https:\u002F\u002Fdeveloper.mozilla.org\u002Fen-US\u002Fdocs\u002FWeb\u002FAPI\u002FServer-sent_events\u002FUsing_server-sent_events\n[generator]: https:\u002F\u002Fbook.pythontips.com\u002Fen\u002Flatest\u002Fgenerators.html\n[context-manager]: https:\u002F\u002Fbook.pythontips.com\u002Fen\u002Flatest\u002Fcontext_managers.html\n\u003C!-- End Server-sent event streaming [eventstream] -->\n\n\u003C!-- Start Pagination [pagination] -->\n## Pagination\n\nSome of the endpoints in this SDK support pagination. To use pagination, you make your SDK calls as usual, but the\nreturned response object will have a `Next` method that can be called to pull down the next group of results. If the\nreturn value of `Next` is `None`, then there are no more pages to be fetched.\n\nHere's an example of one such pagination call:\n```python\nfrom mistralai.client import Mistral\nimport os\n\n\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.workflows.get_workflows(active_only=False, include_shared=True, limit=50)\n\n    while res is not None:\n        # Handle items\n\n        res = res.next()\n\n```\n\u003C!-- End Pagination [pagination] -->\n\n\u003C!-- Start File uploads [file-upload] -->\n## File uploads\n\nCertain SDK methods accept file objects as part of a request body or multi-part request. It is possible and typically recommended to upload files as a stream rather than reading the entire contents into memory. This avoids excessive memory consumption and potentially crashing with out-of-memory errors when working with very large files. The following example demonstrates how to attach a file stream to a request.\n\n> [!TIP]\n>\n> For endpoints that handle file uploads bytes arrays can also be used. However, using streams is recommended for large files.\n>\n\n```python\nfrom mistralai.client import Mistral\nimport os\n\n\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.audio.transcriptions.complete(model=\"Model X\", diarize=False)\n\n    # Handle response\n    print(res)\n\n```\n\u003C!-- End File uploads [file-upload] -->\n\n\u003C!-- Start Retries [retries] -->\n## Retries\n\nSome of the endpoints in this SDK support retries. If you use the SDK without any configuration, it will fall back to the default retry strategy provided by the API. However, the default retry strategy can be overridden on a per-operation basis, or across the entire SDK.\n\nTo change the default retry strategy for a single API call, simply provide a `RetryConfig` object to the call:\n```python\nfrom mistralai.client import Mistral\nfrom mistralai.client.utils import BackoffStrategy, RetryConfig\nimport os\n\n\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.audio.speech.complete(input=\"\u003Cvalue>\", stream=False, additional_properties={\n\n    },\n        RetryConfig(\"backoff\", BackoffStrategy(1, 50, 1.1, 100), False))\n\n    with res as event_stream:\n        for event in event_stream:\n            # handle event\n            print(event, flush=True)\n\n```\n\nIf you'd like to override the default retry strategy for all operations that support retries, you can use the `retry_config` optional parameter when initializing the SDK:\n```python\nfrom mistralai.client import Mistral\nfrom mistralai.client.utils import BackoffStrategy, RetryConfig\nimport os\n\n\nwith Mistral(\n    retry_config=RetryConfig(\"backoff\", BackoffStrategy(1, 50, 1.1, 100), False),\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.audio.speech.complete(input=\"\u003Cvalue>\", stream=False, additional_properties={\n\n    })\n\n    with res as event_stream:\n        for event in event_stream:\n            # handle event\n            print(event, flush=True)\n\n```\n\u003C!-- End Retries [retries] -->\n\n\u003C!-- Start Error Handling [errors] -->\n## Error Handling\n\n[`MistralError`](.\u002Fsrc\u002Fmistralai\u002Fclient\u002Ferrors\u002Fmistralerror.py) is the base class for all HTTP error responses. It has the following properties:\n\n| Property           | Type             | Description                                                                             |\n| ------------------ | ---------------- | --------------------------------------------------------------------------------------- |\n| `err.message`      | `str`            | Error message                                                                           |\n| `err.status_code`  | `int`            | HTTP response status code eg `404`                                                      |\n| `err.headers`      | `httpx.Headers`  | HTTP response headers                                                                   |\n| `err.body`         | `str`            | HTTP body. Can be empty string if no body is returned.                                  |\n| `err.raw_response` | `httpx.Response` | Raw HTTP response                                                                       |\n| `err.data`         |                  | Optional. Some errors may contain structured data. [See Error Classes](#error-classes). |\n\n### Example\n```python\nfrom mistralai.client import Mistral, errors\nimport os\n\n\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n    res = None\n    try:\n\n        res = mistral.audio.speech.complete(input=\"\u003Cvalue>\", stream=False, additional_properties={\n\n        })\n\n        with res as event_stream:\n            for event in event_stream:\n                # handle event\n                print(event, flush=True)\n\n\n    except errors.MistralError as e:\n        # The base class for HTTP error responses\n        print(e.message)\n        print(e.status_code)\n        print(e.body)\n        print(e.headers)\n        print(e.raw_response)\n\n        # Depending on the method different errors may be thrown\n        if isinstance(e, errors.HTTPValidationError):\n            print(e.data.detail)  # Optional[List[models.ValidationError]]\n```\n\n### Error Classes\n**Primary error:**\n* [`MistralError`](.\u002Fsrc\u002Fmistralai\u002Fclient\u002Ferrors\u002Fmistralerror.py): The base class for HTTP error responses.\n\n\u003Cdetails>\u003Csummary>Less common errors (7)\u003C\u002Fsummary>\n\n\u003Cbr \u002F>\n\n**Network errors:**\n* [`httpx.RequestError`](https:\u002F\u002Fwww.python-httpx.org\u002Fexceptions\u002F#httpx.RequestError): Base class for request errors.\n    * [`httpx.ConnectError`](https:\u002F\u002Fwww.python-httpx.org\u002Fexceptions\u002F#httpx.ConnectError): HTTP client was unable to make a request to a server.\n    * [`httpx.TimeoutException`](https:\u002F\u002Fwww.python-httpx.org\u002Fexceptions\u002F#httpx.TimeoutException): HTTP request timed out.\n\n\n**Inherit from [`MistralError`](.\u002Fsrc\u002Fmistralai\u002Fclient\u002Ferrors\u002Fmistralerror.py)**:\n* [`HTTPValidationError`](.\u002Fsrc\u002Fmistralai\u002Fclient\u002Ferrors\u002Fhttpvalidationerror.py): Validation Error. Status code `422`. Applicable to 103 of 168 methods.*\n* [`ObservabilityError`](.\u002Fsrc\u002Fmistralai\u002Fclient\u002Ferrors\u002Fobservabilityerror.py): Bad Request - Invalid request parameters or data. Applicable to 41 of 168 methods.*\n* [`ResponseValidationError`](.\u002Fsrc\u002Fmistralai\u002Fclient\u002Ferrors\u002Fresponsevalidationerror.py): Type mismatch between the response data and the expected Pydantic model. Provides access to the Pydantic validation error via the `cause` attribute.\n\n\u003C\u002Fdetails>\n\n\\* Check [the method documentation](#available-resources-and-operations) to see if the error is applicable.\n\u003C!-- End Error Handling [errors] -->\n\n\u003C!-- Start Server Selection [server] -->\n## Server Selection\n\n### Select Server by Name\n\nYou can override the default server globally by passing a server name to the `server: str` optional parameter when initializing the SDK client instance. The selected server will then be used as the default on the operations that use it. This table lists the names associated with the available servers:\n\n| Name | Server                   | Description          |\n| ---- | ------------------------ | -------------------- |\n| `eu` | `https:\u002F\u002Fapi.mistral.ai` | EU Production server |\n\n#### Example\n\n```python\nfrom mistralai.client import Mistral\nimport os\n\n\nwith Mistral(\n    server=\"eu\",\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.audio.speech.complete(input=\"\u003Cvalue>\", stream=False, additional_properties={\n\n    })\n\n    with res as event_stream:\n        for event in event_stream:\n            # handle event\n            print(event, flush=True)\n\n```\n\n### Override Server URL Per-Client\n\nThe default server can also be overridden globally by passing a URL to the `server_url: str` optional parameter when initializing the SDK client instance. For example:\n```python\nfrom mistralai.client import Mistral\nimport os\n\n\nwith Mistral(\n    server_url=\"https:\u002F\u002Fapi.mistral.ai\",\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.audio.speech.complete(input=\"\u003Cvalue>\", stream=False, additional_properties={\n\n    })\n\n    with res as event_stream:\n        for event in event_stream:\n            # handle event\n            print(event, flush=True)\n\n```\n\u003C!-- End Server Selection [server] -->\n\n\u003C!-- Start Custom HTTP Client [http-client] -->\n## Custom HTTP Client\n\nThe Python SDK makes API calls using the [httpx](https:\u002F\u002Fwww.python-httpx.org\u002F) HTTP library.  In order to provide a convenient way to configure timeouts, cookies, proxies, custom headers, and other low-level configuration, you can initialize the SDK client with your own HTTP client instance.\nDepending on whether you are using the sync or async version of the SDK, you can pass an instance of `HttpClient` or `AsyncHttpClient` respectively, which are Protocol's ensuring that the client has the necessary methods to make API calls.\nThis allows you to wrap the client with your own custom logic, such as adding custom headers, logging, or error handling, or you can just pass an instance of `httpx.Client` or `httpx.AsyncClient` directly.\n\nFor example, you could specify a header for every request that this sdk makes as follows:\n```python\nfrom mistralai.client import Mistral\nimport httpx\n\nhttp_client = httpx.Client(headers={\"x-custom-header\": \"someValue\"})\ns = Mistral(client=http_client)\n```\n\nor you could wrap the client with your own custom logic:\n```python\nfrom mistralai.client import Mistral\nfrom mistralai.client.httpclient import AsyncHttpClient\nimport httpx\n\nclass CustomClient(AsyncHttpClient):\n    client: AsyncHttpClient\n\n    def __init__(self, client: AsyncHttpClient):\n        self.client = client\n\n    async def send(\n        self,\n        request: httpx.Request,\n        *,\n        stream: bool = False,\n        auth: Union[\n            httpx._types.AuthTypes, httpx._client.UseClientDefault, None\n        ] = httpx.USE_CLIENT_DEFAULT,\n        follow_redirects: Union[\n            bool, httpx._client.UseClientDefault\n        ] = httpx.USE_CLIENT_DEFAULT,\n    ) -> httpx.Response:\n        request.headers[\"Client-Level-Header\"] = \"added by client\"\n\n        return await self.client.send(\n            request, stream=stream, auth=auth, follow_redirects=follow_redirects\n        )\n\n    def build_request(\n        self,\n        method: str,\n        url: httpx._types.URLTypes,\n        *,\n        content: Optional[httpx._types.RequestContent] = None,\n        data: Optional[httpx._types.RequestData] = None,\n        files: Optional[httpx._types.RequestFiles] = None,\n        json: Optional[Any] = None,\n        params: Optional[httpx._types.QueryParamTypes] = None,\n        headers: Optional[httpx._types.HeaderTypes] = None,\n        cookies: Optional[httpx._types.CookieTypes] = None,\n        timeout: Union[\n            httpx._types.TimeoutTypes, httpx._client.UseClientDefault\n        ] = httpx.USE_CLIENT_DEFAULT,\n        extensions: Optional[httpx._types.RequestExtensions] = None,\n    ) -> httpx.Request:\n        return self.client.build_request(\n            method,\n            url,\n            content=content,\n            data=data,\n            files=files,\n            json=json,\n            params=params,\n            headers=headers,\n            cookies=cookies,\n            timeout=timeout,\n            extensions=extensions,\n        )\n\ns = Mistral(async_client=CustomClient(httpx.AsyncClient()))\n```\n\u003C!-- End Custom HTTP Client [http-client] -->\n\n\u003C!-- Start Authentication [security] -->\n## Authentication\n\n### Per-Client Security Schemes\n\nThis SDK supports the following security scheme globally:\n\n| Name      | Type | Scheme      | Environment Variable |\n| --------- | ---- | ----------- | -------------------- |\n| `api_key` | http | HTTP Bearer | `MISTRAL_API_KEY`    |\n\nTo authenticate with the API the `api_key` parameter must be set when initializing the SDK client instance. For example:\n```python\nfrom mistralai.client import Mistral\nimport os\n\n\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.audio.speech.complete(input=\"\u003Cvalue>\", stream=False, additional_properties={\n\n    })\n\n    with res as event_stream:\n        for event in event_stream:\n            # handle event\n            print(event, flush=True)\n\n```\n\u003C!-- End Authentication [security] -->\n\n\u003C!-- Start Resource Management [resource-management] -->\n## Resource Management\n\nThe `Mistral` class implements the context manager protocol and registers a finalizer function to close the underlying sync and async HTTPX clients it uses under the hood. This will close HTTP connections, release memory and free up other resources held by the SDK. In short-lived Python programs and notebooks that make a few SDK method calls, resource management may not be a concern. However, in longer-lived programs, it is beneficial to create a single SDK instance via a [context manager][context-manager] and reuse it across the application.\n\n[context-manager]: https:\u002F\u002Fdocs.python.org\u002F3\u002Freference\u002Fdatamodel.html#context-managers\n\n```python\nfrom mistralai.client import Mistral\nimport os\ndef main():\n\n    with Mistral(\n        api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n    ) as mistral:\n        # Rest of application here...\n\n\n# Or when using async:\nasync def amain():\n\n    async with Mistral(\n        api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n    ) as mistral:\n        # Rest of application here...\n```\n\u003C!-- End Resource Management [resource-management] -->\n\n\u003C!-- Start Debugging [debug] -->\n## Debugging\n\nYou can setup your SDK to emit debug logs for SDK requests and responses.\n\nYou can pass your own logger class directly into your SDK.\n```python\nfrom mistralai.client import Mistral\nimport logging\n\nlogging.basicConfig(level=logging.DEBUG)\ns = Mistral(debug_logger=logging.getLogger(\"mistralai.client\"))\n```\n\nYou can also enable a default debug logger by setting an environment variable `MISTRAL_DEBUG` to true.\n\u003C!-- End Debugging [debug] -->\n\n\u003C!-- Start IDE Support [idesupport] -->\n## IDE Support\n\n### PyCharm\n\nGenerally, the SDK will work well with most IDEs out of the box. However, when using PyCharm, you can enjoy much better integration with Pydantic by installing an additional plugin.\n\n- [PyCharm Pydantic Plugin](https:\u002F\u002Fdocs.pydantic.dev\u002Flatest\u002Fintegrations\u002Fpycharm\u002F)\n\u003C!-- End IDE Support [idesupport] -->\n\n\u003C!-- Placeholder for Future Speakeasy SDK Sections -->\n\n# Development\n\n## Contributions\n\nWhile we value open-source contributions to this SDK, this library is generated programmatically. Any manual changes added to internal files will be overwritten on the next generation. \nWe look forward to hearing your feedback. Feel free to open a PR or an issue with a proof of concept and we'll do our best to include it in a future release. \n","# Mistral Python 客户端\n\n## 从 v1 迁移到 v2\n\n如果您正在从 v1 升级到 v2，请查看[迁移指南](https:\u002F\u002Fgithub.com\u002Fmistralai\u002Fclient-python\u002Fblob\u002Fmain\u002FMIGRATION.md)，了解破坏性变更以及如何更新您的代码。\n\n## API 密钥设置\n\n在开始之前，您需要一个 Mistral AI 的 API 密钥。\n\n1. 获取您自己的 Mistral API 密钥： \u003Chttps:\u002F\u002Fdocs.mistral.ai\u002F#api-access>\n2. 将您的 Mistral API 密钥设置为环境变量。您只需执行一次此操作。\n\n```bash\n# 设置 Mistral API 密钥（以 zsh 为例）\n$ echo 'export MISTRAL_API_KEY=[your_key_here]' >> ~\u002F.zshenv\n\n# 重新加载环境变量（或直接退出并打开一个新的终端）\n$ source ~\u002F.zshenv\n```\n\n\u003C!-- 开始摘要 [summary] -->\n## 摘要\n\nMistral AI API：我们的聊天完成和嵌入 API 规范。在 [La Plateforme](https:\u002F\u002Fconsole.mistral.ai) 上创建您的账户以获取访问权限，并阅读[文档](https:\u002F\u002Fdocs.mistral.ai)，了解如何使用它。\n\u003C!-- 结束摘要 [summary] -->\n\n\u003C!-- 开始目录 [toc] -->\n## 目录\n\u003C!-- $toc-max-depth=2 -->\n* [Mistral Python 客户端](#mistral-python-client)\n  * [从 v1 迁移](#migrating-from-v1)\n  * [API 密钥设置](#api-key-setup)\n  * [SDK 安装](#sdk-installation)\n  * [SDK 示例用法](#sdk-example-usage)\n  * [提供商 SDK 示例用法](#providers-sdks-example-usage)\n  * [可用资源和操作](#available-resources-and-operations)\n  * [服务器发送事件流](#server-sent-event-streaming)\n  * [分页](#pagination)\n  * [文件上传](#file-uploads)\n  * [重试](#retries)\n  * [错误处理](#error-handling)\n  * [服务器选择](#server-selection)\n  * [自定义 HTTP 客户端](#custom-http-client)\n  * [身份验证](#authentication)\n  * [资源管理](#resource-management)\n  * [调试](#debugging)\n  * [IDE 支持](#ide-support)\n* [开发](#development)\n  * [贡献](#contributions)\n\n\u003C!-- 结束目录 [toc] -->\n\n\u003C!-- 开始 SDK 安装 [installation] -->\n## SDK 安装\n\n> [!NOTE]\n> **Python 版本升级政策**\n>\n> 一旦某个 Python 版本达到其[官方终止支持日期](https:\u002F\u002Fdevguide.python.org\u002Fversions\u002F)，将为用户提供 3 个月的宽限期以便升级。在此宽限期之后，SDK 所支持的最低 Python 版本将会更新。\n\nSDK 可以通过 *uv*、*pip* 或 *poetry* 包管理器进行安装。\n\n### uv\n\n*uv* 是一个快速的 Python 包安装和解析工具，旨在作为 pip 和 pip-tools 的替代品。由于其速度和现代 Python 工具链功能，推荐使用。\n\n```bash\nuv add mistralai\n```\n\n### PIP\n\n*PIP* 是 Python 的默认包管理器，可通过命令行轻松地从 PyPI 安装和管理软件包。\n\n```bash\npip install mistralai\n```\n\n### Poetry\n\n*Poetry* 是一种现代化的工具，通过使用单个 `pyproject.toml` 文件来管理项目元数据和依赖关系，从而简化了依赖管理和软件包发布。\n\n```bash\npoetry add mistralai\n```\n\n### 使用 `uv` 进行 Shell 和脚本操作\n\n您可以在 Python Shell 中使用此 SDK，借助 [uv](https:\u002F\u002Fdocs.astral.sh\u002Fuv\u002F) 和随附的 `uvx` 命令，如下所示：\n\n```shell\nuvx --from mistralai python\n```\n\n也可以编写独立的 Python 脚本，而无需设置整个项目，如下所示：\n\n```python\n#!\u002Fusr\u002Fbin\u002Fenv -S uv run --script\n# \u002F\u002F\u002F script\n# requires-python = \">=3.10\"\n# dependencies = [\n#     \"mistralai\",\n# ]\n# \u002F\u002F\u002F\n\nfrom mistralai.client import Mistral\n\nsdk = Mistral(\n  # SDK 参数\n)\n\n# 脚本其余部分...\n```\n\n保存到文件后，您可以使用 `uv run script.py` 来运行脚本，其中 `script.py` 可以替换为实际的文件名。\n\u003C!-- 结束 SDK 安装 [installation] -->\n\n### 代理额外依赖项\n\n当使用与代理相关的功能时，需要添加 `agents` 额外依赖项。这可以在安装软件包时一并添加：\n\n```bash\npip install \"mistralai[agents]\"\n```\n\n> 注意：这些功能需要 Python 3.10 或更高版本（SDK 的最低要求）。\n\n### 其他软件包\n\n其他 `mistralai-*` 软件包（例如 `mistralai-workflows`）可以单独安装，并且属于 `mistralai` 命名空间：\n\n```bash\npip install mistralai-workflows\n```\n\n\u003C!-- 开始 SDK 示例用法 [usage] -->\n## SDK 示例用法\n\n### 创建聊天完成\n\n此示例展示了如何创建聊天完成。\n\n```python\n# 同步示例\nfrom mistralai.client import Mistral\nimport os\n\n\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.chat.complete(model=\"mistral-large-latest\", messages=[\n        {\n            \"role\": \"user\",\n            \"content\": \"谁是法国最伟大的画家？用一句话回答。\",\n        },\n    ], stream=False, response_format={\n        \"type\": \"text\",\n    })\n\n    # 处理响应\n    print(res)\n```\n\n\u003C\u002Fbr>\n\n同样的 SDK 客户端也可以通过导入 asyncio 来发出异步请求。\n\n```python\n# 异步示例\nimport asyncio\nfrom mistralai.client import Mistral\nimport os\n\nasync def main():\n\n    async with Mistral(\n        api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n    ) as mistral:\n\n        res = await mistral.chat.complete_async(model=\"mistral-large-latest\", messages=[\n            {\n                \"role\": \"user\",\n                \"content\": \"谁是法国最伟大的画家？用一句话回答。\",\n            },\n        ], stream=False, response_format={\n            \"type\": \"text\",\n        })\n\n        # 处理响应\n        print(res)\n\nasyncio.run(main())\n```\n\n### 上传文件\n\n此示例展示了如何上传文件。\n\n```python\n# 同步示例\nfrom mistralai.client import Mistral\nimport os\n\n\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.files.upload(file={\n        \"file_name\": \"example.file\",\n        \"content\": open(\"example.file\", \"rb\"),\n    }, visibility=\"workspace\")\n\n    # 处理响应\n    print(res)\n```\n\n\u003C\u002Fbr>\n\n同样的 SDK 客户端也可以通过导入 asyncio 发出异步请求。\n\n```python\n# 异步示例\nimport asyncio\nfrom mistralai.client import Mistral\nimport os\n\nasync def main():\n\n    async with Mistral(\n        api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n    ) as mistral:\n\n        res = await mistral.files.upload_async(file={\n            \"file_name\": \"example.file\",\n            \"content\": open(\"example.file\", \"rb\"),\n        }, visibility=\"workspace\")\n\n        # 处理响应\n        print(res)\n\nasyncio.run(main())\n```\n\n### 创建代理完成\n\n此示例展示了如何创建代理完成。\n\n```python\n# 同步示例\nfrom mistralai.client import Mistral\nimport os\n\n\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),Diaspora: A Social Network for the People\nBy: Noura Al-Mutairi\nIntroduction\nIn today's digital age, social media platforms have become an integral part of our daily lives. They offer us a way to connect with friends and family, share our thoughts and experiences, and stay updated on current events. However, despite their popularity, many people are becoming increasingly concerned about privacy issues, data security, and the lack of transparency in how these platforms operate. This has led to the rise of alternative social networks that prioritize user privacy and control. One such platform is Diaspora, which was created by four students at New York University in 2010. In this essay, I will explore what makes Diaspora unique, its benefits and challenges, and why it might be a better choice for those who value their online privacy.\nWhat is Diaspora?\nDiaspora is an open-source social networking platform that allows users to create their own independent servers, known as \"pods,\" where they can host their personal data and interact with others. Unlike traditional social media platforms like Facebook or Twitter, which store all user data on centralized servers owned by the company, Diaspora gives users full control over their data and how it is shared. Each pod can be customized to meet the specific needs of its users, and pods can communicate with each other through a decentralized network called the \"federation.\" This means that users can join any pod they choose, regardless of where it is located, and still maintain a seamless connection with other pods around the world.\nBenefits of Diaspora\nOne of the main advantages of Diaspora is its emphasis on privacy and user control. By allowing users to host their own data on independent servers, Diaspora eliminates the risk of data breaches and unauthorized access that often occur on centralized platforms. Additionally, since users have full control over their data, they can decide who can see their posts, photos, and personal information, and how much of it they want to share with others. This level of control is particularly important for individuals who are concerned about their online safety and want to protect their personal information from prying eyes.\nAnother benefit of Diaspora is its flexibility and customization options. With thousands of pods available to choose from, users can find one that suits their preferences and needs. Whether they want a pod that focuses on art and culture, technology and innovation, or community building and activism, there is always a pod that matches their interests. Moreover, users can also create their own pods if none of the existing ones meet their requirements. This level of freedom and creativity is rare in traditional social media platforms, where users are often limited to the features and settings provided by the company.\nChallenges of Diaspora\nDespite its many benefits, Diaspora also faces some challenges that may hinder its growth and adoption. One of the main challenges is its lack of popularity and awareness among the general public. Since Diaspora is still a relatively new platform compared to Facebook or Twitter, many people are not familiar with its features and benefits. As a result, it is difficult for Diaspora to attract a large number of users and compete with established players in the market. Another challenge is its technical complexity and learning curve. Unlike traditional social media platforms, which are designed to be easy to use and intuitive, Diaspora requires some technical knowledge and skills to set up and manage a pod. This may discourage some users who are not comfortable with technology or do not have access to reliable internet connections.\nConclusion\nIn conclusion, Diaspora is a unique and promising social networking platform that offers a refreshing alternative to traditional social media platforms. By giving users full control over their data and interactions, Diaspora empowers them to take charge of their online presence and express themselves freely without fear of censorship or surveillance. Moreover, Diaspora's decentralized nature and flexible structure make it a perfect fit for diverse communities and groups that seek to build strong bonds and promote mutual understanding. However, in order for Diaspora to reach its full potential, more efforts are needed to increase its visibility and accessibility among the general public. Only then can Diaspora become a viable option for those who value their online privacy and want to reclaim their digital sovereignty.\n\n# 同步示例\nfrom mistralai.client import Mistral\nimport os\n\n\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.agents.complete(messages=[\n        {\n            \"role\": \"user\",\n            \"content\": \"谁是法国最伟大的画家？用一句话简短回答。\",\n        },\n    ], agent_id=\"\u003Cid>\", stream=False, response_format={\n        \"type\": \"text\",\n    })\n\n    # 处理响应\n    print(res)\n```\n\n\u003C\u002Fbr>\n\n同样的 SDK 客户端也可以通过导入 asyncio 来进行异步请求。\n\n```python\n# 异步示例\nimport asyncio\nfrom mistralai.client import Mistral\nimport os\n\nasync def main():\n\n    async with Mistral(\n        api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n    ) as mistral:\n\n        res = await mistral.agents.complete_async(messages=[\n            {\n                \"role\": \"user\",\n                \"content\": \"谁是法国最伟大的画家？用一句话简短回答。\",\n            },\n        ], agent_id=\"\u003Cid>\", stream=False, response_format={\n            \"type\": \"text\",\n        })\n\n        # 处理响应\n        print(res)\n\nasyncio.run(main())\n```\n\n### 创建嵌入请求\n\n本示例展示了如何创建嵌入请求。\n\n```python\n# 同步示例\nfrom mistralai.client import Mistral\nimport os\n\n\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.embeddings.create(model=\"mistral-embed\", inputs=[\n        \"对这句话进行嵌入。\",\n        \"也对这句话进行嵌入。\",\n    ])\n\n    # 处理响应\n    print(res)\n```\n\n\u003C\u002Fbr>\n\n同样的 SDK 客户端也可以通过导入 asyncio 来进行异步请求。\n\n```python\n# 异步示例\nimport asyncio\nfrom mistralai.client import Mistral\nimport os\n\nasync def main():\n\n    async with Mistral(\n        api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n    ) as mistral:\n\n        res = await mistral.embeddings.create_async(model=\"mistral-embed\", inputs=[\n            \"对这句话进行嵌入。\",\n            \"也对这句话进行嵌入。\",\n        ])\n\n        # 处理响应\n        print(res)\n\nasyncio.run(main())\n```\n\u003C!-- 结束 SDK 示例用法 [usage] -->\n\n\n### 更多示例\n\n您可以使用 `uv run` 在 `examples\u002F` 目录中运行这些示例。\n\n\n## 各云服务商 SDK 示例用法\n\n### Azure AI\n\n**先决条件**\n\n在开始之前，请确保您已准备好 `AZURE_ENDPOINT` 和 `AZURE_API_KEY`。要获取这些信息，您需要在 Azure AI 上部署 Mistral 模型。\n有关在 Azure AI 上部署 Mistral 的说明，请参阅[此处](https:\u002F\u002Fdocs.mistral.ai\u002Fdeployment\u002Fcloud\u002Fazure\u002F)。\n\n**步骤 1：安装**\n\n```bash\npip install mistralai\n```\n\n**步骤 2：示例用法**\n\n以下是一个基本示例，帮助您入门。您也可以运行 [`examples` 目录中的示例](\u002Fexamples\u002Fazure)。\n\n```python\nimport os\nfrom mistralai.azure.client import MistralAzure\n\n# SDK 会自动将 api-version 作为查询参数注入\nclient = MistralAzure(\n    api_key=os.environ[\"AZURE_API_KEY\"],\n    server_url=os.environ[\"AZURE_ENDPOINT\"],\n    api_version=\"2024-05-01-preview\",  # 可选，默认值为该版本\n)\n\nres = client.chat.complete(\n    model=os.environ[\"AZURE_MODEL\"],\n    messages=[\n        {\n            \"role\": \"user\",\n            \"content\": \"你好！\",\n        }\n    ],\n)\nprint(res.choices[0].message.content)\n```\n\n### Google Cloud\n\n\n**先决条件**\n\n在开始之前，您需要创建一个 Google Cloud 项目并启用 Mistral API。为此，请按照[此处](https:\u002F\u002Fdocs.mistral.ai\u002Fdeployment\u002Fcloud\u002Fvertex\u002F)的说明操作。\n要在本地运行此示例，您还需要确保已通过 Google Cloud 进行身份验证。您可以通过运行以下命令来完成：\n\n```bash\ngcloud auth application-default login\n```\n\n**步骤 1：安装**\n\n```bash\npip install mistralai\n# 如果需要 GCP 身份验证支持（必需）：\npip install \"mistralai[gcp]\"\n```\n\n**步骤 2：示例用法**\n\n以下是一个基本示例，帮助您入门。您也可以运行 [`examples` 目录中的示例](\u002Fexamples\u002Fgcp)。\n\nSDK 自动执行以下操作：\n- 通过 `google.auth.default()` 检测凭据\n- 在令牌过期时自动刷新\n- 根据 `project_id` 和 `region` 构建 Vertex AI 的 URL\n\n```python\nimport os\nfrom mistralai.gcp.client import MistralGCP\n\n# SDK 会自动检测凭据并构建 Vertex AI 的 URL\nclient = MistralGCP(\n    project_id=os.environ.get(\"GCP_PROJECT_ID\"),  # 可选：从凭据中自动检测\n    region=\"us-central1\",  # 默认值：europe-west4\n)\n\nres = client.chat.complete(\n    model=\"mistral-small-2503\",\n    messages=[\n        {\n            \"role\": \"user\",\n            \"content\": \"你好！\",\n        }\n    ],\n)\nprint(res.choices[0].message.content)\n```\n\n\n\u003C!-- 开始可用资源与操作 [operations] -->\n## 可用资源与操作\n\n\u003Cdetails open>\n\u003Csummary>可用方法\u003C\u002Fsummary>\n\n### [Agents](docs\u002Fsdks\u002Fagents\u002FREADME.md)\n\n* [complete](docs\u002Fsdks\u002Fagents\u002FREADME.md#complete) - 代理完成\n* [stream](docs\u002Fsdks\u002Fagents\u002FREADME.md#stream) - 流式代理完成\n\n### [Audio.Speech](docs\u002Fsdks\u002Fspeech\u002FREADME.md)\n\n* [complete](docs\u002Fsdks\u002Fspeech\u002FREADME.md#complete) - 语音\n\n### [Audio.Transcriptions](docs\u002Fsdks\u002Ftranscriptions\u002FREADME.md)\n\n* [complete](docs\u002Fsdks\u002Ftranscriptions\u002FREADME.md#complete) - 创建转录\n* [stream](docs\u002Fsdks\u002Ftranscriptions\u002FREADME.md#stream) - 创建流式转录（SSE）\n\n### [Audio.Voices](docs\u002Fsdks\u002Fvoices\u002FREADME.md)\n\n* [list](docs\u002Fsdks\u002Fvoices\u002FREADME.md#list) - 列出所有语音\n* [create](docs\u002Fsdks\u002Fvoices\u002FREADME.md#create) - 创建新语音\n* [delete](docs\u002Fsdks\u002Fvoices\u002FREADME.md#delete) - 删除自定义语音\n* [update](docs\u002Fsdks\u002Fvoices\u002FREADME.md#update) - 更新语音元数据\n* [get](docs\u002Fsdks\u002Fvoices\u002FREADME.md#get) - 获取语音详情\n* [get_sample_audio](docs\u002Fsdks\u002Fvoices\u002FREADME.md#get_sample_audio) - 获取语音样本音频\n\n### [Batch.Jobs](docs\u002Fsdks\u002Fbatchjobs\u002FREADME.md)\n\n* [list](docs\u002Fsdks\u002Fbatchjobs\u002FREADME.md#list) - 获取批量作业\n* [create](docs\u002Fsdks\u002Fbatchjobs\u002FREADME.md#create) - 创建批量作业\n* [get](docs\u002Fsdks\u002Fbatchjobs\u002FREADME.md#get) - 获取批量作业\n* [delete](docs\u002Fsdks\u002Fbatchjobs\u002FREADME.md#delete) - 删除批量作业\n* [cancel](docs\u002Fsdks\u002Fbatchjobs\u002FREADME.md#cancel) - 取消批量作业\n\n### [Beta.Agents](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md)\n\n* [create](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#create) - 创建一个可在会话中使用的代理。\n* [list](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#list) - 列出所有代理实体。\n* [get](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#get) - 获取一个代理实体。\n* [update](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#update) - 更新一个代理实体。\n* [delete](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#delete) - 删除一个代理实体。\n* [update_version](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#update_version) - 更新代理的一个版本。\n* [list_versions](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#list_versions) - 列出代理的所有版本。\n* [get_version](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#get_version) - 获取代理的特定版本。\n* [create_version_alias](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#create_version_alias) - 创建或更新代理版本别名。\n* [list_version_aliases](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#list_version_aliases) - 列出代理的所有别名。\n* [delete_version_alias](docs\u002Fsdks\u002Fbetaagents\u002FREADME.md#delete_version_alias) - 删除代理版本别名。\n\n### [Beta.Connectors](docs\u002Fsdks\u002Fconnectors\u002FREADME.md)\n\n* [create](docs\u002Fsdks\u002Fconnectors\u002FREADME.md#create) - 创建一个新的连接器。\n* [list](docs\u002Fsdks\u002Fconnectors\u002FREADME.md#list) - 列出所有连接器。\n* [get_auth_url](docs\u002Fsdks\u002Fconnectors\u002FREADME.md#get_auth_url) - 获取连接器的授权 URL。\n* [call_tool](docs\u002Fsdks\u002Fconnectors\u002FREADME.md#call_tool) - 调用连接器工具。\n* [list_tools](docs\u002Fsdks\u002Fconnectors\u002FREADME.md#list_tools) - 列出连接器的工具。\n* [get](docs\u002Fsdks\u002Fconnectors\u002FREADME.md#get) - 获取一个连接器。\n* [update](docs\u002Fsdks\u002Fconnectors\u002FREADME.md#update) - 更新一个连接器。\n* [delete](docs\u002Fsdks\u002Fconnectors\u002FREADME.md#delete) - 删除一个连接器。\n\n### [Beta.Conversations](docs\u002Fsdks\u002Fconversations\u002FREADME.md)\n\n* [start](docs\u002Fsdks\u002Fconversations\u002FREADME.md#start) - 创建一个会话并为其添加条目。\n* [list](docs\u002Fsdks\u002Fconversations\u002FREADME.md#list) - 列出所有已创建的会话。\n* [get](docs\u002Fsdks\u002Fconversations\u002FREADME.md#get) - 获取会话信息。\n* [delete](docs\u002Fsdks\u002Fconversations\u002FREADME.md#delete) - 删除一个会话。\n* [append](docs\u002Fsdks\u002Fconversations\u002FREADME.md#append) - 向现有会话添加新条目。\n* [get_history](docs\u002Fsdks\u002Fconversations\u002FREADME.md#get_history) - 获取会话中的所有条目。\n* [get_messages](docs\u002Fsdks\u002Fconversations\u002FREADME.md#get_messages) - 获取会话中的所有消息。\n* [restart](docs\u002Fsdks\u002Fconversations\u002FREADME.md#restart) - 从指定条目重新开始一个会话。\n* [start_stream](docs\u002Fsdks\u002Fconversations\u002FREADME.md#start_stream) - 创建一个会话并为其添加条目。\n* [append_stream](docs\u002Fsdks\u002Fconversations\u002FREADME.md#append_stream) - 向现有会话添加新条目。\n* [restart_stream](docs\u002Fsdks\u002Fconversations\u002FREADME.md#restart_stream) - 从指定条目重新开始一个会话。\n\n### [Beta.Libraries](docs\u002Fsdks\u002Flibraries\u002FREADME.md)\n\n* [list](docs\u002Fsdks\u002Flibraries\u002FREADME.md#list) - 列出您有权访问的所有库。\n* [create](docs\u002Fsdks\u002Flibraries\u002FREADME.md#create) - 创建一个新的库。\n* [get](docs\u002Fsdks\u002Flibraries\u002FREADME.md#get) - 获取特定库的详细信息。\n* [delete](docs\u002Fsdks\u002Flibraries\u002FREADME.md#delete) - 删除一个库及其所有文档。\n* [update](docs\u002Fsdks\u002Flibraries\u002FREADME.md#update) - 更新一个库。\n\n#### [Beta.Libraries.Accesses](docs\u002Fsdks\u002Faccesses\u002FREADME.md)\n\n* [list](docs\u002Fsdks\u002Faccesses\u002FREADME.md#list) - 列出对该库的所有访问权限。\n* [update_or_create](docs\u002Fsdks\u002Faccesses\u002FREADME.md#update_or_create) - 创建或更新访问级别。\n* [delete](docs\u002Fsdks\u002Faccesses\u002FREADME.md#delete) - 删除一个访问级别。\n\n#### [Beta.Libraries.Documents](docs\u002Fsdks\u002Fdocuments\u002FREADME.md)\n\n* [list](docs\u002Fsdks\u002Fdocuments\u002FREADME.md#list) - 列出给定库中的文档。\n* [upload](docs\u002Fsdks\u002Fdocuments\u002FREADME.md#upload) - 上传新文档。\n* [get](docs\u002Fsdks\u002Fdocuments\u002FREADME.md#get) - 获取特定文档的元数据。\n* [update](docs\u002Fsdks\u002Fdocuments\u002FREADME.md#update) - 更新特定文档的元数据。\n* [delete](docs\u002Fsdks\u002Fdocuments\u002FREADME.md#delete) - 删除文档。\n* [text_content](docs\u002Fsdks\u002Fdocuments\u002FREADME.md#text_content) - 获取特定文档的文本内容。\n* [status](docs\u002Fsdks\u002Fdocuments\u002FREADME.md#status) - 获取特定文档的处理状态。\n* [get_signed_url](docs\u002Fsdks\u002Fdocuments\u002FREADME.md#get_signed_url) - 获取特定文档的签名 URL。\n* [extracted_text_signed_url](docs\u002Fsdks\u002Fdocuments\u002FREADME.md#extracted_text_signed_url) - 获取从给定文档中提取的文本的签名 URL。\n* [reprocess](docs\u002Fsdks\u002Fdocuments\u002FREADME.md#reprocess) - 对文档进行重新处理。\n\n### [Beta.Observability.Campaigns](docs\u002Fsdks\u002Fcampaigns\u002FREADME.md)\n\n* [create](docs\u002Fsdks\u002Fcampaigns\u002FREADME.md#create) - 创建并启动一个新的活动。\n* [list](docs\u002Fsdks\u002Fcampaigns\u002FREADME.md#list) - 获取所有活动。\n* [fetch](docs\u002Fsdks\u002Fcampaigns\u002FREADME.md#fetch) - 根据 ID 获取活动。\n* [delete](docs\u002Fsdks\u002Fcampaigns\u002FREADME.md#delete) - 删除一个活动。\n* [fetch_status](docs\u002Fsdks\u002Fcampaigns\u002FREADME.md#fetch_status) - 根据活动 ID 获取活动状态。\n* [list_events](docs\u002Fsdks\u002Fcampaigns\u002FREADME.md#list_events) - 获取由给定活动选择的事件 ID。\n\n### [Beta.Observability.ChatCompletionEvents](docs\u002Fsdks\u002Fchatcompletionevents\u002FREADME.md)\n\n* [search](docs\u002Fsdks\u002Fchatcompletionevents\u002FREADME.md#search) - 获取聊天完成事件。\n* [search_ids](docs\u002Fsdks\u002Fchatcompletionevents\u002FREADME.md#search_ids) - 作为 \u002Fsearch 的替代方案，仅返回 ID，并且可以一次性返回大量 ID。\n* [fetch](docs\u002Fsdks\u002Fchatcompletionevents\u002FREADME.md#fetch) - 获取聊天完成事件。\n* [fetch_similar_events](docs\u002Fsdks\u002Fchatcompletionevents\u002FREADME.md#fetch_similar_events) - 获取相似的聊天完成事件。\n* [judge](docs\u002Fsdks\u002Fchatcompletionevents\u002FREADME.md#judge) - 根据给定选项对事件进行评判。\n\n#### [Beta.Observability.ChatCompletionEvents.Fields](docs\u002Fsdks\u002Ffields\u002FREADME.md)\n\n* [list](docs\u002Fsdks\u002Ffields\u002FREADME.md#list) - 获取聊天完成字段。\n* [fetch_options](docs\u002Fsdks\u002Ffields\u002FREADME.md#fetch_options) - 获取聊天完成字段选项。\n* [fetch_option_counts](docs\u002Fsdks\u002Ffields\u002FREADME.md#fetch_option_counts) - 获取聊天完成字段选项的数量。\n\n### [Beta.可观测性.数据集](docs\u002Fsdks\u002Fdatasets\u002FREADME.md)\n\n* [create](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#create) - 创建一个新的空数据集\n* [list](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#list) - 列出现有数据集\n* [fetch](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#fetch) - 根据ID获取数据集\n* [delete](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#delete) - 删除数据集\n* [update](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#update) - 部分更新数据集\n* [list_records](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#list_records) - 列出数据集中现有的记录\n* [create_record](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#create_record) - 向数据集中添加一条对话\n* [import_from_campaign](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#import_from_campaign) - 使用市场活动填充数据集\n* [import_from_explorer](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#import_from_explorer) - 使用探索器中的样本填充数据集\n* [import_from_file](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#import_from_file) - 使用上传的文件中的样本填充数据集\n* [import_from_playground](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#import_from_playground) - 使用游乐场中的样本填充数据集\n* [import_from_dataset_records](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#import_from_dataset_records) - 使用另一个数据集中的样本填充数据集\n* [export_to_jsonl](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#export_to_jsonl) - 导出到文件API，并获取用于下载生成的JSONL文件的预签名URL\n* [fetch_task](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#fetch_task) - 获取数据集导入任务的状态\n* [list_tasks](docs\u002Fsdks\u002Fdatasets\u002FREADME.md#list_tasks) - 列出给定数据集的导入任务\n\n#### [Beta.可观测性.数据集.记录](docs\u002Fsdks\u002Frecords\u002FREADME.md)\n\n* [fetch](docs\u002Fsdks\u002Frecords\u002FREADME.md#fetch) - 从数据集中获取指定对话的内容\n* [delete](docs\u002Fsdks\u002Frecords\u002FREADME.md#delete) - 从数据集中删除一条记录\n* [bulk_delete](docs\u002Fsdks\u002Frecords\u002FREADME.md#bulk_delete) - 从数据集中批量删除多条记录\n* [judge](docs\u002Fsdks\u002Frecords\u002FREADME.md#judge) - 根据给定选项对数据集中的记录运行Judge\n* [update_payload](docs\u002Fsdks\u002Frecords\u002FREADME.md#update_payload) - 更新数据集记录的对话负载\n* [update_properties](docs\u002Fsdks\u002Frecords\u002FREADME.md#update_properties) - 更新对话属性\n\n### [Beta.可观测性.Judge](docs\u002Fsdks\u002Fjudges\u002FREADME.md)\n\n* [create](docs\u002Fsdks\u002Fjudges\u002FREADME.md#create) - 创建一个新的Judge\n* [list](docs\u002Fsdks\u002Fjudges\u002FREADME.md#list) - 获取Judge，支持可选的过滤和搜索\n* [fetch](docs\u002Fsdks\u002Fjudges\u002FREADME.md#fetch) - 根据ID获取Judge\n* [delete](docs\u002Fsdks\u002Fjudges\u002FREADME.md#delete) - 删除Judge\n* [update](docs\u002Fsdks\u002Fjudges\u002FREADME.md#update) - 更新Judge\n* [judge_conversation](docs\u002Fsdks\u002Fjudges\u002FREADME.md#judge_conversation) - 对一条对话运行保存的Judge\n\n### [聊天](docs\u002Fsdks\u002Fchat\u002FREADME.md)\n\n* [complete](docs\u002Fsdks\u002Fchat\u002FREADME.md#complete) - 聊天完成\n* [stream](docs\u002Fsdks\u002Fchat\u002FREADME.md#stream) - 流式聊天完成\n\n### [分类器](docs\u002Fsdks\u002Fclassifiers\u002FREADME.md)\n\n* [moderate](docs\u002Fsdks\u002Fclassifiers\u002FREADME.md#moderate) - 内容审核\n* [moderate_chat](docs\u002Fsdks\u002Fclassifiers\u002FREADME.md#moderate_chat) - 聊天内容审核\n* [classify](docs\u002Fsdks\u002Fclassifiers\u002FREADME.md#classify) - 分类\n* [classify_chat](docs\u002Fsdks\u002Fclassifiers\u002FREADME.md#classify_chat) - 聊天分类\n\n### [嵌入](docs\u002Fsdks\u002Fembeddings\u002FREADME.md)\n\n* [create](docs\u002Fsdks\u002Fembeddings\u002FREADME.md#create) - 嵌入\n\n### [事件](docs\u002Fsdks\u002Fevents\u002FREADME.md)\n\n* [get_stream_events](docs\u002Fsdks\u002Fevents\u002FREADME.md#get_stream_events) - 获取流事件\n* [get_workflow_events](docs\u002Fsdks\u002Fevents\u002FREADME.md#get_workflow_events) - 获取工作流事件\n\n### [文件](docs\u002Fsdks\u002Ffiles\u002FREADME.md)\n\n* [upload](docs\u002Fsdks\u002Ffiles\u002FREADME.md#upload) - 上传文件\n* [list](docs\u002Fsdks\u002Ffiles\u002FREADME.md#list) - 列出文件\n* [retrieve](docs\u002Fsdks\u002Ffiles\u002FREADME.md#retrieve) - 检索文件\n* [delete](docs\u002Fsdks\u002Ffiles\u002FREADME.md#delete) - 删除文件\n* [download](docs\u002Fsdks\u002Ffiles\u002FREADME.md#download) - 下载文件\n* [get_signed_url](docs\u002Fsdks\u002Ffiles\u002FREADME.md#get_signed_url) - 获取签名URL\n\n### [Fim](docs\u002Fsdks\u002Ffim\u002FREADME.md)\n\n* [complete](docs\u002Fsdks\u002Ffim\u002FREADME.md#complete) - Fim完成\n* [stream](docs\u002Fsdks\u002Ffim\u002FREADME.md#stream) - 流式Fim完成\n\n### [微调作业](docs\u002Fsdks\u002Ffinetuningjobs\u002FREADME.md)\n\n* [list](docs\u002Fsdks\u002Ffinetuningjobs\u002FREADME.md#list) - 获取微调作业\n* [create](docs\u002Fsdks\u002Ffinetuningjobs\u002FREADME.md#create) - 创建微调作业\n* [get](docs\u002Fsdks\u002Ffinetuningjobs\u002FREADME.md#get) - 获取微调作业\n* [cancel](docs\u002Fsdks\u002Ffinetuningjobs\u002FREADME.md#cancel) - 取消微调作业\n* [start](docs\u002Fsdks\u002Ffinetuningjobs\u002FREADME.md#start) - 开始微调作业\n\n### [模型](docs\u002Fsdks\u002Fmodels\u002FREADME.md)\n\n* [list](docs\u002Fsdks\u002Fmodels\u002FREADME.md#list) - 列出模型\n* [retrieve](docs\u002Fsdks\u002Fmodels\u002FREADME.md#retrieve) - 检索模型\n* [delete](docs\u002Fsdks\u002Fmodels\u002FREADME.md#delete) - 删除模型\n* [update](docs\u002Fsdks\u002Fmodels\u002FREADME.md#update) - 更新微调后的模型\n* [archive](docs\u002Fsdks\u002Fmodels\u002FREADME.md#archive) - 归档微调后的模型\n* [unarchive](docs\u002Fsdks\u002Fmodels\u002FREADME.md#unarchive) - 解档微调后的模型\n\n### [OCR](docs\u002Fsdks\u002Focr\u002FREADME.md)\n\n* [process](docs\u002Fsdks\u002Focr\u002FREADME.md#process) - OCR\n\n### [工作流](docs\u002Fsdks\u002Fworkflows\u002FREADME.md)\n\n* [get_workflows](docs\u002Fsdks\u002Fworkflows\u002FREADME.md#get_workflows) - 获取工作流\n* [get_workflow_registrations](docs\u002Fsdks\u002Fworkflows\u002FREADME.md#get_workflow_registrations) - 获取工作流注册信息\n* [execute_workflow](docs\u002Fsdks\u002Fworkflows\u002FREADME.md#execute_workflow) - 执行工作流\n* [~~execute_workflow_registration~~](docs\u002Fsdks\u002Fworkflows\u002FREADME.md#execute_workflow_registration) - 执行工作流注册 :warning: **已弃用**\n* [get_workflow](docs\u002Fsdks\u002Fworkflows\u002FREADME.md#get_workflow) - 获取工作流\n* [update_workflow](docs\u002Fsdks\u002Fworkflows\u002FREADME.md#update_workflow) - 更新工作流\n* [get_workflow_registration](docs\u002Fsdks\u002Fworkflows\u002FREADME.md#get_workflow_registration) - 获取工作流注册信息\n* [archive_workflow](docs\u002Fsdks\u002Fworkflows\u002FREADME.md#archive_workflow) - 归档工作流\n* [unarchive_workflow](docs\u002Fsdks\u002Fworkflows\u002FREADME.md#unarchive_workflow) - 取消归档工作流\n\n#### [工作流.部署](docs\u002Fsdks\u002Fdeployments\u002FREADME.md)\n\n* [list_deployments](docs\u002Fsdks\u002Fdeployments\u002FREADME.md#list_deployments) - 列出部署\n* [get_deployment](docs\u002Fsdks\u002Fdeployments\u002FREADME.md#get_deployment) - 获取部署\n\n#### [工作流.事件](docs\u002Fsdks\u002Fworkflowsevents\u002FREADME.md)\n\n* [get_stream_events](docs\u002Fsdks\u002Fworkflowsevents\u002FREADME.md#get_stream_events) - 获取流事件\n* [get_workflow_events](docs\u002Fsdks\u002Fworkflowsevents\u002FREADME.md#get_workflow_events) - 获取工作流事件\n\n#### [工作流.执行](docs\u002Fsdks\u002Fexecutions\u002FREADME.md)\n\n* [get_workflow_execution](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#get_workflow_execution) - 获取工作流执行\n* [get_workflow_execution_history](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#get_workflow_execution_history) - 获取工作流执行历史\n* [signal_workflow_execution](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#signal_workflow_execution) - 发送工作流执行信号\n* [query_workflow_execution](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#query_workflow_execution) - 查询工作流执行\n* [terminate_workflow_execution](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#terminate_workflow_execution) - 终止工作流执行\n* [batch_terminate_workflow_executions](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#batch_terminate_workflow_executions) - 批量终止工作流执行\n* [cancel_workflow_execution](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#cancel_workflow_execution) - 取消工作流执行\n* [batch_cancel_workflow_executions](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#batch_cancel_workflow_executions) - 批量取消工作流执行\n* [reset_workflow](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#reset_workflow) - 重置工作流\n* [update_workflow_execution](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#update_workflow_execution) - 更新工作流执行\n* [get_workflow_execution_trace_otel](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#get_workflow_execution_trace_otel) - 获取工作流执行追踪（Otel）\n* [get_workflow_execution_trace_summary](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#get_workflow_execution_trace_summary) - 获取工作流执行追踪摘要\n* [get_workflow_execution_trace_events](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#get_workflow_execution_trace_events) - 获取工作流执行追踪事件\n* [stream](docs\u002Fsdks\u002Fexecutions\u002FREADME.md#stream) - 流式传输\n\n#### [工作流.指标](docs\u002Fsdks\u002Fmetrics\u002FREADME.md)\n\n* [get_workflow_metrics](docs\u002Fsdks\u002Fmetrics\u002FREADME.md#get_workflow_metrics) - 获取工作流指标\n\n#### [工作流.运行](docs\u002Fsdks\u002Fruns\u002FREADME.md)\n\n* [list_runs](docs\u002Fsdks\u002Fruns\u002FREADME.md#list_runs) - 列出运行\n* [get_run](docs\u002Fsdks\u002Fruns\u002FREADME.md#get_run) - 获取运行\n* [get_run_history](docs\u002Fsdks\u002Fruns\u002FREADME.md#get_run_history) - 获取运行历史\n\n#### [工作流.计划](docs\u002Fsdks\u002Fschedules\u002FREADME.md)\n\n* [get_schedules](docs\u002Fsdks\u002Fschedules\u002FREADME.md#get_schedules) - 获取计划\n* [schedule_workflow](docs\u002Fsdks\u002Fschedules\u002FREADME.md#schedule_workflow) - 计划工作流\n* [unschedule_workflow](docs\u002Fsdks\u002Fschedules\u002FREADME.md#unschedule_workflow) - 取消计划工作流\n\n\u003C\u002Fdetails>\n\u003C!-- 结束可用资源和操作 [operations] -->\n\n\u003C!-- 开始服务器发送事件流 [eventstream] -->\n## 服务器发送事件流\n\n[服务器发送事件][mdn-sse] 用于从某些操作中流式传输内容。这些操作会将流暴露为 [生成器][generator], 可以使用简单的 `for` 循环来消费。当服务器不再有事件可发送并关闭底层连接时，循环将终止。\n\n该流也是一个 [上下文管理器][context-manager], 可以与 `with` 语句一起使用，并在退出上下文时自动关闭底层连接。\n\n```python\nfrom mistralai.client import Mistral\nimport os\n\n\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.audio.speech.complete(input=\"\u003Cvalue>\", stream=False, additional_properties={\n\n    })\n\n    with res as event_stream:\n        for event in event_stream:\n            # 处理事件\n            print(event, flush=True)\n\n```\n\n[mdn-sse]: https:\u002F\u002Fdeveloper.mozilla.org\u002Fen-US\u002Fdocs\u002FWeb\u002FAPI\u002FServer-sent_events\u002FUsing_server-sent_events\n[generator]: https:\u002F\u002Fbook.pythontips.com\u002Fen\u002Flatest\u002Fgenerators.html\n[context-manager]: https:\u002F\u002Fbook.pythontips.com\u002Fen\u002Flatest\u002Fcontext_managers.html\n\u003C!-- 结束服务器发送事件流 [eventstream] -->\n\n\u003C!-- 开始分页 [pagination] -->\n## 分页\n\n此 SDK 中的一些端点支持分页。要使用分页，您可以像往常一样调用 SDK 方法，但返回的响应对象将具有一个 `Next` 方法，可以调用该方法来获取下一批结果。如果 `Next` 的返回值为 `None`, 则表示没有更多页面可供获取。\n\n以下是一个此类分页调用的示例：\n```python\nfrom mistralai.client import Mistral\nimport os\n\n\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.workflows.get_workflows(active_only=False, include_shared=True, limit=50)\n\n    while res is not None:\n        # 处理项目\n\n        res = res.next()\n\n```\n\u003C!-- 结束分页 [pagination] -->\n\n\u003C!-- 开始文件上传 [file-upload] -->\n## 文件上传\n\n某些 SDK 方法接受文件对象作为请求体或多部分请求的一部分。通常建议以流的形式上传文件，而不是将整个文件内容读入内存。这样做可以避免过度消耗内存，并防止在处理非常大的文件时出现内存不足错误。下面的示例展示了如何将文件流附加到请求中。\n\n> [!提示]\n>\n> 对于处理文件上传的端点，也可以使用字节数组。然而，对于大文件，建议使用流。\n>\n\n```python\nfrom mistralai.client import Mistral\nimport os\n\n\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.audio.transcriptions.complete(model=\"Model X\", diarize=False)\n\n    # 处理响应\n    print(res)\n\n```\n\u003C!-- 结束文件上传 [file-upload] -->\n\n\u003C!-- 开始重试 [retries] -->\n\n## 重试\n\n本 SDK 中的部分端点支持重试。如果您在未进行任何配置的情况下使用该 SDK，它将回退到 API 提供的默认重试策略。不过，您可以在单个操作级别或在整个 SDK 范围内覆盖默认重试策略。\n\n要为单个 API 调用更改默认重试策略，只需向调用提供一个 `RetryConfig` 对象：\n```python\nfrom mistralai.client import Mistral\nfrom mistralai.client.utils import BackoffStrategy、RetryConfig\nimport os\n\n\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.audio.speech.complete(input=\"\u003Cvalue>\", stream=False, additional_properties={\n\n    },\n        RetryConfig(\"backoff\", BackoffStrategy(1, 50, 1.1, 100), False))\n\n    with res as event_stream:\n        for event in event_stream:\n            # 处理事件\n            print(event, flush=True)\n\n```\n\n如果您希望覆盖所有支持重试的操作的默认重试策略，可以在初始化 SDK 时使用 `retry_config` 可选参数：\n```python\nfrom mistralai.client import Mistral\nfrom mistralai.client.utils import BackoffStrategy、RetryConfig\nimport os\n\n\nwith Mistral(\n    retry_config=RetryConfig(\"backoff\", BackoffStrategy(1, 50, 1.1, 100), False),\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.audio.speech.complete(input=\"\u003Cvalue>\", stream=False, additional_properties={\n\n    })\n\n    with res as event_stream:\n        for event in event_stream:\n            # 处理事件\n            print(event, flush=True)\n\n```\n\u003C!-- 结束重试 [retries] -->\n\n\u003C!-- 开始错误处理 [errors] -->\n## 错误处理\n\n[`MistralError`](.\u002Fsrc\u002Fmistralai\u002Fclient\u002Ferrors\u002Fmistralerror.py) 是所有 HTTP 错误响应的基类。它具有以下属性：\n\n| 属性           | 类型             | 描述                                                                             |\n| ------------------ | ---------------- | --------------------------------------------------------------------------------------- |\n| `err.message`      | `str`            | 错误消息                                                                           |\n| `err.status_code`  | `int`            | HTTP 响应的状态码，例如 `404`                                                      |\n| `err.headers`      | `httpx.Headers`  | HTTP 响应头                                                                   |\n| `err.body`         | `str`            | HTTP 正文。如果没有返回正文，则可能为空字符串。                                  |\n| `err.raw_response` | `httpx.Response` | 原始 HTTP 响应                                                                       |\n| `err.data`         |                  | 可选。某些错误可能包含结构化数据。[请参阅错误类](#error-classes)。 |\n\n### 示例\n```python\nfrom mistralai.client import Mistral、errors\nimport os\n\n\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n    res = None\n    try:\n\n        res = mistral.audio.speech.complete(input=\"\u003Cvalue>\", stream=False, additional_properties={\n\n        })\n\n        with res as event_stream:\n            for event in event_stream:\n                # 处理事件\n                print(event, flush=True)\n\n\n    except errors.MistralError as e:\n        # HTTP 错误响应的基类\n        print(e.message)\n        print(e.status_code)\n        print(e.body)\n        print(e.headers)\n        print(e.raw_response)\n\n        # 根据方法的不同，可能会抛出不同的错误\n        if isinstance(e, errors.HTTPValidationError):\n            print(e.data.detail)  # Optional[List[models.ValidationError]]\n```\n\n### 错误类\n**主要错误：**\n* [`MistralError`](.\u002Fsrc\u002Fmistralai\u002Fclient\u002Ferrors\u002Fmistralerror.py)：HTTP 错误响应的基类。\n\n\u003Cdetails>\u003Csummary>较常见的错误（7 种）\u003C\u002Fsummary>\n\n\u003Cbr \u002F>\n\n**网络错误：**\n* [`httpx.RequestError`](https:\u002F\u002Fwww.python-httpx.org\u002Fexceptions\u002F#httpx.RequestError)：请求错误的基类。\n    * [`httpx.ConnectError`](https:\u002F\u002Fwww.python-httpx.org\u002Fexceptions\u002F#httpx.ConnectError)：HTTP 客户端无法向服务器发出请求。\n    * [`httpx.TimeoutException`](https:\u002F\u002Fwww.python-httpx.org\u002Fexceptions\u002F#httpx.TimeoutException)：HTTP 请求超时。\n\n\n**继承自 [`MistralError`](.\u002Fsrc\u002Fmistralai\u002Fclient\u002Ferrors\u002Fmistralerror.py)：**\n* [`HTTPValidationError`](.\u002Fsrc\u002Fmistralai\u002Fclient\u002Ferrors\u002Fhttpvalidationerror.py)：验证错误。状态码为 `422`。适用于 168 种方法中的 103 种。*\n* [`ObservabilityError`](.\u002Fsrc\u002Fmistralai\u002Fclient\u002Ferrors\u002Fobservabilityerror.py)：错误请求——请求参数或数据无效。适用于 168 种方法中的 41 种。*\n* [`ResponseValidationError`](.\u002Fsrc\u002Fmistralai\u002Fclient\u002Ferrors\u002Fresponsevalidationerror.py)：响应数据与预期的 Pydantic 模型类型不匹配。可通过 `cause` 属性访问 Pydantic 验证错误。\n\n\u003C\u002Fdetails>\n\n\\* 请查看 [方法文档](#available-resources-and-operations) 以了解错误是否适用。\n\u003C!-- 结束错误处理 [errors] -->\n\n\u003C!-- 开始服务器选择 [server] -->\n## 服务器选择\n\n### 按名称选择服务器\n\n您可以通过在初始化 SDK 客户端实例时将服务器名称传递给 `server: str` 可选参数，来全局覆盖默认服务器。所选服务器随后将作为使用它的操作的默认服务器。下表列出了与可用服务器关联的名称：\n\n| 名称 | 服务器                   | 描述          |\n| ---- | ------------------------ | -------------------- |\n| `eu` | `https:\u002F\u002Fapi.mistral.ai` | 欧盟生产服务器 |\n\n#### 示例\n\n```python\nfrom mistralai.client import Mistral\nimport os\n\n\nwith Mistral(\n    server=\"eu\",\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.audio.speech.complete(input=\"\u003Cvalue>\", stream=False, additional_properties={\n\n    })\n\n    with res as event_stream:\n        for event in event_stream:\n            # 处理事件\n            print(event, flush=True)\n\n```\n\n### 按客户端覆盖服务器 URL\n\n默认服务器也可以通过在初始化 SDK 客户端实例时将 URL 传递给 `server_url: str` 可选参数来全局覆盖。例如：\n```python\nfrom mistralai.client import Mistral\nimport os\n\n\nwith Mistral(\n    server_url=\"https:\u002F\u002Fapi.mistral.ai\",\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.audio.speech.complete(input=\"\u003Cvalue>\", stream=False, additional_properties={\n\n    })\n\n    with res as event_stream:\n        for event in event_stream:\n            # 处理事件\n            print(event, flush=True)\n\n```\n\u003C!-- 结束服务器选择 [server] -->\n\n\u003C!-- 开始自定义 HTTP 客户端 [http-client] -->\n\n## 自定义 HTTP 客户端\n\nPython SDK 使用 [httpx](https:\u002F\u002Fwww.python-httpx.org\u002F) HTTP 库进行 API 调用。为了提供一种便捷的方式来配置超时、Cookie、代理、自定义头部以及其他底层配置，您可以使用自己的 HTTP 客户端实例来初始化 SDK 客户端。\n根据您使用的是同步版还是异步版 SDK，您可以分别传入 `HttpClient` 或 `AsyncHttpClient` 的实例，它们是 Protocol 的实现，确保客户端具备执行 API 调用所需的方法。\n这使您能够将客户端封装在自己的自定义逻辑中，例如添加自定义头部、日志记录或错误处理；或者您也可以直接传入 `httpx.Client` 或 `httpx.AsyncClient` 的实例。\n\n例如，您可以为该 SDK 发出的每个请求指定一个头部，如下所示：\n```python\nfrom mistralai.client import Mistral\nimport httpx\n\nhttp_client = httpx.Client(headers={\"x-custom-header\": \"someValue\"})\ns = Mistral(client=http_client)\n```\n\n或者，您可以将客户端封装在自己的自定义逻辑中：\n```python\nfrom mistralai.client import Mistral\nfrom mistralai.client.httpclient import AsyncHttpClient\nimport httpx\n\nclass CustomClient(AsyncHttpClient):\n    client: AsyncHttpClient\n\n    def __init__(self, client: AsyncHttpClient):\n        self.client = client\n\n    async def send(\n        self,\n        request: httpx.Request,\n        *,\n        stream: bool = False,\n        auth: Union[\n            httpx._types.AuthTypes, httpx._client.UseClientDefault, None\n        ] = httpx.USE_CLIENT_DEFAULT,\n        follow_redirects: Union[\n            bool, httpx._client.UseClientDefault\n        ] = httpx.USE_CLIENT_DEFAULT,\n    ) -> httpx.Response:\n        request.headers[\"Client-Level-Header\"] = \"added by client\"\n\n        return await self.client.send(\n            request, stream=stream, auth=auth, follow_redirects=follow_redirects\n        )\n\n    def build_request(\n        self,\n        method: str,\n        url: httpx._types.URLTypes,\n        *,\n        content: Optional[httpx._types.RequestContent] = None,\n        data: Optional[httpx._types.RequestData] = None,\n        files: Optional[httpx._types.RequestFiles] = None,\n        json: Optional[Any] = None,\n        params: Optional[httpx._types.QueryParamTypes] = None,\n        headers: Optional[httpx._types.HeaderTypes] = None,\n        cookies: Optional[httpx._types.CookieTypes] = None,\n        timeout: Union[\n            httpx._types.TimeoutTypes, httpx._client.UseClientDefault\n        ] = httpx.USE_CLIENT_DEFAULT,\n        extensions: Optional[httpx._types.RequestExtensions] = None,\n    ) -> httpx.Request:\n        return self.client.build_request(\n            method,\n            url,\n            content=content,\n            data=data,\n            files=files,\n            json=json,\n            params=params,\n            headers=headers,\n            cookies=cookies,\n            timeout=timeout,\n            extensions=extensions,\n        )\n\ns = Mistral(async_client=CustomClient(httpx.AsyncClient()))\n```\n\u003C!-- 结束自定义 HTTP 客户端 [http-client] -->\n\n\u003C!-- 开始认证 [security] -->\n## 认证\n\n### 每客户端安全方案\n\n该 SDK 全局支持以下安全方案：\n\n| 名称      | 类型   | 方案       | 环境变量         |\n| --------- | ------ | ---------- | ---------------- |\n| `api_key` | HTTP   | HTTP Bearer | `MISTRAL_API_KEY` |\n\n要通过 API 进行认证，在初始化 SDK 客户端实例时必须设置 `api_key` 参数。例如：\n```python\nfrom mistralai.client import Mistral\nimport os\n\n\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    res = mistral.audio.speech.complete(input=\"\u003Cvalue>\", stream=False, additional_properties={\n\n    })\n\n    with res as event_stream:\n        for event in event_stream:\n            # 处理事件\n            print(event, flush=True)\n\n```\n\u003C!-- 结束认证 [security] -->\n\n\u003C!-- 开始资源管理 [resource-management] -->\n## 资源管理\n\n`Mistral` 类实现了上下文管理协议，并注册了一个终结器函数，用于关闭其内部使用的同步和异步 HTTPX 客户端。这将关闭 HTTP 连接，释放内存，并释放 SDK 占用的其他资源。对于仅执行少量 SDK 方法调用的短生命周期 Python 程序和笔记本而言，资源管理可能不是主要关注点。然而，在长期运行的程序中，通过 [上下文管理器][context-manager] 创建单个 SDK 实例并在整个应用程序中重复使用它会更有益。\n\n[context-manager]: https:\u002F\u002Fdocs.python.org\u002F3\u002Freference\u002Fdatamodel.html#context-managers\n\n```python\nfrom mistralai.client import Mistral\nimport os\ndef main():\n\n    with Mistral(\n        api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n    ) as mistral:\n        # 应用程序的其余部分在这里...\n\n\n# 或者在使用异步时：\nasync def amain():\n\n    async with Mistral(\n        api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n    ) as mistral:\n        # 应用程序的其余部分在这里...\n```\n\u003C!-- 结束资源管理 [resource-management] -->\n\n\u003C!-- 开始调试 [debug] -->\n## 调试\n\n您可以配置 SDK 以输出 SDK 请求和响应的调试日志。\n\n您可以直接将自己的日志记录器类传递给 SDK。\n```python\nfrom mistralai.client import Mistral\nimport logging\n\nlogging.basicConfig(level=logging.DEBUG)\ns = Mistral(debug_logger=logging.getLogger(\"mistralai.client\"))\n```\n\n您还可以通过将环境变量 `MISTRAL_DEBUG` 设置为 `true` 来启用默认的调试日志记录器。\n\u003C!-- 结束调试 [debug] -->\n\n\u003C!-- 开始 IDE 支持 [idesupport] -->\n## IDE 支持\n\n### PyCharm\n\n通常，SDK 可以很好地与大多数 IDE 无缝配合。然而，在使用 PyCharm 时，通过安装一个额外的插件，您可以享受与 Pydantic 更好的集成。\n\n- [PyCharm Pydantic 插件](https:\u002F\u002Fdocs.pydantic.dev\u002Flatest\u002Fintegrations\u002Fpycharm\u002F)\n\u003C!-- 结束 IDE 支持 [idesupport] -->\n\n\u003C!-- 未来 Speakeasy SDK 章节占位符 -->\n\n# 开发\n\n## 贡献\n\n虽然我们重视对本 SDK 的开源贡献，但此库是由程序自动生成的。任何手动添加到内部文件的更改都将在下一次生成时被覆盖。\n我们期待您的反馈。欢迎随时提交 PR 或包含概念验证的议题，我们将尽力将其纳入未来的版本中。","# Mistral Python Client 快速上手指南\n\n## 环境准备\n\n在开始之前，请确保满足以下要求：\n\n*   **Python 版本**：Python 3.10 或更高版本。\n*   **API Key**：您需要一个 Mistral AI API Key。\n    1.  访问 [Mistral 控制台](https:\u002F\u002Fconsole.mistral.ai) 注册账号并获取 Key。\n    2.  将 Key 设置为环境变量（以 zsh 为例）：\n        ```bash\n        echo 'export MISTRAL_API_KEY=[your_key_here]' >> ~\u002F.zshenv\n        source ~\u002F.zshenv\n        ```\n*   **包管理器**：推荐使用 `uv`（速度更快），也支持 `pip` 或 `poetry`。\n\n> **注意**：如果您计划使用国内网络环境，建议在安装命令中指定清华或阿里镜像源以提升下载速度（见安装步骤）。\n\n## 安装步骤\n\n您可以选择以下任意一种方式安装 SDK：\n\n### 方式一：使用 uv（推荐）\n`uv` 是现代化的快速 Python 包管理器。\n```bash\nuv add mistralai\n```\n*国内加速方案：*\n```bash\nuv add mistralai --index-url https:\u002F\u002Fpypi.tuna.tsinghua.edu.cn\u002Fsimple\n```\n\n### 方式二：使用 pip\n```bash\npip install mistralai\n```\n*国内加速方案：*\n```bash\npip install mistralai -i https:\u002F\u002Fpypi.tuna.tsinghua.edu.cn\u002Fsimple\n```\n\n### 方式三：使用 Poetry\n```bash\npoetry add mistralai\n```\n\n### 可选：安装 Agents 额外依赖\n如果您需要使用 Agent 相关功能，请安装额外依赖：\n```bash\npip install \"mistralai[agents]\"\n```\n\n## 基本使用\n\n以下示例展示如何初始化客户端并调用聊天补全接口。SDK 同时支持同步和异步调用。\n\n### 同步调用示例\n\n```python\nfrom mistralai.client import Mistral\nimport os\n\n# 初始化客户端\nwith Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n) as mistral:\n\n    # 调用聊天接口\n    res = mistral.chat.complete(\n        model=\"mistral-large-latest\", \n        messages=[\n            {\n                \"role\": \"user\",\n                \"content\": \"Who is the best French painter? Answer in one short sentence.\",\n            },\n        ], \n        stream=False, \n        response_format={\n            \"type\": \"text\",\n        }\n    )\n\n    # 处理响应\n    print(res)\n```\n\n### 异步调用示例\n\n```python\nimport asyncio\nfrom mistralai.client import Mistral\nimport os\n\nasync def main():\n    async with Mistral(\n        api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n    ) as mistral:\n\n        res = await mistral.chat.complete_async(\n            model=\"mistral-large-latest\", \n            messages=[\n                {\n                    \"role\": \"user\",\n                    \"content\": \"Who is the best French painter? Answer in one short sentence.\",\n                },\n            ], \n            stream=False, \n            response_format={\n                \"type\": \"text\",\n            }\n        )\n\n        print(res)\n\nasyncio.run(main())\n```\n\n### 其他常用功能简述\n\n*   **文件上传**：使用 `mistral.files.upload` 方法。\n*   **Embeddings**：使用 `mistral.embeddings.create` 方法生成向量嵌入。\n*   **流式输出**：将 `stream` 参数设为 `True` 即可启用服务器发送事件（SSE）流式传输。","某电商公司的数据团队需要构建一个实时智能客服系统，自动处理用户关于订单状态和退货政策的咨询。\n\n### 没有 client-python 时\n- 开发人员必须手动编写复杂的 HTTP 请求代码来处理鉴权、重试机制和错误捕获，导致核心业务逻辑被大量样板代码淹没。\n- 实现流式回复（Server-sent events）以模拟真人打字效果时，需底层解析 SSE 数据流，开发难度大且容易出现连接中断问题。\n- 每次升级 Mistral AI 接口版本时，团队需要人工对照文档修改请求参数结构，极易因字段不匹配引发线上故障。\n- 缺乏统一的类型提示和 IDE 支持，调试 API 调用参数耗时费力，新成员上手门槛高。\n\n### 使用 client-python 后\n- 只需几行代码即可初始化 `Mistral` 客户端并自动处理 API Key 认证、网络重试及异常管理，让团队专注于对话策略设计。\n- 内置原生流式传输支持，通过简单的迭代器即可轻松实现流畅的打字机效果，显著提升用户交互体验。\n- 库内部已适配最新的 API 规范，配合清晰的迁移指南，确保接口升级时业务代码无需大幅重构即可平滑过渡。\n- 提供完善的类型注解和 IDE 智能补全，开发者在编码阶段即可发现参数错误，大幅缩短调试周期。\n\nclient-python 将繁琐的底层通信细节封装为简洁的 Python 接口，让开发者能高效、稳定地将顶尖大模型能力集成到生产环境中。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fmistralai_client-python_e4fd2541.png","mistralai","Mistral AI","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002Fmistralai_7094b1e0.png",null,"contact@mistral.ai","mistral.ai","https:\u002F\u002Fgithub.com\u002Fmistralai",[81,85,89],{"name":82,"color":83,"percentage":84},"Python","#3572A5",99.8,{"name":86,"color":87,"percentage":88},"Shell","#89e051",0.2,{"name":90,"color":91,"percentage":92},"Makefile","#427819",0,720,175,"2026-04-14T21:51:55","Apache-2.0","Linux, macOS, Windows","未说明",{"notes":100,"python":101,"dependencies":102},"该工具是 Mistral AI 的 Python 客户端 SDK，用于调用云端 API（包括标准端点、Azure AI 和 Google Cloud Vertex AI），而非在本地运行模型，因此无需本地 GPU 或大内存。使用 Agents 功能需安装额外依赖 `mistralai[agents]`。支持同步和异步请求。需配置 MISTRAL_API_KEY 环境变量或使用云服务商的认证方式。","3.10+",[73,103,104,105],"asyncio","google-auth (可选，用于 GCP)","mistralai-workflows (可选)",[35,14,13],"2026-03-27T02:49:30.150509","2026-04-17T08:23:11.341513",[110,115,120,125,130,134],{"id":111,"question_zh":112,"answer_zh":113,"source_url":114},36337,"为什么 open-mixtral-8x22b 和 mistral-small-latest 模型的函数调用（Function Calling）功能突然失效？","这是一个临时的 API 端问题，维护者已在服务端直接部署了修复补丁。用户无需更新 SDK 版本，直接使用当前版本的包即可恢复正常使用。如果之前遇到此问题，现在应该已经解决。","https:\u002F\u002Fgithub.com\u002Fmistralai\u002Fclient-python\u002Fissues\u002F92",{"id":116,"question_zh":117,"answer_zh":118,"source_url":119},36338,"导入 'Mistral' 类时出现 \"cannot import name 'Mistral' from 'mistralai'\" 错误怎么办？","这通常是由于 Jupyter Notebook 或 IDE 的缓存问题导致的。尝试以下解决方案：\n1. 重启 Jupyter Kernel（内核）。\n2. 重启 VSCode 编辑器。\n3. 确保使用正确的 Python 环境安装模块：`\u002Fpath\u002Fto\u002Fyour\u002Fpython -m pip install mistralai`。\n4. 如果上述无效，尝试卸载后重新安装，或从头重建虚拟环境。","https:\u002F\u002Fgithub.com\u002Fmistralai\u002Fclient-python\u002Fissues\u002F211",{"id":121,"question_zh":122,"answer_zh":123,"source_url":124},36339,"Mistral OCR API 返回 \"Document is not a valid PDF\" (错误代码 3740)，但文件在其他阅读器中正常，如何解决？","这通常是因为 PDF 文件中包含某些解析器严格校验的非标准引用（如损坏的 \u002FMatte 对象引用），而其他阅读器会忽略它们。\n解决方案：\n1. 等待官方修复：维护者已针对此类严格校验问题进行了更新，许多之前失败的 PDF 现在可以正常处理。\n2. 临时变通方法：在发送给 OCR API 之前，将 PDF 页面渲染为光栅图像（如 PNG\u002FJPG），然后以图片形式提交处理，从而绕过 PDF 解析器的严格校验。","https:\u002F\u002Fgithub.com\u002Fmistralai\u002Fclient-python\u002Fissues\u002F485",{"id":126,"question_zh":127,"answer_zh":128,"source_url":129},36340,"mistralai 包是否限制了依赖项（如 httpx）的版本上限，导致与其他库冲突？","早期版本确实存在过度限制依赖项上限的问题，但这已被社区反馈并修复。维护者已合并相关 PR 并发布了新版本，移除了不必要的依赖项上限约束（upper bounds），现在只保留下限约束。请确保将 `mistralai` 包升级到最新版本以解决依赖冲突问题。","https:\u002F\u002Fgithub.com\u002Fmistralai\u002Fclient-python\u002Fissues\u002F171",{"id":131,"question_zh":132,"answer_zh":133,"source_url":114},36341,"在使用 tool_choice=\"any\" 时，模型返回格式错误的 JSON 而不是工具调用（Tool Use）怎么办？","这是一个已知但非阻塞性的问题。当强制模型使用工具时，偶尔会返回格式不佳的 JSON。目前建议的做法是：\n1. 在代码中添加健壮的 JSON 解析异常处理逻辑。\n2. 如果遇到特定复现案例，建议收集具体示例并向维护者提交新的 Issue 以便进一步追踪和优化。",{"id":135,"question_zh":136,"answer_zh":137,"source_url":138},36342,"files.upload 接口在使用 purpose=\"ocr\" 参数时报类型不匹配错误（type mismatch）是什么原因？","这是 SDK 类型定义与实际 API 需求之间的不一致导致的 Bug。该问题已被确认为客户端缺陷。解决方法是检查并更新 `mistralai` 库到最新版本，维护者通常会在后续版本中修正此类类型注解错误。如果最新版仍有问题，可暂时忽略类型检查提示或通过类型转换绕过，直到官方发布修复补丁。","https:\u002F\u002Fgithub.com\u002Fmistralai\u002Fclient-python\u002Fissues\u002F196",[140,145,150,155,160,165,170,175,180,185,190,195,200,205,210,215,220,225,230,235],{"id":141,"version":142,"summary_zh":143,"released_at":144},289123,"v2.4.0rc2","# 由 Speakeasy CLI 生成\n\n## 2026-04-14 15:56:48\n### 更改\n基于：\n- OpenAPI 文档 1.0.0 \n- Speakeasy CLI 1.761.1 (2.879.6) https:\u002F\u002Fgithub.com\u002Fspeakeasy-api\u002Fspeakeasy\n### 生成\n- [python v2.4.0rc2] .\n### 发布\n- [PyPI v2.4.0rc2] https:\u002F\u002Fpypi.org\u002Fproject\u002Fmistralai\u002F2.4.0rc2 - .\n\n发布完成","2026-04-14T15:56:51",{"id":146,"version":147,"summary_zh":148,"released_at":149},289124,"v2.4.0rc1","# 由 Speakeasy CLI 生成\n\n## 2026-04-14 12:23:19\n### 变更\n基于：\n- OpenAPI 文档 1.0.0 \n- Speakeasy CLI 1.761.1 (2.879.6) https:\u002F\u002Fgithub.com\u002Fspeakeasy-api\u002Fspeakeasy\n### 生成\n- [python v2.4.0rc1] .\n### 发布\n- [PyPI v2.4.0rc1] https:\u002F\u002Fpypi.org\u002Fproject\u002Fmistralai\u002F2.4.0rc1 - .\n\n发布完成","2026-04-14T12:23:22",{"id":151,"version":152,"summary_zh":153,"released_at":154},289125,"v2.3.2","# 由 Speakeasy CLI 生成\n[mistralai 2.3.2](https:\u002F\u002Fpypi.org\u002Fproject\u002Fmistralai\u002F2.3.2)\n## Python SDK 变更：\n* `mistral.chat.complete()`:  `response.choices[].messages[]` **已更改**（破坏性变更 ⚠️）\n* `mistral.agents.complete()`:  `response.choices[].messages[]` **已更改**（破坏性变更 ⚠️）\n* `mistral.fim.complete()`:  `response.choices[].messages[]` **已更改**（破坏性变更 ⚠️）\n* `mistral.beta.observability.datasets.delete()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.campaigns.delete()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.chat_completion_events.judge()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.chat_completion_events.fields.list()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.chat_completion_events.fields.fetch_options()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.chat_completion_events.fields.fetch_option_counts()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.judges.create()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.judges.list()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.judges.fetch()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.judges.delete()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.judges.update()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.judges.judge_conversation()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.campaigns.create()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.campaigns.list()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.campaigns.fetch()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.datasets.list_records()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.campaigns.fetch_status()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.campaigns.list_events()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.datasets.create()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.datasets.list()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.datasets.fetch()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.libraries.list()`:  `request` **已更改**\n* `mistral.beta.observability.chat_completion_events.fetch_similar_events()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.datasets.update()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.datasets.records.update_properties()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.datasets.import_from_campaign()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.datasets.import_from_explorer()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.datasets.import_from_file()`:  `error.","2026-04-10T14:04:55",{"id":156,"version":157,"summary_zh":158,"released_at":159},289126,"v2.3.1","# 由 Speakeasy CLI 生成\n[mistralai 2.3.1](https:\u002F\u002Fpypi.org\u002Fproject\u002Fmistralai\u002F2.3.1)\n## Python SDK 变更：\n* `mistral.ocr.process()`:\n  *  `request` **已更改**（破坏性变更 ⚠️）\n  *  `response.pages[]` **已更改**\n* `mistral.chat.complete()`：`response.choices[]` **已更改**\n* `mistral.fim.complete()`：`response.choices[]` **已更改**\n* `mistral.agents.complete()`：`response.choices[]` **已更改**\n* `mistral.workflows.executions.stream()`：`request.event_source` **已更改**\n\n使用 [Speakeasy CLI 1.761.1](https:\u002F\u002Fgithub.com\u002Fspeakeasy-api\u002Fspeakeasy\u002Freleases) 生成\n\n\n发布完成","2026-04-07T14:48:24",{"id":161,"version":162,"summary_zh":163,"released_at":164},289127,"v2.3.0","# 由 Speakeasy CLI 生成\n[mistralai 2.3.0](https:\u002F\u002Fpypi.org\u002Fproject\u002Fmistralai\u002F2.3.0)\n## Python SDK 变更：\n* `mistral.models.list()`:  `response.data[].union(fine-tuned).job` **已更改**（破坏性变更 ⚠️）\n* `mistral.models.retrieve()`:  `response.union(fine-tuned).job` **已更改**（破坏性变更 ⚠️）\n* `mistral.chat.complete()`: \n  *  `request` **已更改**（破坏性变更 ⚠️）\n  *  `response.choices[].message.tool_calls[].type` **已更改**（破坏性变更 ⚠️）\n* `mistral.chat.stream()`: \n  *  `request` **已更改**（破坏性变更 ⚠️）\n  *  `response.[].data.choices[].delta.tool_calls[].type` **已更改**（破坏性变更 ⚠️）\n* `mistral.fim.complete()`:  `response.choices[].message.tool_calls[].type` **已更改**（破坏性变更 ⚠️）\n* `mistral.fim.stream()`:  `response.[].data.choices[].delta.tool_calls[].type` **已更改**（破坏性变更 ⚠️）\n* `mistral.workflows.get_workflow_registration()`:  `response.workflow_registration` **已更改**（破坏性变更 ⚠️）\n* `mistral.workflows.execute_workflow_registration()`:  `request.input` **已更改**（破坏性变更 ⚠️）\n* `mistral.workflows.execute_workflow()`:  `request.input` **已更改**（破坏性变更 ⚠️）\n* `mistral.workflows.get_workflow_registrations()`:  `response.workflow_registrations[]` **已更改**（破坏性变更 ⚠️）\n* `mistral.classifiers.classify_chat()`: \n  *  `request.input.union(InstructRequest).messages[].union(assistant).tool_calls[].type` **已更改**（破坏性变更 ⚠️）\n* `mistral.classifiers.moderate_chat()`: \n  *  `request.inputs.union(Array\u003C>)[].union(assistant).tool_calls[].type` **已更改**（破坏性变更 ⚠️）\n* `mistral.agents.stream()`: \n  *  `request` **已更改**（破坏性变更 ⚠️）\n  *  `response.[].data.choices[].delta.tool_calls[].type` **已更改**（破坏性变更 ⚠️）\n* `mistral.agents.complete()`: \n  *  `request` **已更改**（破坏性变更 ⚠️）\n  *  `response.choices[].message.tool_calls[].type` **已更改**（破坏性变更 ⚠️）\n* `mistral.beta.observability.datasets.list()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.datasets.import_from_explorer()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.judges.list()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.judges.fetch()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.judges.delete()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.judges.update()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.judges.judge_conversation()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.campaigns.create()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.campaigns.list()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.campaigns.fetch()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.campaigns.delete()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.campaigns.fetch_status()`:  `error.detail.error_code` **已更改**\n* `mistral.beta.observability.campaigns.list_events()`:  `error.detail.error_code` **已更改","2026-04-03T15:06:04",{"id":166,"version":167,"summary_zh":168,"released_at":169},289128,"v2.2.0","# 由 Speakeasy CLI 生成\n\n## 2026-03-31 11:20:47\n### 变更\n基于：\n- OpenAPI 文档 1.0.0 \n- Speakeasy CLI 1.754.0 (2.862.0) https:\u002F\u002Fgithub.com\u002Fspeakeasy-api\u002Fspeakeasy\n### 生成\n- [python v2.2.0] .\n### 发布\n- [PyPI v2.2.0] https:\u002F\u002Fpypi.org\u002Fproject\u002Fmistralai\u002F2.2.0 - .\n\n发布完成","2026-03-31T11:20:49",{"id":171,"version":172,"summary_zh":173,"released_at":174},289129,"v2.2.0rc3","# 由 Speakeasy CLI 生成\n[mistralai 2.2.0rc3](https:\u002F\u002Fpypi.org\u002Fproject\u002Fmistralai\u002F2.2.0rc3)\n## Python SDK 变更：\n* `mistral.workflows.events.send_event()`: **已移除**（重大变更 ⚠️）\n* `mistral.workflows.events.send_events_batch()`: **已移除**（重大变更 ⚠️）\n* `mistral.events.send_event()`: **已移除**（重大变更 ⚠️）\n* `mistral.events.send_events_batch()`: **已移除**（重大变更 ⚠️）\n\n使用 [Speakeasy CLI 1.754.0](https:\u002F\u002Fgithub.com\u002Fspeakeasy-api\u002Fspeakeasy\u002Freleases) 生成\n\n\n发布完成","2026-03-30T17:32:31",{"id":176,"version":177,"summary_zh":178,"released_at":179},289130,"v2.2.0rc2","# 由 Speakeasy CLI 生成\n[mistralai 2.2.0rc2](https:\u002F\u002Fpypi.org\u002Fproject\u002Fmistralai\u002F2.2.0rc2)\n## Python SDK 变更：\n* `mistral.workflows.workers.whoami()`: **新增**\n* `mistral.workflows.events.send_event()`: **新增**\n* `mistral.workflows.events.send_events_batch()`: **新增**\n* `mistral.events.send_event()`: **新增**\n* `mistral.events.send_events_batch()`: **新增**\n* `mistral.workflows.events.receive_workflow_event()`: **已移除**（重大变更 ⚠️）\n* `mistral.workflows.events.receive_workflow_events_batch()`: **已移除**（重大变更 ⚠️）\n* `mistral.events.receive_workflow_event()`: **已移除**（重大变更 ⚠️）\n* `mistral.events.receive_workflow_events_batch()`: **已移除**（重大变更 ⚠️）\n* `mistral.workflows.executions.get_workflow_execution_history()`:  `request.decode_payloads` **新增**\n* `mistral.workflows.runs.get_run_history()`:  `request.decode_payloads` **新增**\n\n使用 [Speakeasy CLI 1.754.0](https:\u002F\u002Fgithub.com\u002Fspeakeasy-api\u002Fspeakeasy\u002Freleases) 生成\n\n\n发布完成","2026-03-30T16:01:38",{"id":181,"version":182,"summary_zh":183,"released_at":184},289131,"v2.2.0rc1","# 由 Speakeasy CLI 生成\n[mistralai 2.2.0rc1](https:\u002F\u002Fpypi.org\u002Fproject\u002Fmistralai\u002F2.2.0rc1)\n## Python SDK 变更：\n* `mistral.models.list()`:  `response.data[].union(fine-tuned).job` **已更改**（破坏性变更 ⚠️）\n* `mistral.beta.connectors.list_tools()`:  `response` **已更改**（破坏性变更 ⚠️）\n* `mistral.models.update()`:  `response` **已更改**（破坏性变更 ⚠️）\n* `mistral.models.retrieve()`:  `response.union(fine-tuned).job` **已更改**（破坏性变更 ⚠️）\n* `mistral.workflows.metrics.get_workflow_metrics()`: **新增**\n* `mistral.workflows.get_workflow()`: **新增**\n* `mistral.workflows.update_workflow()`: **新增**\n* `mistral.workflows.get_workflow_registration()`: **新增**\n* `mistral.workflows.archive_workflow()`: **新增**\n* `mistral.workflows.unarchive_workflow()`: **新增**\n* `mistral.workflows.executions.get_workflow_execution()`: **新增**\n* `mistral.workflows.executions.get_workflow_execution_history()`: **新增**\n* `mistral.workflows.executions.signal_workflow_execution()`: **新增**\n* `mistral.workflows.executions.query_workflow_execution()`: **新增**\n* `mistral.workflows.executions.terminate_workflow_execution()`: **新增**\n* `mistral.workflows.executions.batch_terminate_workflow_executions()`: **新增**\n* `mistral.workflows.executions.cancel_workflow_execution()`: **新增**\n* `mistral.workflows.executions.batch_cancel_workflow_executions()`: **新增**\n* `mistral.workflows.executions.reset_workflow()`: **新增**\n* `mistral.workflows.executions.update_workflow_execution()`: **新增**\n* `mistral.workflows.executions.get_workflow_execution_trace_otel()`: **新增**\n* `mistral.workflows.executions.get_workflow_execution_trace_summary()`: **新增**\n* `mistral.workflows.executions.get_workflow_execution_trace_events()`: **新增**\n* `mistral.workflows.executions.stream()`: **新增**\n* `mistral.workflows.runs.get_run()`: **新增**\n* `mistral.batch.jobs.delete()`: **新增**\n* `mistral.workflows.runs.list_runs()`: **新增**\n* `mistral.workflows.runs.get_run_history()`: **新增**\n* `mistral.workflows.schedules.get_schedules()`: **新增**\n* `mistral.workflows.schedules.schedule_workflow()`: **新增**\n* `mistral.workflows.schedules.unschedule_workflow()`: **新增**\n* `mistral.workflows.events.receive_workflow_event()`: **新增**\n* `mistral.workflows.events.receive_workflow_events_batch()`: **新增**\n* `mistral.workflows.events.get_stream_events()`: **新增**\n* `mistral.workflows.events.get_workflow_events()`: **新增**\n* `mistral.workflows.deployments.list_deployments()`: **新增**\n* `mistral.workflows.deployments.get_deployment()`: **新增**\n* `mistral.events.receive_workflow_event()`: **新增**\n* `mistral.events.receive_workflow_events_batch()`: **新增**\n* `mistral.events.get_stream_events()`: **新增**\n* `mistral.events.get_workflow_events()`: **新增**\n* `mistral.audio.voices.list()`:  `request.type` **新增**\n* `mistral.workflows.execute_workflow_registration()`: **新增**\n* `mistral.workflows.execute_workflow()`: **新增**\n* `mistral.workflows.get_wor","2026-03-30T15:11:31",{"id":186,"version":187,"summary_zh":188,"released_at":189},289132,"v2.1.3","# 由 Speakeasy CLI 生成\n[mistralai 2.1.3](https:\u002F\u002Fpypi.org\u002Fproject\u002Fmistralai\u002F2.1.3)\n## Python SDK 变更：\n* `mistral.beta.connectors.list_tools()`: **已添加**\n\n使用 [Speakeasy CLI 1.754.0](https:\u002F\u002Fgithub.com\u002Fspeakeasy-api\u002Fspeakeasy\u002Freleases) 生成\n\n\n发布完成","2026-03-23T14:59:44",{"id":191,"version":192,"summary_zh":193,"released_at":194},289133,"v2.1.2","# Generated by Speakeasy CLI\n[mistralai 2.1.2](https:\u002F\u002Fpypi.org\u002Fproject\u002Fmistralai\u002F2.1.2)\n## Python SDK Changes:\n* `mistral.beta.conversations.start()`: \n  *  `request.tools[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.conversations.list()`:  `response.[].union(ModelConversation).tools[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.conversations.get()`:  `response.union(ModelConversation).tools[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.conversations.start_stream()`: \n  *  `request.tools[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.agents.create()`: \n  *  `request.tools[]` **Changed** (Breaking ⚠️)\n  *  `response.tools[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.agents.list()`:  `response.[].tools[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.agents.get()`:  `response.tools[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.agents.update()`: \n  *  `request.tools[]` **Changed** (Breaking ⚠️)\n  *  `response.tools[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.agents.update_version()`:  `response.tools[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.agents.list_versions()`:  `response.[].tools[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.agents.get_version()`:  `response.tools[]` **Changed** (Breaking ⚠️)\n* `mistral.chat.complete()`: \n  *  `request` **Changed** (Breaking ⚠️)\n  *  `response.choices[]` **Changed** (Breaking ⚠️)\n* `mistral.chat.stream()`:  `request` **Changed** (Breaking ⚠️)\n* `mistral.fim.complete()`:  `response.choices[]` **Changed** (Breaking ⚠️)\n* `mistral.agents.complete()`: \n  *  `request` **Changed** (Breaking ⚠️)\n  *  `response.choices[]` **Changed** (Breaking ⚠️)\n* `mistral.agents.stream()`:  `request` **Changed** (Breaking ⚠️)\n\nGenerated with [Speakeasy CLI 1.754.0](https:\u002F\u002Fgithub.com\u002Fspeakeasy-api\u002Fspeakeasy\u002Freleases)\n\n\nPublishing Completed","2026-03-20T15:16:17",{"id":196,"version":197,"summary_zh":198,"released_at":199},289134,"v2.1.1","# Generated by Speakeasy CLI\n[mistralai 2.1.1](https:\u002F\u002Fpypi.org\u002Fproject\u002Fmistralai\u002F2.1.1)\n## Python SDK Changes:\n* `mistral.chat.complete()`:  `response.choices[]` **Changed** (Breaking ⚠️)\n* `mistral.fim.complete()`:  `response.choices[]` **Changed** (Breaking ⚠️)\n* `mistral.agents.complete()`:  `response.choices[]` **Changed** (Breaking ⚠️)\n\nGenerated with [Speakeasy CLI 1.754.0](https:\u002F\u002Fgithub.com\u002Fspeakeasy-api\u002Fspeakeasy\u002Freleases)\n\n\nPublishing Completed","2026-03-20T12:22:52",{"id":201,"version":202,"summary_zh":203,"released_at":204},289135,"v2.1.0","# Generated by Speakeasy CLI\n[mistralai 2.1.0](https:\u002F\u002Fpypi.org\u002Fproject\u002Fmistralai\u002F2.1.0)\n## Python SDK Changes:\n* `mistral.beta.agents.update_version()`:  `response` **Changed** (Breaking ⚠️)\n* `mistral.beta.agents.get()`:  `response` **Changed** (Breaking ⚠️)\n* `mistral.agents.stream()`:  `request` **Changed** (Breaking ⚠️)\n* `mistral.agents.complete()`:  `request` **Changed** (Breaking ⚠️)\n* `mistral.beta.conversations.start_stream()`:  `request` **Changed** (Breaking ⚠️)\n* `mistral.chat.stream()`:  `request` **Changed** (Breaking ⚠️)\n* `mistral.chat.complete()`:  `request` **Changed** (Breaking ⚠️)\n* `mistral.beta.conversations.restart()`:  `request` **Changed** (Breaking ⚠️)\n* `mistral.beta.agents.list_versions()`:  `response.[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.conversations.start()`:  `request` **Changed** (Breaking ⚠️)\n* `mistral.beta.conversations.list()`:  `response.[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.conversations.get()`:  `response` **Changed** (Breaking ⚠️)\n* `mistral.beta.agents.update()`: \n  *  `request` **Changed** (Breaking ⚠️)\n  *  `response` **Changed** (Breaking ⚠️)\n* `mistral.beta.agents.get_version()`:  `response` **Changed** (Breaking ⚠️)\n* `mistral.beta.agents.list()`:  `response.[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.agents.create()`: \n  *  `request` **Changed** (Breaking ⚠️)\n  *  `response` **Changed** (Breaking ⚠️)\n* `mistral.beta.conversations.restart_stream()`:  `request` **Changed** (Breaking ⚠️)\n* `mistral.beta.conversations.append_stream()`: \n  *  `request.completion_args.reasoning_effort` **Added**\n* `mistral.audio.voices.update()`: **Added**\n* `mistral.beta.conversations.append()`: \n  *  `request.completion_args.reasoning_effort` **Added**\n* `mistral.audio.voices.list()`: **Added**\n* `mistral.audio.speech.complete()`: **Added**\n* `mistral.models.retrieve()`:  `response.union(base).capabilities.reasoning` **Added**\n* `mistral.models.list()`:  `response.data[].union(base).capabilities.reasoning` **Added**\n* `mistral.audio.voices.get_sample_audio()`: **Added**\n* `mistral.audio.voices.get()`: **Added**\n* `mistral.audio.voices.delete()`: **Added**\n* `mistral.audio.voices.create()`: **Added**\n\nGenerated with [Speakeasy CLI 1.754.0](https:\u002F\u002Fgithub.com\u002Fspeakeasy-api\u002Fspeakeasy\u002Freleases)\n\n\nPublishing Completed","2026-03-20T08:53:22",{"id":206,"version":207,"summary_zh":208,"released_at":209},289136,"v2.0.5","# Generated by Speakeasy CLI\n[mistralai 2.0.5](https:\u002F\u002Fpypi.org\u002Fproject\u002Fmistralai\u002F2.0.5)\n## Python SDK Changes:\n* `mistral.beta.connectors.get_auth_url()`: **Added**\n* `mistral.chat.complete()`:  `request.reasoning_effort` **Added**\n* `mistral.chat.stream()`:  `request.reasoning_effort` **Added**\n* `mistral.agents.complete()`:  `request.reasoning_effort` **Added**\n* `mistral.agents.stream()`:  `request.reasoning_effort` **Added**\n\nGenerated with [Speakeasy CLI 1.754.0](https:\u002F\u002Fgithub.com\u002Fspeakeasy-api\u002Fspeakeasy\u002Freleases)\n\n\nPublishing Completed","2026-03-18T15:31:31",{"id":211,"version":212,"summary_zh":213,"released_at":214},289137,"v2.0.4","# Generated by Speakeasy CLI\n[mistralai 2.0.4](https:\u002F\u002Fpypi.org\u002Fproject\u002Fmistralai\u002F2.0.4)\n## Python SDK Changes:\n* `mistral.beta.observability.judges.judge_conversation()`: **Added**\n* `mistral.chat.complete()`:  `request.guardrails` **Added**\n* `mistral.chat.stream()`:  `request.guardrails` **Added**\n* `mistral.agents.complete()`:  `request.guardrails` **Added**\n* `mistral.agents.stream()`:  `request.guardrails` **Added**\n\nGenerated with [Speakeasy CLI 1.729.0](https:\u002F\u002Fgithub.com\u002Fspeakeasy-api\u002Fspeakeasy\u002Freleases)\n\n\nPublishing Completed","2026-03-16T15:29:39",{"id":216,"version":217,"summary_zh":218,"released_at":219},289138,"v2.0.3","# Generated by Speakeasy CLI\n[mistralai 2.0.3](https:\u002F\u002Fpypi.org\u002Fproject\u002Fmistralai\u002F2.0.3)\n## Python SDK Changes:\n* `mistral.beta.conversations.append_stream()`: \n  *  `request.inputs.union(Array\u003CInputEntries>)[].union(MessageInputEntry).content.union(Array\u003CMessageInputContentChunks>)[].union(ThinkChunk).thinking[].union(ReferenceChunk).reference_ids[]` **Changed** (Breaking ⚠️)\n  *  `response.[].data.union(message.output.delta).content.union(OutputContentChunks).union(ThinkChunk).thinking[].union(ReferenceChunk).reference_ids[]` **Changed** (Breaking ⚠️)\n* `mistral.chat.complete()`: \n  *  `request.messages[].union(system).content.union(Array\u003CSystemMessageContentChunks>)[].union(thinking).thinking[].union(ReferenceChunk).reference_ids[]` **Changed** (Breaking ⚠️)\n  *  `response.choices[].message.content.union(Array\u003CContentChunk>)[].union(reference).reference_ids[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.conversations.start()`: \n  *  `request.inputs.union(Array\u003CInputEntries>)[].union(MessageInputEntry).content.union(Array\u003CMessageInputContentChunks>)[].union(ThinkChunk).thinking[].union(ReferenceChunk).reference_ids[]` **Changed** (Breaking ⚠️)\n  *  `response.outputs[].union(MessageOutputEntry).content.union(Array\u003CMessageOutputContentChunks>)[].union(ThinkChunk).thinking[].union(ReferenceChunk).reference_ids[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.conversations.append()`: \n  *  `request.inputs.union(Array\u003CInputEntries>)[].union(MessageOutputEntry).content.union(Array\u003CMessageOutputContentChunks>)[].union(ThinkChunk).thinking[].union(ReferenceChunk).reference_ids[]` **Changed** (Breaking ⚠️)\n  *  `response.outputs[].union(MessageOutputEntry).content.union(Array\u003CMessageOutputContentChunks>)[].union(ThinkChunk).thinking[].union(ReferenceChunk).reference_ids[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.conversations.get_history()`:  `response.entries[].union(MessageInputEntry).content.union(Array\u003CMessageInputContentChunks>)[].union(ThinkChunk).thinking[].union(ReferenceChunk).reference_ids[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.conversations.get_messages()`:  `response.messages[].union(MessageInputEntry).content.union(Array\u003CMessageInputContentChunks>)[].union(ThinkChunk).thinking[].union(ReferenceChunk).reference_ids[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.conversations.restart()`: \n  *  `request.inputs.union(Array\u003CInputEntries>)[].union(MessageInputEntry).content.union(Array\u003CMessageInputContentChunks>)[].union(ThinkChunk).thinking[].union(ReferenceChunk).reference_ids[]` **Changed** (Breaking ⚠️)\n  *  `response.outputs[].union(MessageOutputEntry).content.union(Array\u003CMessageOutputContentChunks>)[].union(ThinkChunk).thinking[].union(ReferenceChunk).reference_ids[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.conversations.start_stream()`: \n  *  `request.inputs.union(Array\u003CInputEntries>)[].union(MessageInputEntry).content.union(Array\u003CMessageInputContentChunks>)[].union(ThinkChunk).thinking[].union(ReferenceChunk).reference_ids[]` **Changed** (Breaking ⚠️)\n  *  `response.[].data.union(message.output.delta).content.union(OutputContentChunks).union(ThinkChunk).thinking[].union(ReferenceChunk).reference_ids[]` **Changed** (Breaking ⚠️)\n* `mistral.classifiers.classify_chat()`: \n  *  `request.input.union(InstructRequest).messages[].union(user).content.union(Array\u003CContentChunk>)[].union(reference).reference_ids[]` **Changed** (Breaking ⚠️)\n* `mistral.classifiers.moderate_chat()`: \n  *  `request.inputs.union(Array\u003C>)[].union(tool).content.union(Array\u003CContentChunk>)[].union(reference).reference_ids[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.conversations.restart_stream()`: \n  *  `request.inputs.union(Array\u003CInputEntries>)[].union(MessageInputEntry).content.union(Array\u003CMessageInputContentChunks>)[].union(ThinkChunk).thinking[].union(ReferenceChunk).reference_ids[]` **Changed** (Breaking ⚠️)\n  *  `response.[].data.union(message.output.delta).content.union(OutputContentChunks).union(ThinkChunk).thinking[].union(ReferenceChunk).reference_ids[]` **Changed** (Breaking ⚠️)\n* `mistral.chat.stream()`: \n  *  `request.messages[].union(tool).content.union(Array\u003CContentChunk>)[].union(reference).reference_ids[]` **Changed** (Breaking ⚠️)\n  *  `response.[].data.choices[].delta.content.union(Array\u003CContentChunk>)[].union(reference).reference_ids[]` **Changed** (Breaking ⚠️)\n* `mistral.fim.complete()`:  `response.choices[].message.content.union(Array\u003CContentChunk>)[].union(thinking).thinking[].union(ReferenceChunk).reference_ids[]` **Changed** (Breaking ⚠️)\n* `mistral.fim.stream()`:  `response.[].data.choices[].delta.content.union(Array\u003CContentChunk>)[].union(thinking).thinking[].union(ReferenceChunk).reference_ids[]` **Changed** (Breaking ⚠️)\n* `mistral.agents.complete()`: \n  *  `request.messages[].union(system).content.union(Array\u003CSystemMessageContentChunks>)[].union(thinking).thinking[].union(ReferenceChunk).reference_ids[]` **Changed** (Breaking ⚠️)\n  *  `response.choices[].message.content.union(Array\u003CContentChunk>)[].uni","2026-03-16T11:06:13",{"id":221,"version":222,"summary_zh":223,"released_at":224},289139,"v2.0.2","# Generated by Speakeasy CLI\n[mistralai 2.0.2](https:\u002F\u002Fpypi.org\u002Fproject\u002Fmistralai\u002F2.0.2)\n## Python SDK Changes:\n* `mistral.beta.conversations.start()`:  `response.guardrails[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.conversations.append()`:  `response.guardrails[]` **Changed** (Breaking ⚠️)\n* `mistral.beta.conversations.restart()`:  `response.guardrails[]` **Changed** (Breaking ⚠️)\n\nGenerated with [Speakeasy CLI 1.729.0](https:\u002F\u002Fgithub.com\u002Fspeakeasy-api\u002Fspeakeasy\u002Freleases)\n\n\nPublishing Completed","2026-03-13T13:56:04",{"id":226,"version":227,"summary_zh":228,"released_at":229},289140,"v2.0.1","# Generated by Speakeasy CLI\n[mistralai 2.0.1](https:\u002F\u002Fpypi.org\u002Fproject\u002Fmistralai\u002F2.0.1)\n## Python SDK Changes:\n* `mistral.chat.complete()`: \n  *  `request.messages[].union(user).content.union(Array\u003CContentChunk>)[].union(audio_url)` **Removed** (Breaking ⚠️)\n  *  `response.choices[].message.content.union(Array\u003CContentChunk>)[].union(audio_url)` **Removed** (Breaking ⚠️)\n* `mistral.classifiers.classify_chat()`: \n  *  `request.input.union(InstructRequest).messages[].union(user).content.union(Array\u003CContentChunk>)[].union(audio_url)` **Removed** (Breaking ⚠️)\n* `mistral.classifiers.moderate_chat()`: \n  *  `request.inputs.union(Array\u003C>)[].union(user).content.union(Array\u003CContentChunk>)[].union(audio_url)` **Removed** (Breaking ⚠️)\n* `mistral.agents.stream()`: \n  *  `request.messages[].union(user).content.union(Array\u003CContentChunk>)[].union(audio_url)` **Removed** (Breaking ⚠️)\n  *  `response.[].data.choices[].delta.content.union(Array\u003CContentChunk>)[].union(audio_url)` **Removed** (Breaking ⚠️)\n* `mistral.agents.complete()`: \n  *  `request.messages[].union(user).content.union(Array\u003CContentChunk>)[].union(audio_url)` **Removed** (Breaking ⚠️)\n  *  `response.choices[].message.content.union(Array\u003CContentChunk>)[].union(audio_url)` **Removed** (Breaking ⚠️)\n* `mistral.fim.stream()`:  `response.[].data.choices[].delta.content.union(Array\u003CContentChunk>)[].union(audio_url)` **Removed** (Breaking ⚠️)\n* `mistral.fim.complete()`:  `response.choices[].message.content.union(Array\u003CContentChunk>)[].union(audio_url)` **Removed** (Breaking ⚠️)\n* `mistral.chat.stream()`: \n  *  `request.messages[].union(user).content.union(Array\u003CContentChunk>)[].union(audio_url)` **Removed** (Breaking ⚠️)\n  *  `response.[].data.choices[].delta.content.union(Array\u003CContentChunk>)[].union(audio_url)` **Removed** (Breaking ⚠️)\n* `mistral.beta.conversations.restart_stream()`: \n  *  `request.guardrails[]` **Changed**\n* `mistral.beta.agents.update()`: \n  *  `request` **Changed**\n  *  `response` **Changed**\n* `mistral.beta.agents.update_version()`:  `response` **Changed**\n* `mistral.beta.agents.list_versions()`:  `response.[]` **Changed**\n* `mistral.beta.agents.get_version()`:  `response` **Changed**\n* `mistral.beta.agents.get()`:  `response` **Changed**\n* `mistral.beta.agents.list()`:  `response.[]` **Changed**\n* `mistral.beta.agents.create()`: \n  *  `request` **Changed**\n  *  `response` **Changed**\n* `mistral.beta.conversations.start()`:  `request` **Changed**\n* `mistral.beta.conversations.start_stream()`:  `request` **Changed**\n* `mistral.beta.conversations.restart()`: \n  *  `request.guardrails[]` **Changed**\n* `mistral.beta.conversations.get()`:  `response.union(ModelConversation)` **Changed**\n* `mistral.beta.conversations.list()`:  `response.[].union(ModelConversation)` **Changed**\n\nGenerated with [Speakeasy CLI 1.729.0](https:\u002F\u002Fgithub.com\u002Fspeakeasy-api\u002Fspeakeasy\u002Freleases)\n\n\nPublishing Completed","2026-03-12T07:59:21",{"id":231,"version":232,"summary_zh":233,"released_at":234},289141,"v2.0.0","A technical release focused on improving developer experience.\r\n\r\n## What's new\r\n\r\n### Namespace package (`mistralai.*`)\r\n\r\nThe SDK is now a [PEP 420 namespace package](https:\u002F\u002Fpeps.python.org\u002Fpep-0420\u002F). All imports move from `mistralai` to `mistralai.client`:\r\n\r\n```python\r\n# v1\r\nfrom mistralai import Mistral\r\n\r\n# v2\r\nfrom mistralai.client import Mistral\r\n```\r\n\r\nAzure and GCP SDKs are now sub-packages under the same namespace:\r\n\r\n```python\r\nfrom mistralai.azure.client import MistralAzure\r\nfrom mistralai.gcp.client import MistralGCP  # renamed from MistralGoogleCloud\r\n```\r\n\r\n### Schema naming conventions\r\n\r\n42 request\u002Fresponse types renamed to follow consistent conventions:\r\n- Request bodies: `{Verb}{Entity}Request` (e.g. `CreateAgentRequest`)\r\n- Responses: `{Verb}{Entity}Response` (e.g. `ListFilesResponse`)\r\n- Entities: `{Entity}` (e.g. `BatchJob`, `Checkpoint`)\r\n\r\nSee [MIGRATION.md](https:\u002F\u002Fgithub.com\u002Fmistralai\u002Fclient-python\u002Fblob\u002Fmain\u002FMIGRATION.md) for the full rename table.\r\n\r\n### Forward compatibility\r\n\r\n- Enums now accept unknown values, new API values won't crash old SDK versions\r\n- Discriminated unions get an `Unknown` variant for the same reason\r\n\r\n### Automatic schema conflict resolution\r\n\r\nSpeakeasy's `nameResolutionFeb2025` flag is enabled, automatically detecting and resolving ambiguous type names during generation (e.g. `Tools` → `ConversationRequestTool`).\r\n\r\n## Breaking changes\r\n\r\n- All import paths changed (see above)\r\n- 42 type renames (see [MIGRATION.md](https:\u002F\u002Fgithub.com\u002Fmistralai\u002Fclient-python\u002Fblob\u002Fmain\u002FMIGRATION.md))\r\n- `FunctionTool.type`: `Optional[FunctionToolType]` → `Literal[\"function\"]`\r\n- GCP class: `MistralGoogleCloud` → `MistralGCP`\r\n","2026-03-10T17:12:04",{"id":236,"version":237,"summary_zh":238,"released_at":239},289142,"v2.0.0rc1","# Generated by Speakeasy CLI\n[mistralai 2.0.0rc1](https:\u002F\u002Fpypi.org\u002Fproject\u002Fmistralai\u002F2.0.0rc1)\n## Python SDK Changes:\n* `mistral.beta.libraries.documents.list()`:  `response.data[].process_status` **Added**\n* `mistral.beta.libraries.documents.upload()`:  `response.process_status` **Added**\n* `mistral.beta.libraries.documents.get()`:  `response.process_status` **Added**\n* `mistral.beta.libraries.documents.update()`:  `response.process_status` **Added**\n* `mistral.beta.libraries.documents.status()`:  `response.process_status` **Added**\n\nGenerated with [Speakeasy CLI 1.729.0](https:\u002F\u002Fgithub.com\u002Fspeakeasy-api\u002Fspeakeasy\u002Freleases)\n\n\nPublishing Completed","2026-03-02T15:39:51"]