[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-googleapis--python-aiplatform":3,"tool-googleapis--python-aiplatform":61},[4,18,26,36,44,53],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":17},4358,"openclaw","openclaw\u002Fopenclaw","OpenClaw 是一款专为个人打造的本地化 AI 助手，旨在让你在自己的设备上拥有完全可控的智能伙伴。它打破了传统 AI 助手局限于特定网页或应用的束缚，能够直接接入你日常使用的各类通讯渠道，包括微信、WhatsApp、Telegram、Discord、iMessage 等数十种平台。无论你在哪个聊天软件中发送消息，OpenClaw 都能即时响应，甚至支持在 macOS、iOS 和 Android 设备上进行语音交互，并提供实时的画布渲染功能供你操控。\n\n这款工具主要解决了用户对数据隐私、响应速度以及“始终在线”体验的需求。通过将 AI 部署在本地，用户无需依赖云端服务即可享受快速、私密的智能辅助，真正实现了“你的数据，你做主”。其独特的技术亮点在于强大的网关架构，将控制平面与核心助手分离，确保跨平台通信的流畅性与扩展性。\n\nOpenClaw 非常适合希望构建个性化工作流的技术爱好者、开发者，以及注重隐私保护且不愿被单一生态绑定的普通用户。只要具备基础的终端操作能力（支持 macOS、Linux 及 Windows WSL2），即可通过简单的命令行引导完成部署。如果你渴望拥有一个懂你",349277,3,"2026-04-06T06:32:30",[13,14,15,16],"Agent","开发框架","图像","数据工具","ready",{"id":19,"name":20,"github_repo":21,"description_zh":22,"stars":23,"difficulty_score":10,"last_commit_at":24,"category_tags":25,"status":17},3808,"stable-diffusion-webui","AUTOMATIC1111\u002Fstable-diffusion-webui","stable-diffusion-webui 是一个基于 Gradio 构建的网页版操作界面，旨在让用户能够轻松地在本地运行和使用强大的 Stable Diffusion 图像生成模型。它解决了原始模型依赖命令行、操作门槛高且功能分散的痛点，将复杂的 AI 绘图流程整合进一个直观易用的图形化平台。\n\n无论是希望快速上手的普通创作者、需要精细控制画面细节的设计师，还是想要深入探索模型潜力的开发者与研究人员，都能从中获益。其核心亮点在于极高的功能丰富度：不仅支持文生图、图生图、局部重绘（Inpainting）和外绘（Outpainting）等基础模式，还独创了注意力机制调整、提示词矩阵、负向提示词以及“高清修复”等高级功能。此外，它内置了 GFPGAN 和 CodeFormer 等人脸修复工具，支持多种神经网络放大算法，并允许用户通过插件系统无限扩展能力。即使是显存有限的设备，stable-diffusion-webui 也提供了相应的优化选项，让高质量的 AI 艺术创作变得触手可及。",162132,"2026-04-05T11:01:52",[14,15,13],{"id":27,"name":28,"github_repo":29,"description_zh":30,"stars":31,"difficulty_score":32,"last_commit_at":33,"category_tags":34,"status":17},1381,"everything-claude-code","affaan-m\u002Feverything-claude-code","everything-claude-code 是一套专为 AI 编程助手（如 Claude Code、Codex、Cursor 等）打造的高性能优化系统。它不仅仅是一组配置文件，而是一个经过长期实战打磨的完整框架，旨在解决 AI 代理在实际开发中面临的效率低下、记忆丢失、安全隐患及缺乏持续学习能力等核心痛点。\n\n通过引入技能模块化、直觉增强、记忆持久化机制以及内置的安全扫描功能，everything-claude-code 能显著提升 AI 在复杂任务中的表现，帮助开发者构建更稳定、更智能的生产级 AI 代理。其独特的“研究优先”开发理念和针对 Token 消耗的优化策略，使得模型响应更快、成本更低，同时有效防御潜在的攻击向量。\n\n这套工具特别适合软件开发者、AI 研究人员以及希望深度定制 AI 工作流的技术团队使用。无论您是在构建大型代码库，还是需要 AI 协助进行安全审计与自动化测试，everything-claude-code 都能提供强大的底层支持。作为一个曾荣获 Anthropic 黑客大奖的开源项目，它融合了多语言支持与丰富的实战钩子（hooks），让 AI 真正成长为懂上",155373,2,"2026-04-14T11:34:08",[14,13,35],"语言模型",{"id":37,"name":38,"github_repo":39,"description_zh":40,"stars":41,"difficulty_score":32,"last_commit_at":42,"category_tags":43,"status":17},2271,"ComfyUI","Comfy-Org\u002FComfyUI","ComfyUI 是一款功能强大且高度模块化的视觉 AI 引擎，专为设计和执行复杂的 Stable Diffusion 图像生成流程而打造。它摒弃了传统的代码编写模式，采用直观的节点式流程图界面，让用户通过连接不同的功能模块即可构建个性化的生成管线。\n\n这一设计巧妙解决了高级 AI 绘图工作流配置复杂、灵活性不足的痛点。用户无需具备编程背景，也能自由组合模型、调整参数并实时预览效果，轻松实现从基础文生图到多步骤高清修复等各类复杂任务。ComfyUI 拥有极佳的兼容性，不仅支持 Windows、macOS 和 Linux 全平台，还广泛适配 NVIDIA、AMD、Intel 及苹果 Silicon 等多种硬件架构，并率先支持 SDXL、Flux、SD3 等前沿模型。\n\n无论是希望深入探索算法潜力的研究人员和开发者，还是追求极致创作自由度的设计师与资深 AI 绘画爱好者，ComfyUI 都能提供强大的支持。其独特的模块化架构允许社区不断扩展新功能，使其成为当前最灵活、生态最丰富的开源扩散模型工具之一，帮助用户将创意高效转化为现实。",108322,"2026-04-10T11:39:34",[14,15,13],{"id":45,"name":46,"github_repo":47,"description_zh":48,"stars":49,"difficulty_score":32,"last_commit_at":50,"category_tags":51,"status":17},6121,"gemini-cli","google-gemini\u002Fgemini-cli","gemini-cli 是一款由谷歌推出的开源 AI 命令行工具，它将强大的 Gemini 大模型能力直接集成到用户的终端环境中。对于习惯在命令行工作的开发者而言，它提供了一条从输入提示词到获取模型响应的最短路径，无需切换窗口即可享受智能辅助。\n\n这款工具主要解决了开发过程中频繁上下文切换的痛点，让用户能在熟悉的终端界面内直接完成代码理解、生成、调试以及自动化运维任务。无论是查询大型代码库、根据草图生成应用，还是执行复杂的 Git 操作，gemini-cli 都能通过自然语言指令高效处理。\n\n它特别适合广大软件工程师、DevOps 人员及技术研究人员使用。其核心亮点包括支持高达 100 万 token 的超长上下文窗口，具备出色的逻辑推理能力；内置 Google 搜索、文件操作及 Shell 命令执行等实用工具；更独特的是，它支持 MCP（模型上下文协议），允许用户灵活扩展自定义集成，连接如图像生成等外部能力。此外，个人谷歌账号即可享受免费的额度支持，且项目基于 Apache 2.0 协议完全开源，是提升终端工作效率的理想助手。",100752,"2026-04-10T01:20:03",[52,13,15,14],"插件",{"id":54,"name":55,"github_repo":56,"description_zh":57,"stars":58,"difficulty_score":32,"last_commit_at":59,"category_tags":60,"status":17},4721,"markitdown","microsoft\u002Fmarkitdown","MarkItDown 是一款由微软 AutoGen 团队打造的轻量级 Python 工具，专为将各类文件高效转换为 Markdown 格式而设计。它支持 PDF、Word、Excel、PPT、图片（含 OCR）、音频（含语音转录）、HTML 乃至 YouTube 链接等多种格式的解析，能够精准提取文档中的标题、列表、表格和链接等关键结构信息。\n\n在人工智能应用日益普及的今天，大语言模型（LLM）虽擅长处理文本，却难以直接读取复杂的二进制办公文档。MarkItDown 恰好解决了这一痛点，它将非结构化或半结构化的文件转化为模型“原生理解”且 Token 效率极高的 Markdown 格式，成为连接本地文件与 AI 分析 pipeline 的理想桥梁。此外，它还提供了 MCP（模型上下文协议）服务器，可无缝集成到 Claude Desktop 等 LLM 应用中。\n\n这款工具特别适合开发者、数据科学家及 AI 研究人员使用，尤其是那些需要构建文档检索增强生成（RAG）系统、进行批量文本分析或希望让 AI 助手直接“阅读”本地文件的用户。虽然生成的内容也具备一定可读性，但其核心优势在于为机器",93400,"2026-04-06T19:52:38",[52,14],{"id":62,"github_repo":63,"name":64,"description_en":65,"description_zh":66,"ai_summary_zh":66,"readme_en":67,"readme_zh":68,"quickstart_zh":69,"use_case_zh":70,"hero_image_url":71,"owner_login":72,"owner_name":73,"owner_avatar_url":74,"owner_bio":75,"owner_company":76,"owner_location":76,"owner_email":76,"owner_twitter":76,"owner_website":77,"owner_url":78,"languages":79,"stars":94,"forks":95,"last_commit_at":96,"license":97,"difficulty_score":32,"env_os":98,"env_gpu":98,"env_ram":98,"env_deps":99,"category_tags":107,"github_topics":76,"view_count":32,"oss_zip_url":76,"oss_zip_packed_at":76,"status":17,"created_at":109,"updated_at":110,"faqs":111,"releases":112},7461,"googleapis\u002Fpython-aiplatform","python-aiplatform","A Python SDK for Vertex AI, a fully managed, end-to-end platform for data science and machine learning.","python-aiplatform 是谷歌 Vertex AI 平台的官方 Python 开发工具包，旨在为数据科学家和机器学习工程师提供一套全托管、端到端的模型开发与部署解决方案。它有效解决了从实验原型到生产环境落地过程中常见的环境配置复杂、流程割裂及运维困难等痛点，让用户能通过简洁的代码统一管理工作流。\n\n无论是刚入门的开发者还是资深算法专家，都能利用它灵活构建自定义模型或直接调用 AutoML 能力。其核心亮点在于对生成式 AI（GenAI）的深度支持：用户不仅能轻松实例化客户端来调用如 Gemini 等前沿大模型，还内置了强大的评估框架。通过简单的几行代码，即可基于自定义提示词数据集自动运行推理，并利用精确匹配、ROUGE 指标或文本质量评分表等多种维度，对模型输出进行量化评估与优化。此外，结合标准的 pip 或现代化的 uv 工具即可快速安装，极大降低了上手门槛，帮助团队更高效地实现从创意验证到规模化应用的跨越。","Vertex AI SDK for Python\n=================================================\n\n|GA| |pypi| |versions| |unit-tests| |system-tests| |sample-tests|\n\n`Vertex AI`_: Google Vertex AI is an integrated suite of machine learning tools and services for building and using ML models with AutoML or custom code. It offers both novices and experts the best workbench for the entire machine learning development lifecycle.\n\n- `Client Library Documentation`_\n- `Product Documentation`_\n\n.. |GA| image:: https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fsupport-ga-gold.svg\n   :target: https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fgoogle-cloud-python\u002Fblob\u002Fmain\u002FREADME.rst#general-availability\n.. |pypi| image:: https:\u002F\u002Fimg.shields.io\u002Fpypi\u002Fv\u002Fgoogle-cloud-aiplatform.svg\n   :target: https:\u002F\u002Fpypi.org\u002Fproject\u002Fgoogle-cloud-aiplatform\u002F\n.. |versions| image:: https:\u002F\u002Fimg.shields.io\u002Fpypi\u002Fpyversions\u002Fgoogle-cloud-aiplatform.svg\n   :target: https:\u002F\u002Fpypi.org\u002Fproject\u002Fgoogle-cloud-aiplatform\u002F\n.. |unit-tests| image:: https:\u002F\u002Fstorage.googleapis.com\u002Fcloud-devrel-public\u002Fpython-aiplatform\u002Fbadges\u002Fsdk-unit-tests.svg\n   :target: https:\u002F\u002Fstorage.googleapis.com\u002Fcloud-devrel-public\u002Fpython-aiplatform\u002Fbadges\u002Fsdk-unit-tests.html\n.. |system-tests| image:: https:\u002F\u002Fstorage.googleapis.com\u002Fcloud-devrel-public\u002Fpython-aiplatform\u002Fbadges\u002Fsdk-system-tests.svg\n   :target: https:\u002F\u002Fstorage.googleapis.com\u002Fcloud-devrel-public\u002Fpython-aiplatform\u002Fbadges\u002Fsdk-system-tests.html\n.. |sample-tests| image:: https:\u002F\u002Fstorage.googleapis.com\u002Fcloud-devrel-public\u002Fpython-aiplatform\u002Fbadges\u002Fsdk-sample-tests.svg\n   :target: https:\u002F\u002Fstorage.googleapis.com\u002Fcloud-devrel-public\u002Fpython-aiplatform\u002Fbadges\u002Fsdk-sample-tests.html\n.. _Vertex AI: https:\u002F\u002Fcloud.google.com\u002Fvertex-ai\u002Fdocs\n.. _Client Library Documentation: https:\u002F\u002Fcloud.google.com\u002Fpython\u002Fdocs\u002Freference\u002Faiplatform\u002Flatest\n.. _Product Documentation:  https:\u002F\u002Fcloud.google.com\u002Fvertex-ai\u002Fdocs\n\nInstallation\n~~~~~~~~~~~~\n\n.. code-block:: console\n\n    pip install google-cloud-aiplatform\n\n\nWith :code:`uv`:\n\n.. code-block:: console\n\n    uv pip install google-cloud-aiplatform\n\nGenerative AI in the Vertex AI SDK\n~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n\nTo use Gen AI features from the Vertex AI SDK, you can instantiate a Vertex SDK client with the following:\n\n.. code-block:: Python\n\n    import vertexai\n    from vertexai import types\n\n    # Instantiate GenAI client from Vertex SDK\n    # Replace with your project ID and location\n    client = vertexai.Client(project='my-project', location='us-central1')\n\nSee the examples below for guidance on how to use specific features supported by the Vertex SDK client.\n\nGen AI Evaluation\n^^^^^^^^^^^^^^^^^\n\nTo run evaluation, first generate model responses from a set of prompts.\n\n.. code-block:: Python\n\n    import pandas as pd\n\n    prompts_df = pd.DataFrame({\n        \"prompt\": [\n            \"What is the capital of France?\",\n            \"Write a haiku about a cat.\",\n            \"Write a Python function to calculate the factorial of a number.\",\n            \"Translate 'How are you?' to French.\",\n        ],\n\n        \"reference\": [\n            \"Paris\",\n            \"Sunbeam on the floor,\\nA furry puddle sleeping,\\nTwitching tail tells tales.\",\n            \"def factorial(n):\\n    if n \u003C 0:\\n        return 'Factorial does not exist for negative numbers'\\n    elif n == 0:\\n        return 1\\n    else:\\n        fact = 1\\n        i = 1\\n        while i \u003C= n:\\n            fact *= i\\n            i += 1\\n        return fact\",\n            \"Comment ça va ?\",\n        ]\n    })\n\n    inference_results = client.evals.run_inference(\n        model=\"gemini-2.5-flash-preview-05-20\",\n        src=prompts_df\n    )\n\nThen run evaluation by providing the inference results and specifying the metric types.\n\n.. code-block:: Python\n\n    eval_result = client.evals.evaluate(\n        dataset=inference_results,\n        metrics=[\n            types.Metric(name='exact_match'),\n            types.Metric(name='rouge_l_sum'),\n            types.RubricMetric.TEXT_QUALITY,\n        ]\n    )\n\nAgent Engine with Agent Development Kit (ADK)\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\nFirst, define a function that looks up the exchange rate:\n\n.. code-block:: Python\n\n    def get_exchange_rate(\n        currency_from: str = \"USD\",\n        currency_to: str = \"EUR\",\n        currency_date: str = \"latest\",\n    ):\n        \"\"\"Retrieves the exchange rate between two currencies on a specified date.\n\n        Uses the Frankfurter API (https:\u002F\u002Fapi.frankfurter.app\u002F) to obtain\n        exchange rate data.\n\n        Returns:\n            dict: A dictionary containing the exchange rate information.\n                Example: {\"amount\": 1.0, \"base\": \"USD\", \"date\": \"2023-11-24\",\n                    \"rates\": {\"EUR\": 0.95534}}\n        \"\"\"\n        import requests\n        response = requests.get(\n            f\"https:\u002F\u002Fapi.frankfurter.app\u002F{currency_date}\",\n            params={\"from\": currency_from, \"to\": currency_to},\n        )\n        return response.json()\n\nNext, define an ADK Agent:\n\n.. code-block:: Python\n\n    from google.adk.agents import Agent\n    from vertexai.agent_engines import AdkApp\n\n    app = AdkApp(agent=Agent(\n        model=\"gemini-2.0-flash\",        # Required.\n        name='currency_exchange_agent',  # Required.\n        tools=[get_exchange_rate],       # Optional.\n    ))\n\nTest the agent locally using US dollars and Swedish Krona:\n\n.. code-block:: Python\n\n    async for event in app.async_stream_query(\n        user_id=\"user-id\",\n        message=\"What is the exchange rate from US dollars to SEK today?\",\n    ):\n        print(event)\n\nTo deploy the agent to Agent Engine:\n\n.. code-block:: Python\n\n    remote_app = client.agent_engines.create(\n        agent=app,\n        config={\n            \"requirements\": [\"google-cloud-aiplatform[agent_engines,adk]\"],\n        },\n    )\n\nYou can also run queries against the deployed agent:\n\n.. code-block:: Python\n\n    async for event in remote_app.async_stream_query(\n        user_id=\"user-id\",\n        message=\"What is the exchange rate from US dollars to SEK today?\",\n    ):\n        print(event)\n\nPrompt optimization\n^^^^^^^^^^^^^^^^^^^\n\nTo do a zero-shot prompt optimization, use the `optimize`\nmethod.\n\n.. code-block:: Python\n\n    prompt = \"Generate system instructions for a question-answering assistant\"\n    response = client.prompts.optimize(prompt=prompt)\n    print(response.raw_text_response)\n    if response.parsed_response:\n      print(response.parsed_response.suggested_prompt)\n\nTo call the data-driven prompt optimization, call the `launch_optimization_job` method.\nIn this case however, we need to provide a VAPO (Vertex AI Prompt Optimizer) config. This config needs to\nhave either service account or project **number** and the config path.\nPlease refer to this [tutorial](https:\u002F\u002Fcloud.google.com\u002Fvertex-ai\u002Fgenerative-ai\u002Fdocs\u002Flearn\u002Fprompts\u002Fdata-driven-optimizer)\nfor more details on config parameter.\n\n.. code-block:: Python\n\n    from vertexai import types\n\n    project_number = PROJECT_NUMBER # replace with your project number\n    service_account = f\"{project_number}-compute@developer.gserviceaccount.com\"\n\n    vapo_config = vertexai.types.PromptOptimizerVAPOConfig(\n        config_path=\"gs:\u002F\u002Fyour-bucket\u002Fconfig.json\",\n        service_account_project_number=project_number,\n        wait_for_completion=False\n    )\n\n    # Set up logging to see the progress of the optimization job\n    logging.basicConfig(encoding='utf-8', level=logging.INFO, force=True)\n\n    result = client.prompts.launch_optimization_job(method=types.PromptOptimizerMethod.VAPO, config=vapo_config)\n\nIf you want to use the project number instead of the service account, you can\ninstead use the following config:\n\n.. code-block:: Python\n\n    vapo_config = vertexai.types.PromptOptimizerVAPOConfig(\n        config_path=\"gs:\u002F\u002Fyour-bucket\u002Fconfig.json\",\n        service_account_project_number=project_number,\n        wait_for_completion=False\n    )\n\nWe can also call the `launch_optimization_job` method asynchronously.\n\n.. code-block:: Python\n\n    await client.aio.prompts.launch_optimization_job(method=types.PromptOptimizerMethod.VAPO, config=vapo_config)\n\nPrompt Management\n^^^^^^^^^^^^^^^^^\n\nFirst define your prompt as a dictionary or types.Prompt object. Then call create_prompt.\n\n.. code-block:: Python\n\n    prompt = {\n        \"prompt_data\": {\n            \"contents\": [{\"parts\": [{\"text\": \"Hello, {name}! How are you?\"}]}],\n            \"system_instruction\": {\"parts\": [{\"text\": \"Please answer in a short sentence.\"}]},\n            \"variables\": [\n                {\"name\": {\"text\": \"Alice\"}},\n            ],\n            \"model\": \"gemini-2.5-flash\",\n        },\n    }\n\n    prompt_resource = client.prompts.create(\n        prompt=prompt,\n    )\n\nNote that you can also use the types.Prompt object to define your prompt. Some of the types used to do this are from the Gen AI SDK.\n\n.. code-block:: Python\n\n    import types\n    from google.genai import types as genai_types\n\n    prompt = types.Prompt(\n        prompt_data=types.PromptData(\n          contents=[genai_types.Content(parts=[genai_types.Part(text=\"Hello, {name}! How are you?\")])],\n          system_instruction=genai_types.Content(parts=[genai_types.Part(text=\"Please answer in a short sentence.\")]),\n          variables=[\n            {\"name\": genai_types.Part(text=\"Alice\")},\n          ],\n          model=\"gemini-2.5-flash\",\n        ),\n    )\n\nRetrieve a prompt by calling get() with the prompt_id.\n\n.. code-block:: Python\n\n    retrieved_prompt = client.prompts.get(\n        prompt_id=prompt_resource.prompt_id,\n    )\n\nAfter creating or retrieving a prompt, you can call `generate_content()` with that prompt using the Gen AI SDK.\n\nThe following uses a utility function available on Prompt objects to transform a Prompt object into a list of Content objects for use with `generate_content`. To run this you need to have the Gen AI SDK installed, which you can do via `pip install google-genai`.\n\n.. code-block:: Python\n\n    from google import genai\n    from google.genai import types as genai_types\n\n    # Create a Client in the Gen AI SDK\n    genai_client = genai.Client(vertexai=True, project=\"your-project\", location=\"your-location\")\n\n    # Call generate_content() with the prompt\n    response = genai_client.models.generate_content(\n        model=retrieved_prompt.prompt_data.model,\n        contents=retrieved_prompt.assemble_contents(),\n    )\n\n-----------------------------------------\n\n.. note::\n\n   The following Generative AI modules in the Vertex AI SDK are deprecated as of June 24, 2025 and will be removed on June 24, 2026:\n   `vertexai.generative_models`, `vertexai.language_models`, `vertexai.vision_models`, `vertexai.tuning`, `vertexai.caching`. Please use the\n   [Google Gen AI SDK](https:\u002F\u002Fpypi.org\u002Fproject\u002Fgoogle-genai\u002F) to access these features. See\n   [the migration guide](https:\u002F\u002Fcloud.google.com\u002Fvertex-ai\u002Fgenerative-ai\u002Fdocs\u002Fdeprecations\u002Fgenai-vertexai-sdk) for details.\n   You can continue using all other Vertex AI SDK modules, as they are the recommended way to use the API.\n\nQuick Start\n-----------\n\nIn order to use this library, you first need to go through the following steps:\n\n1. `Select or create a Cloud Platform project.`_\n2. `Enable billing for your project.`_\n3. `Enable the Vertex AI API.`_\n4. `Setup Authentication.`_\n\n.. _Select or create a Cloud Platform project.: https:\u002F\u002Fconsole.cloud.google.com\u002Fproject\n.. _Enable billing for your project.: https:\u002F\u002Fcloud.google.com\u002Fbilling\u002Fdocs\u002Fhow-to\u002Fmodify-project#enable_billing_for_a_project\n.. _Enable the Vertex AI API.:  https:\u002F\u002Fcloud.google.com\u002Fvertex-ai\u002Fdocs\u002Fstart\u002Fuse-vertex-ai-python-sdk\n.. _Setup Authentication.: https:\u002F\u002Fgoogleapis.dev\u002Fpython\u002Fgoogle-api-core\u002Flatest\u002Fauth.html\n\n\nSupported Python Versions\n^^^^^^^^^^^^^^^^^^^^^^^^^\nPython >= 3.9\n\nDeprecated Python Versions\n^^^^^^^^^^^^^^^^^^^^^^^^^^\nPython \u003C= 3.8.\n\nThe last version of this library compatible with Python 3.8 is google-cloud-aiplatform==1.90.0.\n\nThe last version of this library compatible with Python 3.7 is google-cloud-aiplatform==1.31.1.\n\nThe last version of this library compatible with Python 3.6 is google-cloud-aiplatform==1.12.1.\n\nOverview\n~~~~~~~~\nThis section provides a brief overview of the Vertex AI SDK for Python. You can also reference the notebooks in `vertex-ai-samples`_ for examples.\n\n.. _vertex-ai-samples: https:\u002F\u002Fgithub.com\u002FGoogleCloudPlatform\u002Fvertex-ai-samples\u002Ftree\u002Fmain\u002Fnotebooks\u002Fcommunity\u002Fsdk\n\nAll publicly available SDK features can be found in the :code:`google\u002Fcloud\u002Faiplatform` directory.\nUnder the hood, Vertex SDK builds on top of GAPIC, which stands for Google API CodeGen.\nThe GAPIC library code sits in :code:`google\u002Fcloud\u002Faiplatform_v1` and :code:`google\u002Fcloud\u002Faiplatform_v1beta1`,\nand it is auto-generated from Google's service proto files.\n\nFor most developers' programmatic needs, they can follow these steps to figure out which libraries to import:\n\n1. Look through :code:`google\u002Fcloud\u002Faiplatform` first -- Vertex SDK's APIs will almost always be easier to use and more concise comparing with GAPIC\n2. If the feature that you are looking for cannot be found there, look through :code:`aiplatform_v1` to see if it's available in GAPIC\n3. If it is still in beta phase, it will be available in :code:`aiplatform_v1beta1`\n\nIf none of the above scenarios could help you find the right tools for your task, please feel free to open a github issue and send us a feature request.\n\nImporting\n^^^^^^^^^\nVertex AI SDK resource based functionality can be used by importing the following namespace:\n\n.. code-block:: Python\n\n    from google.cloud import aiplatform\n\nInitialization\n^^^^^^^^^^^^^^\nInitialize the SDK to store common configurations that you use with the SDK.\n\n.. code-block:: Python\n\n    aiplatform.init(\n        # your Google Cloud Project ID or number\n        # environment default used is not set\n        project='my-project',\n\n        # the Vertex AI region you will use\n        # defaults to us-central1\n        location='us-central1',\n\n        # Google Cloud Storage bucket in same region as location\n        # used to stage artifacts\n        staging_bucket='gs:\u002F\u002Fmy_staging_bucket',\n\n        # custom google.auth.credentials.Credentials\n        # environment default credentials used if not set\n        credentials=my_credentials,\n\n        # customer managed encryption key resource name\n        # will be applied to all Vertex AI resources if set\n        encryption_spec_key_name=my_encryption_key_name,\n\n        # the name of the experiment to use to track\n        # logged metrics and parameters\n        experiment='my-experiment',\n\n        # description of the experiment above\n        experiment_description='my experiment description'\n    )\n\nDatasets\n^^^^^^^^\nVertex AI provides managed tabular, text, image, and video datasets. In the SDK, datasets can be used downstream to\ntrain models.\n\nTo create a tabular dataset:\n\n.. code-block:: Python\n\n    my_dataset = aiplatform.TabularDataset.create(\n        display_name=\"my-dataset\", gcs_source=['gs:\u002F\u002Fpath\u002Fto\u002Fmy\u002Fdataset.csv'])\n\nYou can also create and import a dataset in separate steps:\n\n.. code-block:: Python\n\n    from google.cloud import aiplatform\n\n    my_dataset = aiplatform.TextDataset.create(\n        display_name=\"my-dataset\")\n\n    my_dataset.import_data(\n        gcs_source=['gs:\u002F\u002Fpath\u002Fto\u002Fmy\u002Fdataset.csv'],\n        import_schema_uri=aiplatform.schema.dataset.ioformat.text.multi_label_classification\n    )\n\nTo get a previously created Dataset:\n\n.. code-block:: Python\n\n  dataset = aiplatform.ImageDataset('projects\u002Fmy-project\u002Flocation\u002Fus-central1\u002Fdatasets\u002F{DATASET_ID}')\n\nVertex AI supports a variety of dataset schemas. References to these schemas are available under the\n:code:`aiplatform.schema.dataset` namespace. For more information on the supported dataset schemas please refer to the\n`Preparing data docs`_.\n\n.. _Preparing data docs: https:\u002F\u002Fcloud.google.com\u002Fai-platform-unified\u002Fdocs\u002Fdatasets\u002Fprepare\n\nTraining\n^^^^^^^^\nThe Vertex AI SDK for Python allows you train Custom and AutoML Models.\n\nYou can train custom models using a custom Python script, custom Python package, or container.\n\n**Preparing Your Custom Code**\n\nVertex AI custom training enables you to train on Vertex AI datasets and produce Vertex AI models. To do so your\nscript must adhere to the following contract:\n\nIt must read datasets from the environment variables populated by the training service:\n\n.. code-block:: Python\n\n  os.environ['AIP_DATA_FORMAT']  # provides format of data\n  os.environ['AIP_TRAINING_DATA_URI']  # uri to training split\n  os.environ['AIP_VALIDATION_DATA_URI']  # uri to validation split\n  os.environ['AIP_TEST_DATA_URI']  # uri to test split\n\nPlease visit `Using a managed dataset in a custom training application`_ for a detailed overview.\n\n.. _Using a managed dataset in a custom training application: https:\u002F\u002Fcloud.google.com\u002Fvertex-ai\u002Fdocs\u002Ftraining\u002Fusing-managed-datasets\n\nIt must write the model artifact to the environment variable populated by the training service:\n\n.. code-block:: Python\n\n  os.environ['AIP_MODEL_DIR']\n\n**Running Training**\n\n.. code-block:: Python\n\n  job = aiplatform.CustomTrainingJob(\n      display_name=\"my-training-job\",\n      script_path=\"training_script.py\",\n      container_uri=\"us-docker.pkg.dev\u002Fvertex-ai\u002Ftraining\u002Ftf-cpu.2-2:latest\",\n      requirements=[\"gcsfs==0.7.1\"],\n      model_serving_container_image_uri=\"us-docker.pkg.dev\u002Fvertex-ai\u002Fprediction\u002Ftf2-cpu.2-2:latest\",\n  )\n\n  model = job.run(my_dataset,\n                  replica_count=1,\n                  machine_type=\"n1-standard-4\",\n                  accelerator_type='NVIDIA_TESLA_K80',\n                  accelerator_count=1)\n\nIn the code block above `my_dataset` is managed dataset created in the `Dataset` section above. The `model` variable is a managed Vertex AI model that can be deployed or exported.\n\n\nAutoMLs\n-------\nThe Vertex AI SDK for Python supports AutoML tabular, image, text, video, and forecasting.\n\nTo train an AutoML tabular model:\n\n.. code-block:: Python\n\n  dataset = aiplatform.TabularDataset('projects\u002Fmy-project\u002Flocation\u002Fus-central1\u002Fdatasets\u002F{DATASET_ID}')\n\n  job = aiplatform.AutoMLTabularTrainingJob(\n    display_name=\"train-automl\",\n    optimization_prediction_type=\"regression\",\n    optimization_objective=\"minimize-rmse\",\n  )\n\n  model = job.run(\n      dataset=dataset,\n      target_column=\"target_column_name\",\n      training_fraction_split=0.6,\n      validation_fraction_split=0.2,\n      test_fraction_split=0.2,\n      budget_milli_node_hours=1000,\n      model_display_name=\"my-automl-model\",\n      disable_early_stopping=False,\n  )\n\n\nModels\n------\nTo get a model:\n\n\n.. code-block:: Python\n\n  model = aiplatform.Model('\u002Fprojects\u002Fmy-project\u002Flocations\u002Fus-central1\u002Fmodels\u002F{MODEL_ID}')\n\n\n\nTo upload a model:\n\n.. code-block:: Python\n\n  model = aiplatform.Model.upload(\n      display_name='my-model',\n      artifact_uri=\"gs:\u002F\u002Fpython\u002Fto\u002Fmy\u002Fmodel\u002Fdir\",\n      serving_container_image_uri=\"us-docker.pkg.dev\u002Fvertex-ai\u002Fprediction\u002Ftf2-cpu.2-2:latest\",\n  )\n\n\n\nTo deploy a model:\n\n\n.. code-block:: Python\n\n  endpoint = model.deploy(machine_type=\"n1-standard-4\",\n                          min_replica_count=1,\n                          max_replica_count=5\n                          machine_type='n1-standard-4',\n                          accelerator_type='NVIDIA_TESLA_K80',\n                          accelerator_count=1)\n\n\nPlease visit `Importing models to Vertex AI`_ for a detailed overview:\n\n.. _Importing models to Vertex AI: https:\u002F\u002Fcloud.google.com\u002Fvertex-ai\u002Fdocs\u002Fgeneral\u002Fimport-model\n\nModel Evaluation\n----------------\n\nThe Vertex AI SDK for Python currently supports getting model evaluation metrics for all AutoML models.\n\nTo list all model evaluations for a model:\n\n.. code-block:: Python\n\n  model = aiplatform.Model('projects\u002Fmy-project\u002Flocations\u002Fus-central1\u002Fmodels\u002F{MODEL_ID}')\n\n  evaluations = model.list_model_evaluations()\n\n\nTo get the model evaluation resource for a given model:\n\n.. code-block:: Python\n\n  model = aiplatform.Model('projects\u002Fmy-project\u002Flocations\u002Fus-central1\u002Fmodels\u002F{MODEL_ID}')\n\n  # returns the first evaluation with no arguments, you can also pass the evaluation ID\n  evaluation = model.get_model_evaluation()\n\n  eval_metrics = evaluation.metrics\n\n\nYou can also create a reference to your model evaluation directly by passing in the resource name of the model evaluation:\n\n.. code-block:: Python\n\n  evaluation = aiplatform.ModelEvaluation(\n    evaluation_name='projects\u002Fmy-project\u002Flocations\u002Fus-central1\u002Fmodels\u002F{MODEL_ID}\u002Fevaluations\u002F{EVALUATION_ID}')\n\nAlternatively, you can create a reference to your evaluation by passing in the model and evaluation IDs:\n\n.. code-block:: Python\n\n  evaluation = aiplatform.ModelEvaluation(\n    evaluation_name={EVALUATION_ID},\n    model_id={MODEL_ID})\n\n\nBatch Prediction\n----------------\n\nTo create a batch prediction job:\n\n.. code-block:: Python\n\n  model = aiplatform.Model('\u002Fprojects\u002Fmy-project\u002Flocations\u002Fus-central1\u002Fmodels\u002F{MODEL_ID}')\n\n  batch_prediction_job = model.batch_predict(\n    job_display_name='my-batch-prediction-job',\n    instances_format='csv',\n    machine_type='n1-standard-4',\n    gcs_source=['gs:\u002F\u002Fpath\u002Fto\u002Fmy\u002Ffile.csv'],\n    gcs_destination_prefix='gs:\u002F\u002Fpath\u002Fto\u002Fmy\u002Fbatch_prediction\u002Fresults\u002F',\n    service_account='my-sa@my-project.iam.gserviceaccount.com'\n  )\n\nYou can also create a batch prediction job asynchronously by including the `sync=False` argument:\n\n.. code-block:: Python\n\n  batch_prediction_job = model.batch_predict(..., sync=False)\n\n  # wait for resource to be created\n  batch_prediction_job.wait_for_resource_creation()\n\n  # get the state\n  batch_prediction_job.state\n\n  # block until job is complete\n  batch_prediction_job.wait()\n\n\nEndpoints\n---------\n\nTo create an endpoint:\n\n.. code-block:: Python\n\n  endpoint = aiplatform.Endpoint.create(display_name='my-endpoint')\n\nTo deploy a model to a created endpoint:\n\n.. code-block:: Python\n\n  model = aiplatform.Model('\u002Fprojects\u002Fmy-project\u002Flocations\u002Fus-central1\u002Fmodels\u002F{MODEL_ID}')\n\n  endpoint.deploy(model,\n                  min_replica_count=1,\n                  max_replica_count=5,\n                  machine_type='n1-standard-4',\n                  accelerator_type='NVIDIA_TESLA_K80',\n                  accelerator_count=1)\n\nTo get predictions from endpoints:\n\n.. code-block:: Python\n\n  endpoint.predict(instances=[[6.7, 3.1, 4.7, 1.5], [4.6, 3.1, 1.5, 0.2]])\n\nTo undeploy models from an endpoint:\n\n.. code-block:: Python\n\n  endpoint.undeploy_all()\n\nTo delete an endpoint:\n\n.. code-block:: Python\n\n  endpoint.delete()\n\n\nPipelines\n---------\n\nTo create a Vertex AI Pipeline run and monitor until completion:\n\n.. code-block:: Python\n\n  # Instantiate PipelineJob object\n  pl = PipelineJob(\n      display_name=\"My first pipeline\",\n\n      # Whether or not to enable caching\n      # True = always cache pipeline step result\n      # False = never cache pipeline step result\n      # None = defer to cache option for each pipeline component in the pipeline definition\n      enable_caching=False,\n\n      # Local or GCS path to a compiled pipeline definition\n      template_path=\"pipeline.json\",\n\n      # Dictionary containing input parameters for your pipeline\n      parameter_values=parameter_values,\n\n      # GCS path to act as the pipeline root\n      pipeline_root=pipeline_root,\n  )\n\n  # Execute pipeline in Vertex AI and monitor until completion\n  pl.run(\n    # Email address of service account to use for the pipeline run\n    # You must have iam.serviceAccounts.actAs permission on the service account to use it\n    service_account=service_account,\n\n    # Whether this function call should be synchronous (wait for pipeline run to finish before terminating)\n    # or asynchronous (return immediately)\n    sync=True\n  )\n\nTo create a Vertex AI Pipeline without monitoring until completion, use `submit` instead of `run`:\n\n.. code-block:: Python\n\n  # Instantiate PipelineJob object\n  pl = PipelineJob(\n      display_name=\"My first pipeline\",\n\n      # Whether or not to enable caching\n      # True = always cache pipeline step result\n      # False = never cache pipeline step result\n      # None = defer to cache option for each pipeline component in the pipeline definition\n      enable_caching=False,\n\n      # Local or GCS path to a compiled pipeline definition\n      template_path=\"pipeline.json\",\n\n      # Dictionary containing input parameters for your pipeline\n      parameter_values=parameter_values,\n\n      # GCS path to act as the pipeline root\n      pipeline_root=pipeline_root,\n  )\n\n  # Submit the Pipeline to Vertex AI\n  pl.submit(\n    # Email address of service account to use for the pipeline run\n    # You must have iam.serviceAccounts.actAs permission on the service account to use it\n    service_account=service_account,\n  )\n\n\nExplainable AI: Get Metadata\n----------------------------\n\nTo get metadata in dictionary format from TensorFlow 1 models:\n\n.. code-block:: Python\n\n  from google.cloud.aiplatform.explain.metadata.tf.v1 import saved_model_metadata_builder\n\n  builder = saved_model_metadata_builder.SavedModelMetadataBuilder(\n            'gs:\u002F\u002Fpython\u002Fto\u002Fmy\u002Fmodel\u002Fdir', tags=[tf.saved_model.tag_constants.SERVING]\n        )\n  generated_md = builder.get_metadata()\n\nTo get metadata in dictionary format from TensorFlow 2 models:\n\n.. code-block:: Python\n\n  from google.cloud.aiplatform.explain.metadata.tf.v2 import saved_model_metadata_builder\n\n  builder = saved_model_metadata_builder.SavedModelMetadataBuilder('gs:\u002F\u002Fpython\u002Fto\u002Fmy\u002Fmodel\u002Fdir')\n  generated_md = builder.get_metadata()\n\nTo use Explanation Metadata in endpoint deployment and model upload:\n\n.. code-block:: Python\n\n  explanation_metadata = builder.get_metadata_protobuf()\n\n  # To deploy a model to an endpoint with explanation\n  model.deploy(..., explanation_metadata=explanation_metadata)\n\n  # To deploy a model to a created endpoint with explanation\n  endpoint.deploy(..., explanation_metadata=explanation_metadata)\n\n  # To upload a model with explanation\n  aiplatform.Model.upload(..., explanation_metadata=explanation_metadata)\n\n\nCloud Profiler\n----------------------------\n\nCloud Profiler allows you to profile your remote Vertex AI Training jobs on demand and visualize the results in Vertex AI Tensorboard.\n\nTo start using the profiler with TensorFlow, update your training script to include the following:\n\n.. code-block:: Python\n\n    from google.cloud.aiplatform.training_utils import cloud_profiler\n    ...\n    cloud_profiler.init()\n\nNext, run the job with with a Vertex AI TensorBoard instance. For full details on how to do this, visit https:\u002F\u002Fcloud.google.com\u002Fvertex-ai\u002Fdocs\u002Fexperiments\u002Ftensorboard-overview\n\nFinally, visit your TensorBoard in your Google Cloud Console, navigate to the \"Profile\" tab, and click the `Capture Profile` button. This will allow users to capture profiling statistics for the running jobs.\n\n\nNext Steps\n~~~~~~~~~~\n\n-  Read the `Client Library Documentation`_ for Vertex AI\n   API to see other available methods on the client.\n-  Read the `Vertex AI API Product documentation`_ to learn\n   more about the product and see How-to Guides.\n-  View this `README`_ to see the full list of Cloud\n   APIs that we cover.\n\n.. _Vertex AI API Product documentation:  https:\u002F\u002Fcloud.google.com\u002Fvertex-ai\u002Fdocs\n.. _README: https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fgoogle-cloud-python\u002Fblob\u002Fmain\u002FREADME.rst\n","Vertex AI Python SDK\n=================================================\n\n|GA| |pypi| |versions| |unit-tests| |system-tests| |sample-tests|\n\n`Vertex AI`_: Google Vertex AI 是一套集成的机器学习工具和服务，用于通过 AutoML 或自定义代码构建和使用机器学习模型。它为初学者和专家提供了覆盖整个机器学习开发生命周期的最佳工作台。\n\n- `客户端库文档`_\n- `产品文档`_\n\n.. |GA| image:: https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fsupport-ga-gold.svg\n   :target: https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fgoogle-cloud-python\u002Fblob\u002Fmain\u002FREADME.rst#general-availability\n.. |pypi| image:: https:\u002F\u002Fimg.shields.io\u002Fpypi\u002Fv\u002Fgoogle-cloud-aiplatform.svg\n   :target: https:\u002F\u002Fpypi.org\u002Fproject\u002Fgoogle-cloud-aiplatform\u002F\n.. |versions| image:: https:\u002F\u002Fimg.shields.io\u002Fpypi\u002Fpyversions\u002Fgoogle-cloud-aiplatform.svg\n   :target: https:\u002F\u002Fpypi.org\u002Fproject\u002Fgoogle-cloud-aiplatform\u002F\n.. |unit-tests| image:: https:\u002F\u002Fstorage.googleapis.com\u002Fcloud-devrel-public\u002Fpython-aiplatform\u002Fbadges\u002Fsdk-unit-tests.svg\n   :target: https:\u002F\u002Fstorage.googleapis.com\u002Fcloud-devrel-public\u002Fpython-aiplatform\u002Fbadges\u002Fsdk-unit-tests.html\n.. |system-tests| image:: https:\u002F\u002Fstorage.googleapis.com\u002Fcloud-devrel-public\u002Fpython-aiplatform\u002Fbadges\u002Fsdk-system-tests.svg\n   :target: https:\u002F\u002Fstorage.googleapis.com\u002Fcloud-devrel-public\u002Fpython-aiplatform\u002Fbadges\u002Fsdk-system-tests.html\n.. |sample-tests| image:: https:\u002F\u002Fstorage.googleapis.com\u002Fcloud-devrel-public\u002Fpython-aiplatform\u002Fbadges\u002Fsdk-sample-tests.svg\n   :target: https:\u002F\u002Fstorage.googleapis.com\u002Fcloud-devrel-public\u002Fpython-aiplatform\u002Fbadges\u002Fsdk-sample-tests.html\n.. _Vertex AI: https:\u002F\u002Fcloud.google.com\u002Fvertex-ai\u002Fdocs\n.. _Client Library Documentation: https:\u002F\u002Fcloud.google.com\u002Fpython\u002Fdocs\u002Freference\u002Faiplatform\u002Flatest\n.. _Product Documentation:  https:\u002F\u002Fcloud.google.com\u002Fvertex-ai\u002Fdocs\n\n安装\n~~~~~~~~~~~~\n\n.. code-block:: console\n\n    pip install google-cloud-aiplatform\n\n\n使用 :code:`uv`:\n\n.. code-block:: console\n\n    uv pip install google-cloud-aiplatform\n\nVertex AI SDK 中的生成式 AI\n~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n\n要使用 Vertex AI SDK 中的生成式 AI 功能，您可以按如下方式实例化一个 Vertex SDK 客户端：\n\n.. code-block:: Python\n\n    import vertexai\n    from vertexai import types\n\n    # 从 Vertex SDK 实例化生成式 AI 客户端\n    # 替换为您自己的项目 ID 和位置\n    client = vertexai.Client(project='my-project', location='us-central1')\n\n请参阅下面的示例，了解如何使用 Vertex SDK 客户端支持的特定功能。\n\n生成式 AI 评估\n^^^^^^^^^^^^^^^^^\n\n要运行评估，首先需要根据一组提示生成模型响应。\n\n.. code-block:: Python\n\n    import pandas as pd\n\n    prompts_df = pd.DataFrame({\n        \"prompt\": [\n            \"法国的首都是什么？\",\n            \"写一首关于猫的俳句。\",\n            \"编写一个计算阶乘的 Python 函数。\",\n            \"将‘你好吗？’翻译成法语。\",\n        ],\n\n        \"reference\": [\n            \"巴黎\",\n            \"阳光洒在地板上，\\n毛茸茸的一团沉睡着，\\n尾巴微微抽动，仿佛在诉说故事。\",\n            \"def factorial(n):\\n    if n \u003C 0:\\n        return '负数没有阶乘'\\n    elif n == 0:\\n        return 1\\n    else:\\n        fact = 1\\n        i = 1\\n        while i \u003C= n:\\n            fact *= i\\n            i += 1\\n        return fact\",\n            \"Comment ça va ?\",\n        ]\n    })\n\n    inference_results = client.evals.run_inference(\n        model=\"gemini-2.5-flash-preview-05-20\",\n        src=prompts_df\n    )\n\n然后，提供推理结果并指定指标类型来运行评估。\n\n.. code-block:: Python\n\n    eval_result = client.evals.evaluate(\n        dataset=inference_results,\n        metrics=[\n            types.Metric(name='exact_match'),\n            types.Metric(name='rouge_l_sum'),\n            types.RubricMetric.TEXT_QUALITY,\n        ]\n    )\n\n使用 Agent 开发套件 (ADK) 的 Agent Engine\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n首先，定义一个查询汇率的函数：\n\n.. code-block:: Python\n\n    def get_exchange_rate(\n        currency_from: str = \"USD\",\n        currency_to: str = \"EUR\",\n        currency_date: str = \"latest\",\n    ):\n        \"\"\"获取指定日期两种货币之间的汇率。\n\n        使用 Frankfurter API (https:\u002F\u002Fapi.frankfurter.app\u002F) 获取汇率数据。\n\n        返回：\n            dict: 包含汇率信息的字典。\n                示例：{\"amount\": 1.0, \"base\": \"USD\", \"date\": \"2023-11-24\",\n                    \"rates\": {\"EUR\": 0.95534}}\n        \"\"\"\n        import requests\n        response = requests.get(\n            f\"https:\u002F\u002Fapi.frankfurter.app\u002F{currency_date}\",\n            params={\"from\": currency_from, \"to\": currency_to},\n        )\n        return response.json()\n\n接下来，定义一个 ADK Agent：\n\n.. code-block:: Python\n\n    from google.adk.agents import Agent\n    from vertexai.agent_engines import AdkApp\n\n    app = AdkApp(agent=Agent(\n        model=\"gemini-2.0-flash\",        # 必需。\n        name='currency_exchange_agent',  # 必需。\n        tools=[get_exchange_rate],       # 可选。\n    ))\n\n使用美元和瑞典克朗在本地测试该代理：\n\n.. code-block:: Python\n\n    async for event in app.async_stream_query(\n        user_id=\"user-id\",\n        message=\"今天美元兑瑞典克朗的汇率是多少？\",\n    ):\n        print(event)\n\n要将代理部署到 Agent Engine：\n\n.. code-block:: Python\n\n    remote_app = client.agent_engines.create(\n        agent=app,\n        config={\n            \"requirements\": [\"google-cloud-aiplatform[agent_engines,adk]\"],\n        },\n    )\n\n您还可以对已部署的代理进行查询：\n\n.. code-block:: Python\n\n    async for event in remote_app.async_stream_query(\n        user_id=\"user-id\",\n        message=\"今天美元兑瑞典克朗的汇率是多少？\",\n    ):\n        print(event)\n\n提示优化\n^^^^^^^^^^^^^^^^^^^\n\n要进行零样本提示优化，可以使用 `optimize` 方法。\n\n.. code-block:: Python\n\n    prompt = \"为问答助手生成系统指令\"\n    response = client.prompts.optimize(prompt=prompt)\n    print(response.raw_text_response)\n    if response.parsed_response:\n      print(response.parsed_response.suggested_prompt)\n\n要调用数据驱动的提示优化，可以调用 `launch_optimization_job` 方法。不过，在这种情况下，我们需要提供一个 VAPO（Vertex AI 提示优化器）配置文件。该配置文件需要包含服务账号或项目 **编号** 以及配置文件路径。有关配置参数的更多详细信息，请参阅此 [教程](https:\u002F\u002Fcloud.google.com\u002Fvertex-ai\u002Fgenerative-ai\u002Fdocs\u002Flearn\u002Fprompts\u002Fdata-driven-optimizer)。\n\n.. code-block:: Python\n\n    from vertexai import types\n\nproject_number = PROJECT_NUMBER # 替换为您的项目编号\n    service_account = f\"{project_number}-compute@developer.gserviceaccount.com\"\n\n    vapo_config = vertexai.types.PromptOptimizerVAPOConfig(\n        config_path=\"gs:\u002F\u002Fyour-bucket\u002Fconfig.json\",\n        service_account_project_number=project_number,\n        wait_for_completion=False\n    )\n\n    # 设置日志记录，以便查看优化作业的进度\n    logging.basicConfig(encoding='utf-8', level=logging.INFO, force=True)\n\n    result = client.prompts.launch_optimization_job(method=types.PromptOptimizerMethod.VAPO, config=vapo_config)\n\n如果您想使用项目编号而不是服务账号，可以改用以下配置：\n\n.. code-block:: Python\n\n    vapo_config = vertexai.types.PromptOptimizerVAPOConfig(\n        config_path=\"gs:\u002F\u002Fyour-bucket\u002Fconfig.json\",\n        service_account_project_number=project_number,\n        wait_for_completion=False\n    )\n\n我们也可以异步调用 `launch_optimization_job` 方法。\n\n.. code-block:: Python\n\n    await client.aio.prompts.launch_optimization_job(method=types.PromptOptimizerMethod.VAPO, config=vapo_config)\n\n提示管理\n^^^^^^^^^^^^^\n\n首先将您的提示定义为字典或 types.Prompt 对象。然后调用 create_prompt。\n\n.. code-block:: Python\n\n    prompt = {\n        \"prompt_data\": {\n            \"contents\": [{\"parts\": [{\"text\": \"你好，{name}！你好吗？\"}]}],\n            \"system_instruction\": {\"parts\": [{\"text\": \"请用一句话简短回答。\"}]},\n            \"variables\": [\n                {\"name\": {\"text\": \"Alice\"}},\n            ],\n            \"model\": \"gemini-2.5-flash\",\n        },\n    }\n\n    prompt_resource = client.prompts.create(\n        prompt=prompt,\n    )\n\n请注意，您也可以使用 types.Prompt 对象来定义您的提示。用于此目的的一些类型来自 Gen AI SDK。\n\n.. code-block:: Python\n\n    import types\n    from google.genai import types as genai_types\n\n    prompt = types.Prompt(\n        prompt_data=types.PromptData(\n          contents=[genai_types.Content(parts=[genai_types.Part(text=\"你好，{name}！你好吗？\")])],\n          system_instruction=genai_types.Content(parts=[genai_types.Part(text=\"请用一句话简短回答。\")]),\n          variables=[\n            {\"name\": genai_types.Part(text=\"Alice\")},\n          ],\n          model=\"gemini-2.5-flash\",\n        ),\n    )\n\n通过调用 get() 并传入 prompt_id 来检索提示。\n\n.. code-block:: Python\n\n    retrieved_prompt = client.prompts.get(\n        prompt_id=prompt_resource.prompt_id,\n    )\n\n创建或检索提示后，您可以使用 Gen AI SDK 调用 `generate_content()` 方法并传入该提示。\n\n以下代码使用 Prompt 对象上可用的实用函数，将 Prompt 对象转换为 Content 对象列表，以便与 `generate_content` 一起使用。要运行此代码，您需要安装 Gen AI SDK，可以通过 `pip install google-genai` 安装。\n\n.. code-block:: Python\n\n    from google import genai\n    from google.genai import types as genai_types\n\n    # 在 Gen AI SDK 中创建客户端\n    genai_client = genai.Client(vertexai=True, project=\"your-project\", location=\"your-location\")\n\n    # 使用提示调用 generate_content()\n    response = genai_client.models.generate_content(\n        model=retrieved_prompt.prompt_data.model,\n        contents=retrieved_prompt.assemble_contents(),\n    )\n\n-----------------------------------------\n\n.. note::\n\n   Vertex AI SDK 中的以下生成式 AI 模块自 2025 年 6 月 24 日起已被弃用，并将于 2026 年 6 月 24 日移除：\n   `vertexai.generative_models`、`vertexai.language_models`、`vertexai.vision_models`、`vertexai.tuning`、`vertexai.caching`。请使用\n   [Google Gen AI SDK](https:\u002F\u002Fpypi.org\u002Fproject\u002Fgoogle-genai\u002F) 访问这些功能。有关详细信息，请参阅\n   [迁移指南](https:\u002F\u002Fcloud.google.com\u002Fvertex-ai\u002Fgenerative-ai\u002Fdocs\u002Fdeprecations\u002Fgenai-vertexai-sdk)。\n   您可以继续使用其他所有 Vertex AI SDK 模块，因为它们是使用 API 的推荐方式。\n\n快速入门\n-----------\n\n为了使用本库，您首先需要完成以下步骤：\n\n1. `选择或创建一个 Cloud Platform 项目。`_\n2. `为您的项目启用结算功能。`_\n3. `启用 Vertex AI API。`_\n4. `设置身份验证。`_\n\n.. _选择或创建一个 Cloud Platform 项目.: https:\u002F\u002Fconsole.cloud.google.com\u002Fproject\n.. _为您的项目启用结算功能.: https:\u002F\u002Fcloud.google.com\u002Fbilling\u002Fdocs\u002Fhow-to\u002Fmodify-project#enable_billing_for_a_project\n.. _启用 Vertex AI API.:  https:\u002F\u002Fcloud.google.com\u002Fvertex-ai\u002Fdocs\u002Fstart\u002Fuse-vertex-ai-python-sdk\n.. _设置身份验证.: https:\u002F\u002Fgoogleapis.dev\u002Fpython\u002Fgoogle-api-core\u002Flatest\u002Fauth.html\n\n\n支持的 Python 版本\n^^^^^^^^^^^^^^^^^^^^^^^^^\nPython >= 3.9\n\n已弃用的 Python 版本\n^^^^^^^^^^^^^^^^^^^^^^^^^^\nPython \u003C= 3.8。\n\n与 Python 3.8 兼容的本库最后一个版本是 google-cloud-aiplatform==1.90.0。\n\n与 Python 3.7 兼容的本库最后一个版本是 google-cloud-aiplatform==1.31.1。\n\n与 Python 3.6 兼容的本库最后一个版本是 google-cloud-aiplatform==1.12.1。\n\n概述\n~~~~~~~~\n本节提供 Python 版 Vertex AI SDK 的简要概述。您还可以参考 `vertex-ai-samples`_ 中的笔记本以获取示例。\n\n.. _vertex-ai-samples: https:\u002F\u002Fgithub.com\u002FGoogleCloudPlatform\u002Fvertex-ai-samples\u002Ftree\u002Fmain\u002Fnotebooks\u002Fcommunity\u002Fsdk\n\n所有公开可用的 SDK 功能都可以在 :code:`google\u002Fcloud\u002Faiplatform` 目录中找到。\n在底层，Vertex SDK 基于 GAPIC 构建，GAPIC 是 Google API CodeGen 的缩写。\nGAPIC 库代码位于 :code:`google\u002Fcloud\u002Faiplatform_v1` 和 :code:`google\u002Fcloud\u002Faiplatform_v1beta1`，\n它是根据 Google 的服务 proto 文件自动生成的。\n\n对于大多数开发人员的编程需求，他们可以按照以下步骤来确定要导入哪些库：\n\n1. 首先查看 :code:`google\u002Fcloud\u002Faiplatform` —— 与 GAPIC 相比，Vertex SDK 的 API 几乎总是更易于使用且更简洁。\n2. 如果您寻找的功能在那里找不到，则查看 :code:`aiplatform_v1` 以确定是否在 GAPIC 中可用。\n3. 如果功能仍处于测试阶段，则会在 :code:`aiplatform_v1beta1` 中提供。\n\n如果以上情况都无法帮助您找到适合任务的工具，请随时在 GitHub 上提交问题并向我们提出功能请求。\n\n导入\n^^^^^^^^^\n可以通过导入以下命名空间来使用 Vertex AI SDK 的基于资源的功能：\n\n.. code-block:: Python\n\n    from google.cloud import aiplatform\n\n初始化\n^^^^^^^^^^^^^^\n初始化 SDK 以存储您在使用 SDK 时常用的配置。\n\n.. code-block:: Python\n\n    aiplatform.init(\n        # 您的 Google Cloud 项目 ID 或编号\n        # 默认环境未设置\n        project='my-project',\n\n# 您将使用的 Vertex AI 区域\n        # 默认为 us-central1\n        location='us-central1',\n\n        # 与 location 同区域的 Google Cloud Storage 存储桶\n        # 用于暂存工件\n        staging_bucket='gs:\u002F\u002Fmy_staging_bucket',\n\n        # 自定义 google.auth.credentials.Credentials\n        # 如果未设置，则使用环境默认凭据\n        credentials=my_credentials,\n\n        # 客户管理的加密密钥资源名称\n        # 如果设置，将应用于所有 Vertex AI 资源\n        encryption_spec_key_name=my_encryption_key_name,\n\n        # 用于跟踪记录指标和参数的实验名称\n        experiment='my-experiment',\n\n        # 上述实验的描述\n        experiment_description='my experiment description'\n    )\n\n数据集\n^^^^^^^^\nVertex AI 提供托管的表格、文本、图像和视频数据集。在 SDK 中，数据集可用于下游的模型训练。\n\n要创建一个表格数据集：\n\n.. code-block:: Python\n\n    my_dataset = aiplatform.TabularDataset.create(\n        display_name=\"my-dataset\", gcs_source=['gs:\u002F\u002Fpath\u002Fto\u002Fmy\u002Fdataset.csv'])\n\n您也可以分步骤创建和导入数据集：\n\n.. code-block:: Python\n\n    from google.cloud import aiplatform\n\n    my_dataset = aiplatform.TextDataset.create(\n        display_name=\"my-dataset\")\n\n    my_dataset.import_data(\n        gcs_source=['gs:\u002F\u002Fpath\u002Fto\u002Fmy\u002Fdataset.csv'],\n        import_schema_uri=aiplatform.schema.dataset.ioformat.text.multi_label_classification\n    )\n\n要获取先前创建的数据集：\n\n.. code-block:: Python\n\n  dataset = aiplatform.ImageDataset('projects\u002Fmy-project\u002Flocation\u002Fus-central1\u002Fdatasets\u002F{DATASET_ID}')\n\nVertex AI 支持多种数据集模式。这些模式的引用可在 :code:`aiplatform.schema.dataset` 命名空间下找到。有关支持的数据集模式的更多信息，请参阅 `准备数据文档`_。\n\n.. _准备数据文档: https:\u002F\u002Fcloud.google.com\u002Fai-platform-unified\u002Fdocs\u002Fdatasets\u002Fprepare\n\n训练\n^^^^^^^^\nVertex AI 的 Python SDK 允许您训练自定义模型和 AutoML 模型。\n\n您可以使用自定义 Python 脚本、自定义 Python 包或容器来训练自定义模型。\n\n**准备您的自定义代码**\n\nVertex AI 自定义训练使您能够在 Vertex AI 数据集上进行训练，并生成 Vertex AI 模型。为此，您的脚本必须遵守以下约定：\n\n它必须从训练服务填充的环境变量中读取数据集：\n\n.. code-block:: Python\n\n  os.environ['AIP_DATA_FORMAT']  # 提供数据格式\n  os.environ['AIP_TRAINING_DATA_URI']  # 训练集的 URI\n  os.environ['AIP_VALIDATION_DATA_URI']  # 验证集的 URI\n  os.environ['AIP_TEST_DATA_URI']  # 测试集的 URI\n\n请访问 `在自定义训练应用中使用托管数据集`_ 以获取详细概述。\n\n.. _在自定义训练应用中使用托管数据集: https:\u002F\u002Fcloud.google.com\u002Fvertex-ai\u002Fdocs\u002Ftraining\u002Fusing-managed-datasets\n\n它必须将模型工件写入由训练服务填充的环境变量中：\n\n.. code-block:: Python\n\n  os.environ['AIP_MODEL_DIR']\n\n**运行训练**\n\n.. code-block:: Python\n\n  job = aiplatform.CustomTrainingJob(\n      display_name=\"my-training-job\",\n      script_path=\"training_script.py\",\n      container_uri=\"us-docker.pkg.dev\u002Fvertex-ai\u002Ftraining\u002Ftf-cpu.2-2:latest\",\n      requirements=[\"gcsfs==0.7.1\"],\n      model_serving_container_image_uri=\"us-docker.pkg.dev\u002Fvertex-ai\u002Fprediction\u002Ftf2-cpu.2-2:latest\",\n  )\n\n  model = job.run(my_dataset,\n                  replica_count=1,\n                  machine_type=\"n1-standard-4\",\n                  accelerator_type='NVIDIA_TESLA_K80',\n                  accelerator_count=1)\n\n在上面的代码块中，`my_dataset` 是在上述“数据集”部分创建的托管数据集。`model` 变量是一个托管的 Vertex AI 模型，可以部署或导出。\n\n\nAutoMLs\n-------\nVertex AI 的 Python SDK 支持 AutoML 表格、图像、文本、视频和预测模型。\n\n要训练一个 AutoML 表格模型：\n\n.. code-block:: Python\n\n  dataset = aiplatform.TabularDataset('projects\u002Fmy-project\u002Flocation\u002Fus-central1\u002Fdatasets\u002F{DATASET_ID}')\n\n  job = aiplatform.AutoMLTabularTrainingJob(\n    display_name=\"train-automl\",\n    optimization_prediction_type=\"regression\",\n    optimization_objective=\"minimize-rmse\",\n  )\n\n  model = job.run(\n      dataset=dataset,\n      target_column=\"target_column_name\",\n      training_fraction_split=0.6,\n      validation_fraction_split=0.2,\n      test_fraction_split=0.2,\n      budget_milli_node_hours=1000,\n      model_display_name=\"my-automl-model\",\n      disable_early_stopping=False,\n  )\n\n\n模型\n------\n要获取模型：\n\n\n.. code-block:: Python\n\n  model = aiplatform.Model('\u002Fprojects\u002Fmy-project\u002Flocations\u002Fus-central1\u002Fmodels\u002F{MODEL_ID}')\n\n\n\n要上传模型：\n\n.. code-block:: Python\n\n  model = aiplatform.Model.upload(\n      display_name='my-model',\n      artifact_uri=\"gs:\u002F\u002Fpython\u002Fto\u002Fmy\u002Fmodel\u002Fdir\",\n      serving_container_image_uri=\"us-docker.pkg.dev\u002Fvertex-ai\u002Fprediction\u002Ftf2-cpu.2-2:latest\",\n  )\n\n\n\n要部署模型：\n\n\n.. code-block:: Python\n\n  endpoint = model.deploy(machine_type=\"n1-standard-4\",\n                          min_replica_count=1,\n                          max_replica_count=5\n                          machine_type='n1-standard-4',\n                          accelerator_type='NVIDIA_TESLA_K80',\n                          accelerator_count=1)\n\n\n请访问 `将模型导入 Vertex AI`_ 以获取详细概述：\n\n.. _将模型导入 Vertex AI: https:\u002F\u002Fcloud.google.com\u002Fvertex-ai\u002Fdocs\u002Fgeneral\u002Fimport-model\n\n模型评估\n----------------\n\nVertex AI 的 Python SDK 目前支持获取所有 AutoML 模型的评估指标。\n\n要列出某个模型的所有评估结果：\n\n.. code-block:: Python\n\n  model = aiplatform.Model('projects\u002Fmy-project\u002Flocations\u002Fus-central1\u002Fmodels\u002F{MODEL_ID}')\n\n  evaluations = model.list_model_evaluations()\n\n\n要获取特定模型的评估资源：\n\n.. code-block:: Python\n\n  model = aiplatform.Model('projects\u002Fmy-project\u002Flocations\u002Fus-central1\u002Fmodels\u002F{MODEL_ID}')\n\n  # 不带参数时返回第一个评估，您也可以传入评估 ID\n  evaluation = model.get_model_evaluation()\n\n  eval_metrics = evaluation.metrics\n\n\n您也可以通过直接传入模型评估的资源名称来创建对评估的引用：\n\n.. code-block:: Python\n\n  evaluation = aiplatform.ModelEvaluation(\n    evaluation_name='projects\u002Fmy-project\u002Flocations\u002Fus-central1\u002Fmodels\u002F{MODEL_ID}\u002Fevaluations\u002F{EVALUATION_ID}')\n\n或者，您可以通过传入模型和评估 ID 来创建评估的引用：\n\n.. code-block:: Python\n\n  evaluation = aiplatform.ModelEvaluation(\n    evaluation_name={EVALUATION_ID},\n    model_id={MODEL_ID})\n\n\n批量预测\n----------------\n\n要创建一个批量预测任务：\n\n.. code-block:: Python\n\nmodel = aiplatform.Model('\u002Fprojects\u002Fmy-project\u002Flocations\u002Fus-central1\u002Fmodels\u002F{MODEL_ID}')\n\n  batch_prediction_job = model.batch_predict(\n    job_display_name='my-batch-prediction-job',\n    instances_format='csv',\n    machine_type='n1-standard-4',\n    gcs_source=['gs:\u002F\u002Fpath\u002Fto\u002Fmy\u002Ffile.csv'],\n    gcs_destination_prefix='gs:\u002F\u002Fpath\u002Fto\u002Fmy\u002Fbatch_prediction\u002Fresults\u002F',\n    service_account='my-sa@my-project.iam.gserviceaccount.com'\n  )\n\n你也可以通过包含 `sync=False` 参数来异步创建批量预测作业：\n\n.. code-block:: Python\n\n  batch_prediction_job = model.batch_predict(..., sync=False)\n\n  # 等待资源创建完成\n  batch_prediction_job.wait_for_resource_creation()\n\n  # 获取状态\n  batch_prediction_job.state\n\n  # 阻塞直到作业完成\n  batch_prediction_job.wait()\n\n\n端点\n---------\n\n要创建一个端点：\n\n.. code-block:: Python\n\n  endpoint = aiplatform.Endpoint.create(display_name='my-endpoint')\n\n要将模型部署到已创建的端点上：\n\n.. code-block:: Python\n\n  model = aiplatform.Model('\u002Fprojects\u002Fmy-project\u002Flocations\u002Fus-central1\u002Fmodels\u002F{MODEL_ID}')\n\n  endpoint.deploy(model,\n                  min_replica_count=1,\n                  max_replica_count=5,\n                  machine_type='n1-standard-4',\n                  accelerator_type='NVIDIA_TESLA_K80',\n                  accelerator_count=1)\n\n要从端点获取预测结果：\n\n.. code-block:: Python\n\n  endpoint.predict(instances=[[6.7, 3.1, 4.7, 1.5], [4.6, 3.1, 1.5, 0.2]])\n\n要从端点卸载模型：\n\n.. code-block:: Python\n\n  endpoint.undeploy_all()\n\n要删除一个端点：\n\n.. code-block:: Python\n\n  endpoint.delete()\n\n\n管道\n---------\n\n要创建一个 Vertex AI 管道运行并监控其完成：\n\n.. code-block:: Python\n\n  # 实例化 PipelineJob 对象\n  pl = PipelineJob(\n      display_name=\"我的第一个管道\",\n\n      # 是否启用缓存\n      # True = 始终缓存管道步骤结果\n      # False = 永远不缓存管道步骤结果\n      # None = 延迟到管道定义中每个组件的缓存选项\n      enable_caching=False,\n\n      # 已编译管道定义的本地或 GCS 路径\n      template_path=\"pipeline.json\",\n\n      # 包含管道输入参数的字典\n      parameter_values=parameter_values,\n\n      # 用作管道根目录的 GCS 路径\n      pipeline_root=pipeline_root,\n  )\n\n  # 在 Vertex AI 中执行管道并监控其完成\n  pl.run(\n    # 用于管道运行的服务账户电子邮件地址\n    # 必须对该服务账户拥有 iam.serviceAccounts.actAs 权限才能使用\n    service_account=service_account,\n\n    # 此函数调用是同步（等待管道运行完成后再终止）还是异步（立即返回）\n    sync=True\n  )\n\n要创建一个 Vertex AI 管道而不监控其完成，可以使用 `submit` 代替 `run`：\n\n.. code-block:: Python\n\n  # 实例化 PipelineJob 对象\n  pl = PipelineJob(\n      display_name=\"我的第一个管道\",\n\n      # 是否启用缓存\n      # True = 始终缓存管道步骤结果\n      # False = 永远不缓存管道步骤结果\n      # None = 延迟到管道定义中每个组件的缓存选项\n      enable_caching=False,\n\n      # 已编译管道定义的本地或 GCS 路径\n      template_path=\"pipeline.json\",\n\n      # 包含管道输入参数的字典\n      parameter_values=parameter_values,\n\n      # 用作管道根目录的 GCS 路径\n      pipeline_root=pipeline_root,\n  )\n\n  # 将管道提交到 Vertex AI\n  pl.submit(\n    # 用于管道运行的服务账户电子邮件地址\n    # 必须对该服务账户拥有 iam.serviceAccounts.actAs 权限才能使用\n    service_account=service_account,\n  )\n\n\n可解释 AI：获取元数据\n----------------------------\n\n要从 TensorFlow 1 模型中以字典格式获取元数据：\n\n.. code-block:: Python\n\n  from google.cloud.aiplatform.explain.metadata.tf.v1 import saved_model_metadata_builder\n\n  builder = saved_model_metadata_builder.SavedModelMetadataBuilder(\n            'gs:\u002F\u002Fpython\u002Fto\u002Fmy\u002Fmodel\u002Fdir', tags=[tf.saved_model.tag_constants.SERVING]\n        )\n  generated_md = builder.get_metadata()\n\n要从 TensorFlow 2 模型中以字典格式获取元数据：\n\n.. code-block:: Python\n\n  from google.cloud.aiplatform.explain.metadata.tf.v2 import saved_model_metadata_builder\n\n  builder = saved_model_metadata_builder.SavedModelMetadataBuilder('gs:\u002F\u002Fpython\u002Fto\u002Fmy\u002Fmodel\u002Fdir')\n  generated_md = builder.get_metadata()\n\n要在端点部署和模型上传中使用解释性元数据：\n\n.. code-block:: Python\n\n  explanation_metadata = builder.get_metadata_protobuf()\n\n  # 使用解释性元数据部署模型到端点\n  model.deploy(..., explanation_metadata=explanation_metadata)\n\n  # 使用解释性元数据将模型部署到已创建的端点\n  endpoint.deploy(..., explanation_metadata=explanation_metadata)\n\n  # 使用解释性元数据上传模型\n  aiplatform.Model.upload(..., explanation_metadata=explanation_metadata)\n\n\nCloud Profiler\n----------------------------\n\nCloud Profiler 允许你按需对远程 Vertex AI 训练作业进行性能分析，并在 Vertex AI TensorBoard 中可视化结果。\n\n要开始使用 TensorFlow 的性能分析器，需更新你的训练脚本以包含以下内容：\n\n.. code-block:: Python\n\n    from google.cloud.aiplatform.training_utils import cloud_profiler\n    ...\n    cloud_profiler.init()\n\n接下来，使用 Vertex AI TensorBoard 实例运行作业。有关详细操作，请访问 https:\u002F\u002Fcloud.google.com\u002Fvertex-ai\u002Fdocs\u002Fexperiments\u002Ftensorboard-overview\n\n最后，在 Google Cloud 控制台中打开你的 TensorBoard，导航到“Profile”选项卡，然后点击“Capture Profile”按钮。这将允许用户捕获正在运行的作业的性能分析统计数据。\n\n\n下一步\n~~~~~~~~~~\n\n- 阅读 Vertex AI API 的 `客户端库文档`_，以了解客户端提供的其他可用方法。\n- 阅读 `Vertex AI API 产品文档`_，以了解更多关于该产品的信息及操作指南。\n- 查看此 `README`_，以查看我们支持的完整云端 API 列表。\n\n.. _Vertex AI API 产品文档:  https:\u002F\u002Fcloud.google.com\u002Fvertex-ai\u002Fdocs\n.. _README: https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fgoogle-cloud-python\u002Fblob\u002Fmain\u002FREADME.rst","# Vertex AI Python SDK 快速上手指南\n\n本指南帮助中国开发者快速安装并使用 `google-cloud-aiplatform`（Vertex AI SDK for Python），涵盖环境准备、安装及生成式 AI 核心功能的基本使用。\n\n## 环境准备\n\n在开始之前，请确保满足以下系统要求和前置条件：\n\n*   **Python 版本**：Python >= 3.9（Python 3.8 及以下版本已不再支持）。\n*   **Google Cloud 项目**：\n    1.  创建或选择一个 [Cloud Platform 项目](https:\u002F\u002Fconsole.cloud.google.com\u002Fproject)。\n    2.  为项目 [启用计费](https:\u002F\u002Fcloud.google.com\u002Fbilling\u002Fdocs\u002Fhow-to\u002Fmodify-project#enable_billing_for_a_project)。\n    3.  [启用 Vertex AI API](https:\u002F\u002Fcloud.google.com\u002Fvertex-ai\u002Fdocs\u002Fstart\u002Fuse-vertex-ai-python-sdk)。\n*   **身份验证**：\n    *   **本地开发**：运行 `gcloud auth application-default login` 进行认证。\n    *   **生产环境**：确保运行环境配置了具有适当权限的服务账号密钥或元数据服务器访问权限。\n\n## 安装步骤\n\n推荐使用 `pip` 或 `uv` 进行安装。国内开发者若遇到下载速度慢的问题，可指定清华或阿里镜像源加速。\n\n### 使用 pip 安装\n\n```bash\n# 官方源\npip install google-cloud-aiplatform\n\n# 国内加速（推荐）\npip install google-cloud-aiplatform -i https:\u002F\u002Fpypi.tuna.tsinghua.edu.cn\u002Fsimple\n```\n\n### 使用 uv 安装\n\n```bash\nuv pip install google-cloud-aiplatform\n```\n\n> **注意**：部分旧版生成式 AI 模块（如 `vertexai.generative_models`）已于 2025 年 6 月弃用。本指南及最新实践推荐使用新的 `vertexai` 客户端接口或配合 `google-genai` SDK 使用。\n\n## 基本使用\n\n以下示例展示如何初始化客户端并执行核心的生成式 AI 任务（评估、Agent 构建、提示词优化与管理）。\n\n### 1. 初始化客户端\n\n首先实例化 Vertex AI 客户端，替换为您的项目 ID 和区域。\n\n```python\nimport vertexai\nfrom vertexai import types\n\n# 实例化 GenAI 客户端\n# 请替换为您的项目 ID 和位置 (例如: us-central1)\nclient = vertexai.Client(project='my-project', location='us-central1')\n```\n\n### 2. 生成式 AI 评估 (Gen AI Evaluation)\n\n对模型回答进行自动化评估（如精确匹配、ROUGE-L 等）。\n\n```python\nimport pandas as pd\n\n# 准备测试数据\nprompts_df = pd.DataFrame({\n    \"prompt\": [\n        \"What is the capital of France?\",\n        \"Write a haiku about a cat.\",\n    ],\n    \"reference\": [\n        \"Paris\",\n        \"Sunbeam on the floor,\\nA furry puddle sleeping,\\nTwitching tail tells tales.\",\n    ]\n})\n\n# 运行推理\ninference_results = client.evals.run_inference(\n    model=\"gemini-2.5-flash-preview-05-20\",\n    src=prompts_df\n)\n\n# 执行评估\neval_result = client.evals.evaluate(\n    dataset=inference_results,\n    metrics=[\n        types.Metric(name='exact_match'),\n        types.Metric(name='rouge_l_sum'),\n        types.RubricMetric.TEXT_QUALITY,\n    ]\n)\n```\n\n### 3. 构建 Agent (Agent Engine with ADK)\n\n定义工具函数并创建基于 Agent Development Kit (ADK) 的智能体。\n\n```python\nfrom google.adk.agents import Agent\nfrom vertexai.agent_engines import AdkApp\nimport requests\n\n# 定义工具函数\ndef get_exchange_rate(\n    currency_from: str = \"USD\",\n    currency_to: str = \"EUR\",\n    currency_date: str = \"latest\",\n):\n    \"\"\"获取汇率\"\"\"\n    response = requests.get(\n        f\"https:\u002F\u002Fapi.frankfurter.app\u002F{currency_date}\",\n        params={\"from\": currency_from, \"to\": currency_to},\n    )\n    return response.json()\n\n# 定义 Agent\napp = AdkApp(agent=Agent(\n    model=\"gemini-2.0-flash\",\n    name='currency_exchange_agent',\n    tools=[get_exchange_rate],\n))\n\n# 本地测试\n# async for event in app.async_stream_query(\n#     user_id=\"user-id\",\n#     message=\"What is the exchange rate from US dollars to SEK today?\",\n# ):\n#     print(event)\n\n# 部署到 Agent Engine\nremote_app = client.agent_engines.create(\n    agent=app,\n    config={\n        \"requirements\": [\"google-cloud-aiplatform[agent_engines,adk]\"],\n    },\n)\n```\n\n### 4. 提示词管理 (Prompt Management)\n\n创建结构化提示词模板并调用模型生成内容。\n\n```python\n# 定义提示词对象\nprompt = {\n    \"prompt_data\": {\n        \"contents\": [{\"parts\": [{\"text\": \"Hello, {name}! How are you?\"}]}],\n        \"system_instruction\": {\"parts\": [{\"text\": \"Please answer in a short sentence.\"}]},\n        \"variables\": [\n            {\"name\": {\"text\": \"Alice\"}},\n        ],\n        \"model\": \"gemini-2.5-flash\",\n    },\n}\n\n# 创建提示词资源\nprompt_resource = client.prompts.create(prompt=prompt)\n\n# 获取提示词\nretrieved_prompt = client.prompts.get(prompt_id=prompt_resource.prompt_id)\n\n# 结合 google-genai SDK 进行内容生成\n# 需先安装: pip install google-genai\nfrom google import genai\n\ngenai_client = genai.Client(vertexai=True, project=\"my-project\", location=\"us-central1\")\n\nresponse = genai_client.models.generate_content(\n    model=retrieved_prompt.prompt_data.model,\n    contents=retrieved_prompt.assemble_contents(),\n)\n```","某电商公司的数据科学团队需要快速评估多个大语言模型在“智能客服回答质量”上的表现，以便选出最佳模型上线。\n\n### 没有 python-aiplatform 时\n- **手动调用接口繁琐**：开发人员需自行编写复杂的 HTTP 请求代码来调用 Vertex AI 接口，处理认证、重试机制和错误日志耗费大量精力。\n- **评估流程割裂**：生成模型回答与计算评估指标（如精确匹配、ROUGE-L）分为两个独立脚本，数据需要在本地文件间反复导入导出，极易出错。\n- **缺乏统一度量标准**：不同成员使用自定义的评估逻辑，导致无法横向对比不同模型版本的真实性能，决策缺乏数据支撑。\n- **扩展性差**：当测试集从几十条增加到上万条时，本地脚本容易因内存溢出或速率限制而崩溃，难以进行大规模压力测试。\n\n### 使用 python-aiplatform 后\n- **一站式客户端集成**：通过 `vertexai.Client` 实例化即可自动处理认证与连接，直接调用 `run_inference` 批量获取模型回答，代码量减少 70%。\n- **流水线式评估体验**：利用 `client.evals.evaluate` 方法，将推理结果直接传入并指定 `exact_match` 或 `TEXT_QUALITY` 等内置指标，瞬间完成端到端评估。\n- **标准化指标体系**：内置多种权威评估算法和评分规则（Rubric），确保团队所有成员基于同一套标准衡量模型效果，对比结果可信度高。\n- **云原生弹性伸缩**：依托 Vertex AI 后端能力，轻松处理海量测试数据，自动管理并发与资源，无需担心本地环境瓶颈。\n\npython-aiplatform 将原本碎片化、高门槛的模型评估工作转化为简洁的代码调用，让团队能专注于模型优化而非工程基建。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogleapis_python-aiplatform_b5b4a86b.png","googleapis","Google APIs","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002Fgoogleapis_5b6fe389.png","Clients for Google APIs and tools that help produce them.",null,"https:\u002F\u002Fgoogleapis.github.io","https:\u002F\u002Fgithub.com\u002Fgoogleapis",[80,84,88,91],{"name":81,"color":82,"percentage":83},"Python","#3572A5",100,{"name":85,"color":86,"percentage":87},"Shell","#89e051",0,{"name":89,"color":90,"percentage":87},"Dockerfile","#384d54",{"name":92,"color":93,"percentage":87},"Jinja","#a52a22",880,438,"2026-04-14T09:30:19","Apache-2.0","未说明",{"notes":100,"python":101,"dependencies":102},"该工具是 Google Vertex AI 的 Python SDK，主要作为客户端库调用云端 API，因此本地运行无需高性能 GPU 或大内存。使用前需配置 Google Cloud 项目、启用计费及 Vertex AI API，并设置身份验证。注意：vertexai.generative_models 等部分生成式 AI 模块已于 2025 年 6 月弃用，建议迁移至 google-genai SDK。若使用 Agent Engine 或特定评估功能，需安装额外依赖（如 adk）。","3.9+",[103,104,105,106],"google-cloud-aiplatform","pandas","requests","google-genai (可选，用于部分生成式 AI 功能)",[35,14,13,108],"其他","2026-03-27T02:49:30.150509","2026-04-15T06:05:24.029373",[],[113,118,123,128,133,138,143,148,153,158,163,168,173,178,183,188,193,198,203,208],{"id":114,"version":115,"summary_zh":116,"released_at":117},263559,"v1.143.0","## [1.143.0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcompare\u002Fv1.142.0...v1.143.0) (2026-03-25)\n\n\n### 功能特性\n\n* 添加 AgentEngine 会话模块 ([368a8f8](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F368a8f82c73a227b8fb90a36e6dfc1ff3ab91f53))\n* 在创建内存时添加 memory_id 参数 ([2167f36](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F2167f369a45ad1cfd0a701777dea4cdbc08810db))\n* 添加 RAG 元数据和 RAG 数据模式管理 API ([4f0fdfe](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F4f0fdfe5a71e0a9fddfb86aa2c1ef5a492795f0c))\n* 在追加事件时添加 raw_event 参数 ([2167f36](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F2167f369a45ad1cfd0a701777dea4cdbc08810db))\n* GenAI SDK 客户端（多模态）——在所有函数中接受 `gemini_request_read_config` 而不是 `template_config`。([f138162](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Ff138162e55e913c632c76a0762d911f795bffcd4))\n* GenAI SDK 客户端（多模态）——支持从 bigframe DataFrame 创建多模态数据集 ([9b7dc29](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F9b7dc2916c55a394e8beef7c02ad122147543846))\n\n\n### 错误修复\n\n* 将受污染的 LiteLLM 版本从依赖项中排除，并将其固定为 1.82.6 ([78966da](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F78966dac863b026db0b426f63dbff72271cdda10))\n* 修复 RAG 资源解析问题 ([bc61708](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fbc6170819c35db80238a843d5d658f30e73630d3))，关闭 [#6442](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fissues\u002F6442)","2026-03-25T18:19:29",{"id":119,"version":120,"summary_zh":121,"released_at":122},263555,"v1.147.0","## [1.147.0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcompare\u002Fv1.146.0...v1.147.0) (2026-04-09)\n\n\n### ⚠ 重大变更\n\n* 已从 GeminiExample 中移除 labels 字段。tools 和 safety_settings 字段已改为列表。\n\n### 功能\n\n* 向 Model Garden 部署方法添加 system_labels 参数。([a196cda](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fa196cda777b8f6d9136a7f0b0d7264d09dab1ba9))\n* 为 Agent Engine Task Store Service 添加 delete 方法 ([2f2a211](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F2f2a211a54109e8b5701c0868b1a459a59cac626))\n* GenAI 客户端（评估）：在本地代理抓取中，将 gemini-3 模型流量路由到全球区域 ([e2e81c9](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe2e81c99b46fef3978655fc480edd5c39f098fd6))\n* GenAI SDK 客户端（多模态）：向 GeminiRequestReadConfig 添加 single_turn_template 辅助函数。([0e5037d](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F0e5037d31bd6566756dd79ba8606d7fdeb9b54ae))\n* 为 A2aAgent 类设置 agent_framework。([a8085e5](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fa8085e50741471fa5c17a5b36b69adaab1da4f25))\n\n\n### 错误修复\n\n* 在 async_retrieve_contexts 中添加 SDK 临时解决方案，以处理双重包装的 Any 响应。([bd4983b](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fbd4983b488a3181578c552482af6ec78e8aa408e))\n* GenAI SDK 客户端（多模态）：修复 GeminiExample 类定义与 API 类型之间的不一致。([fad250e](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Ffad250e700cc676ce66d20ca1c410829788a251c))\n\n\n### 其他杂项工作\n\n* 发布 1.147.0 版本 ([4a11370](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F4a113706f1ad9a105b0b04a6cdace9e31942d6e7))","2026-04-09T16:58:02",{"id":124,"version":125,"summary_zh":126,"released_at":127},263556,"v1.146.0","## [1.146.0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcompare\u002Fv1.145.0...v1.146.0) (2026-04-07)\n\n\n### ⚠ 重大变更\n\n* `to_bigframes` 已从 datasets 模块中移除，并迁移到 `MultimodalDataset` 类中。不再使用 `dataframe = client.datasets.to_bigframes(multimodal_dataset=multimodal_dataset)`，而是改用 `dataframe = multimodal_dataset.to_bigframes()` 来从多模态数据集中创建 BigFrame 实例。\n\n### 功能特性\n\n* 向 Memory Bank 添加整合自定义功能 ([a8948c4](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fa8948c4dfcc65b73ea579b93f8c36baa65817f25))\n* 在基于 LLM 的评估指标中添加对自定义结果解析的支持 ([3e0ddff](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F3e0ddff2f5c306d601bf325618b9136f7713ff68))\n* GenAI 客户端（evals）：添加核心数据模型及用于自动损失分析的代码生成映射 ([09794ba](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F09794ba83746377b1a94ed00f7c41f6b9d647cd6))\n* GenAI 客户端（evals）：在多轮代理抓取中应用默认用户角色“Evaluator” ([7002dc5](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F7002dc5452032d3189eabae9b0952decd73fb8eb))\n* 在 run_query_job 中，将 gcs_bucket 重命名为 gcs_uri，并允许用户为输出文件设置文件名。([f302d1f](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Ff302d1f99b79b862a61a41c729ae56a22307bf11))\n* 将首次 bidi_stream_query 请求中的状态传递给 async_create_session ([37b5a0f](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F37b5a0f796984e3af69f8036ca235e096d64e7d1))\n\n\n### 其他杂项\n\n* GenAI SDK 客户端（多模态）：将 `to_bigframes` 方法移至 `MultimodalDataset` 类。([6874b8d](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F6874b8db7adfa4022972e0eba2c3ccaffd523cf2))\n* 发布版本 1.146.0 ([aab457d](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Faab457dcf8f0552b0e378c5b4d693e4aa8a2e2f1))","2026-04-08T01:01:17",{"id":129,"version":130,"summary_zh":131,"released_at":132},263557,"v1.145.0","## [1.145.0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcompare\u002Fv1.144.0...v1.145.0)（2026-04-01）\n\n\n### 功能特性\n\n* 在 Create Session 中添加 session_id，以支持自定义会话 ID ([bea67c2](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fbea67c296637c06821df12494ba46b74de5cd2aa))\n* Eval SDK：通过使用 genai SDK 迁移模型调用方法 ([ff5e246](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fff5e24659d399816c19c641674f85ef0c5e65b6b))\n* Eval SDK：在预览文件夹中，通过使用 genai SDK 迁移模型调用方法 ([ad36123](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fad36123f6f9ae35cbe45bd925dde9458ee3cef00))\n* 在 SDK 中将指标注册表的支持限制为仅自定义代码执行指标和基于 LLM 的指标 ([c12aedc](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fc12aedcea7f778bc6cdfd1d871334ef504870c97))\n\n\n### Bug 修复\n\n* 将 VertexRagServiceClient 中 ask_contexts 和 async_retrieve_contexts 的默认超时时间增加至 600 秒。([3de2c1e](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F3de2c1eba1a16775085fb3c46b19ac9e8d5f5f22))","2026-04-01T21:51:39",{"id":134,"version":135,"summary_zh":136,"released_at":137},263558,"v1.144.0","## [1.144.0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcompare\u002Fv1.143.0...v1.144.0) (2026-03-31)\n\n\n### ⚠ 重大变更\n\n* **实验性方法的重大变更：** `create_from_bigquery` 和 `update_multimodal_dataset` 不再自动为 BigQuery URI 补充缺失的 `bq:\u002F\u002F` 前缀。使用新函数 `MultimodalDataset.set_bigquery_uri` 时，仍会在必要时添加该前缀。\n\n### 功能\n\n* 在 `rag_retrieval.py` 中的所有检索和生成方法中添加 `metadata_filter` 的用法 ([841c597](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F841c597c35d9b6d4f101060284a89131a510a813))\n* 将 `container_spec` 添加到 Reasoning Engine 公开协议缓冲区定义中 ([9a0eefb](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F9a0eefb2baa9f4044db349fce2ae5e3991a3cf92))\n* 将 `container_spec` 添加到 Reasoning Engine 公开协议缓冲区定义中 ([9a0eefb](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F9a0eefb2baa9f4044db349fce2ae5e3991a3cf92))\n* 在 AgentEngines 中添加对 `container_spec` 的支持 ([da663c0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fda663c0d000deeb5663d53334abc1d997f3575b5))\n* GenAI SDK 客户端（多模态）——向 `MultimodalDataset` 添加元数据辅助工具。([e164b19](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe164b19e16356dc83e23ccd54c9bbe9f9649bad0))\n* 发布客户端批处理配置模式 ([9a0eefb](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F9a0eefb2baa9f4044db349fce2ae5e3991a3cf92))\n* 重构评估实例构建并更新 LLM 指标处理器 ([7a3b436](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F7a3b436bda89fd3931fa6fee30e6fed10489d200))\n* 更新自定义代码执行指标的接口，同时保留对 remote_custom_function 的支持以确保向后兼容性 ([f7733ec](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Ff7733ec70f4128ecdb31811c2ff33dfb3132aa98))\n\n\n### 错误修复\n\n* 自动创建的 GCS 暂存存储桶名称可预测性较低 ([1a33ad9](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F1a33ad9a56c09892b12194b7cb8615b2739386ad))\n* GenAI SDK 客户端（多模态）——将 `create_from_pandas` 中的阻塞调用替换为异步版本。([2767273](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F27672738a12c8012534d6350d3315eff95bbbaa4))\n* 为来自 AgentEngine 的出站 A2A 请求引入超时机制 ([78525d2](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F78525d20a1e2f183cabcde222e15db2756d40c2c))\n* 如果已提供 image_spec\u002Fcontainer_spec，则放宽对指定 class_methods 的要求 ([6f7b12c](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F6f7b12c40f5ae3099b997e99d899b0f5e618cfda))\n* 在 AdkApp 中统一使用 app_name ([ee9fbe1](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fee9fbe11c13bf3bb7fe7a70f1484882c056ff63f))\n\n\n### 文档\n\n* 更新 API 常用类型文档 ([9a0eefb](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F9a0eefb2baa9f4044db349fce2ae5e3991a3cf92))\n\n\n### 其他杂项工作\n\n* 发布 1.144.0 版本","2026-03-31T22:59:38",{"id":139,"version":140,"summary_zh":141,"released_at":142},263560,"v1.142.0","## [1.142.0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcompare\u002Fv1.141.0...v1.142.0) (2026-03-20)\n\n\n### 功能特性\n\n* 在 `rag_retrieval.py` 中添加 `retrieve_contexts_async` 和 `ask_contexts` SDK 方法 ([0e0137e](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F0e0137e120286b07e749c1f3b4beeea0308fdfc6))\n* 为 v1 添加 `VALIDATED` 函数调用模式 ([981a551](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F981a551c7438cad87fc52077ddbb5c109d2c62d6))\n* 将 aiohttp 添加到 agent_engines 的依赖项中。([394253a](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F394253a81f9bccbaef38bdb084da442668695f86))\n* 为 v1 创建会话添加自定义会话 ID 字段 ([981a551](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F981a551c7438cad87fc52077ddbb5c109d2c62d6))\n* 为 v1beta1 创建会话添加自定义会话 ID 字段 ([981a551](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F981a551c7438cad87fc52077ddbb5c109d2c62d6))\n* 为 Vertex SDK GenAI 评估添加 EvaluationMetric 的创建、获取和列表方法 ([f4b4244](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Ff4b4244c99d4981ab38f7895e652d986026c2eab))\n* 为消息 `EmbedContentRequest` 添加新的 `embed_content_config` ([981a551](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F981a551c7438cad87fc52077ddbb5c109d2c62d6))\n* 为消息 `EmbedContentRequest` 添加新的 `embed_content_config` ([981a551](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F981a551c7438cad87fc52077ddbb5c109d2c62d6))\n* 为事件 proto v1 添加 raw_event 字段 ([981a551](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F981a551c7438cad87fc52077ddbb5c109d2c62d6))\n* 在评分标准生成中支持在指标内使用 metric_resource_name ([4dbd76c](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F4dbd76c4a701552c91ab940b9596785912105be2))\n* 支持为自定义代码执行指标传递代理数据 ([0c70de8](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F0c70de8a75328d81b24fe1a79b4e28bdb2863b00))\n* 在评估运行 API 中支持通过资源名称引用已注册的指标 ([76a9558](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F76a9558c6d90d2beecb5817b16e77296560c0e48))\n* 为 SDK 添加 run_query_job 和 check_query_job，用于处理长时间运行的异步任务。([0cff2d8](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F0cff2d8365dc0e51bbe763322d0d1ca635a41c84))\n* 允许在评估中使用已注册指标的资源名称 ([72942a4](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F72942a4030bd897b7a4b07295fd35b989ef2a825))\n* 启用部署到 Agent Engine 的代理的 a2a 流式传输。([ccfd37f](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fccfd37f47bcdd1edcc26dee046bbee8284812146))\n* GenAI 客户端（评估）– 破坏性变更：代理引擎资源名称现在作为单独的 `agent` 参数传递给 `create_evaluation_run` 方法，而不是作为 `AgentInfo` 对象的一部分。如果提供了 `agent_info`，则此参数现在是必填项 ([dab185a](https:\u002F\u002F","2026-03-20T22:40:00",{"id":144,"version":145,"summary_zh":146,"released_at":147},263561,"v1.141.0","## [1.141.0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcompare\u002Fv1.140.0...v1.141.0) (2026-03-10)\n\n\n### 功能特性\n\n* “global” 端点支持 “grpc” 传输 ([b3bae32](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fb3bae32b8577974a113ad1dc3effb5d3a3db4fe0))\n* 添加 PromptTemplateData，以在从 DataFrame 创建评估运行时支持 `context` 和 `history` 列 ([e887a2e](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe887a2e5dc00462793663c647b00666e5b7bae02))\n* GenAI 客户端（评估）：将候选名称添加到本地 ADK 代理抓取中 ([79d8e1c](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F79d8e1c63ef3ec854fec7233f612eea77d5946ef))\n* GenAI 客户端（评估）：在 create_evaluation_run 中为 EvaluationDataset 中的 `agent_data` 添加验证 ([2b0a98c](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F2b0a98cc06ddaa40dab52359ade3dafa0ab07062))\n* GenAI 客户端（评估）：在 create_evaluation_run 中引入 agent_config（代理）配置 ([eacc86c](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Feacc86c1fec17d48cf6b2ec93e8ef03560496f7d))\n* GenAI 客户端（评估）：将代理配置映射修补至代理数据 ([8ba4707](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F8ba4707cab8c33aa63bacf1c56bb6b61a74aea70))\n* GenAI 客户端（评估）：更新以启用评估管理服务中的代理运行功能 ([7a59738](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F7a597382a0f793cf8784bb4e2735c756b43886ea))\n* 更新 match_service proto，以支持新的嵌入元数据字段 ([fba5350](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Ffba53506b8b59746a822824aad5f08975450112c))\n* 更新 match_service proto，以支持新的嵌入元数据字段 ([fba5350](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Ffba53506b8b59746a822824aad5f08975450112c))\n* 更新 match_service proto，以支持新的嵌入元数据字段 ([fba5350](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Ffba53506b8b59746a822824aad5f08975450112c))\n\n\n### Bug 修复\n\n* GenAI 客户端（评估）：向 _json_serializer 添加 datetime 和 bytes 序列化 ([7410b1d](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F7410b1d4361c42e52d6527ed96e69349cbb84378))","2026-03-10T21:56:50",{"id":149,"version":150,"summary_zh":151,"released_at":152},263562,"v1.140.0","## [1.140.0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcompare\u002Fv1.139.0...v1.140.0) (2026-03-04)\n\n\n### 功能特性\n\n* 在消息 `.google.cloud.aiplatform.v1beta1.Metric` 中新增字段 `computation_based_metric_spec` ([e5f71de](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe5f71de169f378ee3ae992b15f8c38b1cf9454a1))\n* 在消息 `.google.cloud.aiplatform.v1beta1.EvaluateDatasetRun` 中新增字段 `evaluation_run` ([e5f71de](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe5f71de169f378ee3ae992b15f8c38b1cf9454a1))\n* 在消息 `.google.cloud.aiplatform.v1beta1.EvaluationConfig` 中新增字段 `inference_generation_config` ([e5f71de](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe5f71de169f378ee3ae992b15f8c38b1cf9454a1))\n* 在消息 `.google.cloud.aiplatform.v1beta1.Metric` 中新增字段 `llm_based_metric_spec` ([e5f71de](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe5f71de169f378ee3ae992b15f8c38b1cf9454a1))\n* 在消息 `.google.cloud.aiplatform.v1beta1.EvaluateInstancesResponse` 中新增字段 `metric_results` ([e5f71de](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe5f71de169f378ee3ae992b15f8c38b1cf9454a1))\n* 在消息 `.google.cloud.aiplatform.v1beta1.Metric` 中新增字段 `predefined_metric_spec` ([e5f71de](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe5f71de169f378ee3ae992b15f8c38b1cf9454a1))\n* 新增消息 `ComputationBasedMetricSpec` ([e5f71de](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe5f71de169f378ee3ae992b15f8c38b1cf9454a1))\n* 新增消息 `LLMBasedMetricSpec` ([e5f71de](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe5f71de169f378ee3ae992b15f8c38b1cf9454a1))\n* 新增消息 `MetricResult` ([e5f71de](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe5f71de169f378ee3ae992b15f8c38b1cf9454a1))\n* 新增消息 `PredefinedMetricSpec` ([e5f71de](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe5f71de169f378ee3ae992b15f8c38b1cf9454a1))\n* 在消息 `GenerateContentResponse.UsageMetadata` 中添加 `traffic_type, tool_use_prompt_tokens_details` ([e5f71de](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe5f71de169f378ee3ae992b15f8c38b1cf9454a1))\n* 向 VertexRagService 添加 AskContexts 和 AsyncRetrieveContexts API ([fa610af](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Ffa610af6e77ab9085ee1cbaab833c310a2706708))\n* 向 VertexRagService 添加 AskContexts 和 AsyncRetrieveContexts API ([fa610af](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Ffa610af6e77ab9085ee1cbaab833c310a2706708))\n* 为 Agent Engine Task Store Service 的 GenAI 客户端添加异步封装器 ([199a406](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F199a40662a6021910c6f8c403523b720546bb711))\n* 在 Reasoning Engine 公开协议缓冲区中添加 `image_spec` ([e5f71de](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe5f71de169f378ee3ae992b15f8c38b1cf9454a1))\n* 在 Reasoning Engine 公开…","2026-03-04T00:47:12",{"id":154,"version":155,"summary_zh":156,"released_at":157},263563,"v1.139.0","## [1.139.0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcompare\u002Fv1.138.0...v1.139.0) (2026-02-24)\n\n\n### 功能\n\n* 为定期管道客户端 GA 添加 `max_concurrent_active_run_count` 支持。([a204e74](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fa204e742a5bf398cd65f47508c04500432d3d884))\n* GenAI 客户端（评估）- 更新代理数据的 SDK 类型定义 ([6ac28a5](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F6ac28a546218feb6c3cd1d29f36cd9e980eba778))","2026-02-25T00:39:43",{"id":159,"version":160,"summary_zh":161,"released_at":162},263564,"v1.138.0","## [1.138.0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcompare\u002Fv1.137.0...v1.138.0)（2026-02-17）\n\n\n### 功能特性\n\n* 在 AE 部署中添加对 BYO-dockerfile 的支持（[7572601](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F7572601e4eb15167c3c6965039994d4f0c069d67)）\n* GenAI SDK 客户端 - 在创建代理引擎沙盒时，使操作轮询间隔可配置（[bf9e0ff](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fbf9e0ffbdb9d4c08817cb54d46f6b22f968f077d)）\n* GenAI SDK 客户端（多模态）- 支持对多模态数据集进行批量预测资源评估。（[0fe5314](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F0fe5314995cb07388408587ed843fb5586b797b0)）\n* GenAI SDK 客户端（多模态）- 支持对多模态数据集进行批量预测有效性评估。（[a63e8d5](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fa63e8d5980cec5851c933e8f446e63b5ca847df5)）\n* GenAI SDK 客户端（多模态）- 支持对多模态数据集进行调优有效性评估。（[12f5aa5](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F12f5aa5799d8e342d8fdf2d1a2b14dc8e05aa0da)）\n* 更新 ADK 模板，当 OTEL_SEMCONV_STABILITY_OPT_IN 设置为“gen_ai_latest_experimental”时，直接将日志导出到 Cloud Logging。（[82db4ad](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F82db4adcf5900bd9bb38f1d7a6949e0cc09a05cd)）\n\n\n### Bug 修复\n\n* 重构 _streaming_agent_run_with_events 中的会话检索回退逻辑。（[8aec754](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F8aec75496dc0aab66dec7012ac6e9f3272ffccfd)）","2026-02-17T22:03:29",{"id":164,"version":165,"summary_zh":166,"released_at":167},263565,"v1.137.0","## [1.137.0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcompare\u002Fv1.136.0...v1.137.0) (2026-02-11)\n\n\n### Features\n\n* Add filter_groups to PurgeMemories for metadata-based filtering ([6907f89](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F6907f89be1fa40a29f779fa79b785513248e35d6))\n* Deprecate prompt_optimizer.optimize and prompt_optimizer.optimize_prompt in favor of prompts.launch_optimization_job and prompts.optimize ([ff811f5](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fff811f5dc2bb82eec2a38c5fe6d69961e9bbe78b))\n\n\n### Bug Fixes\n\n* Use GOOGLE_CLOUD_AGENT_ENGINE_LOCATION env var for service locations. ([04aacbb](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F04aacbb76c0ce0280daf5a4cd4fd5496e8d3f5e1))","2026-02-11T16:15:48",{"id":169,"version":170,"summary_zh":171,"released_at":172},263566,"v1.136.0","## [1.136.0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcompare\u002Fv1.135.0...v1.136.0) (2026-02-04)\n\n\n### Features\n\n* Add `fps` to message `VideoMetadata` ([157381a](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F157381a3c402fdca0c226b4f4b3fa4a2b17a2cf9))\n* Add agent_card to agent engine spec ([d685d81](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fd685d8193eea8123f7de933213bd8487faf03f02))\n* Force-delete for RagCorpora, ignoring any external errors and ensuring deletion of the RagCorpus ([157381a](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F157381a3c402fdca0c226b4f4b3fa4a2b17a2cf9))\n* Update data types from discovery doc. ([a5748fd](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fa5748fd4f09239e437cdf11efb24d22746b35174))\n\n\n### Bug Fixes\n\n* Support custom credentials in RAG.upload_file ([66c4d85](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F66c4d85f51a9fb5ca0afc23cec9b20850e7f6f3e)), closes [#4986](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fissues\u002F4986)\n* Test fix internal ([1bbf7bb](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F1bbf7bbd0725a4150558c7cc2980203697f8dafa))","2026-02-04T16:12:05",{"id":174,"version":175,"summary_zh":176,"released_at":177},263567,"v1.135.0","## [1.135.0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcompare\u002Fv1.134.0...v1.135.0) (2026-01-27)\n\n\n### Features\n\n* Add `ComputationBasedMetricSpec` to support metrics like EXACT_MATCH, BLEU, and ROUGE in EvaluationRun configurations. ([9d32dd5](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F9d32dd56d56825e60f45fa7d5f60aba365950367))\n* Add autoscaling_target_dcgm_fi_dev_gpu_util, autoscaling_target_vllm_gpu_cache_usage_perc, autoscaling_target_vllm_num_requests_waiting options in model deployment on Endpoint & Model classes. ([0179aa5](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F0179aa5fd18260ffa9723abb16b1ec2d36571dd2))\n* Add optimize and launch_optimization_job methods to prompts module ([044c3fa](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F044c3fa04d4e595d132a1f76f96dc0a9cec39863))\n* List all Model Garden models ([54260fd](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F54260fdece355f76ed14cb6f0a577dc6c63e2046))\n* RAG - Add Serverless and Spanner modes in preview. ([79da831](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F79da8316dfcd3d8200d4ec1d2dca624bfc6988a8))\n* Remove experimental warning from sandboxes and add specific warning for non-code execution sandboxes. ([7d31d13](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F7d31d137b351201b1e40c649da08f96e8be1094e))\n* Update the v1 service definition to add embedding_metadata. ([8f5bfc5](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F8f5bfc5476e150bc64cc43d46f6a4a2932c07df8))\n\n\n### Documentation\n\n* Update prompt optimizer code in readme to use methods from prompts module ([4636507](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F4636507b398e63512b9559256582c3fdda1ce2e3))","2026-01-28T00:15:55",{"id":179,"version":180,"summary_zh":181,"released_at":182},263568,"v1.134.0","## [1.134.0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcompare\u002Fv1.133.0...v1.134.0) (2026-01-18)\n\n\n### Features\n\n* Add metadata to memories ([f9fc79d](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Ff9fc79dda6888538486f4fd6a44aa02fa1bcba75))\n* Expose PSC for OpenModel ([feeb54d](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Ffeeb54d7a227adfadfb7d45a425c16e260dcb16b))\n* GenAI Client(evals) - Add support for `inference_configs` in `create_evaluation_run`. ([33fe72a](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F33fe72a41de35f43c1ceb905ecf5652d5257b3ac))\n* GenAI SDK client - Support agent engine sandbox http request in genai sdk ([11c23a3](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F11c23a36a2a2e8a7ac6e9a4d6934943d9e8d1aa9))\n* Support metadata filtering for memory retrieval ([f9fc79d](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Ff9fc79dda6888538486f4fd6a44aa02fa1bcba75))\n* Support metadata merge strategies for memory generation ([f9fc79d](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Ff9fc79dda6888538486f4fd6a44aa02fa1bcba75))\n* Support Python 3.14 for reasoning engine. ([394cd1d](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F394cd1d5c29eeca46804fca90f6a9a43ab92206d))\n* Update data types from discovery doc. ([0c6fb66](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F0c6fb66ed5f641f60d5d1d14a51a5f4fcfa64aa1))\n* Update data types from discovery doc. ([a451fa3](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fa451fa374f670d2540f654866eb1091948efaf79))\n\n\n### Bug Fixes\n\n* Mistyping of langchain tools causing mypy errors ([0705a37](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F0705a378c6b81fa82a8e77c9c6026130209e57fb))\n* Test fix internal ([b1b900e](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fb1b900e953f9391b901cbdbe448a976d63fa3aca))","2026-01-20T18:58:58",{"id":184,"version":185,"summary_zh":186,"released_at":187},263569,"v1.133.0","## [1.133.0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcompare\u002Fv1.132.0...v1.133.0) (2026-01-08)\n\n\n### Features\n\n* Deprecate tuning public preview SDK in favor of tuning SDK ([35d362c](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F35d362ce8f6c50498f781857e0d8cabd327284be))\n* GenAI SDK client - Enabling Few-shot Prompt Optimization by passing either \"OPTIMIZATION_TARGET_FEW_SHOT_RUBRICS\" or \"OPTIMIZATION_TARGET_FEW_SHOT_TARGET_RESPONSE\" to the `optimize_prompt` method ([715cc5b](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F715cc5b71b996eecde2d97bad71a617274739dcc))\n* GenAI SDK client(memory): Add enable_third_person_memories ([65717fa](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F65717fa0c3d9b8c3105638cf9c75ee415f36b6e0))\n* Support Developer Connect in AE ([04f1771](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F04f1771e16f54a0627ecac1266764ca77f833694))\n\n\n### Bug Fixes\n\n* Add None check for agent_info in evals.py ([c8c0f0f](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fc8c0f0f7eb67696c2e91902af7e6dca20cea2040))\n* GenAI client(evals) - Fix TypeError in _build_generate_content_config ([be2eaaa](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fbe2eaaa30dbf13a86f6856771eeacd2a51a97806))\n* Make project_number to project_id mapping fail-open. ([f1c8458](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Ff1c8458dd5e4641cb03ff175f0837b6d6017c131))\n* Replace asyncio.run with create_task in ADK async thread mains. ([83f4076](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F83f4076706d808dcc0e1784219856846540e10da))\n* Replace asyncio.run with create_task in ADK async thread mains. ([8c876ef](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F8c876ef069d0fe6942790ede41e203196cd4a390))\n* Require uri or staging bucket configuration for saving model to Vertex Experiment. ([5448f06](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F5448f065fa30d77c2ee0868249ec0bea6a93a4c0))\n* Return embedding metadata if available ([d9c6eb1](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fd9c6eb199b6ccc1fae417463e1b374574f2ae2f8))\n* Update `examples_dataframe` type to `PandasDataFrame` in Prompt Optimizer. ([a2564cc](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fa2564cc3ea5c4860ee732f14cea9db2c10b52420))","2026-01-08T22:05:06",{"id":189,"version":190,"summary_zh":191,"released_at":192},263570,"v1.132.0","## [1.132.0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcompare\u002Fv1.131.0...v1.132.0) (2025-12-17)\n\n\n### Features\n\n* Add Lustre support to the Vertex Training Custom Job API ([71747e8](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F71747e8cbb028f321fd9499bd77656b083909eb0))\n\n\n### Documentation\n\n* A comment for field `restart_job_on_worker_restart` in message `.google.cloud.aiplatform.v1beta1.Scheduling` is changed ([71747e8](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F71747e8cbb028f321fd9499bd77656b083909eb0))\n* A comment for field `timeout` in message `.google.cloud.aiplatform.v1beta1.Scheduling` is changed ([71747e8](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F71747e8cbb028f321fd9499bd77656b083909eb0))","2025-12-17T06:03:44",{"id":194,"version":195,"summary_zh":196,"released_at":197},263571,"v1.131.0","## [1.131.0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcompare\u002Fv1.130.0...v1.131.0) (2025-12-16)\n\n\n### Features\n\n* Allow list of events to be passed to AdkApp.async_stream_query ([dd8840a](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fdd8840a5012b2762f8b8971b6cea4302ac5c648d))\n* GenAI Client(evals) - Support CustomCodeExecution metric in Vertex Gen AI Eval Service ([4114728](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F4114728750b5b12f991a18df87c1f1a570d1b29d))\n* Updates the ADK template to direct structured JSON logs to standard output. ([a65ec29](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fa65ec297c5b8d99e4d2dfb49473c189197198f97))\n\n\n### Bug Fixes\n\n* Fix RagManagedVertexVectorSearch when using backend_config ([df0976e](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fdf0976ed3195dc8313f4728bc5ecb29dda55d467))\n* GenAI Client(evals) - patch for vulnerability in visualization ([8a00d43](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F8a00d43dbd24e95dbab6ea32c63ce0a5a1849480))","2025-12-17T01:01:26",{"id":199,"version":200,"summary_zh":201,"released_at":202},263572,"v1.130.0","## [1.130.0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcompare\u002Fv1.129.0...v1.130.0) (2025-12-10)\n\n\n### Features\n\n* A new field `min_gpu_driver_version` is added to message `.google.cloud.aiplatform.v1beta1.MachineSpec` ([26dfdfe](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F26dfdfec208667cf25ecfd5649af3676586ff139))\n* Adding RagManagedVertexVectorSearch Vector DB option for RAG corpuses to SDK ([da79e21](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fda79e218cacfa39e00d57ada7f0ec2b12fa35a84))\n* Expose FullFineTunedResources for full fine tuned deployments ([26dfdfe](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F26dfdfec208667cf25ecfd5649af3676586ff139))\n* Expose zone when creating a FeatureOnlineStore ([26dfdfe](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F26dfdfec208667cf25ecfd5649af3676586ff139))\n* GenAI Client(evals) - Add support to local agent run for agent eval ([30e41d0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F30e41d01f3fd0ef08da6ad6eb7f83df34476105e))\n* GenAI SDK client(memory): Add PurgeMemories ([95eb10f](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F95eb10f7a9572553f2f238e95252619b8a340736))\n* Introduce RagManagedVertexVectorSearch as a new vector db option ([26dfdfe](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F26dfdfec208667cf25ecfd5649af3676586ff139))\n\n\n### Documentation\n\n* Update `ReplicatedVoiceConfig.mime_type` comment ([26dfdfe](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F26dfdfec208667cf25ecfd5649af3676586ff139))\n* Update `ReplicatedVoiceConfig.mime_type` comment ([26dfdfe](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F26dfdfec208667cf25ecfd5649af3676586ff139))","2025-12-10T13:55:59",{"id":204,"version":205,"summary_zh":206,"released_at":207},263573,"v1.129.0","## [1.129.0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcompare\u002Fv1.128.0...v1.129.0) (2025-12-03)\n\n\n### ⚠ BREAKING CHANGES\n\n* An existing field `transfer_to_agent` is removed from message `.google.cloud.aiplatform.v1beta1.EventActions`\n* updating `bigtable_metadata` field name in `FeatureOnlineStore`\n* updating `enableDirectBigtableAccess` field name in FeatureOnlineStore`\n* updating `bigtable_metadata` field name in `FeatureView`\n\n### Features\n\n* Add `gpu_partition_size` in `machine_spec` v1 api ([e0bc3d8](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe0bc3d83be3f508a500b693d11c7185199a7f454))\n* Add `ReplicatedVoiceConfig` to `VoiceConfig` to enable Gemini TTS voice replication ([e0bc3d8](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe0bc3d83be3f508a500b693d11c7185199a7f454))\n* Add `ReplicatedVoiceConfig` to `VoiceConfig` to enable Gemini TTS voice replication ([e0bc3d8](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe0bc3d83be3f508a500b693d11c7185199a7f454))\n* Add EmbedContent method v1 ([e0bc3d8](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe0bc3d83be3f508a500b693d11c7185199a7f454))\n* Add EmbedContent method v1beta1 ([e0bc3d8](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe0bc3d83be3f508a500b693d11c7185199a7f454))\n* Add FunctionResponsePart and excluded_predefined_functions in ComputerUse ([e0bc3d8](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe0bc3d83be3f508a500b693d11c7185199a7f454))\n* Add FunctionResponsePart and excluded_predefined_functions in ComputerUse ([e0bc3d8](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe0bc3d83be3f508a500b693d11c7185199a7f454))\n* Add new fields `SUCCESSFULLY_DEPLOYED` and `FAILED_TO_DEPLOY` to `DeploymentStage` ([e0bc3d8](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe0bc3d83be3f508a500b693d11c7185199a7f454))\n* Add new fields `SUCCESSFULLY_DEPLOYED` and `FAILED_TO_DEPLOY` to `DeploymentStage` ([e0bc3d8](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe0bc3d83be3f508a500b693d11c7185199a7f454))\n* Add order_by to list_events ([e0bc3d8](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe0bc3d83be3f508a500b693d11c7185199a7f454))\n* Add support for developer connect based deployment ([e0bc3d8](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe0bc3d83be3f508a500b693d11c7185199a7f454))\n* Add support for developer connect based deployment ([e0bc3d8](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe0bc3d83be3f508a500b693d11c7185199a7f454))\n* Continuous Tuning ([e0bc3d8](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe0bc3d83be3f508a500b693d11c7185199a7f454))\n* Enable Vertex Model Garden Managed OSS Fine Tuning. ([26b7e51](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F26b7e51603e3e523f41277c697adc50962fb70f0))\n* GenAI Client(evals) - Add location override parameter to run_inference and evaluate methods ([b867043](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fb867043ca506e08c87f753f0c9b81ba007bfcec5))\n* GenAI Client(evals) - support setting autorater generation config for predefined rubric metrics ([9304f15](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F9304f156d89b1a36b0794692798ddd5dbda128a5))\n* GenAI SDK client(multimodal) - Support Assess Tuning Resource for multimodal dataset. ([bc26160](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fbc26160c89817a985770608912b565e40127da73))\n* GenAI SDK client(sessions): Add label to Sessions ([837c8ea](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F837c8ea05479ae43847d2e0f9e7d80385f43ba0e))\n\n\n### Bug Fixes\n\n* Add OTel cloud.provider attribute to AdkTemplate ([7d3bcdd](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F7d3bcddf790e747b261089bc3295b8af3c915959))\n* Add support for app in _init_session ([d9f6c58](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fd9f6c58a2315352b41cf763646243229a3fe5059))\n* An existing field `transfer_to_agent` is removed from message `.google.cloud.aiplatform.v1beta1.EventActions` ([e0bc3d8](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe0bc3d83be3f508a500b693d11c7185199a7f454))\n* Correlate traces with logs in Cloud Trace panel on `adk deploy agent_engine` ([9301551](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F9301551181d80b4ef199a6db5daf7fbe76e0fb7a))\n* Enable `from vertexai.types import TypeName` without needing to run `from vertexai import types` first ([46285bf](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F46285bf11623cfe6ac77011be11996388f67f1d0))\n* Enable `from vertexai.types import TypeName` without needing to run `from vertexai import types` first ([f4a6cbe](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Ff4a6cbe6530a5a01b2c5fb95f388dfcc8e8e331f))\n* Gen AI SDK client - Fix bug in GCS bucket creation for new agent engines. ([8d4ce38](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F8d4ce38c3b02672e06458554c7fb3475d0d8c052))\n* GenAI SDK client(eval) - Reorder the params to put the Config param at the last pla","2025-12-03T03:06:00",{"id":209,"version":210,"summary_zh":211,"released_at":212},263574,"v1.128.0","## [1.128.0](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcompare\u002Fv1.127.0...v1.128.0) (2025-11-18)\n\n\n### Features\n\n* GenAI Client(evals) - Add `pass_rate` to `AggregatedMetricResult` and calculate it for adaptive rubric metrics. ([1f1f67e](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F1f1f67e548b7616653f6d84954376b1d4e095ca0))\n* GenAI SDK client - Support `build options` in Agent Engine GCS Deployment. ([28499a9](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F28499a92466994232191e5aaf7745180abb4a640))\n* GenAI SDK client - Support `build options` in Agent Engine source-based Deployment. ([f7e718f](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Ff7e718fbe82a82fc301a95105bc8a44d65abd652))\n* GenAI SDK client(multimodal) - Support Assemble feature on the multimodal datasets. ([2195411](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F21954110fa700f539c17ed6deca3ea0ac789019e))\n\n\n### Bug Fixes\n\n* Fix the change runner behavior back to sync function in streaming_agent_run_with_events ([e9d9c31](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fe9d9c31f8b386e21094fb98820aa263d5ee7224e))\n* GenAI Client(evals) - fix eval visualizations in Vertex Workbench ([c3abe51](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002Fc3abe512332fa9e0a9a23e269ff8998d99e3b703))\n* GenAI Client(evals) - Reformat codebase 1. Remove duplicated code in _evals_utils and _evals_metric_loader 2. Keep metric utils in _evals_metric_loader and data util in _evals_utils ([5f3c655](https:\u002F\u002Fgithub.com\u002Fgoogleapis\u002Fpython-aiplatform\u002Fcommit\u002F5f3c6558691010ee1211f548e8d7ffa254a8fad3))","2025-11-19T01:25:09"]