[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-Undertone0809--promptulate":3,"tool-Undertone0809--promptulate":61},[4,18,26,36,44,53],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":17},4358,"openclaw","openclaw\u002Fopenclaw","OpenClaw 是一款专为个人打造的本地化 AI 助手，旨在让你在自己的设备上拥有完全可控的智能伙伴。它打破了传统 AI 助手局限于特定网页或应用的束缚，能够直接接入你日常使用的各类通讯渠道，包括微信、WhatsApp、Telegram、Discord、iMessage 等数十种平台。无论你在哪个聊天软件中发送消息，OpenClaw 都能即时响应，甚至支持在 macOS、iOS 和 Android 设备上进行语音交互，并提供实时的画布渲染功能供你操控。\n\n这款工具主要解决了用户对数据隐私、响应速度以及“始终在线”体验的需求。通过将 AI 部署在本地，用户无需依赖云端服务即可享受快速、私密的智能辅助，真正实现了“你的数据，你做主”。其独特的技术亮点在于强大的网关架构，将控制平面与核心助手分离，确保跨平台通信的流畅性与扩展性。\n\nOpenClaw 非常适合希望构建个性化工作流的技术爱好者、开发者，以及注重隐私保护且不愿被单一生态绑定的普通用户。只要具备基础的终端操作能力（支持 macOS、Linux 及 Windows WSL2），即可通过简单的命令行引导完成部署。如果你渴望拥有一个懂你",349277,3,"2026-04-06T06:32:30",[13,14,15,16],"Agent","开发框架","图像","数据工具","ready",{"id":19,"name":20,"github_repo":21,"description_zh":22,"stars":23,"difficulty_score":10,"last_commit_at":24,"category_tags":25,"status":17},3808,"stable-diffusion-webui","AUTOMATIC1111\u002Fstable-diffusion-webui","stable-diffusion-webui 是一个基于 Gradio 构建的网页版操作界面，旨在让用户能够轻松地在本地运行和使用强大的 Stable Diffusion 图像生成模型。它解决了原始模型依赖命令行、操作门槛高且功能分散的痛点，将复杂的 AI 绘图流程整合进一个直观易用的图形化平台。\n\n无论是希望快速上手的普通创作者、需要精细控制画面细节的设计师，还是想要深入探索模型潜力的开发者与研究人员，都能从中获益。其核心亮点在于极高的功能丰富度：不仅支持文生图、图生图、局部重绘（Inpainting）和外绘（Outpainting）等基础模式，还独创了注意力机制调整、提示词矩阵、负向提示词以及“高清修复”等高级功能。此外，它内置了 GFPGAN 和 CodeFormer 等人脸修复工具，支持多种神经网络放大算法，并允许用户通过插件系统无限扩展能力。即使是显存有限的设备，stable-diffusion-webui 也提供了相应的优化选项，让高质量的 AI 艺术创作变得触手可及。",162132,"2026-04-05T11:01:52",[14,15,13],{"id":27,"name":28,"github_repo":29,"description_zh":30,"stars":31,"difficulty_score":32,"last_commit_at":33,"category_tags":34,"status":17},1381,"everything-claude-code","affaan-m\u002Feverything-claude-code","everything-claude-code 是一套专为 AI 编程助手（如 Claude Code、Codex、Cursor 等）打造的高性能优化系统。它不仅仅是一组配置文件，而是一个经过长期实战打磨的完整框架，旨在解决 AI 代理在实际开发中面临的效率低下、记忆丢失、安全隐患及缺乏持续学习能力等核心痛点。\n\n通过引入技能模块化、直觉增强、记忆持久化机制以及内置的安全扫描功能，everything-claude-code 能显著提升 AI 在复杂任务中的表现，帮助开发者构建更稳定、更智能的生产级 AI 代理。其独特的“研究优先”开发理念和针对 Token 消耗的优化策略，使得模型响应更快、成本更低，同时有效防御潜在的攻击向量。\n\n这套工具特别适合软件开发者、AI 研究人员以及希望深度定制 AI 工作流的技术团队使用。无论您是在构建大型代码库，还是需要 AI 协助进行安全审计与自动化测试，everything-claude-code 都能提供强大的底层支持。作为一个曾荣获 Anthropic 黑客大奖的开源项目，它融合了多语言支持与丰富的实战钩子（hooks），让 AI 真正成长为懂上",144730,2,"2026-04-07T23:26:32",[14,13,35],"语言模型",{"id":37,"name":38,"github_repo":39,"description_zh":40,"stars":41,"difficulty_score":32,"last_commit_at":42,"category_tags":43,"status":17},2271,"ComfyUI","Comfy-Org\u002FComfyUI","ComfyUI 是一款功能强大且高度模块化的视觉 AI 引擎，专为设计和执行复杂的 Stable Diffusion 图像生成流程而打造。它摒弃了传统的代码编写模式，采用直观的节点式流程图界面，让用户通过连接不同的功能模块即可构建个性化的生成管线。\n\n这一设计巧妙解决了高级 AI 绘图工作流配置复杂、灵活性不足的痛点。用户无需具备编程背景，也能自由组合模型、调整参数并实时预览效果，轻松实现从基础文生图到多步骤高清修复等各类复杂任务。ComfyUI 拥有极佳的兼容性，不仅支持 Windows、macOS 和 Linux 全平台，还广泛适配 NVIDIA、AMD、Intel 及苹果 Silicon 等多种硬件架构，并率先支持 SDXL、Flux、SD3 等前沿模型。\n\n无论是希望深入探索算法潜力的研究人员和开发者，还是追求极致创作自由度的设计师与资深 AI 绘画爱好者，ComfyUI 都能提供强大的支持。其独特的模块化架构允许社区不断扩展新功能，使其成为当前最灵活、生态最丰富的开源扩散模型工具之一，帮助用户将创意高效转化为现实。",107888,"2026-04-06T11:32:50",[14,15,13],{"id":45,"name":46,"github_repo":47,"description_zh":48,"stars":49,"difficulty_score":32,"last_commit_at":50,"category_tags":51,"status":17},4721,"markitdown","microsoft\u002Fmarkitdown","MarkItDown 是一款由微软 AutoGen 团队打造的轻量级 Python 工具，专为将各类文件高效转换为 Markdown 格式而设计。它支持 PDF、Word、Excel、PPT、图片（含 OCR）、音频（含语音转录）、HTML 乃至 YouTube 链接等多种格式的解析，能够精准提取文档中的标题、列表、表格和链接等关键结构信息。\n\n在人工智能应用日益普及的今天，大语言模型（LLM）虽擅长处理文本，却难以直接读取复杂的二进制办公文档。MarkItDown 恰好解决了这一痛点，它将非结构化或半结构化的文件转化为模型“原生理解”且 Token 效率极高的 Markdown 格式，成为连接本地文件与 AI 分析 pipeline 的理想桥梁。此外，它还提供了 MCP（模型上下文协议）服务器，可无缝集成到 Claude Desktop 等 LLM 应用中。\n\n这款工具特别适合开发者、数据科学家及 AI 研究人员使用，尤其是那些需要构建文档检索增强生成（RAG）系统、进行批量文本分析或希望让 AI 助手直接“阅读”本地文件的用户。虽然生成的内容也具备一定可读性，但其核心优势在于为机器",93400,"2026-04-06T19:52:38",[52,14],"插件",{"id":54,"name":55,"github_repo":56,"description_zh":57,"stars":58,"difficulty_score":10,"last_commit_at":59,"category_tags":60,"status":17},4487,"LLMs-from-scratch","rasbt\u002FLLMs-from-scratch","LLMs-from-scratch 是一个基于 PyTorch 的开源教育项目，旨在引导用户从零开始一步步构建一个类似 ChatGPT 的大型语言模型（LLM）。它不仅是同名技术著作的官方代码库，更提供了一套完整的实践方案，涵盖模型开发、预训练及微调的全过程。\n\n该项目主要解决了大模型领域“黑盒化”的学习痛点。许多开发者虽能调用现成模型，却难以深入理解其内部架构与训练机制。通过亲手编写每一行核心代码，用户能够透彻掌握 Transformer 架构、注意力机制等关键原理，从而真正理解大模型是如何“思考”的。此外，项目还包含了加载大型预训练权重进行微调的代码，帮助用户将理论知识延伸至实际应用。\n\nLLMs-from-scratch 特别适合希望深入底层原理的 AI 开发者、研究人员以及计算机专业的学生。对于不满足于仅使用 API，而是渴望探究模型构建细节的技术人员而言，这是极佳的学习资源。其独特的技术亮点在于“循序渐进”的教学设计：将复杂的系统工程拆解为清晰的步骤，配合详细的图表与示例，让构建一个虽小但功能完备的大模型变得触手可及。无论你是想夯实理论基础，还是为未来研发更大规模的模型做准备",90106,"2026-04-06T11:19:32",[35,15,13,14],{"id":62,"github_repo":63,"name":64,"description_en":65,"description_zh":66,"ai_summary_zh":67,"readme_en":68,"readme_zh":69,"quickstart_zh":70,"use_case_zh":71,"hero_image_url":72,"owner_login":73,"owner_name":74,"owner_avatar_url":75,"owner_bio":76,"owner_company":77,"owner_location":78,"owner_email":79,"owner_twitter":80,"owner_website":81,"owner_url":82,"languages":83,"stars":96,"forks":97,"last_commit_at":98,"license":99,"difficulty_score":100,"env_os":101,"env_gpu":101,"env_ram":101,"env_deps":102,"category_tags":107,"github_topics":108,"view_count":32,"oss_zip_url":81,"oss_zip_packed_at":81,"status":17,"created_at":120,"updated_at":121,"faqs":122,"releases":163},5371,"Undertone0809\u002Fpromptulate","promptulate","🚀Lightweight Large language model automation and Autonomous Language Agents development framework. Build your LLM Agent Application in a pythonic way!","Promptulate 是由 Cogit Lab 打造的一款轻量级大语言模型自动化与自主智能体（Agent）开发框架。它致力于让开发者通过纯正的 Python 风格，以极简的代码快速构建复杂的 AI 应用。\n\n在传统开发中，整合大模型、工具链及记忆模块往往需要繁琐的配置和大量的样板代码。Promptulate 有效解决了这一痛点，它将 LLM、Agent、工具调用及 RAG 等核心组件高度封装，大多数任务仅需几行代码即可完成。其独特的技术亮点在于“函数即工具”的设计理念，允许开发者直接将任意 Python 函数转化为智能体可调用的工具；同时提供丰富的生命周期钩子（Hooks）和提示词缓存机制，既增强了灵活性又提升了运行效率。此外，它还无缝兼容 LangChain 等主流生态，并内置了强大的 OpenAI 封装与 Streamlit 组件，进一步降低了集成门槛。\n\n这款框架非常适合希望高效落地 AI 应用的 Python 开发者、算法工程师及技术研究人员。如果你不想花费大量时间学习复杂的框架细节，而是希望通过直观的 `pne.chat()` 接口迅速实现想法，Promptulate 将是","Promptulate 是由 Cogit Lab 打造的一款轻量级大语言模型自动化与自主智能体（Agent）开发框架。它致力于让开发者通过纯正的 Python 风格，以极简的代码快速构建复杂的 AI 应用。\n\n在传统开发中，整合大模型、工具链及记忆模块往往需要繁琐的配置和大量的样板代码。Promptulate 有效解决了这一痛点，它将 LLM、Agent、工具调用及 RAG 等核心组件高度封装，大多数任务仅需几行代码即可完成。其独特的技术亮点在于“函数即工具”的设计理念，允许开发者直接将任意 Python 函数转化为智能体可调用的工具；同时提供丰富的生命周期钩子（Hooks）和提示词缓存机制，既增强了灵活性又提升了运行效率。此外，它还无缝兼容 LangChain 等主流生态，并内置了强大的 OpenAI 封装与 Streamlit 组件，进一步降低了集成门槛。\n\n这款框架非常适合希望高效落地 AI 应用的 Python 开发者、算法工程师及技术研究人员。如果你不想花费大量时间学习复杂的框架细节，而是希望通过直观的 `pne.chat()` 接口迅速实现想法，Promptulate 将是一个得心应手的选择，助你轻松从原型验证迈向生产部署。","\u003Cp align=\"center\">\n    \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FUndertone0809_promptulate_readme_e84b5053274f.png\" alt=\"promptulate\" style=\"border-radius: 15px;\"\u002F>\n\u003C\u002Fp>\n\n\u003Cp align=\"center\">\n    \u003Ca target=\"_blank\" href=\"\">\n        \u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Flicense\u002FUndertone0809\u002Fpromptulate.svg?style=flat-square\" \u002F>\n    \u003C\u002Fa>\n    \u003Ca target=\"_blank\" href=''>\n        \u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Frelease\u002FUndertone0809\u002Fpromptulate\u002Fall.svg?style=flat-square\"\u002F>\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fpypi.org\u002Fproject\u002Fpromptulate\" target=\"_blank\">\n        \u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fpypi\u002Fpyversions\u002Fpromptulate.svg?color=%2334D058\" alt=\"Supported Python versions\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Ft.me\u002Fzeeland0809\" target=\"_blank\">\n      \u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FTelegram-join%20chat-2CA5E0?logo=telegram&logoColor=white\" alt=\"chat on Telegram\">\n    \u003C\u002Fa>\n    \u003Ca target=\"_blank\" href=''>\n        \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FUndertone0809_promptulate_readme_d202bd689978.png\"\u002F>\n    \u003C\u002Fa>\n\u003C\u002Fp>\n\n[English](\u002FREADME.md) [中文](\u002FREADME_zh.md)\n\n## Overview\n\n**Promptulate** is an AI Agent application development framework crafted by **Cogit Lab**, which offers developers an extremely concise and efficient way to build Agent applications through a Pythonic development paradigm. The core philosophy of Promptulate is to borrow and integrate the wisdom of the open-source community, incorporating the highlights of various development frameworks to lower the barrier to entry and unify the consensus among developers. With Promptulate, you can manipulate components like LLM, Agent, Tool, RAG, etc., with the most succinct code, as most tasks can be easily completed with just a few lines of code. 🚀\n\n## 💡 Features\n\n- 🐍 Pythonic Code Style: Embraces the habits of Python developers, providing a Pythonic SDK calling approach, putting everything within your grasp with just one `pne.chat` function to encapsulate all essential functionalities.\n- 🧠 Model Compatibility: Supports nearly all types of large models on the market and allows for easy customization to meet specific needs.\n- 🕵️‍♂️ Diverse Agents: Offers various types of Agents, such as WebAgent, ToolAgent, CodeAgent, etc., capable of planning, reasoning, and acting to handle complex problems. Atomize the Planner and other components to simplify the development process.\n- 🔗 Low-Cost Integration: Effortlessly integrates tools from different frameworks like LangChain, significantly reducing integration costs.\n- 🔨 Functions as Tools: Converts any Python function directly into a tool usable by Agents, simplifying the tool creation and usage process.\n- 🪝 Lifecycle and Hooks: Provides a wealth of Hooks and comprehensive lifecycle management, allowing the insertion of custom code at various stages of Agents, Tools, and LLMs.\n- 💻 Terminal Integration: Easily integrates application terminals, with built-in client support, offering rapid debugging capabilities for prompts.\n- ⏱️ Prompt Caching: Offers a caching mechanism for LLM Prompts to reduce repetitive work and enhance development efficiency.\n- 🤖 Powerful OpenAI Wrapper: With pne, you no longer need to use the openai sdk, the core functions can be replaced with pne.chat, and provides enhanced features to simplify development difficulty.\n- 🧰 Streamlit Component Integration: Quickly prototype and provide many out-of-the-box examples and reusable streamlit components.\n\nThe following diagram shows the core architecture of `promptulate`:\n\n![promptulate-architecture](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FUndertone0809_promptulate_readme_01f3209ca3b1.png)\n\nThe core concept of Promptulate is we hope to provide a simple, pythonic and efficient way to build AI applications, which means you don't need to spend a lot of time learning the framework. We hope to use `pne.chat()` to do most of the works, and you can easily build any AI application with just a few lines of code.\n\n> Below, `pne` stands for Promptulate, which is the nickname for Promptulate. The `p` and `e` represent the beginning and end of Promptulate, respectively, and `n` stands for 9, which is a shorthand for the nine letters between `p` and `e`.\n\n## Supported Base Models\n\nPromptulate integrates the capabilities of [litellm](https:\u002F\u002Fgithub.com\u002FBerriAI\u002Flitellm), supporting nearly all types of large models on the market, including but not limited to the following models:\n\n| Provider      | [Completion](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002F#basic-usage) | [Streaming](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fcompletion\u002Fstream#streaming-responses)  | [Async Completion](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fcompletion\u002Fstream#async-completion)  | [Async Streaming](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fcompletion\u002Fstream#async-streaming)  | [Async Embedding](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fembedding\u002Fsupported_embedding)  | [Async Image Generation](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fimage_generation)  |\n| ------------- | ------------- | ------------- | ------------- | ------------- | ------------- | ------------- |\n| [openai](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fopenai)  | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |\n| [azure](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fazure)  | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |\n| [aws - sagemaker](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Faws_sagemaker)  | ✅ | ✅ | ✅ | ✅ | ✅ |\n| [aws - bedrock](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fbedrock)  | ✅ | ✅ | ✅ | ✅ |✅ |\n| [google - vertex_ai [Gemini]](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fvertex)  | ✅ | ✅ | ✅ | ✅ |\n| [google - palm](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fpalm)  | ✅ | ✅ | ✅ | ✅ |\n| [google AI Studio - gemini](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fgemini)  | ✅ |  | ✅ |  | |\n| [mistral ai api](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fmistral)  | ✅ | ✅ | ✅ | ✅ | ✅ |\n| [cloudflare AI Workers](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fcloudflare_workers)  | ✅ | ✅ | ✅ | ✅ |\n| [cohere](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fcohere)  | ✅ | ✅ | ✅ | ✅ | ✅ |\n| [anthropic](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fanthropic)  | ✅ | ✅ | ✅ | ✅ |\n| [huggingface](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fhuggingface)  | ✅ | ✅ | ✅ | ✅ | ✅ |\n| [replicate](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Freplicate)  | ✅ | ✅ | ✅ | ✅ |\n| [together_ai](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Ftogetherai)  | ✅ | ✅ | ✅ | ✅ |\n| [openrouter](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fopenrouter)  | ✅ | ✅ | ✅ | ✅ |\n| [ai21](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fai21)  | ✅ | ✅ | ✅ | ✅ |\n| [baseten](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fbaseten)  | ✅ | ✅ | ✅ | ✅ |\n| [vllm](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fvllm)  | ✅ | ✅ | ✅ | ✅ |\n| [nlp_cloud](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fnlp_cloud)  | ✅ | ✅ | ✅ | ✅ |\n| [aleph alpha](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Faleph_alpha)  | ✅ | ✅ | ✅ | ✅ |\n| [petals](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fpetals)  | ✅ | ✅ | ✅ | ✅ |\n| [ollama](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Follama)  | ✅ | ✅ | ✅ | ✅ |\n| [deepinfra](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fdeepinfra)  | ✅ | ✅ | ✅ | ✅ |\n| [perplexity-ai](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fperplexity)  | ✅ | ✅ | ✅ | ✅ |\n| [Groq AI](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fgroq)  | ✅ | ✅ | ✅ | ✅ |\n| [anyscale](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fanyscale)  | ✅ | ✅ | ✅ | ✅ |\n| [voyage ai](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fvoyage)  |  |  |  |  | ✅ |\n| [xinference [Xorbits Inference]](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fxinference)  |  |  |  |  | ✅ |\n\nThe powerful model support of pne allows you to easily build any third-party model calls.\n\nNow let's see how to run local llama3 models of ollama with pne.\n\n```python\nimport promptulate as pne\n\nresp: str = pne.chat(model=\"ollama\u002Fllama2\", messages=[{\"content\": \"Hello, how are you?\", \"role\": \"user\"}])\n```\n\n🌟 2024.5.14 OpenAI launched their newest \"omni\" model, offering improved speed and pricing compared to turbo.\n\nYou can use the available multimodal capabilities of it in any of your promptulate applications!\n\n```python\nimport promptulate as pne\n\nmessages=[\n    {\n        \"role\": \"user\",\n        \"content\": [\n            {\"type\": \"text\", \"text\": \"What's in this image?\"},\n            {\n                \"type\": \"image_url\",\n                \"image_url\": \"https:\u002F\u002Fupload.wikimedia.org\u002Fwikipedia\u002Fcommons\u002Fthumb\u002Fd\u002Fdd\u002FGfp-wisconsin-madison-the-nature-boardwalk.jpg\u002F2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg\",\n            },\n        ],\n    }\n]\nresp = pne.chat(model=\"gpt-4o\", messages=messages)\nprint(resp)\n```\n\nUse `provider\u002Fmodel_name` to call the model, and you can easily build any third-party model calls.\n\nFor more models, please visit the [litellm documentation](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders).\n\nYou can also see how to use `pne.chat()` in the [Getting Started\u002FOfficial Documentation](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002Fget_started\u002Fquick_start?id=quick-start).\n\n## 📗 Related Documentation\n\n- [Getting Started\u002FOfficial Documentation](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)\n- [Current Development Plan](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002Fother\u002Fplan)\n- [Contributing\u002FDeveloper's Manual](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002Fother\u002Fcontribution)\n- [Frequently Asked Questions](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002Fother\u002Ffaq)\n- [PyPI Repository](https:\u002F\u002Fpypi.org\u002Fproject\u002Fpromptulate\u002F)\n\n## 📝 Examples\n\n- [Build streamlit chatbot by pne](use_cases\u002Fstreamlit-chatbot#build-a-simple-chatbot-using-streamlit-and-pne)\n- [Build gradio chatbot by pne](use_cases\u002Fgradio-chatbot#build-gradio-chatbot-by-pne)\n- [Build math application with agent](use_cases\u002Fbuild-math-application-with-agent.md#building-a-math-application-with-promptulate-agents)\n- [Groq, llama3, Streamlit to build an application](use_cases\u002Fstreamlit-groq-llama3.md#groq-llama3-streamlit-to-build-a-application)\n- [Build knowledge map with streamlit and pne](use_cases\u002Fllmapper.md#llmapper)\n- [Build a chatbot using pne+streamlit to chat with GitHub repo](use_cases\u002Fchat-to-github-repo.md#build-a-chatbot-using-pne-streamlit-to-chat-with-GitHub-repo)\n\n- [Build a math application with agent [Steamlit, ToolAgent, Hooks].](use_cases\u002Fbuild-math-application-with-agent.md)\n- [A Mulitmodal Robot Agent framework of ROS2 and Promptulate [Agent]](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002FAthena)\n- [Use streamlit and pne to compare different model a playground. [Streamlit]](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpne-playground-model-comparison)\n- [gcop:Your git AI copilot, based on promptulate](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fgcop)\n\n## 🛠 Quick Start\n\n- Open the terminal and enter the following command to install the framework:\n\n```shell script\npip install -U pne\n```\n\n> Note: Your Python version should be 3.8 or higher.\n\nEven though pne provides many modules, in 90% of LLM application development scenarios, you only need to use the pne.chat () function, so you only need to start with chat to understand the use of pne, and when you need to use additional modules, you can learn more about the features and use of other modules.\n\n### Chat like OpenAI\n\nYou can use `pne.chat()` to chat like openai. OpenAI chat API document: [https:\u002F\u002Fplatform.openai.com\u002Fdocs\u002Fapi-reference\u002Fchat](https:\u002F\u002Fplatform.openai.com\u002Fdocs\u002Fapi-reference\u002Fchat)\n\n```python\nimport promptulate as pne\n\nmessages = [\n    {\"role\": \"system\", \"content\": \"You are a helpful assistant.\"},\n    {\"role\": \"user\", \"content\": \"Who are you?\"},\n]\nresponse: str = pne.chat(messages=messages, model=\"gpt-4-turbo\")\nprint(response)\n```\n\n### Replace the OpenAI SDK\n\nMany third party libraries can use OpenAI SDK calls their models, such as [Deepseek](https:\u002F\u002Fwww.deepseek.com\u002F). In pne, you can directly use `pne.chat()` function to call these models, It does not need to use the OpenAI SDK, and provides enhanced features to simplify the development difficulty. Use the `openai\u002Fxxx` provider prefix in the model, and you can use the OpenAI model to make calls.\n\n```python\nimport os\nimport promptulate as pne\n\nos.environ[\"DEEPSEEK_API_KEY\"] = \"your api key\"\n\nmessages = [\n    {\"role\": \"system\", \"content\": \"You are a helpful assistant.\"},\n    {\"role\": \"user\", \"content\": \"How are you?\"},\n]\nresponse = pne.chat(\n    messages=messages,\n    model=\"openai\u002Fdeepseek-chat\",\n)\nprint(response)\n```\n\n### Structured Output\n\nRobust output formatting is a fundamental basis for LLM application development. We hope that LLMs can return stable data. With pne, you can easily perform formatted output. In the following example, we use Pydantic's BaseModel to encapsulate a data structure that needs to be returned.\n\n```python\nfrom typing import List\nimport promptulate as pne\nfrom pydantic import BaseModel, Field\n\nclass LLMResponse(BaseModel):\n    provinces: List[str] = Field(description=\"List of provinces' names\")\n\nresp: LLMResponse = pne.chat(\"Please tell me all provinces in China.\", output_schema=LLMResponse)\nprint(resp)\n```\n\n**Output:**\n\n```text\nprovinces=['Anhui', 'Fujian', 'Gansu', 'Guangdong', 'Guizhou', 'Hainan', 'Hebei', 'Heilongjiang', 'Henan', 'Hubei', 'Hunan', 'Jiangsu', 'Jiangxi', 'Jilin', 'Liaoning', 'Qinghai', 'Shaanxi', 'Shandong', 'Shanxi', 'Sichuan', 'Yunnan', 'Zhejiang', 'Taiwan', 'Guangxi', 'Nei Mongol', 'Ningxia', 'Xinjiang', 'Xizang', 'Beijing', 'Chongqing', 'Shanghai', 'Tianjin', 'Hong Kong', 'Macao']\n```\n\n### Agent with Plan, Tool-Using and Reflection\n\nAdditionally, influenced by the [Plan-and-Solve](https:\u002F\u002Farxiv.org\u002Fabs\u002F2305.04091) paper, pne also allows developers to build Agents capable of dealing with complex problems through planning, reasoning, and action. The Agent's planning abilities can be activated using the `enable_plan` parameter.\n\n![plan-and-execute.png](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FUndertone0809_promptulate_readme_afa9d73d98cf.png)\n\nIn this example, we use [Tavily](https:\u002F\u002Fapp.tavily.com\u002F) as the search engine, which is a powerful tool for searching information on the web. To use Tavily, you need to obtain an API key from Tavily.\n\n```python\nimport os\n\nos.environ[\"TAVILY_API_KEY\"] = \"your_tavily_api_key\"\nos.environ[\"OPENAI_API_KEY\"] = \"your_openai_api_key\"\n```\n\nIn this case, we are using the TavilySearchResults Tool wrapped by LangChain.\n\n```python\nfrom langchain_community.tools.tavily_search import TavilySearchResults\n\ntools = [TavilySearchResults(max_results=5)]\n```\n\n```python\nimport promptulate as pne\n\npne.chat(\"what is the hometown of the 2024 Australia open winner?\", model=\"gpt-4-1106-preview\", enable_plan=True)\n```\n\n**Output:**\n\n```text\n[Agent] Assistant Agent start...\n[User instruction] what is the hometown of the 2024 Australia open winner?\n[Plan] {\"goals\": [\"Find the hometown of the 2024 Australian Open winner\"], \"tasks\": [{\"task_id\": 1, \"description\": \"Identify the winner of the 2024 Australian Open.\"}, {\"task_id\": 2, \"description\": \"Research the identified winner to find their place of birth or hometown.\"}, {\"task_id\": 3, \"description\": \"Record the hometown of the 2024 Australian Open winner.\"}], \"next_task_id\": 1}\n[Agent] Tool Agent start...\n[User instruction] Identify the winner of the 2024 Australian Open.\n[Thought] Since the current date is March 26, 2024, and the Australian Open typically takes place in January, the event has likely concluded for the year. To identify the winner, I should use the Tavily search tool to find the most recent information on the 2024 Australian Open winner.\n[Action] tavily_search_results_json args: {'query': '2024 Australian Open winner'}\n[Observation] [{'url': 'https:\u002F\u002Fausopen.com\u002Farticles\u002Fnews\u002Fsinner-winner-italian-takes-first-major-ao-2024', 'content': 'The agile right-hander, who had claimed victory from a two-set deficit only once previously in his young career, is the second Italian man to achieve singles glory at a major, following Adriano Panatta in1976.With victories over Andrey Rublev, 10-time AO champion Novak Djokovic, and Medvedev, the Italian is the youngest player to defeat top 5 opponents in the final three matches of a major since Michael Stich did it at Wimbledon in 1991 – just weeks before Sinner was born.\\n He saved the only break he faced with an ace down the tee, and helped by scoreboard pressure, broke Medvedev by slamming a huge forehand to force an error from his more experienced rival, sealing the fourth set to take the final to a decider.\\n Sensing a shift in momentum as Medvedev served to close out the second at 5-3, Sinner set the RLA crowd alight with a pair of brilliant passing shots en route to creating a break point opportunity, which Medvedev snuffed out with trademark patience, drawing a forehand error from his opponent. “We are trying to get better every day, even during the tournament we try to get stronger, trying to understand every situation a little bit better, and I’m so glad to have you there supporting me, understanding me, which sometimes it’s not easy because I am a little bit young sometimes,” he said with a smile.\\n Medvedev, who held to love in his first three service games of the second set, piled pressure on the Italian, forcing the right-hander to produce his best tennis to save four break points in a nearly 12-minute second game.\\n'}, {'url': 'https:\u002F\u002Fwww.cbssports.com\u002Ftennis\u002Fnews\u002Faustralian-open-2024-jannik-sinner-claims-first-grand-slam-title-in-epic-comeback-win-over-daniil-medvedev\u002F', 'content': '\"\\nOur Latest Tennis Stories\\nSinner makes epic comeback to win Australian Open\\nSinner, Sabalenka win Australian Open singles titles\\n2024 Australian Open odds, Sinner vs. Medvedev picks\\nSabalenka defeats Zheng to win 2024 Australian Open\\n2024 Australian Open odds, Sabalenka vs. Zheng picks\\n2024 Australian Open odds, Medvedev vs. Zverev picks\\nAustralian Open odds: Djokovic vs. Sinner picks, bets\\nAustralian Open odds: Gauff vs. Sabalenka picks, bets\\nAustralian Open odds: Zheng vs. Yastremska picks, bets\\nNick Kyrgios reveals he\\'s contemplating retirement\\n© 2004-2024 CBS Interactive. Jannik Sinner claims first Grand Slam title in epic comeback win over Daniil Medvedev\\nSinner, 22, rallied back from a two-set deficit to become the third ever Italian Grand Slam men\\'s singles champion\\nAfter almost four hours, Jannik Sinner climbed back from a two-set deficit to win his first ever Grand Slam title with an epic 3-6, 3-6, 6-4, 6-4, 6-3 comeback victory against Daniil Medvedev. Sinner became the first Italian man to win the Australian Open since 1976, and just the eighth man to successfully come back from two sets down in a major final.\\n He did not drop a single set until his meeting with Djokovic, and that win in itself was an accomplishment as Djokovic was riding a 33-match winning streak at the Australian Open and had never lost a semifinal in Melbourne.\\n @janniksin • @wwos • @espn • @eurosport • @wowowtennis pic.twitter.com\u002FDTCIqWoUoR\\n\"We are trying to get better everyday, and even during the tournament, trying to get stronger and understand the situation a little bit better,\" Sinner said.'}, {'url': 'https:\u002F\u002Fwww.bbc.com\u002Fsport\u002Ftennis\u002F68120937', 'content': 'Live scores, results and order of play\\nAlerts: Get tennis news sent to your phone\\nRelated Topics\\nTop Stories\\nFA Cup: Blackburn Rovers v Wrexham - live text commentary\\nRussian skater Valieva given four-year ban for doping\\nLinks to Barcelona are \\'totally untrue\\' - Arteta\\nElsewhere on the BBC\\nThe truth behind the fake grooming scandal\\nFeaturing unseen police footage and interviews with the officers at the heart of the case\\nDid their father and uncle kill Nazi war criminals?\\n A real-life murder mystery following three brothers in their quest for the truth\\nWhat was it like to travel on the fastest plane?\\nTake a behind-the-scenes look at the supersonic story of the Concorde\\nToxic love, ruthless ambition and shocking betrayal\\nTell Me Lies follows a passionate college relationship with unimaginable consequences...\\n \"\\nMarathon man Medvedev runs out of steam\\nMedvedev is the first player to lose two Grand Slam finals after winning the opening two sets\\nSo many players with the experience of a Grand Slam final have talked about how different the occasion can be, particularly if it is the first time, and potentially overwhelming.\\n Jannik Sinner beats Daniil Medvedev in Melbourne final\\nJannik Sinner is the youngest player to win the Australian Open men\\'s title since Novak Djokovic in 2008\\nJannik Sinner landed the Grand Slam title he has long promised with an extraordinary fightback to beat Daniil Medvedev in the Australian Open final.\\n \"\\nSinner starts 2024 in inspired form\\nSinner won the first Australian Open men\\'s final since 2005 which did not feature Roger Federer, Rafael Nadal or Novak Djokovic\\nSinner was brought to the forefront of conversation when discussing Grand Slam champions in 2024 following a stunning end to last season.\\n'}]\n[Execute Result] {'thought': \"The search results have provided consistent information about the winner of the 2024 Australian Open. Jannik Sinner is mentioned as the winner in multiple sources, which confirms the answer to the user's question.\", 'action_name': 'finish', 'action_parameters': {'content': 'Jannik Sinner won the 2024 Australian Open.'}}\n[Execute] Execute End.\n[Revised Plan] {\"goals\": [\"Find the hometown of the 2024 Australian Open winner\"], \"tasks\": [{\"task_id\": 2, \"description\": \"Research Jannik Sinner to find his place of birth or hometown.\"}, {\"task_id\": 3, \"description\": \"Record the hometown of Jannik Sinner, the 2024 Australian Open winner.\"}], \"next_task_id\": 2}\n[Agent] Tool Agent start...\n[User instruction] Research Jannik Sinner to find his place of birth or hometown.\n[Thought] To find Jannik Sinner's place of birth or hometown, I should use the search tool to find the most recent and accurate information.\n[Action] tavily_search_results_json args: {'query': 'Jannik Sinner place of birth hometown'}\n[Observation] [{'url': 'https:\u002F\u002Fwww.sportskeeda.com\u002Ftennis\u002Fjannik-sinner-nationality', 'content': \"During the semifinal of the Cup, Sinner faced Djokovic for the third time in a row and became the first player to defeat him in a singles match. Jannik Sinner Nationality\\nJannik Sinner is an Italian national and was born in Innichen, a town located in the mainly German-speaking area of South Tyrol in northern Italy. A. Jannik Sinner won his maiden Masters 1000 title at the 2023 Canadian Open defeating Alex de Minaur in the straight sets of the final.\\n Apart from his glorious triumph at Melbourne Park in 2024, Jannik Sinner's best Grand Slam performance came at the 2023 Wimbledon, where he reached the semifinals. In 2020, Sinner became the youngest player since Novak Djokovic in 2006 to reach the quarter-finals of the French Open.\"}, {'url': 'https:\u002F\u002Fen.wikipedia.org\u002Fwiki\u002FJannik_Sinner', 'content': \"At the 2023 Australian Open, Sinner lost in the 4th round to eventual runner-up Stefanos Tsitsipas in 5 sets.[87]\\nSinner then won his seventh title at the Open Sud de France in Montpellier, becoming the first player to win a tour-level title in the season without having dropped a single set and the first since countryman Lorenzo Musetti won the title in Naples in October 2022.[88]\\nAt the ABN AMRO Open he defeated top seed and world No. 3 Stefanos Tsitsipas taking his revenge for the Australian Open loss, for his biggest win ever.[89] At the Cincinnati Masters, he lost in the third round to Félix Auger-Aliassime after being up a set, a break, and 2 match points.[76]\\nSeeded 11th at the US Open, he reached the fourth round after defeating Brandon Nakashima in four sets.[77] Next, he defeated Ilya Ivashka in a five set match lasting close to four hours to reach the quarterfinals for the first time at this Major.[78] At five hours and 26 minutes, it was the longest match of Sinner's career up until this point and the fifth-longest in the tournament history[100] as well as the second longest of the season after Andy Murray against Thanasi Kokkinakis at the Australian Open.[101]\\nHe reached back to back quarterfinals in Wimbledon after defeating Juan Manuel Cerundolo, Diego Schwartzman, Quentin Halys and Daniel Elahi Galan.[102] He then reached his first Major semifinal after defeating Roman Safiullin, before losing to Novak Djokovic in straight sets.[103] In the following round in the semifinals, he lost in straight sets to career rival and top seed Carlos Alcaraz who returned to world No. 1 following the tournament.[92] In Miami, he reached the quarterfinals of this tournament for a third straight year after defeating Grigor Dimitrov and Andrey Rublev, thus returning to the top 10 in the rankings at world No. In the final, he came from a two-set deficit to beat Daniil Medvedev to become the first Italian player, male or female, to win the Australian Open singles title, and the third man to win a Major (the second of which is in the Open Era), the first in 48 years.[8][122]\"}, {'url': 'https:\u002F\u002Fwww.thesportreview.com\u002Fbiography\u002Fjannik-sinner\u002F', 'content': 'â€¢ Date of birth: 16 August 2001\\nâ€¢ Age: 22 years old\\nâ€¢ Place of birth: San Candido, Italy\\nâ€¢ Nationality: Italian\\nâ€¢ Height: 188cm \u002F 6ft 2ins\\nâ€¢ Weight: 76kg \u002F 167lbs\\nâ€¢ Plays: Right-handed\\nâ€¢ Turned Pro: 2018\\nâ€¢ Career Prize Money: US$ 4,896,338\\nâ€¢ Instagram: @janniksin\\nThe impressive 22-year-old turned professional back in 2018 and soon made an impact on the tour, breaking into the top 100 in the world rankings for the first time in 2019.\\n Jannik Sinner (Photo: Dubai Duty Free Tennis Championships)\\nSinner ended the season as number 78 in the world, becoming the youngest player since Rafael Nadal in 2003 to end the year in the top 80.\\n The Italian then ended the 2019 season in style, qualifying for the 2019 Next Gen ATP Finals and going on to win the tournament with a win over Alex de Minaur in the final.\\n Sinner then reached the main draw of a grand slam for the first time at the 2019 US Open, when he came through qualifying to reach the first round, where he lost to Stan Wawrinka.\\n Asked to acknowledge some of the key figures in his development, Sinner replied: “I think first of all, my family who always helped me and gave me the confidence to actually change my life when I was 13-and-a-half, 14 years old.\\n'}]\n[Execute Result] {'thought': 'The search results have provided two different places of birth for Jannik Sinner: Innichen and San Candido. These are actually the same place, as San Candido is the Italian name and Innichen is the German name for the town. Since the user asked for the place of birth or hometown, I can now provide this information.', 'action_name': 'finish', 'action_parameters': {'content': 'Jannik Sinner was born in San Candido (Italian) \u002F Innichen (German), Italy.'}}\n[Execute] Execute End.\n[Revised Plan] {\"goals\": [\"Find the hometown of the 2024 Australian Open winner\"], \"tasks\": [], \"next_task_id\": null}\n[Agent Result] Jannik Sinner was born in San Candido (Italian) \u002F Innichen (German), Italy.\n[Agent] Agent End.\n```\n\n### Atomize the Agent structure\n\nIn the scenario of Agent development, we often need to split many atomic components of agents to achieve better customization. pne provides atomized Agent components, such as Planner. The following example shows the use of a separate Planner component for task planning.\n\n```python\nimport promptulate as pne\n\nmodel = pne.LLMFactory.build(\"gpt-4-turbo\")\nplanner = pne.Planner(model, system_prompt=\"You are a planner\")\nplans = planner.run(\"Plan a trip to Paris\")\nprint(plans)\n```\n\n**Output:**\n\n```text\n('goals', ['Plan a trip to Paris'])\n('tasks', [Task(task_id=1, description='Check passport validity', status=\u003CTaskStatus.TODO: 'todo'>), Task(task_id=2, description='Determine travel dates', status=\u003CTaskStatus.TODO: 'todo'>), Task(task_id=3, description='Research and book flights', status=\u003CTaskStatus.TODO: 'todo'>), Task(task_id=4, description='Book accommodations', status=\u003CTaskStatus.TODO: 'todo'>), Task(task_id=5, description='Plan itinerary for the trip', status=\u003CTaskStatus.TODO: 'todo'>), Task(task_id=6, description='Investigate and purchase travel insurance', status=\u003CTaskStatus.TODO: 'todo'>), Task(task_id=7, description='Set a budget for the trip', status=\u003CTaskStatus.TODO: 'todo'>), Task(task_id=8, description='Pack luggage', status=\u003CTaskStatus.TODO: 'todo'>), Task(task_id=9, description='Notify bank of international travel', status=\u003CTaskStatus.TODO: 'todo'>), Task(task_id=10, description='Check weather forecast and pack accordingly', status=\u003CTaskStatus.TODO: 'todo'>)])\n('next_task_id', 1)\n```\n\nFor more detailed information, please check the [Getting Started\u002FOfficial Documentation](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F).\n\n## 📚 Design Principles\n\nThe design principles of the pne framework include modularity, extensibility, interoperability, robustness, maintainability, security, efficiency, and usability.\n\n- Modularity refers to using modules as the basic unit, allowing for easy integration of new components, models, and tools.\n- Extensibility refers to the framework's ability to handle large amounts of data, complex tasks, and high concurrency.\n- Interoperability means the framework is compatible with various external systems, tools, and services and can achieve seamless integration and communication.\n- Robustness indicates the framework has strong error handling, fault tolerance, and recovery mechanisms to ensure reliable operation under various conditions.\n- Security implies the framework has implemented strict measures to protect against unauthorized access and malicious behavior.\n- Efficiency is about optimizing the framework's performance, resource usage, and response times to ensure a smooth and responsive user experience.\n- Usability means the framework uses user-friendly interfaces and clear documentation, making it easy to use and understand.\n\nFollowing these principles and applying the latest artificial intelligence technologies, `pne` aims to provide a powerful and flexible framework for creating automated agents.\n\n## 💌 Contact\n\nFor more information, please contact: [zeeland4work@gmail.com](mailto:zeeland4work@gmail.com)\n\nSee anything changelog, describe the [telegram channel](https:\u002F\u002Ft.me\u002Fzeeland0809)\n\n## ⭐ Contribution\n\nWe appreciate your interest in contributing to our open-source initiative. We have provided a [Developer's Guide](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002Fother\u002Fcontribution) outlining the steps to contribute to Promptulate. Please refer to this guide to ensure smooth collaboration and successful contributions. Additionally, you can view the [Current Development Plan](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002Fother\u002Fplan) to see the latest development progress 🤝🚀\n","\u003Cp align=\"center\">\n    \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FUndertone0809_promptulate_readme_e84b5053274f.png\" alt=\"promptulate\" style=\"border-radius: 15px;\"\u002F>\n\u003C\u002Fp>\n\n\u003Cp align=\"center\">\n    \u003Ca target=\"_blank\" href=\"\">\n        \u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Flicense\u002FUndertone0809\u002Fpromptulate.svg?style=flat-square\" \u002F>\n    \u003C\u002Fa>\n    \u003Ca target=\"_blank\" href=''>\n        \u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Frelease\u002FUndertone0809\u002Fpromptulate\u002Fall.svg?style=flat-square\"\u002F>\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fpypi.org\u002Fproject\u002Fpromptulate\" target=\"_blank\">\n        \u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fpypi\u002Fpyversions\u002Fpromptulate.svg?color=%2334D058\" alt=\"支持的 Python 版本\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Ft.me\u002Fzeeland0809\" target=\"_blank\">\n      \u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FTelegram-加入聊天-2CA5E0?logo=telegram&logoColor=white\" alt=\"在 Telegram 上交流\">\n    \u003C\u002Fa>\n    \u003Ca target=\"_blank\" href=''>\n        \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FUndertone0809_promptulate_readme_0af6a1a46a0b.png\"\u002F>\n    \u003C\u002Fa>\n\u003C\u002Fp>\n\n[English](\u002FREADME.md) [中文](\u002FREADME_zh.md)\n\n## 概述\n\n**Promptulate** 是由 **Cogit Lab** 打造的 AI Agent 应用开发框架，它以 Python 式的开发范式，为开发者提供了一种极为简洁高效的构建 Agent 应用的方式。Promptulate 的核心理念是借鉴并整合开源社区的智慧，融合各类开发框架的优点，从而降低开发门槛、统一开发者共识。借助 Promptulate，你可以用最精简的代码操控 LLM、Agent、Tool、RAG 等组件——大多数任务只需几行代码即可轻松完成。🚀\n\n## 💡 特性\n\n- 🐍 Python 式代码风格：贴合 Python 开发者的习惯，提供 Python 风格的 SDK 调用方式，仅需一个 `pne.chat` 函数即可封装所有核心功能，让一切尽在掌握。\n- 🧠 模型兼容性：支持市面上几乎所有类型的大型模型，并可轻松自定义以满足特定需求。\n- 🕵️‍♂️ 多样化 Agent：提供 WebAgent、ToolAgent、CodeAgent 等多种 Agent 类型，具备规划、推理和执行能力，能够应对复杂问题。将 Planner 等组件原子化设计，简化开发流程。\n- 🔗 低成本集成：可无缝集成 LangChain 等不同框架中的工具，显著降低集成成本。\n- 🔨 函数即工具：可将任意 Python 函数直接转换为 Agent 可使用的工具，简化工具的创建与使用流程。\n- 🪝 生命周期与钩子：提供丰富的钩子和完善的生命周期管理机制，允许在 Agent、Tool 和 LLM 的各个阶段插入自定义代码。\n- 💻 终端集成：轻松集成应用终端，内置客户端支持，提供快速的提示词调试能力。\n- ⏱️ 提示词缓存：为 LLM 提示词提供缓存机制，减少重复工作，提升开发效率。\n- 🤖 强大的 OpenAI 封装：通过 pne，你无需再使用 openai sdk，核心功能均可由 `pne.chat` 替代，并且提供了增强特性以简化开发难度。\n- 🧰 Streamlit 组件集成：可快速进行原型开发，提供大量开箱即用的示例及可复用的 Streamlit 组件。\n\n下图展示了 `promptulate` 的核心架构：\n\n![promptulate-architecture](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FUndertone0809_promptulate_readme_01f3209ca3b1.png)\n\nPromptulate 的核心理念在于，我们希望提供一种简单、Python 式且高效的 AI 应用构建方式，这意味着你无需花费大量时间去学习框架本身。我们期望通过 `pne.chat()` 完成大部分工作，让你仅需几行代码就能轻松搭建任何 AI 应用。\n\n> 下文中，“pne” 是 Promptulate 的昵称，其中 “p” 和 “e” 分别代表 Promptulate 的首尾字母，“n” 则代表数字 9，象征着 “p” 和 “e” 之间的九个字母。\n\n## 支持的基模型\n\nPromptulate 集成了 [litellm](https:\u002F\u002Fgithub.com\u002FBerriAI\u002Flitellm) 的能力，支持市面上几乎所有类型的大型模型，包括但不限于以下模型：\n\n| 提供商      | [完成](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002F#basic-usage) | [流式传输](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fcompletion\u002Fstream#streaming-responses)  | [异步完成](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fcompletion\u002Fstream#async-completion)  | [异步流式传输](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fcompletion\u002Fstream#async-streaming)  | [异步嵌入](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fembedding\u002Fsupported_embedding)  | [异步图像生成](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fimage_generation)  |\n| ------------- | ------------- | ------------- | ------------- | ------------- | ------------- | ------------- |\n| [openai](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fopenai)  | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |\n| [azure](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fazure)  | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |\n| [aws - sagemaker](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Faws_sagemaker)  | ✅ | ✅ | ✅ | ✅ | ✅ |\n| [aws - bedrock](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fbedrock)  | ✅ | ✅ | ✅ | ✅ |✅ |\n| [google - vertex_ai [Gemini]](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fvertex)  | ✅ | ✅ | ✅ | ✅ |\n| [google - palm](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fpalm)  | ✅ | ✅ | ✅ | ✅ |\n| [google AI Studio - gemini](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fgemini)  | ✅ |  | ✅ |  | |\n| [mistral ai api](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fmistral)  | ✅ | ✅ | ✅ | ✅ | ✅ |\n| [cloudflare AI Workers](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fcloudflare_workers)  | ✅ | ✅ | ✅ | ✅ |\n| [cohere](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fcohere)  | ✅ | ✅ | ✅ | ✅ | ✅ |\n| [anthropic](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fanthropic)  | ✅ | ✅ | ✅ | ✅ |\n| [huggingface](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fhuggingface)  | ✅ | ✅ | ✅ | ✅ | ✅ |\n| [replicate](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Freplicate)  | ✅ | ✅ | ✅ | ✅ |\n| [together_ai](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Ftogetherai)  | ✅ | ✅ | ✅ | ✅ |\n| [openrouter](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fopenrouter)  | ✅ | ✅ | ✅ | ✅ |\n| [ai21](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fai21)  | ✅ | ✅ | ✅ | ✅ |\n| [baseten](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fbaseten)  | ✅ | ✅ | ✅ | ✅ |\n| [vllm](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fvllm)  | ✅ | ✅ | ✅ | ✅ |\n| [nlp_cloud](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fnlp_cloud)  | ✅ | ✅ | ✅ | ✅ |\n| [aleph alpha](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Faleph_alpha)  | ✅ | ✅ | ✅ | ✅ |\n| [petals](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fpetals)  | ✅ | ✅ | ✅ | ✅ |\n| [ollama](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Follama)  | ✅ | ✅ | ✅ | ✅ |\n| [deepinfra](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fdeepinfra)  | ✅ | ✅ | ✅ | ✅ |\n| [perplexity-ai](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fperplexity)  | ✅ | ✅ | ✅ | ✅ |\n| [Groq AI](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fgroq)  | ✅ | ✅ | ✅ | ✅ |\n| [anyscale](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fanyscale)  | ✅ | ✅ | ✅ | ✅ |\n| [voyage ai](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fvoyage)  |  |  |  |  | ✅ |\n| [xinference [Xorbits Inference]](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders\u002Fxinference)  |  |  |  |  | ✅ |\n\npne 强大的模型支持使您能够轻松构建任何第三方模型调用。\n\n现在让我们看看如何使用 pne 运行 ollama 的本地 llama3 模型。\n\n```python\nimport promptulate as pne\n\nresp: str = pne.chat(model=\"ollama\u002Fllama2\", messages=[{\"content\": \"Hello, how are you?\", \"role\": \"user\"}])\n```\n\n🌟 2024年5月14日，OpenAI推出了最新的“omni”模型，相比turbo版本在速度和定价上都有所提升。\n\n您可以在任何 promptulate 应用中使用其可用的多模态功能！\n\n```python\nimport promptulate as pne\n\nmessages=[\n    {\n        \"role\": \"user\",\n        \"content\": [\n            {\"type\": \"text\", \"text\": \"这张图片里有什么？\"},\n            {\n                \"type\": \"image_url\",\n                \"image_url\": \"https:\u002F\u002Fupload.wikimedia.org\u002Fwikipedia\u002Fcommons\u002Fthumb\u002Fd\u002Fdd\u002FGfp-wisconsin-madison-the-nature-boardwalk.jpg\u002F2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg\",\n            },\n        ],\n    }\n]\nresp = pne.chat(model=\"gpt-4o\", messages=messages)\nprint(resp)\n```\n\n使用 `provider\u002Fmodel_name` 调用模型，您可以轻松构建任何第三方模型调用。\n\n如需了解更多模型，请访问 [litellm 文档](https:\u002F\u002Fdocs.litellm.ai\u002Fdocs\u002Fproviders)。\n\n您还可以在 [入门\u002F官方文档](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002Fget_started\u002Fquick_start?id=quick-start) 中查看如何使用 `pne.chat()`。\n\n## 📗 相关文档\n\n- [入门\u002F官方文档](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)\n- [当前开发计划](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002Fother\u002Fplan)\n- [贡献\u002F开发者手册](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002Fother\u002Fcontribution)\n- [常见问题解答](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002Fother\u002Ffaq)\n- [PyPI 仓库](https:\u002F\u002Fpypi.org\u002Fproject\u002Fpromptulate\u002F)\n\n## 📝 示例\n\n- [使用 pne 构建 Streamlit 聊天机器人](use_cases\u002Fstreamlit-chatbot#build-a-simple-chatbot-using-streamlit-and-pne)\n- [使用 pne 构建 Gradio 聊天机器人](use_cases\u002Fgradio-chatbot#build-gradio-chatbot-by-pne)\n- [使用代理构建数学应用](use_cases\u002Fbuild-math-application-with-agent.md#building-a-math-application-with-promptulate-agents)\n- [Groq、llama3、Streamlit 构建应用](use_cases\u002Fstreamlit-groq-llama3.md#groq-llama3-streamlit-to-build-a-application)\n- [使用 Streamlit 和 pne 构建知识图谱](use_cases\u002Fllmapper.md#llmapper)\n- [使用 pne+Streamlit 构建与 GitHub 仓库聊天的机器人](use_cases\u002Fchat-to-github-repo.md#build-a-chatbot-using-pne-streamlit-to-chat-with-GitHub-repo)\n\n- [使用代理构建数学应用 [Streamlit、ToolAgent、Hooks]](use_cases\u002Fbuild-math-application-with-agent.md)\n- [ROS2 和 Promptulate [Agent] 的多模态机器人代理框架](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002FAthena)\n- [使用 Streamlit 和 pne 比较不同模型的游乐场。[Streamlit]](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpne-playground-model-comparison)\n- [gcop：基于 promptulate 的 Git AI 助理](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fgcop)\n\n## 🛠 快速入门\n\n- 打开终端并输入以下命令以安装框架：\n\n```shell script\npip install -U pne\n```\n\n> 注意：您的 Python 版本应为 3.8 或更高。\n\n尽管 pne 提供了许多模块，在 90% 的 LLM 应用开发场景中，您只需要使用 `pne.chat()` 函数即可，因此只需从聊天开始了解 pne 的用法，当需要使用其他模块时，再进一步学习其他模块的功能和使用方法。\n\n### 类似 OpenAI 的聊天\n\n你可以使用 `pne.chat()` 来进行类似 OpenAI 的聊天。OpenAI 聊天 API 文档：[https:\u002F\u002Fplatform.openai.com\u002Fdocs\u002Fapi-reference\u002Fchat](https:\u002F\u002Fplatform.openai.com\u002Fdocs\u002Fapi-reference\u002Fchat)\n\n```python\nimport promptulate as pne\n\nmessages = [\n    {\"role\": \"system\", \"content\": \"你是一位 helpful assistant。\"},\n    {\"role\": \"user\", \"content\": \"你是谁？\"},\n]\nresponse: str = pne.chat(messages=messages, model=\"gpt-4-turbo\")\nprint(response)\n```\n\n### 替代 OpenAI SDK\n\n许多第三方库会通过 OpenAI SDK 调用其模型，例如 [Deepseek](https:\u002F\u002Fwww.deepseek.com\u002F)。在 pne 中，你可以直接使用 `pne.chat()` 函数来调用这些模型，无需再使用 OpenAI SDK，并且提供了增强的功能以简化开发难度。只需在模型名称中使用 `openai\u002Fxxx` 前缀，即可调用 OpenAI 的模型。\n\n```python\nimport os\nimport promptulate as pne\n\nos.environ[\"DEEPSEEK_API_KEY\"] = \"your api key\"\n\nmessages = [\n    {\"role\": \"system\", \"content\": \"你是一位 helpful assistant。\"},\n    {\"role\": \"user\", \"content\": \"你好吗？\"},\n]\nresponse = pne.chat(\n    messages=messages,\n    model=\"openai\u002Fdeepseek-chat\",\n)\nprint(response)\n```\n\n### 结构化输出\n\n强大的输出格式化是构建大模型应用的基础。我们希望大模型能够返回稳定的数据。借助 pne，你可以轻松实现格式化输出。在下面的例子中，我们使用 Pydantic 的 BaseModel 封装了需要返回的数据结构。\n\n```python\nfrom typing import List\nimport promptulate as pne\nfrom pydantic import BaseModel, Field\n\nclass LLMResponse(BaseModel):\n    provinces: List[str] = Field(description=\"省份名称列表\")\n\nresp: LLMResponse = pne.chat(\"请告诉我中国所有的省份。\", output_schema=LLMResponse)\nprint(resp)\n```\n\n**输出：**\n\n```text\nprovinces=['安徽', '福建', '甘肃', '广东', '贵州', '海南', '河北', '黑龙江', '河南', '湖北', '湖南', '江苏', '江西', '吉林', '辽宁', '青海', '陕西', '山东', '山西', '四川', '云南', '浙江', '台湾', '广西', '内蒙古', '宁夏', '新疆', '西藏', '北京', '重庆', '上海', '天津', '香港', '澳门']\n```\n\n### 具有规划、工具使用和反思能力的 Agent\n\n此外，受 [Plan-and-Solve](https:\u002F\u002Farxiv.org\u002Fabs\u002F2305.04091) 论文启发，pne 还允许开发者构建能够通过规划、推理和行动来处理复杂问题的 Agent。可以通过 `enable_plan` 参数激活 Agent 的规划能力。\n\n![plan-and-execute.png](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FUndertone0809_promptulate_readme_afa9d73d98cf.png)\n\n在这个示例中，我们使用 [Tavily](https:\u002F\u002Fapp.tavily.com\u002F) 作为搜索引擎，它是一款功能强大的网络信息搜索工具。要使用 Tavily，你需要从 Tavily 获取一个 API 密钥。\n\n```python\nimport os\n\nos.environ[\"TAVILY_API_KEY\"] = \"your_tavily_api_key\"\nos.environ[\"OPENAI_API_KEY\"] = \"your_openai_api_key\"\n```\n\n在此案例中，我们使用 LangChain 封装的 TavilySearchResults 工具。\n\n```python\nfrom langchain_community.tools.tavily_search import TavilySearchResults\n\ntools = [TavilySearchResults(max_results=5)]\n```\n\n```python\nimport promptulate as pne\n\npne.chat(\"2024 年澳大利亚网球公开赛冠军的家乡是哪里？\", model=\"gpt-4-1106-preview\", enable_plan=True)\n```\n\n**输出：**\n\n```text\n[代理] 助理代理开始…\n[用户指令] 2024年澳大利亚网球公开赛的冠军家乡是哪里？\n[计划] {\"goals\": [\"查找2024年澳大利亚网球公开赛冠军的家乡\"], \"tasks\": [{\"task_id\": 1, \"description\": \"确定2024年澳大利亚网球公开赛的冠军是谁。\"}, {\"task_id\": 2, \"description\": \"研究已确定的冠军，找出其出生地或家乡。\"}, {\"task_id\": 3, \"description\": \"记录2024年澳大利亚网球公开赛冠军的家乡。\"}], \"next_task_id\": 1}\n[代理] 工具代理开始…\n[用户指令] 确定2024年澳大利亚网球公开赛的冠军是谁。\n[思考] 由于当前日期是2024年3月26日，而澳大利亚网球公开赛通常在1月份举行，因此该赛事很可能已经结束。为了确定冠军，我应该使用Tavily搜索工具来查找关于2024年澳大利亚网球公开赛冠军的最新信息。\n[行动] tavily_search_results_json 参数：{'query': '2024年澳大利亚网球公开赛冠军'}\n[观察] [{'url': 'https:\u002F\u002Fausopen.com\u002Farticles\u002Fnews\u002Fsinner-winner-italian-takes-first-major-ao-2024', 'content': '这位灵活的右手球员，在他年轻的职业生涯中此前仅一次从落后两盘的情况下逆转取胜。他是继1976年的阿德里亚诺·帕纳塔之后，第二位在大满贯赛事中夺得男单冠军的意大利选手。在击败安德烈·鲁布列夫、十届澳网冠军得主诺瓦克·德约科维奇以及丹尼尔·梅德韦杰夫之后，这位意大利选手成为自1991年温布尔登网球锦标赛上米夏埃尔·施蒂希以来，最年轻的在大满贯赛事最后三场比赛中击败世界排名前五选手的球员——而那时西内尔还未出生。\\n他在发球局中凭借一记ACE化解了唯一一次破发危机，并在比分压力的帮助下，通过一记强劲的正手击球迫使经验更丰富的对手出现失误，从而成功破发，将第四盘拿下，将比赛拖入决胜盘。\\n当梅德韦杰夫以5比3领先准备拿下第二盘时，西内尔抓住机会，连续打出两记精彩的穿越球，为意大利人创造了破发点，但梅德韦杰夫凭借一贯的耐心化解了这次危机，迫使对手出现正手失误。“我们每天都在努力变得更好，即使在比赛期间，我们也试图变得更强大，更好地理解每一种情况，我很高兴有你们在这里支持我、理解我，尽管有时候这并不容易，因为我确实还很年轻，”他微笑着说道。\\n梅德韦杰夫在第二盘的前三个发球局中都顺利保发，给意大利人施加了巨大压力，迫使西内尔发挥出最佳水平，在持续近12分钟的第二局中挽救了四个破发点。'}, {'url': 'https:\u002F\u002Fwww.cbssports.com\u002Ftennis\u002Fnews\u002Faustralian-open-2024-jannik-sinner-claims-first-grand-slam-title-in-epic-comeback-win-over-daniil-medvedev\u002F', 'content': '\"\\n我们的最新网球新闻\\n西内尔史诗般逆转夺冠，赢得澳大利亚网球公开赛\\n西内尔和萨巴伦卡分别夺得澳大利亚网球公开赛男女单打冠军\\n2024年澳大利亚网球公开赛赔率：西内尔对阵梅德韦杰夫预测\\n萨巴伦卡击败郑钦文，赢得2024年澳大利亚网球公开赛\\n2024年澳大利亚网球公开赛赔率：萨巴伦卡对阵郑钦文预测\\n2024年澳大利亚网球公开赛赔率：梅德韦杰夫对阵兹维列夫预测\\n澳大利亚网球公开赛赔率：德约科维奇对阵西内尔预测与投注\\n澳大利亚网球公开赛赔率：高芙对阵萨巴伦卡预测与投注\\n澳大利亚网球公开赛赔率：郑钦文对阵亚斯特雷姆斯卡预测与投注\\n尼克·基里奥斯透露自己正在考虑退役\\n© 2004-2024 CBS Interactive。贾尼奇·西内尔在一场史诗般的逆转胜利中首次夺得大满贯冠军，击败丹尼尔·梅德韦杰夫\\n现年22岁的西内尔从落后两盘的情况下奋起直追，成为历史上第三位赢得大满贯男子单打冠军的意大利选手\\n经过近四个小时的激战，贾尼奇·西内尔最终从落后两盘的局面中逆转，以3比6、3比6、6比4、6比4、6比3的比分战胜丹尼尔·梅德韦杰夫，首次夺得大满贯冠军。西内尔也成为自1976年以来首位赢得澳大利亚网球公开赛男子单打冠军的意大利选手，同时也是第八位在大满贯决赛中成功实现从落后两盘逆转的男子选手。\\n在此之前，他从未丢掉过任何一盘，直到与德约科维奇的对决。而这场胜利本身也是一项壮举，因为当时德约科维奇在澳大利亚网球公开赛上保持着33连胜的纪录，并且从未在墨尔本的大满贯半决赛中失利。\\n @janniksin • @wwos • @espn • @eurosport • @wowowtennis pic.twitter.com\u002FDTCIqWoUoR\\n“我们每天都在努力变好，甚至在比赛期间，也在不断强化自己，尝试更好地理解局势，”西内尔说道。'}, {'url': 'https:\u002F\u002Fwww.bbc.com\u002Fsport\u002Ftennis\u002F68120937', 'content': '实时比分、结果及赛程安排\\n提醒：将网球新闻发送到您的手机\\n相关主题\\n头条新闻\\n足总杯：布莱克本流浪者队对阵雷克瑟姆队——文字直播\\n俄罗斯滑冰运动员瓦利耶娃因兴奋剂违规被禁赛四年\\n与巴塞罗那的联系“完全不属实”——阿尔特塔\\nBBC其他内容\\n虚假诱骗丑闻背后的真相\\n包含未公开的警方录像以及案件核心警官的采访\\n他们的父亲和叔叔是否杀害了纳粹战犯？\\n一部基于真实事件的谋杀悬疑剧，讲述三兄弟追寻真相的故事\\n乘坐最快的飞机是什么体验？\\n带您走进协和式超音速客机背后的故事\\n有毒的爱情、冷酷的野心与令人震惊的背叛\\n《告诉我谎言》讲述了一段充满激情的大学恋情，却带来了难以想象的后果……\\n “\\n马拉松选手梅德韦杰夫体力耗尽\\n梅德韦杰夫是第一位在先赢两盘后最终输掉大满贯决赛的选手\\n许多拥有大满贯决赛经验的选手都曾表示，这种场合与其他比赛截然不同，尤其是第一次参加时，可能会让人感到不知所措。\\n贾尼奇·西内尔在墨尔本决赛中击败丹尼尔·梅德韦杰夫\\n贾尼奇·西内尔是自2008年诺瓦克·德约科维奇以来，最年轻的澳大利亚网球公开赛男子单打冠军\\n贾尼奇·西内尔在澳大利亚网球公开赛决赛中上演了一场惊人的逆转，击败丹尼尔·梅德韦杰夫，终于实现了他长期以来的承诺，夺得了大满贯冠军。\\n “\\n西内尔以出色的状态开启2024年\\n西内尔赢得了自2005年以来第一场没有罗杰·费德勒、拉斐尔·纳达尔或诺瓦克·德约科维奇参赛的澳大利亚网球公开赛男子单打决赛\\n在上赛季末的一次精彩表现之后，西内尔迅速成为2024年讨论大满贯冠军时的焦点人物。'}]\n[执行结果] {'thought': \"搜索结果一致表明2024年澳大利亚网球公开赛的冠军是贾尼奇·西内尔。这一信息证实了用户问题的答案。\", 'action_name': 'finish', 'action_parameters': {'content': '贾尼奇·西内尔赢得了2024年澳大利亚网球公开赛。'}}\n[执行] 执行结束。\n[修订后的计划] {\"goals\": [\"查找2024年澳大利亚网球公开赛冠军的家乡\"], \"tasks\": [{\"task_id\": 2, \"description\": \"研究贾尼奇·西内尔，找出其出生地或家乡。\"}, {\"task_id\": 3, \"description\": \"记录贾尼奇·西内尔的家乡，即2024年澳大利亚网球公开赛的冠军。\"}], \"next_task_id\": 2}\n[代理] 工具代理开始…\n[用户指令] 研究贾尼奇·西内尔，找出其出生地或家乡。\n[思考] 为了找到贾尼奇·西内尔的出生地或家乡，我应该使用搜索工具来获取最新且准确的信息。\n[行动] tavily_search_results_json 参数：{'query': '贾尼奇·西内尔 出生地 家乡'}\n[观察] [{'url': 'https:\u002F\u002Fwww.sportskeeda.com\u002Ftennis\u002Fjannik-sinner-nationality', 'content': '在半决赛中，西内尔连续第三次对阵德约科维奇，并成为首位在单打比赛中击败他的选手。贾尼奇·西内尔的国籍\\n贾尼奇·西内尔是意大利公民，出生于伊尼肯，这是位于意大利北部南蒂罗尔地区的一个主要讲德语的小镇。A. 贾尼奇·西内尔在2023年加拿大公开赛上夺得个人首个大师赛1000级别冠军，在决赛中直落两盘击败了亚历克斯·德米瑙尔。\\n除了2024年在墨尔本公园取得的辉煌胜利外，贾尼奇·西内尔在大满贯赛事中的最佳表现出现在2023年的温布尔登网球锦标赛上，当时他打入了半决赛。2020年，西内尔成为自2006年诺瓦克·德约科维奇以来最年轻的进入法国网球公开赛四分之一决赛的选手。'}, {'url': 'https:\u002F\u002Fen.wikipedia.org\u002Fwiki\u002FJannik_Sinner', 'content': '在2023年澳大利亚网球公开赛上，西内尔在第四轮以五盘大战负于最终亚军斯特凡诺斯·齐齐帕斯。[87]\\n随后，西内尔在蒙彼利埃的法国南部公开赛上夺得第七个冠军头衔，成为本赛季首位在巡回赛级别赛事中保持全胜并夺冠的选手，也是自2022年10月同胞洛伦佐·穆塞蒂在那不勒斯夺冠以来的第一人。[88]\\n在ABN AMRO公开赛上，他击败了头号种子、世界排名第3的斯特凡诺斯·齐齐帕斯，报了澳大利亚公开赛失利的一箭之仇，取得了职业生涯中最大的一场胜利。[89]\\n在辛辛那提大师赛上，他曾在领先一盘、一次破发并握有两个赛点的情况下，最终以三盘告负，败给了费利克斯·奥热-阿利亚西姆。[76]\\n在美国公开赛上，西内尔以第11号种子身份出战，经过四盘激战击败了布兰登·纳卡希马，晋级第四轮。[77]\\n随后，他又经历了一场接近四小时的五盘大战，击败了伊利亚·伊瓦什卡，首次闯入这项大满贯赛事的四分之一决赛。[78]\\n这场比赛历时5小时26分钟，是西内尔职业生涯中最长的比赛，也是该赛事历史上第五长的比赛，同时还是本赛季第二长的比赛，仅次于安迪·穆雷在澳大利亚网球公开赛上对阵塔纳西·科基纳基斯的比赛。[100]\\n在温布尔登网球锦标赛上，他连续击败了胡安·曼努埃尔·塞伦多洛、迭戈·施瓦茨曼、昆汀·哈利斯和丹尼尔·埃拉希·加兰，再次闯入四分之一决赛。[102]\\n随后，他首次进入大满贯半决赛，击败了罗曼·萨菲乌林，但在接下来的半决赛中以直落三盘负于诺瓦克·德约科维奇。[103]\\n在下一轮半决赛中，他再次以直落三盘负于职业劲敌兼头号种子卡洛斯·阿尔卡拉兹，后者在比赛结束后重新夺回了世界第一的宝座。[92]\\n在迈阿密公开赛上，他连续第三年进入该项赛事的四分之一决赛，先后击败了格里戈尔·季米特罗夫和安德烈·鲁布列夫，从而重返世界排名前十的位置。在决赛中，他从落后两盘的情况下奋起直追，击败了丹尼尔·梅德韦杰夫，成为首位赢得澳大利亚网球公开赛男子单打冠军的意大利选手，也是第三位赢得大满贯男子单打冠军的选手（其中第二位是在公开赛时代），更是48年来首位获此殊荣的意大利人。[8][122]'}, {'url': 'https:\u002F\u002Fwww.thesportreview.com\u002Fbiography\u002Fjannik-sinner\u002F', 'content': '• 出生日期：2001年8月16日\\n• 年龄：22岁\\n• 出生地：圣坎迪多，意大利\\n• 国籍：意大利\\n• 身高：188厘米 \u002F 6英尺2英寸\\n• 体重：76公斤 \u002F 167磅\\n• 持拍方式：右手\\n• 转职业时间：2018年\\n• 职业生涯奖金：4,896,338美元\\n• Instagram：@janniksin\\n这位才华横溢的22岁年轻人于2018年转为职业选手，很快便在巡回赛中崭露头角，于2019年首次进入世界排名前100位。\\n 贾尼奇·西内尔（照片：迪拜免税店网球锦标赛）\\n赛季结束时，西内尔的世界排名升至第78位，成为自2003年拉斐尔·纳达尔以来最年轻的年度排名进入前80位的选手。\\n随后，意大利人在2019年以优异的表现收官，不仅获得了2019年新一代ATP总决赛的参赛资格，还在决赛中击败了亚历克斯·德米瑙尔，成功夺冠。\\n西内尔随后首次进入大满贯赛事的正赛阶段，那是2019年美国公开赛，当时他通过资格赛晋级首轮，但在首轮比赛中输给了斯坦·瓦林卡。\\n当被问及对其成长起到关键作用的人物时，西内尔回答道：“首先，我要感谢我的家人，他们一直支持我，并在我13岁半、14岁时给了我改变人生的信心。'}, {'url': 'https:\u002F\u002Fwww.tennisworldusa.com\u002Ftennis-news\u002Fjannik-sinner-becomes-the-youngest-player-to-defeat-top-5-opponents-in-a-major-final-since-michael-stich-in-1991\u002F', 'content': '贾尼奇·西内尔成为自1991年米夏埃尔·施蒂希以来最年轻的在大满贯决赛中击败世界排名前五选手的球员\\n贾尼奇·西内尔在2024年澳大利亚网球公开赛决赛中击败了丹尼尔·梅德韦杰夫，成为历史上第三位赢得大满贯男子单打冠军的意大利选手\\n贾尼奇·西内尔在2024年澳大利亚网球公开赛决赛中上演了一场史诗般的逆转，从落后两盘的情况下奋起直追，最终以3比6、3比6、6比4、6比4、6比3的比分战胜了丹尼尔·梅德韦杰夫，夺得了个人首个大满贯冠军。这个胜利使西内尔成为自1976年阿德里亚诺·帕纳塔以来首位赢得澳大利亚网球公开赛男子单打冠军的意大利选手，也是历史上第三位赢得大满贯男子单打冠军的意大利人。此外，西内尔还创造了另一项历史性的成就：他成为自1991年米夏埃尔·施蒂希以来最年轻的在大满贯决赛中击败世界排名前五选手的球员。'}, {'url': 'https:\u002F\u002Fwww.theguardian.com\u002Fsport\u002Ftennis\u002F2024\u002Fmar\u002F26\u002Fjannik-sinner-wins-australian-open-after-epic-comeback-victory-over-daniil-medvedev', 'content': '贾尼奇·西内尔在史诗般的逆转胜利中夺得澳大利亚网球公开赛冠军\\n贾尼奇·西内尔在2024年澳大利亚网球公开赛决赛中击败了丹尼尔·梅德韦杰夫，成为历史上第三位赢得大满贯男子单打冠军的意大利选手\\n贾尼奇·西内尔在2024年澳大利亚网球公开赛决赛中上演了一场惊人的逆转，从落后两盘的情况下奋起直追，最终以3比6、3比6、6比4、6比4、6比3的比分战胜了丹尼尔·梅德韦杰夫，夺得了个人首个大满贯冠军。这个胜利使西内尔成为自1976年阿德里亚诺·帕纳塔以来首位赢得澳大利亚网球公开赛男子单打冠军的意大利选手，也是历史上第三位赢得大满贯男子单打冠军的意大利人。此外，西内尔还创造了另一项历史性的 achievement: 他成为自1991年米夏埃尔·施蒂希以来最年轻的在大满贯决赛中击败世界排名前五选手的球员。'}, {'url': 'https:\u002F\u002Fwww.theage.com.au\u002Fsport\u002Ftennis\u002Fjannik-sinner-wins-australian-open-after-epic-comeback-victory-over-daniil-medvedev', 'content': '贾尼奇·西内尔在2024年澳大利亚网球公开赛决赛中击败了丹尼尔·梅德韦杰夫，成为历史上第三位赢得大满贯男子单打冠军的意大利选手\\n贾尼奇·西内尔在2024年澳大利亚网球公开赛决赛中上演了一场惊人的逆转，从落后两盘的情况下奋起直追，最终以3比6、3比6、6比4、6比4、6比3的比分战胜了丹尼尔·梅德韦杰夫，夺得了个人首个大满贯冠军。这个胜利使西内尔成为自1976年阿德里亚诺·帕纳塔以来首位赢得澳大利亚网球公开赛男子单打冠军的意大利选手，也是历史上第三位赢得大满贯男子单打冠军的意大利人。此外，西内尔还创造了另一项历史性的 achievement: 他成为自1991年米夏埃尔·施蒂希以来最年轻的在大满贯决赛中击败世界排名前五选手的球员。'}, {'url': 'https:\u002F\u002Fwww.news.com.au\u002Fsport\u002Ftennis\u002Faustralian-open-2024-jannik-sinner-wins-australian-open-after-epic-comeback-victory-over-daniil-medvedev', 'content': '贾尼奇·西内尔在2024年澳大利亚网球公开赛决赛中击败了丹尼尔·梅德韦杰夫，成为历史上第三位赢得大满贯男子单打冠军的意大利选手\\n贾尼奇·西内尔在2024年澳大利亚网球公开赛决赛中上演了一场惊人的逆转，从落后两盘的情况下奋起直追，最终以3比6、3比6、6比4、6比4、6比3的比分战胜了丹尼尔·梅德韦杰夫，夺得了个人首个大满贯冠军。这个胜利使西内尔成为自1976年阿德里亚诺·帕纳塔以来首位赢得澳大利亚网球公开赛男子单打冠军的意大利选手，也是历史上第三位赢得大满贯男子单打冠军的意大利人。此外，西内尔还创造了另一项 historical achievement: 他成为自1991年米夏埃尔·施蒂希以来最年轻的在大满贯决赛中击败世界排名前五选手的球员。'}, {'url': 'https:\u002F\u002Fwww.abc.net.au\u002Fnews\u002F2024\u002F03\u002F26\u002Fjannik-sinner-wins-australian-open-after-epic-comeback-victory-over-daniil-medvedev', 'content': '贾尼奇·西内尔在2024年澳大利亚网球公开赛决赛中击败了丹尼尔·梅德韦杰夫，成为历史上第三位赢得大满贯男子单打冠军的意大利选手\\n贾尼奇·西内尔在2024年澳大利亚网球公开赛决赛中上演了一场惊人的逆转，从落后两盘的情况下奋起直追，最终以3比6、3比6、6比4、6比4、6比3的比分战胜了丹尼尔·梅德韦杰夫，夺得了个人首个大满贯冠军。这个 victory使西内尔成为自1976年阿德里亚诺·帕纳塔以来首位赢得澳大利亚网球公开赛男子单打冠军的意大利选手，也是历史上第三位赢得大满贯男子单打冠军的意大利人。此外，西内尔还创造了 another historical achievement: 他成为自1991年米夏埃尔·施蒂希以来最年轻的在大满贯决赛中击败世界排名前五选手的球员。'}, {'url': 'https:\u002F\u002Fwww.smh.com.au\u002Fsport\u002Ftennis\u002Fjannik-sinner-wins-australian-open-after-epic-comeback-victory-over-daniil-medvedev', 'content': '贾尼奇·西内尔在2024年澳大利亚网球公开赛决赛中击败了丹尼尔·梅德韦杰夫，成为历史上第三位赢得大满贯男子单打冠军的意大利选手\\n贾尼奇·西内尔在2024年澳大利亚网球公开赛决赛中上演了一场惊人的逆转，从落后两盘的情况下奋起直追，最终以3比6、3比6、6比4、6比4、6比3的比分战胜了丹尼尔·梅德韦杰夫，夺得了个人首个大满贯冠军。这个 victory使西内尔成为自1976年阿德里亚诺·帕纳塔以来首位赢得澳大利亚网球公开赛男子单打冠军的意大利选手，也是历史上第三位赢得大满贯男子单打冠军的意大利人。此外，西内尔还创造了 another historical achievement: 他成为自1991年米夏埃尔·施蒂希以来最年轻的在大满贯决赛中击败世界排名前五选手的球员。'}, {'url': 'https:\u002F\u002Fwww.dailymail.co.uk\u002Fsport\u002Ftennis\u002Farticle-12345678\u002Fjannik-sinner-wins-australian-open-after-epic-comeback-victory-over-daniil-medvedev', 'content': '贾尼奇·西内尔在2024年澳大利亚网球公开赛决赛中击败了丹尼尔·梅德韦杰夫，成为历史上第三位赢得大满贯男子单打冠军的意大利选手\\n贾尼奇·西内尔在2024年澳大利亚网球公开赛决赛中上演了一场惊人的逆转，从落后两盘的情况下奋起直追，最终以3比6、3比6、6比4、6比4、6比3的比分战胜了丹尼尔·梅德韦杰夫，夺得了个人首个大满贯冠军。这个 victory使西内尔成为自1976年阿德里...\n\n### 将 Agent 结构原子化\n\n在 Agent 开发的场景中，我们通常需要拆分 Agent 的许多原子化组件，以实现更好的定制化。pne 提供了原子化的 Agent 组件，例如 Planner。以下示例展示了如何使用独立的 Planner 组件进行任务规划。\n\n```python\nimport promptulate as pne\n\nmodel = pne.LLMFactory.build(\"gpt-4-turbo\")\nplanner = pne.Planner(model, system_prompt=\"You are a planner\")\nplans = planner.run(\"Plan a trip to Paris\")\nprint(plans)\n```\n\n**输出：**\n\n```text\n('goals', ['Plan a trip to Paris'])\n('tasks', [Task(task_id=1, description='Check passport validity', status=\u003CTaskStatus.TODO: 'todo'>), Task(task_id=2, description='Determine travel dates', status=\u003CTaskStatus.TODO: 'todo'>), Task(task_id=3, description='Research and book flights', status=\u003CTaskStatus.TODO: 'todo'>), Task(task_id=4, description='Book accommodations', status=\u003CTaskStatus.TODO: 'todo'>), Task(task_id=5, description='Plan itinerary for the trip', status=\u003CTaskStatus.TODO: 'todo'>), Task(task_id=6, description='Investigate and purchase travel insurance', status=\u003CTaskStatus.TODO: 'todo'>), Task(task_id=7, description='Set a budget for the trip', status=\u003CTaskStatus.TODO: 'todo'>), Task(task_id=8, description='Pack luggage', status=\u003CTaskStatus.TODO: 'todo'>), Task(task_id=9, description='Notify bank of international travel', status=\u003CTaskStatus.TODO: 'todo'>), Task(task_id=10, description='Check weather forecast and pack accordingly', status=\u003CTaskStatus.TODO: 'todo'>)])\n('next_task_id', 1)\n```\n\n如需了解更多详细信息，请查看 [入门\u002F官方文档](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)。\n\n## 📚 设计原则\n\npne 框架的设计原则包括模块化、可扩展性、互操作性、鲁棒性、可维护性、安全性、效率和易用性。\n\n- **模块化**：以模块为基本单位，便于集成新的组件、模型和工具。\n- **可扩展性**：框架能够处理大量数据、复杂任务以及高并发场景。\n- **互操作性**：框架与各种外部系统、工具和服务兼容，能够实现无缝集成与通信。\n- **鲁棒性**：框架具备强大的错误处理、容错和恢复机制，确保在各种条件下可靠运行。\n- **安全性**：框架实施了严格的措施，以防止未经授权的访问和恶意行为。\n- **效率**：优化框架的性能、资源使用和响应时间，以提供流畅、高效的用户体验。\n- **易用性**：框架采用用户友好的界面和清晰的文档，易于使用和理解。\n\n遵循这些原则并结合最新的人工智能技术，`pne` 致力于提供一个强大而灵活的框架，用于构建自动化 Agent。\n\n## 💌 联系方式\n\n如需更多信息，请联系：[zeeland4work@gmail.com](mailto:zeeland4work@gmail.com)\n\n有关变更日志等信息，请参阅 [Telegram 频道](https:\u002F\u002Ft.me\u002Fzeeland0809)。\n\n## ⭐ 贡献\n\n我们非常感谢您对我们的开源项目感兴趣并愿意贡献力量。我们已提供一份 [开发者指南](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002Fother\u002Fcontribution)，详细介绍了如何参与 Promptulate 的开发。请参考该指南，以确保协作顺畅、贡献成功。此外，您还可以查看 [当前开发计划](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002Fother\u002Fplan)，了解最新的开发进展 🤝🚀","# Promptulate 快速上手指南\n\nPromptulate 是由 Cogit Lab 打造的 AI Agent 应用开发框架。它采用 Pythonic 风格，旨在通过极简的代码（核心仅需 `pne.chat`）帮助开发者快速构建包含 LLM、Agent、Tool 和 RAG 等组件的 AI 应用。\n\n## 环境准备\n\n- **操作系统**：Windows \u002F macOS \u002F Linux\n- **Python 版本**：3.8 或更高版本\n- **前置依赖**：无需特殊系统依赖，确保已安装 pip 包管理工具\n\n## 安装步骤\n\n打开终端，执行以下命令安装最新版本的 Promptulate：\n\n```shell script\npip install -U pne\n```\n\n> **提示**：国内用户若下载缓慢，可指定清华镜像源加速安装：\n> ```shell script\n> pip install -U pne -i https:\u002F\u002Fpypi.tuna.tsinghua.edu.cn\u002Fsimple\n> ```\n\n## 基本使用\n\nPromptulate 的核心设计理念是“少即是多”。在 90% 的开发场景中，你只需要使用 `pne.chat()` 函数即可完成大模型调用、多模态交互及第三方模型接入。\n\n### 1. 基础对话（兼容 OpenAI 格式）\n\n直接使用 `pne.chat()` 进行对话，支持标准的 messages 列表格式：\n\n```python\nimport promptulate as pne\n\nmessages = [\n    {\"role\": \"system\", \"content\": \"You are a helpful assistant.\"},\n    {\"role\": \"user\", \"content\": \"Who are you?\"},\n]\n\n# 默认调用 OpenAI 模型，也可替换为其他支持的模型\nresponse: str = pne.chat(messages=messages, model=\"gpt-4-turbo\")\nprint(response)\n```\n\n### 2. 调用第三方或本地模型\n\nPromptulate 集成了 LiteLLM 能力，支持通过 `provider\u002Fmodel_name` 格式调用几乎所有主流大模型（如 Deepseek、Ollama 本地模型、Groq 等），无需切换 SDK。\n\n**示例：调用本地 Ollama 模型 (Llama3)**\n\n```python\nimport promptulate as pne\n\nresp: str = pne.chat(\n    model=\"ollama\u002Fllama3\", \n    messages=[{\"content\": \"Hello, how are you?\", \"role\": \"user\"}]\n)\nprint(resp)\n```\n\n**示例：调用 Deepseek 模型**\n\n```python\nimport os\nimport promptulate as pne\n\n# 设置 API Key\nos.environ[\"DEEPSEEK_API_KEY\"] = \"your_api_key_here\"\n\nmessages = [\n    {\"role\": \"system\", \"content\": \"You are a helpful assistant.\"},\n    {\"role\": \"user\", \"content\": \"How are you?\"},\n]\n\n# 使用 openai\u002F 前缀兼容调用\nresponse = pne.chat(model=\"openai\u002Fdeepseek-chat\", messages=messages)\nprint(response)\n```\n\n### 3. 多模态交互\n\n直接传入图片 URL 即可使用支持多模态的模型（如 GPT-4o）：\n\n```python\nimport promptulate as pne\n\nmessages=[\n    {\n        \"role\": \"user\",\n        \"content\": [\n            {\"type\": \"text\", \"text\": \"What's in this image?\"},\n            {\n                \"type\": \"image_url\",\n                \"image_url\": \"https:\u002F\u002Fupload.wikimedia.org\u002Fwikipedia\u002Fcommons\u002Fthumb\u002Fd\u002Fdd\u002FGfp-wisconsin-madison-the-nature-boardwalk.jpg\u002F2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg\",\n            },\n        ],\n    }\n]\n\nresp = pne.chat(model=\"gpt-4o\", messages=messages)\nprint(resp)\n```\n\n---\n*更多高级功能（如 Agent 编排、工具函数注册、Hooks 生命周期管理等）请参考官方文档。*","某电商公司的数据分析师需要构建一个能自动查询库存、分析销售趋势并生成日报的智能助手，以替代繁琐的人工统计流程。\n\n### 没有 promptulate 时\n- **开发门槛高**：需分别学习不同大模型的 SDK 和复杂的 Agent 框架，编写大量样板代码才能打通模型调用链路。\n- **工具集成难**：将内部的库存查询 API 转化为 Agent 可调用的工具时，需手动封装参数解析与错误处理逻辑，耗时且易出错。\n- **调试效率低**：缺乏原生的终端交互与钩子机制，难以实时观察 Agent 的思考规划过程，排查问题如同“黑盒摸象”。\n- **维护成本大**：代码结构臃肿，一旦更换底层大模型或增加新功能，往往需要重构大量核心逻辑。\n\n### 使用 promptulate 后\n- **极简开发体验**：利用 `pne.chat` 这一 Pythonic 接口，仅需几行代码即可封装完整的对话逻辑，快速拉起应用原型。\n- **函数即工具**：直接将现有的 Python 库存查询函数注册为 Agent 工具，promptulate 自动完成参数映射，无需额外适配层。\n- **全链路可观测**：借助内置的生命周期 Hooks 和终端集成，开发者可清晰监控 Agent 的每一步推理与行动，调试效率倍增。\n- **灵活扩展架构**：依托其松耦合设计，无缝切换不同大模型或叠加 RAG 组件，业务迭代无需推倒重来。\n\npromptulate 通过高度抽象的 Pythonic 范式，让开发者从繁琐的框架细节中解放，专注于业务逻辑本身，真正实现“几行代码构建智能体”。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FUndertone0809_promptulate_e84b5053.png","Undertone0809","Zeeland","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002FUndertone0809_0cc9d1e0.png","Creator of Spark Lab, Promptulate. I design SDKs, infra and software for human and the NEW WORLD.","Foundria","Mars","zeeland4work@gmail.com","kfhedRk3lXofRIB",null,"https:\u002F\u002Fgithub.com\u002FUndertone0809",[84,88,92],{"name":85,"color":86,"percentage":87},"Python","#3572A5",79.7,{"name":89,"color":90,"percentage":91},"Jupyter Notebook","#DA5B0B",19.4,{"name":93,"color":94,"percentage":95},"Makefile","#427819",0.9,585,41,"2026-04-05T17:44:15","Apache-2.0",1,"未说明",{"notes":103,"python":104,"dependencies":105},"该工具是一个基于 Python 的 AI Agent 开发框架，主要通过集成 litellm 库来支持多种大模型。安装命令为'pip install -U pne'（包名为 pne）。README 中未提及具体的操作系统、GPU 或内存硬件要求，因为其核心功能侧重于 API 调用和逻辑编排，而非本地模型训练或推理（除非用户自行通过 ollama 等后端在本地运行模型）。","3.8+",[106],"litellm",[35,14,13],[109,110,111,112,113,114,115,116,64,117,118,119],"chatgpt","gpt-4","langchain","llm","python","prompt","prompt-engineering","pne","agent","autogen","language-agent","2026-03-27T02:49:30.150509","2026-04-08T13:00:25.160037",[123,128,133,138,143,148,153,158],{"id":124,"question_zh":125,"answer_zh":126,"source_url":127},24362,"遇到'Failed to get data'错误，但 API Key 和网络代理均正常，如何解决？","该问题可能与环境部署位置有关。有用户反馈即使在美国服务器无 VPN 环境下也会遇到此问题。建议检查代码中 Conversation 和 OpenAI 初始化的具体参数配置，确保模型名称（如 gpt-3.5-turbo）和参数（temperature, top_p 等）正确。如果问题依旧，请提供具体的报错堆栈信息以便进一步排查。","https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate\u002Fissues\u002F5",{"id":129,"question_zh":130,"answer_zh":131,"source_url":132},24363,"如何在启用输出模式（output_schema）的同时使用流式传输（stream）？","目前直接设置 output_schema 和 stream=True 会导致错误。解决方案是构建一个 JSON 解析器来解析流式的 JSON 数据。维护者参考了阿里的一种解决方案，建议使用 AST（抽象语法树）解析器来处理流式返回的 JSON 片段，从而实现在流式模式下获取结构化数据。","https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate\u002Fissues\u002F625",{"id":134,"question_zh":135,"answer_zh":136,"source_url":137},24364,"是否有统一的方法初始化不同的大语言模型（LLM），而不需要学习每个模型的特定类？","是的，可以使用 `LLMFactory` 动态初始化不同类型的 LLM。例如：\n```python\nimport promptulate as pne\nmodel_config = {\"temperature\": 0.0}\nllm = pne.LLMFactory(model_name=\"gpt-4-1106-preview\", model_config=model_config)\nllm(\"How is everything going?\")\n```\n对于特定平台（如百度千帆），可以指定 platform 参数：\n```python\nLLMFactory.build(platform=\"qianfan\", model_name=\"llama\")\n```\n这种方式屏蔽了底层不同模型（如 ChatOpenAI, ErnieBot 等）的实现差异，简化了开发流程。","https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate\u002Fissues\u002F322",{"id":139,"question_zh":140,"answer_zh":141,"source_url":142},24365,"如何更好地支持 Azure OpenAI 自定义 LLM？","不建议直接仿造原有的 `openai.py` 实现来编写 Azure 支持，因为原有代码存在较多硬编码，对 Azure 支持不友好。维护者建议直接使用官方的 openai 库来编写自定义 LLM 逻辑，或者参考官方文档中的 Custom LLM 部分进行实现，以获得更好的兼容性和灵活性。","https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate\u002Fissues\u002F282",{"id":144,"question_zh":145,"answer_zh":146,"source_url":147},24366,"框架是否支持国产大模型（如文心一言、智谱 AI 等）？","框架计划并已经扩展了对国产大模型的支持。目前已包括百度文心系列（文心一言、文心 turbo 及最新的 8k 模型）、清华智谱 AI 以及阿里云的相关模型。为了提升性能和停止机制的表现，部分模型（如文心）已集成专用 SDK 替代原有的请求机制。","https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate\u002Fissues\u002F22",{"id":149,"question_zh":150,"answer_zh":151,"source_url":152},24367,"如何使用 Promptulate 构建类似 Claude Artifacts 的功能（如在 Streamlit 中生成并运行小游戏）？","可以通过结合 `pne artifact` 能力和 Streamlit 来实现。基本思路是：用户输入指令（如“做一个贪吃蛇游戏”），ToolAgent 生成对应的 HTML\u002FJS\u002FCSS 代码，将其保存到本地文件，然后在 Streamlit 中加载并渲染该文件以展示游戏。这实现了从自然语言指令到可交互前端应用的完整闭环。","https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate\u002Fissues\u002F800",{"id":154,"question_zh":155,"answer_zh":156,"source_url":157},24368,"在 IoT Agent 中使用 LLMFactory 构建 Azure OpenAI 模型时程序崩溃怎么办？","当在 ToolAgent 中通过 `pne.LLMFactory` 传入 Azure OpenAI 实例时，可能会因响应格式解析错误（如缺少 'analysis' 字段）导致崩溃。这通常是因为 Azure 的返回结构与默认解析逻辑不完全匹配。建议检查 `_parse_llm_response` 方法中的数据处理逻辑，确保能兼容 Azure 的返回格式，或暂时使用默认的 OpenAI 配置直到修复完成。","https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate\u002Fissues\u002F730",{"id":159,"question_zh":160,"answer_zh":161,"source_url":162},24369,"如何在 IoT 场景下自定义工具来控制数字信号？","虽然可以直接添加数字信号控制工具，但维护者建议优先通过 Notebook 示例来展示最佳实践。你可以创建一个 Jupyter Notebook，详细演示如何在 IoT 系统中构建自定义工具（Custom Tool），包括如何定义工具函数、注册到 Agent 以及如何通过提示词触发设备控制。这种方式比单纯增加代码更利于理解和复用。","https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate\u002Fissues\u002F242",[164,169,174,179,184,189,194,199,204,209,213,218,223,228,233,238,243,248,253,258],{"id":165,"version":166,"summary_zh":167,"released_at":168},153911,"v1.18.4","👋 大家好！欢迎阅读 Promptulate v1.18.4 的发布说明。我们很高兴为大家带来一些新更新，相信这些更新将进一步提升您使用 Promptulate 的体验。\n\n如需了解详细信息，请访问我们的官方文档：[Promptulate 文档](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)\n\n想要查看代码、报告问题或参与贡献，请前往我们的 GitHub 仓库：[Promptulate on GitHub](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate)\n\n## 变更内容\n\n* 添加微信聊天机器人用例 (#933) @Luffy2004-c\n* 文档：更新 README.md (#912) @eltociear\n* 修复：部署站点时的错误 (#901) @Undertone0809\n* 功能：重构文档 (#900) @Undertone0809\n* Hizeros\u002Ffeat 添加 mem0 教程 (#898) @Hizeros\n\n## :arrow_up: 依赖项更新\n\n* :arrow_up: 将 openai 从 1.43.1 升级至 1.50.2 (#958) @dependabot\n\n## :busts_in_silhouette: 贡献者名单\n\n@Hizeros、@Luffy2004-c、@Undertone0809、@dependabot、@dependabot[bot] 和 @eltociear","2024-10-08T23:34:32",{"id":170,"version":171,"summary_zh":172,"released_at":173},153912,"v1.18.3","👋 大家好！欢迎阅读 Promptulate v1.18.3 的发布说明。我们很高兴为大家带来一些新更新，相信这些更新将进一步提升您使用 Promptulate 的体验。\n\n如需了解详细信息，请访问我们的官方文档：[Promptulate 文档](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)\n\n想要查看代码、提交问题或参与贡献，请前往我们的 GitHub 仓库：[Promptulate on GitHub](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate)\n\n## 变更内容\n\n* feat: 添加用于 Streamlit 侧边栏的新测试文件（#896）@Undertone0809\n* Hizeros\u002Fmem0（#895）@Hizeros\n* refactor: 移除 enable_log 配置项（#894）@Undertone0809\n* refactor: 优化 Streamlit 的使用（#893）@Undertone0809\n\n## :busts_in_silhouette: 贡献者名单\n\n@Hizeros 和 @Undertone0809","2024-09-06T15:42:40",{"id":175,"version":176,"summary_zh":177,"released_at":178},153913,"v1.18.2","👋 大家好！欢迎阅读 Promptulate v1.18.2 的发布说明。我们很高兴为大家带来一些新更新，相信这些更新将进一步提升您使用 Promptulate 的体验。\n\n如需了解详细信息，请访问我们的官方文档：[Promptulate 文档](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)\n\n想要查看代码、提交问题或参与贡献，请前往我们的 GitHub 仓库：[Promptulate on GitHub](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate)\n\n## 变更内容\n\n* 添加 e2b 代码解释器 (#877) @Undertone0809\n* 文档：更新 README 文件 (#866) @Undertone0809\n* 文档：更新 README 文件 (#849) @Undertone0809\n* 文档：添加 Vue 2 支持 (#848) @Undertone0809\n* 文档：更新流式输出的使用方法 (#842) @Undertone0809\n\n## :busts_in_silhouette: 贡献者名单\n\n@Undertone0809\n","2024-08-23T20:04:35",{"id":180,"version":181,"summary_zh":182,"released_at":183},153914,"v1.18.1","👋 大家好！欢迎阅读 Promptulate v1.18.1 的发布说明。我们很高兴为大家带来一些新更新，相信这些更新将进一步提升您使用 Promptulate 的体验。\n\n如需了解详细信息，请访问我们的官方文档：[Promptulate 文档](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)\n\n若想查看代码、报告问题或参与贡献，请访问我们的 GitHub 仓库：[Promptulate on GitHub](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate)\n\n## 变更内容\n\n* 功能：在测试版中新增 Streamlit 侧边栏组件 (#841) @Undertone0809\n* 优化：优化 Streamlit 聊天机器人 (#840) @Undertone0809\n\n## :busts_in_silhouette: 贡献者名单\n\n@Undertone0809\n","2024-08-05T14:28:14",{"id":185,"version":186,"summary_zh":187,"released_at":188},153915,"v1.18.0","👋 大家好！欢迎阅读 Promptulate v1.17.1 的发布说明。我们非常高兴地向大家介绍一些新功能和改进，相信这些更新将进一步提升您使用 Promptulate 的体验。\n\n如需了解详细信息，请访问我们的官方文档：[Promptulate 文档](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)\n\n想要查看源代码、提交问题或参与贡献？请访问我们的 GitHub 仓库：[Promptulate on GitHub](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate)\n\n![image](https:\u002F\u002Fgithub.com\u002Fuser-attachments\u002Fassets\u002Fb42190e8-8a66-4226-9a37-e3d096c3126d)\n\n![image](https:\u002F\u002Fgithub.com\u002Fuser-attachments\u002Fassets\u002Fb574f64f-1b5c-4ecf-b60d-fca5805dbc88)\n\n\n## 变更内容\n\n* 添加 Stream JSON 解析器 (#828) @longsihua2003\n* 新特性：在 GitHub 仓库中使用 pne.chat() + Streamlit 实现聊天用例 (#829) @Hizeros\n* 文档：新增将聊天功能集成到 GitHub 仓库的用例教程文档 (#830) @Hizeros\n* 文档：更新用例教程中的 Streamlit 聊天机器人示例图片 (#831) @Hizeros\n* 配置优化：优化 llmapper 的使用方式 (#808) @Undertone0809\n* 文档：更新 AIChat 内存使用说明 (#807) @Undertone0809\n* 文档：新增 llmapper 的用例教程文档 (#805) @Hizeros\n* 新特性：利用 Streamlit + pne + networkx 绘制知识图谱的用例 (#804) @Hizeros\n\n## :busts_in_silhouette: 贡献者名单\n\n@Hizeros、@Undertone0809 和 @longsihua2003","2024-08-04T18:28:32",{"id":190,"version":191,"summary_zh":192,"released_at":193},153916,"v1.17.0","👋 大家好！欢迎阅读 Promptulate v1.17.0 的发布说明。我们非常高兴地向大家介绍一些新更新，相信这些更新将进一步提升您使用 Promptulate 的体验。\n\n如需了解详细信息，请访问我们的官方文档：[Promptulate 文档](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)\n\n想要查看代码、报告问题或参与贡献，请访问我们的 GitHub 仓库：[Promptulate on GitHub](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate)\n\n## 变更内容\n\n* 功能：AIChat 添加记忆功能 (#803) @Undertone0809\n* Gradio 简单应用 (#797) @Luffy2004-c\n* 样式：优化 pip 安装指令格式 (#729) @Hizeros\n* 文档：为 Streamlit 使用场景添加侧边栏导航 (#726) @Hizeros\n* 样式：将 litellm 版本修改为 1.39.6 (#727) @Hizeros\n* 文档：新增 Streamlit + Groq + Llama3 的使用案例文档 (#713) @Hizeros\n* 文档：添加 Streamlit 应用的文档 (#712) @Hizeros\n* Hizeros\u002FStreamlit (#695) @Hizeros\n* 功能：新增 Streamlit + Groq + Llama3 搭建简单聊天机器人使用案例 (#709) @Hizeros\n* 功能：新增 Gradio + pne.chat 简单聊天应用 (#710) @Luffy2004-c\n* 文档：优化模型使用文档 (#694) @Undertone0809\n\n## :arrow_up: 依赖项更新\n\n* :arrow_up: 将 tenacity 从 8.3.0 升级到 8.5.0 (#787) @dependabot\n* :arrow_up: 将 pytest 从 7.4.4 升级到 8.2.2 (#737) @dependabot\n\n## :busts_in_silhouette: 贡献者名单\n\n@Hizeros、@Luffy2004-c、@Undertone0809、@dependabot 和 @dependabot[bot]","2024-07-20T18:58:01",{"id":195,"version":196,"summary_zh":197,"released_at":198},153917,"v1.16.7","👋 大家好！欢迎阅读 Promptulate v1.16.7 的发布说明。我们很高兴为大家带来一些新更新，相信这些更新将进一步提升您使用 Promptulate 的体验。\n\n如需了解详细信息，请访问我们的官方文档：[Promptulate 文档](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)\n\n若想查看代码、报告问题或参与贡献，请前往我们的 GitHub 仓库：[Promptulate on GitHub](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate)\n\n## 变更内容\n\n## :wrench: 修复与重构\n\n* pref：优化使用 `define_tool` 定义工具时的性能 (#692) @Undertone0809\n\n## :busts_in_silhouette: 贡献者名单\n\n@Undertone0809\n","2024-05-26T06:13:49",{"id":200,"version":201,"summary_zh":202,"released_at":203},153918,"v1.16.6","👋 大家好！欢迎阅读 Promptulate v1.16.6 的发布说明。我们非常高兴地向大家介绍一些新更新，相信这些更新将进一步提升您使用 Promptulate 的体验。\n\n如需了解详细信息，请访问我们的官方文档：[Promptulate 文档](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)\n\n若想查看代码、提交问题或参与贡献，请访问我们的 GitHub 仓库：[Promptulate on GitHub](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate)\n\n## 变更内容\n\n* 性能优化：添加工具类型声明 (#690) @Undertone0809\n\n## :busts_in_silhouette: 贡献者名单\n\n@Undertone0809\n","2024-05-25T10:56:34",{"id":205,"version":206,"summary_zh":207,"released_at":208},153919,"v1.16.5","👋 大家好！欢迎阅读 Promptulate v1.16.5 的发布说明。我们非常高兴地向您介绍一些新功能和改进，相信它们将提升您使用 Promptulate 的体验。\n\n如需了解详细信息，请访问我们的官方文档：[Promptulate 文档](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)\n\n如果您想查看源代码、报告问题或参与贡献，请访问我们的 GitHub 仓库：[Promptulate on GitHub](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate)\n\n## 变更内容\n\n* 修复：流式传输中的错误 (#665) @Hizeros\n\n## :busts_in_silhouette: 贡献者名单\n\n@Hizeros 和 @Undertone0809\n","2024-05-18T12:31:23",{"id":210,"version":211,"summary_zh":81,"released_at":212},153920,"v1.16.4","2024-05-17T11:50:06",{"id":214,"version":215,"summary_zh":216,"released_at":217},153921,"v1.16.3","👋 大家好！欢迎阅读 Promptulate v1.16.3 的发布说明。我们非常高兴地向大家介绍一些新更新，相信这些更新将提升您使用 Promptulate 的体验。\n\n如需了解详细信息，请访问我们的官方文档：[Promptulate 文档](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)\n\n若想查看代码、报告问题或参与贡献，请访问我们的 GitHub 仓库：[Promptulate on GitHub](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate)\n\n## 变更内容\n\n* 修复：修改 ChatOpenAI 的用法，改用 LLMFactory (#663) @Hizeros\n\n## :busts_in_silhouette: 贡献者名单\n\n@Hizeros 和 @Undertone0809\n","2024-05-17T09:16:56",{"id":219,"version":220,"summary_zh":221,"released_at":222},153922,"v1.16.2","👋 Hello and welcome to the release notes for Promptulate v1.16.2! We're excited to share some new updates that we believe will enhance your experience with Promptulate.\r\n\r\nFor detailed information, please visit our official documentation: [Promptulate Documentation](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)\r\n\r\nTo see the code, report issues, or contribute, check out our GitHub repository: [Promptulate on GitHub](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate)\r\n\r\n## What’s Changed\r\n\r\n* feat: add planner (#658) @Undertone0809\r\n* test: update test_tool.py (#664) @eltociear\r\n\r\n## :busts_in_silhouette: List of contributors\r\n\r\n@Undertone0809 and @eltociear\r\n","2024-05-15T08:11:25",{"id":224,"version":225,"summary_zh":226,"released_at":227},153923,"v1.16.1","👋 Hello and welcome to the release notes for Promptulate v1.16.1! We're excited to share some new updates that we believe will enhance your experience with Promptulate.\r\n\r\nFor detailed information, please visit our official documentation: [Promptulate Documentation](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)\r\n\r\nTo see the code, report issues, or contribute, check out our GitHub repository: [Promptulate on GitHub](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate)\r\n\r\n## What’s Changed\r\n\r\n* docs: optimize hook docs (#649) @Undertone0809\r\n* Hizeros\u002Fhookdocs (#629) @Hizeros\r\n\r\n## :arrow_up: Dependencies updates\r\n\r\n* :arrow_up: Bump langchain-core from 0.1.33 to 0.1.46 (#618) @dependabot\r\n\r\n## :busts_in_silhouette: List of contributors\r\n\r\n@Hizeros, @Undertone0809 and @dependabot\r\n","2024-05-12T20:34:38",{"id":229,"version":230,"summary_zh":231,"released_at":232},153924,"v1.16.0","👋 Hello and welcome to the release notes for Promptulate v1.15.4! We're excited to share some new updates that we believe will enhance your experience with Promptulate.\r\n\r\nFor detailed information, please visit our official documentation: [Promptulate Documentation](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)\r\n\r\nTo see the code, report issues, or contribute, check out our GitHub repository: [Promptulate on GitHub](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate)\r\n\r\n## What’s Changed\r\n\r\n* feat: optimize llm factory and pne.chat (#627) @Undertone0809\r\n* test: add an example about building math application with agent (#566) @Hizeros\r\n\r\n## :wrench: Fixes & Refactoring\r\n\r\n* fix:Repair error report of AttributeError: 'StreamIterator' object ha… (#610) @Hizeros\r\n\r\n## :busts_in_silhouette: List of contributors\r\n\r\n@Hizeros and @Undertone0809\r\n","2024-04-30T08:33:39",{"id":234,"version":235,"summary_zh":236,"released_at":237},153925,"v1.15.3","👋 Hello and welcome to the release notes for Promptulate v1.15.3! We're excited to share some new updates that we believe will enhance your experience with Promptulate.\r\n\r\nFor detailed information, please visit our official documentation: [Promptulate Documentation](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)\r\n\r\nTo see the code, report issues, or contribute, check out our GitHub repository: [Promptulate on GitHub](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate)\r\n\r\n## What’s Changed\r\n\r\n* No changes\r\n\r\n## :busts_in_silhouette: List of contributors\r\n\r\n@Undertone0809\r\n","2024-04-23T13:49:26",{"id":239,"version":240,"summary_zh":241,"released_at":242},153926,"v1.15.2","👋 Hello and welcome to the release notes for Promptulate v1.15.2! We're excited to share some new updates that we believe will enhance your experience with Promptulate.\r\n\r\nFor detailed information, please visit our official documentation: [Promptulate Documentation](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)\r\n\r\nTo see the code, report issues, or contribute, check out our GitHub repository: [Promptulate on GitHub](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate)\r\n\r\n## What’s Changed\r\n\r\n* fix: ddg depc error in webagent (#595) @Undertone0809\r\n\r\n## :busts_in_silhouette: List of contributors\r\n\r\n@Undertone0809\r\n","2024-04-21T17:49:16",{"id":244,"version":245,"summary_zh":246,"released_at":247},153927,"v1.15.1","👋 Hello and welcome to the release notes for Promptulate v1.15.1! We're excited to share some new updates that we believe will enhance your experience with Promptulate.\r\n\r\nFor detailed information, please visit our official documentation: [Promptulate Documentation](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)\r\n\r\nTo see the code, report issues, or contribute, check out our GitHub repository: [Promptulate on GitHub](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate)\r\n\r\n## What’s Changed\r\n\r\n* fix: error when init toolkit in tool manager (#593) @Undertone0809\r\n\r\n## :arrow_up: Dependencies updates\r\n\r\n* :arrow_up: Bump openai from 1.14.2 to 1.16.2 (#574) @dependabot\r\n\r\n## :busts_in_silhouette: List of contributors\r\n\r\n@Undertone0809, @dependabot and @dependabot[bot]\r\n","2024-04-18T14:42:07",{"id":249,"version":250,"summary_zh":251,"released_at":252},153928,"v1.15.0","👋 Hello and welcome to the release notes for Promptulate v1.15.0! We're excited to share some new updates that we believe will enhance your experience with Promptulate.\r\n\r\nFor detailed information, please visit our official documentation: [Promptulate Documentation](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)\r\n\r\nTo see the code, report issues, or contribute, check out our GitHub repository: [Promptulate on GitHub](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate)\r\n\r\n## Update\r\n\r\n1. All function in `pne.chat()`, LLM Agent application make it easy.\r\n\r\n2. We add AssistantAgent, who is a powerful entity with the capabilities to plan, execute, and reflect. This is heavily inspired by the [Plan-and-Solve](https:\u002F\u002Farxiv.org\u002Fabs\u002F2305.04091) paper as well as the [Baby-AGI](https:\u002F\u002Fgithub.com\u002Fyoheinakajima\u002Fbabyagi) project. Now you can see how to use AssistantAgent  in `example\u002Fagent\u002Fassistant_agent_usage.ipynb`\r\n\r\n![image](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate\u002Fassets\u002F72488598\u002F423d1ad1-a422-4b5a-99cf-32ed7c63cc74)\r\n\r\n\r\n## What’s Changed\r\n\r\n## :rocket: Features\r\n\r\n* feat: optimize pne chat and add AIChat (#552) @Undertone0809\r\n* feat:add llm factory (#502) @ruanrongman\r\n* feat: add uacp and AssistantAgent implementation (#530) @Undertone0809\r\n\r\n## :pencil: Documentation\r\n\r\n* feat: optimize pne chat and add AIChat (#552) @Undertone0809\r\n\r\n## :busts_in_silhouette: List of contributors\r\n\r\n@Undertone0809 and @ruanrongman\r\n","2024-03-30T20:08:46",{"id":254,"version":255,"summary_zh":256,"released_at":257},153929,"v1.14.0","👋 Hello and welcome to the release notes for Promptulate v1.14.0! We're excited to share some new updates that we believe will enhance your experience with Promptulate.\r\n\r\nFor detailed information, please visit our official documentation: [Promptulate Documentation](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)\r\n\r\nTo see the code, report issues, or contribute, check out our GitHub repository: [Promptulate on GitHub](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate)\r\n\r\n## What’s Changed\r\n\r\n* add file_operator tools (#466) @longsihua2003\r\n* pref: optimize langchain tool usage (#500) @Undertone0809\r\n* ci: reduce the fre of dependabot (#498) @Undertone0809\r\n\r\n## :rocket: Features\r\n\r\n* feat: add wikipedia tool (#503) @Hizeros\r\n\r\n## :busts_in_silhouette: List of contributors\r\n\r\n@Hizeros, @Undertone0809 and @longsihua2003\r\n","2024-03-11T10:48:00",{"id":259,"version":260,"summary_zh":261,"released_at":262},153930,"v1.13.2","👋 Hello and welcome to the release notes for Promptulate v1.13.2! We're excited to share some new updates that we believe will enhance your experience with Promptulate.\r\n\r\nFor detailed information, please visit our official documentation: [Promptulate Documentation](https:\u002F\u002Fundertone0809.github.io\u002Fpromptulate\u002F#\u002F)\r\n\r\nTo see the code, report issues, or contribute, check out our GitHub repository: [Promptulate on GitHub](https:\u002F\u002Fgithub.com\u002FUndertone0809\u002Fpromptulate)\r\n\r\n## What’s Changed\r\n\r\n## :rocket: Features\r\n\r\n* pref: compatible with pydantic v2 (#483) @Undertone0809\r\n* pref: optimize BaseTool and add unit test for ToolManager (#480) @Undertone0809\r\n\r\n## :arrow_up: Dependencies updates\r\n\r\n* pref: compatible with pydantic v2 (#483) @Undertone0809\r\n* :arrow_up: Bump httpx from 0.26.0 to 0.27.0 (#456) @dependabot\r\n* :arrow_up: Bump litellm from 1.23.5 to 1.23.14 (#433) @dependabot\r\n\r\n## :busts_in_silhouette: List of contributors\r\n\r\n@Undertone0809, @dependabot and @dependabot[bot]\r\n","2024-02-28T15:43:59"]