[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-lancedb--vectordb-recipes":3,"tool-lancedb--vectordb-recipes":61},[4,18,26,36,44,53],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":17},4358,"openclaw","openclaw\u002Fopenclaw","OpenClaw 是一款专为个人打造的本地化 AI 助手，旨在让你在自己的设备上拥有完全可控的智能伙伴。它打破了传统 AI 助手局限于特定网页或应用的束缚，能够直接接入你日常使用的各类通讯渠道，包括微信、WhatsApp、Telegram、Discord、iMessage 等数十种平台。无论你在哪个聊天软件中发送消息，OpenClaw 都能即时响应，甚至支持在 macOS、iOS 和 Android 设备上进行语音交互，并提供实时的画布渲染功能供你操控。\n\n这款工具主要解决了用户对数据隐私、响应速度以及“始终在线”体验的需求。通过将 AI 部署在本地，用户无需依赖云端服务即可享受快速、私密的智能辅助，真正实现了“你的数据，你做主”。其独特的技术亮点在于强大的网关架构，将控制平面与核心助手分离，确保跨平台通信的流畅性与扩展性。\n\nOpenClaw 非常适合希望构建个性化工作流的技术爱好者、开发者，以及注重隐私保护且不愿被单一生态绑定的普通用户。只要具备基础的终端操作能力（支持 macOS、Linux 及 Windows WSL2），即可通过简单的命令行引导完成部署。如果你渴望拥有一个懂你",349277,3,"2026-04-06T06:32:30",[13,14,15,16],"Agent","开发框架","图像","数据工具","ready",{"id":19,"name":20,"github_repo":21,"description_zh":22,"stars":23,"difficulty_score":10,"last_commit_at":24,"category_tags":25,"status":17},3808,"stable-diffusion-webui","AUTOMATIC1111\u002Fstable-diffusion-webui","stable-diffusion-webui 是一个基于 Gradio 构建的网页版操作界面，旨在让用户能够轻松地在本地运行和使用强大的 Stable Diffusion 图像生成模型。它解决了原始模型依赖命令行、操作门槛高且功能分散的痛点，将复杂的 AI 绘图流程整合进一个直观易用的图形化平台。\n\n无论是希望快速上手的普通创作者、需要精细控制画面细节的设计师，还是想要深入探索模型潜力的开发者与研究人员，都能从中获益。其核心亮点在于极高的功能丰富度：不仅支持文生图、图生图、局部重绘（Inpainting）和外绘（Outpainting）等基础模式，还独创了注意力机制调整、提示词矩阵、负向提示词以及“高清修复”等高级功能。此外，它内置了 GFPGAN 和 CodeFormer 等人脸修复工具，支持多种神经网络放大算法，并允许用户通过插件系统无限扩展能力。即使是显存有限的设备，stable-diffusion-webui 也提供了相应的优化选项，让高质量的 AI 艺术创作变得触手可及。",162132,"2026-04-05T11:01:52",[14,15,13],{"id":27,"name":28,"github_repo":29,"description_zh":30,"stars":31,"difficulty_score":32,"last_commit_at":33,"category_tags":34,"status":17},1381,"everything-claude-code","affaan-m\u002Feverything-claude-code","everything-claude-code 是一套专为 AI 编程助手（如 Claude Code、Codex、Cursor 等）打造的高性能优化系统。它不仅仅是一组配置文件，而是一个经过长期实战打磨的完整框架，旨在解决 AI 代理在实际开发中面临的效率低下、记忆丢失、安全隐患及缺乏持续学习能力等核心痛点。\n\n通过引入技能模块化、直觉增强、记忆持久化机制以及内置的安全扫描功能，everything-claude-code 能显著提升 AI 在复杂任务中的表现，帮助开发者构建更稳定、更智能的生产级 AI 代理。其独特的“研究优先”开发理念和针对 Token 消耗的优化策略，使得模型响应更快、成本更低，同时有效防御潜在的攻击向量。\n\n这套工具特别适合软件开发者、AI 研究人员以及希望深度定制 AI 工作流的技术团队使用。无论您是在构建大型代码库，还是需要 AI 协助进行安全审计与自动化测试，everything-claude-code 都能提供强大的底层支持。作为一个曾荣获 Anthropic 黑客大奖的开源项目，它融合了多语言支持与丰富的实战钩子（hooks），让 AI 真正成长为懂上",150037,2,"2026-04-10T23:33:47",[14,13,35],"语言模型",{"id":37,"name":38,"github_repo":39,"description_zh":40,"stars":41,"difficulty_score":32,"last_commit_at":42,"category_tags":43,"status":17},2271,"ComfyUI","Comfy-Org\u002FComfyUI","ComfyUI 是一款功能强大且高度模块化的视觉 AI 引擎，专为设计和执行复杂的 Stable Diffusion 图像生成流程而打造。它摒弃了传统的代码编写模式，采用直观的节点式流程图界面，让用户通过连接不同的功能模块即可构建个性化的生成管线。\n\n这一设计巧妙解决了高级 AI 绘图工作流配置复杂、灵活性不足的痛点。用户无需具备编程背景，也能自由组合模型、调整参数并实时预览效果，轻松实现从基础文生图到多步骤高清修复等各类复杂任务。ComfyUI 拥有极佳的兼容性，不仅支持 Windows、macOS 和 Linux 全平台，还广泛适配 NVIDIA、AMD、Intel 及苹果 Silicon 等多种硬件架构，并率先支持 SDXL、Flux、SD3 等前沿模型。\n\n无论是希望深入探索算法潜力的研究人员和开发者，还是追求极致创作自由度的设计师与资深 AI 绘画爱好者，ComfyUI 都能提供强大的支持。其独特的模块化架构允许社区不断扩展新功能，使其成为当前最灵活、生态最丰富的开源扩散模型工具之一，帮助用户将创意高效转化为现实。",108322,"2026-04-10T11:39:34",[14,15,13],{"id":45,"name":46,"github_repo":47,"description_zh":48,"stars":49,"difficulty_score":32,"last_commit_at":50,"category_tags":51,"status":17},6121,"gemini-cli","google-gemini\u002Fgemini-cli","gemini-cli 是一款由谷歌推出的开源 AI 命令行工具，它将强大的 Gemini 大模型能力直接集成到用户的终端环境中。对于习惯在命令行工作的开发者而言，它提供了一条从输入提示词到获取模型响应的最短路径，无需切换窗口即可享受智能辅助。\n\n这款工具主要解决了开发过程中频繁上下文切换的痛点，让用户能在熟悉的终端界面内直接完成代码理解、生成、调试以及自动化运维任务。无论是查询大型代码库、根据草图生成应用，还是执行复杂的 Git 操作，gemini-cli 都能通过自然语言指令高效处理。\n\n它特别适合广大软件工程师、DevOps 人员及技术研究人员使用。其核心亮点包括支持高达 100 万 token 的超长上下文窗口，具备出色的逻辑推理能力；内置 Google 搜索、文件操作及 Shell 命令执行等实用工具；更独特的是，它支持 MCP（模型上下文协议），允许用户灵活扩展自定义集成，连接如图像生成等外部能力。此外，个人谷歌账号即可享受免费的额度支持，且项目基于 Apache 2.0 协议完全开源，是提升终端工作效率的理想助手。",100752,"2026-04-10T01:20:03",[52,13,15,14],"插件",{"id":54,"name":55,"github_repo":56,"description_zh":57,"stars":58,"difficulty_score":32,"last_commit_at":59,"category_tags":60,"status":17},4721,"markitdown","microsoft\u002Fmarkitdown","MarkItDown 是一款由微软 AutoGen 团队打造的轻量级 Python 工具，专为将各类文件高效转换为 Markdown 格式而设计。它支持 PDF、Word、Excel、PPT、图片（含 OCR）、音频（含语音转录）、HTML 乃至 YouTube 链接等多种格式的解析，能够精准提取文档中的标题、列表、表格和链接等关键结构信息。\n\n在人工智能应用日益普及的今天，大语言模型（LLM）虽擅长处理文本，却难以直接读取复杂的二进制办公文档。MarkItDown 恰好解决了这一痛点，它将非结构化或半结构化的文件转化为模型“原生理解”且 Token 效率极高的 Markdown 格式，成为连接本地文件与 AI 分析 pipeline 的理想桥梁。此外，它还提供了 MCP（模型上下文协议）服务器，可无缝集成到 Claude Desktop 等 LLM 应用中。\n\n这款工具特别适合开发者、数据科学家及 AI 研究人员使用，尤其是那些需要构建文档检索增强生成（RAG）系统、进行批量文本分析或希望让 AI 助手直接“阅读”本地文件的用户。虽然生成的内容也具备一定可读性，但其核心优势在于为机器",93400,"2026-04-06T19:52:38",[52,14],{"id":62,"github_repo":63,"name":64,"description_en":65,"description_zh":66,"ai_summary_zh":67,"readme_en":68,"readme_zh":69,"quickstart_zh":70,"use_case_zh":71,"hero_image_url":72,"owner_login":73,"owner_name":74,"owner_avatar_url":75,"owner_bio":76,"owner_company":77,"owner_location":77,"owner_email":78,"owner_twitter":73,"owner_website":79,"owner_url":80,"languages":81,"stars":104,"forks":105,"last_commit_at":106,"license":107,"difficulty_score":108,"env_os":109,"env_gpu":109,"env_ram":109,"env_deps":110,"category_tags":116,"github_topics":118,"view_count":32,"oss_zip_url":77,"oss_zip_packed_at":77,"status":17,"created_at":135,"updated_at":136,"faqs":137,"releases":163},4967,"lancedb\u002Fvectordb-recipes","vectordb-recipes","Resource, examples & tutorials for multimodal AI, RAG and agents using vector search and LLMs","vectordb-recipes 是一个专为生成式 AI 开发者打造的开源资源库，汇集了丰富的示例代码、实战应用和教程，旨在帮助用户快速构建基于向量搜索和大语言模型（LLM）的多模态 AI、RAG（检索增强生成）及智能体应用。\n\n面对 GenAI 项目起步难、环境配置复杂等痛点，vectordb-recipes 提供了一条从创意到原型验证的捷径。其核心优势在于深度集成了 LanceDB——一款无需繁琐设置、免费且无服务器的开源向量数据库。这使得开发者能够直接在现有的 Python 数据生态（如 Pandas、Pydantic）中无缝使用，甚至通过原生 TypeScript SDK 在无服务器函数中运行向量搜索，极大降低了技术门槛。\n\n资源库内容结构清晰，涵盖“从零构建”、“多模态处理”、\"RAG 实战”、“智能体协作”及“推荐系统”等多个板块。无论是希望快速上手的新手开发者，还是寻求高效解决方案的研究人员，都能在这里找到对应的交互式笔记（Notebook）和完整应用案例。例如，用户可以直接复用代码来搭建本地 RAG 系统、开发金融领域 AI 智能体，或探索最新的视频搜索技术。vecto","vectordb-recipes 是一个专为生成式 AI 开发者打造的开源资源库，汇集了丰富的示例代码、实战应用和教程，旨在帮助用户快速构建基于向量搜索和大语言模型（LLM）的多模态 AI、RAG（检索增强生成）及智能体应用。\n\n面对 GenAI 项目起步难、环境配置复杂等痛点，vectordb-recipes 提供了一条从创意到原型验证的捷径。其核心优势在于深度集成了 LanceDB——一款无需繁琐设置、免费且无服务器的开源向量数据库。这使得开发者能够直接在现有的 Python 数据生态（如 Pandas、Pydantic）中无缝使用，甚至通过原生 TypeScript SDK 在无服务器函数中运行向量搜索，极大降低了技术门槛。\n\n资源库内容结构清晰，涵盖“从零构建”、“多模态处理”、\"RAG 实战”、“智能体协作”及“推荐系统”等多个板块。无论是希望快速上手的新手开发者，还是寻求高效解决方案的研究人员，都能在这里找到对应的交互式笔记（Notebook）和完整应用案例。例如，用户可以直接复用代码来搭建本地 RAG 系统、开发金融领域 AI 智能体，或探索最新的视频搜索技术。vectordb-recipes 致力于让 AI 应用开发变得更加简单、高效，是进入生成式 AI 领域的理想起点。","# VectorDB-recipes\n\u003Cbr \u002F>\nDive into building GenAI applications!\nThis repository contains examples, applications, starter code, & tutorials to help you kickstart your GenAI projects.\n\n- These are built using LanceDB, a free, open-source, serverless vectorDB that **requires no setup**. \n- It **integrates into Python data ecosystem** so you can simply start using these in your existing data pipelines in pandas, arrow, pydantic etc.\n- LanceDB has **native Typescript SDK** using which you can **run vector search** in serverless functions!\n\n\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_5fa302fe41e0.png\" height=\"85%\" width=\"85%\" \u002F>\n\n\u003Cbr \u002F>\nJoin our community for support - \u003Ca href=\"https:\u002F\u002Fdiscord.gg\u002FzMM32dvNtd\">Discord\u003C\u002Fa> •\n\u003Ca href=\"https:\u002F\u002Ftwitter.com\u002Flancedb\">Twitter\u003C\u002Fa>\n\n---\n\nThis repository is divided into 2 sections:\n- [Examples](#examples) - Get right into the code with minimal introduction, aimed at getting you from an idea to PoC within minutes!\n- [Applications](#projects--applications) - Ready to use Python and web apps using applied LLMs, VectorDB and GenAI tools\n\n\nThe following examples are organized into different tables to make similar types of examples easily accessible.\n\n### Sections\n\n- [Build from Scratch](#build-from-scratch) - Step-by-step guides to create AI applications from scratch.\n- [Multimodal](#multimodal) - Build apps that process and search across both text and images.\n- [RAG](#rag) - Combine document retrieval with LLM-powered responses.\n- [Vector Search](#vector-search) - Learn to efficiently find relevant documents using vector-based search.\n- [Chatbot](#chatbot) - Create AI chatbots that fetch information and generate intelligent replies.\n- [Evaluation](#evaluation) - Measure the quality and accuracy of AI-generated answers.\n- [AI Agents](#ai-agents) - Build LLM-driven applications where multiple agents collaborate and interact.\n- [Recommender Systems](#recommender-systems) - Develop AI-powered recommendation systems for personalized suggestions.\n- [Concepts](#concepts) - Tutorials and explanations of key techniques used in AI applications.\n\n\n### 🌟 New 🌟 \nStay up to date with the latest projects, tools, and improvements added to the repository.\n- **V-JEPA Video Search** - \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fv-jepa-video-search\u002Fintra-video.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>\n\n### Build from Scratch\n\nStart with the basics! These examples guide you through creating AI applications from the ground up using LanceDB for efficient document retrieval and search.\n\n| Build from Scratch &nbsp; &nbsp;| Interactive Notebook & Scripts &nbsp; | \n|-------- | -------------: |\n|||\n| [Build RAG from Scratch](.\u002Ftutorials\u002FRAG-from-Scratch) | [![Open In Colab](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002FRAG-from-Scratch\u002FRAG_from_Scratch.ipynb) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|  |\n| [Local RAG from Scratch with Llama3](.\u002Ftutorials\u002FLocal-RAG-from-Scratch) | [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Ftutorials\u002FLocal-RAG-from-Scratch\u002Frag.py) [![local LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|  |\n| [Multi-Head RAG from Scratch](.\u002Ftutorials\u002FMulti-Head-RAG-from-Scratch\u002F) | [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Ftutorials\u002FMulti-Head-RAG-from-Scratch\u002Fmain.py) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![local LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|  |\n| [Fintech AI Agent from Scratch](.\u002Fexamples\u002Ffintech-ai-agent) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Ffintech-ai-agent\u002Ffintech-ai-agent.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)     [![advanced](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Ffintech-ai-agent\u002F) |\n||||\n\n### MultiModal\n\nSearch across different types of data (text, images, and more). Build powerful search applications that work with diverse inputs.\n\n| Multimodal &nbsp; &nbsp;| Interactive Notebook & Scripts &nbsp; | Blog |\n| --------- | -------------------------- | ----------- |\n||||\n| [V-JEPA Video Search](.\u002Fexamples\u002Fv-jepa-video-search\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fv-jepa-video-search\u002Fintra-video.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> | |\n| [Multimodal CLIP: DiffusionDB](.\u002Fexamples\u002Fmultimodal_clip_diffusiondb\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fmultimodal_clip_diffusiondb\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002Fmultimodal_clip_diffusiondb\u002Fmain.py) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#)    [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fmulti-modal-ai-made-easy-with-lancedb-clip-5aaf8801c939\u002F)|\n| [Multimodal CLIP: Youtube videos](.\u002Fexamples\u002Fmultimodal_video_search\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fmultimodal_video_search\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002Fmultimodal_video_search\u002Fmain.py) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#)    [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fmulti-modal-ai-made-easy-with-lancedb-clip-5aaf8801c939\u002F)|\n| [Cambrian-1: Vision centric exploration of images](https:\u002F\u002Fwww.kaggle.com\u002Fcode\u002Fprasantdixit\u002Fcambrian-1-vision-centric-exploration-of-images\u002F) | [![Kaggle](https:\u002F\u002Fkaggle.com\u002Fstatic\u002Fimages\u002Fopen-in-kaggle.svg)](https:\u002F\u002Fwww.kaggle.com\u002Fcode\u002Fprasantdixit\u002Fcambrian-1-vision-centric-exploration-of-images\u002F) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#)   [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fcambrian-1-vision-centric-exploration\u002F)|\n| [Multimodal Jina CLIP-V2 : Food Search ](.\u002Fexamples\u002Fmultimodal_jina_clipv2\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fmultimodal_jina_clipv2\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002Fmultimodal_jina_clipv2)    [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|\n| [Multimodal vector search: Voyage AI X LanceDB](.\u002Fexamples\u002Fvoyagexlancedb\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fvoyagexlancedb\u002FVoyage_x_LanceDB.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>  [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)||\n||||\n\n### RAG\n\nGenerated Responses by retrieving relevant documents before answering. This section covers different approaches to implementing RAG in your projects.\n\n| RAG &nbsp; &nbsp;| Interactive Notebook & Scripts | Blog |\n| --------- | -------------------------- | ----------- |\n||||\n| [RAG using Deepseek R1 vs OpenAI o1](.\u002Fexamples\u002FDeepseek_R1_VS_GPT_4o) | [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002FDeepseek_R1_VS_GPT_4o\u002FREADME.md)  [![Analysis](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FAnalysis-FF3333)](#) |\n| [RAG On PDF](.\u002Fexamples\u002FRAG-On-PDF\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FRAG-On-PDF\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|\n| [RAG with Contextual Retrieval and Hybrid search](.\u002Fexamples\u002FContextual-RAG\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FContextual-RAG\u002FAnthropic_Contextual_RAG.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)  [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fguide-to-use-contextual-retrieval-and-prompt-caching-with-lancedb\u002F) |\n| [RAG with Matryoshka Embeddings and LlamaIndex](.\u002Ftutorials\u002FRAG-with_MatryoshkaEmbed-Llamaindex\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002FRAG-with_MatryoshkaEmbed-Llamaindex\u002FRAG_with_MatryoshkaEmbedding_and_Llamaindex.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)||\n| [RAG with IBM Watsonx](.\u002Fexamples\u002FRAG-with-watsonx\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FRAG-with-watsonx\u002FWatsonx_example.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![watsonx LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fwatsonx-api-lightblue)](#) [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)||\n| [Cognee RAG](.\u002Fexamples\u002Fcognee-RAG\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fcognee-RAG\u002Fcognee_demo.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> ||\n| [Improve RAG with Re-ranking](.\u002Fexamples\u002FRAG_Reranking\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FRAG_Reranking\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fsimplest-method-to-improve-rag-pipeline-re-ranking-cf6eaec6d544)|\n[Improve RAG with HyDE](.\u002Fexamples\u002FAdvance-RAG-with-HyDE\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FAdvance-RAG-with-HyDE\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>   [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)  [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fadvanced-rag-precise-zero-shot-dense-retrieval-with-hyde-0946c54dfdcb)|\n| [Improve RAG with LOTR ](.\u002Fexamples\u002FAdvance_RAG_LOTR\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FAdvance_RAG_LOTR\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)  [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fbetter-rag-with-lotr-lord-of-retriever-23c8336b9a35)|\n| [Advanced RAG: Context Enrichment Window](.\u002Fexamples\u002FAdvanced_RAG_Context_Enrichment_Window\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FAdvanced_RAG_Context_Enrichment_Window\u002FAdvanced_RAG_Context_Enrichment_Window.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)  [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fhttps:\u002F\u002Fblog.lancedb.com\u002Fadvanced-rag-context-enrichment-window\u002F)|\n| [Advanced RAG: Late Chunking](.\u002Fexamples\u002FAdvanced_RAG_Late_Chunking\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FAdvanced_RAG_Late_Chunking\u002FLate_Chunking_(Chunked_Pooling).ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)  [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Flate-chunking-aka-chunked-pooling-2\u002F)|\n| [Corrective RAG with Langgraph](.\u002Ftutorials\u002FCorrective-RAG-with_Langgraph\u002F) | [![Open In Colab](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002FCorrective-RAG-with_Langgraph\u002FCRAG_with_Langgraph.ipynb) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fimplementing-corrective-rag-in-the-easiest-way-2\u002F)|\n| [Contextual-Compression-with-RAG](.\u002Fexamples\u002FContextual-Compression-with-RAG\u002F) | [![Open In Colab](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FContextual-Compression-with-RAG\u002Fmain.ipynb) [![local LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#)   [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fenhance-rag-integrate-contextual-compression-and-filtering-for-precision-a29d4a810301\u002F) |\n| [Improve RAG with FLARE](.\u002Fexamples\u002Fbetter-rag-FLAIR) | [![Open In Colab](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fbetter-rag-FLAIR\u002Fmain.ipynb) [![local LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![advanced](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fbetter-rag-with-active-retrieval-augmented-generation-flare-3b66646e2a9f\u002F) |\n| [Agentic RAG ](.\u002Ftutorials\u002FAgentic_RAG\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002FAgentic_RAG\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![advanced](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#)|\n| [GraphRAG ](.\u002Fexamples\u002FGraphrag\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FGraphrag\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fgraphrag-hierarchical-approach-to-retrieval-augmented-generation\u002F)|\n| [GraphRAG with CSV File ](.\u002Ftutorials\u002FGraphRAG_CSV\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002FGraphRAG_CSV\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Faksdesai1998.medium.com\u002Foptimizing-graphrag-with-microsoft-for-csv-data-a-guide-with-lancedb-8e4150b93e37)|\n| [GraphRAG with cognee - Multimedia ](.\u002Ftutorials\u002FGraphRAG_with_cognee\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002FGraphRAG_with_cognee\u002Fcognee_multimedia_demo.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)||\n||||\n\n### Vector Search\n\nFind relevant documents quickly! These projects show how to use vector-based search techniques to make AI-powered searches faster and smarter.\n\n| Vector Search &nbsp; &nbsp;| Interactive Notebook & Scripts &nbsp; | Blog |\n| --------- | -------------------------- | ----------- |\n||||\n| [Inbuilt Hybrid Search](.\u002Fexamples\u002FInbuilt-Hybrid-Search) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FInbuilt-Hybrid-Search\u002FInbuilt_Hybrid_Search_with_LanceDB.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)    [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)||\n| [Hybrid search BM25 & lancedb ](.\u002Fexamples\u002FHybrid_search_bm25_lancedb\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FHybrid_search_bm25_lancedb\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>   [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#) |[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fhybrid-search-combining-bm25-and-semantic-search-for-better-results-with-lan-1358038fe7e6)|\n| [NER powered Semantic Search](.\u002Ftutorials\u002FNER-powered-Semantic-Search) | [![Open In Colab](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002FNER-powered-Semantic-Search\u002FNER_powered_Semantic_Search_with_LanceDB.ipynb) [![local LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fner-powered-semantic-search-using-lancedb-51051dc3e493) |\n| [Vector Arithmetic with LanceDB](.\u002Fexamples\u002FVector-Arithmetic-with-LanceDB\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FVector-Arithmetic-with-LanceDB\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>   [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#) |[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fvector-arithmetic-with-lancedb-an-intro-to-vector-embeddings\u002F)|\n| [Summarize and Search Reddit Posts](.\u002Fexamples\u002FReddit-summarization-and-search\u002F) | \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FReddit-summarization-and-search\u002Fsubreddit_summarization_querying.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>    [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|\n| [Imagebind demo app](.\u002Fexamples\u002Fimagebind_demo\u002F) | \u003Ca href=\"https:\u002F\u002Fhuggingface.co\u002Fspaces\u002Fraghavd99\u002Fimagebind2\">\u003Cimg src=\"https:\u002F\u002Fhuggingface.co\u002Fdatasets\u002Fhuggingface\u002Fbrand-assets\u002Fresolve\u002Fmain\u002Fhf-logo-with-title.svg\" alt=\"hf spaces\" style=\"width: 80px; vertical-align: middle; background-color: white;\">\u003C\u002Fa>  [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|\n| [Search Within Images](.\u002Fexamples\u002Fsearch-within-images-with-sam-and-clip\u002F) | [![Open In Colab](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fsearch-within-images-with-sam-and-clip\u002Fmain.ipynb) [![local LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#)   [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fsearch-within-an-image-331b54e4285e)|\n| [Zero Shot Object Detection with CLIP](.\u002Fexamples\u002Fzero-shot-object-detection-CLIP\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fzero-shot-object-detection-CLIP\u002Fzero_shot_object_detection_clip.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|\n| [Vector Search with TransformersJS](.\u002Fexamples\u002Fjs-transformers\u002F) |[![JS](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fjavascript-%23323330.svg?style=for-the-badge&logo=javascript&logoColor=%23F7DF1E)](.\u002Fexamples\u002Fjs-transformers\u002Findex.js) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![advanced](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#)|  |\n| [Geospatial Recommendation System](.\u002Fexamples\u002FGeospatial-Recommendation-System\u002F) | [![Open In Colab](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FGeospatial-Recommendation-System\u002Fgeospatial-recommendation.ipynb) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|\n| [Accelerate Vector Search Applications Using OpenVINO](.\u002Fexamples\u002FAccelerate-Vector-Search-Applications-Using-OpenVINO\u002F) | [![Open In Colab](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FAccelerate-Vector-Search-Applications-Using-OpenVINO\u002Fclip_text_image_search.ipynb) [![local LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![advanced](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Faccelerate-vector-search-applications-using-openvino-lancedb\u002F)|\n||||\n\n### Chatbot\n\nCreate chatbots that understand user queries and fetch relevant responses using LanceDB’s vector search capabilities.\n\n| Chatbot &nbsp; &nbsp;| Interactive Notebook & Scripts &nbsp; | Blog &nbsp;|\n| --------- | -------------------------- | ----------- |\n||||\n| [Databricks DBRX Website Bot](.\u002Fexamples\u002Fdatabricks_DBRX_website_bot\u002F) | [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002Fdatabricks_DBRX_website_bot\u002Fmain.py) [![Databricks LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fdatabricks-api-red)](#)    [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|\n| [CLI-based SDK Manual Chatbot with Phidata](.\u002Fexamples\u002FCLI-SDK-Manual-Chatbot-Locally\u002F) | [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002FCLI-SDK-Manual-Chatbot-Locally\u002Fassistant.py) [![local LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|\n| [Youtube transcript search bot](.\u002Fexamples\u002FYoutube-Search-QA-Bot\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FYoutube-Search-QA-Bot\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>  [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002FYoutube-Search-QA-Bot\u002Fmain.py) [![JS](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fjavascript-%23323330.svg?style=for-the-badge&logo=javascript&logoColor=%23F7DF1E)](.\u002Fexamples\u002FYoutube-Search-QA-Bot\u002Findex.js) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)||\n| [Langchain: Code Docs QA bot](.\u002Fexamples\u002FCode-Documentation-QA-Bot\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FCode-Documentation-QA-Bot\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002FCode-Documentation-QA-Bot\u002Fmain.py) [![JS](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fjavascript-%23323330.svg?style=for-the-badge&logo=javascript&logoColor=%23F7DF1E)](.\u002Fexamples\u002FCode-Documentation-QA-Bot\u002Findex.js) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)||\n| [Chatbot with any website using Crawl4AI ](.\u002Fexamples\u002FCrawlerQ&A_website\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FCrawlerQ&A_website\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002FCode-Documentation-QA-Bot\u002Fmain.py) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)  [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|\n| [Context-Aware Chatbot using Llama 2 & LanceDB](.\u002Ftutorials\u002Fchatbot_using_Llama2_&_lanceDB) | [![Open In Colab](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002Fchatbot_using_Llama2_&_lanceDB\u002Fmain.ipynb) [![local LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![advanced](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fcontext-aware-chatbot-using-llama-2-lancedb-as-vector-database-4d771d95c755) |\n||||\n\n\n### Evaluation\n\nThese projects provide tools to compare AI-generated responses against reference data and fine-tune accuracy.\n\n| Evaluation &nbsp; &nbsp;| Interactive Notebook & Scripts &nbsp; | Blog |\n| --------- | -------------------------- | ----------- |\n||||\n| [Monitoring and Tracing RAG using HoneyHive](.\u002Fexamples\u002FHoneyHive_x_LanceDB\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FHoneyHive_x_LanceDB\u002FHoneyHive_x_LanceDB.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>   [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|  |\n| [Evaluating RAG with RAGAs](.\u002Fexamples\u002FEvaluating_RAG_with_RAGAs\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FEvaluating_RAG_with_RAGAs\u002FEvaluating_RAG_with_RAGAs.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>   [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|  |\n||||\n\n### AI Agents\n\nBuild applications where multiple AI agents interact to complete tasks efficiently. These projects show how agents can collaborate, exchange data, and automate workflows.\n\n| AI Agents &nbsp; &nbsp;| Interactive Notebook & Scripts &nbsp; | Blog |\n| --------- | -------------------------- | ----------- |\n||||\n| [Trip Planner Swarm style Agent ](.\u002Fexamples\u002FTrip_planner_swarm_style_agent\u002F) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FTrip_planner_swarm_style_agent\u002FTrip_planner_agent.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)   [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|\n| [Multi Source Agent  ](.\u002Fexamples\u002FMulti-source-Agent\u002F) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FMulti-source-Agent\u002FMulti_source_RAG_Agent.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)   [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|\n| [AI email assistant with Composio](.\u002Fexamples\u002FAI-Email-Assistant-with-Composio\u002F) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FAI-Email-Assistant-with-Composio\u002Fcomposio-lance.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)   [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|\n| [Assistant Bot with OpenAI Swarm](.\u002Fexamples\u002Fassistance-bot-with-swarm\u002F) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fassistance-bot-with-swarm\u002Fassistant_bot_with_swarm.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)   [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|\n| [AI Trends Searcher with CrewAI](.\u002Fexamples\u002FAI-Trends-with-CrewAI\u002F) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FAI-Trends-with-CrewAI\u002FCrewAI_AI_Trends.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)    [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Ftrack-ai-trends-crewai-agents-rag\u002F)|\n| [SuperAgent Autogen](.\u002Fexamples\u002FSuperAgent_Autogen) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FSuperAgent_Autogen\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)||\n| [Build autonomous Customer support agent using Langgraph](.\u002Fexamples\u002Fcustomer_support_agent_langgraph\u002FLangGraph_LanceDB.ipynb) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002F\u002Fexamples\u002Fcustomer_support_agent_langgraph\u002FLangGraph_LanceDB.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>     [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fagentic-rag-using-langgraph-building-a-simple-customer-support-autonomous-agent\u002F)|\n| [AI Agents: Reducing Hallucination](.\u002Fexamples\u002Freducing_hallucinations_ai_agents\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Freducing_hallucinations_ai_agents\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002Freducing_hallucinations_ai_agents\u002Fmain.py) [![JS](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fjavascript-%23323330.svg?style=for-the-badge&logo=javascript&logoColor=%23F7DF1E)](.\u002Fexamples\u002Freducing_hallucinations_ai_agents\u002Findex.js) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![advanced](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#) |[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fhow-to-reduce-hallucinations-from-llm-powered-agents-using-long-term-memory-72f262c3cc1f\u002F)|\n| [Multi Document Agentic RAG](.\u002Fexamples\u002Fmulti-document-agentic-rag\u002F) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fmulti-document-agentic-rag\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)     [![advanced](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fmulti-document-agentic-rag\u002F)|\n| [RASA: Customer Support Bot](.\u002Fexamples\u002FRASA_Customer-support-bot) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FRASA_Customer-support-bot\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)     [![advanced](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fcustomer-support-bot-rasa-x-lancedb\u002F)|\n||||\n\n### Recommender Systems\n\nPersonalized AI recommendations! These projects help you build recommendation engines that suggest content based on user preferences.\n\n| Recommender Systems | Interactive Notebook & Scripts &nbsp; | Blog |\n| --------- | -------------------------- | ----------- |\n||||\n| [Movie Recommender](.\u002Fexamples\u002Fmovie-recommender\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fmovie-recommender\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002Fmovie-recommender\u002Fmain.py) [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|  |\n| [Product Recommender](.\u002Fexamples\u002Fproduct-recommender\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fproduct-recommender\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002Fproduct-recommender\u002Fmain.py) [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)| |\n| [Arxiv paper recommender](.\u002Fexamples\u002Farxiv-recommender) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Farxiv-recommender\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"Open In Colab\">\u003C\u002Fa> [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002Farxiv-recommender\u002Fmain.py) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#)  [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|  |\n| [Music Recommender](.\u002Fapplications\u002FMusic_Recommendation\u002F) | [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fapplications\u002FMusic_Recommendation\u002Fapp_music.py) [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)| |||||\n||||\n\n### Concepts\n\nLearn the core ideas behind AI applications—including text chunking, retrieval strategies, and optimization techniques—to improve your understanding of vector search and AI pipelines.\n\n| Concepts | Interactive Notebook | Blog |\n| --------- | -------------------------- | ----------- |\n|           |                            |             |\n| [A Primer on Text Chunking and its Types](.\u002Ftutorials\u002Fdifferent-types-text-chunking-in-RAG) | [![Open In Colab](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002Fdifferent-types-text-chunking-in-RAG\u002FText_Chunking_on_RAG_application_with_LanceDB.ipynb) [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fa-primer-on-text-chunking-and-its-types-a420efc96a13) |\n| [Langchain LlamaIndex Chunking](.\u002Ftutorials\u002FLangchain-LlamaIndex-Chunking) | [![Open In Colab](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002FLangchain-LlamaIndex-Chunking\u002FLangchain_Llamaindex_chunking.ipynb) [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fchunking-techniques-with-langchain-and-llamaindex\u002F) |\n| [Create structured dataset using Instructor](.\u002Ftutorials\u002FNER-dataset-with-Instructor\u002F) | [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Ftutorials\u002FNER-dataset-with-Instructor\u002Fmain.py) [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)| |\n| [Comparing Cohere Rerankers with LanceDB](.\u002Ftutorials\u002Fcohere-reranker) | [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fbenchmarking-cohere-reranker-with-lancedb\u002F) |\n| [Product Quantization: Compress High Dimensional Vectors](https:\u002F\u002Fblog.lancedb.com\u002Fbenchmarking-lancedb-92b01032874a-2\u002F) |[![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#) | [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fbenchmarking-lancedb-92b01032874a-2\u002F) |\n| [LLMs, RAG, & the missing storage layer for AI](https:\u002F\u002Fblog.lancedb.com\u002Fllms-rag-the-missing-storage-layer-for-ai-28ded35fa984) | [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fllms-rag-the-missing-storage-layer-for-ai-28ded35fa984\u002F) |\n| [Fine-Tuning LLM using PEFT & QLoRA](.\u002Ftutorials\u002Ffine-tuning_LLM_with_PEFT_QLoRA) | [![Open In Colab](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002Ffine-tuning_LLM_with_PEFT_QLoRA\u002Fmain.ipynb) [![local LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![advanced](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Foptimizing-llms-a-step-by-step-guide-to-fine-tuning-with-peft-and-qlora-22eddd13d25b) |\n| [Extracting Complex tables-text from PDFs using LlamaParse  ](.\u002Ftutorials\u002FAdvace_RAG_LlamaParser) | [![Open In Colab](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002FAdvace_RAG_LlamaParser\u002Fmain.ipynb) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![LlamaCloud](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FLlama-api-pink)](#) [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|  |\n| [Convert any Image dataset to lance Format](.\u002Ftutorials\u002Fcli-sdk-to-convert-image-datasets-to-lance) | [![Open In Colab](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002Fcli-sdk-to-convert-image-datasets-to-lance\u002Fmain.ipynb) [![advanced](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fpython-package-to-convert-image-datasets-to-lance-type\u002F) |\n||||\n\n## Projects & Applications\nReady-to-use AI applications built with LanceDB! Use these projects as-is, customize them, or integrate them into your own applications.\n\n### Node applications powered by LanceDB\n| Project Name                                        | Description                                                                                                          | Screenshot                                |\n|-----------------------------------------------------|----------------------------------------------------------------------------------------------------------------------|-------------------------------------------|\n| [Writing assistant](https:\u002F\u002Fgithub.com\u002Flancedb\u002Fvectordb-recipes\u002Ftree\u002Fmain\u002Fapplications\u002Fnode\u002Flanchain_writing_assistant) | Writing assistant app using lanchain.js with LanceDB, allows you to get real time relevant suggestions and facts based on you written text to help you with your writing.                  | ![Writing assistant](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_619937a5141e.png) |\n| [Sentence Auto-Complete](https:\u002F\u002Fgithub.com\u002Flancedb\u002Fvectordb-recipes\u002Ftree\u002Fmain\u002Fapplications\u002Fnode\u002Fsentance_auto_complete) | Sentance auto complete app using lanchain.js with LanceDB, allows you to get real time relevant auto complete suggestions and facts based on you written text to help you with your writing.You can also upload your data source in the form of a pdf file.You can switch between gpt models to get faster results.                 | ![Sentence auto-complete](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_6f6b1e7b5531.gif) |\n| [Article Recommendation](https:\u002F\u002Fgithub.com\u002Flancedb\u002Fvectordb-recipes\u002Ftree\u002Fmain\u002Fapplications\u002Fnode\u002Farticle_recommender) | Article Recommender: Explore vast data set of articles with Instant, Context-Aware Suggestions. Leveraging Advanced NLP, Vector Search, and Customizable Datasets, Our App Delivers Real-Time, Precise Article Recommendations. Perfect for Research, Content Curation, and Staying Informed. Unlock Smarter Insights with State-of-the-Art Technology in Content Retrieval and Discovery!\".                 | ![Article Recommendation](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_aca9b696b08d.gif) |\n| [AI Powered Job Search](https:\u002F\u002Fgithub.com\u002Flancedb\u002Fvectordb-recipes\u002Ftree\u002Fmain\u002Fapplications\u002Fnode\u002FAI_powered_job_search) | Transform your job search experience with this AI-driven application. Powered by LangChain.js, LanceDB, and advanced semantic search, it provides real-time, highly accurate job listings tailored to your preferences. Featuring customizable datasets and advanced filtering options (e.g., skills, location, job type, and salary range), this app ensures you find the right opportunities quickly and effortlessly. Best suited for job seekers, recruiters, career platforms, custom job boards.                 | ![Job Search](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_013620672e7d.gif) |\n| [AI Powered Multimodal meme search](.\u002Fapplications\u002Fnode\u002Fmutimodal_meme_finder) | An advanced AI-powered meme search engine that allows users to find memes using both text and image queries. By leveraging LanceDB as a high-performance vector database and Roboflow's CLIP model for embedding generation, the platform delivers fast and accurate meme retrieval.     | ![Multimodal meme search](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_e76b53b08d75.gif) |\n| [AI Powered Feedback search and analysis](https:\u002F\u002Fgithub.com\u002Flancedb\u002Fvectordb-recipes\u002Ftree\u002Fmain\u002Fapplications\u002Fnode\u002FFeedback_search_and_analysis) | An AI-powered employee feedback analysis platform designed to collect, store, analyze, and retrieve insightful employee feedback. This system leverages LanceDB for high-speed vector-based semantic search, React.js for an interactive UI, Node.js for backend processing, and LangChain.js with an Ambient Agent for intelligent analysis and actionable insights.     | ![AI Powered Feedback search and analysis](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_20d35c3c4218.gif) |\n| [Hierarchical Multi Agent](https:\u002F\u002Fgithub.com\u002Flancedb\u002Fvectordb-recipes\u002Ftree\u002Fmain\u002Fapplications\u002Fnode\u002Fhierarchical-multi-agent) | The AI-Powered Law Assistant is a **Hierarchical Multi-Agent** System leveraging LangGraph, LangChain, and LanceDB for efficient legal query processing. It features a Supervisor Agent that delegates tasks to specialized agents for IPC and NDPS laws, each with sub-agents for case retrieval and legal summarization. Using LanceDB, it stores and retrieves vectorized legal documents, enabling fast, structured, and context-aware responses for legal professionals, researchers, and law students.     | ![AI Powered Law Assistant](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_d75f618aeb41.gif) |\n||||\n\n| Project Name                                        | Description                                                                                                          | Screenshot                                |\n|-----------------------------------------------------|----------------------------------------------------------------------------------------------------------------------|-------------------------------------------|\n| [YOLOExplorer](https:\u002F\u002Fgithub.com\u002Flancedb\u002Fyoloexplorer) | Iterate on your YOLO \u002F CV datasets using SQL, Vector semantic search, and more within seconds                  | ![YOLOExplorer](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_ba49bbbe4858.png) |\n| [Website Chatbot (Deployable Vercel Template)](https:\u002F\u002Fgithub.com\u002Flancedb\u002Flancedb-vercel-chatbot) | Create a chatbot from the sitemap of any website\u002Fdocs of your choice. Built using vectorDB serverless native javascript package. | ![Chatbot](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_db18523af12d.gif)    |\n| [Advanced Chatbot with Parler TTS ](.\u002Fapplications\u002FChatbot_with_Parler_TTS) | This Chatbot app uses Lancedb Hybrid search, FTS & reranker method with Parlers TTS library.|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_26882b434a7c.png)|\n| [Multi-Modal Search Engine](.\u002Fapplications\u002Fmultimodal-search\u002F) | Create a Multi-modal search engine app, to search images using both images or text | ![Search](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_0a44723eb222.png)|\n| [Evaluate RAG](.\u002Fapplications\u002Fevaluate_RAG\u002F) | A working Streamlit RAG App designed to demonstrate end to to end production grade evaluation using 50+ scores and metrics which include guards, software metrics, traditional metrics and LLM as judge metrics. It uses mixture of specialised deep learning models and LLM as Judge models to do the evaluations |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_4971dc9709da.png)|\n| [Multi-Agent Collaboration Chatbot](.\u002Fapplications\u002FMulti_collabration_chatbot\u002F) | Multi-Agent collaboration chatbot using langgraph for share-market use case using Lancedb & tools such as Polygon ,Tavily |![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_89b8f6ac8775.png)|\n| [Multimodal Myntra Fashion Search Engine](https:\u002F\u002Fgithub.com\u002Fishandutta0098\u002Flancedb-multimodal-myntra-fashion-search-engine) | This app uses OpenAI's CLIP to make a search engine that can understand and deal with both written words and pictures.|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_2698c1035fa1.png)|\n| [Multilingual-RAG](.\u002Fapplications\u002FMultilingual_RAG\u002F) | Multilingual RAG with cohere embedding & support 100+ languages|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_326dc2afe208.png)|\n| [Music Recommender](.\u002Fapplications\u002FMusic_Recommendation\u002F) | Music Recommendation system using audio feature extraction and vector similarity search. By utilizing **LanceDB**, **PANNs** for audio tagging, and **Librosa** for audio feature extraction, the system finds and recommends tracks with similar audio characteristics based on a query song.|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_8f3f44ae8b78.png)|\n| [NoOCR](https:\u002F\u002Fgithub.com\u002Fkyryl-opens-ml\u002Fno-ocr) | End-to-end solution for complex PDFs, powered by **ColPali** and **LanceDB**.|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_0211d734ebfa.gif)|\n\n\n**🌟 New! 🌟 Applied GenAI and VectorDB course on Udacity**\nLearn about GenAI and vectorDBs using LanceDB in the recently launched [Udacity Course](https:\u002F\u002Fwww.udacity.com\u002Fcourse\u002Fbuilding-generative-ai-solutions-with-vector-databases--cd12952)\n\n\n\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_040285530f18.png\" width=\"80%\" height=\"80%\" \u002F>\n\n\n## Contributing Examples\nIf you're working on some cool applications that you'd like to add to this repo, please open a PR!\n","# VectorDB-recipes\n\u003Cbr \u002F>\n深入构建生成式AI应用！\n本仓库包含示例、应用、入门代码及教程，助您快速启动生成式AI项目。\n\n- 这些内容基于LanceDB构建，LanceDB是一款免费、开源、无服务器的向量数据库，**无需任何设置**。\n- 它**与Python数据生态系统无缝集成**，因此您可以直接在现有的pandas、arrow、pydantic等数据管道中使用。\n- LanceDB还提供**原生的TypeScript SDK**，借助它您可以在无服务器函数中**执行向量搜索**！\n\n\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_5fa302fe41e0.png\" height=\"85%\" width=\"85%\" \u002F>\n\n\u003Cbr \u002F>\n加入我们的社区获取支持 - \u003Ca href=\"https:\u002F\u002Fdiscord.gg\u002FzMM32dvNtd\">Discord\u003C\u002Fa> •\n\u003Ca href=\"https:\u002F\u002Ftwitter.com\u002Flancedb\">Twitter\u003C\u002Fa>\n\n---\n\n本仓库分为两个部分：\n- [示例](#examples) - 通过极少的介绍即可直接上手代码，旨在让您在几分钟内从想法过渡到概念验证！\n- [应用](#projects--applications) - 可直接使用的Python和Web应用，结合了应用型LLM、向量数据库和生成式AI工具。\n\n\n以下示例按不同类别整理，便于您快速找到相似类型的示例。\n\n### 版块\n\n- [从零开始构建](#build-from-scratch) - 逐步指南，教您从头创建AI应用。\n- [多模态](#multimodal) - 构建可同时处理并检索文本与图像的应用。\n- [RAG](#rag) - 将文档检索与LLM驱动的回答相结合。\n- [向量搜索](#vector-search) - 学习如何利用基于向量的搜索高效地找到相关文档。\n- [聊天机器人](#chatbot) - 创建能够获取信息并生成智能回复的AI聊天机器人。\n- [评估](#evaluation) - 衡量AI生成答案的质量与准确性。\n- [AI代理](#ai-agents) - 构建由多个代理协作互动的LLM驱动应用。\n- [推荐系统](#recommender-systems) - 开发用于个性化推荐的AI驱动系统。\n- [概念](#concepts) - 关于AI应用中关键技巧的教程与解释。\n\n\n### 🌟 新增 🌟 \n随时了解仓库中新增的最新项目、工具及改进。\n- **V-JEPA视频搜索** - \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fv-jepa-video-search\u002Fintra-video.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在Colab中打开\">\u003C\u002Fa>\n\n### 从零开始构建\n\n从基础入手！这些示例将引导您使用LanceDB从零开始构建AI应用，实现高效的文档检索与搜索功能。\n\n| 从零开始构建 &nbsp; &nbsp;| 交互式笔记本与脚本 &nbsp; | \n|-------- | -------------: |\n|||\n| [从零开始构建RAG](.\u002Ftutorials\u002FRAG-from-Scratch) | [![在Colab中打开](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002FRAG-from-Scratch\u002FRAG_from_Scratch.ipynb) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![初学者](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|  |\n| [使用Llama3从零开始构建本地RAG](.\u002Ftutorials\u002FLocal-RAG-from-Scratch) | [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Ftutorials\u002FLocal-RAG-from-Scratch\u002Frag.py) [![本地LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![初学者](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|  |\n| [从零开始构建多头RAG](.\u002Ftutorials\u002FMulti-Head-RAG-from-Scratch\u002F) | [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Ftutorials\u002FMulti-Head-RAG-from-Scratch\u002Fmain.py) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![本地LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![初学者](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|  |\n| [从零开始构建金融科技AI代理](.\u002Fexamples\u002Ffintech-ai-agent) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Ffintech-ai-agent\u002Ffintech-ai-agent.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在Colab中打开\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)     [![进阶](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Ffintech-ai-agent\u002F) |\n||||\n\n### 多模态\n\n跨不同类型的数据（文本、图像等）进行搜索。构建能够处理多样化输入的强大搜索应用。\n\n| 多模态 &nbsp; &nbsp;| 交互式笔记本与脚本 &nbsp; | 博客 |\n| --------- | -------------------------- | ----------- |\n||||\n| [V-JEPA 视频搜索](.\u002Fexamples\u002Fv-jepa-video-search\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fv-jepa-video-search\u002Fintra-video.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa> | |\n| [多模态 CLIP：DiffusionDB](.\u002Fexamples\u002Fmultimodal_clip_diffusiondb\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fmultimodal_clip_diffusiondb\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa> [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002Fmultimodal_clip_diffusiondb\u002Fmain.py) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#)    [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fmulti-modal-ai-made-easy-with-lancedb-clip-5aaf8801c939\u002F)|\n| [多模态 CLIP：YouTube 视频](.\u002Fexamples\u002Fmultimodal_video_search\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fmultimodal_video_search\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa> [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002Fmultimodal_video_search\u002Fmain.py) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#)    [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fmulti-modal-ai-made-easy-with-lancedb-clip-5aaf8801c939\u002F)|\n| [Cambrian-1：以视觉为中心的图像探索](https:\u002F\u002Fwww.kaggle.com\u002Fcode\u002Fprasantdixit\u002Fcambrian-1-vision-centric-exploration-of-images\u002F) | [![Kaggle](https:\u002F\u002Fkaggle.com\u002Fstatic\u002Fimages\u002Fopen-in-kaggle.svg)](https:\u002F\u002Fwww.kaggle.com\u002Fcode\u002Fprasantdixit\u002Fcambrian-1-vision-centric-exploration-of-images\u002F) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#)   [![intermediate](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fcambrian-1-vision-centric-exploration\u002F)|\n| [多模态 Jina CLIP-V2：美食搜索](.\u002Fexamples\u002Fmultimodal_jina_clipv2\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fmultimodal_jina_clipv2\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa> [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002Fmultimodal_jina_clipv2)    [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|\n| [多模态向量搜索：Voyage AI 与 LanceDB](.\u002Fexamples\u002Fvoyagexlancedb\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fvoyagexlancedb\u002FVoyage_x_LanceDB.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa>  [![beginner](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|||\n||||\n\n### RAG\n\n通过在回答前检索相关文档来生成响应。本节涵盖了在项目中实现 RAG 的不同方法。\n\n| RAG &nbsp; &nbsp;| 交互式笔记本与脚本 | 博客 |\n| --------- | -------------------------- | ----------- |\n||||\n| [使用 Deepseek R1 对比 OpenAI o1 的 RAG](.\u002Fexamples\u002FDeepseek_R1_VS_GPT_4o) | [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002FDeepseek_R1_VS_GPT_4o\u002FREADME.md)  [![分析](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FAnalysis-FF3333)](#) |\n| [PDF 上的 RAG](.\u002Fexamples\u002FRAG-On-PDF\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FRAG-On-PDF\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa> [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![初学者](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|\n| [基于上下文检索与混合搜索的 RAG](.\u002Fexamples\u002FContextual-RAG\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FContextual-RAG\u002FAnthropic_Contextual_RAG.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa> [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)  [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fguide-to-use-contextual-retrieval-and-prompt-caching-with-lancedb\u002F) |\n| [使用 Matryoshka 嵌入和 LlamaIndex 的 RAG](.\u002Ftutorials\u002FRAG-with_MatryoshkaEmbed-Llamaindex\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002FRAG-with_MatryoshkaEmbed-Llamaindex\u002FRAG_with_MatryoshkaEmbedding_and_Llamaindex.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa> [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)||\n| [使用 IBM Watsonx 的 RAG](.\u002Fexamples\u002FRAG-with-watsonx\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FRAG-with-watsonx\u002FWatsonx_example.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa> [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![watsonx LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fwatsonx-api-lightblue)](#) [![初学者](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)||\n| [Cognee RAG](.\u002Fexamples\u002Fcognee-RAG\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fcognee-RAG\u002Fcognee_demo.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa> ||\n| [通过重排序提升 RAG](.\u002Fexamples\u002FRAG_Reranking\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FRAG_Reranking\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa> [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![初学者](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fsimplest-method-to-improve-rag-pipeline-re-ranking-cf6eaec6d544)|\n[通过 HyDE 提升 RAG](.\u002Fexamples\u002FAdvance-RAG-with-HyDE\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FAdvance-RAG-with-HyDE\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa>   [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)  [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fadvanced-rag-precise-zero-shot-dense-retrieval-with-hyde-0946c54dfdcb)|\n| [通过 LOTR 提升 RAG](.\u002Fexamples\u002FAdvance_RAG_LOTR\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FAdvance_RAG_LOTR\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)  [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fbetter-rag-with-lotr-lord-of-retriever-23c8336b9a35)|\n| [高级 RAG：上下文增强窗口](.\u002Fexamples\u002FAdvanced_RAG_Context_Enrichment_Window\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FAdvanced_RAG_Context_Enrichment_Window\u002FAdvanced_RAG_Context_Enrichment_Window.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)  [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fhttps:\u002F\u002Fblog.lancedb.com\u002Fadvanced-rag-context-enrichment-window\u002F)|\n| [高级 RAG：延迟分块](.\u002Fexamples\u002FAdvanced_RAG_Late_Chunking\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FAdvanced_RAG_Late_Chunking\u002FLate_Chunking_(Chunked_Pooling).ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)  [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Flate-chunking-aka-chunked-pooling-2\u002F)|\n| [使用 Langgraph 的纠正型 RAG](.\u002Ftutorials\u002FCorrective-RAG-with_Langgraph\u002F) | [![在 Colab 中打开](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002FCorrective-RAG-with_Langgraph\u002FCRAG_with_Langgraph.ipynb) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fimplementing-corrective-rag-in-the-easiest-way-2\u002F)|\n| [基于上下文压缩的 RAG](.\u002Fexamples\u002FContextual-Compression-with-RAG\u002F) | [![在 Colab 中打开](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FContextual-Compression-with-RAG\u002Fmain.ipynb) [![本地 LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#)   [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fenhance-rag-integrate-contextual-compression-and-filtering-for-precision-a29d4a810301\u002F) |\n| [通过 FLARE 提升 RAG](.\u002Fexamples\u002Fbetter-rag-FLAIR) | [![在 Colab 中打开](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fbetter-rag-FLAIR\u002Fmain.ipynb) [![本地 LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![高级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fbetter-rag-with-active-retrieval-augmented-generation-flare-3b66646e2a9f\u002F) |\n| [代理型 RAG](.\u002Ftutorials\u002FAgentic_RAG\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002FAgentic_RAG\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![高级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#)|\n| [GraphRAG](.\u002Fexamples\u002FGraphrag\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FGraphrag\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa> [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fgraphrag-hierarchical-approach-to-retrieval-augmented-generation\u002F)|\n| [使用 CSV 文件的 GraphRAG](.\u002Ftutorials\u002FGraphRAG_CSV\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002FGraphRAG_CSV\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa> [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Faksdesai1998.medium.com\u002Foptimizing-graphrag-with-microsoft-for-csv-data-a-guide-with-lancedb-8e4150b93e37)|\n| [结合 cognee 的多媒体 GraphRAG](.\u002Ftutorials\u002FGraphRAG_with_cognee\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002FGraphRAG_with_cognee\u002Fcognee_multimedia_demo.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa> [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)||\n||||\n\n### 向量搜索\n\n快速找到相关文档！这些项目展示了如何使用基于向量的搜索技术，使人工智能驱动的搜索更快、更智能。\n\n| 向量搜索 &nbsp; &nbsp;| 交互式笔记本与脚本 &nbsp; | 博客 |\n| --------- | -------------------------- | ----------- |\n||||\n| [内置混合搜索](.\u002Fexamples\u002FInbuilt-Hybrid-Search) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FInbuilt-Hybrid-Search\u002FInbuilt_Hybrid_Search_with_LanceDB.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa> [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)    [![初学者](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#) ||\n| [BM25 与 LanceDB 的混合搜索](.\u002Fexamples\u002FHybrid_search_bm25_lancedb\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FHybrid_search_bm25_lancedb\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa>   [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![初学者](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#) |[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fhybrid-search-combining-bm25-and-semantic-search-for-better-results-with-lan-1358038fe7e6)|\n| [基于 NER 的语义搜索](.\u002Ftutorials\u002FNER-powered-Semantic-Search) | [![在 Colab 中打开](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002FNER-powered-Semantic-Search\u002FNER_powered_Semantic_Search_with_LanceDB.ipynb) [![本地 LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![初学者](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fner-powered-semantic-search-using-lancedb-51051dc3e493) |\n| [LanceDB 中的向量运算](.\u002Fexamples\u002FVector-Arithmetic-with-LanceDB\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FVector-Arithmetic-with-LanceDB\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa>   [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![初学者](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#) |[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fvector-arithmetic-with-lancedb-an-intro-to-vector-embeddings\u002F)|\n| [总结并搜索 Reddit 帖子](.\u002Fexamples\u002FReddit-summarization-and-search\u002F) | \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FReddit-summarization-and-search\u002Fsubreddit_summarization_querying.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa>    [![初学者](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|\n| [ImageBind 演示应用](.\u002Fexamples\u002Fimagebind_demo\u002F) | \u003Ca href=\"https:\u002F\u002Fhuggingface.co\u002Fspaces\u002Fraghavd99\u002Fimagebind2\">\u003Cimg src=\"https:\u002F\u002Fhuggingface.co\u002Fdatasets\u002Fhuggingface\u002Fbrand-assets\u002Fresolve\u002Fmain\u002Fhf-logo-with-title.svg\" alt=\"hf spaces\" style=\"width: 80px; vertical-align: middle; background-color: white;\">\u003C\u002Fa>  [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|\n| [图像内的搜索](.\u002Fexamples\u002Fsearch-within-images-with-sam-and-clip\u002F) | [![在 Colab 中打开](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fsearch-within-images-with-sam-and-clip\u002Fmain.ipynb) [![本地 LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#)   [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fsearch-within-an-image-331b54e4285e)|\n| [CLIP 的零样本目标检测](.\u002Fexamples\u002Fzero-shot-object-detection-CLIP\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fzero-shot-object-detection-CLIP\u002Fzero_shot_object_detection_clip.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa> [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|\n| [使用 TransformersJS 进行向量搜索](.\u002Fexamples\u002Fjs-transformers\u002F) |[![JS](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fjavascript-%23323330.svg?style=for-the-badge&logo=javascript&logoColor=%23F7DF1E)](.\u002Fexamples\u002Fjs-transformers\u002Findex.js) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![高级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#)|  |\n| [地理空间推荐系统](.\u002Fexamples\u002FGeospatial-Recommendation-System\u002F) | [![在 Colab 中打开](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FGeospatial-Recommendation-System\u002Fgeospatial-recommendation.ipynb) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|\n| [使用 OpenVINO 加速向量搜索应用](.\u002Fexamples\u002FAccelerate-Vector-Search-Applications-Using-OpenVINO\u002F) | [![在 Colab 中打开](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FAccelerate-Vector-Search-Applications-Using-OpenVINO\u002Fclip_text_image_search.ipynb) [![本地 LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![高级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Faccelerate-vector-search-applications-using-openvino-lancedb\u002F)|\n||||\n\n### 聊天机器人\n\n利用 LanceDB 的向量搜索功能，创建能够理解用户查询并返回相关回答的聊天机器人。\n\n| 聊天机器人 &nbsp; &nbsp;| 交互式笔记本与脚本 &nbsp; | 博客 &nbsp;|\n| --------- | -------------------------- | ----------- |\n||||\n| [Databricks DBRX 网站聊天机器人](.\u002Fexamples\u002Fdatabricks_DBRX_website_bot\u002F) | [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002Fdatabricks_DBRX_website_bot\u002Fmain.py) [![Databricks LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fdatabricks-api-red)](#)    [![初级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|\n| [基于 CLI 的 SDK 手动聊天机器人（本地运行）](.\u002Fexamples\u002FCLI-SDK-Manual-Chatbot-Locally\u002F) | [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002FCLI-SDK-Manual-Chatbot-Locally\u002Fassistant.py) [![本地 LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![初级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|\n| [YouTube 字幕搜索聊天机器人](.\u002Fexamples\u002FYoutube-Search-QA-Bot\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FYoutube-Search-QA-Bot\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa>  [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002FYoutube-Search-QA-Bot\u002Fmain.py) [![JavaScript](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fjavascript-%23323330.svg?style=for-the-badge&logo=javascript&logoColor=%23F7DF1E)](.\u002Fexamples\u002FYoutube-Search-QA-Bot\u002Findex.js) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)||\n| [Langchain：代码文档问答聊天机器人](.\u002Fexamples\u002FCode-Documentation-QA-Bot\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FCode-Documentation-QA-Bot\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa> [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002FCode-Documentation-QA-Bot\u002Fmain.py) [![JavaScript](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fjavascript-%23323330.svg?style=for-the-badge&logo=javascript&logoColor=%23F7DF1E)](.\u002Fexamples\u002FCode-Documentation-QA-Bot\u002Findex.js) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)||\n| [使用 Crawl4AI 构建任意网站的聊天机器人](.\u002Fexamples\u002FCrawlerQ&A_website\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FCrawlerQ&A_website\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa> [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002FCode-Documentation-QA-Bot\u002Fmain.py) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)  [![初级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|\n| [使用 Llama 2 和 LanceDB 的上下文感知聊天机器人](.\u002Ftutorials\u002Fchatbot_using_Llama2_&_lanceDB) | [![在 Colab 中打开](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002Fchatbot_using_Llama2_&_lanceDB\u002Fmain.ipynb) [![本地 LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![高级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fcontext-aware-chatbot-using-llama-2-lancedb-as-vector-database-4d771d95c755) |\n||||\n\n\n### 评估\n\n这些项目提供了比较 AI 生成的回答与参考数据、并优化准确性的工具。\n\n| 评估 &nbsp; &nbsp;| 交互式笔记本与脚本 &nbsp; | 博客 |\n| --------- | -------------------------- | ----------- |\n||||\n| [使用 HoneyHive 监控和追踪 RAG](.\u002Fexamples\u002FHoneyHive_x_LanceDB\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FHoneyHive_x_LanceDB\u002FHoneyHive_x_LanceDB.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa>   [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![初级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|\n| [使用 RAGAs 评估 RAG](.\u002Fexamples\u002FEvaluating_RAG_with_RAGAs\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FEvaluating_RAG_with_RAGAs\u002FEvaluating_RAG_with_RAGAs.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa>   [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|\n||||\n\n### AI 代理\n\n构建多个 AI 代理相互协作以高效完成任务的应用程序。这些项目展示了代理如何协同工作、交换数据并自动化工作流。\n\n| AI 代理 &nbsp; &nbsp;| 交互式笔记本与脚本 &nbsp; | 博客 |\n| --------- | -------------------------- | ----------- |\n||||\n| [旅行计划 Swarm 风格代理](.\u002Fexamples\u002FTrip_planner_swarm_style_agent\u002F) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FTrip_planner_swarm_style_agent\u002FTrip_planner_agent.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)   [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|\n| [多源代理](.\u002Fexamples\u002FMulti-source-Agent\u002F) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FMulti-source-Agent\u002FMulti_source_RAG_Agent.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)   [![初级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|\n| [带有 Composio 的 AI 邮件助手](.\u002Fexamples\u002FAI-Email-Assistant-with-Composio\u002F) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FAI-Email-Assistant-with-Composio\u002Fcomposio-lance.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)   [![初级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|\n| [带有 OpenAI Swarm 的助理机器人](.\u002Fexamples\u002Fassistance-bot-with-swarm\u002F) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fassistance-bot-with-swarm\u002Fassistant_bot_with_swarm.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)   [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|\n| [使用 CrewAI 的 AI 趋势搜索器](.\u002Fexamples\u002FAI-Trends-with-CrewAI\u002F) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FAI-Trends-with-CrewAI\u002FCrewAI_AI_Trends.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)    [![初级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Ftrack-ai-trends-crewai-agents-rag\u002F)|\n| [SuperAgent Autogen](.\u002Fexamples\u002FSuperAgent_Autogen) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FSuperAgent_Autogen\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa> [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)||\n| [使用 Langgraph 构建自主客服代理](.\u002Fexamples\u002Fcustomer_support_agent_langgraph\u002FLangGraph_LanceDB.ipynb) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002F\u002Fexamples\u002Fcustomer_support_agent_langgraph\u002FLangGraph_LanceDB.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa>     [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fagentic-rag-using-langgraph-building-a-simple-customer-support-autonomous-agent\u002F)|\n| [AI 代理：减少幻觉](.\u002Fexamples\u002Freducing_hallucinations_ai_agents\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Freducing_hallucinations_ai_agents\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa> [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002Freducing_hallucinations_ai_agents\u002Fmain.py) [![JS](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fjavascript-%23323330.svg?style=for-the-badge&logo=javascript&logoColor=%23F7DF1E)](.\u002Fexamples\u002Freducing_hallucinations_ai_agents\u002Findex.js) [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![高级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#) |[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fhow-to-reduce-hallucinations-from-llm-powered-agents-using-long-term-memory-72f262c3cc1f\u002F)|\n| [多文档 Agentic RAG](.\u002Fexamples\u002Fmulti-document-agentic-rag\u002F) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fmulti-document-agentic-rag\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)     [![高级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fmulti-document-agentic-rag\u002F)|\n| [RASA：客服机器人](.\u002Fexamples\u002FRASA_Customer-support-bot) |\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002FRASA_Customer-support-bot\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在 Colab 中打开\">\u003C\u002Fa>  [![LLM](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#)     [![高级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#)|[![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fcustomer-support-bot-rasa-x-lancedb\u002F)|\n||||\n\n### 推荐系统\n\n个性化AI推荐！这些项目可以帮助你构建基于用户偏好的内容推荐引擎。\n\n| 推荐系统 | 交互式笔记本与脚本 &nbsp; | 博客 |\n| --------- | -------------------------- | ----------- |\n||||\n| [电影推荐](.\u002Fexamples\u002Fmovie-recommender\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fmovie-recommender\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在Colab中打开\">\u003C\u002Fa> [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002Fmovie-recommender\u002Fmain.py) [![初学者](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|  |\n| [产品推荐](.\u002Fexamples\u002Fproduct-recommender\u002F) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Fproduct-recommender\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在Colab中打开\">\u003C\u002Fa> [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002Fproduct-recommender\u002Fmain.py) [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)| |\n| [Arxiv论文推荐](.\u002Fexamples\u002Farxiv-recommender) | \u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Fexamples\u002Farxiv-recommender\u002Fmain.ipynb\">\u003Cimg src=\"https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg\" alt=\"在Colab中打开\">\u003C\u002Fa> [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fexamples\u002Farxiv-recommender\u002Fmain.py) [![本地大模型](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#)  [![初学者](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|  |\n| [音乐推荐](.\u002Fapplications\u002FMusic_Recommendation\u002F) | [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Fapplications\u002FMusic_Recommendation\u002Fapp_music.py) [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)| |||||\n||||\n\n### 概念\n\n学习AI应用背后的核心概念——包括文本分块、检索策略和优化技术——以提升你对向量搜索和AI工作流的理解。\n\n| 概念 | 交互式笔记本 | 博客 |\n| --------- | -------------------------- | ----------- |\n|           |                            |             |\n| [文本分块及其类型入门](.\u002Ftutorials\u002Fdifferent-types-text-chunking-in-RAG) | [![在Colab中打开](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002Fdifferent-types-text-chunking-in-RAG\u002FText_Chunking_on_RAG_application_with_LanceDB.ipynb) [![初学者](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fa-primer-on-text-chunking-and-its-types-a420efc96a13) |\n| [Langchain与LlamaIndex的分块](.\u002Ftutorials\u002FLangchain-LlamaIndex-Chunking) | [![在Colab中打开](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002FLangchain-LlamaIndex-Chunking\u002FLangchain_Llamaindex_chunking.ipynb) [![初学者](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fchunking-techniques-with-langchain-and-llamaindex\u002F) |\n| [使用Instructor创建结构化数据集](.\u002Ftutorials\u002FNER-dataset-with-Instructor\u002F) | [![Python](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fpython-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](.\u002Ftutorials\u002FNER-dataset-with-Instructor\u002Fmain.py) [![初学者](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)| |\n| [比较Cohere重排序器与LanceDB](.\u002Ftutorials\u002Fcohere-reranker) | [![初学者](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fbenchmarking-cohere-reranker-with-lancedb\u002F) |\n| [产品量化：压缩高维向量](https:\u002F\u002Fblog.lancedb.com\u002Fbenchmarking-lancedb-92b01032874a-2\u002F) |[![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#) | [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fbenchmarking-lancedb-92b01032874a-2\u002F) |\n| [大模型、RAG与AI缺失的存储层](https:\u002F\u002Fblog.lancedb.com\u002Fllms-rag-the-missing-storage-layer-for-ai-28ded35fa984) | [![中级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fintermediate-FFDA33)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fllms-rag-the-missing-storage-layer-for-ai-28ded35fa984\u002F) |\n| [使用PEFT与QLoRA微调大模型](.\u002Ftutorials\u002Ffine-tuning_LLM_with_PEFT_QLoRA) | [![在Colab中打开](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002Ffine-tuning_LLM_with_PEFT_QLoRA\u002Fmain.ipynb) [![本地大模型](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flocal-llm-green)](#) [![高级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Foptimizing-llms-a-step-by-step-guide-to-fine-tuning-with-peft-and-qlora-22eddd13d25b) |\n| [使用LlamaParse从PDF中提取复杂表格与文本](.\u002Ftutorials\u002FAdvace_RAG_LlamaParser) | [![在Colab中打开](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002FAdvace_RAG_LlamaParser\u002Fmain.ipynb) [![大模型](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fopenai-api-white)](#) [![LlamaCloud](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FLlama-api-pink)](#) [![初学者](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbeginner-B5FF33)](#)|  |\n| [将任意图像数据集转换为Lance格式](.\u002Ftutorials\u002Fcli-sdk-to-convert-image-datasets-to-lance) | [![在Colab中打开](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Flancedb\u002Fvectordb-recipes\u002Fblob\u002Fmain\u002Ftutorials\u002Fcli-sdk-to-convert-image-datasets-to-lance\u002Fmain.ipynb) [![高级](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fadvanced-FF3333)](#)| [![Ghost](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https:\u002F\u002Fblog.lancedb.com\u002Fpython-package-to-convert-image-datasets-to-lance-type\u002F) |\n||||\n\n## 项目与应用\n使用LanceDB构建的即用型AI应用！你可以直接使用这些项目，也可以根据需要进行定制，或将其集成到你自己的应用中。\n\n### 由 LanceDB 驱动的 Node.js 应用程序\n| 项目名称                                        | 描述                                                                                                          | 截图                                |\n|-----------------------------------------------------|----------------------------------------------------------------------------------------------------------------------|-------------------------------------------|\n| [写作助手](https:\u002F\u002Fgithub.com\u002Flancedb\u002Fvectordb-recipes\u002Ftree\u002Fmain\u002Fapplications\u002Fnode\u002Flanchain_writing_assistant) | 使用 lanchain.js 和 LanceDB 构建的写作助手应用，可根据您撰写的文本实时提供相关建议和事实，帮助您完成写作。                  | ![写作助手](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_619937a5141e.png) |\n| [句子自动补全](https:\u002F\u002Fgithub.com\u002Flancedb\u002Fvectordb-recipes\u002Ftree\u002Fmain\u002Fapplications\u002Fnode\u002Fsentance_auto_complete) | 使用 lanchain.js 和 LanceDB 构建的句子自动补全应用，可根据您撰写的文本实时提供相关的自动补全建议和事实，帮助您完成写作。您还可以上传 PDF 格式的资料来源，并在不同的 GPT 模型之间切换以获得更快的结果。                 | ![句子自动补全](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_6f6b1e7b5531.gif) |\n| [文章推荐](https:\u002F\u002Fgithub.com\u002Flancedb\u002Fvectordb-recipes\u002Ftree\u002Fmain\u002Fapplications\u002Fnode\u002Farticle_recommender) | 文章推荐：探索海量文章数据集，享受即时、上下文感知的推荐服务。借助先进的自然语言处理技术、向量搜索以及可定制的数据集，我们的应用能够提供实时、精准的文章推荐。非常适合用于研究、内容策划以及保持信息更新。通过内容检索与发现领域的前沿技术，解锁更智能的洞察！                 | ![文章推荐](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_aca9b696b08d.gif) |\n| [AI 驱动的求职搜索](https:\u002F\u002Fgithub.com\u002Flancedb\u002Fvectordb-recipes\u002Ftree\u002Fmain\u002Fapplications\u002Fnode\u002FAI_powered_job_search) | 通过这款 AI 驱动的应用程序，彻底改变您的求职体验。它基于 LangChain.js、LanceDB 和先进的语义搜索技术，能够根据您的偏好实时提供高度精准的职位列表。该应用支持自定义数据集和高级筛选选项（如技能、地点、职位类型和薪资范围），确保您快速且轻松地找到合适的职位机会。最适合求职者、招聘人员、职业平台及定制化招聘网站使用。                 | ![求职搜索](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_013620672e7d.gif) |\n| [AI 驱动的多模态表情包搜索](.\u002Fapplications\u002Fnode\u002Fmutimodal_meme_finder) | 一款先进的 AI 驱动的表情包搜索引擎，允许用户通过文本和图像查询来查找表情包。该平台利用 LanceDB 作为高性能向量数据库，并结合 Roboflow 的 CLIP 模型生成嵌入向量，从而实现快速而准确的表情包检索。     | ![多模态表情包搜索](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_e76b53b08d75.gif) |\n| [AI 驱动的员工反馈搜索与分析](https:\u002F\u002Fgithub.com\u002Flancedb\u002Fvectordb-recipes\u002Ftree\u002Fmain\u002Fapplications\u002Fnode\u002FFeedback_search_and_analysis) | 一个 AI 驱动的员工反馈分析平台，旨在收集、存储、分析并检索有价值的员工反馈信息。该系统利用 LanceDB 进行高速的基于向量的语义搜索，采用 React.js 构建交互式用户界面，使用 Node.js 处理后端逻辑，并结合 LangChain.js 和 Ambient Agent 实现智能化分析与可操作的洞察。     | ![AI 驱动的反馈搜索与分析](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_20d35c3c4218.gif) |\n| [层次化多智能体](https:\u002F\u002Fgithub.com\u002Flancedb\u002Fvectordb-recipes\u002Ftree\u002Fmain\u002Fapplications\u002Fnode\u002Fhierarchical-multi-agent) | AI 驱动的法律助手是一个基于 LangGraph、LangChain 和 LanceDB 的 **层次化多智能体** 系统，用于高效处理法律查询。它配备了一个主管智能体，负责将任务委派给专门处理《印度刑法典》和《麻醉药品及精神药物法》的子智能体，每个子智能体又进一步细分为案件检索和法律摘要生成等模块。借助 LanceDB，该系统可以存储和检索经过向量化处理的法律文档，为法律从业者、研究人员和法学院学生提供快速、结构化且具备上下文感知能力的响应。     | ![AI 驱动的法律助手](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_d75f618aeb41.gif) |\n||||\n\n| 项目名称                                        | 描述                                                                                                          | 截图                                |\n|-----------------------------------------------------|----------------------------------------------------------------------------------------------------------------------|-------------------------------------------|\n| [YOLOExplorer](https:\u002F\u002Fgithub.com\u002Flancedb\u002Fyoloexplorer) | 使用 SQL、向量语义搜索等工具，在几秒钟内迭代处理您的 YOLO\u002FCV 数据集                  | ![YOLOExplorer](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_ba49bbbe4858.png) |\n| [网站聊天机器人（可部署的 Vercel 模板）](https:\u002F\u002Fgithub.com\u002Flancedb\u002Flancedb-vercel-chatbot) | 根据任意网站的站点地图或您选择的文档创建聊天机器人。基于向量数据库的无服务器原生 JavaScript 包构建。 | ![Chatbot](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_db18523af12d.gif)    |\n| [带有 Parler TTS 的高级聊天机器人](.\u002Fapplications\u002FChatbot_with_Parler_TTS) | 该聊天机器人应用使用 Lancedb 混合搜索、FTS 和重排序方法，并结合 Parlers TTS 库。|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_26882b434a7c.png)|\n| [多模态搜索引擎](.\u002Fapplications\u002Fmultimodal-search\u002F) | 构建一个多模态搜索引擎应用，支持通过图像或文本进行搜索 | ![Search](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_0a44723eb222.png)|\n| [评估 RAG](.\u002Fapplications\u002Fevaluate_RAG\u002F) | 一个可用的 Streamlit RAG 应用，旨在展示端到端的生产级评估，涵盖 50 多种评分和指标，包括安全检查、软件度量、传统指标以及以大模型为评判标准的指标。它结合了专用深度学习模型和大模型作为评判模型来进行评估。|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_4971dc9709da.png)|\n| [多智能体协作聊天机器人](.\u002Fapplications\u002FMulti_collabration_chatbot\u002F) | 基于 LangGraph 的多智能体协作聊天机器人，适用于股票市场场景，使用 Lancedb 及 Polygon、Tavily 等工具。|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_89b8f6ac8775.png)|\n| [多模态 Myntra 时尚搜索引擎](https:\u002F\u002Fgithub.com\u002Fishandutta0098\u002Flancedb-multimodal-myntra-fashion-search-engine) | 该应用使用 OpenAI 的 CLIP 构建了一个能够理解并处理文字与图片的搜索引擎。|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_2698c1035fa1.png)|\n| [多语言 RAG](.\u002Fapplications\u002FMultilingual_RAG\u002F) | 支持 100 多种语言的多语言 RAG，采用 Cohere 嵌入技术。|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_326dc2afe208.png)|\n| [音乐推荐系统](.\u002Fapplications\u002FMusic_Recommendation\u002F) | 基于音频特征提取和向量相似性搜索的音乐推荐系统。通过利用 **LanceDB**、用于音频标签的 **PANNs** 以及用于音频特征提取的 **Librosa**，该系统可以根据查询歌曲找到并推荐具有相似音频特征的曲目。|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_8f3f44ae8b78.png)|\n| [NoOCR](https:\u002F\u002Fgithub.com\u002Fkyryl-opens-ml\u002Fno-ocr) | 基于 **ColPali** 和 **LanceDB** 的复杂 PDF 全流程解决方案。|![image](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_0211d734ebfa.gif)|\n\n\n**🌟 新！ 🌟 Udacity 上的应用型生成式 AI 和向量数据库课程**\n在最近推出的 [Udacity 课程](https:\u002F\u002Fwww.udacity.com\u002Fcourse\u002Fbuilding-generative-ai-solutions-with-vector-databases--cd12952) 中，使用 LanceDB 学习生成式 AI 和向量数据库。\n\n\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_readme_040285530f18.png\" width=\"80%\" height=\"80%\" \u002F>\n\n\n## 贡献示例\n如果您正在开发一些很酷的应用程序，并希望将其添加到此仓库中，请提交 PR！","# vectordb-recipes 快速上手指南\n\n`vectordb-recipes` 是一个基于 **LanceDB** 的开源项目集合，旨在帮助开发者快速构建生成式 AI（GenAI）应用。它提供了从基础 RAG、多模态搜索到 AI Agent 的各种示例代码和教程。LanceDB 是一款免费、开源、无服务器（serverless）的向量数据库，无需复杂配置即可嵌入现有的 Python 数据生态（如 pandas, arrow, pydantic）。\n\n## 环境准备\n\n在开始之前，请确保您的开发环境满足以下要求：\n\n*   **操作系统**：Linux, macOS 或 Windows\n*   **Python 版本**：Python 3.9 或更高版本\n*   **包管理工具**：pip 或 conda\n*   **可选依赖**：\n    *   若运行 Jupyter Notebook 示例，建议安装 `jupyter` 或使用 Google Colab。\n    *   部分示例需要 LLM API Key（如 OpenAI, Anthropic 等）或本地大模型环境。\n\n> **国内开发者提示**：建议使用国内镜像源加速 Python 包的安装，例如清华源或阿里源。\n\n## 安装步骤\n\n### 1. 克隆项目仓库\n\n首先，将 `vectordb-recipes` 仓库克隆到本地：\n\n```bash\ngit clone https:\u002F\u002Fgithub.com\u002Flancedb\u002Fvectordb-recipes.git\ncd vectordb-recipes\n```\n\n> **加速方案**：如果 GitHub 访问较慢，可使用国内镜像（如 Gitee 镜像，若有）或通过代理加速克隆。\n\n### 2. 创建虚拟环境（推荐）\n\n为了避免依赖冲突，建议创建独立的虚拟环境：\n\n```bash\npython -m venv venv\nsource venv\u002Fbin\u002Factivate  # Windows 用户请使用: venv\\Scripts\\activate\n```\n\n### 3. 安装核心依赖\n\n安装 LanceDB 及项目通用的基础依赖。由于不同示例可能需要特定的额外库（如 `torch`, `transformers`, `openai` 等），建议先安装核心包，再根据具体运行的示例安装额外需求。\n\n```bash\n# 使用国内镜像源加速安装\npip install lancedb -i https:\u002F\u002Fpypi.tuna.tsinghua.edu.cn\u002Fsimple\npip install pandas pyarrow -i https:\u002F\u002Fpypi.tuna.tsinghua.edu.cn\u002Fsimple\n```\n\n*注意：运行特定示例（如 `examples\u002FRAG-On-PDF`）时，请进入对应目录查看 `requirements.txt` 并安装特定依赖。*\n\n## 基本使用\n\n以下是一个最简单的示例，展示如何使用 LanceDB 创建向量表并执行向量搜索。此逻辑与仓库中大多数 \"Build from Scratch\" 示例的核心流程一致。\n\n### 示例：快速构建向量搜索\n\n创建一个名为 `quick_start.py` 的文件，写入以下代码：\n\n```python\nimport lancedb\nimport pandas as pd\n\n# 1. 连接数据库 (本地模式，无需服务器)\ndb = lancedb.connect(\".\u002Fdata\u002Flancedb\")\n\n# 2. 准备数据\ndata = pd.DataFrame({\n    \"vector\": [[1.0, 0.0], [0.0, 1.0], [1.0, 1.0]], \n    \"text\": [\"apple\", \"banana\", \"fruit\"],\n    \"id\": [1, 2, 3]\n})\n\n# 3. 创建表并插入数据\ntable = db.create_table(\"my_table\", data=data)\n\n# 4. 执行向量搜索\n# 搜索与 [0.9, 0.1] 最相似的向量\nquery_vector = [0.9, 0.1]\nresults = table.search(query_vector).limit(2).to_pandas()\n\nprint(results)\n```\n\n运行脚本：\n\n```bash\npython quick_start.py\n```\n\n### 运行仓库中的示例\n\n您可以直接运行仓库中提供的成熟示例。例如，运行一个简单的 PDF RAG 示例：\n\n1.  进入示例目录：\n    ```bash\n    cd examples\u002FRAG-On-PDF\n    ```\n2.  安装该示例的特定依赖：\n    ```bash\n    pip install -r requirements.txt -i https:\u002F\u002Fpypi.tuna.tsinghua.edu.cn\u002Fsimple\n    ```\n3.  运行 Notebook 或 Python 脚本：\n    ```bash\n    # 如果在本地运行\n    jupyter notebook main.ipynb\n    \n    # 或者直接运行 python 脚本 (如果存在)\n    python main.py\n    ```\n\n通过以上步骤，您即可利用 `vectordb-recipes` 快速启动您的 GenAI 项目原型。更多高级用法（如多模态搜索、Agent 构建）请参考仓库中对应的 `examples` 和 `tutorials` 目录。","某初创金融科技团队急需构建一个能理解财报图表并回答复杂查询的智能投研助手，但面临技术栈复杂和开发周期长的挑战。\n\n### 没有 vectordb-recipes 时\n- **环境配置繁琐**：团队需花费数天时间搭建和运维传统的向量数据库服务器，处理复杂的依赖冲突和网络配置，迟迟无法进入核心代码开发。\n- **多模态开发门槛高**：想要实现“图文混合检索”（如搜索包含特定趋势图的财报段落），缺乏现成的多模态嵌入与检索范例，算法工程师需从零摸索架构。\n- **数据链路割裂**：现有的 Pandas 数据分析流程无法直接对接向量存储，必须编写大量冗余代码进行格式转换，导致数据流转效率低下且易出错。\n- **原型验证缓慢**：从想法到可演示的 PoC（概念验证）通常需要数周时间，难以快速向投资人展示基于本地大模型（如 Llama3）的 RAG 应用效果。\n\n### 使用 vectordb-recipes 后\n- **零配置即时启动**：利用 LanceDB 的无服务器特性，团队无需任何基础设施设置，直接在现有 Python 环境中通过几行代码即可开启向量搜索功能。\n- **多模态案例复用**：直接参考库中的\"Multimodal\"和\"V-JEPA Video Search\"示例，快速复用了图文联合检索逻辑，几天内便实现了财报图表的精准定位。\n- **生态无缝集成**：借助其与 Pandas、Pydantic 的原生集成能力，数据科学家直接在原有分析管道中嵌入向量检索，消除了数据格式转换的中间环节。\n- **极速构建原型**：基于\"Build from Scratch\"系列教程，团队在几分钟内就搭建出支持本地大模型的 RAG 应用框架，将产品验证周期从数周缩短至数小时。\n\nvectordb-recipes 通过提供开箱即用的代码范例和无服务器架构，让开发者能跳过繁琐的基础设施搭建，专注于将创意迅速转化为落地的 GenAI 应用。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Flancedb_vectordb-recipes_aca9b696.gif","lancedb","LanceDB","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002Flancedb_d06d20f5.png","",null,"contact@lancedb.com","https:\u002F\u002Flancedb.com\u002F","https:\u002F\u002Fgithub.com\u002Flancedb",[82,86,90,93,97,101],{"name":83,"color":84,"percentage":85},"Jupyter Notebook","#DA5B0B",92.6,{"name":87,"color":88,"percentage":89},"Python","#3572A5",3.7,{"name":91,"color":92,"percentage":10},"JavaScript","#f1e05a",{"name":94,"color":95,"percentage":96},"TypeScript","#3178c6",0.3,{"name":98,"color":99,"percentage":100},"CSS","#663399",0.2,{"name":102,"color":103,"percentage":100},"HTML","#e34c26",940,166,"2026-04-04T10:22:00","Apache-2.0",1,"未说明",{"notes":111,"python":109,"dependencies":112},"该项目是基于 LanceDB 的示例和教程集合，强调无需设置即可使用。支持 Python 数据生态（如 pandas, arrow, pydantic）及 TypeScript SDK。部分示例提供 Google Colab 或 Kaggle 环境可直接运行，具体依赖视各个子项目（如 RAG、多模态搜索等）而定，需参考对应目录下的具体要求。",[73,113,114,115],"pandas","pyarrow","pydantic",[16,15,14,117,35,13],"其他",[119,120,121,122,123,124,125,126,127,128,129,130,131,132,133,134,73],"ai","deep-learning","llms","machine-learning","multimodal","agents","embeddings","fine-tuning","gpt","langchain","llama-index","openai","rag","vector-database","gpt-4-vision","multimodal-ai","2026-03-27T02:49:30.150509","2026-04-11T18:33:33.063589",[138,143,148,153,158],{"id":139,"question_zh":140,"answer_zh":141,"source_url":142},22562,"运行应用时遇到 422 错误（Field required: event_id）怎么办？","这通常是由于系统环境或 Python 版本不兼容导致的。建议尝试以下解决方案：\n1. 确保使用 Python 3.10 版本。\n2. 创建一个新的虚拟环境并重新安装依赖。\n3. 如果是在 macOS Sonoma (M3 芯片) 上遇到此问题，请检查是否有特定的架构兼容性问题，尝试在干净的 Python 3.10 环境中重试。","https:\u002F\u002Fgithub.com\u002Flancedb\u002Fvectordb-recipes\u002Fissues\u002F88",{"id":144,"question_zh":145,"answer_zh":146,"source_url":147},22563,"无法在 GitHub 上预览 Notebook 文件（显示渲染错误或空白）如何解决？","这是由于 GitHub 对包含不完整 widget 的 Notebook 渲染支持有限导致的。您可以尝试以下几种方法查看内容：\n1. 直接运行仓库 README 文件中提供的对应 Colab 笔记本链接。\n2. 使用变通方法：将 URL 中的 `github` 替换为 `nbsanity`。例如：将 `https:\u002F\u002Fgithub.com\u002F...` 改为 `https:\u002F\u002Fnbsanity.com\u002F...` 即可正常预览。\n3. 维护者已修复部分因 widget 不完整导致的问题，如果是特定文件可重新刷新查看。","https:\u002F\u002Fgithub.com\u002Flancedb\u002Fvectordb-recipes\u002Fissues\u002F329",{"id":149,"question_zh":150,"answer_zh":151,"source_url":152},22564,"写入数据时报错 \"Invalid argument error: Dictionary replacement detected...\" 是什么原因？","该错误表明 Arrow IPC 文件格式不支持同一字段在不同批次中存在多个字典映射。解决方法是确保在写入数据库前对数据进行**扁平化（flatten）**处理，保证数据结构的一致性，避免嵌套或复杂的字典结构导致批次间字典冲突。","https:\u002F\u002Fgithub.com\u002Flancedb\u002Fvectordb-recipes\u002Fissues\u002F35",{"id":154,"question_zh":155,"answer_zh":156,"source_url":157},22565,"运行 CrewAI 示例时代理调用不存在的同事（如 \"Database Manager\"）或报错缺少参数怎么办？","这通常是因为博客文章中的代码示例不完整或本地环境与示例环境不一致导致的。建议：\n1. 不要直接复制博客代码，而是运行仓库中提供的完整 Colab Notebook（例如：`examples\u002FAI-Trends-with-CrewAI\u002FCrewAI_AI_Trends.ipynb`），该版本已包含正确的 Tasks 声明和代理配置。\n2. 检查工具输入参数，确保 `Delegate work to co-worker` 等动作包含了所有必需参数（coworker, task, context）。\n3. 如果使用 SerperSearch 或其他返回 JSON 的工具，注意数据格式是否符合 LanceDB 的预期，避免被误判为数据库更新操作。","https:\u002F\u002Fgithub.com\u002Flancedb\u002Fvectordb-recipes\u002Fissues\u002F184",{"id":159,"question_zh":160,"answer_zh":161,"source_url":162},22566,"如何避免 OpenAI 库版本冲突导致的警告或错误？","建议在项目中明确固定（Pin）OpenAI 库的版本，以避免因自动升级导致的 API 不兼容或警告。可以在 `requirements.txt` 或安装命令中指定具体版本号（例如 `openai==x.x.x`），并确保修复那些可以避免的兼容性警告。","https:\u002F\u002Fgithub.com\u002Flancedb\u002Fvectordb-recipes\u002Fissues\u002F116",[]]