[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-RunanywhereAI--runanywhere-sdks":3,"tool-RunanywhereAI--runanywhere-sdks":61},[4,18,26,36,44,53],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":17},4358,"openclaw","openclaw\u002Fopenclaw","OpenClaw 是一款专为个人打造的本地化 AI 助手，旨在让你在自己的设备上拥有完全可控的智能伙伴。它打破了传统 AI 助手局限于特定网页或应用的束缚，能够直接接入你日常使用的各类通讯渠道，包括微信、WhatsApp、Telegram、Discord、iMessage 等数十种平台。无论你在哪个聊天软件中发送消息，OpenClaw 都能即时响应，甚至支持在 macOS、iOS 和 Android 设备上进行语音交互，并提供实时的画布渲染功能供你操控。\n\n这款工具主要解决了用户对数据隐私、响应速度以及“始终在线”体验的需求。通过将 AI 部署在本地，用户无需依赖云端服务即可享受快速、私密的智能辅助，真正实现了“你的数据，你做主”。其独特的技术亮点在于强大的网关架构，将控制平面与核心助手分离，确保跨平台通信的流畅性与扩展性。\n\nOpenClaw 非常适合希望构建个性化工作流的技术爱好者、开发者，以及注重隐私保护且不愿被单一生态绑定的普通用户。只要具备基础的终端操作能力（支持 macOS、Linux 及 Windows WSL2），即可通过简单的命令行引导完成部署。如果你渴望拥有一个懂你",349277,3,"2026-04-06T06:32:30",[13,14,15,16],"Agent","开发框架","图像","数据工具","ready",{"id":19,"name":20,"github_repo":21,"description_zh":22,"stars":23,"difficulty_score":10,"last_commit_at":24,"category_tags":25,"status":17},3808,"stable-diffusion-webui","AUTOMATIC1111\u002Fstable-diffusion-webui","stable-diffusion-webui 是一个基于 Gradio 构建的网页版操作界面，旨在让用户能够轻松地在本地运行和使用强大的 Stable Diffusion 图像生成模型。它解决了原始模型依赖命令行、操作门槛高且功能分散的痛点，将复杂的 AI 绘图流程整合进一个直观易用的图形化平台。\n\n无论是希望快速上手的普通创作者、需要精细控制画面细节的设计师，还是想要深入探索模型潜力的开发者与研究人员，都能从中获益。其核心亮点在于极高的功能丰富度：不仅支持文生图、图生图、局部重绘（Inpainting）和外绘（Outpainting）等基础模式，还独创了注意力机制调整、提示词矩阵、负向提示词以及“高清修复”等高级功能。此外，它内置了 GFPGAN 和 CodeFormer 等人脸修复工具，支持多种神经网络放大算法，并允许用户通过插件系统无限扩展能力。即使是显存有限的设备，stable-diffusion-webui 也提供了相应的优化选项，让高质量的 AI 艺术创作变得触手可及。",162132,"2026-04-05T11:01:52",[14,15,13],{"id":27,"name":28,"github_repo":29,"description_zh":30,"stars":31,"difficulty_score":32,"last_commit_at":33,"category_tags":34,"status":17},1381,"everything-claude-code","affaan-m\u002Feverything-claude-code","everything-claude-code 是一套专为 AI 编程助手（如 Claude Code、Codex、Cursor 等）打造的高性能优化系统。它不仅仅是一组配置文件，而是一个经过长期实战打磨的完整框架，旨在解决 AI 代理在实际开发中面临的效率低下、记忆丢失、安全隐患及缺乏持续学习能力等核心痛点。\n\n通过引入技能模块化、直觉增强、记忆持久化机制以及内置的安全扫描功能，everything-claude-code 能显著提升 AI 在复杂任务中的表现，帮助开发者构建更稳定、更智能的生产级 AI 代理。其独特的“研究优先”开发理念和针对 Token 消耗的优化策略，使得模型响应更快、成本更低，同时有效防御潜在的攻击向量。\n\n这套工具特别适合软件开发者、AI 研究人员以及希望深度定制 AI 工作流的技术团队使用。无论您是在构建大型代码库，还是需要 AI 协助进行安全审计与自动化测试，everything-claude-code 都能提供强大的底层支持。作为一个曾荣获 Anthropic 黑客大奖的开源项目，它融合了多语言支持与丰富的实战钩子（hooks），让 AI 真正成长为懂上",150720,2,"2026-04-11T11:33:10",[14,13,35],"语言模型",{"id":37,"name":38,"github_repo":39,"description_zh":40,"stars":41,"difficulty_score":32,"last_commit_at":42,"category_tags":43,"status":17},2271,"ComfyUI","Comfy-Org\u002FComfyUI","ComfyUI 是一款功能强大且高度模块化的视觉 AI 引擎，专为设计和执行复杂的 Stable Diffusion 图像生成流程而打造。它摒弃了传统的代码编写模式，采用直观的节点式流程图界面，让用户通过连接不同的功能模块即可构建个性化的生成管线。\n\n这一设计巧妙解决了高级 AI 绘图工作流配置复杂、灵活性不足的痛点。用户无需具备编程背景，也能自由组合模型、调整参数并实时预览效果，轻松实现从基础文生图到多步骤高清修复等各类复杂任务。ComfyUI 拥有极佳的兼容性，不仅支持 Windows、macOS 和 Linux 全平台，还广泛适配 NVIDIA、AMD、Intel 及苹果 Silicon 等多种硬件架构，并率先支持 SDXL、Flux、SD3 等前沿模型。\n\n无论是希望深入探索算法潜力的研究人员和开发者，还是追求极致创作自由度的设计师与资深 AI 绘画爱好者，ComfyUI 都能提供强大的支持。其独特的模块化架构允许社区不断扩展新功能，使其成为当前最灵活、生态最丰富的开源扩散模型工具之一，帮助用户将创意高效转化为现实。",108322,"2026-04-10T11:39:34",[14,15,13],{"id":45,"name":46,"github_repo":47,"description_zh":48,"stars":49,"difficulty_score":32,"last_commit_at":50,"category_tags":51,"status":17},6121,"gemini-cli","google-gemini\u002Fgemini-cli","gemini-cli 是一款由谷歌推出的开源 AI 命令行工具，它将强大的 Gemini 大模型能力直接集成到用户的终端环境中。对于习惯在命令行工作的开发者而言，它提供了一条从输入提示词到获取模型响应的最短路径，无需切换窗口即可享受智能辅助。\n\n这款工具主要解决了开发过程中频繁上下文切换的痛点，让用户能在熟悉的终端界面内直接完成代码理解、生成、调试以及自动化运维任务。无论是查询大型代码库、根据草图生成应用，还是执行复杂的 Git 操作，gemini-cli 都能通过自然语言指令高效处理。\n\n它特别适合广大软件工程师、DevOps 人员及技术研究人员使用。其核心亮点包括支持高达 100 万 token 的超长上下文窗口，具备出色的逻辑推理能力；内置 Google 搜索、文件操作及 Shell 命令执行等实用工具；更独特的是，它支持 MCP（模型上下文协议），允许用户灵活扩展自定义集成，连接如图像生成等外部能力。此外，个人谷歌账号即可享受免费的额度支持，且项目基于 Apache 2.0 协议完全开源，是提升终端工作效率的理想助手。",100752,"2026-04-10T01:20:03",[52,13,15,14],"插件",{"id":54,"name":55,"github_repo":56,"description_zh":57,"stars":58,"difficulty_score":32,"last_commit_at":59,"category_tags":60,"status":17},4721,"markitdown","microsoft\u002Fmarkitdown","MarkItDown 是一款由微软 AutoGen 团队打造的轻量级 Python 工具，专为将各类文件高效转换为 Markdown 格式而设计。它支持 PDF、Word、Excel、PPT、图片（含 OCR）、音频（含语音转录）、HTML 乃至 YouTube 链接等多种格式的解析，能够精准提取文档中的标题、列表、表格和链接等关键结构信息。\n\n在人工智能应用日益普及的今天，大语言模型（LLM）虽擅长处理文本，却难以直接读取复杂的二进制办公文档。MarkItDown 恰好解决了这一痛点，它将非结构化或半结构化的文件转化为模型“原生理解”且 Token 效率极高的 Markdown 格式，成为连接本地文件与 AI 分析 pipeline 的理想桥梁。此外，它还提供了 MCP（模型上下文协议）服务器，可无缝集成到 Claude Desktop 等 LLM 应用中。\n\n这款工具特别适合开发者、数据科学家及 AI 研究人员使用，尤其是那些需要构建文档检索增强生成（RAG）系统、进行批量文本分析或希望让 AI 助手直接“阅读”本地文件的用户。虽然生成的内容也具备一定可读性，但其核心优势在于为机器",93400,"2026-04-06T19:52:38",[52,14],{"id":62,"github_repo":63,"name":64,"description_en":65,"description_zh":66,"ai_summary_zh":66,"readme_en":67,"readme_zh":68,"quickstart_zh":69,"use_case_zh":70,"hero_image_url":71,"owner_login":72,"owner_name":73,"owner_avatar_url":74,"owner_bio":75,"owner_company":76,"owner_location":76,"owner_email":76,"owner_twitter":77,"owner_website":78,"owner_url":79,"languages":80,"stars":120,"forks":121,"last_commit_at":122,"license":123,"difficulty_score":32,"env_os":124,"env_gpu":125,"env_ram":126,"env_deps":127,"category_tags":138,"github_topics":141,"view_count":32,"oss_zip_url":76,"oss_zip_packed_at":76,"status":17,"created_at":162,"updated_at":163,"faqs":164,"releases":194},6601,"RunanywhereAI\u002Frunanywhere-sdks","runanywhere-sdks","Production ready toolkit to run AI locally","RunAnywhere 是一套专为多平台设计的开源开发工具包，旨在帮助开发者轻松将人工智能功能直接集成到应用程序中，并完全在用户设备上运行。它支持大型语言模型对话、语音转文字、文字转语音以及图像生成等核心能力，无需依赖云端服务器。\n\n这一方案有效解决了传统 AI 应用面临的数据隐私泄露、网络延迟高以及离线无法使用等痛点。由于所有计算均在本地完成，用户数据永不离开设备，既保障了隐私安全，又实现了极速响应。\n\nRunAnywhere 主要面向移动应用和前端开发者，无论是使用 Swift、Kotlin 构建原生应用，还是通过 React Native、Flutter 或 Web 技术进行跨平台开发，都能快速接入。其独特的技术亮点在于提供了“生产级”的本地推理引擎，内置了从语音识别到大模型思考再到语音合成的完整流水线，并兼容 Llama、Mistral、Whisper 等多种主流开源模型。借助 RunAnywhere，开发者可以低成本打造出私密、离线且高性能的智能应用，让用户随时随地享受安全的 AI 服务。","\u003Cp align=\"center\">\n  \u003Cimg src=\"examples\u002Flogo.svg\" alt=\"RunAnywhere Logo\" width=\"140\"\u002F>\n\u003C\u002Fp>\n\n\u003Ch1 align=\"center\">RunAnywhere\u003C\u002Fh1>\n\n\u003Cp align=\"center\">\n  \u003Cstrong>On-device AI for every platform.\u003C\u002Fstrong>\u003Cbr\u002F>\n  Run LLMs, speech-to-text, and text-to-speech locally — private, offline, fast.\n\u003C\u002Fp>\n\n\u003Cp align=\"center\">\n  \u003Ca href=\"https:\u002F\u002Fapps.apple.com\u002Fus\u002Fapp\u002Frunanywhere\u002Fid6756506307\">\n    \u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FApp_Store-Download-0D96F6?style=for-the-badge&logo=apple&logoColor=white\" alt=\"Download on App Store\" \u002F>\n  \u003C\u002Fa>\n  &nbsp;\n  \u003Ca href=\"https:\u002F\u002Fplay.google.com\u002Fstore\u002Fapps\u002Fdetails?id=com.runanywhere.runanywhereai\">\n    \u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FGoogle_Play-Download-34A853?style=for-the-badge&logo=google-play&logoColor=white\" alt=\"Get it on Google Play\" \u002F>\n  \u003C\u002Fa>\n\u003C\u002Fp>\n\n\u003Cp align=\"center\">\n  \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\u002Fstargazers\">\u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fstars\u002FRunanywhereAI\u002Frunanywhere-sdks?style=flat-square\" alt=\"GitHub Stars\" \u002F>\u003C\u002Fa>\n  \u003Ca href=\"LICENSE\">\u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FLicense-Apache%202.0-blue?style=flat-square\" alt=\"License\" \u002F>\u003C\u002Fa>\n  \u003Ca href=\"https:\u002F\u002Fdiscord.gg\u002FN359FBbDVd\">\u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FDiscord-Join-5865F2?style=flat-square&logo=discord&logoColor=white\" alt=\"Discord\" \u002F>\u003C\u002Fa>\n\u003C\u002Fp>\n\n## See It In Action\n\n\u003Cdiv align=\"center\">\n\u003Ctable>\n  \u003Ctr>\n    \u003Ctd align=\"center\" width=\"50%\">\n      \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FRunanywhereAI_runanywhere-sdks_readme_a1b11fcd74c9.gif\" alt=\"Text Generation\" width=\"240\"\u002F>\u003Cbr\u002F>\u003Cbr\u002F>\n      \u003Cstrong>Text Generation\u003C\u002Fstrong>\u003Cbr\u002F>\n      \u003Csub>LLM inference — 100% on-device\u003C\u002Fsub>\n    \u003C\u002Ftd>\n    \u003Ctd width=\"40\">\u003C\u002Ftd>\n    \u003Ctd align=\"center\" width=\"50%\">\n      \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FRunanywhereAI_runanywhere-sdks_readme_db82d7906e5e.gif\" alt=\"Voice AI\" width=\"240\"\u002F>\u003Cbr\u002F>\u003Cbr\u002F>\n      \u003Cstrong>Voice AI\u003C\u002Fstrong>\u003Cbr\u002F>\n      \u003Csub>STT → LLM → TTS pipeline — fully offline\u003C\u002Fsub>\n    \u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\u003Ctd colspan=\"3\" height=\"30\">\u003C\u002Ftd>\u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd align=\"center\" width=\"50%\">\n      \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FRunanywhereAI_runanywhere-sdks_readme_80711eba4fcc.gif\" alt=\"Image Generation\" width=\"240\"\u002F>\u003Cbr\u002F>\u003Cbr\u002F>\n      \u003Cstrong>Image Generation\u003C\u002Fstrong>\u003Cbr\u002F>\n      \u003Csub>On-device diffusion model\u003C\u002Fsub>\n    \u003C\u002Ftd>\n    \u003Ctd width=\"40\">\u003C\u002Ftd>\n    \u003Ctd align=\"center\" width=\"50%\">\n      \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FRunanywhereAI_runanywhere-sdks_readme_e1ce34c8772c.gif\" alt=\"Visual Language Model\" width=\"240\"\u002F>\u003Cbr\u002F>\u003Cbr\u002F>\n      \u003Cstrong>Visual Language Model\u003C\u002Fstrong>\u003Cbr\u002F>\n      \u003Csub>Vision + language understanding on-device\u003C\u002Fsub>\n    \u003C\u002Ftd>\n  \u003C\u002Ftr>\n\u003C\u002Ftable>\n\u003C\u002Fdiv>\n\n---\n\n## What is RunAnywhere?\n\nRunAnywhere lets you add AI features to your app that run entirely on-device:\n\n- **LLM Chat** — Llama, Mistral, Qwen, SmolLM, and more\n- **Speech-to-Text** — Whisper-powered transcription\n- **Text-to-Speech** — Neural voice synthesis\n- **Voice Assistant** — Full STT → LLM → TTS pipeline\n\nNo cloud. No latency. No data leaves the device.\n\n---\n\n## SDKs\n\n| Platform | Status | Installation | Documentation |\n|----------|--------|--------------|---------------|\n| **Swift** (iOS\u002FmacOS) | Stable | [Swift Package Manager](#swift-ios--macos) | [docs.runanywhere.ai\u002Fswift](https:\u002F\u002Fdocs.runanywhere.ai\u002Fswift\u002Fintroduction) |\n| **Kotlin** (Android) | Stable | [Gradle](#kotlin-android) | [docs.runanywhere.ai\u002Fkotlin](https:\u002F\u002Fdocs.runanywhere.ai\u002Fkotlin\u002Fintroduction) |\n| **Web** (Browser) | Beta | [npm](#web-browser) | [SDK README](sdk\u002Frunanywhere-web\u002F) |\n| **React Native** | Beta | [npm](#react-native) | [docs.runanywhere.ai\u002Freact-native](https:\u002F\u002Fdocs.runanywhere.ai\u002Freact-native\u002Fintroduction) |\n| **Flutter** | Beta | [pub.dev](#flutter) | [docs.runanywhere.ai\u002Fflutter](https:\u002F\u002Fdocs.runanywhere.ai\u002Fflutter\u002Fintroduction) |\n\n---\n\n## Quick Start\n\n### Swift (iOS \u002F macOS)\n\n```swift\nimport RunAnywhere\nimport LlamaCPPRuntime\n\n\u002F\u002F 1. Initialize\nLlamaCPP.register()\ntry RunAnywhere.initialize()\n\n\u002F\u002F 2. Load a model\ntry await RunAnywhere.downloadModel(\"smollm2-360m\")\ntry await RunAnywhere.loadModel(\"smollm2-360m\")\n\n\u002F\u002F 3. Generate\nlet response = try await RunAnywhere.chat(\"What is the capital of France?\")\nprint(response) \u002F\u002F \"Paris is the capital of France.\"\n```\n\n**Install via Swift Package Manager:**\n\n```\nhttps:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\n```\n\n[Full documentation →](https:\u002F\u002Fdocs.runanywhere.ai\u002Fswift\u002Fintroduction) · [Source code](sdk\u002Frunanywhere-swift\u002F)\n\n---\n\n### Kotlin (Android)\n\n```kotlin\nimport com.runanywhere.sdk.public.RunAnywhere\nimport com.runanywhere.sdk.public.extensions.*\n\n\u002F\u002F 1. Initialize\nLlamaCPP.register()\nRunAnywhere.initialize(environment = SDKEnvironment.DEVELOPMENT)\n\n\u002F\u002F 2. Load a model\nRunAnywhere.downloadModel(\"smollm2-360m\").collect { println(\"${it.progress * 100}%\") }\nRunAnywhere.loadLLMModel(\"smollm2-360m\")\n\n\u002F\u002F 3. Generate\nval response = RunAnywhere.chat(\"What is the capital of France?\")\nprintln(response) \u002F\u002F \"Paris is the capital of France.\"\n```\n\n**Install via Gradle:**\n\n```kotlin\ndependencies {\n    implementation(\"com.runanywhere.sdk:runanywhere-kotlin:0.16.1\")\n    implementation(\"com.runanywhere.sdk:runanywhere-core-llamacpp:0.16.1\")\n}\n```\n\n[Full documentation →](https:\u002F\u002Fdocs.runanywhere.ai\u002Fkotlin\u002Fintroduction) · [Source code](sdk\u002Frunanywhere-kotlin\u002F)\n\n---\n\n### React Native\n\n```typescript\nimport { RunAnywhere, SDKEnvironment } from '@runanywhere\u002Fcore';\nimport { LlamaCPP } from '@runanywhere\u002Fllamacpp';\n\n\u002F\u002F 1. Initialize\nawait RunAnywhere.initialize({ environment: SDKEnvironment.Development });\nLlamaCPP.register();\n\n\u002F\u002F 2. Load a model\nawait RunAnywhere.downloadModel('smollm2-360m');\nawait RunAnywhere.loadModel('smollm2-360m');\n\n\u002F\u002F 3. Generate\nconst response = await RunAnywhere.chat('What is the capital of France?');\nconsole.log(response); \u002F\u002F \"Paris is the capital of France.\"\n```\n\n**Install via npm:**\n\n```bash\nnpm install @runanywhere\u002Fcore @runanywhere\u002Fllamacpp\n```\n\n[Full documentation →](https:\u002F\u002Fdocs.runanywhere.ai\u002Freact-native\u002Fintroduction) · [Source code](sdk\u002Frunanywhere-react-native\u002F)\n\n---\n\n### Flutter\n\n```dart\nimport 'package:runanywhere\u002Frunanywhere.dart';\nimport 'package:runanywhere_llamacpp\u002Frunanywhere_llamacpp.dart';\n\n\u002F\u002F 1. Initialize\nawait RunAnywhere.initialize();\nawait LlamaCpp.register();\n\n\u002F\u002F 2. Load a model\nawait RunAnywhere.downloadModel('smollm2-360m');\nawait RunAnywhere.loadModel('smollm2-360m');\n\n\u002F\u002F 3. Generate\nfinal response = await RunAnywhere.chat('What is the capital of France?');\nprint(response); \u002F\u002F \"Paris is the capital of France.\"\n```\n\n**Install via pub.dev:**\n\n```yaml\ndependencies:\n  runanywhere: ^0.16.0\n  runanywhere_llamacpp: ^0.16.0  # LLM text generation\n  # runanywhere_onnx: ^0.16.0   # Add this if you need STT, TTS, or Voice features\n```\n\n[Full documentation →](https:\u002F\u002Fdocs.runanywhere.ai\u002Fflutter\u002Fintroduction) · [Source code](sdk\u002Frunanywhere-flutter\u002F)\n\n---\n\n### Web (Browser)\n\n```typescript\nimport { RunAnywhere, TextGeneration } from '@runanywhere\u002Fweb';\n\n\u002F\u002F 1. Initialize\nawait RunAnywhere.initialize({ environment: 'development' });\n\n\u002F\u002F 2. Load a model\nawait TextGeneration.loadModel('\u002Fmodels\u002Fqwen2.5-0.5b-instruct-q4_0.gguf', 'qwen2.5-0.5b');\n\n\u002F\u002F 3. Generate\nconst result = await TextGeneration.generate('What is the capital of France?');\nconsole.log(result.text); \u002F\u002F \"Paris is the capital of France.\"\n```\n\n**Install via npm:**\n\n```bash\nnpm install @runanywhere\u002Fweb\n```\n\n[Full documentation →](sdk\u002Frunanywhere-web\u002F) · [Source code](sdk\u002Frunanywhere-web\u002F)\n\n---\n\n## Sample Apps\n\nFull-featured demo applications demonstrating SDK capabilities:\n\n| Platform | Source Code | Download |\n|----------|-------------|----------|\n| iOS | [examples\u002Fios\u002FRunAnywhereAI](examples\u002Fios\u002FRunAnywhereAI\u002F) | [App Store](https:\u002F\u002Fapps.apple.com\u002Fus\u002Fapp\u002Frunanywhere\u002Fid6756506307) |\n| Android | [examples\u002Fandroid\u002FRunAnywhereAI](examples\u002Fandroid\u002FRunAnywhereAI\u002F) | [Google Play](https:\u002F\u002Fplay.google.com\u002Fstore\u002Fapps\u002Fdetails?id=com.runanywhere.runanywhereai) |\n| Web | [examples\u002Fweb\u002FRunAnywhereAI](examples\u002Fweb\u002FRunAnywhereAI\u002F) | Build from source |\n| React Native | [examples\u002Freact-native\u002FRunAnywhereAI](examples\u002Freact-native\u002FRunAnywhereAI\u002F) | Build from source |\n| Flutter | [examples\u002Fflutter\u002FRunAnywhereAI](examples\u002Fflutter\u002FRunAnywhereAI\u002F) | Build from source |\n\n---\n\n## Starter Examples\n\nMinimal starter projects to get up and running with RunAnywhere on each platform:\n\n| Platform | Repository |\n|----------|------------|\n| Kotlin (Android) | [RunanywhereAI\u002Fkotlin-starter-example](https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Fkotlin-starter-example) |\n| Swift (iOS) | [RunanywhereAI\u002Fswift-starter-example](https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Fswift-starter-example) |\n| Flutter | [RunanywhereAI\u002Fflutter-starter-example](https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Fflutter-starter-example) |\n| React Native | [RunanywhereAI\u002Freact-native-starter-app](https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Freact-native-starter-app) |\n\n---\n\n## Playground\n\nReal-world projects built with RunAnywhere that push the boundaries of on-device AI. Each one ships as a standalone app you can build and run.\n\n### [Android Use Agent](Playground\u002Fandroid-use-agent\u002F)\n\nA fully on-device autonomous Android agent that controls your phone. Give it a goal like \"Open YouTube and search for lofi music\" and it reads the screen via the Accessibility API, reasons about the next action with an on-device LLM (Qwen3-4B), and executes taps, swipes, and text input -- all without any cloud calls. Includes a Samsung foreground boost that delivers a 15x inference speedup, smart pre-launch via Android intents, and loop detection with automatic recovery. Benchmarked across four LLM models on a Galaxy S24. **[Full benchmarks](Playground\u002Fandroid-use-agent\u002FASSESSMENT.md)**\n\n### [On-Device Browser Agent](Playground\u002Fon-device-browser-agent\u002F)\n\nA Chrome extension that automates browser tasks entirely on-device using WebLLM and WebGPU. Uses a two-agent architecture -- a Planner that breaks down goals into steps and a Navigator that interacts with page elements -- with both DOM-based and vision-based page understanding. Includes site-specific workflows for Amazon, YouTube, and more. All AI inference runs locally on your GPU after the initial model download.\n\n### [Swift Starter App](Playground\u002Fswift-starter-app\u002F)\n\nA full-featured iOS app demonstrating the RunAnywhere SDK's core AI capabilities in a clean SwiftUI interface. Includes LLM chat with on-device language models, Whisper-powered speech-to-text, neural text-to-speech, and a complete voice pipeline that chains STT, LLM, and TTS together with voice activity detection. A good starting point for building privacy-first AI features on iOS.\n\n### [Linux Voice Assistant](Playground\u002Flinux-voice-assistant\u002F)\n\nA complete on-device voice AI pipeline for Linux (Raspberry Pi 5, x86_64, ARM64). Say \"Hey Jarvis\" to activate, speak naturally, and get responses -- all running locally with zero cloud dependency. Chains Wake Word detection (openWakeWord), Voice Activity Detection (Silero VAD), Speech-to-Text (Whisper Tiny EN), LLM reasoning (Qwen2.5 0.5B Q4), and Text-to-Speech (Piper neural TTS) in a single C++ binary.\n\n### [OpenClaw Hybrid Assistant](Playground\u002Fopenclaw-hybrid-assistant\u002F)\n\nA hybrid voice assistant that keeps latency-sensitive components on-device (wake word, VAD, STT, TTS) while routing reasoning to a cloud LLM via OpenClaw WebSocket. Supports barge-in (interrupt TTS by saying the wake word), waiting chimes for cloud response feedback, and noise-robust VAD with burst filtering. Built for scenarios where on-device LLMs are too slow but you still want private audio processing.\n\n---\n\n## Features\n\n| Feature | iOS | Android | Web | React Native | Flutter |\n|---------|-----|---------|-----|--------------|---------|\n| LLM Text Generation | ✅ | ✅ | ✅ | ✅ | ✅ |\n| Streaming | ✅ | ✅ | ✅ | ✅ | ✅ |\n| Speech-to-Text | ✅ | ✅ | ✅ | ✅ | ✅ |\n| Text-to-Speech | ✅ | ✅ | ✅ | ✅ | ✅ |\n| Voice Assistant Pipeline | ✅ | ✅ | ✅ | ✅ | ✅ |\n| Vision Language Models | ✅ | — | ✅ | — | — |\n| Model Download + Progress | ✅ | ✅ | ✅ | ✅ | ✅ |\n| Structured Output (JSON) | ✅ | ✅ | ✅ | 🔜 | 🔜 |\n| Tool Calling | ✅ | ✅ | ✅ | — | — |\n| Embeddings | — | — | ✅ | — | — |\n| Apple Foundation Models | ✅ | — | — | — | — |\n\n---\n\n## Supported Models\n\n### LLM (GGUF format via llama.cpp)\n\n| Model | Size | RAM Required | Use Case |\n|-------|------|--------------|----------|\n| SmolLM2 360M | ~400MB | 500MB | Fast, lightweight |\n| Qwen 2.5 0.5B | ~500MB | 600MB | Multilingual |\n| Llama 3.2 1B | ~1GB | 1.2GB | Balanced |\n| Mistral 7B Q4 | ~4GB | 5GB | High quality |\n\n### Speech-to-Text (Whisper via ONNX)\n\n| Model | Size | Languages |\n|-------|------|-----------|\n| Whisper Tiny | ~75MB | English |\n| Whisper Base | ~150MB | Multilingual |\n\n### Text-to-Speech (Piper via ONNX)\n\n| Voice | Size | Language |\n|-------|------|----------|\n| Piper US English | ~65MB | English (US) |\n| Piper British English | ~65MB | English (UK) |\n\n---\n\n## Repository Structure\n\n```\nrunanywhere-sdks\u002F\n├── sdk\u002F\n│   ├── runanywhere-swift\u002F          # iOS\u002FmacOS SDK\n│   ├── runanywhere-kotlin\u002F         # Android SDK\n│   ├── runanywhere-web\u002F            # Web SDK (WebAssembly)\n│   ├── runanywhere-react-native\u002F   # React Native SDK\n│   ├── runanywhere-flutter\u002F        # Flutter SDK\n│   └── runanywhere-commons\u002F        # Shared C++ core\n│\n├── examples\u002F\n│   ├── ios\u002FRunAnywhereAI\u002F          # iOS sample app\n│   ├── android\u002FRunAnywhereAI\u002F      # Android sample app\n│   ├── web\u002FRunAnywhereAI\u002F          # Web sample app\n│   ├── react-native\u002FRunAnywhereAI\u002F # React Native sample app\n│   └── flutter\u002FRunAnywhereAI\u002F      # Flutter sample app\n│\n├── Playground\u002F\n│   ├── swift-starter-app\u002F          # iOS AI playground app\n│   ├── on-device-browser-agent\u002F    # Chrome browser automation agent\n│   ├── android-use-agent\u002F          # On-device autonomous Android agent\n│   ├── linux-voice-assistant\u002F      # Linux on-device voice assistant\n│   └── openclaw-hybrid-assistant\u002F  # Hybrid voice assistant (on-device + cloud)\n│\n└── docs\u002F                           # Documentation\n```\n\n---\n\n## Requirements\n\n| Platform | Minimum | Recommended |\n|----------|---------|-------------|\n| iOS | 17.0+ | 17.0+ |\n| macOS | 14.0+ | 14.0+ |\n| Android | API 24 (7.0) | API 28+ |\n| Web | Chrome 96+ \u002F Edge 96+ | Chrome 120+ |\n| React Native | 0.74+ | 0.76+ |\n| Flutter | 3.10+ | 3.24+ |\n\n**Memory:** 2GB minimum, 4GB+ recommended for larger models\n\n---\n\n## Contributing\n\nWe welcome contributions. See our [Contributing Guide](CONTRIBUTING.md) for details.\n\n```bash\n# Clone the repo\ngit clone https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks.git\n\n# Set up a specific SDK (example: Swift)\ncd runanywhere-sdks\u002Fsdk\u002Frunanywhere-swift\n.\u002Fscripts\u002Fbuild-swift.sh --setup\n\n# Run the sample app\ncd ..\u002F..\u002Fexamples\u002Fios\u002FRunAnywhereAI\nopen RunAnywhereAI.xcodeproj\n```\n\n---\n\n## Support\n\n- **Discord:** [Join our community](https:\u002F\u002Fdiscord.gg\u002FN359FBbDVd)\n- **GitHub Issues:** [Report bugs or request features](https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\u002Fissues)\n- **Email:** founders@runanywhere.ai\n- **Twitter:** [@RunanywhereAI](https:\u002F\u002Ftwitter.com\u002FRunanywhereAI)\n\n---\n\n## License\n\nApache 2.0 — see [LICENSE](LICENSE) for details.\n","\u003Cp align=\"center\">\n  \u003Cimg src=\"examples\u002Flogo.svg\" alt=\"RunAnywhere Logo\" width=\"140\"\u002F>\n\u003C\u002Fp>\n\n\u003Ch1 align=\"center\">RunAnywhere\u003C\u002Fh1>\n\n\u003Cp align=\"center\">\n  \u003Cstrong>为所有平台提供设备端AI。\u003C\u002Fstrong>\u003Cbr\u002F>\n  在本地运行大语言模型、语音转文本和文本转语音——私密、离线、快速。\n\u003C\u002Fp>\n\n\u003Cp align=\"center\">\n  \u003Ca href=\"https:\u002F\u002Fapps.apple.com\u002Fus\u002Fapp\u002Frunanywhere\u002Fid6756506307\">\n    \u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FApp_Store-Download-0D96F6?style=for-the-badge&logo=apple&logoColor=white\" alt=\"在App Store下载\" \u002F>\n  \u003C\u002Fa>\n  &nbsp;\n  \u003Ca href=\"https:\u002F\u002Fplay.google.com\u002Fstore\u002Fapps\u002Fdetails?id=com.runanywhere.runanywhereai\">\n    \u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FGoogle_Play-Download-34A853?style=for-the-badge&logo=google-play&logoColor=white\" alt=\"在Google Play获取\" \u002F>\n  \u003C\u002Fa>\n\u003C\u002Fp>\n\n\u003Cp align=\"center\">\n  \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\u002Fstargazers\">\u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fstars\u002FRunanywhereAI\u002Frunanywhere-sdks?style=flat-square\" alt=\"GitHub Star数\" \u002F>\u003C\u002Fa>\n  \u003Ca href=\"LICENSE\">\u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FLicense-Apache%202.0-blue?style=flat-square\" alt=\"许可证\" \u002F>\u003C\u002Fa>\n  \u003Ca href=\"https:\u002F\u002Fdiscord.gg\u002FN359FBbDVd\">\u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FDiscord-Join-5865F2?style=flat-square&logo=discord&logoColor=white\" alt=\"Discord\" \u002F>\u003C\u002Fa>\n\u003C\u002Fp>\n\n## 实际演示\n\n\u003Cdiv align=\"center\">\n\u003Ctable>\n  \u003Ctr>\n    \u003Ctd align=\"center\" width=\"50%\">\n      \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FRunanywhereAI_runanywhere-sdks_readme_a1b11fcd74c9.gif\" alt=\"文本生成\" width=\"240\"\u002F>\u003Cbr\u002F>\u003Cbr\u002F>\n      \u003Cstrong>文本生成\u003C\u002Fstrong>\u003Cbr\u002F>\n      \u003Csub>LLM推理——100%在设备端\u003C\u002Fsub>\n    \u003C\u002Ftd>\n    \u003Ctd width=\"40\">\u003C\u002Ftd>\n    \u003Ctd align=\"center\" width=\"50%\">\n      \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FRunanywhereAI_runanywhere-sdks_readme_db82d7906e5e.gif\" alt=\"语音AI\" width=\"240\"\u002F>\u003Cbr\u002F>\u003Cbr\u002F>\n      \u003Cstrong>语音AI\u003C\u002Fstrong>\u003Cbr\u002F>\n      \u003Csub>STT → LLM → TTS流水线——完全离线\u003C\u002Fsub>\n    \u003C\u002Ftd>\n  \u003C\u002Ftr>\n  \u003Ctr>\u003Ctd colspan=\"3\" height=\"30\">\u003C\u002Ftd>\u003C\u002Ftr>\n  \u003Ctr>\n    \u003Ctd align=\"center\" width=\"50%\">\n      \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FRunanywhereAI_runanywhere-sdks_readme_80711eba4fcc.gif\" alt=\"图像生成\" width=\"240\"\u002F>\u003Cbr\u002F>\u003Cbr\u002F>\n      \u003Cstrong>图像生成\u003C\u002Fstrong>\u003Cbr\u002F>\n      \u003Csub>设备端扩散模型\u003C\u002Fsub>\n    \u003C\u002Ftd>\n    \u003Ctd width=\"40\">\u003C\u002Ftd>\n    \u003Ctd align=\"center\" width=\"50%\">\n      \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FRunanywhereAI_runanywhere-sdks_readme_e1ce34c8772c.gif\" alt=\"视觉语言模型\" width=\"240\"\u002F>\u003Cbr\u002F>\u003Cbr\u002F>\n      \u003Cstrong>视觉语言模型\u003C\u002Fstrong>\u003Cbr\u002F>\n      \u003Csub>设备端视觉与语言理解\u003C\u002Fsub>\n    \u003C\u002Ftd>\n  \u003C\u002Ftr>\n\u003C\u002Ftable>\n\u003C\u002Fdiv>\n\n---\n\n## RunAnywhere是什么？\n\nRunAnywhere让你能够为你的应用添加完全在设备端运行的AI功能：\n\n- **LLM聊天**——Llama、Mistral、Qwen、SmolLM等\n- **语音转文本**——Whisper驱动的转录\n- **文本转语音**——神经网络语音合成\n- **语音助手**——完整的STT→LLM→TTS流水线\n\n无需云端。无延迟。数据不会离开设备。\n\n---\n\n## SDK列表\n\n| 平台 | 状态 | 安装方式 | 文档 |\n|----------|--------|--------------|---------------|\n| **Swift** (iOS\u002FmacOS) | 稳定 | [Swift Package Manager](#swift-ios--macos) | [docs.runanywhere.ai\u002Fswift](https:\u002F\u002Fdocs.runanywhere.ai\u002Fswift\u002Fintroduction) |\n| **Kotlin** (Android) | 稳定 | [Gradle](#kotlin-android) | [docs.runanywhere.ai\u002Fkotlin](https:\u002F\u002Fdocs.runanywhere.ai\u002Fkotlin\u002Fintroduction) |\n| **Web** (浏览器) | 测试版 | [npm](#web-browser) | [SDK README](sdk\u002Frunanywhere-web\u002F) |\n| **React Native** | 测试版 | [npm](#react-native) | [docs.runanywhere.ai\u002Freact-native](https:\u002F\u002Fdocs.runanywhere.ai\u002Freact-native\u002Fintroduction) |\n| **Flutter** | 测试版 | [pub.dev](#flutter) | [docs.runanywhere.ai\u002Fflutter](https:\u002F\u002Fdocs.runanywhere.ai\u002Fflutter\u002Fintroduction) |\n\n---\n\n## 快速入门\n\n### Swift (iOS \u002F macOS)\n\n```swift\nimport RunAnywhere\nimport LlamaCPPRuntime\n\n\u002F\u002F 1. 初始化\nLlamaCPP.register()\ntry RunAnywhere.initialize()\n\n\u002F\u002F 2. 加载模型\ntry await RunAnywhere.downloadModel(\"smollm2-360m\")\ntry await RunAnywhere.loadModel(\"smollm2-360m\")\n\n\u002F\u002F 3. 生成\nlet response = try await RunAnywhere.chat(\"法国的首都是哪里？\")\nprint(response) \u002F\u002F \"巴黎是法国的首都。\"\n```\n\n**通过Swift Package Manager安装：**\n\n```\nhttps:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\n```\n\n[完整文档 →](https:\u002F\u002Fdocs.runanywhere.ai\u002Fswift\u002Fintroduction) · [源代码](sdk\u002Frunanywhere-swift\u002F)\n\n---\n\n### Kotlin (Android)\n\n```kotlin\nimport com.runanywhere.sdk.public.RunAnywhere\nimport com.runanywhere.sdk.public.extensions.*\n\n\u002F\u002F 1. 初始化\nLlamaCPP.register()\nRunAnywhere.initialize(environment = SDKEnvironment.DEVELOPMENT)\n\n\u002F\u002F 2. 加载模型\nRunAnywhere.downloadModel(\"smollm2-360m\").collect { println(\"${it.progress * 100}%\") }\nRunAnywhere.loadLLMModel(\"smollm2-360m\")\n\n\u002F\u002F 3. 生成\nval response = RunAnywhere.chat(\"法国的首都是哪里？\")\nprintln(response) \u002F\u002F \"巴黎是法国的首都。\"\n```\n\n**通过Gradle安装：**\n\n```kotlin\ndependencies {\n    implementation(\"com.runanywhere.sdk:runanywhere-kotlin:0.16.1\")\n    implementation(\"com.runanywhere.sdk:runanywhere-core-llamacpp:0.16.1\")\n}\n```\n\n[完整文档 →](https:\u002F\u002Fdocs.runanywhere.ai\u002Fkotlin\u002Fintroduction) · [源代码](sdk\u002Frunanywhere-kotlin\u002F)\n\n---\n\n### React Native\n\n```typescript\nimport { RunAnywhere, SDKEnvironment } from '@runanywhere\u002Fcore';\nimport { LlamaCPP } from '@runanywhere\u002Fllamacpp';\n\n\u002F\u002F 1. 初始化\nawait RunAnywhere.initialize({ environment: SDKEnvironment.Development });\nLlamaCPP.register();\n\n\u002F\u002F 2. 加载模型\nawait RunAnywhere.downloadModel('smollm2-360m');\nawait RunAnywhere.loadModel('smollm2-360m');\n\n\u002F\u002F 3. 生成\nconst response = await RunAnywhere.chat('法国的首都是哪里？');\nconsole.log(response); \u002F\u002F \"巴黎是法国的首都。\"\n```\n\n**通过npm安装：**\n\n```bash\nnpm install @runanywhere\u002Fcore @runanywhere\u002Fllamacpp\n```\n\n[完整文档 →](https:\u002F\u002Fdocs.runanywhere.ai\u002Freact-native\u002Fintroduction) · [源代码](sdk\u002Frunanywhere-react-native\u002F)\n\n---\n\n### Flutter\n\n```dart\nimport 'package:runanywhere\u002Frunanywhere.dart';\nimport 'package:runanywhere_llamacpp\u002Frunanywhere_llamacpp.dart';\n\n\u002F\u002F 1. 初始化\nawait RunAnywhere.initialize();\nawait LlamaCpp.register();\n\n\u002F\u002F 2. 加载模型\nawait RunAnywhere.downloadModel('smollm2-360m');\nawait RunAnywhere.loadModel('smollm2-360m');\n\n\u002F\u002F 3. 生成\nfinal response = await RunAnywhere.chat('法国的首都是哪里？');\nprint(response); \u002F\u002F \"巴黎是法国的首都。\"\n```\n\n**通过pub.dev安装：**\n\n```yaml\ndependencies:\n  runanywhere: ^0.16.0\n  runanywhere_llamacpp: ^0.16.0  # LLM文本生成\n  # runanywhere_onnx: ^0.16.0   # 如果需要STT、TTS或语音功能，请添加此依赖\n```\n\n[完整文档 →](https:\u002F\u002Fdocs.runanywhere.ai\u002Fflutter\u002Fintroduction) · [源代码](sdk\u002Frunanywhere-flutter\u002F)\n\n---\n\n### 网页（浏览器）\n\n```typescript\nimport { RunAnywhere, TextGeneration } from '@runanywhere\u002Fweb';\n\n\u002F\u002F 1. 初始化\nawait RunAnywhere.initialize({ environment: 'development' });\n\n\u002F\u002F 2. 加载模型\nawait TextGeneration.loadModel('\u002Fmodels\u002Fqwen2.5-0.5b-instruct-q4_0.gguf', 'qwen2.5-0.5b');\n\n\u002F\u002F 3. 生成\nconst result = await TextGeneration.generate('法国的首都是哪里？');\nconsole.log(result.text); \u002F\u002F \"巴黎是法国的首都在。\"\n```\n\n**通过 npm 安装：**\n\n```bash\nnpm install @runanywhere\u002Fweb\n```\n\n[完整文档 →](sdk\u002Frunanywhere-web\u002F) · [源代码](sdk\u002Frunanywhere-web\u002F)\n\n---\n\n## 示例应用\n\n展示 SDK 功能的全功能演示应用：\n\n| 平台 | 源代码 | 下载 |\n|----------|-------------|----------|\n| iOS | [examples\u002Fios\u002FRunAnywhereAI](examples\u002Fios\u002FRunAnywhereAI\u002F) | [App Store](https:\u002F\u002Fapps.apple.com\u002Fus\u002Fapp\u002Frunanywhere\u002Fid6756506307) |\n| Android | [examples\u002Fandroid\u002FRunAnywhereAI](examples\u002Fandroid\u002FRunAnywhereAI\u002F) | [Google Play](https:\u002F\u002Fplay.google.com\u002Fstore\u002Fapps\u002Fdetails?id=com.runanywhere.runanywhereai) |\n| Web | [examples\u002Fweb\u002FRunAnywhereAI](examples\u002Fweb\u002FRunAnywhereAI\u002F) | 从源码构建 |\n| React Native | [examples\u002Freact-native\u002FRunAnywhereAI](examples\u002Freact-native\u002FRunAnywhereAI\u002F) | 从源码构建 |\n| Flutter | [examples\u002Fflutter\u002FRunAnywhereAI](examples\u002Fflutter\u002FRunAnywhereAI\u002F) | 从源码构建 |\n\n---\n\n## 入门示例\n\n在各个平台上快速上手 RunAnywhere 的最小化入门项目：\n\n| 平台 | 仓库 |\n|----------|------------|\n| Kotlin (Android) | [RunanywhereAI\u002Fkotlin-starter-example](https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Fkotlin-starter-example) |\n| Swift (iOS) | [RunanywhereAI\u002Fswift-starter-example](https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Fswift-starter-example) |\n| Flutter | [RunanywhereAI\u002Fflutter-starter-example](https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Fflutter-starter-example) |\n| React Native | [RunanywhereAI\u002Freact-native-starter-app](https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Freact-native-starter-app) |\n\n---\n\n## 实验室\n\n使用 RunAnywhere 构建的、推动设备端 AI 边界的现实世界项目。每个项目都以可构建和运行的独立应用形式提供。\n\n### [Android Use Agent](Playground\u002Fandroid-use-agent\u002F)\n\n一个完全在设备上运行的自主 Android 助手，能够控制你的手机。只需给它一个目标，比如“打开 YouTube 并搜索 LoFi 音乐”，它就会通过无障碍 API 读取屏幕内容，利用设备上的 LLM（Qwen3-4B）推理下一步操作，并执行点击、滑动和文本输入——全程无需任何云端调用。内置三星前台加速功能，可将推理速度提升 15 倍；支持基于 Android Intent 的智能预启动；并具备循环检测与自动恢复功能。已在 Galaxy S24 上对四种 LLM 模型进行了基准测试。**[完整基准测试](Playground\u002Fandroid-use-agent\u002FASSESSMENT.md)**\n\n### [设备端浏览器助手](Playground\u002Fon-device-browser-agent\u002F)\n\n一款 Chrome 扩展程序，完全在设备上使用 WebLLM 和 WebGPU 自动化浏览器任务。采用双代理架构——规划器负责将目标分解为步骤，导航器则与页面元素交互——同时支持基于 DOM 和视觉的页面理解。包含针对亚马逊、YouTube 等网站的特定工作流。所有 AI 推理在初始模型下载后均在本地 GPU 上运行。\n\n### [Swift 入门应用](Playground\u002Fswift-starter-app\u002F)\n\n一款功能齐全的 iOS 应用，以简洁的 SwiftUI 界面展示了 RunAnywhere SDK 的核心 AI 能力。包括使用设备端语言模型进行 LLM 对话、基于 Whisper 的语音转文字、神经网络文语转换，以及将 STT、LLM 和 TTS 通过语音活动检测串联起来的完整语音流水线。是开发 iOS 上隐私优先 AI 功能的良好起点。\n\n### [Linux 语音助手](Playground\u002Flinux-voice-assistant\u002F)\n\n一套完整的 Linux 设备端语音 AI 流水线（适用于 Raspberry Pi 5、x86_64 和 ARM64）。说出“Hey Jarvis”即可激活，自然对话并获得响应——全程本地运行，零云端依赖。将唤醒词检测（openWakeWord）、语音活动检测（Silero VAD）、语音转文字（Whisper Tiny EN）、LLM 推理（Qwen2.5 0.5B Q4）以及文语转换（Piper 神经网络 TTS）整合进一个 C++ 可执行文件中。\n\n### [OpenClaw 混合助手](Playground\u002Fopenclaw-hybrid-assistant\u002F)\n\n一种混合语音助手，将延迟敏感的组件保留在设备上（唤醒词、VAD、STT、TTS），而将推理部分通过 OpenClaw WebSocket 路由到云端 LLM。支持打断式交互（可通过说出唤醒词中断 TTS）、等待提示音以反馈云端响应，以及具有脉冲滤波功能的抗噪 VAD。专为设备端 LLM 太慢但又希望进行私密音频处理的场景设计。\n\n---\n\n## 功能特性\n\n| 功能 | iOS | Android | Web | React Native | Flutter |\n|---------|-----|---------|-----|--------------|---------|\n| LLM 文本生成 | ✅ | ✅ | ✅ | ✅ | ✅ |\n| 流式输出 | ✅ | ✅ | ✅ | ✅ | ✅ |\n| 语音转文字 | ✅ | ✅ | ✅ | ✅ | ✅ |\n| 文字转语音 | ✅ | ✅ | ✅ | ✅ | ✅ |\n| 语音助手流水线 | ✅ | ✅ | ✅ | ✅ | ✅ |\n| 视觉语言模型 | ✅ | — | ✅ | — | — |\n| 模型下载 + 进度显示 | ✅ | ✅ | ✅ | ✅ | ✅ |\n| 结构化输出（JSON） | ✅ | ✅ | ✅ | 🔜 | 🔜 |\n| 工具调用 | ✅ | ✅ | ✅ | — | — |\n| 嵌入表示 | — | — | ✅ | — | — |\n| Apple 基础模型 | ✅ | — | — | — | — |\n\n---\n\n## 支持的模型\n\n### LLM（GGUF 格式，通过 llama.cpp 加载）\n\n| 模型 | 大小 | 所需内存 | 使用场景 |\n|-------|------|--------------|----------|\n| SmolLM2 360M | ~400MB | 500MB | 快速轻量级 |\n| Qwen 2.5 0.5B | ~500MB | 600MB | 多语言支持 |\n| Llama 3.2 1B | ~1GB | 1.2GB | 平衡性能 |\n| Mistral 7B Q4 | ~4GB | 5GB | 高质量 |\n\n### 语音转文字（Whisper，通过 ONNX 加载）\n\n| 模型 | 大小 | 语言 |\n|-------|------|-----------|\n| Whisper Tiny | ~75MB | 英语 |\n| Whisper Base | ~150MB | 多语言 |\n\n### 文字转语音（Piper，通过 ONNX 加载）\n\n| 声音 | 大小 | 语言 |\n|-------|------|----------|\n| Piper 美式英语 | ~65MB | 英语（美） |\n| Piper 英式英语 | ~65MB | 英语（英） |\n\n---\n\n## 仓库结构\n\n```\nrunanywhere-sdks\u002F\n├── sdk\u002F\n│   ├── runanywhere-swift\u002F          # iOS\u002FmacOS SDK\n│   ├── runanywhere-kotlin\u002F         # Android SDK\n│   ├── runanywhere-web\u002F            # Web SDK (WebAssembly)\n│   ├── runanywhere-react-native\u002F   # React Native SDK\n│   ├── runanywhere-flutter\u002F        # Flutter SDK\n│   └── runanywhere-commons\u002F        # 共享 C++ 核心\n│\n├── examples\u002F\n│   ├── ios\u002FRunAnywhereAI\u002F          # iOS 示例应用\n│   ├── android\u002FRunAnywhereAI\u002F      # 安卓示例应用\n│   ├── web\u002FRunAnywhereAI\u002F          # Web 示例应用\n│   ├── react-native\u002FRunAnywhereAI\u002F # React Native 示例应用\n│   └── flutter\u002FRunAnywhereAI\u002F      # Flutter 示例应用\n│\n├── Playground\u002F\n│   ├── swift-starter-app\u002F          # iOS AI 演示应用\n│   ├── on-device-browser-agent\u002F    # Chrome 浏览器自动化代理\n│   ├── android-use-agent\u002F          # 设备端自主安卓代理\n│   ├── linux-voice-assistant\u002F      # Linux 设备端语音助手\n│   └── openclaw-hybrid-assistant\u002F  # 混合语音助手（设备端 + 云端）\n│\n└── docs\u002F                           # 文档\n```\n\n---\n\n## 系统要求\n\n| 平台     | 最低版本 | 推荐版本 |\n|----------|----------|----------|\n| iOS      | 17.0+    | 17.0+    |\n| macOS    | 14.0+    | 14.0+    |\n| Android  | API 24 (7.0) | API 28+  |\n| Web      | Chrome 96+ \u002F Edge 96+ | Chrome 120+ |\n| React Native | 0.74+    | 0.76+    |\n| Flutter  | 3.10+    | 3.24+    |\n\n**内存：** 最低 2GB，运行大型模型建议 4GB 以上\n\n---\n\n## 贡献\n\n我们欢迎各类贡献。详情请参阅我们的 [贡献指南](CONTRIBUTING.md)。\n\n```bash\n# 克隆仓库\ngit clone https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks.git\n\n# 设置特定 SDK（以 Swift 为例）\ncd runanywhere-sdks\u002Fsdk\u002Frunanywhere-swift\n.\u002Fscripts\u002Fbuild-swift.sh --setup\n\n# 运行示例应用\ncd ..\u002F..\u002Fexamples\u002Fios\u002FRunAnywhereAI\nopen RunAnywhereAI.xcodeproj\n```\n\n---\n\n## 支持\n\n- **Discord：** [加入我们的社区](https:\u002F\u002Fdiscord.gg\u002FN359FBbDVd)\n- **GitHub Issues：** [报告问题或请求功能](https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\u002Fissues)\n- **邮箱：** founders@runanywhere.ai\n- **Twitter：** [@RunanywhereAI](https:\u002F\u002Ftwitter.com\u002FRunanywhereAI)\n\n---\n\n## 许可证\n\nApache 2.0 — 详情请参阅 [LICENSE](LICENSE) 文件。","# RunAnywhere SDKs 快速上手指南\n\nRunAnywhere 是一个致力于在**设备端（On-device）**运行 AI 模型的开源框架。它支持在大语言模型（LLM）、语音转文字（STT）、文字转语音（TTS）及视觉语言模型等场景下，实现完全离线、低延迟且保护隐私的 AI 功能。\n\n## 环境准备\n\n在开始之前，请确保您的开发环境满足以下基本要求：\n\n*   **操作系统**：\n    *   **iOS\u002FmacOS**: macOS 13.0+ (Swift)\n    *   **Android**: Android 8.0 (API 26)+ (Kotlin)\n    *   **Web**: 现代浏览器 (支持 WebGPU\u002FWebAssembly)\n    *   **跨平台**: Flutter 3.x+, React Native 0.70+\n*   **前置依赖**：\n    *   Xcode (iOS\u002FmacOS)\n    *   Android Studio & Gradle (Android)\n    *   Node.js & npm (Web\u002FReact Native)\n    *   Dart & Flutter SDK (Flutter)\n*   **网络说明**：首次运行需下载模型文件（GGUF 格式），建议确保网络连接通畅。国内开发者若遇到模型下载缓慢，可考虑配置代理或使用支持断点续传的网络环境。\n\n---\n\n## 安装步骤\n\n根据您的目标平台，选择对应的安装方式：\n\n### 1. Swift (iOS \u002F macOS)\n通过 **Swift Package Manager** 添加依赖：\n*   Repository URL: `https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks`\n\n### 2. Kotlin (Android)\n在 `build.gradle` (Module level) 的 `dependencies` 块中添加：\n```kotlin\ndependencies {\n    implementation(\"com.runanywhere.sdk:runanywhere-kotlin:0.16.1\")\n    implementation(\"com.runanywhere.sdk:runanywhere-core-llamacpp:0.16.1\")\n}\n```\n\n### 3. React Native\n使用 npm 安装核心包和 LlamaCPP 运行时：\n```bash\nnpm install @runanywhere\u002Fcore @runanywhere\u002Fllamacpp\n```\n\n### 4. Flutter\n在 `pubspec.yaml` 的 `dependencies` 中添加：\n```yaml\ndependencies:\n  runanywhere: ^0.16.0\n  runanywhere_llamacpp: ^0.16.0  # 用于 LLM 文本生成\n  # runanywhere_onnx: ^0.16.0   # 如需 STT\u002FTTS 功能请取消注释\n```\n然后运行 `flutter pub get`。\n\n### 5. Web (Browser)\n使用 npm 安装 Web 专用包：\n```bash\nnpm install @runanywhere\u002Fweb\n```\n\n---\n\n## 基本使用\n\n以下示例展示了如何初始化 SDK、下载并加载模型（以 `smollm2-360m` 为例），以及进行简单的对话生成。\n\n### Swift 示例\n```swift\nimport RunAnywhere\nimport LlamaCPPRuntime\n\n\u002F\u002F 1. 初始化\nLlamaCPP.register()\ntry RunAnywhere.initialize()\n\n\u002F\u002F 2. 下载并加载模型\ntry await RunAnywhere.downloadModel(\"smollm2-360m\")\ntry await RunAnywhere.loadModel(\"smollm2-360m\")\n\n\u002F\u002F 3. 生成回复\nlet response = try await RunAnywhere.chat(\"What is the capital of France?\")\nprint(response) \u002F\u002F 输出：\"Paris is the capital of France.\"\n```\n\n### Kotlin (Android) 示例\n```kotlin\nimport com.runanywhere.sdk.public.RunAnywhere\nimport com.runanywhere.sdk.public.extensions.*\n\n\u002F\u002F 1. 初始化\nLlamaCPP.register()\nRunAnywhere.initialize(environment = SDKEnvironment.DEVELOPMENT)\n\n\u002F\u002F 2. 下载并加载模型 (带进度监听)\nRunAnywhere.downloadModel(\"smollm2-360m\").collect { println(\"${it.progress * 100}%\") }\nRunAnywhere.loadLLMModel(\"smollm2-360m\")\n\n\u002F\u002F 3. 生成回复\nval response = RunAnywhere.chat(\"What is the capital of France?\")\nprintln(response) \u002F\u002F 输出：\"Paris is the capital of France.\"\n```\n\n### React Native 示例\n```typescript\nimport { RunAnywhere, SDKEnvironment } from '@runanywhere\u002Fcore';\nimport { LlamaCPP } from '@runanywhere\u002Fllamacpp';\n\nasync function runDemo() {\n  \u002F\u002F 1. 初始化\n  await RunAnywhere.initialize({ environment: SDKEnvironment.Development });\n  LlamaCPP.register();\n\n  \u002F\u002F 2. 下载并加载模型\n  await RunAnywhere.downloadModel('smollm2-360m');\n  await RunAnywhere.loadModel('smollm2-360m');\n\n  \u002F\u002F 3. 生成回复\n  const response = await RunAnywhere.chat('What is the capital of France?');\n  console.log(response); \u002F\u002F 输出：\"Paris is the capital of France.\"\n}\n```\n\n### Flutter 示例\n```dart\nimport 'package:runanywhere\u002Frunanywhere.dart';\nimport 'package:runanywhere_llamacpp\u002Frunanywhere_llamacpp.dart';\n\nvoid main() async {\n  \u002F\u002F 1. 初始化\n  await RunAnywhere.initialize();\n  await LlamaCpp.register();\n\n  \u002F\u002F 2. 下载并加载模型\n  await RunAnywhere.downloadModel('smollm2-360m');\n  await RunAnywhere.loadModel('smollm2-360m');\n\n  \u002F\u002F 3. 生成回复\n  final response = await RunAnywhere.chat('What is the capital of France?');\n  print(response); \u002F\u002F 输出：\"Paris is the capital of France.\"\n}\n```\n\n### Web 示例\n```typescript\nimport { RunAnywhere, TextGeneration } from '@runanywhere\u002Fweb';\n\nasync function runDemo() {\n  \u002F\u002F 1. 初始化\n  await RunAnywhere.initialize({ environment: 'development' });\n\n  \u002F\u002F 2. 加载模型 (Web 端通常需指定模型文件路径)\n  await TextGeneration.loadModel('\u002Fmodels\u002Fqwen2.5-0.5b-instruct-q4_0.gguf', 'qwen2.5-0.5b');\n\n  \u002F\u002F 3. 生成回复\n  const result = await TextGeneration.generate('What is the capital of France?');\n  console.log(result.text); \u002F\u002F 输出：\"Paris is the capital of France.\"\n}\n```\n\n> **提示**：首次调用 `downloadModel` 时会从网络拉取模型文件，后续运行将直接使用本地缓存，实现完全离线推理。","一家专注于户外探险的创业团队正在开发一款名为“荒野向导”的 iOS 应用，旨在为无网络信号的深山用户提供实时语音交互和路线建议。\n\n### 没有 runanywhere-sdks 时\n- **功能受限**：由于依赖云端 API，用户在进入无信号区域后，核心的语音问答和路线规划功能完全瘫痪，应用变成“砖头”。\n- **隐私顾虑**：为了在弱网环境下勉强工作，不得不缓存用户录音上传，导致敏感的地理位置和语音数据存在泄露风险，难以通过严苛的隐私合规审查。\n- **体验割裂**：网络波动导致语音转文字（STT）和回答生成延迟高达数秒，甚至频繁超时失败，严重打断用户在紧急情况下的操作流。\n- **成本高昂**：随着用户量增长，云端 GPU 推理费用呈线性激增，且需额外投入运维资源处理高并发请求，压缩了初创团队的利润空间。\n\n### 使用 runanywhere-sdks 后\n- **全场景可用**：集成其 Swift SDK 后，LLM 聊天、语音识别与合成全部在手机本地运行，即使在没有信号的无人区，用户也能流畅获取急救指南和路线建议。\n- **数据主权回归**：所有语音和对话数据仅在用户设备内部处理，绝不离开手机，天然满足最高级别的隐私安全标准，无需担心数据合规问题。\n- **零延迟交互**：得益于端侧推理，语音指令到反馈的延迟降低至毫秒级，实现了真正的“即说即答”，在紧急求救等关键场景中争取了宝贵时间。\n- **架构轻量化**：团队无需搭建和维护复杂的后端推理集群，大幅降低了服务器成本和运维复杂度，让开发者能更专注于业务逻辑创新。\n\nrunanywhere-sdks 通过将强大的 AI 能力彻底本地化，让应用在极端环境下依然保持智能、私密且高效，重新定义了移动端的离线交互体验。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FRunanywhereAI_runanywhere-sdks_db82d790.gif","RunanywhereAI","RunAnywhere","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002FRunanywhereAI_6db9046b.png","ODLM - On Device Language Models",null,"runanywhereai","https:\u002F\u002Fwww.runanywhere.ai\u002F","https:\u002F\u002Fgithub.com\u002FRunanywhereAI",[81,85,89,93,97,101,105,109,113,117],{"name":82,"color":83,"percentage":84},"C++","#f34b7d",27.3,{"name":86,"color":87,"percentage":88},"Kotlin","#A97BFF",17.7,{"name":90,"color":91,"percentage":92},"C","#555555",13.6,{"name":94,"color":95,"percentage":96},"Swift","#F05138",12.9,{"name":98,"color":99,"percentage":100},"TypeScript","#3178c6",12.4,{"name":102,"color":103,"percentage":104},"Dart","#00B4AB",9.4,{"name":106,"color":107,"percentage":108},"Shell","#89e051",4.2,{"name":110,"color":111,"percentage":112},"CMake","#DA3434",1.5,{"name":114,"color":115,"percentage":116},"Ruby","#701516",0.3,{"name":118,"color":119,"percentage":116},"Objective-C","#438eff",10345,349,"2026-04-11T07:55:00","NOASSERTION","iOS, macOS, Android, Web (Browser), Linux","非必需（依赖 CPU 运行 GGUF 模型）；Web 端需支持 WebGPU 的显卡；Android 特定场景（如 Samsung 设备）可利用 GPU 加速","未说明（取决于所选模型大小，示例模型为 360M-4B 参数量）",{"notes":128,"python":129,"dependencies":130},"该工具主要作为移动端和 Web 端的 SDK（支持 Swift, Kotlin, React Native, Flutter, Web），而非传统的 Python 服务器环境。核心功能是在设备本地运行 AI 模型（LLM、语音识别、语音合成），无需云端连接。模型格式主要为 GGUF。Linux 支持主要通过 C++ 二进制文件实现（如 Raspberry Pi 5 或 x86_64）。Web 端运行需要浏览器支持 WebGPU。","未说明",[131,132,133,134,135,136,137],"LlamaCPP","Whisper","ONNX Runtime (可选)","Swift Package Manager (iOS\u002FmacOS)","Gradle (Android)","npm (@runanywhere\u002Fcore, @runanywhere\u002Fllamacpp)","pub.dev (Flutter)",[35,139,16,15,14,140],"音频","其他",[142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,160,161],"ios","kotlin","llm","swift","inference","multimodal","on-device-ai","voice-ai","android","edge","llamacpp","apple-intelligence","ollama","cpp","diffusion-models","flutter","react-native","vlm","web","websdk","2026-03-27T02:49:30.150509","2026-04-11T21:39:46.815214",[165,170,175,180,185,190],{"id":166,"question_zh":167,"answer_zh":168,"source_url":169},29816,"iOS 应用在欧盟地区（如德国）无法下载怎么办？","该问题已解决。维护者发布了新的 iOS 版本，现在应用已在所有地区（包括德国）上架。如果您仍无法下载，请尝试更新 App Store 或重新搜索应用名称。如有问题可重新开启该议题反馈。","https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\u002Fissues\u002F347",{"id":171,"question_zh":172,"answer_zh":173,"source_url":174},29817,"Android SDK 中调用 speak() 函数后为什么没有声音播放？","此前 Android SDK 的 speak() 函数仅合成音频但未集成平台音频播放，导致无声。该问题已通过 PR #288 修复，集成了 AudioPlaybackManager 以支持完整的声音播放功能。请确保升级到包含此修复的最新版本 SDK。","https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\u002Fissues\u002F274",{"id":176,"question_zh":177,"answer_zh":178,"source_url":179},29818,"iOS 应用中显示的 tokens\u002Fsec 指标为何不准确？","此前 iOS 应用通过“字符数\u002F4”估算 token 数量，导致 tokens\u002Fsec 计算偏差。正确做法应基于实际生成的 token 总数除以耗时。该问题已被社区发现并欢迎提交 PR 修复，建议开发者使用最新代码或等待官方更新以获得准确指标。","https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\u002Fissues\u002F339",{"id":181,"question_zh":182,"answer_zh":183,"source_url":184},29819,"文档中的 SDK 版本号过时或示例代码报错如何处理？","文档中存在多处版本过时和代码错误：Flutter 包版本应为 ^0.16.0（非 ^0.15.11），React Native 示例中 modelPath 未定义应改为字符串 ID（如 'smollm2-360m'），Kotlin 版本应为 0.16.1（非 0.1.4）。此外，Flutter 的 pubspec 需注明 runanywhere_onnx 以支持 STT\u002FTTS。相关问题已通过 PR 合并修复，请参考最新文档。","https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\u002Fissues\u002F448",{"id":186,"question_zh":187,"answer_zh":188,"source_url":189},29820,"Web SDK 是否支持完整的 TypeScript 类型定义？","早期 Web SDK (@runanywhere\u002Fweb) 缺乏完整的 TypeScript 类型支持。社区已提出增强计划，包括添加 .d.ts 文件、覆盖模型加载选项、生成参数、环境配置及错误回调等类型。该功能正在开发中，贡献者可参与改进。建议关注后续版本更新以获取完整 TS 支持。","https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\u002Fissues\u002F378",{"id":191,"question_zh":192,"answer_zh":193,"source_url":184},29821,"如何为 RunAnywhere 项目贡献代码或报告文档问题？","欢迎用户发现问题后直接开启 Issue 并提交 PR。例如文档版本错误、代码示例缺陷等问题，维护者鼓励社区成员 fork 仓库、测试修改后提交 PR。对于新功能或类型定义改进，也可主动认领任务并在评论中沟通进展。",[195,200,205,210,215,220,225,230,235,240,245,250,255,260,265,270,275,280,285,290],{"id":196,"version":197,"summary_zh":198,"released_at":199},206391,"genie-v0.3.0","## Genie 后端 v0.3.0 - 原生二进制文件\n\n适用于 Android arm64-v8a 架构的 Genie NPU 后端二进制分发包。\n\n### 新增内容\n- **SM8850 支持**：新增 Snapdragon 8 Elite Gen 5（三星 Galaxy S26）的主 HTP 运行时库 `libQnnHtpV81.so`\n- 更新 Qualcomm QNN 库至 QAIRT SDK v2.42.0\n\n### 平台支持\n| 芯片 | SoC | HTP 版本 | 状态 |\n|------|-----|----------|------|\n| Snapdragon 8 Gen 2 | SM8550 | V73 | 支持 |\n| Snapdragon 8 Gen 3 | SM8650 | V75 | 支持 |\n| Snapdragon 8 Elite | SM8750 | V79 | 支持 |\n| Snapdragon 8 Elite Gen 5 | SM8850 | V81 | **新增** |\n\n### 内容\n- 28 个原生 `.so` 文件（arm64-v8a）\n- 压缩后约 68MB，解压后约 192MB\n\n### 许可证\n根据 RunAnywhere Genie SDK 二进制分发许可协议进行分发（包含 Qualcomm 专有技术）。","2026-03-20T02:35:46",{"id":201,"version":202,"summary_zh":203,"released_at":204},206392,"v0.19.7","## RunAnywhere SDKs v0.19.7\n\n**以隐私为先的设备端 AI SDK**，适用于 iOS、Android、Flutter 和 React Native。\n\n---\n\n### 📦 发布资产\n\n#### 核心库\n| 资产 | 平台 | 大小 |\n|-------|----------|------|\n| `RACommons-ios-v0.19.7.zip` | iOS\u002FmacOS | ~2 MB |\n| `RACommons-android-arm64-v8a-v0.19.7.zip` | Android arm64 | ~61 MB |\n| `RACommons-android-armeabi-v7a-v0.19.7.zip` | Android armv7 | ~57 MB |\n| `RACommons-android-x86_64-v0.19.7.zip` | Android x86_64 | ~64 MB |\n\n#### 后端库（根据需求选择）\n| 资产 | 平台 | 使用场景 |\n|-------|----------|----------|\n| `RABackendLLAMACPP-ios-v0.19.7.zip` | iOS\u002FmacOS | LLM |\n| `RABackendLLAMACPP-android-arm64-v8a-v0.19.7.zip` | Android arm64 | LLM |\n| `RABackendLLAMACPP-android-armeabi-v7a-v0.19.7.zip` | Android armv7 | LLM |\n| `RABackendLLAMACPP-android-x86_64-v0.19.7.zip` | Android x86_64 | LLM |\n| `RABackendONNX-ios-v0.19.7.zip` | iOS\u002FmacOS | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-arm64-v8a-v0.19.7.zip` | Android arm64 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-armeabi-v7a-v0.19.7.zip` | Android armv7 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-x86_64-v0.19.7.zip` | Android x86_64 | STT\u002FTTS\u002FVAD |\n\n#### SDK 包\n| 资产 | 平台 | 说明 |\n|-------|----------|-------------|\n| `RunAnywhere-Swift-SDK-v0.19.7.zip` | iOS\u002FmacOS | Swift SDK |\n| `RunAnywhere-Kotlin-SDK-v0.19.7.aar` | Android | Kotlin SDK (AAR) |\n\n> **注意:** Flutter SDK 可在 [pub.dev](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere) 获取，React Native SDK 可在 [npm](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore) 获取。\n\n> 💡 **提示:** 大多数 Android 应用仅需 `arm64-v8a`（覆盖 85% 的设备）。\n\n---\n\n### 🚀 快速入门\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Swift（iOS\u002FmacOS）\u003C\u002Fb>\u003C\u002Fsummary>\n\n```swift\n\u002F\u002F Package.swift\ndependencies: [\n    .package(url: \"https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\", from: \"0.19.7\")\n]\n```\n\n或者直接从本版本下载 XCFrameworks。\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Kotlin（Android）\u003C\u002Fb>\u003C\u002Fsummary>\n\n```kotlin\n\u002F\u002F build.gradle.kts\ndependencies {\n    implementation(\"ai.runanywhere:runanywhere-kotlin:0.19.7\")\n}\n```\n\n或者从本版本下载 `RunAnywhere-Kotlin-SDK-v0.19.7.aar`。\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Flutter\u003C\u002Fb>\u003C\u002Fsummary>\n\n```yaml\n# pubspec.yaml - 从 pub.dev 安装\ndependencies:\n  runanywhere: ^0.19.7\n```\n\n或者访问 [pub.dev\u002Fpackages\u002Frunanywhere](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere)。\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>React Native\u003C\u002Fb>\u003C\u002Fsummary>\n\n```bash\n# 从 npm 安装\nnpm install @runanywhere\u002Fcore\n```\n\n或者访问 [npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore)。\n\u003C\u002Fdetails>\n\n---\n\n### ✨ 功能\n\n- 🧠 **LLM**: 通过 llama.cpp 在设备端进行文本生成\n- 🎤 **STT**: 通过 Sherpa-ONNX Whisper 进行语音转文本\n- 🔊 **TTS**: 通过 Sherpa-ONNX Piper 进行文本转语音\n- 🎯 **VAD**: 语音活动检测\n- 🔒 **隐私**: 所有处理均在设备端完成\n\n---\n\n### 📋 构建状态\n\n| 组件 | 状态 |\n|-----------|--------|\n| RACommons | ❌ |\n| 后端 | ❌ |\n| S","2026-03-19T22:40:32",{"id":206,"version":207,"summary_zh":208,"released_at":209},206393,"genie-v0.1.0","## Genie 后端 v0.1.0 - 原生二进制文件\n\n适用于 Android arm64-v8a 架构的 Genie NPU 后端二进制分发包。\n\n### 平台支持\n| 芯片 | SoC | HTP | 状态 |\n|------|-----|-----|--------|\n| 骁龙 8 第 2 代 | SM8550 | V73 | 支持 |\n| 骁龙 8 第 3 代 | SM8650 | V75 | 支持 |\n| 骁龙 8 Elite | SM8750 | V79 | 支持 |\n\n### 内容\n- 27 个原生 `.so` 文件（arm64-v8a）\n- 压缩后约 63MB，解压后约 178MB\n\n### 许可证\n根据 RunAnywhere Genie SDK 二进制分发许可协议进行分发（包含高通专有技术）。","2026-03-20T02:37:24",{"id":211,"version":212,"summary_zh":213,"released_at":214},206394,"runanywhere-v0.17.2-mac","## RunAnywhere for macOS — 直接下载\n\n设备端 AI 平台演示——大语言模型聊天、语音转文本、文本转语音、语音助手、视觉、扩散模型、RAG 以及基准测试。所有推理均在本地运行。\n\n### 功能\n- 使用 llama.cpp 的设备端大语言模型聊天（通义千问、Llama、Phi 等）\n- 通过 WhisperKit（神经引擎）和 ONNX（CPU）实现语音转文本\n- 文本转语音与语音助手流水线\n- 视觉语言模型、图像生成\n- 文档 RAG、LoRA 适配器、模型基准测试\n- 模型中心，支持一键下载与管理\n\n### 安装\n1. 打开 `.dmg` 文件\n2. 将 **RunAnywhere** 拖拽至 **应用程序** 文件夹\n3. 从“应用程序”中启动\n\n### 系统要求\n- macOS 14.0 或更高版本\n- Apple Silicon 或 Intel 架构的 Mac\n\n> 已使用开发者 ID 签名。首次启动时，若 Gatekeeper 弹出提示，请右键点击 → 打开。","2026-02-25T01:37:39",{"id":216,"version":217,"summary_zh":218,"released_at":219},206395,"yaprun-v0.1-mac","## YapRun for macOS — 直接下载\n\n由 RunAnywhere SDK 提供支持的设备端语音转文字功能。所有语音识别都在本地进行——您的语音永远不会离开您的设备。\n\n**[runanywhere.ai\u002Fyaprun](https:\u002F\u002Frunanywhere.ai\u002Fyaprun)**\n\n### 功能\n- 菜单栏代理，支持全局快捷键语音输入\n- 多种 Whisper 后端（WhisperKit Neural Engine + ONNX CPU）\n- 浮动流程条，显示语音输入状态\n- 模型中心、ASR 体验区和记事本\n- 离线可用——只需下载一次，即可在无网络环境下运行\n\n### 安装\n1. 打开 `.dmg` 文件\n2. 将 **YapRun** 拖拽至 **应用程序** 文件夹\n3. 从应用程序中启动——引导界面将指导您完成设置\n\n### 系统要求\n- macOS 14.0 及以上版本\n- Apple Silicon 或 Intel 架构的 Mac\n\n> 已使用开发者 ID 签名。首次启动时，若 Gatekeeper 弹出提示，请右键点击 → 打开。","2026-02-24T06:48:58",{"id":221,"version":222,"summary_zh":223,"released_at":224},206396,"v0.19.6","## RunAnywhere SDKs v0.19.6\n\n**以隐私为先的设备端 AI SDK**，适用于 iOS、Android、Flutter 和 React Native。\n\n---\n\n### 📦 发布资产\n\n#### 核心库\n| 资产 | 平台 | 大小 |\n|-------|----------|------|\n| `RACommons-ios-v0.19.6.zip` | iOS\u002FmacOS | ~2 MB |\n| `RACommons-android-arm64-v8a-v0.19.6.zip` | Android arm64 | ~61 MB |\n| `RACommons-android-armeabi-v7a-v0.19.6.zip` | Android armv7 | ~57 MB |\n| `RACommons-android-x86_64-v0.19.6.zip` | Android x86_64 | ~64 MB |\n\n#### 后端库（根据需求选择）\n| 资产 | 平台 | 使用场景 |\n|-------|----------|----------|\n| `RABackendLLAMACPP-ios-v0.19.6.zip` | iOS\u002FmacOS | LLM |\n| `RABackendLLAMACPP-android-arm64-v8a-v0.19.6.zip` | Android arm64 | LLM |\n| `RABackendLLAMACPP-android-armeabi-v7a-v0.19.6.zip` | Android armv7 | LLM |\n| `RABackendLLAMACPP-android-x86_64-v0.19.6.zip` | Android x86_64 | LLM |\n| `RABackendONNX-ios-v0.19.6.zip` | iOS\u002FmacOS | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-arm64-v8a-v0.19.6.zip` | Android arm64 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-armeabi-v7a-v0.19.6.zip` | Android armv7 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-x86_64-v0.19.6.zip` | Android x86_64 | STT\u002FTTS\u002FVAD |\n\n#### SDK 包\n| 资产 | 平台 | 说明 |\n|-------|----------|-------------|\n| `RunAnywhere-Swift-SDK-v0.19.6.zip` | iOS\u002FmacOS | Swift SDK |\n| `RunAnywhere-Kotlin-SDK-v0.19.6.aar` | Android | Kotlin SDK (AAR) |\n\n> **注意:** Flutter SDK 可在 [pub.dev](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere) 上获取，React Native SDK 可在 [npm](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore) 上获取。\n\n> 💡 **提示:** 大多数 Android 应用仅需 `arm64-v8a`（覆盖 85% 的设备）。\n\n---\n\n### 🚀 快速入门\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Swift（iOS\u002FmacOS）\u003C\u002Fb>\u003C\u002Fsummary>\n\n```swift\n\u002F\u002F Package.swift\ndependencies: [\n    .package(url: \"https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\", from: \"0.19.6\")\n]\n```\n\n或者直接从本版本下载 XCFramework。\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Kotlin（Android）\u003C\u002Fb>\u003C\u002Fsummary>\n\n```kotlin\n\u002F\u002F build.gradle.kts\ndependencies {\n    implementation(\"ai.runanywhere:runanywhere-kotlin:0.19.6\")\n}\n```\n\n或者从本版本下载 `RunAnywhere-Kotlin-SDK-v0.19.6.aar`。\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Flutter\u003C\u002Fb>\u003C\u002Fsummary>\n\n```yaml\n# pubspec.yaml - 从 pub.dev 安装\ndependencies:\n  runanywhere: ^0.19.6\n```\n\n或者访问 [pub.dev\u002Fpackages\u002Frunanywhere](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere)。\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>React Native\u003C\u002Fb>\u003C\u002Fsummary>\n\n```bash\n# 从 npm 安装\nnpm install @runanywhere\u002Fcore\n```\n\n或者访问 [npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore)。\n\u003C\u002Fdetails>\n\n---\n\n### ✨ 功能\n\n- 🧠 **LLM**: 通过 llama.cpp 在设备端进行文本生成\n- 🎤 **STT**: 通过 Sherpa-ONNX Whisper 进行语音转文本\n- 🔊 **TTS**: 通过 Sherpa-ONNX Piper 进行文本转语音\n- 🎯 **VAD**: 语音活动检测\n- 🔒 **隐私**: 所有处理均在设备端完成\n\n---\n\n### 📋 构建状态\n\n| 组件 | 状态 |\n|-----------|--------|\n| RACommons | ❌ |\n| 后端 | ❌ |\n| S","2026-02-24T03:43:45",{"id":226,"version":227,"summary_zh":228,"released_at":229},206397,"v0.19.4","## RunAnywhere SDKs v0.19.4\n\n**以隐私为先的设备端 AI SDK**，适用于 iOS、Android、Flutter 和 React Native。\n\n---\n\n### 📦 发布资产\n\n#### 核心库\n| 资产 | 平台 | 大小 |\n|-------|----------|------|\n| `RACommons-ios-v0.19.4.zip` | iOS\u002FmacOS | ~2 MB |\n| `RACommons-android-arm64-v8a-v0.19.4.zip` | Android arm64 | ~61 MB |\n| `RACommons-android-armeabi-v7a-v0.19.4.zip` | Android armv7 | ~57 MB |\n| `RACommons-android-x86_64-v0.19.4.zip` | Android x86_64 | ~64 MB |\n\n#### 后端库（根据需求选择）\n| 资产 | 平台 | 使用场景 |\n|-------|----------|----------|\n| `RABackendLLAMACPP-ios-v0.19.4.zip` | iOS\u002FmacOS | LLM |\n| `RABackendLLAMACPP-android-arm64-v8a-v0.19.4.zip` | Android arm64 | LLM |\n| `RABackendLLAMACPP-android-armeabi-v7a-v0.19.4.zip` | Android armv7 | LLM |\n| `RABackendLLAMACPP-android-x86_64-v0.19.4.zip` | Android x86_64 | LLM |\n| `RABackendONNX-ios-v0.19.4.zip` | iOS\u002FmacOS | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-arm64-v8a-v0.19.4.zip` | Android arm64 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-armeabi-v7a-v0.19.4.zip` | Android armv7 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-x86_64-v0.19.4.zip` | Android x86_64 | STT\u002FTTS\u002FVAD |\n\n#### SDK 包\n| 资产 | 平台 | 说明 |\n|-------|----------|-------------|\n| `RunAnywhere-Swift-SDK-v0.19.4.zip` | iOS\u002FmacOS | Swift SDK |\n| `RunAnywhere-Kotlin-SDK-v0.19.4.aar` | Android | Kotlin SDK (AAR) |\n\n> **注意:** Flutter SDK 可在 [pub.dev](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere) 获取，React Native SDK 可在 [npm](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore) 获取。\n\n> 💡 **提示:** 大多数 Android 应用仅需 `arm64-v8a`（占设备总数的 85%）。\n\n---\n\n### 🚀 快速入门\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Swift（iOS\u002FmacOS）\u003C\u002Fb>\u003C\u002Fsummary>\n\n```swift\n\u002F\u002F Package.swift\ndependencies: [\n    .package(url: \"https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\", from: \"0.19.4\")\n]\n```\n\n或者直接从本次发布下载 XCFrameworks。\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Kotlin（Android）\u003C\u002Fb>\u003C\u002Fsummary>\n\n```kotlin\n\u002F\u002F build.gradle.kts\ndependencies {\n    implementation(\"ai.runanywhere:runanywhere-kotlin:0.19.4\")\n}\n```\n\n或者从本次发布下载 `RunAnywhere-Kotlin-SDK-v0.19.4.aar`。\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Flutter\u003C\u002Fb>\u003C\u002Fsummary>\n\n```yaml\n# pubspec.yaml - 从 pub.dev 安装\ndependencies:\n  runanywhere: ^0.19.4\n```\n\n或者访问 [pub.dev\u002Fpackages\u002Frunanywhere](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere)。\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>React Native\u003C\u002Fb>\u003C\u002Fsummary>\n\n```bash\n# 从 npm 安装\nnpm install @runanywhere\u002Fcore\n```\n\n或者访问 [npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore)。\n\u003C\u002Fdetails>\n\n---\n\n### ✨ 功能特性\n\n- 🧠 **LLM**: 通过 llama.cpp 在设备端进行文本生成\n- 🎤 **STT**: 通过 Sherpa-ONNX Whisper 进行语音转文本\n- 🔊 **TTS**: 通过 Sherpa-ONNX Piper 进行文本转语音\n- 🎯 **VAD**: 语音活动检测\n- 🔒 **隐私**: 所有处理均在设备端完成\n\n---\n\n### 📋 构建状态\n\n| 组件 | 状态 |\n|-----------|--------|\n| RACommons | ✅ |\n| 后端 | ✅ |\n| S","2026-02-23T04:45:22",{"id":231,"version":232,"summary_zh":233,"released_at":234},206398,"v0.19.3","## RunAnywhere SDKs v0.19.3\n\n**以隐私为先的设备端 AI SDK**，适用于 iOS、Android、Flutter 和 React Native。\n\n---\n\n### 📦 发布资产\n\n#### 核心库\n| 资产 | 平台 | 大小 |\n|-------|----------|------|\n| `RACommons-ios-v0.19.3.zip` | iOS\u002FmacOS | ~2 MB |\n| `RACommons-android-arm64-v8a-v0.19.3.zip` | Android arm64 | ~61 MB |\n| `RACommons-android-armeabi-v7a-v0.19.3.zip` | Android armv7 | ~57 MB |\n| `RACommons-android-x86_64-v0.19.3.zip` | Android x86_64 | ~64 MB |\n\n#### 后端库（根据需求选择）\n| 资产 | 平台 | 使用场景 |\n|-------|----------|----------|\n| `RABackendLLAMACPP-ios-v0.19.3.zip` | iOS\u002FmacOS | LLM |\n| `RABackendLLAMACPP-android-arm64-v8a-v0.19.3.zip` | Android arm64 | LLM |\n| `RABackendLLAMACPP-android-armeabi-v7a-v0.19.3.zip` | Android armv7 | LLM |\n| `RABackendLLAMACPP-android-x86_64-v0.19.3.zip` | Android x86_64 | LLM |\n| `RABackendONNX-ios-v0.19.3.zip` | iOS\u002FmacOS | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-arm64-v8a-v0.19.3.zip` | Android arm64 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-armeabi-v7a-v0.19.3.zip` | Android armv7 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-x86_64-v0.19.3.zip` | Android x86_64 | STT\u002FTTS\u002FVAD |\n\n#### SDK 包\n| 资产 | 平台 | 说明 |\n|-------|----------|-------------|\n| `RunAnywhere-Swift-SDK-v0.19.3.zip` | iOS\u002FmacOS | Swift SDK |\n| `RunAnywhere-Kotlin-SDK-v0.19.3.aar` | Android | Kotlin SDK (AAR) |\n\n> **注意:** Flutter SDK 可在 [pub.dev](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere) 上获取，React Native SDK 可在 [npm](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore) 上获取。\n\n> 💡 **提示:** 大多数 Android 应用仅需 `arm64-v8a` 版本（覆盖 85% 的设备）。\n\n---\n\n### 🚀 快速入门\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Swift（iOS\u002FmacOS）\u003C\u002Fb>\u003C\u002Fsummary>\n\n```swift\n\u002F\u002F Package.swift\ndependencies: [\n    .package(url: \"https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\", from: \"0.19.3\")\n]\n```\n\n或者直接从本次发布下载 XCFrameworks。\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Kotlin（Android）\u003C\u002Fb>\u003C\u002Fsummary>\n\n```kotlin\n\u002F\u002F build.gradle.kts\ndependencies {\n    implementation(\"ai.runanywhere:runanywhere-kotlin:0.19.3\")\n}\n```\n\n或者从本次发布下载 `RunAnywhere-Kotlin-SDK-v0.19.3.aar`。\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Flutter\u003C\u002Fb>\u003C\u002Fsummary>\n\n```yaml\n# pubspec.yaml - 从 pub.dev 安装\ndependencies:\n  runanywhere: ^0.19.3\n```\n\n或者访问 [pub.dev\u002Fpackages\u002Frunanywhere](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere)。\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>React Native\u003C\u002Fb>\u003C\u002Fsummary>\n\n```bash\n# 从 npm 安装\nnpm install @runanywhere\u002Fcore\n```\n\n或者访问 [npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore)。\n\u003C\u002Fdetails>\n\n---\n\n### ✨ 功能特性\n\n- 🧠 **LLM**: 通过 llama.cpp 在设备端进行文本生成\n- 🎤 **STT**: 通过 Sherpa-ONNX Whisper 进行语音转文本\n- 🔊 **TTS**: 通过 Sherpa-ONNX Piper 进行文本转语音\n- 🎯 **VAD**: 语音活动检测\n- 🔒 **隐私**: 所有处理均在设备端完成\n\n---\n\n### 📋 构建状态\n\n| 组件 | 状态 |\n|-----------|--------|\n| RACommons | ✅ |\n| 后端 | ✅ |\n| S","2026-02-23T00:55:31",{"id":236,"version":237,"summary_zh":238,"released_at":239},206399,"v0.19.5","## RunAnywhere SDKs v0.19.5\n\n**以隐私为先的设备端 AI SDK**，适用于 iOS、Android、Flutter 和 React Native。\n\n---\n\n### 📦 发布资产\n\n#### 核心库\n| 资产 | 平台 | 大小 |\n|-------|----------|------|\n| `RACommons-ios-v0.19.5.zip` | iOS\u002FmacOS | ~2 MB |\n| `RACommons-android-arm64-v8a-v0.19.5.zip` | Android arm64 | ~61 MB |\n| `RACommons-android-armeabi-v7a-v0.19.5.zip` | Android armv7 | ~57 MB |\n| `RACommons-android-x86_64-v0.19.5.zip` | Android x86_64 | ~64 MB |\n\n#### 后端库（根据需求选择）\n| 资产 | 平台 | 使用场景 |\n|-------|----------|----------|\n| `RABackendLLAMACPP-ios-v0.19.5.zip` | iOS\u002FmacOS | LLM |\n| `RABackendLLAMACPP-android-arm64-v8a-v0.19.5.zip` | Android arm64 | LLM |\n| `RABackendLLAMACPP-android-armeabi-v7a-v0.19.5.zip` | Android armv7 | LLM |\n| `RABackendLLAMACPP-android-x86_64-v0.19.5.zip` | Android x86_64 | LLM |\n| `RABackendONNX-ios-v0.19.5.zip` | iOS\u002FmacOS | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-arm64-v8a-v0.19.5.zip` | Android arm64 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-armeabi-v7a-v0.19.5.zip` | Android armv7 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-x86_64-v0.19.5.zip` | Android x86_64 | STT\u002FTTS\u002FVAD |\n\n#### SDK 包\n| 资产 | 平台 | 说明 |\n|-------|----------|-------------|\n| `RunAnywhere-Swift-SDK-v0.19.5.zip` | iOS\u002FmacOS | Swift SDK |\n| `RunAnywhere-Kotlin-SDK-v0.19.5.aar` | Android | Kotlin SDK (AAR) |\n\n> **注：** Flutter SDK 可在 [pub.dev](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere) 上获取，React Native SDK 可在 [npm](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore) 上获取。\n\n> 💡 **提示：** 大多数 Android 应用仅需 `arm64-v8a`（覆盖 85% 的设备）。\n\n---\n\n### 🚀 快速入门\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Swift（iOS\u002FmacOS）\u003C\u002Fb>\u003C\u002Fsummary>\n\n```swift\n\u002F\u002F Package.swift\ndependencies: [\n    .package(url: \"https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\", from: \"0.19.5\")\n]\n```\n\n或者直接从本版本下载 XCFrameworks。\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Kotlin（Android）\u003C\u002Fb>\u003C\u002Fsummary>\n\n```kotlin\n\u002F\u002F build.gradle.kts\ndependencies {\n    implementation(\"ai.runanywhere:runanywhere-kotlin:0.19.5\")\n}\n```\n\n或者从本版本下载 `RunAnywhere-Kotlin-SDK-v0.19.5.aar`。\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Flutter\u003C\u002Fb>\u003C\u002Fsummary>\n\n```yaml\n# pubspec.yaml - 从 pub.dev 安装\ndependencies:\n  runanywhere: ^0.19.5\n```\n\n或者访问 [pub.dev\u002Fpackages\u002Frunanywhere](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere)。\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>React Native\u003C\u002Fb>\u003C\u002Fsummary>\n\n```bash\n# 从 npm 安装\nnpm install @runanywhere\u002Fcore\n```\n\n或者访问 [npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore)。\n\u003C\u002Fdetails>\n\n---\n\n### ✨ 功能\n\n- 🧠 **LLM**：通过 llama.cpp 在设备端进行文本生成\n- 🎤 **STT**：通过 Sherpa-ONNX Whisper 进行语音转文本\n- 🔊 **TTS**：通过 Sherpa-ONNX Piper 进行文本转语音\n- 🎯 **VAD**：语音活动检测\n- 🔒 **隐私**：所有处理均在设备端完成\n\n---\n\n### 📋 构建状态\n\n| 组件 | 状态 |\n|-----------|--------|\n| RACommons | ❌ |\n| 后端 | ❌ |\n| S","2026-02-23T21:21:40",{"id":241,"version":242,"summary_zh":243,"released_at":244},206400,"v0.19.2","## RunAnywhere SDKs v0.19.2\n\n**以隐私为先的设备端 AI SDK**，适用于 iOS、Android、Flutter 和 React Native。\n\n---\n\n### 📦 发布资产\n\n#### 核心库\n| 资产 | 平台 | 大小 |\n|-------|----------|------|\n| `RACommons-ios-v0.19.2.zip` | iOS\u002FmacOS | ~2 MB |\n| `RACommons-android-arm64-v8a-v0.19.2.zip` | Android arm64 | ~61 MB |\n| `RACommons-android-armeabi-v7a-v0.19.2.zip` | Android armv7 | ~57 MB |\n| `RACommons-android-x86_64-v0.19.2.zip` | Android x86_64 | ~64 MB |\n\n#### 后端库（根据需求选择）\n| 资产 | 平台 | 使用场景 |\n|-------|----------|----------|\n| `RABackendLLAMACPP-ios-v0.19.2.zip` | iOS\u002FmacOS | LLM |\n| `RABackendLLAMACPP-android-arm64-v8a-v0.19.2.zip` | Android arm64 | LLM |\n| `RABackendLLAMACPP-android-armeabi-v7a-v0.19.2.zip` | Android armv7 | LLM |\n| `RABackendLLAMACPP-android-x86_64-v0.19.2.zip` | Android x86_64 | LLM |\n| `RABackendONNX-ios-v0.19.2.zip` | iOS\u002FmacOS | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-arm64-v8a-v0.19.2.zip` | Android arm64 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-armeabi-v7a-v0.19.2.zip` | Android armv7 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-x86_64-v0.19.2.zip` | Android x86_64 | STT\u002FTTS\u002FVAD |\n\n#### SDK 包\n| 资产 | 平台 | 说明 |\n|-------|----------|-------------|\n| `RunAnywhere-Swift-SDK-v0.19.2.zip` | iOS\u002FmacOS | Swift SDK |\n| `RunAnywhere-Kotlin-SDK-v0.19.2.aar` | Android | Kotlin SDK (AAR) |\n\n> **注意:** Flutter SDK 可在 [pub.dev](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere) 获取，React Native SDK 可在 [npm](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore) 获取。\n\n> 💡 **提示:** 大多数 Android 应用仅需 `arm64-v8a`（覆盖 85% 的设备）。\n\n---\n\n### 🚀 快速入门\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Swift (iOS\u002FmacOS)\u003C\u002Fb>\u003C\u002Fsummary>\n\n```swift\n\u002F\u002F Package.swift\ndependencies: [\n    .package(url: \"https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\", from: \"0.19.2\")\n]\n```\n\n或者直接从本版本下载 XCFrameworks。\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Kotlin (Android)\u003C\u002Fb>\u003C\u002Fsummary>\n\n```kotlin\n\u002F\u002F build.gradle.kts\ndependencies {\n    implementation(\"ai.runanywhere:runanywhere-kotlin:0.19.2\")\n}\n```\n\n或者从本版本下载 `RunAnywhere-Kotlin-SDK-v0.19.2.aar`。\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Flutter\u003C\u002Fb>\u003C\u002Fsummary>\n\n```yaml\n# pubspec.yaml - 从 pub.dev 安装\ndependencies:\n  runanywhere: ^0.19.2\n```\n\n或者访问 [pub.dev\u002Fpackages\u002Frunanywhere](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere)。\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>React Native\u003C\u002Fb>\u003C\u002Fsummary>\n\n```bash\n# 从 npm 安装\nnpm install @runanywhere\u002Fcore\n```\n\n或者访问 [npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore)。\n\u003C\u002Fdetails>\n\n---\n\n### ✨ 功能\n\n- 🧠 **LLM**: 通过 llama.cpp 在设备端进行文本生成\n- 🎤 **STT**: 通过 Sherpa-ONNX Whisper 进行语音转文本\n- 🔊 **TTS**: 通过 Sherpa-ONNX Piper 进行文本转语音\n- 🎯 **VAD**: 语音活动检测\n- 🔒 **隐私**: 所有处理均在设备端完成\n\n---\n\n### 📋 构建状态\n\n| 组件 | 状态 |\n|-----------|--------|\n| RACommons | ❌ |\n| 后端 | ❌ |\n| S","2026-02-22T08:03:01",{"id":246,"version":247,"summary_zh":248,"released_at":249},206401,"commons-v0.1.6","RACommons v0.1.6 - Rebuilt native binaries with latest changes.\n\n## Changes\n- Fix parameter piping through SDK (#340)\n- Network layer fixes (auth, dev config)\n- API configuration management\n- Keychain store capabilities\n- Updated llama.cpp to b7650\n- Updated Sherpa-ONNX to v1.12.20 (Android, 16KB aligned)\n- Updated Sherpa-ONNX to v1.12.18 (iOS)\n\n## Assets\n- RACommons-ios-v0.1.6.zip: iOS XCFrameworks (RACommons, RABackendLLAMACPP, RABackendONNX)\n- RACommons-android-v0.1.6.zip: Android .so libraries (JNI, LlamaCPP, ONNX backends)","2026-02-15T03:27:11",{"id":251,"version":252,"summary_zh":253,"released_at":254},206402,"v0.19.1","## RunAnywhere SDKs v0.19.1\n\n**Privacy-first, on-device AI SDKs** for iOS, Android, Flutter, and React Native.\n\n---\n\n### 📦 Release Assets\n\n#### Core Libraries\n| Asset | Platform | Size |\n|-------|----------|------|\n| `RACommons-ios-v0.19.1.zip` | iOS\u002FmacOS | ~2 MB |\n| `RACommons-android-arm64-v8a-v0.19.1.zip` | Android arm64 | ~61 MB |\n| `RACommons-android-armeabi-v7a-v0.19.1.zip` | Android armv7 | ~57 MB |\n| `RACommons-android-x86_64-v0.19.1.zip` | Android x86_64 | ~64 MB |\n\n#### Backend Libraries (pick what you need)\n| Asset | Platform | Use Case |\n|-------|----------|----------|\n| `RABackendLLAMACPP-ios-v0.19.1.zip` | iOS\u002FmacOS | LLM |\n| `RABackendLLAMACPP-android-arm64-v8a-v0.19.1.zip` | Android arm64 | LLM |\n| `RABackendLLAMACPP-android-armeabi-v7a-v0.19.1.zip` | Android armv7 | LLM |\n| `RABackendLLAMACPP-android-x86_64-v0.19.1.zip` | Android x86_64 | LLM |\n| `RABackendONNX-ios-v0.19.1.zip` | iOS\u002FmacOS | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-arm64-v8a-v0.19.1.zip` | Android arm64 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-armeabi-v7a-v0.19.1.zip` | Android armv7 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-x86_64-v0.19.1.zip` | Android x86_64 | STT\u002FTTS\u002FVAD |\n\n#### SDK Packages\n| Asset | Platform | Description |\n|-------|----------|-------------|\n| `RunAnywhere-Swift-SDK-v0.19.1.zip` | iOS\u002FmacOS | Swift SDK |\n| `RunAnywhere-Kotlin-SDK-v0.19.1.aar` | Android | Kotlin SDK (AAR) |\n\n> **Note:** Flutter SDK is available on [pub.dev](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere), React Native SDK is available on [npm](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore)\n\n> 💡 **Tip:** Most Android apps only need `arm64-v8a` (85% of devices)\n\n---\n\n### 🚀 Quick Start\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Swift (iOS\u002FmacOS)\u003C\u002Fb>\u003C\u002Fsummary>\n\n```swift\n\u002F\u002F Package.swift\ndependencies: [\n    .package(url: \"https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\", from: \"0.19.1\")\n]\n```\n\nOr download the XCFrameworks directly from this release.\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Kotlin (Android)\u003C\u002Fb>\u003C\u002Fsummary>\n\n```kotlin\n\u002F\u002F build.gradle.kts\ndependencies {\n    implementation(\"ai.runanywhere:runanywhere-kotlin:0.19.1\")\n}\n```\n\nOr download `RunAnywhere-Kotlin-SDK-v0.19.1.aar` from this release.\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Flutter\u003C\u002Fb>\u003C\u002Fsummary>\n\n```yaml\n# pubspec.yaml - Install from pub.dev\ndependencies:\n  runanywhere: ^0.19.1\n```\n\nOr visit [pub.dev\u002Fpackages\u002Frunanywhere](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere)\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>React Native\u003C\u002Fb>\u003C\u002Fsummary>\n\n```bash\n# Install from npm\nnpm install @runanywhere\u002Fcore\n```\n\nOr visit [npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore)\n\u003C\u002Fdetails>\n\n---\n\n### ✨ Features\n\n- 🧠 **LLM**: On-device text generation via llama.cpp\n- 🎤 **STT**: Speech-to-text via Sherpa-ONNX Whisper\n- 🔊 **TTS**: Text-to-speech via Sherpa-ONNX Piper\n- 🎯 **VAD**: Voice activity detection\n- 🔒 **Privacy**: All processing happens on-device\n\n---\n\n### 📋 Build Status\n\n| Component | Status |\n|-----------|--------|\n| RACommons | ❌ |\n| Backends | ❌ |\n| Swift SDK | ❌ |\n| Kotlin SDK | ❌ |\n\n---\n\n### 🔐 Verification\n\n```bash\nshasum -a 256 -c checksums.sha256\n```\n\n---\n\nBuilt from commit: 11915fafad7b031ec08f18ab5d777c6b3c5ab3a2\n","2026-02-15T00:11:13",{"id":256,"version":257,"summary_zh":258,"released_at":259},206403,"v0.19.0","## RunAnywhere SDKs v0.19.0\n\n**Privacy-first, on-device AI SDKs** for iOS, Android, Flutter, and React Native.\n\n---\n\n### 📦 Release Assets\n\n#### Core Libraries\n| Asset | Platform | Size |\n|-------|----------|------|\n| `RACommons-ios-v0.19.0.zip` | iOS\u002FmacOS | ~2 MB |\n| `RACommons-android-arm64-v8a-v0.19.0.zip` | Android arm64 | ~61 MB |\n| `RACommons-android-armeabi-v7a-v0.19.0.zip` | Android armv7 | ~57 MB |\n| `RACommons-android-x86_64-v0.19.0.zip` | Android x86_64 | ~64 MB |\n\n#### Backend Libraries (pick what you need)\n| Asset | Platform | Use Case |\n|-------|----------|----------|\n| `RABackendLLAMACPP-ios-v0.19.0.zip` | iOS\u002FmacOS | LLM |\n| `RABackendLLAMACPP-android-arm64-v8a-v0.19.0.zip` | Android arm64 | LLM |\n| `RABackendLLAMACPP-android-armeabi-v7a-v0.19.0.zip` | Android armv7 | LLM |\n| `RABackendLLAMACPP-android-x86_64-v0.19.0.zip` | Android x86_64 | LLM |\n| `RABackendONNX-ios-v0.19.0.zip` | iOS\u002FmacOS | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-arm64-v8a-v0.19.0.zip` | Android arm64 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-armeabi-v7a-v0.19.0.zip` | Android armv7 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-x86_64-v0.19.0.zip` | Android x86_64 | STT\u002FTTS\u002FVAD |\n\n#### SDK Packages\n| Asset | Platform | Description |\n|-------|----------|-------------|\n| `RunAnywhere-Swift-SDK-v0.19.0.zip` | iOS\u002FmacOS | Swift SDK |\n| `RunAnywhere-Kotlin-SDK-v0.19.0.aar` | Android | Kotlin SDK (AAR) |\n\n> **Note:** Flutter SDK is available on [pub.dev](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere), React Native SDK is available on [npm](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore)\n\n> 💡 **Tip:** Most Android apps only need `arm64-v8a` (85% of devices)\n\n---\n\n### 🚀 Quick Start\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Swift (iOS\u002FmacOS)\u003C\u002Fb>\u003C\u002Fsummary>\n\n```swift\n\u002F\u002F Package.swift\ndependencies: [\n    .package(url: \"https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\", from: \"0.19.0\")\n]\n```\n\nOr download the XCFrameworks directly from this release.\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Kotlin (Android)\u003C\u002Fb>\u003C\u002Fsummary>\n\n```kotlin\n\u002F\u002F build.gradle.kts\ndependencies {\n    implementation(\"ai.runanywhere:runanywhere-kotlin:0.19.0\")\n}\n```\n\nOr download `RunAnywhere-Kotlin-SDK-v0.19.0.aar` from this release.\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Flutter\u003C\u002Fb>\u003C\u002Fsummary>\n\n```yaml\n# pubspec.yaml - Install from pub.dev\ndependencies:\n  runanywhere: ^0.19.0\n```\n\nOr visit [pub.dev\u002Fpackages\u002Frunanywhere](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere)\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>React Native\u003C\u002Fb>\u003C\u002Fsummary>\n\n```bash\n# Install from npm\nnpm install @runanywhere\u002Fcore\n```\n\nOr visit [npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore)\n\u003C\u002Fdetails>\n\n---\n\n### ✨ Features\n\n- 🧠 **LLM**: On-device text generation via llama.cpp\n- 🎤 **STT**: Speech-to-text via Sherpa-ONNX Whisper\n- 🔊 **TTS**: Text-to-speech via Sherpa-ONNX Piper\n- 🎯 **VAD**: Voice activity detection\n- 🔒 **Privacy**: All processing happens on-device\n\n---\n\n### 📋 Build Status\n\n| Component | Status |\n|-----------|--------|\n| RACommons | ❌ |\n| Backends | ❌ |\n| Swift SDK | ❌ |\n| Kotlin SDK | ❌ |\n\n---\n\n### 🔐 Verification\n\n```bash\nshasum -a 256 -c checksums.sha256\n```\n\n---\n\nBuilt from commit: 593e00ac7201fd6a31408654af6790bbc0e8f983\n","2026-02-14T23:32:09",{"id":261,"version":262,"summary_zh":263,"released_at":264},206404,"v0.18.0","## RunAnywhere SDKs v0.18.0\n\n**Privacy-first, on-device AI SDKs** for iOS, Android, Flutter, and React Native.\n\n---\n\n### What's New\n\n#### Tool Calling (Function Calling)\n- Register custom tools (weather, calculator, etc.) with typed parameters\n- Auto-execute loop: generate → parse → execute → loop\n- Multi-format support: default JSON + LFM2 (Liquid AI models)\n- Actor-based thread-safe tool registry\n- C++ parsing via `\u003Ctool_call>` tag detection (single source of truth)\n\n#### VLM (Vision Language Model)\n- Describe and analyze images with on-device AI\n- Multiple image input formats: file path, RGB pixels, base64, UIImage, CVPixelBuffer\n- Streaming support with real-time token output\n- Camera and photo library integration in example app\n- llama.cpp backend with mmproj (multimodal projection) support\n\n#### Diffusion (Image Generation)\n- On-device image generation via CoreML + Apple Neural Engine\n- 6 model variants: SD 1.5, SD 2.1, SDXL, SDXL Turbo, SDXS (1-step), LCM\n- 3 modes: text-to-image, image-to-image, inpainting\n- 8 schedulers: DPM++ 2M Karras, DDIM, Euler, Euler Ancestral, PNDM, LMS, and more\n- Progress streaming with intermediate image previews\n- Automatic tokenizer download from HuggingFace\n\n#### Structured Output\n- Type-safe JSON generation via `Generatable` protocol\n- Streaming support with token-by-token display + final typed result\n- C++ JSON extraction for reliability\n\n#### Apple Foundation Models (iOS 26+)\n- Integration with Apple Intelligence built-in models\n- 4096 token context window\n- Automatic device eligibility checking\n\n---\n\n### Installation (Swift Package Manager)\n\n```swift\ndependencies: [\n    .package(url: \"https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\", from: \"0.18.0\")\n]\n```\n\n### Assets\n\n| Asset | Size | Description |\n|-------|------|-------------|\n| `RACommons-ios-v0.18.0.zip` | ~1.4 MB | Core SDK framework |\n| `RABackendLLAMACPP-ios-v0.18.0.zip` | ~12 MB | LLM text generation backend |\n| `RABackendONNX-ios-v0.18.0.zip` | ~13 MB | STT\u002FTTS\u002FVAD backend |","2026-02-13T07:04:08",{"id":266,"version":267,"summary_zh":268,"released_at":269},206405,"voice-assistant-v0.1.0","## RunAnywhere Voice Assistant v0.1.0\n\nPre-built binaries for Raspberry Pi 5 and other Linux aarch64 systems.\n\n### Contents\n- `runanywhere-server` - OpenAI-compatible LLM inference server\n- `voice-assistant` - Wake word + STT + TTS pipeline\n- Shared libraries (librac_commons, librac_backend_*, libsherpa-onnx, libonnxruntime)\n\n### Quick Install\n```bash\n# Download and extract\ncurl -fsSL https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\u002Freleases\u002Fdownload\u002Fvoice-assistant-v0.1.0\u002Frunanywhere-voice-assistant-linux-aarch64.tar.gz | tar -xzf - -C \u002Ftmp\n\n# Install\ncd \u002Ftmp\u002Frunanywhere-release && .\u002Finstall.sh\n\n# Download AI models (~2.5GB)\ncurl -fsSL https:\u002F\u002Fraw.githubusercontent.com\u002FRunanywhereAI\u002Frunanywhere-sdks\u002Fsmonga\u002Frasp\u002Fplayground\u002Flinux-voice-assistant\u002Fscripts\u002Fdownload-models.sh | bash\n\n# Run\n~\u002F.local\u002Frunanywhere\u002Frun.sh\n```\n\n### With Moltbot (Full Experience)\n```bash\ncurl -fsSL https:\u002F\u002Fraw.githubusercontent.com\u002FRunanywhereAI\u002Fclawdbot\u002Fmain\u002Fscripts\u002Fquickstart.sh | bash\n```\n\nSay **Hey Jarvis** to activate!\n","2026-01-30T00:38:04",{"id":271,"version":272,"summary_zh":273,"released_at":274},206406,"v0.17.5","## RunAnywhere SDKs v0.17.5\n\n**Privacy-first, on-device AI SDKs** for iOS, Android, Flutter, and React Native.\n\n---\n\n### 📦 Release Assets\n\n#### Core Libraries\n| Asset | Platform | Size |\n|-------|----------|------|\n| `RACommons-ios-v0.17.5.zip` | iOS\u002FmacOS | ~2 MB |\n| `RACommons-android-arm64-v8a-v0.17.5.zip` | Android arm64 | ~61 MB |\n| `RACommons-android-armeabi-v7a-v0.17.5.zip` | Android armv7 | ~57 MB |\n| `RACommons-android-x86_64-v0.17.5.zip` | Android x86_64 | ~64 MB |\n\n#### Backend Libraries (pick what you need)\n| Asset | Platform | Use Case |\n|-------|----------|----------|\n| `RABackendLLAMACPP-ios-v0.17.5.zip` | iOS\u002FmacOS | LLM |\n| `RABackendLLAMACPP-android-arm64-v8a-v0.17.5.zip` | Android arm64 | LLM |\n| `RABackendLLAMACPP-android-armeabi-v7a-v0.17.5.zip` | Android armv7 | LLM |\n| `RABackendLLAMACPP-android-x86_64-v0.17.5.zip` | Android x86_64 | LLM |\n| `RABackendONNX-ios-v0.17.5.zip` | iOS\u002FmacOS | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-arm64-v8a-v0.17.5.zip` | Android arm64 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-armeabi-v7a-v0.17.5.zip` | Android armv7 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-x86_64-v0.17.5.zip` | Android x86_64 | STT\u002FTTS\u002FVAD |\n\n#### SDK Packages\n| Asset | Platform | Description |\n|-------|----------|-------------|\n| `RunAnywhere-Swift-SDK-v0.17.5.zip` | iOS\u002FmacOS | Swift SDK |\n| `RunAnywhere-Kotlin-SDK-v0.17.5.aar` | Android | Kotlin SDK (AAR) |\n\n> **Note:** Flutter SDK is available on [pub.dev](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere), React Native SDK is available on [npm](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore)\n\n> 💡 **Tip:** Most Android apps only need `arm64-v8a` (85% of devices)\n\n---\n\n### 🚀 Quick Start\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Swift (iOS\u002FmacOS)\u003C\u002Fb>\u003C\u002Fsummary>\n\n```swift\n\u002F\u002F Package.swift\ndependencies: [\n    .package(url: \"https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\", from: \"0.17.5\")\n]\n```\n\nOr download the XCFrameworks directly from this release.\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Kotlin (Android)\u003C\u002Fb>\u003C\u002Fsummary>\n\n```kotlin\n\u002F\u002F build.gradle.kts\ndependencies {\n    implementation(\"ai.runanywhere:runanywhere-kotlin:0.17.5\")\n}\n```\n\nOr download `RunAnywhere-Kotlin-SDK-v0.17.5.aar` from this release.\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Flutter\u003C\u002Fb>\u003C\u002Fsummary>\n\n```yaml\n# pubspec.yaml - Install from pub.dev\ndependencies:\n  runanywhere: ^0.17.5\n```\n\nOr visit [pub.dev\u002Fpackages\u002Frunanywhere](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere)\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>React Native\u003C\u002Fb>\u003C\u002Fsummary>\n\n```bash\n# Install from npm\nnpm install @runanywhere\u002Fcore\n```\n\nOr visit [npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore)\n\u003C\u002Fdetails>\n\n---\n\n### ✨ Features\n\n- 🧠 **LLM**: On-device text generation via llama.cpp\n- 🎤 **STT**: Speech-to-text via Sherpa-ONNX Whisper\n- 🔊 **TTS**: Text-to-speech via Sherpa-ONNX Piper\n- 🎯 **VAD**: Voice activity detection\n- 🔒 **Privacy**: All processing happens on-device\n\n---\n\n### 📋 Build Status\n\n| Component | Status |\n|-----------|--------|\n| RACommons | ✅ |\n| Backends | ✅ |\n| Swift SDK | ❌ |\n| Kotlin SDK | ❌ |\n\n---\n\n### 🔐 Verification\n\n```bash\nshasum -a 256 -c checksums.sha256\n```\n\n---\n\nBuilt from commit: 1901c9411703bcf5864e6c1340fe68d33796ad66\n","2026-01-26T00:48:47",{"id":276,"version":277,"summary_zh":278,"released_at":279},206407,"v0.17.4","## RunAnywhere SDKs v0.17.4\n\n**Privacy-first, on-device AI SDKs** for iOS, Android, Flutter, and React Native.\n\n---\n\n### 📦 Release Assets\n\n#### Core Libraries\n| Asset | Platform | Size |\n|-------|----------|------|\n| `RACommons-ios-v0.17.4.zip` | iOS\u002FmacOS | ~2 MB |\n| `RACommons-android-arm64-v8a-v0.17.4.zip` | Android arm64 | ~61 MB |\n| `RACommons-android-armeabi-v7a-v0.17.4.zip` | Android armv7 | ~57 MB |\n| `RACommons-android-x86_64-v0.17.4.zip` | Android x86_64 | ~64 MB |\n\n#### Backend Libraries (pick what you need)\n| Asset | Platform | Use Case |\n|-------|----------|----------|\n| `RABackendLLAMACPP-ios-v0.17.4.zip` | iOS\u002FmacOS | LLM |\n| `RABackendLLAMACPP-android-arm64-v8a-v0.17.4.zip` | Android arm64 | LLM |\n| `RABackendLLAMACPP-android-armeabi-v7a-v0.17.4.zip` | Android armv7 | LLM |\n| `RABackendLLAMACPP-android-x86_64-v0.17.4.zip` | Android x86_64 | LLM |\n| `RABackendONNX-ios-v0.17.4.zip` | iOS\u002FmacOS | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-arm64-v8a-v0.17.4.zip` | Android arm64 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-armeabi-v7a-v0.17.4.zip` | Android armv7 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-x86_64-v0.17.4.zip` | Android x86_64 | STT\u002FTTS\u002FVAD |\n\n#### SDK Packages\n| Asset | Platform | Description |\n|-------|----------|-------------|\n| `RunAnywhere-Swift-SDK-v0.17.4.zip` | iOS\u002FmacOS | Swift SDK |\n| `RunAnywhere-Kotlin-SDK-v0.17.4.aar` | Android | Kotlin SDK (AAR) |\n\n> **Note:** Flutter SDK is available on [pub.dev](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere), React Native SDK is available on [npm](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore)\n\n> 💡 **Tip:** Most Android apps only need `arm64-v8a` (85% of devices)\n\n---\n\n### 🚀 Quick Start\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Swift (iOS\u002FmacOS)\u003C\u002Fb>\u003C\u002Fsummary>\n\n```swift\n\u002F\u002F Package.swift\ndependencies: [\n    .package(url: \"https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\", from: \"0.17.4\")\n]\n```\n\nOr download the XCFrameworks directly from this release.\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Kotlin (Android)\u003C\u002Fb>\u003C\u002Fsummary>\n\n```kotlin\n\u002F\u002F build.gradle.kts\ndependencies {\n    implementation(\"ai.runanywhere:runanywhere-kotlin:0.17.4\")\n}\n```\n\nOr download `RunAnywhere-Kotlin-SDK-v0.17.4.aar` from this release.\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Flutter\u003C\u002Fb>\u003C\u002Fsummary>\n\n```yaml\n# pubspec.yaml - Install from pub.dev\ndependencies:\n  runanywhere: ^0.17.4\n```\n\nOr visit [pub.dev\u002Fpackages\u002Frunanywhere](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere)\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>React Native\u003C\u002Fb>\u003C\u002Fsummary>\n\n```bash\n# Install from npm\nnpm install @runanywhere\u002Fcore\n```\n\nOr visit [npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore)\n\u003C\u002Fdetails>\n\n---\n\n### ✨ Features\n\n- 🧠 **LLM**: On-device text generation via llama.cpp\n- 🎤 **STT**: Speech-to-text via Sherpa-ONNX Whisper\n- 🔊 **TTS**: Text-to-speech via Sherpa-ONNX Piper\n- 🎯 **VAD**: Voice activity detection\n- 🔒 **Privacy**: All processing happens on-device\n\n---\n\n### 📋 Build Status\n\n| Component | Status |\n|-----------|--------|\n| RACommons | ✅ |\n| Backends | ✅ |\n| Swift SDK | ❌ |\n| Kotlin SDK | ❌ |\n\n---\n\n### 🔐 Verification\n\n```bash\nshasum -a 256 -c checksums.sha256\n```\n\n---\n\nBuilt from commit: 03085d9c77e87eccdb0aab013ce6818eeb808f44\n","2026-01-22T12:04:32",{"id":281,"version":282,"summary_zh":283,"released_at":284},206408,"v0.17.3","## RunAnywhere SDKs v0.17.3\n\n**Privacy-first, on-device AI SDKs** for iOS, Android, Flutter, and React Native.\n\n---\n\n### 📦 Release Assets\n\n#### Core Libraries\n| Asset | Platform | Size |\n|-------|----------|------|\n| `RACommons-ios-v0.17.3.zip` | iOS\u002FmacOS | ~2 MB |\n| `RACommons-android-arm64-v8a-v0.17.3.zip` | Android arm64 | ~61 MB |\n| `RACommons-android-armeabi-v7a-v0.17.3.zip` | Android armv7 | ~57 MB |\n| `RACommons-android-x86_64-v0.17.3.zip` | Android x86_64 | ~64 MB |\n\n#### Backend Libraries (pick what you need)\n| Asset | Platform | Use Case |\n|-------|----------|----------|\n| `RABackendLLAMACPP-ios-v0.17.3.zip` | iOS\u002FmacOS | LLM |\n| `RABackendLLAMACPP-android-arm64-v8a-v0.17.3.zip` | Android arm64 | LLM |\n| `RABackendLLAMACPP-android-armeabi-v7a-v0.17.3.zip` | Android armv7 | LLM |\n| `RABackendLLAMACPP-android-x86_64-v0.17.3.zip` | Android x86_64 | LLM |\n| `RABackendONNX-ios-v0.17.3.zip` | iOS\u002FmacOS | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-arm64-v8a-v0.17.3.zip` | Android arm64 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-armeabi-v7a-v0.17.3.zip` | Android armv7 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-x86_64-v0.17.3.zip` | Android x86_64 | STT\u002FTTS\u002FVAD |\n\n#### SDK Packages\n| Asset | Platform | Description |\n|-------|----------|-------------|\n| `RunAnywhere-Swift-SDK-v0.17.3.zip` | iOS\u002FmacOS | Swift SDK |\n| `RunAnywhere-Kotlin-SDK-v0.17.3.aar` | Android | Kotlin SDK (AAR) |\n\n> **Note:** Flutter SDK is available on [pub.dev](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere), React Native SDK is available on [npm](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore)\n\n> 💡 **Tip:** Most Android apps only need `arm64-v8a` (85% of devices)\n\n---\n\n### 🚀 Quick Start\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Swift (iOS\u002FmacOS)\u003C\u002Fb>\u003C\u002Fsummary>\n\n```swift\n\u002F\u002F Package.swift\ndependencies: [\n    .package(url: \"https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\", from: \"0.17.3\")\n]\n```\n\nOr download the XCFrameworks directly from this release.\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Kotlin (Android)\u003C\u002Fb>\u003C\u002Fsummary>\n\n```kotlin\n\u002F\u002F build.gradle.kts\ndependencies {\n    implementation(\"ai.runanywhere:runanywhere-kotlin:0.17.3\")\n}\n```\n\nOr download `RunAnywhere-Kotlin-SDK-v0.17.3.aar` from this release.\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Flutter\u003C\u002Fb>\u003C\u002Fsummary>\n\n```yaml\n# pubspec.yaml - Install from pub.dev\ndependencies:\n  runanywhere: ^0.17.3\n```\n\nOr visit [pub.dev\u002Fpackages\u002Frunanywhere](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere)\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>React Native\u003C\u002Fb>\u003C\u002Fsummary>\n\n```bash\n# Install from npm\nnpm install @runanywhere\u002Fcore\n```\n\nOr visit [npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore)\n\u003C\u002Fdetails>\n\n---\n\n### ✨ Features\n\n- 🧠 **LLM**: On-device text generation via llama.cpp\n- 🎤 **STT**: Speech-to-text via Sherpa-ONNX Whisper\n- 🔊 **TTS**: Text-to-speech via Sherpa-ONNX Piper\n- 🎯 **VAD**: Voice activity detection\n- 🔒 **Privacy**: All processing happens on-device\n\n---\n\n### 📋 Build Status\n\n| Component | Status |\n|-----------|--------|\n| RACommons | ✅ |\n| Backends | ✅ |\n| Swift SDK | ❌ |\n| Kotlin SDK | ❌ |\n\n---\n\n### 🔐 Verification\n\n```bash\nshasum -a 256 -c checksums.sha256\n```\n\n---\n\nBuilt from commit: 72a9e3eaefc6ce84e6334d207c79704886569b6b\n","2026-01-22T11:50:11",{"id":286,"version":287,"summary_zh":288,"released_at":289},206409,"v0.17.2","## RunAnywhere SDKs v0.17.2\n\n**Privacy-first, on-device AI SDKs** for iOS, Android, Flutter, and React Native.\n\n---\n\n### 📦 Release Assets\n\n#### Core Libraries\n| Asset | Platform | Size |\n|-------|----------|------|\n| `RACommons-ios-v0.17.2.zip` | iOS\u002FmacOS | ~2 MB |\n| `RACommons-android-arm64-v8a-v0.17.2.zip` | Android arm64 | ~61 MB |\n| `RACommons-android-armeabi-v7a-v0.17.2.zip` | Android armv7 | ~57 MB |\n| `RACommons-android-x86_64-v0.17.2.zip` | Android x86_64 | ~64 MB |\n\n#### Backend Libraries (pick what you need)\n| Asset | Platform | Use Case |\n|-------|----------|----------|\n| `RABackendLLAMACPP-ios-v0.17.2.zip` | iOS\u002FmacOS | LLM |\n| `RABackendLLAMACPP-android-arm64-v8a-v0.17.2.zip` | Android arm64 | LLM |\n| `RABackendLLAMACPP-android-armeabi-v7a-v0.17.2.zip` | Android armv7 | LLM |\n| `RABackendLLAMACPP-android-x86_64-v0.17.2.zip` | Android x86_64 | LLM |\n| `RABackendONNX-ios-v0.17.2.zip` | iOS\u002FmacOS | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-arm64-v8a-v0.17.2.zip` | Android arm64 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-armeabi-v7a-v0.17.2.zip` | Android armv7 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-x86_64-v0.17.2.zip` | Android x86_64 | STT\u002FTTS\u002FVAD |\n\n#### SDK Packages\n| Asset | Platform | Description |\n|-------|----------|-------------|\n| `RunAnywhere-Swift-SDK-v0.17.2.zip` | iOS\u002FmacOS | Swift SDK |\n| `RunAnywhere-Kotlin-SDK-v0.17.2.aar` | Android | Kotlin SDK (AAR) |\n\n> **Note:** Flutter SDK is available on [pub.dev](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere), React Native SDK is available on [npm](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore)\n\n> 💡 **Tip:** Most Android apps only need `arm64-v8a` (85% of devices)\n\n---\n\n### 🚀 Quick Start\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Swift (iOS\u002FmacOS)\u003C\u002Fb>\u003C\u002Fsummary>\n\n```swift\n\u002F\u002F Package.swift\ndependencies: [\n    .package(url: \"https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\", from: \"0.17.2\")\n]\n```\n\nOr download the XCFrameworks directly from this release.\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Kotlin (Android)\u003C\u002Fb>\u003C\u002Fsummary>\n\n```kotlin\n\u002F\u002F build.gradle.kts\ndependencies {\n    implementation(\"ai.runanywhere:runanywhere-kotlin:0.17.2\")\n}\n```\n\nOr download `RunAnywhere-Kotlin-SDK-v0.17.2.aar` from this release.\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Flutter\u003C\u002Fb>\u003C\u002Fsummary>\n\n```yaml\n# pubspec.yaml - Install from pub.dev\ndependencies:\n  runanywhere: ^0.17.2\n```\n\nOr visit [pub.dev\u002Fpackages\u002Frunanywhere](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere)\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>React Native\u003C\u002Fb>\u003C\u002Fsummary>\n\n```bash\n# Install from npm\nnpm install @runanywhere\u002Fcore\n```\n\nOr visit [npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore)\n\u003C\u002Fdetails>\n\n---\n\n### ✨ Features\n\n- 🧠 **LLM**: On-device text generation via llama.cpp\n- 🎤 **STT**: Speech-to-text via Sherpa-ONNX Whisper\n- 🔊 **TTS**: Text-to-speech via Sherpa-ONNX Piper\n- 🎯 **VAD**: Voice activity detection\n- 🔒 **Privacy**: All processing happens on-device\n\n---\n\n### 📋 Build Status\n\n| Component | Status |\n|-----------|--------|\n| RACommons | ✅ |\n| Backends | ✅ |\n| Swift SDK | ❌ |\n| Kotlin SDK | ❌ |\n\n---\n\n### 🔐 Verification\n\n```bash\nshasum -a 256 -c checksums.sha256\n```\n\n---\n\nBuilt from commit: 4d3493016955e56eb391ea060a35812bd715ceda\n","2026-01-22T11:45:10",{"id":291,"version":292,"summary_zh":293,"released_at":294},206410,"v0.17.1","## RunAnywhere SDKs v0.17.1\n\n**Privacy-first, on-device AI SDKs** for iOS, Android, Flutter, and React Native.\n\n---\n\n### 📦 Release Assets\n\n#### Core Libraries\n| Asset | Platform | Size |\n|-------|----------|------|\n| `RACommons-ios-v0.17.1.zip` | iOS\u002FmacOS | ~2 MB |\n| `RACommons-android-arm64-v8a-v0.17.1.zip` | Android arm64 | ~61 MB |\n| `RACommons-android-armeabi-v7a-v0.17.1.zip` | Android armv7 | ~57 MB |\n| `RACommons-android-x86_64-v0.17.1.zip` | Android x86_64 | ~64 MB |\n\n#### Backend Libraries (pick what you need)\n| Asset | Platform | Use Case |\n|-------|----------|----------|\n| `RABackendLLAMACPP-ios-v0.17.1.zip` | iOS\u002FmacOS | LLM |\n| `RABackendLLAMACPP-android-arm64-v8a-v0.17.1.zip` | Android arm64 | LLM |\n| `RABackendLLAMACPP-android-armeabi-v7a-v0.17.1.zip` | Android armv7 | LLM |\n| `RABackendLLAMACPP-android-x86_64-v0.17.1.zip` | Android x86_64 | LLM |\n| `RABackendONNX-ios-v0.17.1.zip` | iOS\u002FmacOS | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-arm64-v8a-v0.17.1.zip` | Android arm64 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-armeabi-v7a-v0.17.1.zip` | Android armv7 | STT\u002FTTS\u002FVAD |\n| `RABackendONNX-android-x86_64-v0.17.1.zip` | Android x86_64 | STT\u002FTTS\u002FVAD |\n\n#### SDK Packages\n| Asset | Platform | Description |\n|-------|----------|-------------|\n| `RunAnywhere-Swift-SDK-v0.17.1.zip` | iOS\u002FmacOS | Swift SDK |\n| `RunAnywhere-Kotlin-SDK-v0.17.1.aar` | Android | Kotlin SDK (AAR) |\n\n> **Note:** Flutter SDK is available on [pub.dev](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere), React Native SDK is available on [npm](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore)\n\n> 💡 **Tip:** Most Android apps only need `arm64-v8a` (85% of devices)\n\n---\n\n### 🚀 Quick Start\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Swift (iOS\u002FmacOS)\u003C\u002Fb>\u003C\u002Fsummary>\n\n```swift\n\u002F\u002F Package.swift\ndependencies: [\n    .package(url: \"https:\u002F\u002Fgithub.com\u002FRunanywhereAI\u002Frunanywhere-sdks\", from: \"0.17.1\")\n]\n```\n\nOr download the XCFrameworks directly from this release.\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Kotlin (Android)\u003C\u002Fb>\u003C\u002Fsummary>\n\n```kotlin\n\u002F\u002F build.gradle.kts\ndependencies {\n    implementation(\"ai.runanywhere:runanywhere-kotlin:0.17.1\")\n}\n```\n\nOr download `RunAnywhere-Kotlin-SDK-v0.17.1.aar` from this release.\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>Flutter\u003C\u002Fb>\u003C\u002Fsummary>\n\n```yaml\n# pubspec.yaml - Install from pub.dev\ndependencies:\n  runanywhere: ^0.17.1\n```\n\nOr visit [pub.dev\u002Fpackages\u002Frunanywhere](https:\u002F\u002Fpub.dev\u002Fpackages\u002Frunanywhere)\n\u003C\u002Fdetails>\n\n\u003Cdetails>\n\u003Csummary>\u003Cb>React Native\u003C\u002Fb>\u003C\u002Fsummary>\n\n```bash\n# Install from npm\nnpm install @runanywhere\u002Fcore\n```\n\nOr visit [npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore](https:\u002F\u002Fwww.npmjs.com\u002Fpackage\u002F@runanywhere\u002Fcore)\n\u003C\u002Fdetails>\n\n---\n\n### ✨ Features\n\n- 🧠 **LLM**: On-device text generation via llama.cpp\n- 🎤 **STT**: Speech-to-text via Sherpa-ONNX Whisper\n- 🔊 **TTS**: Text-to-speech via Sherpa-ONNX Piper\n- 🎯 **VAD**: Voice activity detection\n- 🔒 **Privacy**: All processing happens on-device\n\n---\n\n### 📋 Build Status\n\n| Component | Status |\n|-----------|--------|\n| RACommons | ✅ |\n| Backends | ✅ |\n| Swift SDK | ❌ |\n| Kotlin SDK | ❌ |\n\n---\n\n### 🔐 Verification\n\n```bash\nshasum -a 256 -c checksums.sha256\n```\n\n---\n\nBuilt from commit: 60902eeff0363d8b7b591c95f3619859c930a364\n","2026-01-22T06:58:51"]