[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-google-deepmind--mujoco_menagerie":3,"tool-google-deepmind--mujoco_menagerie":65},[4,23,32,40,49,57],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":22},2268,"ML-For-Beginners","microsoft\u002FML-For-Beginners","ML-For-Beginners 是由微软推出的一套系统化机器学习入门课程，旨在帮助零基础用户轻松掌握经典机器学习知识。这套课程将学习路径规划为 12 周，包含 26 节精炼课程和 52 道配套测验，内容涵盖从基础概念到实际应用的完整流程，有效解决了初学者面对庞大知识体系时无从下手、缺乏结构化指导的痛点。\n\n无论是希望转型的开发者、需要补充算法背景的研究人员，还是对人工智能充满好奇的普通爱好者，都能从中受益。课程不仅提供了清晰的理论讲解，还强调动手实践，让用户在循序渐进中建立扎实的技能基础。其独特的亮点在于强大的多语言支持，通过自动化机制提供了包括简体中文在内的 50 多种语言版本，极大地降低了全球不同背景用户的学习门槛。此外，项目采用开源协作模式，社区活跃且内容持续更新，确保学习者能获取前沿且准确的技术资讯。如果你正寻找一条清晰、友好且专业的机器学习入门之路，ML-For-Beginners 将是理想的起点。",84991,2,"2026-04-05T10:45:23",[13,14,15,16,17,18,19,20,21],"图像","数据工具","视频","插件","Agent","其他","语言模型","开发框架","音频","ready",{"id":24,"name":25,"github_repo":26,"description_zh":27,"stars":28,"difficulty_score":29,"last_commit_at":30,"category_tags":31,"status":22},3128,"ragflow","infiniflow\u002Fragflow","RAGFlow 是一款领先的开源检索增强生成（RAG）引擎，旨在为大语言模型构建更精准、可靠的上下文层。它巧妙地将前沿的 RAG 技术与智能体（Agent）能力相结合，不仅支持从各类文档中高效提取知识，还能让模型基于这些知识进行逻辑推理和任务执行。\n\n在大模型应用中，幻觉问题和知识滞后是常见痛点。RAGFlow 通过深度解析复杂文档结构（如表格、图表及混合排版），显著提升了信息检索的准确度，从而有效减少模型“胡编乱造”的现象，确保回答既有据可依又具备时效性。其内置的智能体机制更进一步，使系统不仅能回答问题，还能自主规划步骤解决复杂问题。\n\n这款工具特别适合开发者、企业技术团队以及 AI 研究人员使用。无论是希望快速搭建私有知识库问答系统，还是致力于探索大模型在垂直领域落地的创新者，都能从中受益。RAGFlow 提供了可视化的工作流编排界面和灵活的 API 接口，既降低了非算法背景用户的上手门槛，也满足了专业开发者对系统深度定制的需求。作为基于 Apache 2.0 协议开源的项目，它正成为连接通用大模型与行业专有知识之间的重要桥梁。",77062,3,"2026-04-04T04:44:48",[17,13,20,19,18],{"id":33,"name":34,"github_repo":35,"description_zh":36,"stars":37,"difficulty_score":29,"last_commit_at":38,"category_tags":39,"status":22},519,"PaddleOCR","PaddlePaddle\u002FPaddleOCR","PaddleOCR 是一款基于百度飞桨框架开发的高性能开源光学字符识别工具包。它的核心能力是将图片、PDF 等文档中的文字提取出来，转换成计算机可读取的结构化数据，让机器真正“看懂”图文内容。\n\n面对海量纸质或电子文档，PaddleOCR 解决了人工录入效率低、数字化成本高的问题。尤其在人工智能领域，它扮演着连接图像与大型语言模型（LLM）的桥梁角色，能将视觉信息直接转化为文本输入，助力智能问答、文档分析等应用场景落地。\n\nPaddleOCR 适合开发者、算法研究人员以及有文档自动化需求的普通用户。其技术优势十分明显：不仅支持全球 100 多种语言的识别，还能在 Windows、Linux、macOS 等多个系统上运行，并灵活适配 CPU、GPU、NPU 等各类硬件。作为一个轻量级且社区活跃的开源项目，PaddleOCR 既能满足快速集成的需求，也能支撑前沿的视觉语言研究，是处理文字识别任务的理想选择。",74913,"2026-04-05T10:44:17",[19,13,20,18],{"id":41,"name":42,"github_repo":43,"description_zh":44,"stars":45,"difficulty_score":46,"last_commit_at":47,"category_tags":48,"status":22},3215,"awesome-machine-learning","josephmisiti\u002Fawesome-machine-learning","awesome-machine-learning 是一份精心整理的机器学习资源清单，汇集了全球优秀的机器学习框架、库和软件工具。面对机器学习领域技术迭代快、资源分散且难以甄选的痛点，这份清单按编程语言（如 Python、C++、Go 等）和应用场景（如计算机视觉、自然语言处理、深度学习等）进行了系统化分类，帮助使用者快速定位高质量项目。\n\n它特别适合开发者、数据科学家及研究人员使用。无论是初学者寻找入门库，还是资深工程师对比不同语言的技术选型，都能从中获得极具价值的参考。此外，清单还延伸提供了免费书籍、在线课程、行业会议、技术博客及线下聚会等丰富资源，构建了从学习到实践的全链路支持体系。\n\n其独特亮点在于严格的维护标准：明确标记已停止维护或长期未更新的项目，确保推荐内容的时效性与可靠性。作为机器学习领域的“导航图”，awesome-machine-learning 以开源协作的方式持续更新，旨在降低技术探索门槛，让每一位从业者都能高效地站在巨人的肩膀上创新。",72149,1,"2026-04-03T21:50:24",[20,18],{"id":50,"name":51,"github_repo":52,"description_zh":53,"stars":54,"difficulty_score":46,"last_commit_at":55,"category_tags":56,"status":22},2234,"scikit-learn","scikit-learn\u002Fscikit-learn","scikit-learn 是一个基于 Python 构建的开源机器学习库，依托于 SciPy、NumPy 等科学计算生态，旨在让机器学习变得简单高效。它提供了一套统一且简洁的接口，涵盖了从数据预处理、特征工程到模型训练、评估及选择的全流程工具，内置了包括线性回归、支持向量机、随机森林、聚类等在内的丰富经典算法。\n\n对于希望快速验证想法或构建原型的数据科学家、研究人员以及 Python 开发者而言，scikit-learn 是不可或缺的基础设施。它有效解决了机器学习入门门槛高、算法实现复杂以及不同模型间调用方式不统一的痛点，让用户无需重复造轮子，只需几行代码即可调用成熟的算法解决分类、回归、聚类等实际问题。\n\n其核心技术亮点在于高度一致的 API 设计风格，所有估算器（Estimator）均遵循相同的调用逻辑，极大地降低了学习成本并提升了代码的可读性与可维护性。此外，它还提供了强大的模型选择与评估工具，如交叉验证和网格搜索，帮助用户系统地优化模型性能。作为一个由全球志愿者共同维护的成熟项目，scikit-learn 以其稳定性、详尽的文档和活跃的社区支持，成为连接理论学习与工业级应用的最",65628,"2026-04-05T10:10:46",[20,18,14],{"id":58,"name":59,"github_repo":60,"description_zh":61,"stars":62,"difficulty_score":10,"last_commit_at":63,"category_tags":64,"status":22},3364,"keras","keras-team\u002Fkeras","Keras 是一个专为人类设计的深度学习框架，旨在让构建和训练神经网络变得简单直观。它解决了开发者在不同深度学习后端之间切换困难、模型开发效率低以及难以兼顾调试便捷性与运行性能的痛点。\n\n无论是刚入门的学生、专注算法的研究人员，还是需要快速落地产品的工程师，都能通过 Keras 轻松上手。它支持计算机视觉、自然语言处理、音频分析及时间序列预测等多种任务。\n\nKeras 3 的核心亮点在于其独特的“多后端”架构。用户只需编写一套代码，即可灵活选择 TensorFlow、JAX、PyTorch 或 OpenVINO 作为底层运行引擎。这一特性不仅保留了 Keras 一贯的高层易用性，还允许开发者根据需求自由选择：利用 JAX 或 PyTorch 的即时执行模式进行高效调试，或切换至速度最快的后端以获得最高 350% 的性能提升。此外，Keras 具备强大的扩展能力，能无缝从本地笔记本电脑扩展至大规模 GPU 或 TPU 集群，是连接原型开发与生产部署的理想桥梁。",63927,"2026-04-04T15:24:37",[20,14,18],{"id":66,"github_repo":67,"name":68,"description_en":69,"description_zh":70,"ai_summary_zh":71,"readme_en":72,"readme_zh":73,"quickstart_zh":74,"use_case_zh":75,"hero_image_url":76,"owner_login":77,"owner_name":78,"owner_avatar_url":79,"owner_bio":80,"owner_company":81,"owner_location":81,"owner_email":81,"owner_twitter":81,"owner_website":82,"owner_url":83,"languages":84,"stars":97,"forks":98,"last_commit_at":99,"license":100,"difficulty_score":10,"env_os":101,"env_gpu":102,"env_ram":102,"env_deps":103,"category_tags":107,"github_topics":108,"view_count":29,"oss_zip_url":81,"oss_zip_packed_at":81,"status":22,"created_at":110,"updated_at":111,"faqs":112,"releases":148},900,"google-deepmind\u002Fmujoco_menagerie","mujoco_menagerie","A collection of high-quality models for the MuJoCo physics engine, curated by Google DeepMind.","mujoco_menagerie 是一个由 Google DeepMind 维护的高质量机器人及物理模型集合，专为 MuJoCo 物理引擎设计。它提供了大量经过精心调试和验证的模型，涵盖从工业机械臂、灵巧手到四足机器人、仿生机器人等多种类型，用户可以直接在 MuJoCo 仿真环境中加载使用，无需从头建模。\n\n这个工具主要解决了物理仿真中模型质量参差不齐的问题。在 MuJoCo 这样功能强大的仿真平台中，自行构建模型往往涉及复杂的参数调整，容易导致模型行为异常或仿真不稳定。mujoco_menagerie 通过提供一系列“开箱即用”、符合物理规律的可靠模型，让研究人员和开发者能跳过繁琐的建模阶段，快速开展机器人控制、强化学习或运动规划算法的实验。\n\n它非常适合机器人学、强化学习领域的研究人员与工程师使用。无论是学术机构进行算法验证，还是工业界快速搭建仿真原型，都可以从中受益。其模型结构清晰，并注重仿真实用性，例如包含完整的关节限位、碰撞网格和合理的驱动参数。\n\n技术亮点在于其模型的“即用性”和高质量。每个模型都经过人工检查和测试，确保在 MuJoCo 中能够稳定运行，行为符合预期。集合持","mujoco_menagerie 是一个由 Google DeepMind 维护的高质量机器人及物理模型集合，专为 MuJoCo 物理引擎设计。它提供了大量经过精心调试和验证的模型，涵盖从工业机械臂、灵巧手到四足机器人、仿生机器人等多种类型，用户可以直接在 MuJoCo 仿真环境中加载使用，无需从头建模。\n\n这个工具主要解决了物理仿真中模型质量参差不齐的问题。在 MuJoCo 这样功能强大的仿真平台中，自行构建模型往往涉及复杂的参数调整，容易导致模型行为异常或仿真不稳定。mujoco_menagerie 通过提供一系列“开箱即用”、符合物理规律的可靠模型，让研究人员和开发者能跳过繁琐的建模阶段，快速开展机器人控制、强化学习或运动规划算法的实验。\n\n它非常适合机器人学、强化学习领域的研究人员与工程师使用。无论是学术机构进行算法验证，还是工业界快速搭建仿真原型，都可以从中受益。其模型结构清晰，并注重仿真实用性，例如包含完整的关节限位、碰撞网格和合理的驱动参数。\n\n技术亮点在于其模型的“即用性”和高质量。每个模型都经过人工检查和测试，确保在 MuJoCo 中能够稳定运行，行为符合预期。集合持续更新，涵盖了当前机器人社区中许多热门和经典的平台模型，为仿真实验提供了坚实的基础。","\u003Ch1>\n  \u003Ca href=\"#\">\u003Cimg alt=\"MuJoCo Menagerie\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_aabd161b0e13.png\" width=\"100%\">\u003C\u002Fa>\n\u003C\u002Fh1>\n\n\u003Cp>\n  \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco_menagerie\u002Factions\u002Fworkflows\u002Fbuild.yml?query=branch%3Amain\" alt=\"GitHub Actions\">\n    \u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Factions\u002Fworkflow\u002Fstatus\u002Fgoogle-deepmind\u002Fmujoco_menagerie\u002Fbuild.yml?branch=main\">\n  \u003C\u002Fa>\n  \u003Ca href=\"https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002Flatest\u002Fmodels.html\" alt=\"Documentation\">\n    \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_13d664e1afd7.png\">\n  \u003C\u002Fa>\n  \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco_menagerie\u002Fblob\u002Fmain\u002FCONTRIBUTING.md\">\n    \u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FPRs-welcome-green.svg\" alt=\"PRs\" height=\"20\">\n  \u003C\u002Fa>\n\u003C\u002Fp>\n\n**Menagerie** is a collection of high-quality models for the\n[MuJoCo](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco) physics engine, curated by\nGoogle DeepMind.\n\nA physics simulator is only as good as the model it is simulating, and in a\npowerful simulator like MuJoCo with many modeling options, it is easy to create\n\"bad\" models which do not behave as expected. The goal of this collection is to\nprovide the community with a curated library of well-designed models that work\nwell right out of the gate.\n\n### Gallery\n\n|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_43dba392b538.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_a31e4a16f6e8.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_e24179345020.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_b01b4b8caac9.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_bf2617a7e544.png' width=100>|\n| :---: | :---: | :---: | :---: | :---: |\n|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_8b39130ea16e.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_0ab7c15e16c4.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_5a0b677e876e.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_1842c191ae78.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_06b2462445d9.png' width=100>|\n|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_67a673b6e410.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_15bfbd894b3c.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_667c8b79594d.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_b1b27048dbeb.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_95d1f8147347.png' width=100>|\n|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_78c5b4b43cc0.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_6f3c1033c3b0.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_1759e7c04327.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_5ed9457378a2.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_3863b5c97e1c.png' width=100>|\n|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_8c188c5412a3.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_81eeeded9728.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_b25c835536f1.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_f8ed30800ed3.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_bc2c6fe56cc5.png' width=100>|\n|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_9557bf5a065e.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_e2e14a5ed97f.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_e1b234028b42.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_7ec347125372.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_74af9faac866.png' width=100>|\n|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_29fa3acdbccd.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_28c2e04a127b.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_c0aee5583cb9.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_d4ff79dd7bd1.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_5420da487fcd.png' width=100>|\n|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_962095a52fc3.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_bc1167ac27c1.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_ba8c8423a7e4.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_bf060c60f52f.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_45378eed2bc4.png' width=100>|\n|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_74b4371934bc.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_fc18ea1fcf0a.png' width=100>||||\n\n\n- [Getting Started](#getting-started)\n  - [Prerequisites](#prerequisites)\n  - [Overview](#overview)\n  - [Usage](#usage)\n    - [Via `robot-descriptions`](#via-robot-descriptions)\n    - [Via `git clone`](#via-git-clone)\n- [Model Quality and Contributing](#model-quality-and-contributing)\n- [Menagerie Models](#menagerie-models)\n- [Citing Menagerie](#citing-menagerie)\n- [Acknowledgments](#acknowledgments)\n- [Changelog](#changelog)\n- [License and Disclaimer](#license-and-disclaimer)\n\n## Getting Started\n\n### Prerequisites\n\nThe minimum required MuJoCo version for each model is specified in its\nrespective README. You can download prebuilt binaries for MuJoCo from the GitHub\n[releases page](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco\u002Freleases\u002F), or if you\nare working with Python, you can install the native bindings from\n[PyPI](https:\u002F\u002Fpypi.org\u002Fproject\u002Fmujoco\u002F) via `pip install mujoco`. For\nalternative installation instructions, see\n[here](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco#installation).\n\n### Overview\n\nThe structure of Menagerie is illustrated below. For brevity, we have only\nincluded one model directory since all others follow the exact same pattern.\n\n```bash\n├── unitree_go2\n│   ├── assets\n│   │   ├── base_0.obj\n│   │   ├── ...\n│   ├── go2.png\n│   ├── go2.xml\n│   ├── LICENSE\n│   ├── README.md\n│   └── scene.xml\n│   └── go2_mjx.xml\n│   └── scene_mjx.xml\n```\n\n- `assets`: stores the 3D meshes (.stl or .obj) of the model used for visual and\n  collision purposes\n- `LICENSE`: describes the copyright and licensing terms of the model\n- `README.md`: contains detailed steps describing how the model's MJCF XML file\n  was generated\n- `\u003Cmodel>.xml`: contains the MJCF definition of the model\n- `scene.xml`: includes `\u003Cmodel>.xml` with a plane, a light source and\n  potentially other objects\n- `\u003Cmodel>.png`: a PNG image of `scene.xml`\n- `\u003Cmodel>_mjx.xml`: contains an MJX-compatible version of the model. Not all\n  models have an MJX variant (see [Menagerie Models](#menagerie-models) for more\n  information).\n- `scene_mjx.xml`: same as `scene.xml` but loads the MJX variant\n\nNote that `\u003Cmodel>.xml` solely describes the model, i.e., no other entity is\ndefined in the kinematic tree. We leave additional body definitions for the\n`scene.xml` file, as can be seen in the Shadow Hand\n[`scene.xml`](shadow_hand\u002Fscene_right.xml).\n\n### Usage\n\n#### Via `robot-descriptions`\n\nYou can use the opensource\n[`robot_descriptions`](https:\u002F\u002Fgithub.com\u002Frobot-descriptions\u002Frobot_descriptions.py)\npackage to load any model in Menagerie. It is available on PyPI and can be\ninstalled via `pip install robot_descriptions`.\n\nOnce installed, you can load a model of your choice as follows:\n\n```python\nimport mujoco\n\n# Loading a specific model description as an imported module.\nfrom robot_descriptions import panda_mj_description\nmodel = mujoco.MjModel.from_xml_path(panda_mj_description.MJCF_PATH)\n\n# Directly loading an instance of MjModel.\nfrom robot_descriptions.loaders.mujoco import load_robot_description\nmodel = load_robot_description(\"panda_mj_description\")\n\n# Loading a variant of the model, e.g. panda without a gripper.\nmodel = load_robot_description(\"panda_mj_description\", variant=\"panda_nohand\")\n```\n\n#### Via `git clone`\n\nYou can also directly clone this repository in the directory of your choice:\n\n```bash\ngit clone https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco_menagerie.git\n```\n\nYou can then interactively explore the model using the Python viewer:\n\n```bash\npython -m mujoco.viewer --mjcf mujoco_menagerie\u002Funitree_go2\u002Fscene.xml\n```\n\nIf you have further questions, please check out our [FAQ](FAQ.md).\n\n## Model Quality and Contributing\n\nOur goal is to eventually make all Menagerie models as faithful as possible to\nthe real system they are being modeled after. Improving model quality is an\nongoing effort, and the current state of many models is not necessarily\nas good as it could be.\n\nHowever, by releasing Menagerie in its current state, we hope to consolidate\nand increase visibility for community contributions. To help Menagerie users\nset proper expectations around the quality of each model, we introduce the\nfollowing grading system:\n\n| Grade | Description                                                 |\n|-------|-------------------------------------------------------------|\n| A+    | Values are the product of proper system identification      |\n| A     | Values are realistic, but have not been properly identified |\n| B     | Stable, but some values are unrealistic                     |\n| C     | Conditionally stable, can be significantly improved         |\n\nThe grading system will be applied to each model once a proper system\nidentification toolbox is created. We are currently planning to release\nthis toolbox later this year.\n\nFor more information regarding contributions, for example to add a new model to\nMenagerie, see [CONTRIBUTING](CONTRIBUTING.md).\n\n## Menagerie Models\n\n**Arms.**\n\n| Name | Maker | DoFs    | License | MJX |\n|------|-------|---------|---------|-----|\n| ARX L5 | ARX Robotics | 7 | [BSD-3-Clause](arx_l5\u002FLICENSE) |✖️|\n| PiPER | AgileX | 7 | [MIT](agilex_piper\u002FLICENSE) |✖️|\n| FR3 | Franka Robotics | 7 | [Apache-2.0](franka_fr3\u002FLICENSE) |✖️|\n| iiwa14 | KUKA | 7 | [BSD-3-Clause](kuka_iiwa_14\u002FLICENSE) |✖️|\n| Lite6 | UFACTORY | 6 | [BSD-3-Clause](ufactory_lite6\u002FLICENSE) |✖️|\n| Panda | Franka Robotics | 7 | [BSD-3-Clause](franka_emika_panda\u002FLICENSE) |✔️|\n| Rizon4 | Flexiv Robotics | 7 | [Apache-2.0](flexiv_rizon4\u002FLICENSE) |✖️|\n| Sawyer | Rethink Robotics | 7 | [Apache-2.0](rethink_robotics_sawyer\u002FLICENSE) |✖️|\n| Unitree Z1 | Unitree Robotics | 6 | [BSD-3-Clause](unitree_z1\u002FLICENSE) |✖️|\n| UR5e | Universal Robots | 6 | [BSD-3-Clause](universal_robots_ur5e\u002FLICENSE) |✖️|\n| UR10e | Universal Robots | 6 | [BSD-3-Clause](universal_robots_ur10e\u002FLICENSE) |✖️|\n| ViperX 300 | Trossen Robotics | 8 | [BSD-3-Clause](trossen_vx300s\u002FLICENSE) |✖️|\n| WidowX 250 | Trossen Robotics | 8 | [BSD-3-Clause](trossen_wx250s\u002FLICENSE) |✖️|\n| xarm7 | UFACTORY | 7 | [BSD-3-Clause](ufactory_xarm7\u002FLICENSE) |✖️|\n| Gen3 | Kinova Robotics | 7 | [BSD-3-Clause](kinova_gen3\u002FLICENSE) |✖️|\n| SO-ARM100 | The Robot Studio | 5 | [Apache-2.0](trs_so_arm100\u002FLICENSE) |✖️|\n| Koch v1.1 Low-Cost Robot | Hugging Face | 5 | [Apache-2.0](low_cost_robot_arm\u002FLICENSE) |✖️|\n| YAM | I2RT Robotics | 7 | [MIT](i2rt_yam\u002FLICENSE) |✖️|\n\n**Bipeds.**\n\n| Name | Maker | DoFs    | License | MJX |\n|------|-------|---------|---------|-----|\n| Cassie | Agility Robotics | 28 | [BSD-3-Clause](agility_cassie\u002FLICENSE) |✖️|\n\n**Dual Arms.**\n\n| Name | Maker | DoFs    | License | MJX |\n|------|-------|---------|---------|-----|\n| ALOHA 2 | Trossen Robotics, Google DeepMind | 16 | [BSD-3-Clause](aloha\u002FLICENSE) |✔️|\n\n**Drones.**\n\n| Name | Maker | DoFs    | License | MJX |\n|------|-------|---------|---------|-----|\n| Crazyflie 2 | Bitcraze | 0 | [MIT](bitcraze_crazyflie_2\u002FLICENSE) |✖️|\n| Skydio X2 | Skydio | 0 | [Apache-2.0](skydio_x2\u002FLICENSE) |✖️|\n\n**End-effectors.**\n\n| Name | Maker | DoFs    | License | MJX |\n|------|-------|---------|---------|-----|\n| Allegro Hand V3 | Wonik Robotics | 16 | [BSD-2-Clause](wonik_allegro\u002FLICENSE) |✖️|\n| UMI Gripper | Stanford University | 1 | [MIT](umi_gripper\u002FLICENSE) |✖️|\n| LEAP Hand | Carnegie Mellon University | 16 | [MIT](leap_hand\u002FLICENSE) |✖️|\n| Robotiq 2F-85 | Robotiq | 8 | [BSD-2-Clause](robotiq_2f85\u002FLICENSE) |✖️|\n| Shadow Hand EM35 | Shadow Robot Company | 24 | [Apache-2.0](shadow_hand\u002FLICENSE) |✖️|\n| Shadow DEX-EE Hand | Shadow Robot Company | 12 | [Apache-2.0](shadow_dexee\u002FLICENSE) |✖️|\n\n**Mobile Manipulators.**\n\n| Name | Maker | DoFs    | License | MJX |\n|------|-------|---------|---------|-----|\n| Google Robot | Google DeepMind | 9 | [Apache-2.0](google_robot\u002FLICENSE) |✖️|\n| Stanford TidyBot | Stanford University | 11 | [MIT](stanford_tidybot\u002FLICENSE) |✖️|\n| Stretch 2 | Hello Robot | 17 | [Clear BSD](hello_robot_stretch\u002FLICENSE) |✖️|\n| Stretch 3 | Hello Robot | 17 | [Apache-2.0](hello_robot_stretch_3\u002FLICENSE) |✖️|\n| PAL Tiago | PAL Robotics | 12 | [Apache-2.0](pal_tiago\u002FLICENSE) |✖️|\n| PAL Tiago Dual | PAL Robotics | 21 | [Apache-2.0](pal_tiago_dual\u002FLICENSE) |✖️|\n\n**Mobile Bases.**\n\n| Name | Maker | DoFs    | License | MJX |\n|------|-------|---------|---------|-----|\n| Omniwheel Soccer Robot | Robot Soccer Kit | 4 | [MIT](robot_soccer_kit\u002FLICENSE) |✖️|\n\n**Humanoids.**\n\n| Name | Maker | DoFs    | License | MJX |\n|------|-------|---------|---------|-----|\n| Adam Lite | PNDbotics | 25 | [MIT](pndbotics_adam_lite\u002FLICENSE) |✖️|\n| Apptronik Apollo | Apptronik | 32 | [Apache-2.0](apptronik_apollo\u002FLICENSE) |✔️|\n| Berkeley Humanoid | Hybrid Robotics | 12 | [BSD-3-Clause](berkeley_humanoid\u002FLICENSE) |✖️|\n| Booster T1 | Booster Robotics | 23 | [Apache-2.0](booster_t1\u002FLICENSE) |✖️|\n| Fourier N1 | Fourier Robotics | 30 | [Apache-2.0](fourier_n1\u002FLICENSE) |✖️|\n| Robotis OP3 | Robotis | 20 | [Apache-2.0](robotis_op3\u002FLICENSE) |✖️|\n| TALOS | PAL Robotics | 32 | [Apache-2.0](pal_talos\u002FLICENSE) |✖️|\n| Unitree G1 | Unitree Robotics | 37 | [BSD-3-Clause](unitree_g1\u002FLICENSE) |✔️|\n| Unitree H1 | Unitree Robotics | 19 | [BSD-3-Clause](unitree_h1\u002FLICENSE) |✖️|\n| ToddlerBot 2XC | Stanford University | 30 | [MIT](toddlerbot_2xc\u002FLICENSE) |✔️|\n| ToddlerBot 2XM | Stanford University | 30 | [MIT](toddlerbot_2xm\u002FLICENSE) |✔️|\n\n**Quadrupeds.**\n\n| Name | Maker | DoFs    | License | MJX |\n|------|-------|---------|---------|-----|\n| ANYmal B | ANYbotics | 12 | [BSD-3-Clause](anybotics_anymal_b\u002FLICENSE) |✖️|\n| ANYmal C | ANYbotics | 12 | [BSD-3-Clause](anybotics_anymal_c\u002FLICENSE) |✔️|\n| Spot | Boston Dynamics | 12 | [BSD-3-Clause](boston_dynamics_spot\u002FLICENSE) |✖️|\n| Unitree A1 | Unitree Robotics | 12 | [BSD-3-Clause](unitree_a1\u002FLICENSE) |✖️|\n| Unitree Go1 | Unitree Robotics | 12 | [BSD-3-Clause](unitree_go1\u002FLICENSE) |✖️|\n| Unitree Go2 | Unitree Robotics | 12 | [BSD-3-Clause](unitree_go2\u002FLICENSE) |✔️|\n| Google Barkour v0 | Google DeepMind | 12 | [Apache-2.0](google_barkour_v0\u002FLICENSE) |✔️|\n| Google Barkour vB | Google DeepMind | 12 | [Apache-2.0](google_barkour_vb\u002FLICENSE) |✔️|\n\n**Biomechanical.**\n\n| Name | Maker | DoFs    | License | MJX |\n|------|-------|---------|---------|-----|\n| IIT Softfoot | IIT Softbots | 92 | [BSD-3-Clause](iit_softfoot\u002FLICENSE) |✖️|\n| flybody | Google DeepMind, HHMI Janelia Research Campus | 102 | [Apache-2.0](flybody\u002FLICENSE) |✖️|\n\n**Miscellaneous.**\n\n| Name | Maker | DoFs    | License | MJX |\n|------|-------|---------|---------|-----|\n| D435i | Intel Realsense | 0 | [Apache-2.0](realsense_d435i\u002FLICENSE) |✖️|\n\n## Citing Menagerie\n\nIf you use Menagerie in your work, please use the following citation:\n\n```bibtex\n@software{menagerie2022github,\n  author = {Zakka, Kevin and Tassa, Yuval and {MuJoCo Menagerie Contributors}},\n  title = {{MuJoCo Menagerie: A collection of high-quality simulation models for MuJoCo}},\n  url = {http:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco_menagerie},\n  year = {2022},\n}\n```\n\n## Acknowledgments\n\nThe models in this repository are based on third-party models designed by many talented people, and would not have been possible without their generous open-source contributions. We would like to acknowledge all the designers and engineers who made MuJoCo Menagerie possible.\n\nWe'd like to thank Pedro Vergani for his help with visuals and design.\n\nThe main effort required to make this repository publicly available was undertaken by [Kevin Zakka](https:\u002F\u002Fkzakka.com\u002F), with help from the Robotics Simulation team at Google DeepMind.\n\nThis project has also benefited from contributions by members of the broader community — see the [CONTRIBUTORS.md](.\u002FCONTRIBUTORS.md) for a full list.\n\n## Changelog\n\nFor a summary of key updates across the repository, see the [global CHANGELOG.md](.\u002FCHANGELOG.md).\n\nEach individual model also includes its own `CHANGELOG.md` file with model-specific updates, linked directly from the corresponding README.\n\n## License and Disclaimer\n\nXML and asset files in each individual model directory of this repository are\nsubject to different license terms. Please consult the `LICENSE` files under\neach specific model subdirectory for the relevant license and copyright\ninformation.\n\nAll other content is Copyright 2022 DeepMind Technologies Limited and licensed\nunder the Apache License, Version 2.0. A copy of this license is provided in the\ntop-level LICENSE file in this repository.\nYou can also obtain it from https:\u002F\u002Fwww.apache.org\u002Flicenses\u002FLICENSE-2.0.\n\nThis is not an officially supported Google product.\n","\u003Ch1>\n  \u003Ca href=\"#\">\u003Cimg alt=\"MuJoCo Menagerie\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_aabd161b0e13.png\" width=\"100%\">\u003C\u002Fa>\n\u003C\u002Fh1>\n\n\u003Cp>\n  \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco_menagerie\u002Factions\u002Fworkflows\u002Fbuild.yml?query=branch%3Amain\" alt=\"GitHub Actions\">\n    \u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Factions\u002Fworkflow\u002Fstatus\u002Fgoogle-deepmind\u002Fmujoco_menagerie\u002Fbuild.yml?branch=main\">\n  \u003C\u002Fa>\n  \u003Ca href=\"https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002Flatest\u002Fmodels.html\" alt=\"Documentation\">\n    \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_13d664e1afd7.png\">\n  \u003C\u002Fa>\n  \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco_menagerie\u002Fblob\u002Fmain\u002FCONTRIBUTING.md\">\n    \u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FPRs-welcome-green.svg\" alt=\"PRs\" height=\"20\">\n  \u003C\u002Fa>\n\u003C\u002Fp>\n\n**Menagerie** 是一个由 Google DeepMind 精心整理的、用于 [MuJoCo](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco) 物理引擎的高质量模型集合。\n\n一个物理模拟器的好坏取决于它所模拟的模型。在像 MuJoCo 这样功能强大、建模选项众多的模拟器中，很容易创建出行为不符合预期的“坏”模型。本集合的目标是为社区提供一个精心设计的模型库，这些模型开箱即用，性能良好。\n\n### 模型画廊\n\n|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_43dba392b538.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_a31e4a16f6e8.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_e24179345020.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_b01b4b8caac9.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_bf2617a7e544.png' width=100>|\n| :---: | :---: | :---: | :---: | :---: |\n|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_8b39130ea16e.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_0ab7c15e16c4.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_5a0b677e876e.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_1842c191ae78.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_06b2462445d9.png' width=100>|\n|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_67a673b6e410.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_15bfbd894b3c.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_667c8b79594d.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_b1b27048dbeb.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_95d1f8147347.png' width=100>|\n|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_78c5b4b43cc0.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_6f3c1033c3b0.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_1759e7c04327.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_5ed9457378a2.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_3863b5c97e1c.png' width=100>|\n|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_8c188c5412a3.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_81eeeded9728.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_b25c835536f1.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_f8ed30800ed3.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_bc2c6fe56cc5.png' width=100>|\n|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_9557bf5a065e.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_e2e14a5ed97f.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_e1b234028b42.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_7ec347125372.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_74af9faac866.png' width=100>|\n|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_29fa3acdbccd.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_28c2e04a127b.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_c0aee5583cb9.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_d4ff79dd7bd1.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_5420da487fcd.png' width=100>|\n|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_962095a52fc3.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_bc1167ac27c1.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_ba8c8423a7e4.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_bf060c60f52f.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_45378eed2bc4.png' width=100>|\n|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_74b4371934bc.png' width=100>|\u003Cimg src='https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_readme_fc18ea1fcf0a.png' width=100>||||\n\n\n- [快速开始](#快速开始)\n  - [先决条件](#先决条件)\n  - [概览](#概览)\n  - [使用方法](#使用方法)\n    - [通过 `robot-descriptions`](#通过-robot-descriptions)\n    - [通过 `git clone`](#通过-git-clone)\n- [模型质量与贡献](#模型质量与贡献)\n- [Menagerie 模型列表](#menagerie-模型列表)\n- [引用 Menagerie](#引用-menagerie)\n- [致谢](#致谢)\n- [更新日志](#更新日志)\n- [许可证与免责声明](#许可证与免责声明)\n\n## 快速开始\n\n### 先决条件\n\n每个模型所需的最低 MuJoCo 版本在其各自的 README 文件中指定。你可以从 GitHub [发布页面](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco\u002Freleases\u002F) 下载 MuJoCo 的预编译二进制文件。如果你使用 Python，也可以通过 `pip install mujoco` 从 [PyPI](https:\u002F\u002Fpypi.org\u002Fproject\u002Fmujoco\u002F) 安装原生绑定。其他安装说明请参见 [此处](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco#installation)。\n\n### 概览\n\nMenagerie 的结构如下所示。为简洁起见，我们只包含了一个模型目录，因为所有其他目录都遵循完全相同的模式。\n\n```bash\n├── unitree_go2\n│   ├── assets\n│   │   ├── base_0.obj\n│   │   ├── ...\n│   ├── go2.png\n│   ├── go2.xml\n│   ├── LICENSE\n│   ├── README.md\n│   └── scene.xml\n│   └── go2_mjx.xml\n│   └── scene_mjx.xml\n```\n\n- `assets`: 存储用于视觉和碰撞目的的模型 3D 网格文件 (.stl 或 .obj)\n- `LICENSE`: 描述模型的版权和许可条款\n- `README.md`: 包含详细步骤，描述模型的 MJCF XML 文件是如何生成的\n- `\u003Cmodel>.xml`: 包含模型的 MJCF 定义\n- `scene.xml`: 包含 `\u003Cmodel>.xml`，并添加了一个平面、一个光源，可能还有其他物体\n- `\u003Cmodel>.png`: `scene.xml` 的 PNG 图像\n- `\u003Cmodel>_mjx.xml`: 包含与 MJX 兼容的模型版本。并非所有模型都有 MJX 变体（更多信息请参见 [Menagerie 模型列表](#menagerie-模型列表)）。\n- `scene_mjx.xml`: 与 `scene.xml` 相同，但加载的是 MJX 变体\n\n请注意，`\u003Cmodel>.xml` 仅描述模型本身，即运动学树中没有定义其他实体。我们将额外的物体定义留给 `scene.xml` 文件，例如 Shadow Hand 的 [`scene.xml`](shadow_hand\u002Fscene_right.xml)。\n\n### 使用方法\n\n#### 通过 `robot-descriptions`\n\n你可以使用开源的 [`robot_descriptions`](https:\u002F\u002Fgithub.com\u002Frobot-descriptions\u002Frobot_descriptions.py) 包来加载 Menagerie 中的任何模型。该包可在 PyPI 上获取，可通过 `pip install robot_descriptions` 安装。\n\n安装后，你可以按如下方式加载你选择的模型：\n\n```python\nimport mujoco\n\n# 以导入模块的方式加载特定的模型描述。\nfrom robot_descriptions import panda_mj_description\nmodel = mujoco.MjModel.from_xml_path(panda_mj_description.MJCF_PATH)\n\n# 直接加载 MjModel 实例。\nfrom robot_descriptions.loaders.mujoco import load_robot_description\nmodel = load_robot_description(\"panda_mj_description\")\n```\n\n# 加载模型的变体，例如不带夹爪的熊猫模型。\nmodel = load_robot_description(\"panda_mj_description\", variant=\"panda_nohand\")\n```\n\n#### 通过 `git clone`\n\n你也可以直接在所选目录中克隆此仓库：\n\n```bash\ngit clone https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco_menagerie.git\n```\n\n然后，你可以使用 Python 查看器交互式地探索模型：\n\n```bash\npython -m mujoco.viewer --mjcf mujoco_menagerie\u002Funitree_go2\u002Fscene.xml\n```\n\n如果你有更多问题，请查看我们的 [常见问题解答](FAQ.md)。\n\n## 模型质量与贡献\n\n我们的目标是最终使所有 Menagerie 模型尽可能逼真地模拟它们所对应的真实系统。提高模型质量是一项持续的努力，目前许多模型的状态未必能达到最佳水平。\n\n然而，通过以当前状态发布 Menagerie，我们希望能整合并提高社区贡献的可见度。为了帮助 Menagerie 用户对每个模型的质量建立适当的期望，我们引入了以下分级系统：\n\n| 等级 | 描述                                                 |\n|-------|-------------------------------------------------------------|\n| A+    | 数值是经过恰当系统辨识的产物      |\n| A     | 数值是真实的，但尚未经过恰当辨识 |\n| B     | 稳定，但部分数值不真实                     |\n| C     | 条件性稳定，有显著改进空间         |\n\n一旦创建了恰当的系统辨识工具箱，该分级系统将应用于每个模型。我们目前计划在今年晚些时候发布这个工具箱。\n\n有关贡献的更多信息，例如如何向 Menagerie 添加新模型，请参阅 [贡献指南](CONTRIBUTING.md)。\n\n## Menagerie 模型库\n\n**机械臂。**\n\n| 名称 | 制造商 | 自由度 (DoFs) | 许可证 | MJX |\n|------|-------|---------|---------|-----|\n| ARX L5 | ARX Robotics | 7 | [BSD-3-Clause](arx_l5\u002FLICENSE) |✖️|\n| PiPER | AgileX | 7 | [MIT](agilex_piper\u002FLICENSE) |✖️|\n| FR3 | Franka Robotics | 7 | [Apache-2.0](franka_fr3\u002FLICENSE) |✖️|\n| iiwa14 | KUKA | 7 | [BSD-3-Clause](kuka_iiwa_14\u002FLICENSE) |✖️|\n| Lite6 | UFACTORY | 6 | [BSD-3-Clause](ufactory_lite6\u002FLICENSE) |✖️|\n| Panda | Franka Robotics | 7 | [BSD-3-Clause](franka_emika_panda\u002FLICENSE) |✔️|\n| Rizon4 | Flexiv Robotics | 7 | [Apache-2.0](flexiv_rizon4\u002FLICENSE) |✖️|\n| Sawyer | Rethink Robotics | 7 | [Apache-2.0](rethink_robotics_sawyer\u002FLICENSE) |✖️|\n| Unitree Z1 | Unitree Robotics | 6 | [BSD-3-Clause](unitree_z1\u002FLICENSE) |✖️|\n| UR5e | Universal Robots | 6 | [BSD-3-Clause](universal_robots_ur5e\u002FLICENSE) |✖️|\n| UR10e | Universal Robots | 6 | [BSD-3-Clause](universal_robots_ur10e\u002FLICENSE) |✖️|\n| ViperX 300 | Trossen Robotics | 8 | [BSD-3-Clause](trossen_vx300s\u002FLICENSE) |✖️|\n| WidowX 250 | Trossen Robotics | 8 | [BSD-3-Clause](trossen_wx250s\u002FLICENSE) |✖️|\n| xarm7 | UFACTORY | 7 | [BSD-3-Clause](ufactory_xarm7\u002FLICENSE) |✖️|\n| Gen3 | Kinova Robotics | 7 | [BSD-3-Clause](kinova_gen3\u002FLICENSE) |✖️|\n| SO-ARM100 | The Robot Studio | 5 | [Apache-2.0](trs_so_arm100\u002FLICENSE) |✖️|\n| Koch v1.1 Low-Cost Robot | Hugging Face | 5 | [Apache-2.0](low_cost_robot_arm\u002FLICENSE) |✖️|\n| YAM | I2RT Robotics | 7 | [MIT](i2rt_yam\u002FLICENSE) |✖️|\n\n**双足机器人。**\n\n| 名称 | 制造商 | 自由度 (DoFs) | 许可证 | MJX |\n|------|-------|---------|---------|-----|\n| Cassie | Agility Robotics | 28 | [BSD-3-Clause](agility_cassie\u002FLICENSE) |✖️|\n\n**双臂机器人。**\n\n| 名称 | 制造商 | 自由度 (DoFs) | 许可证 | MJX |\n|------|-------|---------|---------|-----|\n| ALOHA 2 | Trossen Robotics, Google DeepMind | 16 | [BSD-3-Clause](aloha\u002FLICENSE) |✔️|\n\n**无人机。**\n\n| 名称 | 制造商 | 自由度 (DoFs) | 许可证 | MJX |\n|------|-------|---------|---------|-----|\n| Crazyflie 2 | Bitcraze | 0 | [MIT](bitcraze_crazyflie_2\u002FLICENSE) |✖️|\n| Skydio X2 | Skydio | 0 | [Apache-2.0](skydio_x2\u002FLICENSE) |✖️|\n\n**末端执行器。**\n\n| 名称 | 制造商 | 自由度 (DoFs) | 许可证 | MJX |\n|------|-------|---------|---------|-----|\n| Allegro Hand V3 | Wonik Robotics | 16 | [BSD-2-Clause](wonik_allegro\u002FLICENSE) |✖️|\n| UMI Gripper | Stanford University | 1 | [MIT](umi_gripper\u002FLICENSE) |✖️|\n| LEAP Hand | Carnegie Mellon University | 16 | [MIT](leap_hand\u002FLICENSE) |✖️|\n| Robotiq 2F-85 | Robotiq | 8 | [BSD-2-Clause](robotiq_2f85\u002FLICENSE) |✖️|\n| Shadow Hand EM35 | Shadow Robot Company | 24 | [Apache-2.0](shadow_hand\u002FLICENSE) |✖️|\n| Shadow DEX-EE Hand | Shadow Robot Company | 12 | [Apache-2.0](shadow_dexee\u002FLICENSE) |✖️|\n\n**移动操作机器人。**\n\n| 名称 | 制造商 | 自由度 (DoFs) | 许可证 | MJX |\n|------|-------|---------|---------|-----|\n| Google Robot | Google DeepMind | 9 | [Apache-2.0](google_robot\u002FLICENSE) |✖️|\n| Stanford TidyBot | Stanford University | 11 | [MIT](stanford_tidybot\u002FLICENSE) |✖️|\n| Stretch 2 | Hello Robot | 17 | [Clear BSD](hello_robot_stretch\u002FLICENSE) |✖️|\n| Stretch 3 | Hello Robot | 17 | [Apache-2.0](hello_robot_stretch_3\u002FLICENSE) |✖️|\n| PAL Tiago | PAL Robotics | 12 | [Apache-2.0](pal_tiago\u002FLICENSE) |✖️|\n| PAL Tiago Dual | PAL Robotics | 21 | [Apache-2.0](pal_tiago_dual\u002FLICENSE) |✖️|\n\n**移动底盘。**\n\n| 名称 | 制造商 | 自由度 (DoFs) | 许可证 | MJX |\n|------|-------|---------|---------|-----|\n| Omniwheel Soccer Robot | Robot Soccer Kit | 4 | [MIT](robot_soccer_kit\u002FLICENSE) |✖️|\n\n**人形机器人。**\n\n| 名称 | 制造商 | 自由度 (DoFs) | 许可证 | MJX |\n|------|-------|---------|---------|-----|\n| Adam Lite | PNDbotics | 25 | [MIT](pndbotics_adam_lite\u002FLICENSE) |✖️|\n| Apptronik Apollo | Apptronik | 32 | [Apache-2.0](apptronik_apollo\u002FLICENSE) |✔️|\n| Berkeley Humanoid | Hybrid Robotics | 12 | [BSD-3-Clause](berkeley_humanoid\u002FLICENSE) |✖️|\n| Booster T1 | Booster Robotics | 23 | [Apache-2.0](booster_t1\u002FLICENSE) |✖️|\n| Fourier N1 | Fourier Robotics | 30 | [Apache-2.0](fourier_n1\u002FLICENSE) |✖️|\n| Robotis OP3 | Robotis | 20 | [Apache-2.0](robotis_op3\u002FLICENSE) |✖️|\n| TALOS | PAL Robotics | 32 | [Apache-2.0](pal_talos\u002FLICENSE) |✖️|\n| Unitree G1 | Unitree Robotics | 37 | [BSD-3-Clause](unitree_g1\u002FLICENSE) |✔️|\n| Unitree H1 | Unitree Robotics | 19 | [BSD-3-Clause](unitree_h1\u002FLICENSE) |✖️|\n| ToddlerBot 2XC | Stanford University | 30 | [MIT](toddlerbot_2xc\u002FLICENSE) |✔️|\n| ToddlerBot 2XM | Stanford University | 30 | [MIT](toddlerbot_2xm\u002FLICENSE) |✔️|\n\n**四足机器人。**\n\n| 名称 | 制造商 | 自由度 (DoFs) | 许可证 | MJX |\n|------|-------|---------|---------|-----|\n| ANYmal B | ANYbotics | 12 | [BSD-3-Clause](anybotics_anymal_b\u002FLICENSE) |✖️|\n| ANYmal C | ANYbotics | 12 | [BSD-3-Clause](anybotics_anymal_c\u002FLICENSE) |✔️|\n| Spot | Boston Dynamics | 12 | [BSD-3-Clause](boston_dynamics_spot\u002FLICENSE) |✖️|\n| Unitree A1 | Unitree Robotics | 12 | [BSD-3-Clause](unitree_a1\u002FLICENSE) |✖️|\n| Unitree Go1 | Unitree Robotics | 12 | [BSD-3-Clause](unitree_go1\u002FLICENSE) |✖️|\n| Unitree Go2 | Unitree Robotics | 12 | [BSD-3-Clause](unitree_go2\u002FLICENSE) |✔️|\n| Google Barkour v0 | Google DeepMind | 12 | [Apache-2.0](google_barkour_v0\u002FLICENSE) |✔️|\n| Google Barkour vB | Google DeepMind | 12 | [Apache-2.0](google_barkour_vb\u002FLICENSE) |✔️|\n\n**仿生机器人。**\n\n| 名称 | 制造商 | 自由度 (DoFs) | 许可证 | MJX |\n|------|-------|---------|---------|-----|\n| IIT Softfoot | IIT Softbots | 92 | [BSD-3-Clause](iit_softfoot\u002FLICENSE) |✖️|\n| flybody | Google DeepMind, HHMI Janelia Research Campus | 102 | [Apache-2.0](flybody\u002FLICENSE) |✖️|\n\n**其他。**\n\n| 名称 | 制造商 | 自由度 (DoFs) | 许可证 | MJX |\n|------|-------|---------|---------|-----|\n| D435i | Intel Realsense | 0 | [Apache-2.0](realsense_d435i\u002FLICENSE) |✖️|\n\n## 引用 Menagerie\n\n如果您在您的工作中使用了 Menagerie，请使用以下引用：\n\n```bibtex\n@software{menagerie2022github,\n  author = {Zakka, Kevin and Tassa, Yuval and {MuJoCo Menagerie Contributors}},\n  title = {{MuJoCo Menagerie: A collection of high-quality simulation models for MuJoCo}},\n  url = {http:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco_menagerie},\n  year = {2022},\n}\n```\n\n## 致谢\n\n本仓库中的模型基于许多才华横溢的人设计的第三方模型，没有他们慷慨的开源贡献，这一切都不可能实现。我们要感谢所有使 MuJoCo Menagerie 成为可能的设计师和工程师。\n\n我们要感谢 Pedro Vergani 在视觉效果和设计方面提供的帮助。\n\n使此仓库公开可用的主要工作由 [Kevin Zakka](https:\u002F\u002Fkzakka.com\u002F) 承担，并得到了 Google DeepMind 机器人仿真团队的帮助。\n\n该项目也受益于更广泛社区成员的贡献——完整列表请参见 [CONTRIBUTORS.md](.\u002FCONTRIBUTORS.md)。\n\n## 更新日志\n\n有关整个仓库关键更新的摘要，请参阅 [全局 CHANGELOG.md](.\u002FCHANGELOG.md)。\n\n每个单独的模型也包含其自己的 `CHANGELOG.md` 文件，其中记录了特定于模型的更新，可直接从相应的 README 中链接访问。\n\n## 许可与免责声明\n\n本仓库中每个独立模型目录下的 XML 和资产文件遵循不同的许可条款。请查阅每个具体模型子目录下的 `LICENSE` 文件以获取相关许可和版权信息。\n\n所有其他内容的版权归 DeepMind Technologies Limited 2022 所有，并依据 Apache License, Version 2.0 进行许可。本仓库的顶级 LICENSE 文件中提供了该许可的副本。\n您也可以从 https:\u002F\u002Fwww.apache.org\u002Flicenses\u002FLICENSE-2.0 获取该许可。\n\n本产品并非 Google 官方支持的产品。","# mujoco_menagerie 快速上手指南\n\n## 环境准备\n\n### 系统要求\n- 支持 MuJoCo 物理引擎的操作系统（如 Linux, macOS, Windows）。\n- 确保已安装 Python（推荐 3.8 及以上版本）。\n\n### 前置依赖\n必须安装 **MuJoCo** 物理引擎。每个模型所需的最低 MuJoCo 版本在其各自的 `README.md` 中有说明。\n\n**安装 MuJoCo：**\n- **方式一（推荐，通过 pip 安装 Python 绑定）：**\n    ```bash\n    pip install mujoco\n    ```\n- **方式二（下载预编译二进制文件）：**\n    从 MuJoCo 的 GitHub [发布页面](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco\u002Freleases\u002F) 下载并安装。\n\n## 安装步骤\n\n选择以下任一方式获取模型文件。\n\n### 方式一：使用 `robot-descriptions` 包（推荐）\n这是一个开源 Python 包，可以方便地加载 Menagerie 中的模型。\n\n1.  安装 `robot_descriptions`：\n    ```bash\n    pip install robot_descriptions\n    ```\n\n### 方式二：直接克隆仓库\n如果你想直接访问所有模型文件，可以克隆整个仓库。\n\n1.  克隆仓库到本地：\n    ```bash\n    git clone https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco_menagerie.git\n    ```\n\n## 基本使用\n\n### 使用 `robot-descriptions` 加载模型\n安装 `robot_descriptions` 后，你可以在 Python 代码中轻松加载模型：\n\n```python\nimport mujoco\n\n# 方法1：导入特定模型的描述模块\nfrom robot_descriptions import panda_mj_description\nmodel = mujoco.MjModel.from_xml_path(panda_mj_description.MJCF_PATH)\n\n# 方法2：使用加载器直接获取 MjModel 实例\nfrom robot_descriptions.loaders.mujoco import load_robot_description\nmodel = load_robot_description(\"panda_mj_description\")\n\n# 加载模型的变体，例如不带夹爪的 Panda\nmodel = load_robot_description(\"panda_mj_description\", variant=\"panda_nohand\")\n```\n\n### 使用克隆的仓库查看模型\n如果你克隆了仓库，可以使用 MuJoCo 的 Python 查看器交互式地查看模型：\n\n```bash\n# 进入克隆的目录，或使用绝对路径\npython -m mujoco.viewer --mjcf mujoco_menagerie\u002Funitree_go2\u002Fscene.xml\n```\n将 `unitree_go2` 替换为你感兴趣的模型文件夹名称（如 `franka_emika_panda`）。\n\n### 模型文件结构说明\n每个模型文件夹（例如 `unitree_go2`）通常包含以下文件：\n- `\u003Cmodel>.xml`: 模型的核心 MJCF 定义文件。\n- `scene.xml`: 包含模型、地面、光源等的完整场景文件，适合直接用于仿真或查看。\n- `assets\u002F`: 存放模型视觉和碰撞用的 3D 网格文件（.obj 或 .stl）。\n- `README.md`: 该模型的详细说明和生成步骤。\n- `LICENSE`: 该模型的许可证文件。","一名机器人学研究生正在使用 MuJoCo 物理引擎开发一个四足机器人（如 ANYmal）的强化学习控制算法，用于复杂地形行走任务。\n\n### 没有 mujoco_menagerie 时\n- **模型获取困难**：需要从机器人制造商官网、研究论文或开源社区中零散地寻找 ANYmal 的模型文件，过程耗时且版本混乱。\n- **模型质量参差不齐**：找到的模型可能存在几何尺寸错误、关节轴定义不准确或质量\u002F惯性参数不合理等问题，导致仿真中的机器人行为怪异，与真实物理不符。\n- **调试成本高昂**：大量时间被浪费在排查和修复模型本身的错误上，而非专注于控制算法本身。例如，需要反复调整模型参数才能使机器人正常站立。\n- **协作与复现障碍**：由于每个人使用的模型来源和版本可能不同，实验室同学间难以共享和复现彼此的仿真实验代码，增加了沟通成本。\n\n### 使用 mujoco_menagerie 后\n- **一站式获取高质量模型**：直接从 mujoco_menagerie 库中导入经过 Google DeepMind 验证的 `anybotics_anymal_b` 或 `anybotics_anymal_c` 模型，几分钟内即可获得一个可直接运行的、高保真的仿真环境。\n- **开箱即用的仿真保真度**：模型经过精心校准，几何、动力学参数准确，机器人能表现出符合预期的物理行为，研究者可以立即信任仿真结果，并在此基础上设计控制策略。\n- **聚焦核心算法开发**：节省了数天甚至数周的模型调试与验证时间，可以将全部精力投入到强化学习算法设计、训练和调优上，加速研究迭代。\n- **促进标准化与协作**：团队内部统一使用 menagerie 中的标准模型，确保了实验基础的一致性，代码共享和结果复现变得简单可靠，提升了团队研究效率。\n\nmujoco_menagerie 通过提供一系列即用、可靠的标准化机器人模型，将研究者从繁琐、易错的模型构建工作中解放出来，使其能专注于算法创新这一核心价值。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_menagerie_43dba392.png","google-deepmind","Google DeepMind","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002Fgoogle-deepmind_06b1dd17.png","",null,"https:\u002F\u002Fwww.deepmind.com\u002F","https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind",[85,89,93],{"name":86,"color":87,"percentage":88},"Python","#3572A5",75.7,{"name":90,"color":91,"percentage":92},"Jupyter Notebook","#DA5B0B",23.8,{"name":94,"color":95,"percentage":96},"OpenSCAD","#e5cd45",0.5,3248,456,"2026-04-05T02:34:42","NOASSERTION","Linux, macOS, Windows","未说明",{"notes":104,"python":102,"dependencies":105},"1. 核心依赖为 MuJoCo 物理引擎，需单独安装。\n2. 可通过 pip 安装 mujoco 的 Python 原生绑定，或从 GitHub 下载预编译二进制文件。\n3. 每个模型所需的最低 MuJoCo 版本在其各自的 README 中指定。\n4. 提供两种使用方式：通过 robot_descriptions 包（Python）或直接克隆仓库。\n5. 模型文件包含 XML 定义、3D 网格资产及许可证文件。\n6. 部分模型提供 MJX（JAX 兼容）变体。",[106],"mujoco",[18],[106,109],"robotics","2026-03-27T02:49:30.150509","2026-04-06T05:36:41.957898",[113,118,123,128,133,138,143],{"id":114,"question_zh":115,"answer_zh":116,"source_url":117},3913,"如何从 Mujoco Menagerie 的模型中提取或获取 Denavit-Hartenberg (D-H) 参数？","Mujoco 模型本身不直接存储 D-H 参数。D-H 参数是机器人运动学的一种特定表示法。要计算正向运动学，你应该直接使用 Mujoco 提供的 API，例如通过 `data.xpos` 和 `data.xmat` 来获取身体（body）的位置和旋转矩阵，而不是尝试提取 D-H 参数。模型尺寸通常是忠于原始设计的。","https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco_menagerie\u002Fissues\u002F85",{"id":119,"question_zh":120,"answer_zh":121,"source_url":122},3907,"是否有使用 Mujoco Menagerie 模型的示例代码或教程？","是的，现在有一个非常简洁的教程笔记本（tutorial notebook）可供参考。此外，建议先学习两个 MuJoCo 官方教程（tutorial 1 和 tutorial 2），然后查看 dm_control、dm_robotics 和 mujoco_mpc 等项目，了解如何使用这些模型进行强化学习或最优控制。","https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco_menagerie\u002Fissues\u002F11",{"id":124,"question_zh":125,"answer_zh":126,"source_url":127},3908,"UR5e 模型的正向运动学（Forward Kinematics）计算结果与官方 D-H 参数不一致，如何解决？","问题可能源于世界坐标系和基座坐标系之间存在旋转。一个解决方案是修改模型的朝向。有用户已经创建了一个修改版本的分支，其中调整了方向。你可以参考该分支的改动，或者考虑添加一个与控制器箱（controlbox）坐标系对应的 site\u002Fbody，因为当前“base body”的坐标系并不对应控制器箱的“base frame”。","https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco_menagerie\u002Fissues\u002F75",{"id":129,"question_zh":130,"answer_zh":131,"source_url":132},3909,"ALOHA2 模型中的碰撞检测似乎失效，机器人会穿过物体，如何解决？","碰撞物理问题通常与仿真时间步长（timestep）设置有关。尝试将仿真时间步长降低到 0.002，这可以解决抖动和碰撞失效的问题。此外，检查场景中物体的惯性参数是否合理，不真实的惯性（例如过轻的立方体）会导致物体容易滚动。","https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco_menagerie\u002Fissues\u002F99",{"id":134,"question_zh":135,"answer_zh":136,"source_url":137},3910,"如何为项目贡献新的机器人模型（例如 UFactory Lite6）？","首先，确保上游模型（如 URDF）有明确的许可证（例如 BSD 3-Clause）。然后，使用 `compile` 命令将 URDF 转换为 MuJoCo 模型。之后需要对模型参数（特别是执行器参数）进行调优和系统辨识（sysID）。维护者鼓励社区贡献，并会提供支持将模型集成到仓库中。","https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco_menagerie\u002Fissues\u002F30",{"id":139,"question_zh":140,"answer_zh":141,"source_url":142},3911,"如何为项目贡献像 TIAGo 这样的复杂机器人模型？","贡献流程包括：提供上游 URDF 和许可证信息（如 Apache 2.0），使用 `compile` 命令进行转换，为模型添加执行器（actuators）并进行增益调优以确保稳定性。需要注意避免关节或 body 的命名重复问题。作为机器人原厂团队（如 PAL Robotics）的成员参与开发和维护是受到欢迎的。","https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco_menagerie\u002Fissues\u002F19",{"id":144,"question_zh":145,"answer_zh":146,"source_url":147},3912,"Mujoco 中是否支持镜子（mirror）效果，或者如何模拟 UMI-Gripper 中的镜子？","Mujoco 本身可能不直接支持镜子效果。一个可行的替代方案是：在镜子所在的位置放置摄像头，然后通过后处理将摄像头输出的图像叠加到主视图上，以此来模拟镜面反射。这需要利用渲染和图像合成技术。","https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco_menagerie\u002Fissues\u002F97",[149,152,155,158,161,164,167,170,173],{"id":150,"version":151,"summary_zh":81,"released_at":81},103378,"universal_robots_ur5e_v0",{"id":153,"version":154,"summary_zh":81,"released_at":81},103379,"unitree_a1_v0",{"id":156,"version":157,"summary_zh":81,"released_at":81},103380,"shadow_hand_v1",{"id":159,"version":160,"summary_zh":81,"released_at":81},103381,"shadow_hand_v0",{"id":162,"version":163,"summary_zh":81,"released_at":81},103382,"robotiq_2f85_v0",{"id":165,"version":166,"summary_zh":81,"released_at":81},103383,"franka_emika_panda_v0",{"id":168,"version":169,"summary_zh":81,"released_at":81},103384,"anybotics_anymal_c_v0",{"id":171,"version":172,"summary_zh":81,"released_at":81},103385,"anybotics_anymal_b_v0",{"id":174,"version":175,"summary_zh":81,"released_at":81},103386,"agility_cassie_v0"]