[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-google-deepmind--mujoco":3,"tool-google-deepmind--mujoco":65},[4,23,32,40,49,57],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":22},2268,"ML-For-Beginners","microsoft\u002FML-For-Beginners","ML-For-Beginners 是由微软推出的一套系统化机器学习入门课程，旨在帮助零基础用户轻松掌握经典机器学习知识。这套课程将学习路径规划为 12 周，包含 26 节精炼课程和 52 道配套测验，内容涵盖从基础概念到实际应用的完整流程，有效解决了初学者面对庞大知识体系时无从下手、缺乏结构化指导的痛点。\n\n无论是希望转型的开发者、需要补充算法背景的研究人员，还是对人工智能充满好奇的普通爱好者，都能从中受益。课程不仅提供了清晰的理论讲解，还强调动手实践，让用户在循序渐进中建立扎实的技能基础。其独特的亮点在于强大的多语言支持，通过自动化机制提供了包括简体中文在内的 50 多种语言版本，极大地降低了全球不同背景用户的学习门槛。此外，项目采用开源协作模式，社区活跃且内容持续更新，确保学习者能获取前沿且准确的技术资讯。如果你正寻找一条清晰、友好且专业的机器学习入门之路，ML-For-Beginners 将是理想的起点。",85092,2,"2026-04-10T11:13:16",[13,14,15,16,17,18,19,20,21],"图像","数据工具","视频","插件","Agent","其他","语言模型","开发框架","音频","ready",{"id":24,"name":25,"github_repo":26,"description_zh":27,"stars":28,"difficulty_score":29,"last_commit_at":30,"category_tags":31,"status":22},5784,"funNLP","fighting41love\u002FfunNLP","funNLP 是一个专为中文自然语言处理（NLP）打造的超级资源库，被誉为\"NLP 民工的乐园”。它并非单一的软件工具，而是一个汇集了海量开源项目、数据集、预训练模型和实用代码的综合性平台。\n\n面对中文 NLP 领域资源分散、入门门槛高以及特定场景数据匮乏的痛点，funNLP 提供了“一站式”解决方案。这里不仅涵盖了分词、命名实体识别、情感分析、文本摘要等基础任务的标准工具，还独特地收录了丰富的垂直领域资源，如法律、医疗、金融行业的专用词库与数据集，甚至包含古诗词生成、歌词创作等趣味应用。其核心亮点在于极高的全面性与实用性，从基础的字典词典到前沿的 BERT、GPT-2 模型代码，再到高质量的标注数据和竞赛方案，应有尽有。\n\n无论是刚刚踏入 NLP 领域的学生、需要快速验证想法的算法工程师，还是从事人工智能研究的学者，都能在这里找到急需的“武器弹药”。对于开发者而言，它能大幅减少寻找数据和复现模型的时间；对于研究者，它提供了丰富的基准测试资源和前沿技术参考。funNLP 以开放共享的精神，极大地降低了中文自然语言处理的开发与研究成本，是中文 AI 社区不可或缺的宝藏仓库。",79857,1,"2026-04-08T20:11:31",[19,14,18],{"id":33,"name":34,"github_repo":35,"description_zh":36,"stars":37,"difficulty_score":29,"last_commit_at":38,"category_tags":39,"status":22},5773,"cs-video-courses","Developer-Y\u002Fcs-video-courses","cs-video-courses 是一个精心整理的计算机科学视频课程清单，旨在为自学者提供系统化的学习路径。它汇集了全球知名高校（如加州大学伯克利分校、新南威尔士大学等）的完整课程录像，涵盖从编程基础、数据结构与算法，到操作系统、分布式系统、数据库等核心领域，并深入延伸至人工智能、机器学习、量子计算及区块链等前沿方向。\n\n面对网络上零散且质量参差不齐的教学资源，cs-video-courses 解决了学习者难以找到成体系、高难度大学级别课程的痛点。该项目严格筛选内容，仅收录真正的大学层级课程，排除了碎片化的简短教程或商业广告，确保用户能接触到严谨的学术内容。\n\n这份清单特别适合希望夯实计算机基础的开发者、需要补充特定领域知识的研究人员，以及渴望像在校生一样系统学习计算机科学的自学者。其独特的技术亮点在于分类极其详尽，不仅包含传统的软件工程与网络安全，还细分了生成式 AI、大语言模型、计算生物学等新兴学科，并直接链接至官方视频播放列表，让用户能一站式获取高质量的教育资源，免费享受世界顶尖大学的课堂体验。",79792,"2026-04-08T22:03:59",[18,13,14,20],{"id":41,"name":42,"github_repo":43,"description_zh":44,"stars":45,"difficulty_score":46,"last_commit_at":47,"category_tags":48,"status":22},3128,"ragflow","infiniflow\u002Fragflow","RAGFlow 是一款领先的开源检索增强生成（RAG）引擎，旨在为大语言模型构建更精准、可靠的上下文层。它巧妙地将前沿的 RAG 技术与智能体（Agent）能力相结合，不仅支持从各类文档中高效提取知识，还能让模型基于这些知识进行逻辑推理和任务执行。\n\n在大模型应用中，幻觉问题和知识滞后是常见痛点。RAGFlow 通过深度解析复杂文档结构（如表格、图表及混合排版），显著提升了信息检索的准确度，从而有效减少模型“胡编乱造”的现象，确保回答既有据可依又具备时效性。其内置的智能体机制更进一步，使系统不仅能回答问题，还能自主规划步骤解决复杂问题。\n\n这款工具特别适合开发者、企业技术团队以及 AI 研究人员使用。无论是希望快速搭建私有知识库问答系统，还是致力于探索大模型在垂直领域落地的创新者，都能从中受益。RAGFlow 提供了可视化的工作流编排界面和灵活的 API 接口，既降低了非算法背景用户的上手门槛，也满足了专业开发者对系统深度定制的需求。作为基于 Apache 2.0 协议开源的项目，它正成为连接通用大模型与行业专有知识之间的重要桥梁。",77062,3,"2026-04-04T04:44:48",[17,13,20,19,18],{"id":50,"name":51,"github_repo":52,"description_zh":53,"stars":54,"difficulty_score":46,"last_commit_at":55,"category_tags":56,"status":22},519,"PaddleOCR","PaddlePaddle\u002FPaddleOCR","PaddleOCR 是一款基于百度飞桨框架开发的高性能开源光学字符识别工具包。它的核心能力是将图片、PDF 等文档中的文字提取出来，转换成计算机可读取的结构化数据，让机器真正“看懂”图文内容。\n\n面对海量纸质或电子文档，PaddleOCR 解决了人工录入效率低、数字化成本高的问题。尤其在人工智能领域，它扮演着连接图像与大型语言模型（LLM）的桥梁角色，能将视觉信息直接转化为文本输入，助力智能问答、文档分析等应用场景落地。\n\nPaddleOCR 适合开发者、算法研究人员以及有文档自动化需求的普通用户。其技术优势十分明显：不仅支持全球 100 多种语言的识别，还能在 Windows、Linux、macOS 等多个系统上运行，并灵活适配 CPU、GPU、NPU 等各类硬件。作为一个轻量级且社区活跃的开源项目，PaddleOCR 既能满足快速集成的需求，也能支撑前沿的视觉语言研究，是处理文字识别任务的理想选择。",75569,"2026-04-14T10:38:48",[19,13,20,18],{"id":58,"name":59,"github_repo":60,"description_zh":61,"stars":62,"difficulty_score":29,"last_commit_at":63,"category_tags":64,"status":22},3215,"awesome-machine-learning","josephmisiti\u002Fawesome-machine-learning","awesome-machine-learning 是一份精心整理的机器学习资源清单，汇集了全球优秀的机器学习框架、库和软件工具。面对机器学习领域技术迭代快、资源分散且难以甄选的痛点，这份清单按编程语言（如 Python、C++、Go 等）和应用场景（如计算机视觉、自然语言处理、深度学习等）进行了系统化分类，帮助使用者快速定位高质量项目。\n\n它特别适合开发者、数据科学家及研究人员使用。无论是初学者寻找入门库，还是资深工程师对比不同语言的技术选型，都能从中获得极具价值的参考。此外，清单还延伸提供了免费书籍、在线课程、行业会议、技术博客及线下聚会等丰富资源，构建了从学习到实践的全链路支持体系。\n\n其独特亮点在于严格的维护标准：明确标记已停止维护或长期未更新的项目，确保推荐内容的时效性与可靠性。作为机器学习领域的“导航图”，awesome-machine-learning 以开源协作的方式持续更新，旨在降低技术探索门槛，让每一位从业者都能高效地站在巨人的肩膀上创新。",72149,"2026-04-03T21:50:24",[20,18],{"id":66,"github_repo":67,"name":68,"description_en":69,"description_zh":70,"ai_summary_zh":70,"readme_en":71,"readme_zh":72,"quickstart_zh":73,"use_case_zh":74,"hero_image_url":75,"owner_login":76,"owner_name":77,"owner_avatar_url":78,"owner_bio":79,"owner_company":80,"owner_location":80,"owner_email":80,"owner_twitter":80,"owner_website":81,"owner_url":82,"languages":83,"stars":122,"forks":123,"last_commit_at":124,"license":125,"difficulty_score":29,"env_os":126,"env_gpu":127,"env_ram":128,"env_deps":129,"category_tags":133,"github_topics":134,"view_count":10,"oss_zip_url":80,"oss_zip_packed_at":80,"status":22,"created_at":137,"updated_at":138,"faqs":139,"releases":168},7569,"google-deepmind\u002Fmujoco","mujoco","Multi-Joint dynamics with Contact. A general purpose physics simulator.","MuJoCo（全称 Multi-Joint dynamics with Contact）是一款由 Google DeepMind 维护的通用物理引擎，专为需要快速、精准模拟关节结构与环境交互的场景而设计。它核心解决了复杂机械系统动力学计算及接触力处理的难题，能够高效仿真机器人、生物力学模型以及各类铰接结构的运动状态。\n\n这款工具主要面向研究人员和开发者，广泛应用于机器人学、机器学习、图形动画等领域。MuJoCo 的独特亮点在于其极致的性能优化：底层采用预分配的 C 语言数据结构，确保运行时模拟速度最大化；同时提供原生 OpenGL 交互式可视化界面，方便实时调试。除了强大的 C 接口，它还提供了便捷的 Python 绑定和 Unity 插件，降低了使用门槛。对于希望快速上手的用户，官方准备了丰富的 Google Colab 教程笔记，涵盖从基础操作、模型编辑到基于 JAX 的可微分物理训练等进阶内容。无论是进行控制算法验证，还是开发复杂的物理仿真环境，MuJoCo 都是业界公认的高效选择。","\u003Ch1>\n  \u003Ca href=\"#\">\u003Cimg alt=\"MuJoCo\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_readme_1fc57ab6abcc.png\" width=\"100%\"\u002F>\u003C\u002Fa>\n\u003C\u002Fh1>\n\n\u003Cp>\n  \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco\u002Factions\u002Fworkflows\u002Fbuild.yml?query=branch%3Amain\" alt=\"GitHub Actions\">\n    \u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Factions\u002Fworkflow\u002Fstatus\u002Fgoogle-deepmind\u002Fmujoco\u002Fbuild.yml?branch=main\">\n  \u003C\u002Fa>\n  \u003Ca href=\"https:\u002F\u002Fmujoco.readthedocs.io\u002F\" alt=\"Documentation\">\n    \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_readme_13d664e1afd7.png\">\n  \u003C\u002Fa>\n  \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco\u002Fblob\u002Fmain\u002FLICENSE\" alt=\"License\">\n    \u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Flicense\u002Fgoogle-deepmind\u002Fmujoco\">\n  \u003C\u002Fa>\n\u003C\u002Fp>\n\n**MuJoCo** stands for **Mu**lti-**Jo**int dynamics with **Co**ntact. It is a\ngeneral purpose physics engine that aims to facilitate research and development\nin robotics, biomechanics, graphics and animation, machine learning, and other\nareas which demand fast and accurate simulation of articulated structures\ninteracting with their environment.\n\nThis repository is maintained by [Google DeepMind](https:\u002F\u002Fwww.deepmind.com\u002F).\n\nMuJoCo has a C API and is intended for researchers and developers. The runtime\nsimulation module is tuned to maximize performance and operates on low-level\ndata structures that are preallocated by the built-in XML compiler. The library\nincludes interactive visualization with a native GUI, rendered in OpenGL. MuJoCo\nfurther exposes a large number of utility functions for computing\nphysics-related quantities.\n\nWe also provide [Python bindings] and a plug-in for the [Unity] game engine.\n\n## Documentation\n\nMuJoCo's documentation can be found at [mujoco.readthedocs.io]. Upcoming\nfeatures due for the next release can be found in the [changelog] in the\n\"latest\" branch.\n\n## Getting Started\n\nThere are two easy ways to get started with MuJoCo:\n\n1. **Run `simulate` on your machine.**\n[This video](https:\u002F\u002Fwww.youtube.com\u002Fwatch?v=P83tKA1iz2Y) shows a screen capture\nof `simulate`, MuJoCo's native interactive viewer. Follow the steps described in\nthe [Getting Started] section of the documentation to get `simulate` running on\nyour machine.\n\n2. **Explore our online IPython notebooks.**\nIf you are a Python user, you might want to start with our tutorial notebooks\nrunning on Google Colab:\n\n - The **introductory** tutorial teaches MuJoCo basics:\n   [![Open In Colab](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fgoogle-deepmind\u002Fmujoco\u002Fblob\u002Fmain\u002Fpython\u002Ftutorial.ipynb)\n - The **Model Editing** tutorial shows how to create and edit models procedurally:\n   [![Open In Colab](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fgoogle-deepmind\u002Fmujoco\u002Fblob\u002Fmain\u002Fpython\u002Fmjspec.ipynb)\n - The **rollout** tutorial shows how to use the multithreaded `rollout` module:\n   [![Open In Colab](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fgoogle-deepmind\u002Fmujoco\u002Fblob\u002Fmain\u002Fpython\u002Frollout.ipynb)\n - The **LQR** tutorial synthesizes a linear-quadratic controller, balancing a\n   humanoid on one leg:\n   [![Open In Colab](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fgoogle-deepmind\u002Fmujoco\u002Fblob\u002Fmain\u002Fpython\u002FLQR.ipynb)\n - The **least-squares** tutorial explains how to use the Python-based nonlinear\n   least-squares solver:\n   [![Open In Colab](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fgoogle-deepmind\u002Fmujoco\u002Fblob\u002Fmain\u002Fpython\u002Fleast_squares.ipynb)\n - The **MJX** tutorial provides usage examples of\n   [MuJoCo XLA](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002Fstable\u002Fmjx.html), a branch of MuJoCo written in JAX:\n   [![Open In Colab](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fgoogle-deepmind\u002Fmujoco\u002Fblob\u002Fmain\u002Fmjx\u002Ftutorial.ipynb)\n - The **differentiable physics** tutorial trains locomotion policies with\n   analytical gradients automatically derived from MuJoCo's physics step:\n   [![Open In Colab](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fgoogle-deepmind\u002Fmujoco\u002Fblob\u002Fmain\u002Fmjx\u002Ftraining_apg.ipynb)\n\n## Installation\n\n### Prebuilt binaries\n\nVersioned releases are available as precompiled binaries from the GitHub\n[releases page], built for Linux (x86-64 and AArch64), Windows (x86-64 only),\nand macOS (universal). This is the recommended way to use the software.\n\n### Building from source\n\nUsers who wish to build MuJoCo from source should consult the [build from\nsource] section of the documentation. However, note that the commit at\nthe tip of the `main` branch may be unstable.\n\n### Python (>= 3.10)\n\nThe native Python bindings, which come pre-packaged with a copy of MuJoCo, can\nbe installed from [PyPI] via:\n\n```bash\npip install mujoco\n```\n\nNote that Pre-built Linux wheels target `manylinux2014`, see\n[here](https:\u002F\u002Fgithub.com\u002Fpypa\u002Fmanylinux) for compatible distributions. For more\ninformation such as building the bindings from source, see the [Python bindings]\nsection of the documentation.\n\n## Versioning\n\nWe aim to release MuJoCo in the first week of each month. Our versioning\nstandards changed to modified Semantic Versioning in 3.5.0,\nsee [versioning](VERSIONING.md) for details.\n\n## Contributing\n\nWe welcome community engagement: questions, requests for help, bug reports and\nfeature requests. To read more about bug reports, feature requests and more\nambitious contributions, please see our [contributors guide](CONTRIBUTING.md)\nand [style guide](STYLEGUIDE.md).\n\n## Asking Questions\n\nQuestions and requests for help are welcome as a GitHub\n[\"Asking for Help\" Discussion](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco\u002Fdiscussions\u002Fcategories\u002Fasking-for-help)\nand should focus on a specific problem or question.\n\n## Bug reports and feature requests\n\nGitHub [Issues](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco\u002Fissues) are reserved\nfor bug reports, feature requests and other development-related subjects.\n\n## Related software\nMuJoCo is the backbone for numerous environment packages. Below we list several\nbindings and converters.\n\n### Bindings\n\nThese packages give users of various languages access to MuJoCo functionality:\n\n#### First-party bindings:\n\n- [Python bindings](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002Fstable\u002Fpython.html)\n  - [dm_control](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fdm_control), Google\n    DeepMind's related environment stack, includes\n    [PyMJCF](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fdm_control\u002Fblob\u002Fmain\u002Fdm_control\u002Fmjcf\u002FREADME.md),\n    a module for procedural manipulation of MuJoCo models.\n- [JavaScript bindings and WebAssembly support](\u002Fwasm\u002FREADME.md) (inspired [stillonearth](https:\u002F\u002Fgithub.com\u002Fstillonearth) and [zalo](https:\u002F\u002Fgithub.com\u002Fzalo)'s community projects; [mjswan](https:\u002F\u002Fgithub.com\u002Fttktjmt\u002Fmjswan) extends these with real-time policy control, interactive force\napplication, and more).\n- [C# bindings and Unity plug-in](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002Fstable\u002Funity.html)\n\n#### Third-party bindings:\n\n- **MATLAB Simulink**: [Simulink Blockset for MuJoCo Simulator](https:\u002F\u002Fgithub.com\u002Fmathworks-robotics\u002Fmujoco-simulink-blockset)\n  by [Manoj Velmurugan](https:\u002F\u002Fgithub.com\u002Fvmanoj1996).\n- **Swift**: [swift-mujoco](https:\u002F\u002Fgithub.com\u002Fliuliu\u002Fswift-mujoco)\n- **Java**: [mujoco-java](https:\u002F\u002Fgithub.com\u002FCommonWealthRobotics\u002Fmujoco-java)\n- **Julia**: [MuJoCo.jl](https:\u002F\u002Fgithub.com\u002FJamieMair\u002FMuJoCo.jl)\n- **Rust**: [MuJoCo-rs](https:\u002F\u002Fgithub.com\u002Fdavidhozic\u002Fmujoco-rs)\n\n### Converters\n\n- **OpenSim**: [MyoConverter](https:\u002F\u002Fgithub.com\u002FMyoHub\u002Fmyoconverter) converts\n  OpenSim models to MJCF.\n- **SDFormat**: [gz-mujoco](https:\u002F\u002Fgithub.com\u002Fgazebosim\u002Fgz-mujoco\u002F) is a\n  two-way SDFormat \u003C-> MJCF conversion tool.\n- **OBJ**: [obj2mjcf](https:\u002F\u002Fgithub.com\u002Fkevinzakka\u002Fobj2mjcf)\n  a script for converting composite OBJ files into a loadable MJCF model.\n- **onshape**: [Onshape to Robot](https:\u002F\u002Fgithub.com\u002Frhoban\u002Fonshape-to-robot)\n  Converts [onshape](https:\u002F\u002Fwww.onshape.com\u002Fen\u002F) CAD assemblies to MJCF.\n\n## Citation\n\nIf you use MuJoCo for published research, please cite:\n\n```\n@inproceedings{todorov2012mujoco,\n  title={MuJoCo: A physics engine for model-based control},\n  author={Todorov, Emanuel and Erez, Tom and Tassa, Yuval},\n  booktitle={2012 IEEE\u002FRSJ International Conference on Intelligent Robots and Systems},\n  pages={5026--5033},\n  year={2012},\n  organization={IEEE},\n  doi={10.1109\u002FIROS.2012.6386109}\n}\n```\n\n## License and Disclaimer\n\nCopyright 2021 DeepMind Technologies Limited.\n\nBox collision code ([`engine_collision_box.c`](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco\u002Fblob\u002Fmain\u002Fsrc\u002Fengine\u002Fengine_collision_box.c))\nis Copyright 2016 Svetoslav Kolev.\n\nReStructuredText documents, images, and videos in the `doc` directory are made\navailable under the terms of the Creative Commons Attribution 4.0 (CC BY 4.0)\nlicense. You may obtain a copy of the License at\nhttps:\u002F\u002Fcreativecommons.org\u002Flicenses\u002Fby\u002F4.0\u002Flegalcode.\n\nSource code is licensed under the Apache License, Version 2.0. You may obtain a\ncopy of the License at https:\u002F\u002Fwww.apache.org\u002Flicenses\u002FLICENSE-2.0.\n\nThis is not an officially supported Google product.\n\n[build from source]: https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002Flatest\u002Fprogramming#building-from-source\n[Getting Started]: https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002Flatest\u002Fprogramming#getting-started\n[Unity]: https:\u002F\u002Funity.com\u002F\n[releases page]: https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco\u002Freleases\n[mujoco.readthedocs.io]: https:\u002F\u002Fmujoco.readthedocs.io\n[changelog]: https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002Flatest\u002Fchangelog.html\n[Python bindings]: https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002Fstable\u002Fpython.html#python-bindings\n[PyPI]: https:\u002F\u002Fpypi.org\u002Fproject\u002Fmujoco\u002F\n","\u003Ch1>\n  \u003Ca href=\"#\">\u003Cimg alt=\"MuJoCo\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_readme_1fc57ab6abcc.png\" width=\"100%\"\u002F>\u003C\u002Fa>\n\u003C\u002Fh1>\n\n\u003Cp>\n  \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco\u002Factions\u002Fworkflows\u002Fbuild.yml?query=branch%3Amain\" alt=\"GitHub Actions\">\n    \u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Factions\u002Fworkflow\u002Fstatus\u002Fgoogle-deepmind\u002Fmujoco\u002Fbuild.yml?branch=main\">\n  \u003C\u002Fa>\n  \u003Ca href=\"https:\u002F\u002Fmujoco.readthedocs.io\u002F\" alt=\"Documentation\">\n    \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_readme_13d664e1afd7.png\">\n  \u003C\u002Fa>\n  \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco\u002Fblob\u002Fmain\u002FLICENSE\" alt=\"License\">\n    \u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Flicense\u002Fgoogle-deepmind\u002Fmujoco\">\n  \u003C\u002Fa>\n\u003C\u002Fp>\n\n**MuJoCo** 是 **Mu**lti-**Jo**int dynamics with **Co**ntact 的缩写。它是一个通用的物理引擎，旨在促进机器人技术、生物力学、图形与动画、机器学习以及其他需要快速且精确地模拟与环境交互的多关节结构领域的研究与开发。\n\n本仓库由 [Google DeepMind](https:\u002F\u002Fwww.deepmind.com\u002F) 维护。\n\nMuJoCo 提供 C API，主要面向研究人员和开发者。其运行时仿真模块经过优化以实现最高性能，并基于内置 XML 编译器预先分配的低层数据结构进行操作。该库还包含使用 OpenGL 渲染的原生 GUI 交互式可视化功能。此外，MuJoCo 还提供了大量用于计算物理相关量的实用函数。\n\n我们还提供 [Python 绑定] 和适用于 [Unity] 游戏引擎的插件。\n\n## 文档\n\nMuJoCo 的文档可在 [mujoco.readthedocs.io] 找到。即将在下一次发布中加入的新特性可以在“latest”分支的 [changelog] 中查看。\n\n## 入门指南\n\n有两种简单的方法可以开始使用 MuJoCo：\n\n1. **在您的机器上运行 `simulate`。**\n[这段视频](https:\u002F\u002Fwww.youtube.com\u002Fwatch?v=P83tKA1iz2Y) 展示了 MuJoCo 原生交互式查看器 `simulate` 的屏幕录制。请按照文档中“入门指南”部分的步骤，在您的机器上启动并运行 `simulate`。\n\n2. **探索我们的在线 IPython 笔记本。**\n如果您是 Python 用户，可以从我们在 Google Colab 上运行的教程笔记本开始：\n\n - **入门**教程教授 MuJoCo 的基础知识：\n   [![在 Colab 中打开](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fgoogle-deepmind\u002Fmujoco\u002Fblob\u002Fmain\u002Fpython\u002Ftutorial.ipynb)\n - **模型编辑**教程展示了如何以程序化方式创建和编辑模型：\n   [![在 Colab 中打开](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fgoogle-deepmind\u002Fmujoco\u002Fblob\u002Fmain\u002Fpython\u002Fmjspec.ipynb)\n - **rollout**教程介绍了如何使用多线程的 `rollout` 模块：\n   [![在 Colab 中打开](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fgoogle-deepmind\u002Fmujoco\u002Fblob\u002Fmain\u002Fpython\u002Frollout.ipynb)\n - **LQR**教程通过合成一个线性二次控制器，使一个人形机器人单腿站立平衡：\n   [![在 Colab 中打开](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fgoogle-deepmind\u002Fmujoco\u002Fblob\u002Fmain\u002Fpython\u002FLQR.ipynb)\n - **最小二乘法**教程解释了如何使用基于 Python 的非线性最小二乘求解器：\n   [![在 Colab 中打开](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fgoogle-deepmind\u002Fmujoco\u002Fblob\u002Fmain\u002Fpython\u002Fleast_squares.ipynb)\n - **MJX**教程提供了使用 [MuJoCo XLA](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002Fstable\u002Fmjx.html) 的示例，这是用 JAX 编写的 MuJoCo 分支：\n   [![在 Colab 中打开](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fgoogle-deepmind\u002Fmujoco\u002Fblob\u002Fmain\u002Fmjx\u002Ftutorial.ipynb)\n - **可微分物理**教程利用从 MuJoCo 物理步长自动推导出的解析梯度来训练运动策略：\n   [![在 Colab 中打开](https:\u002F\u002Fcolab.research.google.com\u002Fassets\u002Fcolab-badge.svg)](https:\u002F\u002Fcolab.research.google.com\u002Fgithub\u002Fgoogle-deepmind\u002Fmujoco\u002Fblob\u002Fmain\u002Fmjx\u002Ftraining_apg.ipynb)\n\n## 安装\n\n### 预编译二进制文件\n\n版本化的发布可通过 GitHub 的 [releases 页面] 获取预编译的二进制文件，分别针对 Linux（x86-64 和 AArch64）、Windows（仅 x86-64）以及 macOS（通用）。这是推荐的使用方式。\n\n### 从源代码构建\n\n希望从源代码构建 MuJoCo 的用户，请参阅文档中的“从源代码构建”章节。请注意，`main` 分支顶端的提交可能不稳定。\n\n### Python (>= 3.10)\n\n原生 Python 绑定随 MuJoCo 一起打包，可以通过 [PyPI] 安装：\n\n```bash\npip install mujoco\n```\n\n请注意，预编译的 Linux wheel 针对 `manylinux2014`，兼容的发行版请参见 [这里](https:\u002F\u002Fgithub.com\u002Fpypa\u002Fmanylinux)。有关从源代码构建绑定等更多信息，请参阅文档中的“Python 绑定”部分。\n\n## 版本控制\n\n我们计划每月的第一周发布 MuJoCo 的新版本。自 3.5.0 版起，我们的版本控制标准已改为修改后的语义化版本号，详情请参阅 [VERSIONING.md]。\n\n## 贡献\n\n我们欢迎社区参与：提问、寻求帮助、报告 bug 和提出功能请求。如需了解更多关于 bug 报告、功能请求及更高级别的贡献信息，请参阅我们的 [贡献者指南](CONTRIBUTING.md) 和 [风格指南](STYLEGUIDE.md)。\n\n## 提问\n\n问题和求助请求可通过 GitHub 的 [\"Asking for Help\" Discussion](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco\u002Fdiscussions\u002Fcategories\u002Fasking-for-help) 提出，内容应聚焦于具体的问题或疑问。\n\n## Bug 报告和功能请求\n\nGitHub 的 [Issues](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco\u002Fissues) 专门用于 bug 报告、功能请求及其他开发相关主题。\n\n## 相关软件\nMuJoCo 是许多环境包的核心基础。以下列出了一些绑定和转换工具。\n\n### 绑定\n\n这些软件包为不同编程语言的用户提供了访问 MuJoCo 功能的途径：\n\n#### 第一方绑定：\n\n- [Python 绑定](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002Fstable\u002Fpython.html)\n  - [dm_control](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fdm_control)，Google DeepMind 的相关环境栈，包含\n    [PyMJCF](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fdm_control\u002Fblob\u002Fmain\u002Fdm_control\u002Fmjcf\u002FREADME.md),\n    一个用于程序化操作 MuJoCo 模型的模块。\n- [JavaScript 绑定及 WebAssembly 支持](\u002Fwasm\u002FREADME.md)（受 [stillonearth](https:\u002F\u002Fgithub.com\u002Fstillonearth) 和 [zalo](https:\u002F\u002Fgithub.com\u002Fzalo) 的社区项目启发；[mjswan](https:\u002F\u002Fgithub.com\u002Fttktjmt\u002Fmjswan) 在此基础上扩展了实时策略控制、交互式力应用等功能）。\n- [C# 绑定及 Unity 插件](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002Fstable\u002Funity.html)\n\n#### 第三方绑定：\n\n- **MATLAB Simulink**：由 [Manoj Velmurugan](https:\u002F\u002Fgithub.com\u002Fvmanoj1996) 开发的\n  [Simulink 块库 for MuJoCo 仿真器](https:\u002F\u002Fgithub.com\u002Fmathworks-robotics\u002Fmujoco-simulink-blockset)。\n- **Swift**：[swift-mujoco](https:\u002F\u002Fgithub.com\u002Fliuliu\u002Fswift-mujoco)\n- **Java**：[mujoco-java](https:\u002F\u002Fgithub.com\u002FCommonWealthRobotics\u002Fmujoco-java)\n- **Julia**：[MuJoCo.jl](https:\u002F\u002Fgithub.com\u002FJamieMair\u002FMuJoCo.jl)\n- **Rust**：[MuJoCo-rs](https:\u002F\u002Fgithub.com\u002Fdavidhozic\u002Fmujoco-rs)\n\n### 转换工具\n\n- **OpenSim**：[MyoConverter](https:\u002F\u002Fgithub.com\u002FMyoHub\u002Fmyoconverter) 可将 OpenSim 模型转换为 MJCF 格式。\n- **SDFormat**：[gz-mujoco](https:\u002F\u002Fgithub.com\u002Fgazebosim\u002Fgz-mujoco\u002F) 是一个双向的 SDFormat ↔ MJCF 转换工具。\n- **OBJ**：[obj2mjcf](https:\u002F\u002Fgithub.com\u002Fkevinzakka\u002Fobj2mjcf)\n  是一个脚本，用于将复合 OBJ 文件转换为可加载的 MJCF 模型。\n- **onshape**：[Onshape to Robot](https:\u002F\u002Fgithub.com\u002Frhoban\u002Fonshape-to-robot)\n  可将 [onshape](https:\u002F\u002Fwww.onshape.com\u002Fen\u002F) CAD 装配体转换为 MJCF 格式。\n\n## 引用\n\n如果您在已发表的研究中使用 MuJoCo，请引用以下文献：\n\n```\n@inproceedings{todorov2012mujoco,\n  title={MuJoCo: 一种基于模型控制的物理引擎},\n  author={Todorov, Emanuel and Erez, Tom and Tassa, Yuval},\n  booktitle={2012 IEEE\u002FRSJ 国际智能机器人与系统会议},\n  pages={5026--5033},\n  year={2012},\n  organization={IEEE},\n  doi={10.1109\u002FIROS.2012.6386109}\n}\n```\n\n## 许可与免责声明\n\n版权所有 © 2021 DeepMind Technologies Limited。\n\n盒状碰撞代码（[`engine_collision_box.c`](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco\u002Fblob\u002Fmain\u002Fsrc\u002Fengine\u002Fengine_collision_box.c)）\n归 Svetoslav Kolev 所有，版权 © 2016。\n\n`doc` 目录中的 ReStructuredText 文档、图片和视频根据知识共享署名 4.0（CC BY 4.0）许可协议提供。您可以在\nhttps:\u002F\u002Fcreativecommons.org\u002Flicenses\u002Fby\u002F4.0\u002Flegalcode 获取该许可协议的副本。\n\n源代码采用 Apache 许可协议第 2.0 版授权。您可以在 https:\u002F\u002Fwww.apache.org\u002Flicenses\u002FLICENSE-2.0 获取该许可协议的副本。\n\n本产品并非 Google 官方支持的产品。\n\n[从源码构建]：https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002Flatest\u002Fprogramming#building-from-source\n[入门指南]：https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002Flatest\u002Fprogramming#getting-started\n[Unity]：https:\u002F\u002Funity.com\u002F\n[发布页面]：https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco\u002Freleases\n[mujoco.readthedocs.io]：https:\u002F\u002Fmujoco.readthedocs.io\n[变更日志]：https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002Flatest\u002Fchangelog.html\n[Python 绑定]：https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002Fstable\u002Fpython.html#python-bindings\n[PyPI]：https:\u002F\u002Fpypi.org\u002Fproject\u002Fmujoco\u002F","# MuJoCo 快速上手指南\n\nMuJoCo (Multi-Joint dynamics with Contact) 是一款由 Google DeepMind 维护的高性能物理引擎，专为机器人学、生物力学、机器学习等领域的研究而设计，擅长处理带有接触力的关节结构仿真。\n\n## 环境准备\n\n### 系统要求\nMuJoCo 支持以下操作系统：\n- **Linux**: x86-64 或 AArch64 架构（推荐 `manylinux2014` 兼容发行版，如 Ubuntu 18.04+）\n- **Windows**: x86-64 架构\n- **macOS**: 通用二进制（Intel 和 Apple Silicon）\n\n### 前置依赖\n- **Python 版本**: 若使用 Python 绑定，需安装 Python 3.10 或更高版本。\n- **编译器**: 若需从源码构建，需安装 C\u002FC++ 编译器（如 GCC, Clang, MSVC）及 CMake。\n- **图形库**: 原生查看器需要 OpenGL 支持。\n\n## 安装步骤\n\n对于大多数开发者，推荐使用预编译的 Python 包，这是最快捷的方式。\n\n### 方法一：通过 PyPI 安装（推荐）\n直接使用 pip 安装，该包已内置 MuJoCo 核心库副本，无需单独下载引擎二进制文件。\n\n```bash\npip install mujoco\n```\n\n> **国内加速提示**：如果遇到下载速度慢的问题，可以使用国内镜像源：\n> ```bash\n> pip install mujoco -i https:\u002F\u002Fpypi.tuna.tsinghua.edu.cn\u002Fsimple\n> ```\n\n### 方法二：使用预编译二进制文件（非 Python 用户）\n如果您需要使用 C API 或原生查看器而不通过 Python：\n1. 访问 GitHub [Releases 页面](https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco\u002Freleases)。\n2. 下载对应您操作系统的最新版本压缩包。\n3. 解压并将 `bin` 目录添加到系统环境变量 `PATH` 中。\n\n### 方法三：从源码构建（高级用户）\n仅当您需要修改引擎核心或使用最新不稳定特性时采用。具体步骤请参考官方文档的 [Build from Source](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002Flatest\u002Fprogramming#building-from-source) 章节。\n\n## 基本使用\n\n安装完成后，您可以通过 Python 快速加载模型并进行仿真。以下是一个最简单的示例，展示如何加载一个内置模型并运行仿真步。\n\n### Python 使用示例\n\n确保已安装 `mujoco` 和 `numpy`。\n\n```python\nimport mujoco\nimport numpy as np\n\n# 1. 加载内置的测试模型 (例如：一个人形机器人)\nmodel = mujoco.MjModel.from_xml_string(\"\"\"\n\u003Cmujoco>\n  \u003Cworldbody>\n    \u003Clight pos=\"0 0 3\"\u002F>\n    \u003Cgeom name=\"floor\" type=\"plane\" size=\"1 1 0.1\"\u002F>\n    \u003Cbody pos=\"0 0 1\">\n      \u003Cjoint type=\"free\"\u002F>\n      \u003Cgeom type=\"sphere\" size=\"0.1\"\u002F>\n    \u003C\u002Fbody>\n  \u003C\u002Fworldbody>\n\u003C\u002Fmujoco>\n\"\"\")\n\n# 2. 创建数据结构\ndata = mujoco.MjData(model)\n\n# 3. 运行仿真循环\nprint(\"开始仿真...\")\nfor i in range(1000):\n    # 执行一步物理计算\n    mujoco.mj_step(model, data)\n    \n    # 每 100 步打印一次质心高度\n    if i % 100 == 0:\n        print(f\"Step {i}: Body height = {data.qpos[2]:.4f}\")\n\nprint(\"仿真结束。\")\n```\n\n### 启动原生查看器\n如果您安装了预编译二进制文件（方法二），可以直接在终端运行以下命令来交互式查看模型：\n\n```bash\nsimulate \u003Cpath_to_your_model.xml>\n```\n\n对于 Python 用户，结合 `mujoco.viewer` 模块可实现实时可视化（需本地显示环境支持）。","某机器人实验室团队正在研发一款双足人形机器人，需要在将算法部署到昂贵的实体硬件前，验证其复杂步态控制策略在真实物理环境中的稳定性。\n\n### 没有 mujoco 时\n- **接触模拟失真**：传统引擎难以精确处理脚底与地面的复杂摩擦和碰撞，导致机器人在仿真中频繁“穿模”或无故摔倒，无法反映真实受力情况。\n- **开发周期冗长**：由于仿真可信度低，团队不得不依赖实机调试，每次迭代都面临硬件磨损风险且耗时数天，严重拖慢研发进度。\n- **计算效率低下**：在进行大规模强化学习训练时，原有模拟器运算速度慢，无法支持高并发的数据采样，模型收敛需要数周时间。\n- **建模门槛高**：缺乏高效的 XML 编译器和可视化工具，构建精细的关节连杆模型需手动编写大量底层代码，极易出错。\n\n### 使用 mujoco 后\n- **物理反馈精准**：mujoco 独有的接触动力学算法完美还原了行走时的地面反作用力与滑动摩擦，仿真步态与实机表现高度一致。\n- **安全快速迭代**：团队可在虚拟环境中完成 90% 的极端场景测试（如推搡、滑倒恢复），大幅减少实机试错成本，将迭代周期缩短至小时级。\n- **训练加速显著**：借助 mujoco 的高性能运行时模块及 MJX (JAX) 加速特性，强化学习策略的训练速度提升数十倍，几天内即可完成模型收敛。\n- **工作流更流畅**：利用内置的 XML 编译器和原生 OpenGL 可视化界面，研究人员能快速搭建并直观调试复杂的多关节生物力学模型。\n\nmujoco 通过提供高保真、高效率的物理仿真环境，让人形机器人的算法研发从“高风险试错”转变为“可预测的数字化演进”。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fgoogle-deepmind_mujoco_a80b900c.png","google-deepmind","Google DeepMind","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002Fgoogle-deepmind_06b1dd17.png","",null,"https:\u002F\u002Fwww.deepmind.com\u002F","https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind",[84,88,92,96,100,104,107,111,115,119],{"name":85,"color":86,"percentage":87},"C++","#f34b7d",36.8,{"name":89,"color":90,"percentage":91},"Jupyter Notebook","#DA5B0B",26.2,{"name":93,"color":94,"percentage":95},"Python","#3572A5",15.6,{"name":97,"color":98,"percentage":99},"C","#555555",14.3,{"name":101,"color":102,"percentage":103},"C#","#178600",5,{"name":105,"color":106,"percentage":29},"CMake","#DA3434",{"name":108,"color":109,"percentage":110},"TypeScript","#3178c6",0.7,{"name":112,"color":113,"percentage":114},"HTML","#e34c26",0.2,{"name":116,"color":117,"percentage":118},"Objective-C++","#6866fb",0.1,{"name":120,"color":121,"percentage":118},"Shell","#89e051",12811,1434,"2026-04-14T15:33:36","Apache-2.0","Linux, macOS, Windows","未说明（原生 GUI 基于 OpenGL 渲染，非强制要求独立显卡）","未说明",{"notes":130,"python":131,"dependencies":132},"预编译二进制支持 Linux (x86-64, AArch64)、Windows (x86-64) 和 macOS (通用架构)。Linux 预编译包针对 manylinux2014 标准。提供原生 C API、Python 绑定、Unity 插件及 JavaScript\u002FWASM 支持。MJX 模块基于 JAX，若使用需额外配置 JAX 环境。",">=3.10",[],[18],[135,136,68],"robotics","physics","2026-03-27T02:49:30.150509","2026-04-15T07:03:08.243113",[140,145,150,154,159,163],{"id":141,"question_zh":142,"answer_zh":143,"source_url":144},33918,"如何在 Unity 中正确加载 MuJoCo 插件库文件（.so）？","确保 MuJoCo 库文件命名为 `libmujoco.so`，而不是带有版本号的名称（如 `libmujoco.so.3.1.3`）。虽然 Unity 在界面中可能显示为 `mujoco.so`，但如果文件扩展名包含版本号，Unity 可能会忽略该文件。请将文件名截断为仅保留 `.so` 扩展名即可解决识别问题。","https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco\u002Fissues\u002F1637",{"id":146,"question_zh":147,"answer_zh":148,"source_url":149},33919,"MuJoCo 支持哪些版本的 GMSH (.msh) 文件格式？","只要文件中只包含单个实体，MuJoCo 通常支持 GMSH 2.2 和 4.1 格式。如果遇到 'Only GMSH file format 4.1 supported' 错误，请检查文件内容是否纯净。最新的清理代码可以生成这两种格式，相关工具可在 GitHub 上的 GMSHConverter 项目找到。","https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco\u002Fissues\u002F1543",{"id":151,"question_zh":152,"answer_zh":153,"source_url":149},33920,"如何使用命令行工具将 STL 文件转换为 MuJoCo 可用的体积网格？","首先需要构建 `ftetwild` 工具。构建完成后，在 `ftetwild` 文件夹下运行以下命令来清理并生成体积网格：\n`.\u002Fbuild\u002FFloatTetwild_bin -i \u003C文件路径>\u002Fyour_file.stl`\n这将把输入的 STL 文件转换为适合 MuJoCo `flexcomp` 使用的格式。",{"id":155,"question_zh":156,"answer_zh":157,"source_url":158},33921,"在集群环境中编译 MuJoCo Python 绑定遇到 '#pragma is not allowed here' 错误怎么办？","该错误通常与编译器配置或版本有关。建议在使用 `pip wheel` 编译时添加 `-v` 参数（即 `pip wheel mujoco-x.x.x.tar.gz -v`），以查看 CMake 输出的前几行编译器识别信息。这有助于确认是否正确加载了预期的 GCC 版本。此外，确保使用的是与 MuJoCo 版本兼容的 GCC（如 GCC 11.x），并检查是否有冲突的环境模块加载。","https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco\u002Fissues\u002F379",{"id":160,"question_zh":161,"answer_zh":162,"source_url":144},33922,"如何在 Quest VR 设备上运行 MuJoCo Unity 插件？","目前 standalone（独立）构建可能尚不支持 Android 兼容的 MuJoCo 版本。作为临时解决方案，可以通过 USB 使用 Quest Link，并在 Unity 编辑器中直接运行应用，将 VR 场景流式传输到头显中进行测试。这种方法避免了打包 standalone 应用的复杂性。",{"id":164,"question_zh":165,"answer_zh":166,"source_url":167},33923,"MuJoCo 是否支持通过 API 动态创建模型或在运行时修改场景？","目前 MuJoCo 主要设计为从文件（XML\u002FMJCF、URDF 或二进制 MJB）加载模型，原生 API 不直接支持完全通过代码动态构建模型或在仿真运行时动态添加新物体（如身体、关节等）而无需重新加载整个模型。变通方法包括：1. 在内部表示中生成 URDF 字符串并通过 `from_xml_string` 加载；2. 预先在场景中创建隐藏对象，需要时再激活它们，但这会增加不必要的计算开销。官方已将此列为功能请求，但尚未实现完全的动态场景编辑 API。","https:\u002F\u002Fgithub.com\u002Fgoogle-deepmind\u002Fmujoco\u002Fissues\u002F364",[169,174,179,184,189,194,199,204,209,214,219,224,229,234,239,244,249,254,259,264],{"id":170,"version":171,"summary_zh":172,"released_at":173},263798,"3.7.0","请参阅[变更日志](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002F3.7.0\u002Fchangelog.html)","2026-04-14T13:03:47",{"id":175,"version":176,"summary_zh":177,"released_at":178},263799,"3.6.0","请参阅[变更日志](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002F3.6.0\u002Fchangelog.html)","2026-03-11T01:53:34",{"id":180,"version":181,"summary_zh":182,"released_at":183},263800,"3.5.0","请参阅[变更日志](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002F3.5.0\u002Fchangelog.html)。","2026-02-13T01:20:28",{"id":185,"version":186,"summary_zh":187,"released_at":188},263801,"3.4.0","请参阅[变更日志](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002F3.4.0\u002Fchangelog.html)。","2025-12-05T23:17:44",{"id":190,"version":191,"summary_zh":192,"released_at":193},263802,"3.3.7","请参阅[变更日志](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002F3.3.7\u002Fchangelog.html)。","2025-10-14T17:38:32",{"id":195,"version":196,"summary_zh":197,"released_at":198},263803,"3.3.6","请参阅[变更日志](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002F3.3.6\u002Fchangelog.html)。","2025-09-16T14:28:40",{"id":200,"version":201,"summary_zh":202,"released_at":203},263804,"3.3.5","请参阅[变更日志](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002F3.3.5\u002Fchangelog.html)。","2025-08-08T23:02:07",{"id":205,"version":206,"summary_zh":207,"released_at":208},263805,"3.3.4","请参阅[变更日志](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002F3.3.4\u002Fchangelog.html)。","2025-07-09T20:14:25",{"id":210,"version":211,"summary_zh":212,"released_at":213},263806,"3.3.3","请参阅[变更日志](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002F3.3.3\u002Fchangelog.html)。","2025-06-11T20:16:46",{"id":215,"version":216,"summary_zh":217,"released_at":218},263807,"3.3.2","请参阅[变更日志](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002F3.3.2\u002Fchangelog.html)。","2025-04-28T22:23:42",{"id":220,"version":221,"summary_zh":222,"released_at":223},263808,"3.3.1","See the [changelog](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002F3.3.1\u002Fchangelog.html).","2025-04-10T16:52:51",{"id":225,"version":226,"summary_zh":227,"released_at":228},263809,"3.3.0","See the [changelog](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002F3.3.0\u002Fchangelog.html).","2025-02-27T22:42:47",{"id":230,"version":231,"summary_zh":232,"released_at":233},263810,"3.2.7","See the [changelog](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002F3.2.7\u002Fchangelog.html).","2025-01-15T00:04:28",{"id":235,"version":236,"summary_zh":237,"released_at":238},263811,"3.2.6","See the [changelog](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002F3.2.6\u002Fchangelog.html).","2024-12-02T22:52:14",{"id":240,"version":241,"summary_zh":242,"released_at":243},263812,"3.2.5","See the [changelog](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002F3.2.5\u002Fchangelog.html).","2024-11-05T11:27:27",{"id":245,"version":246,"summary_zh":247,"released_at":248},263813,"3.2.4","See the [changelog](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002F3.2.4\u002Fchangelog.html).","2024-10-16T19:48:24",{"id":250,"version":251,"summary_zh":252,"released_at":253},263814,"3.2.3","See the [changelog](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002F3.2.3\u002Fchangelog.html).","2024-09-16T23:02:46",{"id":255,"version":256,"summary_zh":257,"released_at":258},263815,"3.2.2","See the [changelog](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002F3.2.2\u002Fchangelog.html).","2024-08-08T16:41:32",{"id":260,"version":261,"summary_zh":262,"released_at":263},263816,"3.2.1","See the [changelog](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002F3.2.1\u002Fchangelog.html).","2024-08-06T18:12:44",{"id":265,"version":266,"summary_zh":267,"released_at":268},263817,"3.2.0","See the [changelog](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002F3.2.0\u002Fchangelog.html).\r\n\r\nIntroduced a major new feature: **procedural model creation and editing**, using a new top-level data-structure [mjSpec](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002Flatest\u002FAPIreference\u002FAPItypes.html#mjspec). See the [Model Editing](https:\u002F\u002Fmujoco.readthedocs.io\u002Fen\u002Flatest\u002Fprogramming\u002Fmodeledit.html) chapter for details. Note that as of this release this feature is still in testing and subject to future breaking changes. Fixes #364.","2024-07-15T23:10:38"]