[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-wildminder--AI-windows-whl":3,"tool-wildminder--AI-windows-whl":61},[4,18,26,36,44,53],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":17},4358,"openclaw","openclaw\u002Fopenclaw","OpenClaw 是一款专为个人打造的本地化 AI 助手，旨在让你在自己的设备上拥有完全可控的智能伙伴。它打破了传统 AI 助手局限于特定网页或应用的束缚，能够直接接入你日常使用的各类通讯渠道，包括微信、WhatsApp、Telegram、Discord、iMessage 等数十种平台。无论你在哪个聊天软件中发送消息，OpenClaw 都能即时响应，甚至支持在 macOS、iOS 和 Android 设备上进行语音交互，并提供实时的画布渲染功能供你操控。\n\n这款工具主要解决了用户对数据隐私、响应速度以及“始终在线”体验的需求。通过将 AI 部署在本地，用户无需依赖云端服务即可享受快速、私密的智能辅助，真正实现了“你的数据，你做主”。其独特的技术亮点在于强大的网关架构，将控制平面与核心助手分离，确保跨平台通信的流畅性与扩展性。\n\nOpenClaw 非常适合希望构建个性化工作流的技术爱好者、开发者，以及注重隐私保护且不愿被单一生态绑定的普通用户。只要具备基础的终端操作能力（支持 macOS、Linux 及 Windows WSL2），即可通过简单的命令行引导完成部署。如果你渴望拥有一个懂你",349277,3,"2026-04-06T06:32:30",[13,14,15,16],"Agent","开发框架","图像","数据工具","ready",{"id":19,"name":20,"github_repo":21,"description_zh":22,"stars":23,"difficulty_score":10,"last_commit_at":24,"category_tags":25,"status":17},3808,"stable-diffusion-webui","AUTOMATIC1111\u002Fstable-diffusion-webui","stable-diffusion-webui 是一个基于 Gradio 构建的网页版操作界面，旨在让用户能够轻松地在本地运行和使用强大的 Stable Diffusion 图像生成模型。它解决了原始模型依赖命令行、操作门槛高且功能分散的痛点，将复杂的 AI 绘图流程整合进一个直观易用的图形化平台。\n\n无论是希望快速上手的普通创作者、需要精细控制画面细节的设计师，还是想要深入探索模型潜力的开发者与研究人员，都能从中获益。其核心亮点在于极高的功能丰富度：不仅支持文生图、图生图、局部重绘（Inpainting）和外绘（Outpainting）等基础模式，还独创了注意力机制调整、提示词矩阵、负向提示词以及“高清修复”等高级功能。此外，它内置了 GFPGAN 和 CodeFormer 等人脸修复工具，支持多种神经网络放大算法，并允许用户通过插件系统无限扩展能力。即使是显存有限的设备，stable-diffusion-webui 也提供了相应的优化选项，让高质量的 AI 艺术创作变得触手可及。",162132,"2026-04-05T11:01:52",[14,15,13],{"id":27,"name":28,"github_repo":29,"description_zh":30,"stars":31,"difficulty_score":32,"last_commit_at":33,"category_tags":34,"status":17},1381,"everything-claude-code","affaan-m\u002Feverything-claude-code","everything-claude-code 是一套专为 AI 编程助手（如 Claude Code、Codex、Cursor 等）打造的高性能优化系统。它不仅仅是一组配置文件，而是一个经过长期实战打磨的完整框架，旨在解决 AI 代理在实际开发中面临的效率低下、记忆丢失、安全隐患及缺乏持续学习能力等核心痛点。\n\n通过引入技能模块化、直觉增强、记忆持久化机制以及内置的安全扫描功能，everything-claude-code 能显著提升 AI 在复杂任务中的表现，帮助开发者构建更稳定、更智能的生产级 AI 代理。其独特的“研究优先”开发理念和针对 Token 消耗的优化策略，使得模型响应更快、成本更低，同时有效防御潜在的攻击向量。\n\n这套工具特别适合软件开发者、AI 研究人员以及希望深度定制 AI 工作流的技术团队使用。无论您是在构建大型代码库，还是需要 AI 协助进行安全审计与自动化测试，everything-claude-code 都能提供强大的底层支持。作为一个曾荣获 Anthropic 黑客大奖的开源项目，它融合了多语言支持与丰富的实战钩子（hooks），让 AI 真正成长为懂上",144730,2,"2026-04-07T23:26:32",[14,13,35],"语言模型",{"id":37,"name":38,"github_repo":39,"description_zh":40,"stars":41,"difficulty_score":32,"last_commit_at":42,"category_tags":43,"status":17},2271,"ComfyUI","Comfy-Org\u002FComfyUI","ComfyUI 是一款功能强大且高度模块化的视觉 AI 引擎，专为设计和执行复杂的 Stable Diffusion 图像生成流程而打造。它摒弃了传统的代码编写模式，采用直观的节点式流程图界面，让用户通过连接不同的功能模块即可构建个性化的生成管线。\n\n这一设计巧妙解决了高级 AI 绘图工作流配置复杂、灵活性不足的痛点。用户无需具备编程背景，也能自由组合模型、调整参数并实时预览效果，轻松实现从基础文生图到多步骤高清修复等各类复杂任务。ComfyUI 拥有极佳的兼容性，不仅支持 Windows、macOS 和 Linux 全平台，还广泛适配 NVIDIA、AMD、Intel 及苹果 Silicon 等多种硬件架构，并率先支持 SDXL、Flux、SD3 等前沿模型。\n\n无论是希望深入探索算法潜力的研究人员和开发者，还是追求极致创作自由度的设计师与资深 AI 绘画爱好者，ComfyUI 都能提供强大的支持。其独特的模块化架构允许社区不断扩展新功能，使其成为当前最灵活、生态最丰富的开源扩散模型工具之一，帮助用户将创意高效转化为现实。",107888,"2026-04-06T11:32:50",[14,15,13],{"id":45,"name":46,"github_repo":47,"description_zh":48,"stars":49,"difficulty_score":32,"last_commit_at":50,"category_tags":51,"status":17},4721,"markitdown","microsoft\u002Fmarkitdown","MarkItDown 是一款由微软 AutoGen 团队打造的轻量级 Python 工具，专为将各类文件高效转换为 Markdown 格式而设计。它支持 PDF、Word、Excel、PPT、图片（含 OCR）、音频（含语音转录）、HTML 乃至 YouTube 链接等多种格式的解析，能够精准提取文档中的标题、列表、表格和链接等关键结构信息。\n\n在人工智能应用日益普及的今天，大语言模型（LLM）虽擅长处理文本，却难以直接读取复杂的二进制办公文档。MarkItDown 恰好解决了这一痛点，它将非结构化或半结构化的文件转化为模型“原生理解”且 Token 效率极高的 Markdown 格式，成为连接本地文件与 AI 分析 pipeline 的理想桥梁。此外，它还提供了 MCP（模型上下文协议）服务器，可无缝集成到 Claude Desktop 等 LLM 应用中。\n\n这款工具特别适合开发者、数据科学家及 AI 研究人员使用，尤其是那些需要构建文档检索增强生成（RAG）系统、进行批量文本分析或希望让 AI 助手直接“阅读”本地文件的用户。虽然生成的内容也具备一定可读性，但其核心优势在于为机器",93400,"2026-04-06T19:52:38",[52,14],"插件",{"id":54,"name":55,"github_repo":56,"description_zh":57,"stars":58,"difficulty_score":10,"last_commit_at":59,"category_tags":60,"status":17},4487,"LLMs-from-scratch","rasbt\u002FLLMs-from-scratch","LLMs-from-scratch 是一个基于 PyTorch 的开源教育项目，旨在引导用户从零开始一步步构建一个类似 ChatGPT 的大型语言模型（LLM）。它不仅是同名技术著作的官方代码库，更提供了一套完整的实践方案，涵盖模型开发、预训练及微调的全过程。\n\n该项目主要解决了大模型领域“黑盒化”的学习痛点。许多开发者虽能调用现成模型，却难以深入理解其内部架构与训练机制。通过亲手编写每一行核心代码，用户能够透彻掌握 Transformer 架构、注意力机制等关键原理，从而真正理解大模型是如何“思考”的。此外，项目还包含了加载大型预训练权重进行微调的代码，帮助用户将理论知识延伸至实际应用。\n\nLLMs-from-scratch 特别适合希望深入底层原理的 AI 开发者、研究人员以及计算机专业的学生。对于不满足于仅使用 API，而是渴望探究模型构建细节的技术人员而言，这是极佳的学习资源。其独特的技术亮点在于“循序渐进”的教学设计：将复杂的系统工程拆解为清晰的步骤，配合详细的图表与示例，让构建一个虽小但功能完备的大模型变得触手可及。无论你是想夯实理论基础，还是为未来研发更大规模的模型做准备",90106,"2026-04-06T11:19:32",[35,15,13,14],{"id":62,"github_repo":63,"name":64,"description_en":65,"description_zh":66,"ai_summary_zh":67,"readme_en":68,"readme_zh":69,"quickstart_zh":70,"use_case_zh":71,"hero_image_url":72,"owner_login":73,"owner_name":74,"owner_avatar_url":75,"owner_bio":76,"owner_company":77,"owner_location":78,"owner_email":77,"owner_twitter":79,"owner_website":77,"owner_url":80,"languages":77,"stars":81,"forks":82,"last_commit_at":83,"license":77,"difficulty_score":32,"env_os":84,"env_gpu":85,"env_ram":86,"env_deps":87,"category_tags":100,"github_topics":101,"view_count":32,"oss_zip_url":77,"oss_zip_packed_at":77,"status":17,"created_at":105,"updated_at":106,"faqs":107,"releases":142},5449,"wildminder\u002FAI-windows-whl","AI-windows-whl","Pre-compiled Python whl for Flash-attention, SageAttention, NATTEN, xFormer etc","AI-windows-whl 是一个专为 Windows 用户打造的开源项目，致力于提供一系列难以安装的 AI 与机器学习库的预编译 Python 安装包（.whl 文件）。在 Windows 系统上，像 Flash-Attention、xFormers、SageAttention、NATTEN 以及 bitsandbytes 等高性能加速库，往往因为缺乏官方预构建版本，迫使开发者必须配置复杂的编译环境并从源码手动构建，这一过程不仅耗时且极易出错。\n\n该项目通过集中整理并维护这些库的直接下载链接，让用户无需安装 C++ 编译器或 CUDA 工具链，只需简单的 pip 命令即可完成安装，极大地降低了部署门槛。它不仅涵盖了主流的 PyTorch 生态组件，还包含了 Triton 的 Windows 分支等前沿技术成果，确保了技术栈的完整性与时效性。\n\n无论是正在调试大模型本地部署的 AI 研究人员、需要快速搭建开发环境的算法工程师，还是希望在个人电脑上体验最新生成式 AI 技术的爱好者，都能从中受益。AI-windows-whl 让 Windows 平台上的高性能 AI 开发变得像在其他系","AI-windows-whl 是一个专为 Windows 用户打造的开源项目，致力于提供一系列难以安装的 AI 与机器学习库的预编译 Python 安装包（.whl 文件）。在 Windows 系统上，像 Flash-Attention、xFormers、SageAttention、NATTEN 以及 bitsandbytes 等高性能加速库，往往因为缺乏官方预构建版本，迫使开发者必须配置复杂的编译环境并从源码手动构建，这一过程不仅耗时且极易出错。\n\n该项目通过集中整理并维护这些库的直接下载链接，让用户无需安装 C++ 编译器或 CUDA 工具链，只需简单的 pip 命令即可完成安装，极大地降低了部署门槛。它不仅涵盖了主流的 PyTorch 生态组件，还包含了 Triton 的 Windows 分支等前沿技术成果，确保了技术栈的完整性与时效性。\n\n无论是正在调试大模型本地部署的 AI 研究人员、需要快速搭建开发环境的算法工程师，还是希望在个人电脑上体验最新生成式 AI 技术的爱好者，都能从中受益。AI-windows-whl 让 Windows 平台上的高性能 AI 开发变得像在其他系统上一样简单流畅，是解决“依赖地狱”问题的实用利器。","\u003C!-- Improved compatibility of back to top link: See: https:\u002F\u002Fgithub.com\u002Fothneildrew\u002FBest-README-Template\u002Fpull\u002F73 -->\r\n\u003C!-- PROJECT LOGO -->\r\n\u003Ca id=\"readme-top\">\u003C\u002Fa>\r\n\u003Cdiv align=\"center\">\r\n  \u003Ch1 align=\"center\">Windows AI Wheels\u003C\u002Fh1>\r\n\r\n\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_14d6df726b07.png\" alt=\"AI-windows-whl logo\">\r\n\r\n  \u003Cp align=\"center\">\r\n    A curated collection of pre-compiled Python wheels for difficult-to-install AI\u002FML libraries on Windows.\r\n    \u003Cbr \u002F>\r\n    \u003Cbr \u002F>\r\n    \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fwildminder\u002FAI-windows-whl\u002Fissues\u002Fnew?labels=bug&template=bug-report---.md\">Report a Broken Link\u003C\u002Fa>\r\n    ·\r\n    \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fwildminder\u002FAI-windows-whl\u002Fissues\u002Fnew?labels=enhancement&template=feature-request---.md\">Request a New Wheel\u003C\u002Fa>\r\n  \u003C\u002Fp>\r\n\u003C\u002Fdiv>\r\n\r\n\r\n\u003C!-- TABLE OF CONTENTS -->\r\n\u003Cdetails>\r\n  \u003Csummary>Table of Contents\u003C\u002Fsummary>\r\n  \u003Col>\r\n    \u003Cli>\u003Ca href=\"#about-the-project\">About The Project\u003C\u002Fa>\u003C\u002Fli>\r\n    \u003Cli>\r\n      \u003Ca href=\"#getting-started\">Getting Started\u003C\u002Fa>\r\n      \u003Cul>\r\n        \u003Cli>\u003Ca href=\"#prerequisites\">Prerequisites\u003C\u002Fa>\u003C\u002Fli>\r\n        \u003Cli>\u003Ca href=\"#installation\">Installation\u003C\u002Fa>\u003C\u002Fli>\r\n      \u003C\u002Ful>\r\n    \u003C\u002Fli>\r\n    \u003Cli>\u003Ca href=\"#available-wheels\">Available Wheels\u003C\u002Fa>\r\n      \u003Cul>\r\n        \u003Cli>\u003Ca href=\"#pytorch\">PyTorch\u003C\u002Fa>\u003C\u002Fli>\r\n        \u003Cli>\u003Ca href=\"#flash-attention\">Flash Attention\u003C\u002Fa>\u003C\u002Fli>\r\n        \u003Cli>\u003Ca href=\"#xformers\">xformers\u003C\u002Fa>\u003C\u002Fli>\r\n        \u003Cli>\u003Ca href=\"#sageattention\">SageAttention\u003C\u002Fa>\u003C\u002Fli>\r\n        \u003Cli>\u003Ca href=\"#natten\">NATTEN\u003C\u002Fa>\u003C\u002Fli>\r\n        \u003Cli>\u003Ca href=\"#triton\">Triton (Windows Fork)\u003C\u002Fa>\u003C\u002Fli>\r\n        \u003Cli>\u003Ca href=\"#bitsandbytes\">bitsandbytes\u003C\u002Fa>\u003C\u002Fli>\r\n        \u003Cli>\u003Ca href=\"#other-packages\">Other Packages\u003C\u002Fa>\u003C\u002Fli>\r\n      \u003C\u002Ful>\r\n    \u003C\u002Fli>\r\n  \u003C\u002Fol>\r\n\u003C\u002Fdetails>\r\n\r\n\u003Cdiv align=\"center\">\r\n\u003Ca href=\"#pytorch\">\u003Cimg width=\"120\" height=\"52\" alt=\"PyTorch\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_6a1342911344.png\" \u002F>\u003C\u002Fa>\r\n  \u003Ca href=\"#torchaudio\">\u003Cimg width=\"120\" height=\"52\" alt=\"Torchaudio\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_68d96e09a63d.png\" \u002F>\u003C\u002Fa>\r\n  \u003Ca href=\"#flash-attention\">\u003Cimg width=\"120\" height=\"52\" alt=\"Flash Attention\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_80422bbcc74a.png\" \u002F>\u003C\u002Fa>\r\n\u003Ca href=\"#xformers\">\u003Cimg width=\"120\" height=\"52\" alt=\"xFormers\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_db27055f33a9.png\" \u002F>\u003C\u002Fa>    \r\n  \u003Ca href=\"#sageattention\">\u003Cimg width=\"120\" height=\"52\" alt=\"SageAttention\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_ad1c8deaaa68.png\" \u002F>\u003C\u002Fa>\r\n  \u003Ca href=\"#nunchaku\">\u003Cimg width=\"120\" height=\"52\" alt=\"Nunchaku\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_62cdd845629b.png\" \u002F>\u003C\u002Fa>\r\n\u003Ca href=\"#natten\">\u003Cimg width=\"120\" height=\"52\" alt=\"Natten\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_174f731582ae.png\" \u002F>\u003C\u002Fa>\r\n\u003Ca href=\"#triton\">\u003Cimg width=\"120\" height=\"52\" alt=\"triton\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_31e63ea4da5e.png\" \u002F>\u003C\u002Fa>\r\n\u003Ca href=\"#spargeattn\">\u003Cimg width=\"120\" height=\"52\" alt=\"SpargeAttn\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_82636fdbed65.png\" \u002F>\u003C\u002Fa>\r\n\u003Ca href=\"#bitsandbytes\">\u003Cimg width=\"120\" height=\"52\" alt=\"bitsandbytes\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_1f13c2b4224c.png\" \u002F>\u003C\u002Fa>  \r\n\u003C\u002Fdiv>\r\n\r\n\r\n\r\n\u003C!-- ABOUT THE PROJECT -->\r\n## About The Project\r\n\r\nThis repository was created to address a common pain point for AI enthusiasts and developers on the Windows platform: **building complex Python packages from source.** Libraries like `flash-attention`, `xformers` are essential for high-performance AI tasks but often lack official pre-built wheels for Windows, forcing users into a complicated and error-prone compilation process.\r\n\r\nThe goal here is to provide a centralized, up-to-date collection of direct links to pre-compiled `.whl` files for these libraries, primarily for the **ComfyUI** community and other PyTorch users on Windows. This saves you time and lets you focus on what's important: creating amazing things with AI.\r\n\r\n### Find Windows AI Wheels\r\nTo make life even easier, you can use this page **[Find Windows AI Wheels](https:\u002F\u002Fwildminder.github.io\u002FAI-windows-whl\u002F)** for quick searches of the required packages. \r\n\u003Cdiv align=\"center\">\r\n\u003Ca  href=\"https:\u002F\u002Fwildminder.github.io\u002FAI-windows-whl\u002F\">\r\n\u003Cimg width=\"70%\" alt=\"image\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_23a1f7f7a854.png\" \u002F>\r\n\u003C\u002Fa>  \r\n\u003C\u002Fdiv>\r\n\r\n\u003Cp align=\"right\">(\u003Ca href=\"#readme-top\">back to top\u003C\u002Fa>)\u003C\u002Fp>\r\n\r\n\r\n\u003C!-- GETTING STARTED -->\r\n## Getting Started\r\n\r\nFollow these simple steps to use the wheels from this repository.\r\n\r\n### Prerequisites\r\n\r\n1.  **Python for Windows**: Ensure you have a compatible Python version installed (PyTorch currently supports **Python 3.9 - 3.14** on Windows). You can get it from the [official Python website](https:\u002F\u002Fwww.python.org\u002Fdownloads\u002Fwindows\u002F).\r\n\r\n\r\n### Installation\r\n\r\nTo install a wheel, use `pip` with the direct URL to the `.whl` file. Make sure to enclose the URL in quotes.\r\n\r\n```sh\r\n# Example of installing a specific flash-attention wheel\r\npip install \"https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fblob\u002Fmain\u002Fflash_attn-2.7.4.post1+cu128torch2.7.0cxx11abiFALSE-cp312-cp312-win_amd64.whl\"\r\n```\r\n\r\n> [!TIP]\r\n> Find the package you need in the [Available Wheels](#available-wheels) section below, find the row that matches your environment (Python, PyTorch, CUDA version), and copy the link for the `pip install` command.\r\n\r\n\u003Cp align=\"right\">(\u003Ca href=\"#readme-top\">back to top\u003C\u002Fa>)\u003C\u002Fp>\r\n\r\n\r\n\u003C!-- AVAILABLE WHEELS -->\r\n## Available Wheels\r\n\r\nHere is the list of tracked packages.\r\n\r\n\u003Ca id=\"pytorch\">\u003C\u002Fa>\r\n### 🛠 PyTorch\r\nThe foundation of everything. Install this first from the official source.\r\n*   **Official Install Page**: [https:\u002F\u002Fpytorch.org\u002Fget-started\u002Flocally\u002F](https:\u002F\u002Fpytorch.org\u002Fget-started\u002Flocally\u002F)\r\n\r\nFor convenience, here are direct installation commands for specific versions on Linux\u002FWSL with an NVIDIA GPU. For other configurations (CPU, macOS, ROCm), please use the official install page.\r\n\r\n#### Stable Version (2.11.0)\r\nThis is the recommended version for most users.\r\n\r\n| CUDA Version | Pip Install Command                                                                                      |\r\n|:-------------|:---------------------------------------------------------------------------------------------------------|\r\n| **CUDA 13.0**  | `pip install torch torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu130` |\r\n| **CUDA 12.8**  | `pip install torch torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu128` |\r\n| **CUDA 12.6**  | `pip install torch torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu126` |\r\n\r\n\u003Cdetails>\r\n  \u003Csummary>Previous Stable Version\u003C\u002Fsummary>\r\n\r\n#### Stable Version (2.10.0)\r\nThis is the recommended version for most users.\r\n\r\n| CUDA Version | Pip Install Command                                                                                      |\r\n|:-------------|:---------------------------------------------------------------------------------------------------------|\r\n| **CUDA 13.0**  | `pip install \"torch>=2.10.0.dev,\u003C2.11.0\" torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu130` |\r\n| **CUDA 12.8**  | `pip install \"torch>=2.10.0.dev,\u003C2.11.0\" torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu128` |\r\n| **CUDA 12.6**  | `pip install \"torch>=2.10.0.dev,\u003C2.11.0\" torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu126` |\r\n\r\n#### Previous Version (2.9.1)\r\n\r\n| CUDA Version | Pip Install Command                                                                                      |\r\n|:-------------|:---------------------------------------------------------------------------------------------------------|\r\n| **CUDA 13.0**  | `pip install \"torch>=2.9.0.dev,\u003C2.10.0\" torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu130` |\r\n| **CUDA 12.8**  | `pip install \"torch>=2.9.0.dev,\u003C2.10.0\" torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu128` |\r\n| **CUDA 12.6**  | `pip install \"torch>=2.9.0.dev,\u003C2.10.0\" torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu126` |\r\n\r\n##### Previous Stable Version (2.8.0)  \r\n| CUDA Version | Pip Install Command                                                              |\r\n|:-------------|:---------------------------------------------------------------------------------|\r\n| **CUDA 12.9**  | `pip install \"torch>=2.8.0.dev,\u003C2.9.0\" torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu129`           |\r\n| **CUDA 12.8**  | `pip install \"torch>=2.8.0.dev,\u003C2.9.0\" torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu128`           |\r\n| **CUDA 12.6**  | `pip install \"torch>=2.8.0.dev,\u003C2.9.0\" torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu126`           |\r\n\r\n##### Previous Stable Version (2.7.1)\r\n| CUDA Version | Pip Install Command                                                                                      |\r\n|:-------------|:---------------------------------------------------------------------------------------------------------|\r\n| **CUDA 12.8**  | `pip install torch==2.7.1 torchvision==0.22.1 torchaudio==2.7.1 --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu128` |\r\n| **CUDA 12.6**  | `pip install torch==2.7.1 torchvision==0.22.1 torchaudio==2.7.1 --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu126` |\r\n| **CUDA 11.8**  | `pip install torch==2.7.1 torchvision==0.22.1 torchaudio==2.7.1 --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu118` |\r\n| **CPU only**   | `pip install torch==2.7.1 torchvision==0.22.1 torchaudio==2.7.1 --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcpu`      |\r\n\r\n\u003C\u002Fdetails>\r\n\r\n---\r\n\r\n#### Nightly Versions\r\nUse these for access to the latest features, but expect potential instability.\r\n\r\n**PyTorch 2.12 (Nightly)**\r\n| CUDA Version | Pip Install Command                                                                                      |\r\n|:-------------|:---------------------------------------------------------------------------------------------------------|\r\n| **CUDA 13.0**  | `pip install --pre torch torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fnightly\u002Fcu130` |\r\n| **CUDA 12.8**  | `pip install --pre torch torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fnightly\u002Fcu128` |\r\n| **CUDA 12.6**  | `pip install --pre torch torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fnightly\u002Fcu126` |\r\n\r\n\u003Cp id=\"torchaudio\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\r\n\r\n### 🛠 Torchaudio\r\n\u003C!-- START_TORCHAUDIO_TABLE -->\r\n\u003C!-- This table is auto-generated. Do not edit manually. -->\r\n| Package Version | PyTorch Ver | Python Ver | CUDA Ver | Download Link |\r\n|:---:|:---:|:---:|:---:|:---:|\r\n| `2.11.0a0` | `2.12.0` | `3.14` | `13.0` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Ftorchaudio\u002Ftorchaudio-2.11.0a0+d20260219.cu130torch2.12.0cxx11abi1-cp314-cp314-win_amd64.whl) |\r\n| `2.11.0a0` | `2.12.0` | `3.13` | `13.0` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Ftorchaudio\u002Ftorchaudio-2.11.0a0+d20260219.cu130torch2.12.0cxx11abi1-cp313-cp313-win_amd64.whl) |\r\n| `2.11.0a0` | `2.11.0` | `3.14` | `13.0` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Ftorchaudio\u002Ftorchaudio-2.11.0a0+d20260219.cu130torch2.11.0cxx11abi1-cp314-cp314-win_amd64.whl) |\r\n| `2.11.0a0` | `2.11.0` | `3.13` | `13.0` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Ftorchaudio\u002Ftorchaudio-2.11.0a0+d20260219.cu130torch2.11.0cxx11abi1-cp313-cp313-win_amd64.whl) |\r\n| `2.11.0a0` | `2.10.0` | `3.13` | `13.0` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Ftorchaudio\u002Ftorchaudio-2.11.0a0+d20260121.cu130torch2.10.0cxx11abi1-cp313-cp313-win_amd64.whl) |\r\n| `2.11.0a0` | `2.10.0` | `3.12` | `13.0` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Ftorchaudio\u002Ftorchaudio-2.11.0a0+d20260121.cu130torch2.10.0cxx11abi1-cp312-cp312-win_amd64.whl) |\r\n| `2.11.0a0` | `2.10.0` | `3.13` | `12.8` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Ftorchaudio\u002Ftorchaudio-2.11.0a0+cu128torch2.10.0cxx11abi1-cp313-cp313-win_amd64.whl) |\r\n| `2.8.0a0` | `2.9.0` | `3.12` | `12.8` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Ftorchaudio\u002Ftorchaudio-2.8.0a0+cu128torch2.9.0cxx11abi1-cp312-cp312-win_amd64.whl) |\r\n| `2.8.0a0` | `2.9.0` | `3.12` | `12.8` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Ftorchaudio\u002Ftorchaudio-2.8.0a0+cu128torch2.9.0cxx11abi1-cp312-cp312-win_amd64.whl) |\r\n\u003C!-- END_TORCHAUDIO_TABLE -->\r\n\r\n```sh\r\n# Torchcodec\r\npip install torchcodec\r\n```\r\n\r\n\u003Cp id=\"flash-attention\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\r\n\r\n### 🛠 Flash Attention\r\nHigh-performance attention implementation.\r\n\r\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FDao--AILab-flash--attention-blue?style=flat)](https:\u002F\u002Fgithub.com\u002FDao-AILab\u002Fflash-attention)\r\n[![HuggingFace](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flldacing-HF%20Wheels-orange?style=flat)](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Ftree\u002Fmain)\r\n[![HuggingFace](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FWildminder-HF%20Wheels-orange?style=flat)](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Ftree\u002Fmain)\r\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fmjun0812-Wheels-green?style=flat)](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels)\r\n\r\n\u003C!-- START_FLASHATTENTION_TABLE -->\r\n\u003C!-- This table is auto-generated. Do not edit manually. -->\r\n| Package Version | PyTorch Ver | Python Ver | CUDA Ver | CXX11 ABI | Download Link |\r\n|:---:|:---:|:---:|:---:|:---:|:---:|\r\n| `2.8.4` | `2.12.0` | `3.14` | `13.0` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.4+d20260328cu130torch2.12.0cxx11abiTRUE-cp314-cp314-win_amd64.whl) |\r\n| `2.8.4` | `2.12.0` | `3.13` | `13.0` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.4+d20260328cu130torch2.12.0cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\r\n| `2.8.4` | `2.11.0` | `3.14` | `13.0` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.4+d20260328cu130torch2.11.0cxx11abiTRUE-cp314-cp314-win_amd64.whl) |\r\n| `2.8.4` | `2.11.0` | `3.13` | `13.0` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.4+d20260328cu130torch2.11.0cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\r\n| `2.8.3` | `2.11.0` | `3.13` | `13.0` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3+cu130torch2.11.0cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\r\n| `2.8.3` | `2.11.0` | `3.12` | `13.0` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3%2Bd20260120.cu130torch2.11.0cxx11abiTRUE-cp312-cp312-win_amd64.whl) |\r\n| `2.8.3` | `2.10.0` | `3.13` | `13.0` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3%2Bd20260121.cu130torch2.10.0cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\r\n| `2.8.3` | `2.10.0` | `3.13` | `13.0` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3+cu130torch2.10.0cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\r\n| `2.8.3` | `2.10.0` | `3.12` | `13.0` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3%2Bd20260121.cu130torch2.10.0cxx11abiTRUE-cp312-cp312-win_amd64.whl) |\r\n| `2.8.3` | `2.10.0` | `3.12` | `13.0` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3+cu130torch2.10.0cxx11abiTRUE-cp312-cp312-win_amd64.whl) |\r\n| `2.8.3` | `2.10.0` | `3.13` | `12.8` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3+cu128torch2.10.0cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\r\n| `2.8.3` | `2.9.1` | `3.13` | `13.0` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3%2Bcu130torch2.9.1cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\r\n| `2.8.3` | `2.9.1` | `3.12` | `13.0` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3%2Bcu130torch2.9.1cxx11abiTRUE-cp312-cp312-win_amd64.whl) |\r\n| `2.8.3` | `2.9.1` | `3.13` | `12.8` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3%2Bcu128torch2.9.1cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\r\n| `2.8.3` | `2.9.0` | `3.13` | `13.0` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3+cu130torch2.9.0cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\r\n| `2.8.3` | `2.9.0` | `3.12` | `13.0` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3%2Bcu130torch2.9.0cxx11abiTRUE-cp312-cp312-win_amd64.whl) |\r\n| `2.8.3` | `2.9.0` | `3.13` | `12.9` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3+cu129torch2.9.0cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\r\n| `2.8.3` | `2.9.0` | `3.12` | `12.8` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3+cu128torch2.9.0cxx11abiTRUE-cp312-cp312-win_amd64.whl) |\r\n| `2.8.3` | `2.8.0` | `3.12` | `12.8` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3+cu128torch2.8.0cxx11abiTRUE-cp312-cp312-win_amd64.whl) |\r\n| `2.8.2` | `2.9.0` | `3.12` | `12.8` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.2%2Bcu128torch2.9.0cxx11abiTRUE-cp312-cp312-win_amd64.whl?download=true) |\r\n| `2.8.2` | `2.8.0` | `3.12` | `12.8` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.2%2Bcu128torch2.8.0cxx11abiTRUE-cp312-cp312-win_amd64.whl?download=true) |\r\n| `2.8.2` | `2.8.0` | `3.11` | `12.8` | ✓ | [Link](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.10\u002Fflash_attn-2.8.2+cu128torch2.8-cp311-cp311-win_amd64.whl) |\r\n| `2.8.2` | `2.8.0` | `3.10` | `12.8` | ✓ | [Link](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.10\u002Fflash_attn-2.8.2+cu128torch2.8-cp310-cp310-win_amd64.whl) |\r\n| `2.8.2` | `2.7.0` | `3.12` | `12.8` | ✗ | [Link](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.10\u002Fflash_attn-2.8.2+cu128torch2.7-cp312-cp312-win_amd64.whl) |\r\n| `2.8.2` | `2.7.0` | `3.11` | `12.8` | ✗ | [Link](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.10\u002Fflash_attn-2.8.2+cu128torch2.7-cp311-cp311-win_amd64.whl) |\r\n| `2.8.2` | `2.7.0` | `3.10` | `12.8` | ✗ | [Link](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.10\u002Fflash_attn-2.8.2+cu128torch2.7-cp310-cp310-win_amd64.whl) |\r\n| `2.8.1` | `2.8.0` | `3.12` | `12.8` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.1%2Bcu128torch2.8.0cxx11abiTRUE-cp312-cp312-win_amd64.whl?download=true) |\r\n| `2.8.0.post2` | `2.8.0` | `3.12` | `12.8` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.0.post2+cu128torch2.8.0cxx11abiTRUE-cp312-cp312-win_amd64.whl?download=true) |\r\n| `2.7.4.post1` | `2.8.0` | `3.12` | `12.8` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4.post1+cu128torch2.8.0cxx11abiTRUE-cp312-cp312-win_amd64.whl?download=true) |\r\n| `2.7.4.post1` | `2.8.0` | `3.10` | `12.8` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4.post1+cu128torch2.8.0cxx11abiTRUE-cp310-cp310-win_amd64.whl?download=true) |\r\n| `2.7.4.post1` | `2.7.0` | `3.12` | `12.8` | ✗ | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4.post1+cu128torch2.7.0cxx11abiFALSE-cp312-cp312-win_amd64.whl?download=true) |\r\n| `2.7.4.post1` | `2.7.0` | `3.11` | `12.8` | ✗ | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4.post1+cu128torch2.7.0cxx11abiFALSE-cp311-cp311-win_amd64.whl?download=true) |\r\n| `2.7.4.post1` | `2.7.0` | `3.10` | `12.8` | ✗ | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4.post1+cu128torch2.7.0cxx11abiFALSE-cp310-cp310-win_amd64.whl?download=true) |\r\n| `2.7.4` | `2.8.0` | `3.12` | `12.8` | ✓ | [Link](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.10\u002Fflash_attn-2.7.4+cu128torch2.8-cp312-cp312-win_amd64.whl) |\r\n| `2.7.4` | `2.8.0` | `3.11` | `12.8` | ✓ | [Link](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.10\u002Fflash_attn-2.7.4+cu128torch2.8-cp311-cp311-win_amd64.whl) |\r\n| `2.7.4` | `2.8.0` | `3.10` | `12.8` | ✓ | [Link](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.10\u002Fflash_attn-2.7.4+cu128torch2.8-cp310-cp310-win_amd64.whl) |\r\n| `2.7.4` | `2.7.0` | `3.12` | `12.8` | ✗ | [Link](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.10\u002Fflash_attn-2.7.4+cu128torch2.7-cp312-cp312-win_amd64.whl) |\r\n| `2.7.4` | `2.7.0` | `3.11` | `12.8` | ✗ | [Link](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.9\u002Fflash_attn-2.7.4+cu128torch2.7-cp311-cp311-win_amd64.whl) |\r\n| `2.7.4` | `2.7.0` | `3.10` | `12.8` | ✗ | [Link](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.10\u002Fflash_attn-2.7.4+cu128torch2.7-cp310-cp310-win_amd64.whl) |\r\n| `2.7.4` | `2.6.0` | `3.12` | `12.6` | ✗ | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4+cu126torch2.6.0cxx11abiFALSE-cp312-cp312-win_amd64.whl?download=true) |\r\n| `2.7.4` | `2.6.0` | `3.11` | `12.6` | ✗ | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4+cu126torch2.6.0cxx11abiFALSE-cp311-cp311-win_amd64.whl?download=true) |\r\n| `2.7.4` | `2.6.0` | `3.10` | `12.6` | ✗ | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4+cu126torch2.6.0cxx11abiFALSE-cp310-cp310-win_amd64.whl?download=true) |\r\n| `2.7.4` | `2.6.0` | `3.12` | `12.4` | ✗ | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4+cu124torch2.6.0cxx11abiFALSE-cp312-cp312-win_amd64.whl?download=true) |\r\n| `2.7.4` | `2.6.0` | `3.11` | `12.4` | ✗ | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4+cu124torch2.6.0cxx11abiFALSE-cp311-cp311-win_amd64.whl?download=true) |\r\n| `2.7.4` | `2.6.0` | `3.10` | `12.4` | ✗ | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4+cu124torch2.6.0cxx11abiFALSE-cp310-cp310-win_amd64.whl?download=true) |\r\n\u003C!-- END_FLASHATTENTION_TABLE -->\r\n\r\n\u003Cp id=\"flash-attention-3\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\r\n\r\n### 🛠 Flash Attention 3\r\nNext-generation Flash Attention with improved performance and features.\r\n\r\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fwindreamer-FA3%20Wheels-blue?style=flat)](https:\u002F\u002Fgithub.com\u002Fwindreamer\u002Fflash-attention-3-wheels-windows)\r\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fmjun0812-FA3%20Wheels-green?style=flat)](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-3-wheels-windows)\r\n\r\n\u003C!-- START_FLASHATTENTION3_TABLE -->\r\n\u003C!-- This table is auto-generated. Do not edit manually. -->\r\n| Package Version | PyTorch Ver | Python Ver | CUDA Ver | CXX11 ABI | Download Link |\r\n|:---:|:---:|:---:|:---:|:---:|:---:|\r\n| `3.0.0` | `2.10` | `3.9+` | `13.0` | ✓ | [Link](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.9.3\u002Fflash_attn_3-3.0.0+cu130torch2.10gite2743ab-cp39-abi3-win_amd64.whl) |\r\n| `3.0.0` | `2.10` | `3.9+` | `13.0` | ✓ | [Link](https:\u002F\u002Fgithub.com\u002Fwindreamer\u002Fflash-attention3-wheels\u002Freleases\u002Fdownload\u002F2026.03.19-850211f\u002Fflash_attn_3-3.0.0+20260318.cu130torch2100cxx11abitrue.8afc61-cp39-abi3-win_amd64.whl) |\r\n| `3.0.0` | `2.10` | `3.9+` | `12.8` | ✓ | [Link](https:\u002F\u002Fgithub.com\u002Fwindreamer\u002Fflash-attention3-wheels\u002Freleases\u002Fdownload\u002F2026.03.19-850211f\u002Fflash_attn_3-3.0.0+20260318.cu128torch2100cxx11abitrue.8afc61-cp39-abi3-win_amd64.whl) |\r\n| `3.0.0` | `2.10` | `3.9+` | `12.8` | ✓ | [Link](https:\u002F\u002Fgithub.com\u002Fwindreamer\u002Fflash-attention3-wheels\u002Freleases\u002Fdownload\u002F2026.03.19-850211f\u002Fflash_attn_3-3.0.0+20260318.cu128torch280cxx11abitrue.8afc61-cp39-abi3-win_amd64.whl) |\r\n| `3.0.0` | `2.9` | `3.9+` | `13.0` | ✓ | [Link](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.9.3\u002Fflash_attn_3-3.0.0+cu130torch2.9gite2743ab-cp39-abi3-win_amd64.whl) |\r\n| `3.0.0` | `2.9` | `3.9+` | `12.8` | ✓ | [Link](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.9.3\u002Fflash_attn_3-3.0.0+cu128torch2.9gite2743ab-cp39-abi3-win_amd64.whl) |\r\n\u003C!-- END_FLASHATTENTION3_TABLE -->\r\n\r\n\u003Cp id=\"flash-attention-4\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\r\n\r\n### 🛠 Flash Attention 4\r\nLatest Flash Attention implementation with cutting-edge optimizations.\r\n\r\n\u003C!-- START_FLASHATTENTION4_TABLE -->\r\n\u003C!-- This table is auto-generated. Do not edit manually. -->\r\n*(No wheels available - package not tracked)*\r\n\u003C!-- END_FLASHATTENTION4_TABLE -->\r\n\r\n\u003Cp id=\"xformers\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\r\n\r\n### 🛠 xformers\r\nAnother library for memory-efficient attention and other optimizations.\r\n\r\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Ffacebookresearch-xformers-blue?style=flat)](https:\u002F\u002Fgithub.com\u002Ffacebookresearch\u002Fxformers\u002Freleases)\r\n[![PyTorch](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FPyTorch-Wheels-red?style=flat)](https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fxformers\u002F)\r\n> [!NOTE]\r\n> PyTorch provides official pre-built wheels for xformers. You can often install it with `pip install xformers`\r\n\r\n\r\n| CUDA Version | Install |\r\n|:---:|:---|\r\n| **CUDA 12.6** | `pip3 install -U xformers --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu126` |\r\n| **CUDA 12.8** | `pip3 install -U xformers --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu128` |\r\n| **CUDA 13.0** | `pip3 install -U xformers --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu130` |\r\n\r\nABI3 version, any Python 3.9-3.12\r\n\r\n\u003C!-- START_XFORMERS_TABLE -->\r\n\u003C!-- This table is auto-generated. Do not edit manually. -->\r\n| Package Version | PyTorch Ver | Python Ver | CUDA Ver | Download Link |\r\n|:---:|:---:|:---:|:---:|:---:|\r\n| `0.0.34` | `2.11` | `3.9+` | `13.0` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fxformers-0.0.34+torch2.11cu130-cp39-abi3-win_amd64.whl) |\r\n| `0.0.34` | `2.10` | `3.9+` | `13.0` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fxformers-0.0.34%2Bd20260123.cu130torch2.10-cp39-abi3-win_amd64.whl) |\r\n| `0.0.34` | `2.10` | `3.9+` | `13.0` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fxformers-0.0.34+torch2.10cu130-cp39-abi3-win_amd64.whl) |\r\n| `0.0.33` | `2.10` | `3.9+` | `13.0` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fxformers-0.0.33%2Bcu130torch2.10-cp39-abi3-win_amd64.whl) |\r\n| `0.0.33` | `2.9` | `3.9+` | `13.0` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fxformers-0.0.33%2Bcu130torch2.9-cp39-abi3-win_amd64.whl) |\r\n| `0.0.32.post2` | `2.8.0` | `3.9+` | `12.9` | [Link](https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu129\u002Fxformers-0.0.32.post2-cp39-abi3-win_amd64.whl) |\r\n| `0.0.32.post2` | `2.8.0` | `3.9+` | `12.8` | [Link](https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu128\u002Fxformers-0.0.32.post2-cp39-abi3-win_amd64.whl) |\r\n| `0.0.32.post2` | `2.8.0` | `3.9+` | `12.6` | [Link](https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu126\u002Fxformers-0.0.32.post2-cp39-abi3-win_amd64.whl) |\r\n\u003C!-- END_XFORMERS_TABLE --> \r\n\r\n\u003Cp id=\"sageattention\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\r\n\r\n### 🛠 SageAttention\r\n\r\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fthu--ml-SageAttention-blue?style=flat)](https:\u002F\u002Fgithub.com\u002Fthu-ml\u002FSageAttention)\r\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fwoct0rdho-Wheels-green?style=flat)](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases)\r\n[![HuggingFace](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FWildminder-HF%20Wheels-orange?style=flat)](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Ftree\u002Fmain)\r\n\r\n\u003C!-- START_SAGEATTENTION2_TABLE -->\r\n\u003C!-- This table is auto-generated. Do not edit manually. -->\r\n| Package Version | PyTorch Ver | Python Ver | CUDA Ver | Download Link |\r\n|:---:|:---:|:---:|:---:|:---:|\r\n| `2.1.1` | `2.8.0` | `3.12` | `12.8` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.1.1+cu128torch2.8.0-cp312-cp312-win_amd64.whl?download=true) |\r\n| `2.1.1` | `2.7.0` | `3.10` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.1.1-windows\u002Fsageattention-2.1.1+cu128torch2.7.0-cp310-cp310-win_amd64.whl) |\r\n| `2.1.1` | `2.6.0` | `3.13` | `12.6` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.1.1-windows\u002Fsageattention-2.1.1+cu126torch2.6.0-cp313-cp313-win_amd64.whl) |\r\n| `2.1.1` | `2.6.0` | `3.12` | `12.6` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.1.1-windows\u002Fsageattention-2.1.1+cu126torch2.6.0-cp312-cp312-win_amd64.whl) |\r\n| `2.1.1` | `2.6.0` | `3.12` | `12.6` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.1.1+cu126torch2.6.0-cp312-cp312-win_amd64.whl?download=true) |\r\n| `2.1.1` | `2.6.0` | `3.11` | `12.6` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.1.1-windows\u002Fsageattention-2.1.1+cu126torch2.6.0-cp311-cp311-win_amd64.whl) |\r\n| `2.1.1` | `2.6.0` | `3.10` | `12.6` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.1.1-windows\u002Fsageattention-2.1.1+cu126torch2.6.0-cp310-cp310-win_amd64.whl) |\r\n| `2.1.1` | `2.6.0` | `3.9` | `12.6` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.1.1-windows\u002Fsageattention-2.1.1+cu126torch2.6.0-cp39-cp39-win_amd64.whl) |\r\n| `2.1.1` | `2.5.1` | `3.12` | `12.4` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.1.1-windows\u002Fsageattention-2.1.1+cu124torch2.5.1-cp312-cp312-win_amd64.whl) |\r\n| `2.1.1` | `2.5.1` | `3.11` | `12.4` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.1.1-windows\u002Fsageattention-2.1.1+cu124torch2.5.1-cp311-cp311-win_amd64.whl) |\r\n| `2.1.1` | `2.5.1` | `3.10` | `12.4` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.1.1-windows\u002Fsageattention-2.1.1+cu124torch2.5.1-cp310-cp310-win_amd64.whl) |\r\n| `2.1.1` | `2.5.1` | `3.9` | `12.4` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.1.1-windows\u002Fsageattention-2.1.1+cu124torch2.5.1-cp39-cp39-win_amd64.whl) |\r\n\u003C!-- END_SAGEATTENTION2_TABLE -->\r\n\r\n◇ ◇ ◇ ◇ ◇ ◇ ◇ ◇ ◇ ◇ ◇ ◇ ◇ ◇ ◇ ◇ ◇\r\n\r\n#### 🛠 SageAttention 2.2 (SageAttention2++)\r\n> [!NOTE]\r\n> Only supports CUDA >= 12.8, therefore PyTorch >= 2.7.\r\n\r\n\u003C!-- START_SAGEATTENTION22_TABLE -->\r\n\u003C!-- This table is auto-generated. Do not edit manually. -->\r\n| Package Version | PyTorch Ver | Python Ver | CUDA Ver | Download Link |\r\n|:---:|:---:|:---:|:---:|:---:|\r\n| `2.2.0.post4` | `2.9.0+` | `3.9+` | `13.0` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post4\u002Fsageattention-2.2.0+cu130torch2.9.0andhigher.post4-cp39-abi3-win_amd64.whl) |\r\n| `2.2.0.post4` | `2.9.0+` | `3.9+` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post4\u002Fsageattention-2.2.0+cu128torch2.9.0andhigher.post4-cp39-abi3-win_amd64.whl) |\r\n| `2.2.0.post3` | `2.10.0` | `3.12` | `13.0` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0.post3+cu130torch2.10.0-cp312-cp312-win_amd64.whl) |\r\n| `2.2.0.post3` | `2.10.0` | `3.13` | `12.8` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0.post3+cu128torch2.10.0-cp313-cp313-win_amd64.whl) |\r\n| `2.2.0.post3` | `2.10.0` | `3.12` | `12.8` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0.post3+cu128torch2.10.0-cp312-cp312-win_amd64.whl) |\r\n| `2.2.0.post3` | `2.9.0` | `3.13` | `13.0` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0.post3+cu130torch2.9.0-cp313-cp313-win_amd64.whl) |\r\n| `2.2.0.post3` | `2.9.0` | `3.13` | `12.9` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0.post3+cu129torch2.9.0-cp313-cp313-win_amd64.whl) |\r\n| `2.2.0.post3` | `2.9.0` | `3.9+` | `12.9` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0.post3+cu129torch2.9.0-cp313-cp313-win_amd64.whl) |\r\n| `2.2.0.post3` | `2.9.0` | `3.13` | `12.8` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0.post3+cu128torch2.9.0-cp313-cp313-win_amd64.whl) |\r\n| `2.2.0.post3` | `2.9.0` | `3.9+` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post3\u002Fsageattention-2.2.0+cu128torch2.9.0.post3-cp39-abi3-win_amd64.whl) |\r\n| `2.2.0.post3` | `2.8.0` | `3.13` | `12.9` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0.post3+cu129torch2.8.0-cp313-cp313-win_amd64.whl) |\r\n| `2.2.0.post3` | `2.8.0` | `3.9+` | `12.9` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0.post3+cu129torch2.8.0-cp313-cp313-win_amd64.whl) |\r\n| `2.2.0.post3` | `2.8.0` | `3.13` | `12.8` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0.post3+cu128torch2.8.0-cp313-cp313-win_amd64.whl) |\r\n| `2.2.0.post3` | `2.8.0` | `3.9+` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post3\u002Fsageattention-2.2.0+cu128torch2.8.0.post3-cp39-abi3-win_amd64.whl) |\r\n| `2.2.0.post3` | `2.7.1` | `3.9+` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post3\u002Fsageattention-2.2.0+cu128torch2.7.1.post3-cp39-abi3-win_amd64.whl) |\r\n| `2.2.0.post3` | `2.6.0` | `3.9+` | `12.6` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post3\u002Fsageattention-2.2.0+cu126torch2.6.0.post3-cp39-abi3-win_amd64.whl) |\r\n| `2.2.0.post3` | `2.5.1` | `3.9+` | `12.4` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post3\u002Fsageattention-2.2.0+cu124torch2.5.1.post3-cp39-abi3-win_amd64.whl) |\r\n| `2.2.0.post2` | `2.9.0` | `3.9+` | `12.8` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0%2Bcu128torch2.9.0cxx11abi1-cp312-cp312-win_amd64.whl?download=true) |\r\n| `2.2.0.post2` | `2.8.0` | `3.9+` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post2\u002Fsageattention-2.2.0+cu128torch2.8.0.post2-cp39-abi3-win_amd64.whl) |\r\n| `2.2.0.post2` | `2.7.1` | `3.9+` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post2\u002Fsageattention-2.2.0+cu128torch2.7.1.post2-cp39-abi3-win_amd64.whl) |\r\n| `2.2.0.post2` | `2.6.0` | `3.9+` | `12.6` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post2\u002Fsageattention-2.2.0+cu126torch2.6.0.post2-cp39-abi3-win_amd64.whl) |\r\n| `2.2.0.post2` | `2.5.1` | `3.9+` | `12.4` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post2\u002Fsageattention-2.2.0+cu124torch2.5.1.post2-cp39-abi3-win_amd64.whl) |\r\n| `2.2.0` | `2.8.0` | `3.13` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows\u002Fsageattention-2.2.0+cu128torch2.8.0-cp313-cp313-win_amd64.whl) |\r\n| `2.2.0` | `2.8.0` | `3.12` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows\u002Fsageattention-2.2.0+cu128torch2.8.0-cp312-cp312-win_amd64.whl) |\r\n| `2.2.0` | `2.8.0` | `3.11` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows\u002Fsageattention-2.2.0+cu128torch2.8.0-cp311-cp311-win_amd64.whl) |\r\n| `2.2.0` | `2.8.0` | `3.10` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows\u002Fsageattention-2.2.0+cu128torch2.8.0-cp310-cp310-win_amd64.whl) |\r\n| `2.2.0` | `2.8.0` | `3.9` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows\u002Fsageattention-2.2.0+cu128torch2.8.0-cp39-cp39-win_amd64.whl) |\r\n| `2.2.0` | `2.7.1` | `3.13` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows\u002Fsageattention-2.2.0+cu128torch2.7.1-cp313-cp313-win_amd64.whl) |\r\n| `2.2.0` | `2.7.1` | `3.12` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows\u002Fsageattention-2.2.0+cu128torch2.7.1-cp312-cp312-win_amd64.whl) |\r\n| `2.2.0` | `2.7.1` | `3.11` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows\u002Fsageattention-2.2.0+cu128torch2.7.1-cp311-cp311-win_amd64.whl) |\r\n| `2.2.0` | `2.7.1` | `3.10` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows\u002Fsageattention-2.2.0+cu128torch2.7.1-cp310-cp310-win_amd64.whl) |\r\n| `2.2.0` | `2.7.1` | `3.9` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows\u002Fsageattention-2.2.0+cu128torch2.7.1-cp39-cp39-win_amd64.whl) |\r\n\u003C!-- END_SAGEATTENTION22_TABLE -->\r\n\r\n\r\n##### 🛠 SageAttention 3\r\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fmengqin-green?style=flat)](https:\u002F\u002Fgithub.com\u002Fmengqin\u002FSageAttention)\r\n\r\n\u003C!-- START_SAGEATTN3_TABLE -->\r\n\u003C!-- This table is auto-generated. Do not edit manually. -->\r\n| Package Version | PyTorch Ver | Python Ver | CUDA Ver | Download Link |\r\n|:---:|:---:|:---:|:---:|:---:|\r\n| `1.0.0` | `2.9.1` | `3.13` | `13.0` | [Link](https:\u002F\u002Fgithub.com\u002Fmengqin\u002FSageAttention\u002Freleases\u002Fdownload\u002F20251229\u002Fsageattn3-1.0.0+cu130torch291-cp313-cp313-win_amd64.whl) |\r\n| `1.0.0` | `2.9.1` | `3.12` | `13.0` | [Link](https:\u002F\u002Fgithub.com\u002Fmengqin\u002FSageAttention\u002Freleases\u002Fdownload\u002F20251229\u002Fsageattn3-1.0.0+cu130torch291-cp312-cp312-win_amd64.whl) |\r\n| `1.0.0` | `2.8.0` | `3.13` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fmengqin\u002FSageAttention\u002Freleases\u002Fdownload\u002F20251229\u002Fsageattn3-1.0.0+cu128torch280-cp313-cp313-win_amd64.whl) |\r\n| `1.0.0` | `2.8.0` | `3.12` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fmengqin\u002FSageAttention\u002Freleases\u002Fdownload\u002F20251229\u002Fsageattn3-1.0.0+cu128torch280-cp312-cp312-win_amd64.whl) |\r\n| `1.0.0` | `2.8.0` | `3.11` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fmengqin\u002FSageAttention\u002Freleases\u002Fdownload\u002F20251229\u002Fsageattn3-1.0.0+cu128torch280-cp311-cp311-win_amd64.whl) |\r\n| `1.0.0` | `2.7.1` | `3.13` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fmengqin\u002FSageAttention\u002Freleases\u002Fdownload\u002F20251229\u002Fsageattn3-1.0.0+cu128torch271-cp313-cp313-win_amd64.whl) |\r\n| `1.0.0` | `2.7.1` | `3.12` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fmengqin\u002FSageAttention\u002Freleases\u002Fdownload\u002F20251229\u002Fsageattn3-1.0.0+cu128torch271-cp312-cp312-win_amd64.whl) |\r\n| `1.0.0` | `2.7.1` | `3.11` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fmengqin\u002FSageAttention\u002Freleases\u002Fdownload\u002F20251229\u002Fsageattn3-1.0.0+cu128torch271-cp311-cp311-win_amd64.whl) |\r\n\u003C!-- END_SAGEATTN3_TABLE -->\r\n\r\n\u003Cp id=\"nunchaku\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\r\n\r\n### 🛠 Nunchaku\r\n*   **Official Repo**: : [mit-han-lab\u002Fnunchaku](https:\u002F\u002Fgithub.com\u002Fmit-han-lab\u002Fnunchaku\u002Freleases)\r\n\u003C!-- START_NUNCHAKU_TABLE -->\r\n\u003C!-- This table is auto-generated. Do not edit manually. -->\r\n| Package Version | PyTorch Ver | Python Ver | Download Link |\r\n|:---:|:---:|:---:|:---:|\r\n| `1.2.0` | `2.11` | `3.13` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-ai\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.2.0\u002Fnunchaku-1.2.0+torch2.11-cp313-cp313-win_amd64.whl) |\r\n| `1.2.0` | `2.11` | `3.12` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-ai\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.2.0\u002Fnunchaku-1.2.0+torch2.11-cp312-cp312-win_amd64.whl) |\r\n| `1.2.0` | `2.11` | `3.11` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-ai\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.2.0\u002Fnunchaku-1.2.0+torch2.11-cp311-cp311-win_amd64.whl) |\r\n| `1.2.0` | `2.11` | `3.10` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-ai\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.2.0\u002Fnunchaku-1.2.0+torch2.11-cp310-cp310-win_amd64.whl) |\r\n| `1.2.0` | `2.9` | `3.13` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-ai\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.2.0\u002Fnunchaku-1.2.0+torch2.9-cp313-cp313-win_amd64.whl) |\r\n| `1.2.0` | `2.9` | `3.12` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-ai\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.2.0\u002Fnunchaku-1.2.0+torch2.9-cp312-cp312-win_amd64.whl) |\r\n| `1.2.0` | `2.9` | `3.11` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-ai\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.2.0\u002Fnunchaku-1.2.0+torch2.9-cp311-cp311-win_amd64.whl) |\r\n| `1.2.0` | `2.9` | `3.10` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-ai\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.2.0\u002Fnunchaku-1.2.0+torch2.9-cp310-cp310-win_amd64.whl) |\r\n| `1.2.0` | `2.8` | `3.13` | [Link](https:\u002F\u002Fhuggingface.co\u002FJusteLeo\u002FNunchaku-Zimage-Win-Wheels\u002Fresolve\u002Fmain\u002Fnunchaku-1.2.0%2Btorch2.8-cp313-cp313-win_amd64.whl) |\r\n| `1.2.0` | `2.8` | `3.12` | [Link](https:\u002F\u002Fhuggingface.co\u002FJusteLeo\u002FNunchaku-Zimage-Win-Wheels\u002Fresolve\u002Fmain\u002Fnunchaku-1.2.0%2Btorch2.8-cp312-cp312-win_amd64.whl) |\r\n| `1.2.0` | `2.8` | `3.11` | [Link](https:\u002F\u002Fhuggingface.co\u002FJusteLeo\u002FNunchaku-Zimage-Win-Wheels\u002Fresolve\u002Fmain\u002Fnunchaku-1.2.0%2Btorch2.8-cp311-cp311-win_amd64.whl) |\r\n| `1.2.0` | `2.7` | `3.13` | [Link](https:\u002F\u002Fhuggingface.co\u002FJusteLeo\u002FNunchaku-Zimage-Win-Wheels\u002Fresolve\u002Fmain\u002Fnunchaku-1.2.0%2Btorch2.7-cp313-cp313-win_amd64.whl) |\r\n| `1.2.0` | `2.7` | `3.12` | [Link](https:\u002F\u002Fhuggingface.co\u002FJusteLeo\u002FNunchaku-Zimage-Win-Wheels\u002Fresolve\u002Fmain\u002Fnunchaku-1.2.0%2Btorch2.7-cp312-cp312-win_amd64.whl) |\r\n| `1.2.0` | `2.7` | `3.11` | [Link](https:\u002F\u002Fhuggingface.co\u002FJusteLeo\u002FNunchaku-Zimage-Win-Wheels\u002Fresolve\u002Fmain\u002Fnunchaku-1.2.0%2Btorch2.7-cp311-cp311-win_amd64.whl) |\r\n| `1.0.2` | `2.10` | `3.13` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.10-cp313-cp313-win_amd64.whl) |\r\n| `1.0.2` | `2.10` | `3.12` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.10-cp312-cp312-win_amd64.whl) |\r\n| `1.0.2` | `2.10` | `3.11` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.10-cp311-cp311-win_amd64.whl) |\r\n| `1.0.2` | `2.10` | `3.10` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.10-cp310-cp310-win_amd64.whl) |\r\n| `1.0.2` | `2.9` | `3.13` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.9-cp313-cp313-win_amd64.whl) |\r\n| `1.0.2` | `2.9` | `3.12` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.9-cp312-cp312-win_amd64.whl) |\r\n| `1.0.2` | `2.9` | `3.11` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.9-cp311-cp311-win_amd64.whl) |\r\n| `1.0.2` | `2.9` | `3.10` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.9-cp310-cp310-win_amd64.whl) |\r\n| `1.0.2` | `2.8` | `3.13` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.8-cp313-cp313-win_amd64.whl) |\r\n| `1.0.2` | `2.8` | `3.12` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.8-cp312-cp312-win_amd64.whl) |\r\n| `1.0.2` | `2.8` | `3.11` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.8-cp311-cp311-win_amd64.whl) |\r\n| `1.0.2` | `2.8` | `3.10` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.8-cp310-cp310-win_amd64.whl) |\r\n| `1.0.2` | `2.7` | `3.13` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.7-cp313-cp313-win_amd64.whl) |\r\n| `1.0.2` | `2.7` | `3.12` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.7-cp312-cp312-win_amd64.whl) |\r\n| `1.0.2` | `2.7` | `3.11` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.7-cp311-cp311-win_amd64.whl) |\r\n| `1.0.2` | `2.7` | `3.10` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.7-cp310-cp310-win_amd64.whl) |\r\n| `1.0.1` | `2.10` | `3.13` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.10-cp313-cp313-win_amd64.whl) |\r\n| `1.0.1` | `2.10` | `3.12` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.10-cp312-cp312-win_amd64.whl) |\r\n| `1.0.1` | `2.10` | `3.11` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.10-cp311-cp311-win_amd64.whl) |\r\n| `1.0.1` | `2.10` | `3.10` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.10-cp310-cp310-win_amd64.whl) |\r\n| `1.0.1` | `2.9` | `3.13` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fnunchaku-1.0.1%2Bcu128torch2.9-cp313-cp313-win_amd64.whl) |\r\n| `1.0.1` | `2.9` | `3.13` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fnunchaku-1.0.1%2Bcu130torch2.9-cp313-cp313-win_amd64.whl) |\r\n| `1.0.1` | `2.9` | `3.12` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fnunchaku-1.0.1%2Bcu128torch2.9-cp312-cp312-win_amd64.whl) |\r\n| `1.0.1` | `2.9` | `3.12` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fnunchaku-1.0.1%2Bcu130torch2.9-cp312-cp312-win_amd64.whl) |\r\n| `1.0.1` | `2.8` | `3.13` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.8-cp313-cp313-win_amd64.whl) |\r\n| `1.0.1` | `2.8` | `3.13` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.8-cp313-cp313-win_amd64.whl) |\r\n| `1.0.1` | `2.8` | `3.12` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.8-cp312-cp312-win_amd64.whl) |\r\n| `1.0.1` | `2.8` | `3.11` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.8-cp311-cp311-win_amd64.whl) |\r\n| `1.0.1` | `2.8` | `3.10` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.8-cp310-cp310-win_amd64.whl) |\r\n| `1.0.1` | `2.7` | `3.13` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.7-cp313-cp313-win_amd64.whl) |\r\n| `1.0.1` | `2.7` | `3.12` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.7-cp312-cp312-win_amd64.whl) |\r\n| `1.0.1` | `2.7` | `3.11` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.7-cp311-cp311-win_amd64.whl) |\r\n| `1.0.1` | `2.7` | `3.10` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.7-cp310-cp310-win_amd64.whl) |\r\n| `1.0.1` | `2.6` | `3.13` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.6-cp313-cp313-win_amd64.whl) |\r\n| `1.0.1` | `2.6` | `3.12` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.6-cp312-cp312-win_amd64.whl) |\r\n| `1.0.1` | `2.6` | `3.11` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.6-cp311-cp311-win_amd64.whl) |\r\n| `1.0.1` | `2.6` | `3.10` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.6-cp310-cp310-win_amd64.whl) |\r\n| `1.0.1` | `2.5` | `3.12` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.5-cp312-cp312-win_amd64.whl) |\r\n| `1.0.1` | `2.5` | `3.11` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.5-cp311-cp311-win_amd64.whl) |\r\n| `1.0.1` | `2.5` | `3.10` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.5-cp310-cp310-win_amd64.whl) |\r\n| `1.0.0` | `2.9` | `3.13` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.9-cp313-cp313-win_amd64.whl) |\r\n| `1.0.0` | `2.9` | `3.12` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.9-cp312-cp312-win_amd64.whl) |\r\n| `1.0.0` | `2.9` | `3.11` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.9-cp311-cp311-win_amd64.whl) |\r\n| `1.0.0` | `2.9` | `3.10` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.9-cp310-cp310-win_amd64.whl) |\r\n| `1.0.0` | `2.8` | `3.13` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.8-cp313-cp313-win_amd64.whl) |\r\n| `1.0.0` | `2.8` | `3.12` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.8-cp312-cp312-win_amd64.whl) |\r\n| `1.0.0` | `2.8` | `3.11` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.8-cp311-cp311-win_amd64.whl) |\r\n| `1.0.0` | `2.8` | `3.10` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.8-cp310-cp310-win_amd64.whl) |\r\n| `1.0.0` | `2.7` | `3.13` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.7-cp313-cp313-win_amd64.whl) |\r\n| `1.0.0` | `2.7` | `3.12` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.7-cp312-cp312-win_amd64.whl) |\r\n| `1.0.0` | `2.7` | `3.11` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.7-cp311-cp311-win_amd64.whl) |\r\n| `1.0.0` | `2.7` | `3.10` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.7-cp310-cp310-win_amd64.whl) |\r\n| `1.0.0` | `2.6` | `3.13` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.6-cp313-cp313-win_amd64.whl) |\r\n| `1.0.0` | `2.6` | `3.12` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.6-cp312-cp312-win_amd64.whl) |\r\n| `1.0.0` | `2.6` | `3.11` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.6-cp311-cp311-win_amd64.whl) |\r\n| `1.0.0` | `2.6` | `3.10` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.6-cp310-cp310-win_amd64.whl) |\r\n| `1.0.0` | `2.5` | `3.12` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.5-cp312-cp312-win_amd64.whl) |\r\n| `1.0.0` | `2.5` | `3.11` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.5-cp311-cp311-win_amd64.whl) |\r\n| `1.0.0` | `2.5` | `3.10` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.5-cp310-cp310-win_amd64.whl) |\r\n| `0.3.2` | `2.9` | `3.12` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fnunchaku-0.3.2%2Btorch2.9-cp312-cp312-win_amd64.whl?download=true) |\r\n| `0.3.2` | `2.8` | `3.12` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.8-cp312-cp312-win_amd64.whl) |\r\n| `0.3.2` | `2.8` | `3.11` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.8-cp311-cp311-win_amd64.whl) |\r\n| `0.3.2` | `2.8` | `3.10` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.8-cp310-cp310-win_amd64.whl) |\r\n| `0.3.2` | `2.7` | `3.12` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.7-cp312-cp312-win_amd64.whl) |\r\n| `0.3.2` | `2.7` | `3.11` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.7-cp311-cp311-win_amd64.whl) |\r\n| `0.3.2` | `2.7` | `3.10` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.7-cp310-cp310-win_amd64.whl) |\r\n| `0.3.2` | `2.6` | `3.12` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.6-cp312-cp312-win_amd64.whl) |\r\n| `0.3.2` | `2.6` | `3.11` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.6-cp311-cp311-win_amd64.whl) |\r\n| `0.3.2` | `2.6` | `3.10` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.6-cp310-cp310-win_amd64.whl) |\r\n| `0.3.2` | `2.5` | `3.12` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.5-cp312-cp312-win_amd64.whl) |\r\n| `0.3.2` | `2.5` | `3.11` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.5-cp311-cp311-win_amd64.whl) |\r\n| `0.3.2` | `2.5` | `3.10` | [Link](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.5-cp310-cp310-win_amd64.whl) |\r\n\u003C!-- END_NUNCHAKU_TABLE -->\r\n  \r\n\u003Cp id=\"natten\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\r\n\r\n### 🛠 NATTEN\r\nNeighborhood Attention Transformer.\r\n\r\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FSHI--Labs-NATTEN-blue?style=flat)](https:\u002F\u002Fgithub.com\u002FSHI-Labs\u002FNATTEN)\r\n[![HuggingFace](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flldacing-HF%20Wheels-orange?style=flat)](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Ftree\u002Fmain)\r\n\r\n\u003C!-- START_NATTEN_TABLE -->\r\n\u003C!-- This table is auto-generated. Do not edit manually. -->\r\n| Package Version | PyTorch Ver | Python Ver | CUDA Ver | Download Link |\r\n|:---:|:---:|:---:|:---:|:---:|\r\n| `0.17.5` | `2.7.0` | `3.12` | `12.8` | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.5+torch270cu128-cp312-cp312-win_amd64.whl) |\r\n| `0.17.5` | `2.7.0` | `3.11` | `12.8` | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.5+torch270cu128-cp311-cp311-win_amd64.whl) |\r\n| `0.17.5` | `2.7.0` | `3.10` | `12.8` | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.5+torch270cu128-cp310-cp310-win_amd64.whl) |\r\n| `0.17.5` | `2.6.0` | `3.12` | `12.6` | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.5+torch260cu126-cp312-cp312-win_amd64.whl) |\r\n| `0.17.5` | `2.6.0` | `3.11` | `12.6` | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.5+torch260cu126-cp311-cp311-win_amd64.whl) |\r\n| `0.17.5` | `2.6.0` | `3.10` | `12.6` | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.5+torch260cu126-cp310-cp310-win_amd64.whl) |\r\n| `0.17.3` | `2.5.1` | `3.12` | `12.4` | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch251cu124-cp312-cp312-win_amd64.whl) |\r\n| `0.17.3` | `2.5.1` | `3.11` | `12.4` | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch251cu124-cp311-cp311-win_amd64.whl) |\r\n| `0.17.3` | `2.5.1` | `3.10` | `12.4` | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch251cu124-cp310-cp310-win_amd64.whl) |\r\n| `0.17.3` | `2.5.0` | `3.12` | `12.4` | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch250cu124-cp312-cp312-win_amd64.whl) |\r\n| `0.17.3` | `2.5.0` | `3.11` | `12.4` | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch250cu124-cp311-cp311-win_amd64.whl) |\r\n| `0.17.3` | `2.5.0` | `3.10` | `12.4` | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch250cu124-cp310-cp310-win_amd64.whl) |\r\n| `0.17.3` | `2.4.1` | `3.12` | `12.4` | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch241cu124-cp312-cp312-win_amd64.whl) |\r\n| `0.17.3` | `2.4.1` | `3.11` | `12.4` | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch241cu124-cp311-cp311-win_amd64.whl) |\r\n| `0.17.3` | `2.4.1` | `3.10` | `12.4` | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch241cu124-cp310-cp310-win_amd64.whl) |\r\n| `0.17.3` | `2.4.0` | `3.12` | `12.4` | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch240cu124-cp312-cp312-win_amd64.whl) |\r\n| `0.17.3` | `2.4.0` | `3.11` | `12.4` | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch240cu124-cp311-cp311-win_amd64.whl) |\r\n| `0.17.3` | `2.4.0` | `3.10` | `12.4` | [Link](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch240cu124-cp310-cp310-win_amd64.whl) |\r\n\u003C!-- END_NATTEN_TABLE -->\r\n\u003Cp id=\"triton\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\r\n\r\n### 🛠 Triton (Windows Fork)\r\nTriton is a language and compiler for writing highly efficient custom deep-learning primitives. Not officially supported on Windows, but a fork provides pre-built wheels.\r\n\r\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Ftriton--lang-triton--windows-blue?style=flat)](https:\u002F\u002Fgithub.com\u002Ftriton-lang\u002Ftriton-windows)\r\n\r\n**Supported GPUs**:\r\n> [!NOTE]\r\n> Different GPU architectures require different Triton versions due to compute capability support.\r\n\r\n| Triton Version | Supported GPUs | Compute Capability |\r\n|:---:|:---|:---:|\r\n| `3.6.x` | RTX 50xx (Blackwell), RTX 40xx, Ada Lovelace, Hopper | SM 8.9, 9.0, 10.0 |\r\n| `3.5.x` | RTX 30xx, 40xx, Ada Lovelace, Hopper | SM 8.0, 8.9, 9.0 |\r\n| `3.4.x` | RTX 20xx, 30xx, 40xx, Ada Lovelace, Hopper | SM 7.5, 8.0, 8.9, 9.0 |\r\n| `\u003C= 3.2.x` | GTX\u002FRTX 16xx, RTX 20xx, 30xx, 40xx, Ada Lovelace, Hopper | SM 7.0, 7.5, 8.0, 8.9, 9.0 |\r\n\r\n**Installation**:\r\n\r\n| Package Version | PyTorch Ver | Compute Capability | Install |\r\n|:---:|:---:|:---:|:---|\r\n| `3.6.x` | >= 2.9 | SM 8.9+ | `pip install -U \"triton-windows\u003C3.7\"` |\r\n| `3.5.x` | >= 2.9 | SM 8.0+ | `pip install -U \"triton-windows\u003C3.6\"` |\r\n| `3.4.x` | >= 2.8 | SM 7.5+ | `pip install -U \"triton-windows\u003C3.5\"` |\r\n\r\n**Python libs**:\r\n> [!IMPORTANT]\r\n> Triton requires additional Python development libraries for building CUDA kernels. Download the package matching your Python version, extract the ZIP file, and copy the `include` and `libs` folders to your Python installation directory.\r\n\r\n| Python Ver | Download |\r\n|:---:|:---:|\r\n| `3.13` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002Ftriton-windows\u002Freleases\u002Fdownload\u002Fv3.0.0-windows.post1\u002Fpython_3.13.2_include_libs.zip) |\r\n| `3.12` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002Ftriton-windows\u002Freleases\u002Fdownload\u002Fv3.0.0-windows.post1\u002Fpython_3.12.7_include_libs.zip) |\r\n| `3.11` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002Ftriton-windows\u002Freleases\u002Fdownload\u002Fv3.0.0-windows.post1\u002Fpython_3.11.9_include_libs.zip) |\r\n| `3.10` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002Ftriton-windows\u002Freleases\u002Fdownload\u002Fv3.0.0-windows.post1\u002Fpython_3.10.11_include_libs.zip) |\r\n| `3.9` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002Ftriton-windows\u002Freleases\u002Fdownload\u002Fv3.0.0-windows.post1\u002Fpython_3.9.13_include_libs.zip) |\r\n\r\n\u003Cp id=\"bitsandbytes\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\r\n\r\n### 🛠 bitsandbytes\r\nA lightweight wrapper around CUDA custom functions, particularly for 8-bit optimizers, matrix multiplication (LLM.int8()), and quantization functions.\r\n\r\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbitsandbytes--foundation-bitsandbytes-blue?style=flat)](https:\u002F\u002Fgithub.com\u002Fbitsandbytes-foundation\u002Fbitsandbytes)\r\n\r\n\u003Cp align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\r\n\r\n### 🛠 RadialAttention for ComfyUI\r\n\r\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fwoct0rdho-ComfyUI--RadialAttn-blue?style=flat)](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FComfyUI-RadialAttn)\r\n\r\n\u003Cp align=\"right\">(\u003Ca href=\"#readme-top\">back to top\u003C\u002Fa>)\u003C\u002Fp>\r\n\r\n\u003Cp id=\"spargeattn\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\r\n\r\n### 🛠 SpargeAttn\r\n\r\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fthu--ml-SpargeAttn-blue?style=flat)](https:\u002F\u002Fgithub.com\u002Fthu-ml\u002FSpargeAttn)\r\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fwoct0rdho-Wheels-green?style=flat)](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSpargeAttn\u002Freleases)\r\n\u003C!-- START_SPARGEATTN_TABLE -->\r\n\u003C!-- This table is auto-generated. Do not edit manually. -->\r\n| Package Version | PyTorch Ver | CUDA Ver | Download Link |\r\n|:---:|:---:|:---:|:---:|\r\n| `0.1.0.post1` | `2.8.0` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSpargeAttn\u002Freleases\u002Fdownload\u002Fv0.1.0-windows.post1\u002Fspas_sage_attn-0.1.0+cu128torch2.8.0.post1-cp39-abi3-win_amd64.whl) |\r\n| `0.1.0.post1` | `2.7.1` | `12.8` | [Link](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSpargeAttn\u002Freleases\u002Fdownload\u002Fv0.1.0-windows.post1\u002Fspas_sage_attn-0.1.0+cu128torch2.7.1.post1-cp39-abi3-win_amd64.whl) |\r\n\u003C!-- END_SPARGEATTN_TABLE -->\r\n\r\n\u003Cp id=\"block_sparse_attn\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\r\n\r\n### 🛠 Block Sparse Attention\r\n\r\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fmit--han--lab-Block%20Sparse-blue?style=flat)](https:\u002F\u002Fgithub.com\u002Fmit-han-lab\u002FBlock-Sparse-Attention\u002F)\r\n[![HuggingFace](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FWildminder-HF%20Wheels-orange?style=flat)](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Ftree\u002Fmain)\r\n\u003C!-- START_BLOCKSPARSEATTN_TABLE -->\r\n\u003C!-- This table is auto-generated. Do not edit manually. -->\r\n| Package Version | PyTorch Ver | Python Ver | CUDA Ver | Download Link |\r\n|:---:|:---:|:---:|:---:|:---:|\r\n| `0.0.2.post1` | `2.11` | `3.13` | `13.0` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fblock_sparse_attn-0.0.2.post1+cu130torch2.11cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\r\n| `0.0.2.post1` | `2.10` | `3.13` | `13.0` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fblock_sparse_attn-0.0.2.post1+cu130torch2.10cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\r\n| `0.0.2.post1` | `2.9.1` | `3.13` | `13.0` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fblock_sparse_attn-0.0.2.post1+cu130torch2.9.1cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\r\n\u003C!-- END_BLOCKSPARSEATTN_TABLE -->\r\n\r\n\u003Cp id=\"deepspeed\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\r\n\r\n### 🛠 DeepSpeed\r\n* A deep learning optimization library \r\n* **Official Repo**: [https:\u002F\u002Fgithub.com\u002Fdeepspeedai\u002FDeepSpeed](https:\u002F\u002Fgithub.com\u002Fdeepspeedai\u002FDeepSpeed)\r\n\u003C!-- START_DEEPSPEED_TABLE -->\r\n\u003C!-- This table is auto-generated. Do not edit manually. -->\r\n| Package Version | Python Ver | Download Link |\r\n|:---:|:---:|:---:|\r\n| `0.18.6` | `3.13` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fdeepspeed\u002Fdeepspeed-0.18.6+0ccb2bb6-cp313-cp313-win_amd64.whl) |\r\n\u003C!-- END_DEEPSPEED_TABLE -->\r\n\r\n\u003Cp id=\"fairseq\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\r\n\r\n### 🛠 Fairseq\r\n* Facebook AI Research Sequence-to-Sequence Toolkit \r\n* **Official Repo**: [https:\u002F\u002Fgithub.com\u002Ffacebookresearch\u002Ffairseq](https:\u002F\u002Fgithub.com\u002Ffacebookresearch\u002Ffairseq)\r\n\u003C!-- START_FAIRSEQ_TABLE -->\r\n\u003C!-- This table is auto-generated. Do not edit manually. -->\r\n| Package Version | Python Ver | Download Link |\r\n|:---:|:---:|:---:|\r\n| `0.12.2` | `3.13` | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Ffairseq\u002Ffairseq-0.12.2-cp313-cp313-win_amd64.whl) |\r\n\u003C!-- END_FAIRSEQ_TABLE -->\r\n\r\n\u003Cp id=\"causalconv1d\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\r\n\r\n### 🛠 causal_conv1d\r\n\u003C!-- START_CAUSALCONV1D_TABLE -->\r\n\u003C!-- This table is auto-generated. Do not edit manually. -->\r\n| Package Version | PyTorch Ver | Python Ver | CUDA Ver | CXX11 ABI | Download Link |\r\n|:---:|:---:|:---:|:---:|:---:|:---:|\r\n| `1.6.1` | `2.11.0` | `3.14` | `13.0` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fcausal_conv1d\u002Fcausal_conv1d-1.6.1+d20260310.cu130torch2.11.0cxx11abi1-cp314-cp314-win_amd64.whl) |\r\n| `1.6.1` | `2.11.0` | `3.13` | `13.0` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fcausal_conv1d\u002Fcausal_conv1d-1.6.1+d20260310.cu130torch2.11.0cxx11abi1-cp313-cp313-win_amd64.whl) |\r\n| `1.6.1` | `2.10.0` | `3.13` | `13.0` | ✓ | [Link](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fcausal_conv1d\u002Fcausal_conv1d-1.6.1+d20260310.cu130torch2.10.0cxx11abi1-cp313-cp313-win_amd64.whl) |\r\n\u003C!-- END_CAUSALCONV1D_TABLE -->\r\n\r\n\r\n\r\n\r\n\u003Cp align=\"center\">▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀\u003C\u002Fp>\r\n\r\n\u003C!-- DATA ACCESS -->\r\n## 🌐 Accessing Data Programmatically (wheels.json)\r\n\r\nAll wheel information in this repository is managed in the `wheels.json` file, which serves as the single source of truth. The tables in this README are automatically generated from this file.\r\n\r\nThis provides a stable, structured JSON endpoint for any external tool or application that needs to access this data without parsing Markdown.\r\n\r\n### ➤ How to Use\r\n\r\nYou can access the raw JSON file directly via the following URL:\r\n\r\n```\r\nhttps:\u002F\u002Fraw.githubusercontent.com\u002Fwildminder\u002FAI-windows-whl\u002Fmain\u002Fwheels.json\r\n```\r\n\r\n**Example using `curl`:**\r\n```sh\r\ncurl -L -o wheels.json https:\u002F\u002Fraw.githubusercontent.com\u002Fwildminder\u002FAI-windows-whl\u002Fmain\u002Fwheels.json\r\n```\r\n\r\nThe file contains a list of `packages`, each with its metadata and an array of `wheels`, where each wheel object contains version details and a direct download `url`.\r\n\r\n\u003Cp align=\"right\">(\u003Ca href=\"#readme-top\">back to top\u003C\u002Fa>)\u003C\u002Fp>\r\n\r\n\u003Cp align=\"center\">▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀\u003C\u002Fp>\r\n\r\n\u003C!-- CONTRIBUTING -->\r\n## ➤ Contributing\r\n\r\nContributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are **greatly appreciated**.\r\n\r\nIf you have found a new pre-built wheel or a reliable source, please fork the repo and create a pull request, or simply open an issue with the link.\r\n\u003Cp align=\"right\">(\u003Ca href=\"#readme-top\">back to top\u003C\u002Fa>)\u003C\u002Fp>\r\n\r\n\r\n\r\n\u003C!-- ACKNOWLEDGMENTS -->\r\n## ➤ Acknowledgments\r\n\r\nThis repository is simply a collection of links. Huge thanks to the individuals and groups who do the hard work of building and hosting these wheels for the community:\r\n\r\n\r\n\u003C!-- MARKDOWN LINKS & IMAGES -->\r\n[contributors-shield]: https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fcontributors\u002FYOUR_USERNAME\u002FWindows-AI-Wheels.svg?style=for-the-badge\r\n[contributors-url]: https:\u002F\u002Fgithub.com\u002FYOUR_USERNAME\u002FWindows-AI-Wheels\u002Fgraphs\u002Fcontributors\r\n[forks-shield]: https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fforks\u002FYOUR_USERNAME\u002FWindows-AI-Wheels.svg?style=for-the-badge\r\n[forks-url]: https:\u002F\u002Fgithub.com\u002FYOUR_USERNAME\u002FWindows-AI-Wheels\u002Fnetwork\u002Fmembers\r\n[stars-shield]: https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fstars\u002FYOUR_USERNAME\u002FWindows-AI-Wheels.svg?style=for-the-badge\r\n[stars-url]: https:\u002F\u002Fgithub.com\u002FYOUR_USERNAME\u002FWindows-AI-Wheels\u002Fstargazers\r\n[issues-shield]: https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fissues\u002FYOUR_USERNAME\u002FWindows-AI-Wheels.svg?style=for-the-badge\r\n[issues-url]: https:\u002F\u002Fgithub.com\u002FYOUR_USERNAME\u002FWindows-AI-Wheels\u002Fissues\r\n","\u003C!-- 改进了返回顶部链接的兼容性：参见：https:\u002F\u002Fgithub.com\u002Fothneildrew\u002FBest-README-Template\u002Fpull\u002F73 -->\r\n\u003C!-- 项目Logo -->\r\n\u003Ca id=\"readme-top\">\u003C\u002Fa>\r\n\u003Cdiv align=\"center\">\r\n  \u003Ch1 align=\"center\">Windows AI Wheels\u003C\u002Fh1>\r\n\r\n\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_14d6df726b07.png\" alt=\"AI-windows-whl logo\">\r\n\r\n  \u003Cp align=\"center\">\r\n    一个精心整理的、针对 Windows 平台上难以安装的 AI\u002FML 库的预编译 Python wheels 集合。\r\n    \u003Cbr \u002F>\r\n    \u003Cbr \u002F>\r\n    \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fwildminder\u002FAI-windows-whl\u002Fissues\u002Fnew?labels=bug&template=bug-report---.md\">报告失效链接\u003C\u002Fa>\r\n    ·\r\n    \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fwildminder\u002FAI-windows-whl\u002Fissues\u002Fnew?labels=enhancement&template=feature-request---.md\">请求新增 wheel\u003C\u002Fa>\r\n  \u003C\u002Fp>\r\n\u003C\u002Fdiv>\r\n\r\n\r\n\u003C!-- 目录 -->\r\n\u003Cdetails>\r\n  \u003Csummary>目录\u003C\u002Fsummary>\r\n  \u003Col>\r\n    \u003Cli>\u003Ca href=\"#about-the-project\">关于该项目\u003C\u002Fa>\u003C\u002Fli>\r\n    \u003Cli>\r\n      \u003Ca href=\"#getting-started\">快速入门\u003C\u002Fa>\r\n      \u003Cul>\r\n        \u003Cli>\u003Ca href=\"#prerequisites\">先决条件\u003C\u002Fa>\u003C\u002Fli>\r\n        \u003Cli>\u003Ca href=\"#installation\">安装方法\u003C\u002Fa>\u003C\u002Fli>\r\n      \u003C\u002Ful>\r\n    \u003C\u002Fli>\r\n    \u003Cli>\u003Ca href=\"#available-wheels\">可用的 wheels\u003C\u002Fa>\r\n      \u003Cul>\r\n        \u003Cli>\u003Ca href=\"#pytorch\">PyTorch\u003C\u002Fa>\u003C\u002Fli>\r\n        \u003Cli>\u003Ca href=\"#flash-attention\">Flash Attention\u003C\u002Fa>\u003C\u002Fli>\r\n        \u003Cli>\u003Ca href=\"#xformers\">xformers\u003C\u002Fa>\u003C\u002Fli>\r\n        \u003Cli>\u003Ca href=\"#sageattention\">SageAttention\u003C\u002Fa>\u003C\u002Fli>\r\n        \u003Cli>\u003Ca href=\"#natten\">NATTEN\u003C\u002Fa>\u003C\u002Fli>\r\n        \u003Cli>\u003Ca href=\"#triton\">Triton（Windows 分支）\u003C\u002Fa>\u003C\u002Fli>\r\n        \u003Cli>\u003Ca href=\"#bitsandbytes\">bitsandbytes\u003C\u002Fa>\u003C\u002Fli>\r\n        \u003Cli>\u003Ca href=\"#other-packages\">其他包\u003C\u002Fa>\u003C\u002Fli>\r\n      \u003C\u002Ful>\r\n    \u003C\u002Fli>\r\n  \u003C\u002Fol>\r\n\u003C\u002Fdetails>\r\n\r\n\u003Cdiv align=\"center\">\r\n\u003Ca href=\"#pytorch\">\u003Cimg width=\"120\" height=\"52\" alt=\"PyTorch\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_6a1342911344.png\" \u002F>\u003C\u002Fa>\r\n  \u003Ca href=\"#torchaudio\">\u003Cimg width=\"120\" height=\"52\" alt=\"Torchaudio\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_68d96e09a63d.png\" \u002F>\u003C\u002Fa>\r\n  \u003Ca href=\"#flash-attention\">\u003Cimg width=\"120\" height=\"52\" alt=\"Flash Attention\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_80422bbcc74a.png\" \u002F>\u003C\u002Fa>\r\n\u003Ca href=\"#xformers\">\u003Cimg width=\"120\" height=\"52\" alt=\"xFormers\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_db27055f33a9.png\" \u002F>\u003C\u002Fa>    \r\n  \u003Ca href=\"#sageattention\">\u003Cimg width=\"120\" height=\"52\" alt=\"SageAttention\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_ad1c8deaaa68.png\" \u002F>\u003C\u002Fa>\r\n  \u003Ca href=\"#nunchaku\">\u003Cimg width=\"120\" height=\"52\" alt=\"Nunchaku\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_62cdd845629b.png\" \u002F>\u003C\u002Fa>\r\n\u003Ca href=\"#natten\">\u003Cimg width=\"120\" height=\"52\" alt=\"Natten\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_174f731582ae.png\" \u002F>\u003C\u002Fa>\r\n\u003Ca href=\"#triton\">\u003Cimg width=\"120\" height=\"52\" alt=\"triton\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_31e63ea4da5e.png\" \u002F>\u003C\u002Fa>\r\n\u003Ca href=\"#spargeattn\">\u003Cimg width=\"120\" height=\"52\" alt=\"SpargeAttn\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_82636fdbed65.png\" \u002F>\u003C\u002Fa>\r\n\u003Ca href=\"#bitsandbytes\">\u003Cimg width=\"120\" height=\"52\" alt=\"bitsandbytes\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_1f13c2b4224c.png\" \u002F>\u003C\u002Fa>  \r\n\u003C\u002Fdiv>\r\n\r\n\r\n\r\n\u003C!-- 关于项目 -->\r\n## 关于该项目\r\n\r\n本仓库旨在解决 Windows 平台上的 AI 爱好者和开发者普遍面临的一个痛点：**从源码构建复杂的 Python 包。** 像 `flash-attention`、`xformers` 这样的库对于高性能 AI 任务至关重要，但它们通常缺乏官方提供的适用于 Windows 的预编译 wheels，这迫使用户不得不进行复杂且容易出错的编译过程。\r\n\r\n我们的目标是提供一个集中、及时更新的预编译 `.whl` 文件直接下载链接集合，主要面向 **ComfyUI** 社区以及其他在 Windows 上使用 PyTorch 的用户。这样可以节省您的时间，让您专注于更重要的事情：用 AI 创造令人惊叹的作品。\n\n### 查找 Windows AI Wheels\n为了让使用更加便捷，您可以使用此页面 **[查找 Windows AI Wheels](https:\u002F\u002Fwildminder.github.io\u002FAI-windows-whl\u002F)** 快速搜索所需的包。\n\u003Cdiv align=\"center\">\n\u003Ca  href=\"https:\u002F\u002Fwildminder.github.io\u002FAI-windows-whl\u002F\">\n\u003Cimg width=\"70%\" alt=\"image\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_readme_23a1f7f7a854.png\" \u002F>\n\u003C\u002Fa>  \n\u003C\u002Fdiv>\n\n\u003Cp align=\"right\">（\u003Ca href=\"#readme-top\">返回顶部\u003C\u002Fa>)\u003C\u002Fp>\n\n\n\u003C!-- 快速入门 -->\r\n## 快速入门\n\n请按照以下简单步骤使用本仓库中的 wheels。\n\n### 先决条件\n\n1.  **Windows 版本的 Python**：确保已安装兼容的 Python 版本（目前 PyTorch 在 Windows 上支持 **Python 3.9 - 3.14**）。您可以通过 [Python 官方网站](https:\u002F\u002Fwww.python.org\u002Fdownloads\u002Fwindows\u002F)获取。\n\n### 安装\n\n要安装某个 wheel，只需使用 `pip` 并指定该 `.whl` 文件的直接 URL 即可。请注意，URL 需要用引号括起来。\n\n```sh\n# 示例：安装特定的 flash-attention wheel\npip install \"https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fblob\u002Fmain\u002Fflash_attn-2.7.4.post1+cu128torch2.7.0cxx11abiFALSE-cp312-cp312-win_amd64.whl\"\n```\n\n> [!提示]\n> 在下方的 [可用 Wheels](#available-wheels) 部分找到您需要的包，根据您的环境（Python、PyTorch、CUDA 版本）选择对应的行，并复制 `pip install` 命令中的链接。\n\n\u003Cp align=\"right\">（\u003Ca href=\"#readme-top\">返回顶部\u003C\u002Fa>)\u003C\u002Fp>\n\n\n\u003C!-- 可用 Wheels -->\r\n## 可用 Wheels\n\n以下是目前已收录的包列表。\n\n\u003Ca id=\"pytorch\">\u003C\u002Fa>\n\n### 🛠 PyTorch\n一切的基础。请先从官方渠道安装。\n*   **官方安装页面**: [https:\u002F\u002Fpytorch.org\u002Fget-started\u002Flocally\u002F](https:\u002F\u002Fpytorch.org\u002Fget-started\u002Flocally\u002F)\n\n为方便起见，以下是针对 Linux\u002FWSL 系统且配备 NVIDIA GPU 的特定版本的直接安装命令。对于其他配置（CPU、macOS、ROCm），请使用官方安装页面。\n\n#### 稳定版 (2.11.0)\n这是大多数用户的推荐版本。\n\n| CUDA 版本 | pip 安装命令                                                                                      |\n|:-------------|:---------------------------------------------------------------------------------------------------------|\n| **CUDA 13.0**  | `pip install torch torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu130` |\n| **CUDA 12.8**  | `pip install torch torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu128` |\n| **CUDA 12.6**  | `pip install torch torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu126` |\n\n\u003Cdetails>\n  \u003Csummary>上一稳定版\u003C\u002Fsummary>\n\n#### 稳定版 (2.10.0)\n这是大多数用户的推荐版本。\n\n| CUDA 版本 | pip 安装命令                                                                                      |\n|:-------------|:---------------------------------------------------------------------------------------------------------|\n| **CUDA 13.0**  | `pip install \"torch>=2.10.0.dev,\u003C2.11.0\" torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu130` |\n| **CUDA 12.8**  | `pip install \"torch>=2.10.0.dev,\u003C2.11.0\" torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu128` |\n| **CUDA 12.6**  | `pip install \"torch>=2.10.0.dev,\u003C2.11.0\" torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu126` |\n\n#### 上一版本 (2.9.1)\n\n| CUDA 版本 | pip 安装命令                                                                                      |\n|:-------------|:---------------------------------------------------------------------------------------------------------|\n| **CUDA 13.0**  | `pip install \"torch>=2.9.0.dev,\u003C2.10.0\" torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu130` |\n| **CUDA 12.8**  | `pip install \"torch>=2.9.0.dev,\u003C2.10.0\" torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu128` |\n| **CUDA 12.6**  | `pip install \"torch>=2.9.0.dev,\u003C2.10.0\" torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu126` |\n\n##### 上一稳定版 (2.8.0)  \n| CUDA 版本 | pip 安装命令                                                              |\n|:-------------|:---------------------------------------------------------------------------------|\n| **CUDA 12.9**  | `pip install \"torch>=2.8.0.dev,\u003C2.9.0\" torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu129`           |\n| **CUDA 12.8**  | `pip install \"torch>=2.8.0.dev,\u003C2.9.0\" torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu128`           |\n| **CUDA 12.6**  | `pip install \"torch>=2.8.0.dev,\u003C2.9.0\" torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu126`           |\n\n##### 上一稳定版 (2.7.1)\n| CUDA 版本 | pip 安装命令                                                                                      |\n|:-------------|:---------------------------------------------------------------------------------------------------------|\n| **CUDA 12.8**  | `pip install torch==2.7.1 torchvision==0.22.1 torchaudio==2.7.1 --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu128` |\n| **CUDA 12.6**  | `pip install torch==2.7.1 torchvision==0.22.1 torchaudio==2.7.1 --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu126` |\n| **CUDA 11.8**  | `pip install torch==2.7.1 torchvision==0.22.1 torchaudio==2.7.1 --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu118` |\n| **仅 CPU**   | `pip install torch==2.7.1 torchvision==0.22.1 torchaudio==2.7.1 --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcpu`      |\n\n\u003C\u002Fdetails>\n\n---\n\n#### 夜间版本\n使用这些版本可以体验最新功能，但可能存在不稳定情况。\n\n**PyTorch 2.12（夜间版）**\n| CUDA 版本 | pip 安装命令                                                                                      |\n|:-------------|:---------------------------------------------------------------------------------------------------------|\n| **CUDA 13.0**  | `pip install --pre torch torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fnightly\u002Fcu130` |\n| **CUDA 12.8**  | `pip install --pre torch torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fnightly\u002Fcu128` |\n| **CUDA 12.6**  | `pip install --pre torch torchvision --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fnightly\u002Fcu126` |\n\n\u003Cp id=\"torchaudio\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\n\n### 🛠 Torchaudio\n\u003C!-- START_TORCHAUDIO_TABLE -->\n\u003C!-- 此表格由程序自动生成，请勿手动编辑。 -->\n| 软件包版本 | PyTorch 版本 | Python 版本 | CUDA 版本 | 下载链接 |\n|:---:|:---:|:---:|:---:|:---:|\n| `2.11.0a0` | `2.12.0` | `3.14` | `13.0` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Ftorchaudio\u002Ftorchaudio-2.11.0a0+d20260219.cu130torch2.12.0cxx11abi1-cp314-cp314-win_amd64.whl) |\n| `2.11.0a0` | `2.12.0` | `3.13` | `13.0` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Ftorchaudio\u002Ftorchaudio-2.11.0a0+d20260219.cu130torch2.12.0cxx11abi1-cp313-cp313-win_amd64.whl) |\n| `2.11.0a0` | `2.11.0` | `3.14` | `13.0` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Ftorchaudio\u002Ftorchaudio-2.11.0a0+d20260219.cu130torch2.11.0cxx11abi1-cp314-cp314-win_amd64.whl) |\n| `2.11.0a0` | `2.11.0` | `3.13` | `13.0` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Ftorchaudio\u002Ftorchaudio-2.11.0a0+d20260219.cu130torch2.11.0cxx11abi1-cp313-cp313-win_amd64.whl) |\n| `2.11.0a0` | `2.10.0` | `3.13` | `13.0` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Ftorchaudio\u002Ftorchaudio-2.11.0a0+d20260121.cu130torch2.10.0cxx11abi1-cp313-cp313-win_amd64.whl) |\n| `2.11.0a0` | `2.10.0` | `3.12` | `13.0` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Ftorchaudio\u002Ftorchaudio-2.11.0a0+d20260121.cu130torch2.10.0cxx11abi1-cp312-cp312-win_amd64.whl) |\n| `2.11.0a0` | `2.10.0` | `3.13` | `12.8` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Ftorchaudio\u002Ftorchaudio-2.11.0a0+cu128torch2.10.0cxx11abi1-cp313-cp313-win_amd64.whl) |\n| `2.8.0a0` | `2.9.0` | `3.12` | `12.8` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Ftorchaudio\u002Ftorchaudio-2.8.0a0+cu128torch2.9.0cxx11abi1-cp312-cp312-win_amd64.whl) |\n| `2.8.0a0` | `2.9.0` | `3.12` | `12.8` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Ftorchaudio\u002Ftorchaudio-2.8.0a0+cu128torch2.9.0cxx11abi1-cp312-cp312-win_amd64.whl) |\n\u003C!-- END_TORCHAUDIO_TABLE -->\n\n```sh\n\n# Torchcodec\npip 安装 torchcodec\n```\n\n\u003Cp id=\"flash-attention\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\n\n### 🛠 Flash Attention\n高性能注意力机制实现。\n\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FDao--AILab-flash--attention-blue?style=flat)](https:\u002F\u002Fgithub.com\u002FDao-AILab\u002Fflash-attention)\n[![HuggingFace](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flldacing-HF%20Wheels-orange?style=flat)](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Ftree\u002Fmain)\n[![HuggingFace](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FWildminder-HF%20Wheels-orange?style=flat)](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Ftree\u002Fmain)\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fmjun0812-Wheels-green?style=flat)](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels)\n\n\u003C!-- START_FLASHATTENTION_TABLE -->\n\u003C!-- 本表格由程序自动生成，请勿手动编辑。 -->\n| 软件包版本 | PyTorch 版本 | Python 版本 | CUDA 版本 | CXX11 ABI | 下载链接 |\n|:---:|:---:|:---:|:---:|:---:|:---:|\n| `2.8.4` | `2.12.0` | `3.14` | `13.0` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.4+d20260328cu130torch2.12.0cxx11abiTRUE-cp314-cp314-win_amd64.whl) |\n| `2.8.4` | `2.12.0` | `3.13` | `13.0` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.4+d20260328cu130torch2.12.0cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\n| `2.8.4` | `2.11.0` | `3.14` | `13.0` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.4+d20260328cu130torch2.11.0cxx11abiTRUE-cp314-cp314-win_amd64.whl) |\n| `2.8.4` | `2.11.0` | `3.13` | `13.0` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.4+d20260328cu130torch2.11.0cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\n| `2.8.3` | `2.11.0` | `3.13` | `13.0` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3+cu130torch2.11.0cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\n| `2.8.3` | `2.11.0` | `3.12` | `13.0` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3%2Bd20260120.cu130torch2.11.0cxx11abiTRUE-cp312-cp312-win_amd64.whl) |\n| `2.8.3` | `2.10.0` | `3.13` | `13.0` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3%2Bd20260121.cu130torch2.10.0cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\n| `2.8.3` | `2.10.0` | `3.13` | `13.0` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3+cu130torch2.10.0cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\n| `2.8.3` | `2.10.0` | `3.12` | `13.0` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3%2Bd20260121.cu130torch2.10.0cxx11abiTRUE-cp312-cp312-win_amd64.whl) |\n| `2.8.3` | `2.10.0` | `3.12` | `13.0` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3+cu130torch2.10.0cxx11abiTRUE-cp312-cp312-win_amd64.whl) |\n| `2.8.3` | `2.10.0` | `3.13` | `12.8` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3+cu128torch2.10.0cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\n| `2.8.3` | `2.9.1` | `3.13` | `13.0` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3%2Bcu130torch2.9.1cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\n| `2.8.3` | `2.9.1` | `3.12` | `13.0` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3%2Bcu130torch2.9.1cxx11abiTRUE-cp312-cp312-win_amd64.whl) |\n| `2.8.3` | `2.9.1` | `3.13` | `12.8` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3%2Bcu128torch2.9.1cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\n| `2.8.3` | `2.9.0` | `3.13` | `13.0` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3+cu130torch2.9.0cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\n| `2.8.3` | `2.9.0` | `3.12` | `13.0` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3%2Bcu130torch2.9.0cxx11abiTRUE-cp312-cp312-win_amd64.whl) |\n| `2.8.3` | `2.9.0` | `3.13` | `12.9` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3+cu129torch2.9.0cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\n| `2.8.3` | `2.9.0` | `3.12` | `12.8` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3+cu128torch2.9.0cxx11abiTRUE-cp312-cp312-win_amd64.whl) |\n| `2.8.3` | `2.8.0` | `3.12` | `12.8` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.3+cu128torch2.8.0cxx11abiTRUE-cp312-cp312-win_amd64.whl) |\n| `2.8.2` | `2.9.0` | `3.12` | `12.8` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.2%2Bcu128torch2.9.0cxx11abiTRUE-cp312-cp312-win_amd64.whl?download=true) |\n| `2.8.2` | `2.8.0` | `3.12` | `12.8` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.2%2Bcu128torch2.8.0cxx11abiTRUE-cp312-cp312-win_amd64.whl?download=true) |\n| `2.8.2` | `2.8.0` | `3.11` | `12.8` | ✓ | [链接](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.10\u002Fflash_attn-2.8.2+cu128torch2.8-cp311-cp311-win_amd64.whl) |\n| `2.8.2` | `2.8.0` | `3.10` | `12.8` | ✓ | [链接](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.10\u002Fflash_attn-2.8.2+cu128torch2.8-cp310-cp310-win_amd64.whl) |\n| `2.8.2` | `2.7.0` | `3.12` | `12.8` | ✗ | [链接](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.10\u002Fflash_attn-2.8.2+cu128torch2.7-cp312-cp312-win_amd64.whl) |\n| `2.8.2` | `2.7.0` | `3.11` | `12.8` | ✗ | [链接](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.10\u002Fflash_attn-2.8.2+cu128torch2.7-cp311-cp311-win_amd64.whl) |\n| `2.8.2` | `2.7.0` | `3.10` | `12.8` | ✗ | [链接](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.10\u002Fflash_attn-2.8.2+cu128torch2.7-cp310-cp310-win_amd64.whl) |\n| `2.8.1` | `2.8.0` | `3.12` | `12.8` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.1%2Bcu128torch2.8.0cxx11abiTRUE-cp312-cp312-win_amd64.whl?download=true) |\n| `2.8.0.post2` | `2.8.0` | `3.12` | `12.8` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.8.0.post2+cu128torch2.8.0cxx11abiTRUE-cp312-cp312-win_amd64.whl?download=true) |\n| `2.7.4.post1` | `2.8.0` | `3.12` | `12.8` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4.post1+cu128torch2.8.0cxx11abiTRUE-cp312-cp312-win_amd64.whl?download=true) |\n| `2.7.4.post1` | `2.8.0` | `3.10` | `12.8` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4.post1+cu128torch2.8.0cxx11abiTRUE-cp310-cp310-win_amd64.whl?download=true) |\n| `2.7.4.post1` | `2.7.0` | `3.12` | `12.8` | ✗ | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4.post1+cu128torch2.7.0cxx11abiFALSE-cp312-cp312-win_amd64.whl?download=true) |\n| `2.7.4.post1` | `2.7.0` | `3.11` | `12.8` | ✗ | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4.post1+cu128torch2.7.0cxx11abiFALSE-cp311-cp311-win_amd64.whl?download=true) |\n| `2.7.4.post1` | `2.7.0` | `3.10` | `12.8` | ✗ | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4.post1+cu128torch2.7.0cxx11abiFALSE-cp310-cp310-win_amd64.whl?download=true) |\n| `2.7.4` | `2.8.0` | `3.12` | `12.8` | ✓ | [链接](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.10\u002Fflash_attn-2.7.4+cu128torch2.8-cp312-cp312-win_amd64.whl) |\n| `2.7.4` | `2.8.0` | `3.11` | `12.8` | ✓ | [链接](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.10\u002Fflash_attn-2.7.4+cu128torch2.8-cp311-cp311-win_amd64.whl) |\n| `2.7.4` | `2.8.0` | `3.10` | `12.8` | ✓ | [链接](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.10\u002Fflash_attn-2.7.4+cu128torch2.8-cp310-cp310-win_amd64.whl) |\n| `2.7.4` | `2.7.0` | `3.12` | `12.8` | ✗ | [链接](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.10\u002Fflash_attn-2.7.4+cu128torch2.7-cp312-cp312-win_amd64.whl) |\n| `2.7.4` | `2.7.0` | `3.11` | `12.8` | ✗ | [链接](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.9\u002Fflash_attn-2.7.4+cu128torch2.7-cp311-cp311-win_amd64.whl) |\n| `2.7.4` | `2.7.0` | `3.10` | `12.8` | ✗ | [链接](https:\u002F\u002Fhuggingface.co\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.4.10\u002Fflash_attn-2.7.4+cu128torch2.7-cp310-cp310-win_amd64.whl) |\n| `2.7.4` | `2.6.0` | `3.12` | `12.6` | ✗ | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4+cu126torch2.6.0cxx11abiFALSE-cp312-cp312-win_amd64.whl?download=true) |\n| `2.7.4` | `2.6.0` | `3.11` | `12.6` | ✗ | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4+cu126torch2.6.0cxx11abiFALSE-cp311-cp311-win_amd64.whl?download=true) |\n| `2.7.4` | `2.6.0` | `3.10` | `12.6` | ✗ | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4+cu126torch2.6.0cxx11abiFALSE-cp310-cp310-win_amd64.whl?download=true) |\n| `2.7.4` | `2.6.0` | `3.12` | `12.4` | ✗ | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4+cu124torch2.6.0cxx11abiFALSE-cp312-cp312-win_amd64.whl?download=true) |\n| `2.7.4` | `2.6.0` | `3.11` | `12.4` | ✗ | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4+cu124torch2.6.0cxx11abiFALSE-cp311-cp311-win_amd64.whl?download=true) |\n| `2.7.4` | `2.6.0` | `3.10` | `12.4` | ✗ | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fresolve\u002Fmain\u002Fflash_attn-2.7.4+cu124torch2.6.0cxx11abiFALSE-cp310-cp310-win_amd64.whl?download=true) |\n\u003C!-- END_FLASHATTENTION_TABLE -->\n\n\u003Cp id=\"flash-attention-3\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\n\n### 🛠 Flash Attention 3\n下一代 Flash Attention，性能和功能均有所提升。\n\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fwindreamer-FA3%20Wheels-blue?style=flat)](https:\u002F\u002Fgithub.com\u002Fwindreamer\u002Fflash-attention-3-wheels-windows)\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fmjun0812-FA3%20Wheels-green?style=flat)](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-3-wheels-windows)\n\n\u003C!-- START_FLASHATTENTION3_TABLE -->\n\u003C!-- 此表格由程序自动生成，请勿手动编辑。 -->\n| 软件包版本 | PyTorch 版本 | Python 版本 | CUDA 版本 | CXX11 ABI | 下载链接 |\n|:---:|:---:|:---:|:---:|:---:|:---:|\n| `3.0.0` | `2.10` | `3.9+` | `13.0` | ✓ | [链接](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.9.3\u002Fflash_attn_3-3.0.0+cu130torch2.10gite2743ab-cp39-abi3-win_amd64.whl) |\n| `3.0.0` | `2.10` | `3.9+` | `13.0` | ✓ | [链接](https:\u002F\u002Fgithub.com\u002Fwindreamer\u002Fflash-attention3-wheels\u002Freleases\u002Fdownload\u002F2026.03.19-850211f\u002Fflash_attn_3-3.0.0+20260318.cu130torch2100cxx11abitrue.8afc61-cp39-abi3-win_amd64.whl) |\n| `3.0.0` | `2.10` | `3.9+` | `12.8` | ✓ | [链接](https:\u002F\u002Fgithub.com\u002Fwindreamer\u002Fflash-attention3-wheels\u002Freleases\u002Fdownload\u002F2026.03.19-850211f\u002Fflash_attn_3-3.0.0+20260318.cu128torch2100cxx11abitrue.8afc61-cp39-abi3-win_amd64.whl) |\n| `3.0.0` | `2.10` | `3.9+` | `12.8` | ✓ | [链接](https:\u002F\u002Fgithub.com\u002Fwindreamer\u002Fflash-attention3-wheels\u002Freleases\u002Fdownload\u002F2026.03.19-850211f\u002Fflash_attn_3-3.0.0+20260318.cu128torch280cxx11abitrue.8afc61-cp39-abi3-win_amd64.whl) |\n| `3.0.0` | `2.9` | `3.9+` | `13.0` | ✓ | [链接](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.9.3\u002Fflash_attn_3-3.0.0+cu130torch2.9gite2743ab-cp39-abi3-win_amd64.whl) |\n| `3.0.0` | `2.9` | `3.9+` | `12.8` | ✓ | [链接](https:\u002F\u002Fgithub.com\u002Fmjun0812\u002Fflash-attention-prebuild-wheels\u002Freleases\u002Fdownload\u002Fv0.9.3\u002Fflash_attn_3-3.0.0+cu128torch2.9gite2743ab-cp39-abi3-win_amd64.whl) |\n\u003C!-- END_FLASHATTENTION3_TABLE -->\n\n\u003Cp id=\"flash-attention-4\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\n\n### 🛠 Flash Attention 4\n最新版 Flash Attention 实现，采用前沿优化技术。\n\n\u003C!-- START_FLASHATTENTION4_TABLE -->\n\u003C!-- 此表格由程序自动生成，请勿手动编辑。 -->\n*(暂无预编译轮子可用 - 未跟踪该软件包)*\n\u003C!-- END_FLASHATTENTION4_TABLE -->\n\n\u003Cp id=\"xformers\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\n\n### 🛠 xformers\n另一款用于内存高效注意力机制及其他优化的库。\n\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Ffacebookresearch-xformers-blue?style=flat)](https:\u002F\u002Fgithub.com\u002Ffacebookresearch\u002Fxformers\u002Freleases)\n[![PyTorch](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FPyTorch-Wheels-red?style=flat)](https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fxformers\u002F)\n> [!NOTE]\n> PyTorch 提供了 xformers 的官方预编译轮子。通常可以直接使用 `pip install xformers` 进行安装。\n\n| CUDA 版本 | 安装命令 |\n|:---:|:---:|\n| **CUDA 12.6** | `pip3 install -U xformers --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu126` |\n| **CUDA 12.8** | `pip3 install -U xformers --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu128` |\n| **CUDA 13.0** | `pip3 install -U xformers --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu130` |\n\nABI3 版本，适用于 Python 3.9–3.12\n\n\u003C!-- START_XFORMERS_TABLE -->\n\u003C!-- 此表格由程序自动生成，请勿手动编辑。 -->\n| 软件包版本 | PyTorch 版本 | Python 版本 | CUDA 版本 | 下载链接 |\n|:---:|:---:|:---:|:---:|:---:|\n| `0.0.34` | `2.11` | `3.9+` | `13.0` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fxformers-0.0.34+torch2.11cu130-cp39-abi3-win_amd64.whl) |\n| `0.0.34` | `2.10` | `3.9+` | `13.0` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fxformers-0.0.34%2Bd20260123.cu130torch2.10-cp39-abi3-win_amd64.whl) |\n| `0.0.34` | `2.10` | `3.9+` | `13.0` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fxformers-0.0.34+torch2.10cu130-cp39-abi3-win_amd64.whl) |\n| `0.0.33` | `2.10` | `3.9+` | `13.0` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fxformers-0.0.33%2Bcu130torch2.10-cp39-abi3-win_amd64.whl) |\n| `0.0.33` | `2.9` | `3.9+` | `13.0` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fxformers-0.0.33%2Bcu130torch2.9-cp39-abi3-win_amd64.whl) |\n| `0.0.32.post2` | `2.8.0` | `3.9+` | `12.9` | [链接](https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu129\u002Fxformers-0.0.32.post2-cp39-abi3-win_amd64.whl) |\n| `0.0.32.post2` | `2.8.0` | `3.9+` | `12.8` | [链接](https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu128\u002Fxformers-0.0.32.post2-cp39-abi3-win_amd64.whl) |\n| `0.0.32.post2` | `2.8.0` | `3.9+` | `12.6` | [链接](https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu126\u002Fxformers-0.0.32.post2-cp39-abi3-win_amd64.whl) |\n\u003C!-- END_XFORMERS_TABLE --> \n\n\u003Cp id=\"sageattention\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\n\n### 🛠 SageAttention\n\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fthu--ml-SageAttention-blue?style=flat)](https:\u002F\u002Fgithub.com\u002Fthu-ml\u002FSageAttention)\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fwoct0rdho-Wheels-green?style=flat)](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases)\n[![HuggingFace](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FWildminder-HF%20Wheels-orange?style=flat)](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Ftree\u002Fmain)\n\n\u003C!-- START_SAGEATTENTION2_TABLE -->\n\u003C!-- 本表格由程序自动生成，请勿手动编辑。 -->\n| 软件包版本 | PyTorch 版本 | Python 版本 | CUDA 版本 | 下载链接 |\n|:---:|:---:|:---:|:---:|:---:|\n| `2.1.1` | `2.8.0` | `3.12` | `12.8` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.1.1+cu128torch2.8.0-cp312-cp312-win_amd64.whl?download=true) |\n| `2.1.1` | `2.7.0` | `3.10` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.1.1-windows\u002Fsageattention-2.1.1+cu128torch2.7.0-cp310-cp310-win_amd64.whl) |\n| `2.1.1` | `2.6.0` | `3.13` | `12.6` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.1.1-windows\u002Fsageattention-2.1.1+cu126torch2.6.0-cp313-cp313-win_amd64.whl) |\n| `2.1.1` | `2.6.0` | `3.12` | `12.6` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.1.1-windows\u002Fsageattention-2.1.1+cu126torch2.6.0-cp312-cp312-win_amd64.whl) |\n| `2.1.1` | `2.6.0` | `3.12` | `12.6` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.1.1+cu126torch2.6.0-cp312-cp312-win_amd64.whl?download=true) |\n| `2.1.1` | `2.6.0` | `3.11` | `12.6` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.1.1-windows\u002Fsageattention-2.1.1+cu126torch2.6.0-cp311-cp311-win_amd64.whl) |\n| `2.1.1` | `2.6.0` | `3.10` | `12.6` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.1.1-windows\u002Fsageattention-2.1.1+cu126torch2.6.0-cp310-cp310-win_amd64.whl) |\n| `2.1.1` | `2.6.0` | `3.9` | `12.6` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.1.1-windows\u002Fsageattention-2.1.1+cu126torch2.6.0-cp39-cp39-win_amd64.whl) |\n| `2.1.1` | `2.5.1` | `3.12` | `12.4` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.1.1-windows\u002Fsageattention-2.1.1+cu124torch2.5.1-cp312-cp312-win_amd64.whl) |\n| `2.1.1` | `2.5.1` | `3.11` | `12.4` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.1.1-windows\u002Fsageattention-2.1.1+cu124torch2.5.1-cp311-cp311-win_amd64.whl) |\n| `2.1.1` | `2.5.1` | `3.10` | `12.4` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.1.1-windows\u002Fsageattention-2.1.1+cu124torch2.5.1-cp310-cp310-win_amd64.whl) |\n| `2.1.1` | `2.5.1` | `3.9` | `12.4` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.1.1-windows\u002Fsageattention-2.1.1+cu124torch2.5.1-cp39-cp39-win_amd64.whl) |\n\u003C!-- END_SAGEATTENTION2_TABLE -->\n\n◇ ◇ ◇ ◇ ◇ ◇ ◇ ◇ ◇ ◇ ◇ ◇ ◇ ◇ ◇ ◇ ◇\n\n#### 🛠 SageAttention 2.2 (SageAttention2++)\n> [!NOTE]\n> 仅支持 CUDA >= 12.8，因此 PyTorch >= 2.7。\n\n\u003C!-- START_SAGEATTENTION22_TABLE -->\n\u003C!-- 本表格由程序自动生成，请勿手动编辑。 -->\n| 软件包版本 | PyTorch 版本 | Python 版本 | CUDA 版本 | 下载链接 |\n|:---:|:---:|:---:|:---:|:---:|\n| `2.2.0.post4` | `2.9.0+` | `3.9+` | `13.0` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post4\u002Fsageattention-2.2.0+cu130torch2.9.0andhigher.post4-cp39-abi3-win_amd64.whl) |\n| `2.2.0.post4` | `2.9.0+` | `3.9+` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post4\u002Fsageattention-2.2.0+cu128torch2.9.0andhigher.post4-cp39-abi3-win_amd64.whl) |\n| `2.2.0.post3` | `2.10.0` | `3.12` | `13.0` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0.post3+cu130torch2.10.0-cp312-cp312-win_amd64.whl) |\n| `2.2.0.post3` | `2.10.0` | `3.13` | `12.8` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0.post3+cu128torch2.10.0-cp313-cp313-win_amd64.whl) |\n| `2.2.0.post3` | `2.10.0` | `3.12` | `12.8` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0.post3+cu128torch2.10.0-cp312-cp312-win_amd64.whl) |\n| `2.2.0.post3` | `2.9.0` | `3.13` | `13.0` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0.post3+cu130torch2.9.0-cp313-cp313-win_amd64.whl) |\n| `2.2.0.post3` | `2.9.0` | `3.13` | `12.9` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0.post3+cu129torch2.9.0-cp313-cp313-win_amd64.whl) |\n| `2.2.0.post3` | `2.9.0` | `3.9+` | `12.9` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0.post3+cu129torch2.9.0-cp313-cp313-win_amd64.whl) |\n| `2.2.0.post3` | `2.9.0` | `3.13` | `12.8` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0.post3+cu128torch2.9.0-cp313-cp313-win_amd64.whl) |\n| `2.2.0.post3` | `2.9.0` | `3.9+` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post3\u002Fsageattention-2.2.0+cu128torch2.9.0.post3-cp39-abi3-win_amd64.whl) |\n| `2.2.0.post3` | `2.8.0` | `3.13` | `12.9` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0.post3+cu129torch2.8.0-cp313-cp313-win_amd64.whl) |\n| `2.2.0.post3` | `2.8.0` | `3.9+` | `12.9` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0.post3+cu129torch2.8.0-cp313-cp313-win_amd64.whl) |\n| `2.2.0.post3` | `2.8.0` | `3.13` | `12.8` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0.post3+cu128torch2.8.0-cp313-cp313-win_amd64.whl) |\n| `2.2.0.post3` | `2.8.0` | `3.9+` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post3\u002Fsageattention-2.2.0+cu128torch2.8.0.post3-cp39-abi3-win_amd64.whl) |\n| `2.2.0.post3` | `2.7.1` | `3.9+` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post3\u002Fsageattention-2.2.0+cu128torch2.7.1.post3-cp39-abi3-win_amd64.whl) |\n| `2.2.0.post3` | `2.6.0` | `3.9+` | `12.6` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post3\u002Fsageattention-2.2.0+cu126torch2.6.0.post3-cp39-abi3-win_amd64.whl) |\n| `2.2.0.post3` | `2.5.1` | `3.9+` | `12.4` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post3\u002Fsageattention-2.2.0+cu124torch2.5.1.post3-cp39-abi3-win_amd64.whl) |\n| `2.2.0.post2` | `2.9.0` | `3.9+` | `12.8` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fsageattention-2.2.0%2Bcu128torch2.9.0cxx11abi1-cp312-cp312-win_amd64.whl?download=true) |\n| `2.2.0.post2` | `2.8.0` | `3.9+` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post2\u002Fsageattention-2.2.0+cu128torch2.8.0.post2-cp39-abi3-win_amd64.whl) |\n| `2.2.0.post2` | `2.7.1` | `3.9+` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post2\u002Fsageattention-2.2.0+cu128torch2.7.1.post2-cp39-abi3-win_amd64.whl) |\n| `2.2.0.post2` | `2.6.0` | `3.9+` | `12.6` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post2\u002Fsageattention-2.2.0+cu126torch2.6.0.post2-cp39-abi3-win_amd64.whl) |\n| `2.2.0.post2` | `2.5.1` | `3.9+` | `12.4` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows.post2\u002Fsageattention-2.2.0+cu124torch2.5.1.post3-cp39-abi3-win_amd64.whl) |\n| `2.2.0` | `2.8.0` | `3.13` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows\u002Fsageattention-2.2.0+cu128torch2.8.0-cp313-cp313-win_amd64.whl) |\n| `2.2.0` | `2.8.0` | `3.12` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows\u002Fsageattention-2.2.0+cu128torch2.8.0-cp312-cp312-win_amd64.whl) |\n| `2.2.0` | `2.8.0` | `3.11` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows\u002Fsageattention-2.2.0+cu128torch2.8.0-cp311-cp311-win_amd64.whl) |\n| `2.2.0` | `2.8.0` | `3.10` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows\u002Fsageattention-2.2.0+cu128torch2.8.0-cp310-cp310-win_amd64.whl) |\n| `2.2.0` | `2.8.0` | `3.9` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows\u002Fsageattention-2.2.0+cu128torch2.8.0-cp39-cp39-win_amd64.whl) |\n| `2.2.0` | `2.7.1` | `3.13` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows\u002Fsageattention-2.2.0+cu128torch2.7.1-cp313-cp313-win_amd64.whl) |\n| `2.2.0` | `2.7.1` | `3.12` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows\u002Fsageattention-2.2.0+cu128torch2.7.1-cp312-cp312-win_amd64.whl) |\n| `2.2.0` | `2.7.1` | `3.11` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows\u002Fsageattention-2.2.0+cu128torch2.7.1-cp311-cp311-win_amd64.whl) |\n| `2.2.0` | `2.7.1` | `3.10` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows\u002Fsageattention-2.2.0+cu128torch2.7.1-cp310-cp310-win_amd64.whl) |\n| `2.2.0` | `2.7.1` | `3.9` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSageAttention\u002Freleases\u002Fdownload\u002Fv2.2.0-windows\u002Fsageattention-2.2.0+cu128torch2.7.1-cp39-cp39-win_amd64.whl) |\n\u003C!-- END_SAGEATTENTION22_TABLE -->\n\n\n##### 🛠 SageAttention 3\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fmengqin-green?style=flat)](https:\u002F\u002Fgithub.com\u002Fmengqin\u002FSageAttention)\n\n\u003C!-- START_SAGEATTN3_TABLE -->\n\u003C!-- 本表格由程序自动生成，请勿手动编辑。 -->\n| 软件包版本 | PyTorch 版本 | Python 版本 | CUDA 版本 | 下载链接 |\n|:---:|:---:|:---:|:---:|:---:|\n| `1.0.0` | `2.9.1` | `3.13` | `13.0` | [链接](https:\u002F\u002Fgithub.com\u002Fmengqin\u002FSageAttention\u002Freleases\u002Fdownload\u002F20251229\u002Fsageattn3-1.0.0+cu130torch291-cp313-cp313-win_amd64.whl) |\n| `1.0.0` | `2.9.1` | `3.12` | `13.0` | [链接](https:\u002F\u002Fgithub.com\u002Fmengqin\u002FSageAttention\u002Freleases\u002Fdownload\u002F20251229\u002Fsageattn3-1.0.0+cu130torch291-cp312-cp312-win_amd64.whl) |\n| `1.0.0` | `2.8.0` | `3.13` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fmengqin\u002FSageAttention\u002Freleases\u002Fdownload\u002F20251229\u002Fsageattn3-1.0.0+cu128torch2.8.0-cp313-cp313-win_amd64.whl) |\n| `1.0.0` | `2.8.0` | `3.12` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fmengqin\u002FSageAttention\u002Freleases\u002Fdownload\u002F20251229\u002Fsageattn3-1.0.0+cu128torch2.8.0-cp312-cp312-win_amd64.whl) |\n| `1.0.0` | `2.8.0` | `3.11` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fmengqin\u002FSageAttention\u002Freleases\u002Fdownload\u002F20251229\u002Fsageattn3-1.0.0+cu128torch2.8.0-cp311-cp311-win_amd64.whl) |\n| `1.0.0` | `2.7.1` | `3.13` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fmengqin\u002FSageAttention\u002Freleases\u002Fdownload\u002F20251229\u002Fsageattn3-1.0.0+cu128torch2.7.1-cp313-cp313-win_amd64.whl) |\n| `1.0.0` | `2.7.1` | `3.12` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fmengqin\u002FSageAttention\u002Freleases\u002Fdownload\u002F20251229\u002Fsageattn3-1.0.0+cu128torch2.7.1-cp312-cp312-win_amd64.whl) |\n| `1.0.0` | `2.7.1` | `3.11` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fmengqin\u002FSageAttention\u002Freleases\u002Fdownload\u002F20251229\u002Fsageattn3-1.0.0+cu128torch2.7.1-cp311-cp311-win_amd64.whl) |\n\u003C!-- END_SAGEATTN3_TABLE -->\n\n\u003Cp id=\"nunchaku\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\n\n### 🛠 捻爪\n*   **官方仓库**: : [mit-han-lab\u002Fnunchaku](https:\u002F\u002Fgithub.com\u002Fmit-han-lab\u002Fnunchaku\u002Freleases)\n\u003C!-- START_NUNCHAKU_TABLE -->\n\u003C!-- 此表格由程序自动生成，请勿手动编辑。 -->\n| 软件包版本 | PyTorch 版本 | Python 版本 | 下载链接 |\n|:---:|:---:|:---:|:---:|\n| `1.2.0` | `2.11` | `3.13` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-ai\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.2.0\u002Fnunchaku-1.2.0+torch2.11-cp313-cp313-win_amd64.whl) |\n| `1.2.0` | `2.11` | `3.12` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-ai\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.2.0\u002Fnunchaku-1.2.0+torch2.11-cp312-cp312-win_amd64.whl) |\n| `1.2.0` | `2.11` | `3.11` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-ai\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.2.0\u002Fnunchaku-1.2.0+torch2.11-cp311-cp311-win_amd64.whl) |\n| `1.2.0` | `2.11` | `3.10` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-ai\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.2.0\u002Fnunchaku-1.2.0+torch2.11-cp310-cp310-win_amd64.whl) |\n| `1.2.0` | `2.9` | `3.13` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-ai\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.2.0\u002Fnunchaku-1.2.0+torch2.9-cp313-cp313-win_amd64.whl) |\n| `1.2.0` | `2.9` | `3.12` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-ai\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.2.0\u002Fnunchaku-1.2.0+torch2.9-cp312-cp312-win_amd64.whl) |\n| `1.2.0` | `2.9` | `3.11` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-ai\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.2.0\u002Fnunchaku-1.2.0+torch2.9-cp311-cp311-win_amd64.whl) |\n| `1.2.0` | `2.9` | `3.10` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-ai\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.2.0\u002Fnunchaku-1.2.0+torch2.9-cp310-cp310-win_amd64.whl) |\n| `1.2.0` | `2.8` | `3.13` | [链接](https:\u002F\u002Fhuggingface.co\u002FJusteLeo\u002FNunchaku-Zimage-Win-Wheels\u002Fresolve\u002Fmain\u002Fnunchaku-1.2.0%2Btorch2.8-cp313-cp313-win_amd64.whl) |\n| `1.2.0` | `2.8` | `3.12` | [链接](https:\u002F\u002Fhuggingface.co\u002FJusteLeo\u002FNunchaku-Zimage-Win-Wheels\u002Fresolve\u002Fmain\u002Fnunchaku-1.2.0%2Btorch2.8-cp312-cp312-win_amd64.whl) |\n| `1.2.0` | `2.8` | `3.11` | [链接](https:\u002F\u002Fhuggingface.co\u002FJusteLeo\u002FNunchaku-Zimage-Win-Wheels\u002Fresolve\u002Fmain\u002Fnunchaku-1.2.0%2Btorch2.8-cp311-cp311-win_amd64.whl) |\n| `1.2.0` | `2.7` | `3.13` | [链接](https:\u002F\u002Fhuggingface.co\u002FJusteLeo\u002FNunchaku-Zimage-Win-Wheels\u002Fresolve\u002Fmain\u002Fnunchaku-1.2.0%2Btorch2.7-cp313-cp313-win_amd64.whl) |\n| `1.2.0` | `2.7` | `3.12` | [链接](https:\u002F\u002Fhuggingface.co\u002FJusteLeo\u002FNunchaku-Zimage-Win-Wheels\u002Fresolve\u002Fmain\u002Fnunchaku-1.2.0%2Btorch2.7-cp312-cp312-win_amd64.whl) |\n| `1.2.0` | `2.7` | `3.11` | [链接](https:\u002F\u002Fhuggingface.co\u002FJusteLeo\u002FNunchaku-Zimage-Win-Wheels\u002Fresolve\u002Fmain\u002Fnunchaku-1.2.0%2Btorch2.7-cp311-cp311-win_amd64.whl) |\n| `1.2.0` | `2.7` | `3.10` | [链接](https:\u002F\u002Fhuggingface.co\u002FJusteLeo\u002FNunchaku-Zimage-Win-Wheels\u002Fresolve\u002Fmain\u002Fnunchaku-1.2.0%2Btorch2.7-cp310-cp310-win_amd64.whl) |\n| `1.0.2` | `2.10` | `3.13` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.10-cp313-cp313-win_amd64.whl) |\n| `1.0.2` | `2.10` | `3.12` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.10-cp312-cp312-win_amd64.whl) |\n| `1.0.2` | `2.10` | `3.11` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.10-cp311-cp311-win_amd64.whl) |\n| `1.0.2` | `2.10` | `3.10` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.10-cp310-cp310-win_amd64.whl) |\n| `1.0.2` | `2.9` | `3.13` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.9-cp313-cp313-win_amd64.whl) |\n| `1.0.2` | `2.9` | `3.12` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.9-cp312-cp312-win_amd64.whl) |\n| `1.0.2` | `2.9` | `3.11` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.9-cp311-cp311-win_amd64.whl) |\n| `1.0.2` | `2.9` | `3.10` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.9-cp310-cp310-win_amd64.whl) |\n| `1.0.2` | `2.8` | `3.13` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.8-cp313-cp313-win_amd64.whl) |\n| `1.0.2` | `2.8` | `3.12` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.8-cp312-cp312-win_amd64.whl) |\n| `1.0.2` | `2.8` | `3.11` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.8-cp311-cp311-win_amd64.whl) |\n| `1.0.2` | `2.8` | `3.10` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.8-cp310-cp310-win_amd64.whl) |\n| `1.0.2` | `2.7` | `3.13` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.7-cp313-cp313-win_amd64.whl) |\n| `1.0.2` | `2.7` | `3.12` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.7-cp312-cp312-win_amd64.whl) |\n| `1.0.2` | `2.7` | `3.11` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.7-cp311-cp311-win_amd64.whl) |\n| `1.0.2` | `2.7` | `3.10` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.2\u002Fnunchaku-1.0.2+torch2.7-cp310-cp310-win_amd64.whl) |\n| `1.0.1` | `2.10` | `3.13` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.10-cp313-cp313-win_amd64.whl) |\n| `1.0.1` | `2.10` | `3.12` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.10-cp312-cp312-win_amd64.whl) |\n| `1.0.1` | `2.10` | `3.11` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.10-cp311-cp311-win_amd64.whl) |\n| `1.0.1` | `2.10` | `3.10` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.10-cp310-cp310-win_amd64.whl) |\n| `1.0.1` | `2.9` | `3.13` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fnunchaku-1.0.1%2Bcu128torch2.9-cp313-cp313-win_amd64.whl) |\n| `1.0.1` | `2.9` | `3.13` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fnunchaku-1.0.1%2Bcu130torch2.9-cp313-cp313-win_amd64.whl) |\n| `1.0.1` | `2.9` | `3.12` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fnunchaku-1.0.1%2Bcu128torch2.9-cp312-cp312-win_amd64.whl) |\n| `1.0.1` | `2.9` | `3.12` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fnunchaku-1.0.1%2Bcu130torch2.9-cp312-cp312-win_amd64.whl) |\n| `1.0.1` | `2.8` | `3.13` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.8-cp313-cp313-win_amd64.whl) |\n| `1.0.1` | `2.8` | `3.13` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.8-cp313-cp313-win_amd64.whl) |\n| `1.0.1` | `2.8` | `3.12` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.8-cp312-cp312-win_amd64.whl) |\n| `1.0.1` | `2.8` | `3.11` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.8-cp311-cp311-win_amd64.whl) |\n| `1.0.1` | `2.8` | `3.10` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.8-cp310-cp310-win_amd64.whl) |\n| `1.0.1` | `2.7` | `3.13` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.7-cp313-cp313-win_amd64.whl) |\n| `1.0.1` | `2.7` | `3.12` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.7-cp312-cp312-win_amd64.whl) |\n| `1.0.1` | `2.7` | `3.11` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.7-cp311-cp311-win_amd64.whl) |\n| `1.0.1` | `2.7` | `3.10` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.7-cp310-cp310-win_amd64.whl) |\n| `1.0.1` | `2.6` | `3.13` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.6-cp313-cp313-win_amd64.whl) |\n| `1.0.1` | `2.6` | `3.12` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.6-cp312-cp312-win_amd64.whl) |\n| `1.0.1` | `2.6` | `3.11` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.6-cp311-cp311-win_amd64.whl) |\n| `1.0.1` | `2.6` | `3.10` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.6-cp310-cp310-win_amd64.whl) |\n| `1.0.1` | `2.5` | `3.12` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.5-cp312-cp312-win_amd64.whl) |\n| `1.0.1` | `2.5` | `3.11` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.5-cp311-cp311-win_amd64.whl) |\n| `1.0.1` | `2.5` | `3.10` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.1\u002Fnunchaku-1.0.1+torch2.5-cp310-cp310-win_amd64.whl) |\n| `1.0.0` | `2.9` | `3.13` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.9-cp313-cp313-win_amd64.whl) |\n| `1.0.0` | `2.9` | `3.12` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.9-cp312-cp312-win_amd64.whl) |\n| `1.0.0` | `2.9` | `3.11` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.9-cp311-cp311-win_amd64.whl) |\n| `1.0.0` | `2.9` | `3.10` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.9-cp310-cp310-win_amd64.whl) |\n| `1.0.0` | `2.8` | `3.13` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.8-cp313-cp313-win_amd64.whl) |\n| `1.0.0` | `2.8` | `3.12` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.8-cp312-cp312-win_amd64.whl) |\n| `1.0.0` | `2.8` | `3.11` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.8-cp311-cp311-win_amd64.whl) |\n| `1.0.0` | `2.8` | `3.10` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.8-cp310-cp310-win_amd64.whl) |\n| `1.0.0` | `2.7` | `3.13` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.7-cp313-cp313-win_amd64.whl) |\n| `1.0.0` | `2.7` | `3.12` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.7-cp312-cp312-win_amd64.whl) |\n| `1.0.0` | `2.7` | `3.11` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.7-cp311-cp311-win_amd64.whl) |\n| `1.0.0` | `2.7` | `3.10` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.7-cp310-cp310-win_amd64.whl) |\n| `1.0.0` | `2.6` | `3.13` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.6-cp313-cp313-win_amd64.whl) |\n| `1.0.0` | `2.6` | `3.12` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.6-cp312-cp312-win_amd64.whl) |\n| `1.0.0` | `2.6` | `3.11` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.6-cp311-cp311-win_amd64.whl) |\n| `1.0.0` | `2.6` | `3.10` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.6-cp310-cp310-win_amd64.whl) |\n| `1.0.0` | `2.5` | `3.12` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.5-cp312-cp312-win_amd64.whl) |\n| `1.0.0` | `2.5` | `3.11` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.5-cp311-cp311-win_amd64.whl) |\n| `1.0.0` | `2.5` | `3.10` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv1.0.0\u002Fnunchaku-1.0.0+torch2.5-cp310-cp310-win_amd64.whl) |\n| `0.3.2` | `2.9` | `3.12` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fnunchaku-0.3.2%2Btorch2.9-cp312-cp312-win_amd64.whl?download=true) |\n| `0.3.2` | `2.8` | `3.12` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.8-cp312-cp312-win_amd64.whl) |\n| `0.3.2` | `2.8` | `3.11` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.8-cp311-cp311-win_amd64.whl) |\n| `0.3.2` | `2.8` | `3.10` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.8-cp310-cp310-win_amd64.whl) |\n| `0.3.2` | `2.7` | `3.12` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.7-cp312-cp312-win_amd64.whl) |\n| `0.3.2` | `2.7` | `3.11` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.7-cp311-cp311-win_amd64.whl) |\n| `0.3.2` | `2.7` | `3.10` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.7-cp310-cp310-win_amd64.whl) |\n| `0.3.2` | `2.6` | `3.12` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.6-cp312-cp312-win_amd64.whl) |\n| `0.3.2` | `2.6` | `3.11` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.6-cp311-cp311-win_amd64.whl) |\n| `0.3.2` | `2.6` | `3.10` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.6-cp310-cp310-win_amd64.whl) |\n| `0.3.2` | `2.5` | `3.12` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.5-cp312-cp312-win_amd64.whl) |\n| `0.3.2` | `2.5` | `3.11` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.5-cp311-cp311-win_amd64.whl) |\n| `0.3.2` | `2.5` | `3.10` | [链接](https:\u002F\u002Fgithub.com\u002Fnunchaku-tech\u002Fnunchaku\u002Freleases\u002Fdownload\u002Fv0.3.2\u002Fnunchaku-0.3.2+torch2.5-cp310-cp310-win_amd64.whl) |\n\u003C!-- END_NUNCHAKU_TABLE -->\n  \n\u003Cp id=\"natten\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\n\n### 🛠 NATTEN\n邻域注意力Transformer。\n\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FSHI--Labs-NATTEN-blue?style=flat)](https:\u002F\u002Fgithub.com\u002FSHI-Labs\u002FNATTEN)\n[![HuggingFace](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Flldacing-HF%20Wheels-orange?style=flat)](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Ftree\u002Fmain)\n\n\u003C!-- START_NATTEN_TABLE -->\n\u003C!-- 本表格由程序自动生成，请勿手动编辑。 -->\n| 软件包版本 | PyTorch 版本 | Python 版本 | CUDA 版本 | 下载链接 |\n|:---:|:---:|:---:|:---:|:---:|\n| `0.17.5` | `2.7.0` | `3.12` | `12.8` | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.5+torch270cu128-cp312-cp312-win_amd64.whl) |\n| `0.17.5` | `2.7.0` | `3.11` | `12.8` | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.5+torch270cu128-cp311-cp311-win_amd64.whl) |\n| `0.17.5` | `2.7.0` | `3.10` | `12.8` | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.5+torch270cu128-cp310-cp310-win_amd64.whl) |\n| `0.17.5` | `2.6.0` | `3.12` | `12.6` | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.5+torch260cu126-cp312-cp312-win_amd64.whl) |\n| `0.17.5` | `2.6.0` | `3.11` | `12.6` | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.5+torch260cu126-cp311-cp311-win_amd64.whl) |\n| `0.17.5` | `2.6.0` | `3.10` | `12.6` | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.5+torch260cu126-cp310-cp310-win_amd64.whl) |\n| `0.17.3` | `2.5.1` | `3.12` | `12.4` | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch251cu124-cp312-cp312-win_amd64.whl) |\n| `0.17.3` | `2.5.1` | `3.11` | `12.4` | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch251cu124-cp311-cp311-win_amd64.whl) |\n| `0.17.3` | `2.5.1` | `3.10` | `12.4` | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch251cu124-cp310-cp310-win_amd64.whl) |\n| `0.17.3` | `2.5.0` | `3.12` | `12.4` | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch250cu124-cp312-cp312-win_amd64.whl) |\n| `0.17.3` | `2.5.0` | `3.11` | `12.4` | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch250cu124-cp311-cp311-win_amd64.whl) |\n| `0.17.3` | `2.5.0` | `3.10` | `12.4` | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch250cu124-cp310-cp310-win_amd64.whl) |\n| `0.17.3` | `2.4.1` | `3.12` | `12.4` | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch241cu124-cp312-cp312-win_amd64.whl) |\n| `0.17.3` | `2.4.1` | `3.11` | `12.4` | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch241cu124-cp311-cp311-win_amd64.whl) |\n| `0.17.3` | `2.4.1` | `3.10` | `12.4` | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch241cu124-cp310-cp310-win_amd64.whl) |\n| `0.17.3` | `2.4.0` | `3.12` | `12.4` | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch240cu124-cp312-cp312-win_amd64.whl) |\n| `0.17.3` | `2.4.0` | `3.11` | `12.4` | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch240cu124-cp311-cp311-win_amd64.whl) |\n| `0.17.3` | `2.4.0` | `3.10` | `12.4` | [链接](https:\u002F\u002Fhuggingface.co\u002Flldacing\u002FNATTEN-windows\u002Fblob\u002Fmain\u002Fnatten-0.17.3+torch240cu124-cp310-cp310-win_amd64.whl) |\n\u003C!-- END_NATTEN_TABLE -->\n\u003Cp id=\"triton\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\n\n### 🛠 Triton（Windows 分支）\nTriton 是一种用于编写高效自定义深度学习原语的语言和编译器。虽然官方不支持 Windows，但有一个分支提供了预构建的 wheel 包。\n\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Ftriton--lang-triton--windows-blue?style=flat)](https:\u002F\u002Fgithub.com\u002Ftriton-lang\u002Ftriton-windows)\n\n**支持的 GPU**:\n> [!注意]\n> 不同的 GPU 架构由于计算能力的支持不同，需要使用不同的 Triton 版本。\n\n| Triton 版本 | 支持的 GPU | 计算能力 |\n|:---:|:---|:---:|\n| `3.6.x` | RTX 50xx（Blackwell）、RTX 40xx、Ada Lovelace、Hopper | SM 8.9、9.0、10.0 |\n| `3.5.x` | RTX 30xx、40xx、Ada Lovelace、Hopper | SM 8.0、8.9、9.0 |\n| `3.4.x` | RTX 20xx、30xx、40xx、Ada Lovelace、Hopper | SM 7.5、8.0、8.9、9.0 |\n| `\u003C= 3.2.x` | GTX\u002FRTX 16xx、RTX 20xx、30xx、40xx、Ada Lovelace、Hopper | SM 7.0、7.5、8.0、8.9、9.0 |\n\n**安装**:\n\n| 软件包版本 | PyTorch 版本 | 计算能力 | 安装命令 |\n|:---:|:---:|:---:|:---|\n| `3.6.x` | >= 2.9 | SM 8.9+ | `pip install -U \"triton-windows\u003C3.7\"` |\n| `3.5.x` | >= 2.9 | SM 8.0+ | `pip install -U \"triton-windows\u003C3.6\"` |\n| `3.4.x` | >= 2.8 | SM 7.5+ | `pip install -U \"triton-windows\u003C3.5\"` |\n\n**Python 库**:\n> [!重要]\n> Triton 需要额外的 Python 开发库来构建 CUDA 内核。请下载与您的 Python 版本匹配的包，解压 ZIP 文件，并将 `include` 和 `libs` 文件夹复制到您的 Python 安装目录中。\n\n| Python 版本 | 下载链接 |\n|:---:|:---:|\n| `3.13` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002Ftriton-windows\u002Freleases\u002Fdownload\u002Fv3.0.0-windows.post1\u002Fpython_3.13.2_include_libs.zip) |\n| `3.12` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002Ftriton-windows\u002Freleases\u002Fdownload\u002Fv3.0.0-windows.post1\u002Fpython_3.12.7_include_libs.zip) |\n| `3.11` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002Ftriton-windows\u002Freleases\u002Fdownload\u002Fv3.0.0-windows.post1\u002Fpython_3.11.9_include_libs.zip) |\n| `3.10` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002Ftriton-windows\u002Freleases\u002Fdownload\u002Fv3.0.0-windows.post1\u002Fpython_3.10.11_include_libs.zip) |\n| `3.9` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002Ftriton-windows\u002Freleases\u002Fdownload\u002Fv3.0.0-windows.post1\u002Fpython_3.9.13_include_libs.zip) |\n\n\u003Cp id=\"bitsandbytes\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\n\n### 🛠 bitsandbytes\n一个轻量级的 CUDA 自定义函数封装库，特别适用于 8 位优化器、矩阵乘法（LLM.int8()）以及量化功能。\n\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fbitsandbytes--foundation-bitsandbytes-blue?style=flat)](https:\u002F\u002Fgithub.com\u002Fbitsandbytes-foundation\u002Fbitsandbytes)\n\n\u003Cp align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\n\n### 🛠 RadialAttention for ComfyUI\n\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fwoct0rdho-ComfyUI--RadialAttn-blue?style=flat)](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FComfyUI-RadialAttn)\n\n\u003Cp align=\"right\">(\u003Ca href=\"#readme-top\">返回顶部\u003C\u002Fa>)\u003C\u002Fp>\n\n\u003Cp id=\"spargeattn\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\n\n### 🛠 SpargeAttn\n\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fthu--ml-SpargeAttn-blue?style=flat)](https:\u002F\u002Fgithub.com\u002Fthu-ml\u002FSpargeAttn)\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fwoct0rdho-Wheels-green?style=flat)](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSpargeAttn\u002Freleases)\n\u003C!-- START_SPARGEATTN_TABLE -->\n\u003C!-- 此表格由程序自动生成，请勿手动编辑。 -->\n| 软件包版本 | PyTorch 版本 | CUDA 版本 | 下载链接 |\n|:---:|:---:|:---:|:---:|\n| `0.1.0.post1` | `2.8.0` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSpargeAttn\u002Freleases\u002Fdownload\u002Fv0.1.0-windows.post1\u002Fspas_sage_attn-0.1.0+cu128torch2.8.0.post1-cp39-abi3-win_amd64.whl) |\n| `0.1.0.post1` | `2.7.1` | `12.8` | [链接](https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002FSpargeAttn\u002Freleases\u002Fdownload\u002Fv0.1.0-windows.post1\u002Fspas_sage_attn-0.1.0+cu128torch2.7.1.post1-cp39-abi3-win_amd64.whl) |\n\u003C!-- END_SPARGEATTN_TABLE -->\n\n\u003Cp id=\"block_sparse_attn\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\n\n### 🛠 块稀疏注意力机制\n\n[![GitHub](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fmit--han--lab-Block%20Sparse-blue?style=flat)](https:\u002F\u002Fgithub.com\u002Fmit-han-lab\u002FBlock-Sparse-Attention\u002F)\n[![HuggingFace](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FWildminder-HF%20Wheels-orange?style=flat)](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Ftree\u002Fmain)\n\u003C!-- START_BLOCKSPARSEATTN_TABLE -->\n\u003C!-- 此表格由程序自动生成，请勿手动编辑。 -->\n| 软件包版本 | PyTorch 版本 | Python 版本 | CUDA 版本 | 下载链接 |\n|:---:|:---:|:---:|:---:|:---:|\n| `0.0.2.post1` | `2.11` | `3.13` | `13.0` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fblock_sparse_attn-0.0.2.post1+cu130torch2.11cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\n| `0.0.2.post1` | `2.10` | `3.13` | `13.0` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fblock_sparse_attn-0.0.2.post1+cu130torch2.10cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\n| `0.0.2.post1` | `2.9.1` | `3.13` | `13.0` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fblock_sparse_attn-0.0.2.post1+cu130torch2.9.1cxx11abiTRUE-cp313-cp313-win_amd64.whl) |\n\u003C!-- END_BLOCKSPARSEATTN_TABLE -->\n\n\u003Cp id=\"deepspeed\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\n\n### 🛠 DeepSpeed\n* 一个深度学习优化库\n* **官方仓库**: [https:\u002F\u002Fgithub.com\u002Fdeepspeedai\u002FDeepSpeed](https:\u002F\u002Fgithub.com\u002Fdeepspeedai\u002FDeepSpeed)\n\u003C!-- START_DEEPSPEED_TABLE -->\n\u003C!-- 此表格由程序自动生成，请勿手动编辑。 -->\n| 软件包版本 | Python 版本 | 下载链接 |\n|:---:|:---:|:---:|\n| `0.18.6` | `3.13` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fdeepspeed\u002Fdeepspeed-0.18.6+0ccb2bb6-cp313-cp313-win_amd64.whl) |\n\u003C!-- END_DEEPSPEED_TABLE -->\n\n\u003Cp id=\"fairseq\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\n\n### 🛠 Fairseq\n* Facebook AI Research 序列到序列工具包\n* **官方仓库**: [https:\u002F\u002Fgithub.com\u002Ffacebookresearch\u002Ffairseq](https:\u002F\u002Fgithub.com\u002Ffacebookresearch\u002Ffairseq)\n\u003C!-- START_FAIRSEQ_TABLE -->\n\u003C!-- 此表格由程序自动生成，请勿手动编辑。 -->\n| 软件包版本 | Python 版本 | 下载链接 |\n|:---:|:---:|:---:|\n| `0.12.2` | `3.13` | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Ffairseq\u002Ffairseq-0.12.2-cp313-cp313-win_amd64.whl) |\n\u003C!-- END_FAIRSEQ_TABLE -->\n\n\u003Cp id=\"causalconv1d\" align=\"center\">▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲▼▲\u003C\u002Fp>\n\n### 🛠 causal_conv1d\n\u003C!-- START_CAUSALCONV1D_TABLE -->\n\u003C!-- 此表格由程序自动生成，请勿手动编辑。 -->\n| 软件包版本 | PyTorch 版本 | Python 版本 | CUDA 版本 | CXX11 ABI | 下载链接 |\n|:---:|:---:|:---:|:---:|:---:|:---:|\n| `1.6.1` | `2.11.0` | `3.14` | `13.0` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fcausal_conv1d\u002Fcausal_conv1d-1.6.1+d20260310.cu130torch2.11.0cxx11abi1-cp314-cp314-win_amd64.whl) |\n| `1.6.1` | `2.11.0` | `3.13` | `13.0` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fcausal_conv1d\u002Fcausal_conv1d-1.6.1+d20260310.cu130torch2.11.0cxx11abi1-cp313-cp313-win_amd64.whl) |\n| `1.6.1` | `2.10.0` | `3.13` | `13.0` | ✓ | [链接](https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Fcausal_conv1d\u002Fcausal_conv1d-1.6.1+d20260310.cu130torch2.10.0cxx11abi1-cp313-cp313-win_amd64.whl) |\n\u003C!-- END_CAUSALCONV1D_TABLE -->\n\n\n\n\n\u003Cp align=\"center\">▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀\u003C\u002Fp>\n\n\u003C!-- 数据访问 -->\n## 🌐 以编程方式访问数据 (wheels.json)\n\n本仓库中的所有轮子信息都存储在 `wheels.json` 文件中，该文件是唯一的真实数据源。此 README 中的表格均由此文件自动生成。\n\n这为任何需要访问这些数据而无需解析 Markdown 的外部工具或应用程序提供了一个稳定且结构化的 JSON 端点。\n\n### ➤ 使用方法\n\n您可以通过以下 URL 直接访问原始 JSON 文件：\n\n```\nhttps:\u002F\u002Fraw.githubusercontent.com\u002Fwildminder\u002FAI-windows-whl\u002Fmain\u002Fwheels.json\n```\n\n**使用 `curl` 的示例：**\n```sh\ncurl -L -o wheels.json https:\u002F\u002Fraw.githubusercontent.com\u002Fwildminder\u002FAI-windows-whl\u002Fmain\u002Fwheels.json\n```\n\n该文件包含一个 `packages` 列表，每个软件包都有其元数据和一个 `wheels` 数组，其中每个轮子对象包含版本信息和直接下载的 `url`。\n\n\u003Cp align=\"right\">(回到顶部)\u003C\u002Fp>\n\n\u003Cp align=\"center\">▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀\u003C\u002Fp>\n\n\u003C!-- 贡献 -->\n## ➤ 贡献\n\n贡献使开源社区成为一个令人惊叹的学习、启发和创作之地。您的任何贡献都将受到**高度赞赏**。\n\n如果您发现了一个新的预构建轮子或可靠的来源，请 fork 该仓库并创建一个 pull request，或者直接开一个包含链接的问题。\n\u003Cp align=\"right\">(回到顶部)\u003C\u002Fp>\n\n\n\n\u003C!-- 致谢 -->\n## ➤ 致谢\n\n此仓库只是一个链接集合。非常感谢那些为社区构建并托管这些轮子的个人和组织：\n\n\n\u003C!-- Markdown 链接与图片 -->\n[contributors-shield]: https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fcontributors\u002FYOUR_USERNAME\u002FWindows-AI-Wheels.svg?style=for-the-badge\n[contributors-url]: https:\u002F\u002Fgithub.com\u002FYOUR_USERNAME\u002FWindows-AI-Wheels\u002Fgraphs\u002Fcontributors\n[forks-shield]: https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fforks\u002FYOUR_USERNAME\u002FWindows-AI-Wheels.svg?style=for-the-badge\n[forks-url]: https:\u002F\u002Fgithub.com\u002FYOUR_USERNAME\u002FWindows-AI-Wheels\u002Fnetwork\u002Fmembers\n[stars-shield]: https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fstars\u002FYOUR_USERNAME\u002FWindows-AI-Wheels.svg?style=for-the-badge\n[stars-url]: https:\u002F\u002Fgithub.com\u002FYOUR_USERNAME\u002FWindows-AI-Wheels\u002Fstargazers\n[issues-shield]: https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fissues\u002FYOUR_USERNAME\u002FWindows-AI-Wheels.svg?style=for-the-badge\n[issues-url]: https:\u002F\u002Fgithub.com\u002FYOUR_USERNAME\u002FWindows-AI-Wheels\u002Fissues","# AI-windows-whl 快速上手指南\n\n本指南旨在帮助 Windows 用户快速安装那些通常难以编译的 AI\u002FML 库（如 `flash-attention`、`xformers` 等）的预编译版本，特别适用于 ComfyUI 和 PyTorch 开发者。\n\n## 1. 环境准备\n\n在开始之前，请确保您的系统满足以下要求：\n\n*   **操作系统**：Windows 10\u002F11 (64 位)\n*   **Python 版本**：已安装 **Python 3.9 - 3.14** 之间的版本。\n    *   下载地址：[Python 官网](https:\u002F\u002Fwww.python.org\u002Fdownloads\u002Fwindows\u002F)\n    *   *注意：安装时请务必勾选 \"Add Python to PATH\"。*\n*   **GPU 驱动**：已安装兼容的 NVIDIA 显卡驱动及对应的 CUDA Toolkit（通常 PyTorch 会自带运行时，但需确保驱动支持目标 CUDA 版本）。\n*   **包管理工具**：确保 `pip` 已更新至最新版本：\n    ```sh\n    python -m pip install --upgrade pip\n    ```\n\n> **💡 提示**：如果您不确定需要哪个版本，可以使用官方提供的搜索工具 **[Find Windows AI Wheels](https:\u002F\u002Fwildminder.github.io\u002FAI-windows-whl\u002F)** 快速查找匹配您环境（Python\u002FPyTorch\u002FCUDA）的安装链接。\n\n## 2. 安装步骤\n\n本项目不提供单一的 `pip install` 命令，而是提供特定版本的 `.whl` 文件直链。请根据您的环境选择对应的链接进行安装。\n\n### 第一步：安装基础 PyTorch\n首先需要通过官方源安装基础的 PyTorch、TorchVision 和 Torchaudio。\n\n**推荐稳定版 (以 CUDA 12.8 为例):**\n```sh\npip install torch torchvision torchaudio --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu128\n```\n\n*如需其他 CUDA 版本（如 12.6, 13.0），请访问 [PyTorch 官网获取对应命令](https:\u002F\u002Fpytorch.org\u002Fget-started\u002Flocally\u002F)。*\n\n### 第二步：安装难点库 (Whl 直链安装)\n对于 `flash-attention`、`xformers` 等库，请使用 `pip` 直接指向本项目提供的 HuggingFace 下载链接。**注意：链接必须用双引号包裹。**\n\n**示例：安装 flash-attention (针对 Python 3.12, PyTorch 2.7.0, CUDA 12.8)**\n```sh\npip install \"https:\u002F\u002Fhuggingface.co\u002Flldacing\u002Fflash-attention-windows-wheel\u002Fblob\u002Fmain\u002Fflash_attn-2.7.4.post1+cu128torch2.7.0cxx11abiFALSE-cp312-cp312-win_amd64.whl\"\n```\n\n**常用库安装参考：**\n请在 [Available Wheels](#available-wheels) 列表或搜索页面中找到匹配您环境的行，复制 `Download Link` 并替换下方命令中的 URL：\n\n*   **Torchaudio (特定版本)**:\n    ```sh\n    pip install \"https:\u002F\u002Fhuggingface.co\u002FWildminder\u002FAI-windows-whl\u002Fresolve\u002Fmain\u002Ftorchaudio\u002Ftorchaudio-2.11.0a0+d20260219.cu130torch2.12.0cxx11abi1-cp314-cp314-win_amd64.whl\"\n    ```\n*   **xformers \u002F bitsandbytes \u002F Triton**:\n    操作逻辑同上，找到对应链接后执行 `pip install \"URL\"`。\n\n> **⚠️ 注意**：由于文件托管在 HuggingFace，国内用户若下载速度慢，可尝试配置本地代理或使用镜像加速工具，但请直接使用原始链接以确保文件哈希值校验通过。\n\n## 3. 基本使用\n\n安装完成后，即可在 Python 环境中正常导入这些库，无需再进行复杂的源码编译。\n\n**验证示例 (以 flash-attention 为例):**\n\n```python\nimport torch\nfrom flash_attn import flash_attn_func\n\n# 创建测试数据\nq = torch.randn(2, 4, 8, 32, dtype=torch.float16, device='cuda')\nk = torch.randn(2, 4, 8, 32, dtype=torch.float16, device='cuda')\nv = torch.randn(2, 4, 8, 32, dtype=torch.float16, device='cuda')\n\n# 调用函数\noutput = flash_attn_func(q, k, v)\n\nprint(f\"Flash Attention 运行成功！输出形状：{output.shape}\")\n```\n\n如果代码无报错运行，说明预编译轮子已成功安装并适配当前环境。现在您可以专注于开发 AI 应用或使用 ComfyUI 等工作流了。","一位 Windows 平台的深度学习开发者正试图在本地部署最新的 Stable Diffusion WebUI，以利用 Flash-Attention 和 xFormers 加速图像生成并降低显存占用。\n\n### 没有 AI-windows-whl 时\n- **环境配置地狱**：为了安装 `flash-attention` 或 `xformers`，必须手动安装特定版本的 Visual Studio Build Tools 和 CUDA Toolkit，过程繁琐且极易出错。\n- **编译频繁失败**：由于 Windows 对部分算子支持不完善，从源码编译时常报出难以排查的 C++ 语法错误或链接错误，导致数小时的努力付诸东流。\n- **依赖版本冲突**：手动寻找与当前 PyTorch 版本严格匹配的预编译包几乎不可能，强行安装往往引发 `DLL load failed` 或版本不兼容崩溃。\n- **开发热情劝退**：大量时间被浪费在环境搭建而非模型调试上，许多开发者因此被迫放弃 Windows 转向 Linux 双系统或直接搁置项目。\n\n### 使用 AI-windows-whl 后\n- **一键极速安装**：直接下载与当前 Python 和 PyTorch 版本完美匹配的预编译 `.whl` 文件，通过 `pip install` 命令秒级完成安装，无需任何编译步骤。\n- **开箱即用稳定**：提供的包已针对 Windows 环境预先测试和优化，彻底规避了源码编译中的平台兼容性陷阱，确保启动即运行。\n- **精准版本对齐**：仓库清晰列出了每个轮子对应的 PyTorch 和 CUDA 版本，开发者可轻松锁定依赖，根除因版本错配导致的运行时崩溃。\n- **专注核心业务**：将原本用于排查环境问题的数小时甚至数天时间，全部投入到提示词工程、模型微调和图像生成的实际创作中。\n\nAI-windows-whl 通过提供高质量的预编译算子库，彻底抹平了 Windows 用户在高性能 AI 部署上的技术门槛，让本地大模型应用真正变得触手可及。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002Fwildminder_AI-windows-whl_14d6df72.jpg","wildminder","WildAi","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002Fwildminder_2201cde2.jpg","Physicist, Programmer, Designer",null,"Berlin","wildmindai","https:\u002F\u002Fgithub.com\u002Fwildminder",581,37,"2026-04-06T14:59:35","Windows","需要 NVIDIA GPU（用于 CUDA 加速），具体显存大小未说明，支持 CUDA 11.8, 12.6, 12.8, 12.9, 13.0","未说明",{"notes":88,"python":89,"dependencies":90},"本项目专为 Windows 用户设计，提供难以从源码编译的 AI\u002FML 库（如 flash-attention, xformers）的预编译 wheel 文件。安装时需使用 pip 直接指向具体的 .whl 文件 URL。PyTorch 建议优先从官方源安装，其他扩展库需根据已安装的 Python、PyTorch 和 CUDA 版本选择对应的 wheel 链接。部分包（如 Triton）为 Windows 分支版本。","3.9 - 3.14",[91,92,93,94,95,96,97,98,99],"torch","torchvision","torchaudio","flash-attention","xformers","sageattention","natten","triton","bitsandbytes",[14,15,13],[102,103,104],"ai","comfyui","python","2026-03-27T02:49:30.150509","2026-04-08T17:18:15.429142",[108,113,118,123,127,132,137],{"id":109,"question_zh":110,"answer_zh":111,"source_url":112},24734,"导入 SageAttention 或 FlashAttention 时出现 'DLL load failed' 或 'The specified module could not be found' 错误怎么办？","这通常是由于 Triton Windows 版本缺少必要的 include\u002Flib 文件或 CUDA 版本不匹配导致的。解决方案如下：\n1. 确保安装正确的 CUDA 版本（建议使用 CUDA 13.0 而非 13.1）。\n2. 参考 Triton Windows 的特殊说明，确保包含了必要的头文件和库文件：https:\u002F\u002Fgithub.com\u002Fwoct0rdho\u002Ftriton-windows?tab=readme-ov-file#8-special-notes-for-comfyui-with-embeded-python\n3. 如果问题依旧，尝试重新编译或使用通用构建版本（universal build），该版本适用于 Torch >= 2.9 和 Python > 3.9。","https:\u002F\u002Fgithub.com\u002Fwildminder\u002FAI-windows-whl\u002Fissues\u002F18",{"id":114,"question_zh":115,"answer_zh":116,"source_url":117},24735,"FlashAttention 在 PyTorch 2.11 + CUDA 13.0 + Python 3.13 环境下无法加载，报错 'The specified procedure could not be found'，如何解决？","此问题可能与 CXX11ABI 设置有关。观察到失败的安装包通常标记为 'CXX11ABI=true'，而成功的安装包标记为 'CXX11ABI=false'。\n建议尝试以下步骤：\n1. 尝试使用 CXX11ABI=false 的预编译 wheel 文件（例如来自 mjun0812 仓库的版本）。\n2. 如果必须自行编译，请确保环境干净：全新安装 Python 3.13、Git、对应的 CUDA Toolkit 和 Visual Studio。\n3. 执行以下命令进行编译：\n   - pip install torch torchvision torchaudio --index-url https:\u002F\u002Fdownload.pytorch.org\u002Fwhl\u002Fcu130\n   - python -m pip install --upgrade pip setuptools wheel\n   - python -m pip install \"setuptools\u003C82\"\n   - pip install ninja packaging\n   - git config --system core.longpaths true\n   - 克隆 flash-attention 仓库并初始化子模块。\n   - 设置环境变量 $env:MAX_JOBS = \"2\"（若内存小于 64GB 则设为 1）后进行安装。","https:\u002F\u002Fgithub.com\u002Fwildminder\u002FAI-windows-whl\u002Fissues\u002F22",{"id":119,"question_zh":120,"answer_zh":121,"source_url":122},24736,"如何自动化安装 SageAttention 而不解析 README 文件？","仓库现已提供结构化的 JSON 文件以便自动化脚本读取。您可以直接获取并使用根目录下的 'wheels.json' 文件，其中包含了所有可用 wheel 文件的结构化数据，避免了解析 README 可能产生的错误。","https:\u002F\u002Fgithub.com\u002Fwildminder\u002FAI-windows-whl\u002Fissues\u002F8",{"id":124,"question_zh":125,"answer_zh":126,"source_url":112},24737,"为什么某些 Wheel 文件标注支持多个 PyTorch 版本（如 >=2.9），但特定版本（如 2.10）却报错？","这是因为提供了“通用构建”（universal build）版本的 Wheel 文件。这些文件设计为兼容 PyTorch >= 2.9 且 Python > 3.9 的环境。如果您遇到特定小版本（如 2.10.0）报错，请确认是否误用了非通用版本，或者直接尝试使用标注为通用的 Wheel 文件，它们通常能解决版本细微差异导致的兼容性问题。",{"id":128,"question_zh":129,"answer_zh":130,"source_url":131},24738,"如何找到适用于特定 Python、PyTorch 和 CUDA 组合的 Wheel 文件（例如 Python 3.13, Torch 2.9, CUDA 13.0）？","维护者通常会响应需求添加特定版本的 Wheel。对于 SageAttention 2.2.0，已存在适用于 Torch >2.9.0、CUDA 13.0 和 Python >3.9 的 wheel 文件（如 sage-2.2.0.post4）。如果您需要的组合暂时缺失，可以在 Issues 中提出请求，维护者会尽快编译添加（例如 Block-Sparse-Attention 和特定版本的 SageAttention 都是通过此方式添加的）。","https:\u002F\u002Fgithub.com\u002Fwildminder\u002FAI-windows-whl\u002Fissues\u002F15",{"id":133,"question_zh":134,"answer_zh":135,"source_url":136},24739,"SageAttention3 是否有可用的 Windows Wheel 文件？","目前 SageAttention3 尚未公开发布正式版本，仍处于预发布（pre-release）阶段，因此暂时没有可用的官方 Wheel 文件。需要等待其正式发布后才会提供相应的支持。","https:\u002F\u002Fgithub.com\u002Fwildminder\u002FAI-windows-whl\u002Fissues\u002F1",{"id":138,"question_zh":139,"answer_zh":140,"source_url":141},24740,"找不到特定版本（如 Flash Attention 2.7.4 + PyTorch 2.8 + Python 3.11）的 Wheel 文件怎么办？","维护者已添加了 2.7.4 版本的支持，但建议直接升级到更新的稳定版本（如 2.8.2）。此外，注意部分 Wheel 文件是 ABI3 版本，这意味着它们可以安装在任何 Python >= 3.9 的版本上，无需寻找特定 Python 小版本的文件。","https:\u002F\u002Fgithub.com\u002Fwildminder\u002FAI-windows-whl\u002Fissues\u002F3",[]]