[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-iMoonLab--DeepHypergraph":3,"tool-iMoonLab--DeepHypergraph":61},[4,18,26,36,44,53],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":17},4358,"openclaw","openclaw\u002Fopenclaw","OpenClaw 是一款专为个人打造的本地化 AI 助手，旨在让你在自己的设备上拥有完全可控的智能伙伴。它打破了传统 AI 助手局限于特定网页或应用的束缚，能够直接接入你日常使用的各类通讯渠道，包括微信、WhatsApp、Telegram、Discord、iMessage 等数十种平台。无论你在哪个聊天软件中发送消息，OpenClaw 都能即时响应，甚至支持在 macOS、iOS 和 Android 设备上进行语音交互，并提供实时的画布渲染功能供你操控。\n\n这款工具主要解决了用户对数据隐私、响应速度以及“始终在线”体验的需求。通过将 AI 部署在本地，用户无需依赖云端服务即可享受快速、私密的智能辅助，真正实现了“你的数据，你做主”。其独特的技术亮点在于强大的网关架构，将控制平面与核心助手分离，确保跨平台通信的流畅性与扩展性。\n\nOpenClaw 非常适合希望构建个性化工作流的技术爱好者、开发者，以及注重隐私保护且不愿被单一生态绑定的普通用户。只要具备基础的终端操作能力（支持 macOS、Linux 及 Windows WSL2），即可通过简单的命令行引导完成部署。如果你渴望拥有一个懂你",349277,3,"2026-04-06T06:32:30",[13,14,15,16],"Agent","开发框架","图像","数据工具","ready",{"id":19,"name":20,"github_repo":21,"description_zh":22,"stars":23,"difficulty_score":10,"last_commit_at":24,"category_tags":25,"status":17},3808,"stable-diffusion-webui","AUTOMATIC1111\u002Fstable-diffusion-webui","stable-diffusion-webui 是一个基于 Gradio 构建的网页版操作界面，旨在让用户能够轻松地在本地运行和使用强大的 Stable Diffusion 图像生成模型。它解决了原始模型依赖命令行、操作门槛高且功能分散的痛点，将复杂的 AI 绘图流程整合进一个直观易用的图形化平台。\n\n无论是希望快速上手的普通创作者、需要精细控制画面细节的设计师，还是想要深入探索模型潜力的开发者与研究人员，都能从中获益。其核心亮点在于极高的功能丰富度：不仅支持文生图、图生图、局部重绘（Inpainting）和外绘（Outpainting）等基础模式，还独创了注意力机制调整、提示词矩阵、负向提示词以及“高清修复”等高级功能。此外，它内置了 GFPGAN 和 CodeFormer 等人脸修复工具，支持多种神经网络放大算法，并允许用户通过插件系统无限扩展能力。即使是显存有限的设备，stable-diffusion-webui 也提供了相应的优化选项，让高质量的 AI 艺术创作变得触手可及。",162132,"2026-04-05T11:01:52",[14,15,13],{"id":27,"name":28,"github_repo":29,"description_zh":30,"stars":31,"difficulty_score":32,"last_commit_at":33,"category_tags":34,"status":17},1381,"everything-claude-code","affaan-m\u002Feverything-claude-code","everything-claude-code 是一套专为 AI 编程助手（如 Claude Code、Codex、Cursor 等）打造的高性能优化系统。它不仅仅是一组配置文件，而是一个经过长期实战打磨的完整框架，旨在解决 AI 代理在实际开发中面临的效率低下、记忆丢失、安全隐患及缺乏持续学习能力等核心痛点。\n\n通过引入技能模块化、直觉增强、记忆持久化机制以及内置的安全扫描功能，everything-claude-code 能显著提升 AI 在复杂任务中的表现，帮助开发者构建更稳定、更智能的生产级 AI 代理。其独特的“研究优先”开发理念和针对 Token 消耗的优化策略，使得模型响应更快、成本更低，同时有效防御潜在的攻击向量。\n\n这套工具特别适合软件开发者、AI 研究人员以及希望深度定制 AI 工作流的技术团队使用。无论您是在构建大型代码库，还是需要 AI 协助进行安全审计与自动化测试，everything-claude-code 都能提供强大的底层支持。作为一个曾荣获 Anthropic 黑客大奖的开源项目，它融合了多语言支持与丰富的实战钩子（hooks），让 AI 真正成长为懂上",155373,2,"2026-04-14T11:34:08",[14,13,35],"语言模型",{"id":37,"name":38,"github_repo":39,"description_zh":40,"stars":41,"difficulty_score":32,"last_commit_at":42,"category_tags":43,"status":17},2271,"ComfyUI","Comfy-Org\u002FComfyUI","ComfyUI 是一款功能强大且高度模块化的视觉 AI 引擎，专为设计和执行复杂的 Stable Diffusion 图像生成流程而打造。它摒弃了传统的代码编写模式，采用直观的节点式流程图界面，让用户通过连接不同的功能模块即可构建个性化的生成管线。\n\n这一设计巧妙解决了高级 AI 绘图工作流配置复杂、灵活性不足的痛点。用户无需具备编程背景，也能自由组合模型、调整参数并实时预览效果，轻松实现从基础文生图到多步骤高清修复等各类复杂任务。ComfyUI 拥有极佳的兼容性，不仅支持 Windows、macOS 和 Linux 全平台，还广泛适配 NVIDIA、AMD、Intel 及苹果 Silicon 等多种硬件架构，并率先支持 SDXL、Flux、SD3 等前沿模型。\n\n无论是希望深入探索算法潜力的研究人员和开发者，还是追求极致创作自由度的设计师与资深 AI 绘画爱好者，ComfyUI 都能提供强大的支持。其独特的模块化架构允许社区不断扩展新功能，使其成为当前最灵活、生态最丰富的开源扩散模型工具之一，帮助用户将创意高效转化为现实。",108322,"2026-04-10T11:39:34",[14,15,13],{"id":45,"name":46,"github_repo":47,"description_zh":48,"stars":49,"difficulty_score":32,"last_commit_at":50,"category_tags":51,"status":17},6121,"gemini-cli","google-gemini\u002Fgemini-cli","gemini-cli 是一款由谷歌推出的开源 AI 命令行工具，它将强大的 Gemini 大模型能力直接集成到用户的终端环境中。对于习惯在命令行工作的开发者而言，它提供了一条从输入提示词到获取模型响应的最短路径，无需切换窗口即可享受智能辅助。\n\n这款工具主要解决了开发过程中频繁上下文切换的痛点，让用户能在熟悉的终端界面内直接完成代码理解、生成、调试以及自动化运维任务。无论是查询大型代码库、根据草图生成应用，还是执行复杂的 Git 操作，gemini-cli 都能通过自然语言指令高效处理。\n\n它特别适合广大软件工程师、DevOps 人员及技术研究人员使用。其核心亮点包括支持高达 100 万 token 的超长上下文窗口，具备出色的逻辑推理能力；内置 Google 搜索、文件操作及 Shell 命令执行等实用工具；更独特的是，它支持 MCP（模型上下文协议），允许用户灵活扩展自定义集成，连接如图像生成等外部能力。此外，个人谷歌账号即可享受免费的额度支持，且项目基于 Apache 2.0 协议完全开源，是提升终端工作效率的理想助手。",100752,"2026-04-10T01:20:03",[52,13,15,14],"插件",{"id":54,"name":55,"github_repo":56,"description_zh":57,"stars":58,"difficulty_score":32,"last_commit_at":59,"category_tags":60,"status":17},4721,"markitdown","microsoft\u002Fmarkitdown","MarkItDown 是一款由微软 AutoGen 团队打造的轻量级 Python 工具，专为将各类文件高效转换为 Markdown 格式而设计。它支持 PDF、Word、Excel、PPT、图片（含 OCR）、音频（含语音转录）、HTML 乃至 YouTube 链接等多种格式的解析，能够精准提取文档中的标题、列表、表格和链接等关键结构信息。\n\n在人工智能应用日益普及的今天，大语言模型（LLM）虽擅长处理文本，却难以直接读取复杂的二进制办公文档。MarkItDown 恰好解决了这一痛点，它将非结构化或半结构化的文件转化为模型“原生理解”且 Token 效率极高的 Markdown 格式，成为连接本地文件与 AI 分析 pipeline 的理想桥梁。此外，它还提供了 MCP（模型上下文协议）服务器，可无缝集成到 Claude Desktop 等 LLM 应用中。\n\n这款工具特别适合开发者、数据科学家及 AI 研究人员使用，尤其是那些需要构建文档检索增强生成（RAG）系统、进行批量文本分析或希望让 AI 助手直接“阅读”本地文件的用户。虽然生成的内容也具备一定可读性，但其核心优势在于为机器",93400,"2026-04-06T19:52:38",[52,14],{"id":62,"github_repo":63,"name":64,"description_en":65,"description_zh":66,"ai_summary_zh":66,"readme_en":67,"readme_zh":68,"quickstart_zh":69,"use_case_zh":70,"hero_image_url":71,"owner_login":72,"owner_name":73,"owner_avatar_url":74,"owner_bio":75,"owner_company":76,"owner_location":76,"owner_email":76,"owner_twitter":76,"owner_website":77,"owner_url":78,"languages":79,"stars":84,"forks":85,"last_commit_at":86,"license":87,"difficulty_score":32,"env_os":88,"env_gpu":88,"env_ram":88,"env_deps":89,"category_tags":100,"github_topics":101,"view_count":32,"oss_zip_url":76,"oss_zip_packed_at":76,"status":17,"created_at":106,"updated_at":107,"faqs":108,"releases":139},7561,"iMoonLab\u002FDeepHypergraph","DeepHypergraph","A pytorch library for graph and hypergraph computation.","DeepHypergraph 是一个基于 PyTorch 构建的深度学习库，专为图神经网络与超图神经网络的研发而设计。它致力于解决传统工具在处理复杂高阶关系时的局限性，不仅支持普通的图结构（如定向图、二部图），更能高效处理包含多对多关系的超图结构。通过提供从顶点到顶点、顶点到超边乃至跨域消息传递等灵活的高阶交互机制，DeepHypergraph 让研究者能够轻松建模现实世界中复杂的关联数据。\n\n这款工具非常适合人工智能领域的研究人员、算法工程师及高校师生使用。其核心亮点在于集成了多种前沿（SOTA）模型与丰富的数据集，并内置了强大的可视化工具，帮助用户直观理解低阶与高阶结构的演化。此外，DeepHypergraph 独有的 `dhg.experiments` 模块基于 Optuna 实现了自动化机器学习（Auto-ML）功能，可自动调优模型超参数，显著降低实验门槛并提升模型性能。无论是进行学术探索还是开发复杂的图分析应用，DeepHypergraph 都能提供一个通用且高效的框架，助您快速验证想法并突破性能瓶颈。","\u003Cp align=\"center\">\n    \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FiMoonLab_DeepHypergraph_readme_b14d0ec29cf8.png\" height=\"200\">\n\u003C\u002Fp>\n\n![Release version](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fv\u002Frelease\u002FiMoonLab\u002FDeepHypergraph)\n[![PyPI version](https:\u002F\u002Fimg.shields.io\u002Fpypi\u002Fv\u002Fdhg?color=purple)](https:\u002F\u002Fpypi.org\u002Fproject\u002Fdhg\u002F)\n[![Website Build Status](https:\u002F\u002Fgithub.com\u002Fyifanfeng97\u002Fdhg-page-source\u002Factions\u002Fworkflows\u002Fwebsite.yml\u002Fbadge.svg)](https:\u002F\u002Fdeephypergraph.com\u002F)\n[![Documentation Status](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FiMoonLab_DeepHypergraph_readme_13d664e1afd7.png)](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002F)\n[![Downloads](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FiMoonLab_DeepHypergraph_readme_9b4ac0d32667.png)](https:\u002F\u002Fpepy.tech\u002Fproject\u002Fdhg)\n[![Visits Badge](https:\u002F\u002Fvisitor-badge.glitch.me\u002Fbadge?page_id=iMoonLab.DeepHypergraph)](https:\u002F\u002Fvisitor-badge.glitch.me\u002F)\n[![license](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Flicense\u002Fimoonlab\u002FDeepHypergraph)](LICENSE)\n\u003C!-- [![Code style: Black](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fcode%20style-Black-000000.svg)](https:\u002F\u002Fgithub.com\u002Fpsf\u002Fblack) -->\n\u003C!-- [![Supported Python versions](https:\u002F\u002Fimg.shields.io\u002Fpypi\u002Fpyversions\u002Fdhg)](https:\u002F\u002Fpypi.org\u002Fproject\u002Fdhg\u002F) -->\n\n\n**[Website](https:\u002F\u002Fdeephypergraph.com\u002F)** | **[Documentation](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002F)** | **[Tutorials](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Ftutorial\u002Foverview.html)** | **[中文文档](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fzh\u002Foverview.html)** | **[Official Examples](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fexamples\u002Fvertex_cls\u002Findex.html)** | **[Discussions](https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fdiscussions)**\n\n\n## News\n- 2025-09-01 -> **v0.9.5** is now available! Fix some bugs and migrated to UV package manager with PEP 621 format!\n- 2025-09-01 -> **v0.9.5** 正式发布！ 修复了若干bug，并迁移到UV包管理器，采用PEP 621格式！\n- 2024-01-31 -> **v0.9.4** is now available! Fix some bugs and more datasets are included!\n- 2024-01-31 -> **v0.9.4** 正式发布！ 修复了若干bug，包含更多数据集！\n- 2022-12-28 -> **v0.9.3** is now available! More datasets and operations of hypergraph are included!\n- 2022-12-28 -> **v0.9.3** 正式发布！ 包含更多数据集和超图操作！\n- 2022-09-25 -> **v0.9.2** is now available! More datasets, SOTA models, and visualizations are included!\n- 2022-09-25 -> **v0.9.2** 正式发布！ 包含更多数据集、最新模型和可视化功能！\n- 2022-08-25 -> DHG's first version **v0.9.1** is now available!\n- 2022-08-25 -> DHG的第一个版本 **v0.9.1** 正式发布！\n\n\n**DHG** *(DeepHypergraph)* is a deep learning library built upon [PyTorch](https:\u002F\u002Fpytorch.org) for learning with both Graph Neural Networks and Hypergraph Neural Networks. It is a general framework that supports both low-order and high-order message passing like **from vertex to vertex**, **from vertex in one domain to vertex in another domain**, **from vertex to hyperedge**, **from hyperedge to vertex**, **from vertex set to vertex set**.\n\nIt supports a wide variety of structures like low-order structures (graph, directed graph, bipartite graph, etc.), high-order structures (hypergraph, etc.). Various spectral-based operations (like Laplacian-based smoothing) and spatial-based operations (like message psssing from domain to domain) are integrated inside different structures. It provides multiple common metrics for performance evaluation on different tasks. Many state-of-the-art models are implemented and can be easily used for research. We also provide various visualization tools for both low-order structures and high-order structures.\n\nIn addition, DHG's [dhg.experiments](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fexperiments.html) module (that implements **Auto-ML** upon [Optuna](https:\u002F\u002Foptuna.org)) can help you automatically tune the hyper-parameters of your models in training and easily outperforms the state-of-the-art models.\n\n![Framework of DHG Structures](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FiMoonLab_DeepHypergraph_readme_2c4e47682afa.jpg)\n\n![Framework of DHG Function Library](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FiMoonLab_DeepHypergraph_readme_b037f574904a.jpg)\n\n* [Hightlights](#highlights)\n* [Installation](#installation)\n* [Dependencies](#dependencies)\n* [Quick Start](#quick-start)\n* [Examples](#examples)\n* [Datasets](#datasets)\n* [Metrics](#metrics)\n* [Implemented Models](#implemented-models)\n\n\n---------------------------------------------------------------\n\n## Highlights\n\n- **Support High-Order Message Passing on Structure**:\nDHG supports pair-wise message passing on the graph structure and beyond-pair-wise message passing on the hypergraph structure.\n\n- **Shared Ecosystem with Pytorch Framework**:\nDHG is built upon Pytorch, and any Pytorch-based models can be integrated into DHG. If you are familiar with Pytorch, you can easily use DHG.\n\n- **Powerful API for Designing GNNs and HGNNs**:\nDHG provides various Laplacian matrices and message passing functions to help build your spectral\u002Fspatial-based models, respectively.\n\n- **Visualization of Graphs and Hypergraphs**\nDHG provides a powerful visualization tool for graph and hypergraph. You can easily visualize the structure of your graph and hypergraph.\n\n- **Bridge the Gap between Graphs and Hypergraphs**:\nDHG provides functions to build hypergraph from graph and build graph from hypergraph. Maybe promoting the graph to hypergraph can exploit those potential high-order connections and improve the performance of your model.\n\n- **Attach Spectral\u002FSpatial-Based Operations to Structure**:\nIn DHG, those Laplacian matrices and message passing functions are attached to the graph\u002Fhypergraph structure. As soon as you build a structure with DHG, those functions will be ready to be used in the process of building your model.\n\n- **Comprehensive, Flexible, and Convenience**:\nDHG provides random graph\u002Fhypergraph generators, various state-of-the-art graph\u002Fhypergraph convolutional layers and models, various public graph\u002Fhypergraph datasets, and various evaluation metrics.\n\n- **Support Tuning Structure and Model with Auto-ML**:\nThe Optuna library endows DHG with the Auto-ML ability. DHG supports automatically searching the optimal configurations for the construction of graph\u002Fhypergraph structure and the optimal hyper-parameters for your model and training.\n\n## Installation\n\n\nCurrent, the stable version of **DHG** is 0.9.5. You can install it with ``pip`` as follows:\n\n```python\npip install dhg\n```\n\nYou can also try the nightly version (0.9.6) of **DHG** library with ``pip`` as follows:\n\n```python\npip install git+https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph.git\n```\n\nNightly version is the development version of **DHG**. It may include the lastest SOTA methods and datasets, but it can also be unstable and not fully tested.\nIf you find any bugs, please report it to us in [GitHub Issues](https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fissues).\n\n### Dependencies\n\n**DHG** requires the following dependencies:\n\n- Python >= 3.8\n- PyTorch >= 1.12.1, \u003C 2.0\n- scipy >= 1.8\n- matplotlib >= 3.7.0\n- numpy\n- scikit-learn\n- optuna\n- requests\n\nFor visualization features, matplotlib 3.7.0 or higher is required to properly render 3D plots.\n\n## Quick Start\n\n### Visualization\n\nYou can draw the graph, hypergraph, directed graph, and bipartite graph with DHG's visualization tool. More details see the [Tutorial](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Ftutorial\u002Fvis_structure.html)\n\n\n![Visualization of graph and hypergraph](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FiMoonLab_DeepHypergraph_readme_5cdf49ae88f3.png)\n\n```python\nimport matplotlib.pyplot as plt\nimport dhg\n# draw a graph\ng = dhg.random.graph_Gnm(10, 12)\ng.draw()\n# draw a hypergraph\nhg = dhg.random.hypergraph_Gnm(10, 8)\nhg.draw()\n# show figures\nplt.show()\n```\n\n![Visualization of directed graph and bipartite graph](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FiMoonLab_DeepHypergraph_readme_b3325efa8141.png)\n\n```python\nimport matplotlib.pyplot as plt\nimport dhg\n# draw a directed graph\ng = dhg.random.digraph_Gnm(12, 18)\ng.draw()\n# draw a bipartite graph\ng = dhg.random.bigraph_Gnm(30, 40, 20)\ng.draw()\n# show figures\nplt.show()\n```\n\n### Learning on Low-Order Structures\n\nOn graph structures, you can smooth a given vertex features with GCN's Laplacian matrix by:\n\n```python\nimport torch\nimport dhg\ng = dhg.random.graph_Gnm(5, 8)\nX = torch.rand(5, 2)\nX_ = g.smoothing_with_GCN(X)\n```\n\nOn graph structures, you can pass messages from vertex to vertex with `mean` aggregation by:\n\n```python\nimport torch\nimport dhg\ng = dhg.random.graph_Gnm(5, 8)\nX = torch.rand(5, 2)\nX_ = g.v2v(X, aggr=\"mean\")\n```\n\nOn directed graph structures, you can pass messages from vertex to vertex with `mean` aggregation by:\n\n```python\nimport torch\nimport dhg\ng = dhg.random.digraph_Gnm(5, 8)\nX = torch.rand(5, 2)\nX_ = g.v2v(X, aggr=\"mean\")\n```\n\nOn bipartite graph structures, you can smoothing vertex features with GCN's Laplacian matrix by:\n\n```python\nimport torch\nimport dhg\ng = dhg.random.bigraph_Gnm(3, 5, 8)\nX_u, X_v = torch.rand(3, 2), torch.rand(5, 2)\nX = torch.cat([X_u, X_v], dim=0)\nX_ = g.smoothing_with_GCN(X, aggr=\"mean\")\n```\n\nOn bipartite graph structures, you can pass messages from vertex in `U` set to vertex in `V` set by `mean` aggregation by:\n\n```python\nimport torch\nimport dhg\ng = dhg.random.bigraph_Gnm(3, 5, 8)\nX_u, X_v = torch.rand(3, 2), torch.rand(5, 2)\nX_u_ = g.v2u(X_v, aggr=\"mean\")\nX_v_ = g.u2v(X_u, aggr=\"mean\")\n```\n\n### Learning on High-Order Structures\n\nOn hypergraph structures, you can smooth a given vertex features with HGNN's Laplacian matrix by:\n\n```python\nimport torch\nimport dhg\nhg = dhg.random.hypergraph_Gnm(5, 4)\nX = torch.rand(5, 2)\nX_ = hg.smoothing_with_HGNN(X)\n```\n\nOn hypergraph structures, you can pass messages from vertex to hyperedge with `mean` aggregation by:\n\n```python\nimport torch\nimport dhg\nhg = dhg.random.hypergraph_Gnm(5, 4)\nX = torch.rand(5, 2)\nY_ = hg.v2e(X, aggr=\"mean\")\n```\nThen, you can pass messages from hyperedge to vertex with `mean` aggregation by:\n\n```python\nX_ = hg.e2v(Y_, aggr=\"mean\")\n```\nOr, you can pass messages from vertex set to vertex set with `mean` aggregation by:\n\n```python\nX_ = hg.v2v(X, aggr=\"mean\")\n```\n\n## Examples\n\n### Building the Convolution Layer of GCN\n\n```python\nclass GCNConv(nn.Module):\n    def __init__(self,):\n        super().__init__()\n        ...\n        self.reset_parameters()\n\n    def forward(self, X: torch.Tensor, g: dhg.Graph) -> torch.Tensor:\n        # apply the trainable parameters ``theta`` to the input ``X``\n        X = self.theta(X)\n        # smooth the input ``X`` with the GCN's Laplacian\n        X = g.smoothing_with_GCN(X)\n        X = F.relu(X)\n        return X\n```\n\n### Building the Convolution Layer of GAT\n\n```python\nclass GATConv(nn.Module):\n    def __init__(self,):\n        super().__init__()\n        ...\n        self.reset_parameters()\n\n    def forward(self, X: torch.Tensor, g: dhg.Graph) -> torch.Tensor:\n        # apply the trainable parameters ``theta`` to the input ``X``\n        X = self.theta(X)\n        # compute attention weights for each edge\n        x_for_src = self.atten_src(X)\n        x_for_dst = self.atten_dst(X)\n        e_atten_score = x_for_src[g.e_src] + x_for_dst[g.e_dst]\n        e_atten_score = F.leaky_relu(e_atten_score).squeeze()\n        # apply ``e_atten_score`` to each edge in the graph ``g``, aggragete neighbor messages\n        #  with ``softmax_then_sum``, and perform vertex->vertex message passing in graph\n        #  with message passing function ``v2v()``\n        X = g.v2v(X, aggr=\"softmax_then_sum\", e_weight=e_atten_score)\n        X = F.elu(X)\n        return X\n```\n\n### Building the Convolution Layer of HGNN\n\n```python\nclass HGNNConv(nn.Module):\n    def __init__(self,):\n        super().__init__()\n        ...\n        self.reset_parameters()\n\n    def forward(self, X: torch.Tensor, hg: dhg.Hypergraph) -> torch.Tensor:\n        # apply the trainable parameters ``theta`` to the input ``X``\n        X = self.theta(X)\n        # smooth the input ``X`` with the HGNN's Laplacian\n        X = hg.smoothing_with_HGNN(X)\n        X = F.relu(X)\n        return X\n```\n\n\n### Building the Convolution Layer of HGNN $^+$\n\n```python\nclass HGNNPConv(nn.Module):\n    def __init__(self,):\n        super().__init__()\n        ...\n        self.reset_parameters()\n\n    def forward(self, X: torch.Tensor, hg: dhg.Hypergraph) -> torch.Tensor:\n        # apply the trainable parameters ``theta`` to the input ``X``\n        X = self.theta(X)\n        # perform vertex->hyperedge->vertex message passing in hypergraph\n        #  with message passing function ``v2v``, which is the combination\n        #  of message passing function ``v2e()`` and ``e2v()``\n        X = hg.v2v(X, aggr=\"mean\")\n        X = F.relu(X)\n        return X\n```\n\n\n## Datasets\n\nCurrently, we have added the following datasets:\n\n- **[Cora](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Cora.html#dhg.data.Cora)**: A citation network dataset for vertex classification task.\n\n- **[PubMed](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Pubmed.html#dhg.data.Pubmed)**: A citation network dataset for vertex classification task.\n\n- **[Citeseer](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Citeseer.html#dhg.data.Citeseer)**: A citation network dataset for vertex classification task.\n\n- **[BlogCatalog](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.BlogCatalog.html#dhg.data.BlogCatalog)**: A social network dataset for vertex classification task.\n\n- **[Flickr](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Flickr.html#dhg.data.Flickr)**: A social network dataset for vertex classification task.\n\n- **[Github](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Github.html#dhg.data.Github)**: A collaboration network dataset for vertex classification task.\n\n- **[Facebook](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Facebook.html#dhg.data.Facebook)**: A social network dataset for vertex classification task.\n\n- **[MovieLens1M](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.MovieLens1M.html#dhg.data.MovieLens1M)**: A movie dataset for user-item recommendation task.\n\n- **[AmazonBook](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.AmazonBook.html#dhg.data.AmazonBook)**: An Amazon dataset for user-item recommendation task.\n\n- **[Yelp2018](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Yelp2018.html#dhg.data.Yelp2018)**: A restaurant review dataset for user-item recommendation task.\n\n- **[Gowalla](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Gowalla.html#dhg.data.Gowalla)**: A location's feedback dataset for user-item recommendation task.\n\n- **[TecentBiGraph](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.TencentBiGraph.html#dhg.data.TencentBiGraph)**: A social network dataset for vertex classification task.\n\n- **[CoraBiGraph](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.CoraBiGraph.html#dhg.data.CoraBiGraph)**: A citation network dataset for vertex classification task.\n\n- **[PubmedBiGraph](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.PubmedBiGraph.html#dhg.data.PubmedBiGraph)**: A citation network dataset for vertex classification task.\n\n- **[CiteseerBiGraph](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.CiteseerBiGraph.html#dhg.data.CiteseerBiGraph)**: A citation network dataset for vertex classification task.\n\n- **[Cooking200](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Cooking200.html#dhg.data.Cooking200)**: A cooking recipe dataset for vertex classification task.\n\n- **[CoauthorshipCora](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.CoauthorshipCora.html#dhg.data.CoauthorshipCora)**: A citation network dataset for vertex classification task.\n\n- **[CoauthorshipDBLP](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.CoauthorshipDBLP.html#dhg.data.CoauthorshipDBLP)**: A citation network dataset for vertex classification task.\n\n- **[CocitationCora](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.CocitationCora.html#dhg.data.CocitationCora)**: A citation network dataset for vertex classification task.\n\n- **[CocitationPubmed](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.CocitationCiteseer.html#dhg.data.CocitationCiteseer)**: A citation network dataset for vertex classification task.\n\n- **[CocitationCiteseer](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.CocitationPubmed.html#dhg.data.CocitationPubmed)**: A citation network dataset for vertex classification task.\n\n- **[YelpRestaurant](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.YelpRestaurant.html#dhg.data.YelpRestaurant)**: A restaurant-review network dataset for vertex classification task.\n\n- **[WalmartTrips](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.WalmartTrips.html#dhg.data.WalmartTrips)**: A user-product network dataset for vertex classification task.\n\n- **[HouseCommittees](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.HouseCommittees.html#dhg.data.HouseCommittees)**: A committee network dataset for vertex classification task.\n\n- **[News20](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.News20.html#dhg.data.News20)**: A newspaper network dataset for vertex classification task.\n\n- **[DBLP8k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.DBLP8k.html#dhg.data.DBLP8k)**: The DBLP-8k dataset is a citation network dataset for link prediction task.\n\n- **[DBLP4k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.DBLP4k.html#dhg.data.DBLP4k)**: The DBLP-4k dataset is a citation network dataset for vertex classification task.\n\n- **[IMDB4k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.IMDB4k.html#dhg.data.IMDB4k)**: The IMDB-4k dataset is a movie dataset for vertex classification task.\n\n- **[Recipe100k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Recipe100k.html#dhg.data.Recipe100k)**: The Recipe100k dataset is a recipe-ingredient network dataset for vertex classification task.\n\n- **[Recipe200k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Recipe200k.html#dhg.data.Recipe200k)**: The Recipe200k dataset is a recipe-ingredient network dataset for vertex classification task.\n\n- **[Yelp3k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Yelp3k.html#dhg.data.Yelp3k)**: The Yelp3k dataset is a subset of Yelp-Restaurant dataset for vertex classification task.\n\n- **[Tencent2k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Tencent2k.html#dhg.data.Tencent2k)**: The Tencent2k dataset is a social network dataset for vertex classification task.\n\n## Metrics\n\n### Classification Metrics\n\n- **[Accuracy](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.classification.accuracy)**: Calculates the accuracy of the predictions.\n\n- **[F1-Score](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.classification.f1_score)**: Calculates the F1-score of the predictions.\n\n- **[Confusion Matrix](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.classification.confusion_matrix)**: Calculates the confusion matrix of the predictions.\n\n### Recommender Metrics\n\n- **[Precision@k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.recommender.precision)**: Calculates the precision@k of the predictions.\n\n- **[Recall@k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002F_modules\u002Fdhg\u002Fmetrics\u002Frecommender.html#recall)**: Calculates the recall@k of the predictions.\n\n- **[NDCG@k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.recommender.ndcg)**: Calculates the normalized discounted cumulative gain@k of the predictions.\n\n### Retrieval Metrics\n\n- **[Precision@k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.retrieval.precision)**: Calculates the precision@k of the predictions.\n\n- **[Recall@k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.retrieval.recall)**: Calculates the recall@k of the predictions.\n\n- **[mAP@k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.retrieval.map)**: Calculates the mAP@k of the predictions.\n\n- **[NDCG@k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.retrieval.ndcg)**: Calculates the normalized Discounted Cumulative Gain@k of the predictions.\n\n- **[mRR@k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.retrieval.mrr)**: Calculates the mean Reciprocal Rank@k of the predictions.\n\n- **[PR-Curve](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.retrieval.pr_curve)**: Calculates the precision-recall curve of the predictions.\n\n## Implemented Models\n\n### On Low-Order Structures\n\n- **[GCN](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.GCN.html#dhg.models.GCN)** model of [Semi-Supervised Classification with Graph Convolutional Networks](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1609.02907) paper (ICLR 2017).\n\n- **[GraphSAGE](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.GraphSAGE.html#dhg.models.GraphSAGE)** model of [Inductive Representation Learning on Large Graphs](https:\u002F\u002Fcs.stanford.edu\u002Fpeople\u002Fjure\u002Fpubs\u002Fgraphsage-nips17.pdf) paper (NeurIPS 2017).\n\n- **[GAT](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.GAT.html#dhg.models.GAT)** model of [Graph Attention Networks](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1710.10903) paper (ICLR 2018).\n\n- **[GIN](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.GIN.html#dhg.models.GIN)** model of [How Powerful are Graph Neural Networks?](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1810.00826) paper (ICLR 2019).\n\n- **[NGCF](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.NGCF.html#dhg.models.NGCF)** model of [Neural Graph Collaborative Filtering](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1905.08108) paper (SIGIR 2019).\n\n- **[LightGCN](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.LightGCN.html#dhg.models.LightGCN)** model of [LightGCN: Lightweight Graph Convolutional Networks](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2002.02126) paper (SIGIR 2020).\n\n- **[BGNN-Adv](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.BGNN_Adv.html#dhg.models.BGNN_Adv)** model of [Cascade-BGNN: Toward Efficient Self-supervised Representation Learning on Large-scale Bipartite Graphs](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1906.11994.pdf) paper (TNNLS 2020).\n\n- **[BGNN-MLP](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.BGNN_MLP.html#dhg.models.BGNN_MLP)** model of [Cascade-BGNN: Toward Efficient Self-supervised Representation Learning on Large-scale Bipartite Graphs](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1906.11994.pdf) paper (TNNLS 2020).\n\n\n### On High-Order Structures\n\n- **[HGNN](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.HGNN.html#dhg.models.HGNN)** model of [Hypergraph Neural Networks](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1809.09401) paper (AAAI 2019).\n\n- **[HGNN+](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.HGNNP.html#dhg.models.HGNNP)** model of [HGNN+: General Hypergraph Neural Networks](https:\u002F\u002Fieeexplore.ieee.org\u002Fdocument\u002F9795251) paper (IEEE T-PAMI 2022).\n\n- **[HyperGCN](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.HyperGCN.html#dhg.models.HyperGCN)** model of [HyperGCN: A New Method of Training Graph Convolutional Networks on Hypergraphs](https:\u002F\u002Fpapers.nips.cc\u002Fpaper\u002F2019\u002Ffile\u002F1efa39bcaec6f3900149160693694536-Paper.pdf) paper (NeurIPS 2019).\n\n- **[DHCF](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.DHCF.html#dhg.models.DHCF)** model of [Dual Channel Hypergraph Collaborative Filtering](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002F10.1145\u002F3394486.3403253) paper (KDD 2020).\n\n- **[HNHN](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.HNHN.html#dhg.models.HNHN)** model of [HNHN: Hypergraph Networks with Hyperedge Neurons](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2006.12278.pdf) paper (ICML 2020).\n\n- **[UniGCN](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.UniGCN.html#dhg.models.UniGCN)** model of [UniGNN: a Unified Framework for Graph and Hypergraph Neural Networks](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2105.00956.pdf) paper (IJCAI 2021).\n\n- **[UniGAT](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.UniGAT.html#dhg.models.UniGAT)** model of [UniGNN: a Unified Framework for Graph and Hypergraph Neural Networks](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2105.00956.pdf) paper (IJCAI 2021).\n\n- **[UniSAGE](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.UniSAGE.html#dhg.models.UniSAGE)** model of [UniGNN: a Unified Framework for Graph and Hypergraph Neural Networks](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2105.00956.pdf) paper (IJCAI 2021).\n\n- **[UniGIN](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.UniGIN.html#dhg.models.UniGIN)** model of [UniGNN: a Unified Framework for Graph and Hypergraph Neural Networks](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2105.00956.pdf) paper (IJCAI 2021).\n\n\n\n## Citing\nIf you find **DHG** is useful in your research, please consider citing:\n\n```\n@article{gao2022hgnn,\n  title={HGNN $\\^{}+ $: General Hypergraph Neural Networks},\n  author={Gao, Yue and Feng, Yifan and Ji, Shuyi and Ji, Rongrong},\n  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},\n  year={2022},\n  publisher={IEEE}\n}\n```\n```\n@inproceedings{feng2019hypergraph,\n  title={Hypergraph neural networks},\n  author={Feng, Yifan and You, Haoxuan and Zhang, Zizhao and Ji, Rongrong and Gao, Yue},\n  booktitle={Proceedings of the AAAI conference on artificial intelligence},\n  volume={33},\n  number={01},\n  pages={3558--3565},\n  year={2019}\n}\n```\n\n## The DHG Team\n\nDHG is developed by DHG's core team including [Yifan Feng](http:\u002F\u002Ffengyifan.site\u002F), [Xinwei Zhang](https:\u002F\u002Fgithub.com\u002Fzhangxwww), [Jielong Yan](https:\u002F\u002Fgithub.com\u002FJasonYanjl), [Xiangmin Han](https:\u002F\u002Fscholar.google.com\u002Fcitations?user=Y96h0t0AAAAJ&hl=zh-CN&oi=ao), [Yue Gao](http:\u002F\u002Fmoon-lab.tech\u002F), and [Qionghai Dai](https:\u002F\u002Fysg.ckcest.cn\u002Fhtml\u002Fdetails\u002F8058\u002Findex.html). It is maintained by the [iMoon-Lab](http:\u002F\u002Fmoon-lab.tech\u002F), Tsinghua University. You can contact us at [email](mailto:evanfeng97@gmail.com).\n\n\n## License\n\nDHG uses Apache License 2.0.\n","\u003Cp align=\"center\">\n    \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FiMoonLab_DeepHypergraph_readme_b14d0ec29cf8.png\" height=\"200\">\n\u003C\u002Fp>\n\n![发布版本](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fv\u002Frelease\u002FiMoonLab\u002FDeepHypergraph)\n[![PyPI版本](https:\u002F\u002Fimg.shields.io\u002Fpypi\u002Fv\u002Fdhg?color=purple)](https:\u002F\u002Fpypi.org\u002Fproject\u002Fdhg\u002F)\n[![网站构建状态](https:\u002F\u002Fgithub.com\u002Fyifanfeng97\u002Fdhg-page-source\u002Factions\u002Fworkflows\u002Fwebsite.yml\u002Fbadge.svg)](https:\u002F\u002Fdeephypergraph.com\u002F)\n[![文档状态](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FiMoonLab_DeepHypergraph_readme_13d664e1afd7.png)](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002F)\n[![下载量](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FiMoonLab_DeepHypergraph_readme_9b4ac0d32667.png)](https:\u002F\u002Fpepy.tech\u002Fproject\u002Fdhg)\n[![访问量徽章](https:\u002F\u002Fvisitor-badge.glitch.me\u002Fbadge?page_id=iMoonLab.DeepHypergraph)](https:\u002F\u002Fvisitor-badge.glitch.me\u002F)\n[![许可证](https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Flicense\u002Fimoonlab\u002FDeepHypergraph)](LICENSE)\n\u003C!-- [![代码风格：Black](https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fcode%20style-Black-000000.svg)](https:\u002F\u002Fgithub.com\u002Fpsf\u002Fblack) -->\n\u003C!-- [![支持的Python版本](https:\u002F\u002Fimg.shields.io\u002Fpypi\u002Fpyversions\u002Fdhg)](https:\u002F\u002Fpypi.org\u002Fproject\u002Fdhg\u002F) -->\n\n\n**[官网](https:\u002F\u002Fdeephypergraph.com\u002F)** | **[文档](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002F)** | **[教程](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Ftutorial\u002Foverview.html)** | **[中文文档](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fzh\u002Foverview.html)** | **[官方示例](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fexamples\u002Fvertex_cls\u002Findex.html)** | **[讨论区](https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fdiscussions)**\n\n\n## 新闻\n- 2025-09-01 -> **v0.9.5** 现已发布！修复了一些bug，并迁移到UV包管理器，采用PEP 621格式！\n- 2024-01-31 -> **v0.9.4** 现已发布！修复了若干bug，增加了更多数据集！\n- 2022-12-28 -> **v0.9.3** 现已发布！包含更多数据集和超图操作！\n- 2022-09-25 -> **v0.9.2** 现已发布！包含了更多数据集、最新模型和可视化功能！\n- 2022-08-25 -> DHG的第一个版本 **v0.9.1** 正式发布！\n\n\n**DHG** *(DeepHypergraph)* 是一个基于 [PyTorch](https:\u002F\u002Fpytorch.org) 的深度学习库，用于同时进行图神经网络和超图神经网络的学习。它是一个通用框架，支持低阶和高阶的消息传递，例如 **从顶点到顶点**、**从一个域的顶点到另一个域的顶点**、**从顶点到超边**、**从超边到顶点**、**从顶点集合到顶点集合**。\n\n该库支持多种结构，包括低阶结构（如普通图、有向图、二分图等）和高阶结构（如超图等）。在不同结构中集成了多种基于谱的运算（如基于拉普拉斯矩阵的平滑）和基于空间的运算（如域与域之间的消息传递）。此外，它还提供了多种常用指标来评估不同任务的性能。许多最先进的模型已被实现，便于研究人员使用。我们还提供了针对低阶和高阶结构的多种可视化工具。\n\n另外，DHG 的 [dhg.experiments](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fexperiments.html) 模块（基于 [Optuna](https:\u002F\u002Foptuna.org) 实现的 **Auto-ML**）可以帮助您自动调整模型训练中的超参数，轻松超越现有最先进模型的性能。\n\n![DHG结构框架](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FiMoonLab_DeepHypergraph_readme_2c4e47682afa.jpg)\n\n![DHG函数库框架](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FiMoonLab_DeepHypergraph_readme_b037f574904a.jpg)\n\n* [亮点](#highlights)\n* [安装](#installation)\n* [依赖项](#dependencies)\n* [快速入门](#quick-start)\n* [示例](#examples)\n* [数据集](#datasets)\n* [指标](#metrics)\n* [已实现的模型](#implemented-models)\n\n\n---------------------------------------------------------------\n\n## 亮点\n\n- **支持结构上的高阶消息传递**：\nDHG 支持在图结构上进行两两消息传递，而在超图结构上则支持多于两两的消息传递。\n\n- **与 PyTorch 生态系统共享**：\nDHG 基于 PyTorch 构建，任何基于 PyTorch 的模型都可以集成到 DHG 中。如果您熟悉 PyTorch，那么使用 DHG 就会非常容易。\n\n- **强大的 API 用于设计 GNN 和 HGNN**：\nDHG 提供了多种拉普拉斯矩阵和消息传递函数，分别帮助构建基于谱和基于空间的模型。\n\n- **图和超图的可视化**\nDHG 提供了一个功能强大的可视化工具，可以轻松地展示您的图和超图结构。\n\n- **连接图与超图的桥梁**：\nDHG 提供了从图构建超图以及从超图构建图的功能。将图升级为超图或许能够挖掘出潜在的高阶连接，从而提升模型性能。\n\n- **将基于谱\u002F空间的运算附加到结构上**：\n在 DHG 中，这些拉普拉斯矩阵和消息传递函数都被附加到了图\u002F超图结构上。一旦您使用 DHG 构建了某种结构，这些功能就可以直接用于模型的构建过程中。\n\n- **全面、灵活且便捷**：\nDHG 提供随机图\u002F超图生成器、各种最先进的图\u002F超图卷积层和模型、多种公开的图\u002F超图数据集以及多种评估指标。\n\n- **支持通过 Auto-ML 调优结构和模型**：\n借助 Optuna 库，DHG 具备了自动机器学习的能力。它可以自动搜索构建图\u002F超图结构的最佳配置，以及您模型和训练过程中的最优超参数。\n\n## 安装\n\n\n目前，**DHG** 的稳定版本是 0.9.5。您可以使用 ``pip`` 进行安装：\n\n```python\npip install dhg\n```\n\n您也可以尝试使用 ``pip`` 安装 **DHG** 库的夜间版本（0.9.6）：\n\n```python\npip install git+https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph.git\n```\n\n夜间版本是 **DHG** 的开发版本，可能包含最新的最先进方法和数据集，但也可能不够稳定，尚未经过充分测试。如果您发现任何问题，请在 [GitHub Issues](https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fissues) 中向我们报告。\n\n### 依赖项\n\n**DHG** 需要以下依赖项：\n\n- Python >= 3.8\n- PyTorch >= 1.12.1, \u003C 2.0\n- scipy >= 1.8\n- matplotlib >= 3.7.0\n- numpy\n- scikit-learn\n- optuna\n- requests\n\n对于可视化功能，需要 matplotlib 3.7.0 或更高版本才能正确渲染 3D 图形。\n## 快速入门\n\n### 可视化\n\n您可以使用 DHG 的可视化工具绘制图、超图、有向图和二分图。更多详细信息请参阅 [教程](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Ftutorial\u002Fvis_structure.html)。\n\n![图和超图的可视化](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FiMoonLab_DeepHypergraph_readme_5cdf49ae88f3.png)\n\n```python\nimport matplotlib.pyplot as plt\nimport dhg\n# 绘制一个图\ng = dhg.random.graph_Gnm(10, 12)\ng.draw()\n# 绘制一个超图\nhg = dhg.random.hypergraph_Gnm(10, 8)\nhg.draw()\n# 显示图形\nplt.show()\n```\n\n![有向图和二分图的可视化](https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FiMoonLab_DeepHypergraph_readme_b3325efa8141.png)\n\n```python\nimport matplotlib.pyplot as plt\nimport dhg\n# 绘制一个有向图\ng = dhg.random.digraph_Gnm(12, 18)\ng.draw()\n# 绘制一个二分图\ng = dhg.random.bigraph_Gnm(30, 40, 20)\ng.draw()\n# 显示图形\nplt.show()\n```\n\n### 低阶结构上的学习\n\n在图结构上，您可以通过 GCN 的拉普拉斯矩阵对给定的节点特征进行平滑处理：\n\n```python\nimport torch\nimport dhg\ng = dhg.random.graph_Gnm(5, 8)\nX = torch.rand(5, 2)\nX_ = g.smoothing_with_GCN(X)\n```\n\n在图结构上，您还可以通过 `mean` 聚合方式实现节点之间的消息传递：\n\n```python\nimport torch\nimport dhg\ng = dhg.random.graph_Gnm(5, 8)\nX = torch.rand(5, 2)\nX_ = g.v2v(X, aggr=\"mean\")\n```\n\n在有向图结构上，同样可以使用 `mean` 聚合方式进行节点间的消息传递：\n\n```python\nimport torch\nimport dhg\ng = dhg.random.digraph_Gnm(5, 8)\nX = torch.rand(5, 2)\nX_ = g.v2v(X, aggr=\"mean\")\n```\n\n在二分图结构上，您也可以利用 GCN 的拉普拉斯矩阵对节点特征进行平滑处理：\n\n```python\nimport torch\nimport dhg\ng = dhg.random.bigraph_Gnm(3, 5, 8)\nX_u, X_v = torch.rand(3, 2), torch.rand(5, 2)\nX = torch.cat([X_u, X_v], dim=0)\nX_ = g.smoothing_with_GCN(X, aggr=\"mean\")\n```\n\n此外，在二分图结构上，您还可以通过 `mean` 聚合方式实现从 `U` 集合到 `V` 集合的节点间消息传递：\n\n```python\nimport torch\nimport dhg\ng = dhg.random.bigraph_Gnm(3, 5, 8)\nX_u, X_v = torch.rand(3, 2), torch.rand(5, 2)\nX_u_ = g.v2u(X_v, aggr=\"mean\")\nX_v_ = g.u2v(X_u, aggr=\"mean\")\n```\n\n### 高阶结构上的学习\n\n在超图结构上，您可以通过 HGNN 的拉普拉斯矩阵对给定的节点特征进行平滑处理：\n\n```python\nimport torch\nimport dhg\nhg = dhg.random.hypergraph_Gnm(5, 4)\nX = torch.rand(5, 2)\nX_ = hg.smoothing_with_HGNN(X)\n```\n\n在超图结构上，您还可以通过 `mean` 聚合方式实现从节点到超边的消息传递：\n\n```python\nimport torch\nimport dhg\nhg = dhg.random.hypergraph_Gnm(5, 4)\nX = torch.rand(5, 2)\nY_ = hg.v2e(X, aggr=\"mean\")\n```\n\n随后，您可以通过 `mean` 聚合方式将超边中的信息传递回节点：\n\n```python\nX_ = hg.e2v(Y_, aggr=\"mean\")\n```\n\n或者，您也可以通过 `mean` 聚合方式实现节点集之间的消息传递：\n\n```python\nX_ = hg.v2v(X, aggr=\"mean\")\n```\n\n## 示例\n\n### 构建 GCN 的卷积层\n\n```python\nclass GCNConv(nn.Module):\n    def __init__(self,):\n        super().__init__()\n        ...\n        self.reset_parameters()\n\n    def forward(self, X: torch.Tensor, g: dhg.Graph) -> torch.Tensor:\n        # 将可训练参数 ``theta`` 应用于输入 ``X``\n        X = self.theta(X)\n        # 使用 GCN 的拉普拉斯矩阵对输入 ``X`` 进行平滑处理\n        X = g.smoothing_with_GCN(X)\n        X = F.relu(X)\n        return X\n```\n\n### 构建 GAT 的卷积层\n\n```python\nclass GATConv(nn.Module):\n    def __init__(self,):\n        super().__init__()\n        ...\n        self.reset_parameters()\n\n    def forward(self, X: torch.Tensor, g: dhg.Graph) -> torch.Tensor:\n        # 将可训练参数 ``theta`` 应用于输入 ``X``\n        X = self.theta(X)\n        # 计算每条边的注意力权重\n        x_for_src = self.atten_src(X)\n        x_for_dst = self.atten_dst(X)\n        e_atten_score = x_for_src[g.e_src] + x_for_dst[g.e_dst]\n        e_atten_score = F.leaky_relu(e_atten_score).squeeze()\n        # 将 ``e_atten_score`` 应用于图中每一条边，以 ``softmax_then_sum`` 方式聚合邻居消息，\n        # 并通过 ``v2v()`` 消息传递函数在图中完成节点间的消息传递\n        X = g.v2v(X, aggr=\"softmax_then_sum\", e_weight=e_atten_score)\n        X = F.elu(X)\n        return X\n```\n\n### 构建 HGNN 的卷积层\n\n```python\nclass HGNNConv(nn.Module):\n    def __init__(self,):\n        super().__init__()\n        ...\n        self.reset_parameters()\n\n    def forward(self, X: torch.Tensor, hg: dhg.Hypergraph) -> torch.Tensor:\n        # 将可训练参数 ``theta`` 应用于输入 ``X``\n        X = self.theta(X)\n        # 使用 HGNN 的拉普拉斯矩阵对输入 ``X`` 进行平滑处理\n        X = hg.smoothing_with_HGNN(X)\n        X = F.relu(X)\n        return X\n```\n\n### 构建 HGNN $^+$ 的卷积层\n\n```python\nclass HGNNPConv(nn.Module):\n    def __init__(self,):\n        super().__init__()\n        ...\n        self.reset_parameters()\n\n    def forward(self, X: torch.Tensor, hg: dhg.Hypergraph) -> torch.Tensor:\n        # 将可训练参数 ``theta`` 应用于输入 ``X``\n        X = self.theta(X)\n        # 在超图中通过 ``v2v()`` 消息传递函数实现节点→超边→节点的消息传递，\n        # 该函数结合了 ``v2e()`` 和 ``e2v()`` 两个步骤\n        X = hg.v2v(X, aggr=\"mean\")\n        X = F.relu(X)\n        return X\n```\n\n## 数据集\n\n目前，我们已添加以下数据集：\n\n- **[Cora](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Cora.html#dhg.data.Cora)**：用于节点分类任务的引用网络数据集。\n\n- **[PubMed](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Pubmed.html#dhg.data.Pubmed)**：用于节点分类任务的引用网络数据集。\n\n- **[Citeseer](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Citeseer.html#dhg.data.Citeseer)**：用于节点分类任务的引用网络数据集。\n\n- **[BlogCatalog](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.BlogCatalog.html#dhg.data.BlogCatalog)**：用于节点分类任务的社交网络数据集。\n\n- **[Flickr](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Flickr.html#dhg.data.Flickr)**：用于节点分类任务的社交网络数据集。\n\n- **[Github](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Github.html#dhg.data.Github)**：用于节点分类任务的合作网络数据集。\n\n- **[Facebook](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Facebook.html#dhg.data.Facebook)**：用于节点分类任务的社交网络数据集。\n\n- **[MovieLens1M](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.MovieLens1M.html#dhg.data.MovieLens1M)**：用于用户-物品推荐任务的电影数据集。\n\n- **[AmazonBook](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.AmazonBook.html#dhg.data.AmazonBook)**：用于用户-物品推荐任务的亚马逊数据集。\n\n- **[Yelp2018](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Yelp2018.html#dhg.data.Yelp2018)**：用于用户-物品推荐任务的餐厅评论数据集。\n\n- **[Gowalla](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Gowalla.html#dhg.data.Gowalla)**：用于用户-物品推荐任务的位置反馈数据集。\n\n- **[TecentBiGraph](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.TencentBiGraph.html#dhg.data.TencentBiGraph)**：用于节点分类任务的社交网络数据集。\n\n- **[CoraBiGraph](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.CoraBiGraph.html#dhg.data.CoraBiGraph)**：用于节点分类任务的引用网络数据集。\n\n- **[PubmedBiGraph](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.PubmedBiGraph.html#dhg.data.PubmedBiGraph)**：用于节点分类任务的引用网络数据集。\n\n- **[CiteseerBiGraph](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.CiteseerBiGraph.html#dhg.data.CiteseerBiGraph)**：用于节点分类任务的引用网络数据集。\n\n- **[Cooking200](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Cooking200.html#dhg.data.Cooking200)**：用于节点分类任务的烹饪食谱数据集。\n\n- **[CoauthorshipCora](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.CoauthorshipCora.html#dhg.data.CoauthorshipCora)**：用于节点分类任务的引用网络数据集。\n\n- **[CoauthorshipDBLP](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.CoauthorshipDBLP.html#dhg.data.CoauthorshipDBLP)**：用于节点分类任务的引用网络数据集。\n\n- **[CocitationCora](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.CocitationCora.html#dhg.data.CocitationCora)**：用于节点分类任务的引用网络数据集。\n\n- **[CocitationPubmed](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.CocitationCiteseer.html#dhg.data.CocitationCiteseer)**：用于节点分类任务的引用网络数据集。\n\n- **[CocitationCiteseer](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.CocitationPubmed.html#dhg.data.CocitationPubmed)**：用于节点分类任务的引用网络数据集。\n\n- **[YelpRestaurant](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.YelpRestaurant.html#dhg.data.YelpRestaurant)**：用于节点分类任务的餐厅评论网络数据集。\n\n- **[WalmartTrips](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.WalmartTrips.html#dhg.data.WalmartTrips)**：用于节点分类任务的用户-商品网络数据集。\n\n- **[HouseCommittees](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.HouseCommittees.html#dhg.data.HouseCommittees)**：用于节点分类任务的委员会网络数据集。\n\n- **[News20](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.News20.html#dhg.data.News20)**：用于节点分类任务的报纸网络数据集。\n\n- **[DBLP8k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.DBLP8k.html#dhg.data.DBLP8k)**：DBLP-8k数据集是一个用于链接预测任务的引用网络数据集。\n\n- **[DBLP4k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.DBLP4k.html#dhg.data.DBLP4k)**：DBLP-4k数据集是一个用于节点分类任务的引用网络数据集。\n\n- **[IMDB4k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.IMDB4k.html#dhg.data.IMDB4k)**：IMDB-4k数据集是一个用于节点分类任务的电影数据集。\n\n- **[Recipe100k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Recipe100k.html#dhg.data.Recipe100k)**：Recipe100k数据集是一个用于节点分类任务的食谱-食材网络数据集。\n\n- **[Recipe200k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Recipe200k.html#dhg.data.Recipe200k)**：Recipe200k数据集是一个用于节点分类任务的食谱-食材网络数据集。\n\n- **[Yelp3k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Yelp3k.html#dhg.data.Yelp3k)**：Yelp3k数据集是Yelp-Restaurant数据集的一个子集，用于节点分类任务。\n\n- **[Tencent2k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Tencent2k.html#dhg.data.Tencent2k)**：Tencent2k数据集是一个用于节点分类任务的社交网络数据集。\n\n## 评估指标\n\n### 分类任务指标\n\n- **[准确率](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.classification.accuracy)**：计算预测的准确率。\n\n- **[F1分数](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.classification.f1_score)**：计算预测的F1分数。\n\n- **[混淆矩阵](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.classification.confusion_matrix)**：计算预测的混淆矩阵。\n\n### 推荐系统指标\n\n- **[Precision@k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.recommender.precision)**：计算预测的 Precision@k。\n\n- **[Recall@k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002F_modules\u002Fdhg\u002Fmetrics\u002Frecommender.html#recall)**：计算预测的 Recall@k。\n\n- **[NDCG@k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.recommender.ndcg)**：计算预测的归一化折损累计增益@k。\n\n### 检索系统指标\n\n- **[Precision@k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.retrieval.precision)**：计算预测的 Precision@k。\n\n- **[Recall@k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.retrieval.recall)**：计算预测的 Recall@k。\n\n- **[mAP@k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.retrieval.map)**：计算预测的 mAP@k。\n\n- **[NDCG@k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.retrieval.ndcg)**：计算预测的归一化折损累计增益@k。\n\n- **[mRR@k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.retrieval.mrr)**：计算预测的平均倒数排名@k。\n\n- **[PR-Curve](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fapi\u002Fmetrics.html#dhg.metrics.retrieval.pr_curve)**：计算预测的精确率-召回率曲线。\n\n## 已实现模型\n\n### 基于低阶结构\n\n- **[GCN](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.GCN.html#dhg.models.GCN)** 模型，出自论文《基于图卷积网络的半监督分类》（ICLR 2017）。\n\n- **[GraphSAGE](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.GraphSAGE.html#dhg.models.GraphSAGE)** 模型，出自论文《大规模图上的归纳式表示学习》（NeurIPS 2017）。\n\n- **[GAT](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.GAT.html#dhg.models.GAT)** 模型，出自论文《图注意力网络》（ICLR 2018）。\n\n- **[GIN](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.GIN.html#dhg.models.GIN)** 模型，出自论文《图神经网络有多强大？》（ICLR 2019）。\n\n- **[NGCF](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.NGCF.html#dhg.models.NGCF)** 模型，出自论文《神经图协同过滤》（SIGIR 2019）。\n\n- **[LightGCN](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.LightGCN.html#dhg.models.LightGCN)** 模型，出自论文《LightGCN：轻量级图卷积网络》（SIGIR 2020）。\n\n- **[BGNN-Adv](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.BGNN_Adv.html#dhg.models.BGNN_Adv)** 模型，出自论文《Cascade-BGNN：面向大规模二分图的高效自监督表示学习》（TNNLS 2020）。\n\n- **[BGNN-MLP](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.BGNN_MLP.html#dhg.models.BGNN_MLP)** 模型，出自论文《Cascade-BGNN：面向大规模二分图的高效自监督表示学习》（TNNLS 2020）。\n\n\n### 基于高阶结构\n\n- **[HGNN](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.HGNN.html#dhg.models.HGNN)** 模型，出自论文《超图神经网络》（AAAI 2019）。\n\n- **[HGNN+](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.HGNNP.html#dhg.models.HGNNP)** 模型，出自论文《HGNN+：通用超图神经网络》（IEEE T-PAMI 2022）。\n\n- **[HyperGCN](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.HyperGCN.html#dhg.models.HyperGCN)** 模型，出自论文《HyperGCN：一种在超图上训练图卷积网络的新方法》（NeurIPS 2019）。\n\n- **[DHCF](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.DHCF.html#dhg.models.DHCF)** 模型，出自论文《双通道超图协同过滤》（KDD 2020）。\n\n- **[HNHN](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.HNHN.html#dhg.models.HNHN)** 模型，出自论文《HNHN：带有超边神经元的超图网络》（ICML 2020）。\n\n- **[UniGCN](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.UniGCN.html#dhg.models.UniGCN)** 模型，出自论文《UniGNN：图与超图神经网络的统一框架》（IJCAI 2021）。\n\n- **[UniGAT](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.UniGAT.html#dhg.models.UniGAT)** 模型，出自论文《UniGNN：图与超图神经网络的统一框架》（IJCAI 2021）。\n\n- **[UniSAGE](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.UniSAGE.html#dhg.models.UniSAGE)** 模型，出自论文《UniGNN：图与超图神经网络的统一框架》（IJCAI 2021）。\n\n- **[UniGIN](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.models.UniGIN.html#dhg.models.UniGIN)** 模型，出自论文《UniGNN：图与超图神经网络的统一框架》（IJCAI 2021）。\n\n\n\n## 引用\n如果您在研究中发现 **DHG** 非常有用，请考虑引用以下文献：\n\n```\n@article{gao2022hgnn,\n  title={HGNN $\\^{}+ $: General Hypergraph Neural Networks},\n  author={Gao, Yue and Feng, Yifan and Ji, Shuyi and Ji, Rongrong},\n  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},\n  year={2022},\n  publisher={IEEE}\n}\n```\n```\n@inproceedings{feng2019hypergraph,\n  title={Hypergraph neural networks},\n  author={Feng, Yifan and You, Haoxuan and Zhang, Zizhao and Ji, Rongrong and Gao, Yue},\n  booktitle={Proceedings of the AAAI conference on artificial intelligence},\n  volume={33},\n  number={01},\n  pages={3558--3565},\n  year={2019}\n}\n```\n\n## DHG 团队\n\nDHG 由 DHG 核心团队开发，成员包括 [冯一凡](http:\u002F\u002Ffengyifan.site\u002F)、[张新伟](https:\u002F\u002Fgithub.com\u002Fzhangxwww)、[严杰龙](https:\u002F\u002Fgithub.com\u002FJasonYanjl)、[韩相敏](https:\u002F\u002Fscholar.google.com\u002Fcitations?user=Y96h0t0AAAAJ&hl=zh-CN&oi=ao)、[高岳](http:\u002F\u002Fmoon-lab.tech\u002F) 和 [戴琼海](https:\u002F\u002Fysg.ckcest.cn\u002Fhtml\u002Fdetails\u002F8058\u002Findex.html)。该项目由清华大学的 [iMoon 实验室](http:\u002F\u002Fmoon-lab.tech\u002F) 维护。如需联系，请发送邮件至 [evanfeng97@gmail.com](mailto:evanfeng97@gmail.com)。\n\n\n## 许可证\n\nDHG 采用 Apache License 2.0 许可证。","# DeepHypergraph (DHG) 快速上手指南\n\nDeepHypergraph (DHG) 是一个基于 PyTorch 的深度学习库，专为图神经网络 (GNN) 和超图神经网络 (HGNN) 设计。它支持低阶结构（如图、有向图、二分图）和高阶结构（如超图）上的消息传递，并内置了多种 SOTA 模型、数据集及自动超参数优化功能。\n\n## 环境准备\n\n在开始之前，请确保您的开发环境满足以下要求：\n\n*   **操作系统**: Linux, macOS 或 Windows\n*   **Python 版本**: >= 3.8\n*   **核心依赖**:\n    *   PyTorch: >= 1.12.1, \u003C 2.0\n    *   scipy: >= 1.8\n    *   matplotlib: >= 3.7.0 (用于渲染 3D 可视化图表)\n    *   numpy, scikit-learn, optuna, requests\n\n> **注意**: 建议先安装好对应 CUDA 版本的 PyTorch，再安装 DHG。\n\n## 安装步骤\n\n### 方式一：安装稳定版（推荐）\n\n使用 `pip` 直接安装最新稳定版本 (v0.9.5)：\n\n```bash\npip install dhg\n```\n\n**国内加速方案**：\n如果您在中国大陆地区，建议使用清华源或阿里源以加快下载速度：\n\n```bash\npip install dhg -i https:\u002F\u002Fpypi.tuna.tsinghua.edu.cn\u002Fsimple\n```\n\n### 方式二：安装开发版（夜间版）\n\n如果您需要体验最新的 SOTA 方法或数据集（可能包含未完全测试的功能），可以安装 GitHub 上的开发版本 (v0.9.6)：\n\n```bash\npip install git+https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph.git\n```\n\n## 基本使用\n\nDHG 的使用风格与 PyTorch 高度一致。以下是几个最核心的快速示例。\n\n### 1. 结构可视化\n\nDHG 内置了强大的可视化工具，可一键绘制图、超图、有向图和二分图。\n\n```python\nimport matplotlib.pyplot as plt\nimport dhg\n\n# 绘制随机图 (Graph)\ng = dhg.random.graph_Gnm(10, 12)\ng.draw()\n\n# 绘制随机超图 (Hypergraph)\nhg = dhg.random.hypergraph_Gnm(10, 8)\nhg.draw()\n\n# 显示图像\nplt.show()\n```\n\n### 2. 低阶结构学习 (图神经网络)\n\n在普通图上，您可以直接使用内置方法进行特征平滑或顶点到顶点的消息传递。\n\n**特征平滑 (基于 GCN Laplacian):**\n```python\nimport torch\nimport dhg\n\n# 创建一个随机图\ng = dhg.random.graph_Gnm(5, 8)\n# 随机生成顶点特征\nX = torch.rand(5, 2)\n\n# 使用 GCN 拉普拉斯矩阵进行平滑\nX_ = g.smoothing_with_GCN(X)\n```\n\n**顶点到顶点消息传递 (Vertex-to-Vertex):**\n```python\nimport torch\nimport dhg\n\ng = dhg.random.graph_Gnm(5, 8)\nX = torch.rand(5, 2)\n\n# 使用 \"mean\" 聚合策略进行消息传递\nX_ = g.v2v(X, aggr=\"mean\")\n```\n\n### 3. 高阶结构学习 (超图神经网络)\n\nDHG 的核心优势在于处理超图结构，支持顶点与超边之间的双向消息传递。\n\n**特征平滑 (基于 HGNN Laplacian):**\n```python\nimport torch\nimport dhg\n\n# 创建一个随机超图\nhg = dhg.random.hypergraph_Gnm(5, 4)\nX = torch.rand(5, 2)\n\n# 使用 HGNN 拉普拉斯矩阵进行平滑\nX_ = hg.smoothing_with_HGNN(X)\n```\n\n**两阶段消息传递 (Vertex -> Hyperedge -> Vertex):**\n```python\nimport torch\nimport dhg\n\nhg = dhg.random.hypergraph_Gnm(5, 4)\nX = torch.rand(5, 2)\n\n# 第一步：从顶点传递消息到超边 (Vertex to Hyperedge)\nY_ = hg.v2e(X, aggr=\"mean\")\n\n# 第二步：从超边传递消息回顶点 (Hyperedge to Vertex)\nX_ = hg.e2v(Y_, aggr=\"mean\")\n\n# 或者直接进行顶点到顶点的传递 (内部自动处理超边连接)\nX_direct = hg.v2v(X, aggr=\"mean\")\n```\n\n### 4. 构建自定义卷积层\n\n您可以像编写普通 PyTorch 模块一样，结合 DHG 的结构对象构建 GCN 或 HGNN 层。\n\n**构建一个简单的 GCN 卷积层:**\n```python\nimport torch\nimport torch.nn as nn\nimport torch.nn.functional as F\nimport dhg\n\nclass GCNConv(nn.Module):\n    def __init__(self, in_channels, out_channels):\n        super().__init__()\n        self.theta = nn.Linear(in_channels, out_channels)\n\n    def forward(self, X: torch.Tensor, g: dhg.Graph) -> torch.Tensor:\n        # 应用可学习参数\n        X = self.theta(X)\n        # 使用图的拉普拉斯矩阵进行平滑\n        X = g.smoothing_with_GCN(X)\n        return F.relu(X)\n```\n\n**构建一个简单的 HGNN 卷积层:**\n```python\nclass HGNNConv(nn.Module):\n    def __init__(self, in_channels, out_channels):\n        super().__init__()\n        self.theta = nn.Linear(in_channels, out_channels)\n\n    def forward(self, X: torch.Tensor, hg: dhg.Hypergraph) -> torch.Tensor:\n        # 应用可学习参数\n        X = self.theta(X)\n        # 使用超图的拉普拉斯矩阵进行平滑\n        X = hg.smoothing_with_HGNN(X)\n        return F.relu(X)\n```\n\n更多高级用法、完整模型示例及自动超参数调优 (`dhg.experiments`) 请参考 [官方文档](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002F)。","某生物制药公司的算法团队正在构建药物 - 靶点相互作用预测模型，需要处理包含多种分子基团复杂关联的高阶数据结构。\n\n### 没有 DeepHypergraph 时\n- 研究人员需手动编写复杂的矩阵运算代码来实现超图消息传递，极易在“顶点到超边”或“跨域传递”逻辑中引入难以排查的数学错误。\n- 面对药物分子中天然存在的多对多高阶关系，传统图神经网络库只能强行拆解为普通图，导致关键的群体协同特征严重丢失。\n- 缺乏内置的超图可视化与标准评估指标，团队需花费数周时间自行开发辅助工具，严重拖慢了从实验验证到论文发表的周期。\n- 模型超参数调优完全依赖人工经验试错，难以在有限时间内找到最优配置，导致模型性能始终无法突破现有基准。\n\n### 使用 DeepHypergraph 后\n- 直接调用 DeepHypergraph 封装好的高阶消息传递接口，几行代码即可精准实现从顶点集到顶点集的复杂交互，彻底消除底层数学实现风险。\n- 原生支持超图结构建模，完整保留药物分子基团间的多元关联信息，显著提升了模型对复杂生化反应的表征能力。\n- 利用内置的可视化模块和常用评估指标，团队能即时生成结构图谱并量化性能，将实验迭代效率提升了数倍。\n- 借助集成 Optuna 的 `dhg.experiments` 模块自动执行超参数搜索，轻松复现并超越当前最先进（SOTA）模型的准确率。\n\nDeepHypergraph 通过标准化高阶图计算流程与自动化调优能力，让科研团队从繁琐的底层实现中解放，专注于挖掘数据背后的深层生物规律。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FiMoonLab_DeepHypergraph_5cdf49ae.png","iMoonLab","iMoon: Intelligent Media and Cognition Group","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002FiMoonLab_a7b2d59b.png","",null,"https:\u002F\u002Fmoon-lab.tech\u002F","https:\u002F\u002Fgithub.com\u002FiMoonLab",[80],{"name":81,"color":82,"percentage":83},"Python","#3572A5",100,847,85,"2026-04-13T15:15:50","Apache-2.0","未说明",{"notes":90,"python":91,"dependencies":92},"该工具基于 PyTorch 构建，支持图神经网络和超图神经网络。若需使用可视化功能（特别是 3D 绘图），必须安装 matplotlib 3.7.0 或更高版本。最新稳定版为 v0.9.5，已迁移至 UV 包管理器并采用 PEP 621 格式。",">=3.8",[93,94,95,96,97,98,99],"torch>=1.12.1,\u003C2.0","scipy>=1.8","matplotlib>=3.7.0","numpy","scikit-learn","optuna","requests",[14],[102,103,104,105],"deep-learning","graph-neural-networks","hypergraph-neural-networks","pytorch","2026-03-27T02:49:30.150509","2026-04-15T07:10:18.063876",[109,114,119,124,129,134],{"id":110,"question_zh":111,"answer_zh":112,"source_url":113},33883,"使用 pip install dhg 安装时出现 'sklearn' 包已弃用的错误怎么办？","该问题是由于依赖项中使用了已弃用的 'sklearn' 包名（应使用 'scikit-learn'）导致的。解决方案有两种：\n1. 安装最新的开发版本（已修复此问题）：运行命令 `pip install git+https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph.git`。\n2. 等待官方发布 v0.9.6 或更高版本。\n此外，也可以尝试手动先安装所需的基础库（如 torch, matplotlib, numpy），然后再安装 dhg，或者尝试降低 Python 版本至 3.8 或 3.9。","https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fissues\u002F60",{"id":115,"question_zh":116,"answer_zh":117,"source_url":118},33884,"在超图监督学习中，如何对负样本超边进行采样？","由于超边可以连接超过两个节点，导致预测空间呈指数级增长，负样本采样较为困难。建议采取以下策略：\n1. 添加约束条件，例如只预测包含固定数量节点（如三个点）的超边。\n2. 采用破坏正样本的方法构造负样本：将正样本超边中的某一个节点替换为随机选择的其他节点，以此生成负样本。","https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fissues\u002F28",{"id":120,"question_zh":121,"answer_zh":122,"source_url":123},33885,"如果输入数据的节点标识不是连续的整数（例如字符串或不连续的 ID），该如何构建超图？","DeepHypergraph 提供了一些工具函数，可以将字符串 ('str') 或不连续的整数映射为有序且连续的整数范围。你可以使用 `dhg.utils.remap_edge_list` 函数来处理边列表或邻接表输入。该函数支持简单图和二分图结构，能自动完成节点类型的转换（如 'str' -> 'int'），以便顺利构建超图。","https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fissues\u002F15",{"id":125,"question_zh":126,"answer_zh":127,"source_url":128},33886,"DBLP 合作者网络数据集的来源和具体信息是什么？","该数据集是根据 DBLP API 爬取并经过筛选得到的。筛选条件包括特定的会议\u002F期刊 venue 以及发表年份（2018 年至 2022 年）。数据集包含 6498 位作者和 2603 篇论文。在 DeepHypergraph 库中，该数据集被归类为超图数据集，名称暂定为 DBLP8K。","https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fissues\u002F21",{"id":130,"question_zh":131,"answer_zh":132,"source_url":133},33887,"发现文档中存在描述错误（如将“二分图”误写为“有向图”），如何处理？","如果您发现文档中有此类笔误，可以在 GitHub Issues 中反馈。维护者通常会迅速核实并修正文档内容。例如，之前的“二分图”误写问题已在版本 0.9.2 的文档中得到修正。欢迎用户多提问题和批评指正，共同推动项目发展。","https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fissues\u002F18",{"id":135,"question_zh":136,"answer_zh":137,"source_url":138},33888,"计算得到的超图顶点度矩阵 D_v 与论文定义不符，是否是 Bug？","是的，这曾是一个已知问题。根据论文《Hypergraph Neural Networks》的定义，顶点度应为关联超边权重之和。该计算逻辑错误已在 DeepHypergraph v0.9.3 版本中修复。请确保您安装的版本不低于 0.9.3（可通过 `pip install dhg` 安装最新版），以获取正确的顶点度矩阵计算结果。","https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fissues\u002F25",[140,145,150,155,160],{"id":141,"version":142,"summary_zh":143,"released_at":144},263749,"v0.9.5","## 变更内容\n* 修复平均度数的 bug，由 @HPUtx813 在 https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fpull\u002F51 中完成\n* 重构：从 Poetry 迁移到 uv，并更新 pyproject.toml 文件，由 @yifanfeng97 在 https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fpull\u002F67 中完成\n* 修复可视化相关的 bug，由 @AuroraMaster 在 https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fpull\u002F65 中完成\n\n## 新贡献者\n* @HPUtx813 在 https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fpull\u002F51 中完成了首次贡献\n* @yifanfeng97 在 https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fpull\u002F67 中完成了首次贡献\n* @AuroraMaster 在 https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fpull\u002F65 中完成了首次贡献\n\n**完整变更日志**：https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fcompare\u002Fv0.9.4...v0.9.5","2025-09-01T07:36:00",{"id":146,"version":147,"summary_zh":148,"released_at":149},263750,"v0.9.4","我们现已发布 v0.9.4 版本！在 v0.9.4 中，我们新增了 6 个超图数据集，并修复了一些已知的 bug。\n\n新增数据集：\n- [DBLP-4k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.4\u002Fgenerated\u002Fdhg.data.DBLP4k.html) @yifanfeng97  \n- [IMDB4k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.IMDB4k.html) @yifanfeng97  \n- [Recipe100k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Recipe100k.html) @yifanfeng97 \n- [Recipe200k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Recipe200k.html) @yifanfeng97 \n- [Yelp3k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Yelp3k.html) @yifanfeng97 \n- [Tencent2k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002Flatest\u002Fgenerated\u002Fdhg.data.Tencent2k.html) @yifanfeng97 \n\n修复的 bug：\n- [修复最后一层 BN 的 bug](https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fcommit\u002F27f944f6aa5c66da7f395124ecb72f62c186b53b) @yifanfeng97 \n- [修复 graph、di_graph、bi_graph、hypergraph 拉普拉斯矩阵计算中的设备错误](https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fcommit\u002F834634904fed8e0055a6f3e11c616d4a35e7a5f5) @yifanfeng97 \n\n新功能：\n- [添加自定义超图生成器](https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fcommit\u002F6278f0da6c34e6c40c6f92544c50f010afe757f0) @yifanfeng97","2024-01-31T05:01:37",{"id":151,"version":152,"summary_zh":153,"released_at":154},263751,"v0.9.3","我们现已发布 v0.9.3 版本！在 v0.9.3 中，我们新增了一个超图数据集，修复了一些已知的 bug，并添加了一些超图操作。\n\n新增数据集：\n- [DBLP-8k](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.3\u002Fgenerated\u002Fdhg.data.DBLP8k.html)，由 @mgao97 贡献\n\n修复的 bug：\n- [修复超图 D_v 相关 bug（h[v, e] -> w[e]*h[v, e]）](https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fcommit\u002F7786e2d3a54951fb4bad93fae471bbe18656cfe4)，@yifanfeng97\n- [修复 group_name 和测试相关 bug](https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fcommit\u002Fbd8a8dd5c27f5493fe928737be272b1ae15eab99)，@yifanfeng97\n- [修复 bug（ndcg 和 recall 出现 NaN）](https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fcommit\u002Fe124154fba503e885bd49dd8b6c6016a78fd9bbb)，@yifanfeng97\n\n新功能：\n- [新增超图顶点权重（v_weight 和 W_v）](https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fcommit\u002Fe17fc95460472c6e8af074285a316d1983f7d463)，@yifanfeng97\n\n其他：\n- [更新 DHCF](https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fcommit\u002F7293527b266ecce3a8c60a26d35b042486d5d697)\n- [优化对 PyTorch 版本的依赖声明，改为 >=1.12.1 且 \u003C2.0](https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fcommit\u002Fa4c7e467000ec8fd5341f0d9d0e0d1447d420dd8)，@Starrah\n- [将依赖列表中的 sklearn 更新为 scikit-learn](https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fcommit\u002Fe1eabc52859350dae0caa43e8433e52949c410d3)，@Starrah\n- [修复错别字：有向图 -> 二分图](https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDeepHypergraph\u002Fcommit\u002F7e559b9e821aba5d0c280bbecf44fb2b3d4a5e0a)，@yifanfeng97","2022-12-28T07:07:39",{"id":156,"version":157,"summary_zh":158,"released_at":159},263752,"v0.9.2","我们现已发布 v0.9.2 版本！在 v0.9.2 中，我们新增了 21 个数据集、6 种 SOTA 方法，以及结构和特征可视化功能。\n\n新增数据集：\n\n- [BlogCatalog](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.data.BlogCatalog.html#dhg.data.BlogCatalog)\n- [Flickr](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.data.Flickr.html#dhg.data.Flickr)\n- [Github](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.data.Github.html#dhg.data.Github)\n- [Facebook](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.data.Facebook.html#dhg.data.Facebook)\n- [TencentBiGraph](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.data.TencentBiGraph.html#dhg.data.TencentBiGraph)\n- [CoraBiGraph](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.data.CoraBiGraph.html#dhg.data.CoraBiGraph)\n- [PubmedBiGraph](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.data.PubmedBiGraph.html#dhg.data.PubmedBiGraph)\n- [CiteseerBiGraph](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.data.CiteseerBiGraph.html#dhg.data.CiteseerBiGraph)\n- [CoauthorshipCora](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.data.CoauthorshipCora.html#dhg.data.CoauthorshipCora)\n- [CoauthorshipDBLP](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.data.CoauthorshipDBLP.html#dhg.data.CoauthorshipDBLP)\n- [CocitationCora](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.data.CocitationCora.html#dhg.data.CocitationCora)\n- [CocitationCiteseer](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.data.CocitationCiteseer.html#dhg.data.CocitationCiteseer)\n- [CocitationPubmed](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.data.CocitationPubmed.html#dhg.data.CocitationPubmed)\n- [YelpRestaurant](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.data.YelpRestaurant.html#dhg.data.YelpRestaurant)\n- [WalmartTrips](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.data.WalmartTrips.html#dhg.data.WalmartTrips)\n- [HouseCommittees](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.data.HouseCommittees.html#dhg.data.HouseCommittees)\n- [News20](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.data.News20.html#dhg.data.News20)\n\n新增 SOTA 方法：\n\n- [BGNN_Adv](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.models.BGNN_Adv.html#dhg.models.BGNN_Adv)\n- [BGNN_MLP](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.models.BGNN_MLP.html#dhg.models.BGNN_MLP)\n- [UniGCN](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.models.UniGCN.html#dhg.models.UniGCN)\n- [UniGAT](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.models.UniGAT.html#dhg.models.UniGAT)\n- [UniSAGE](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.models.UniSAGE.html#dhg.models.UniSAGE)\n- [UniGIN](https:\u002F\u002Fdeephypergraph.readthedocs.io\u002Fen\u002F0.9.2\u002Fgenerated\u002Fdhg.models.UniGIN.html#dhg.models.UniGIN)","2022-09-24T12:30:50",{"id":161,"version":162,"summary_zh":163,"released_at":164},263753,"v0.9.1","DHG 的首个发布版本！这些附带的包可以通过 pip 安装。","2022-08-26T02:48:30"]