[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-SpaceLearner--Awesome-DynamicGraphLearning":3,"tool-SpaceLearner--Awesome-DynamicGraphLearning":64},[4,17,27,35,43,56],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":16},3808,"stable-diffusion-webui","AUTOMATIC1111\u002Fstable-diffusion-webui","stable-diffusion-webui 是一个基于 Gradio 构建的网页版操作界面，旨在让用户能够轻松地在本地运行和使用强大的 Stable Diffusion 图像生成模型。它解决了原始模型依赖命令行、操作门槛高且功能分散的痛点，将复杂的 AI 绘图流程整合进一个直观易用的图形化平台。\n\n无论是希望快速上手的普通创作者、需要精细控制画面细节的设计师，还是想要深入探索模型潜力的开发者与研究人员，都能从中获益。其核心亮点在于极高的功能丰富度：不仅支持文生图、图生图、局部重绘（Inpainting）和外绘（Outpainting）等基础模式，还独创了注意力机制调整、提示词矩阵、负向提示词以及“高清修复”等高级功能。此外，它内置了 GFPGAN 和 CodeFormer 等人脸修复工具，支持多种神经网络放大算法，并允许用户通过插件系统无限扩展能力。即使是显存有限的设备，stable-diffusion-webui 也提供了相应的优化选项，让高质量的 AI 艺术创作变得触手可及。",162132,3,"2026-04-05T11:01:52",[13,14,15],"开发框架","图像","Agent","ready",{"id":18,"name":19,"github_repo":20,"description_zh":21,"stars":22,"difficulty_score":23,"last_commit_at":24,"category_tags":25,"status":16},1381,"everything-claude-code","affaan-m\u002Feverything-claude-code","everything-claude-code 是一套专为 AI 编程助手（如 Claude Code、Codex、Cursor 等）打造的高性能优化系统。它不仅仅是一组配置文件，而是一个经过长期实战打磨的完整框架，旨在解决 AI 代理在实际开发中面临的效率低下、记忆丢失、安全隐患及缺乏持续学习能力等核心痛点。\n\n通过引入技能模块化、直觉增强、记忆持久化机制以及内置的安全扫描功能，everything-claude-code 能显著提升 AI 在复杂任务中的表现，帮助开发者构建更稳定、更智能的生产级 AI 代理。其独特的“研究优先”开发理念和针对 Token 消耗的优化策略，使得模型响应更快、成本更低，同时有效防御潜在的攻击向量。\n\n这套工具特别适合软件开发者、AI 研究人员以及希望深度定制 AI 工作流的技术团队使用。无论您是在构建大型代码库，还是需要 AI 协助进行安全审计与自动化测试，everything-claude-code 都能提供强大的底层支持。作为一个曾荣获 Anthropic 黑客大奖的开源项目，它融合了多语言支持与丰富的实战钩子（hooks），让 AI 真正成长为懂上",138956,2,"2026-04-05T11:33:21",[13,15,26],"语言模型",{"id":28,"name":29,"github_repo":30,"description_zh":31,"stars":32,"difficulty_score":23,"last_commit_at":33,"category_tags":34,"status":16},2271,"ComfyUI","Comfy-Org\u002FComfyUI","ComfyUI 是一款功能强大且高度模块化的视觉 AI 引擎，专为设计和执行复杂的 Stable Diffusion 图像生成流程而打造。它摒弃了传统的代码编写模式，采用直观的节点式流程图界面，让用户通过连接不同的功能模块即可构建个性化的生成管线。\n\n这一设计巧妙解决了高级 AI 绘图工作流配置复杂、灵活性不足的痛点。用户无需具备编程背景，也能自由组合模型、调整参数并实时预览效果，轻松实现从基础文生图到多步骤高清修复等各类复杂任务。ComfyUI 拥有极佳的兼容性，不仅支持 Windows、macOS 和 Linux 全平台，还广泛适配 NVIDIA、AMD、Intel 及苹果 Silicon 等多种硬件架构，并率先支持 SDXL、Flux、SD3 等前沿模型。\n\n无论是希望深入探索算法潜力的研究人员和开发者，还是追求极致创作自由度的设计师与资深 AI 绘画爱好者，ComfyUI 都能提供强大的支持。其独特的模块化架构允许社区不断扩展新功能，使其成为当前最灵活、生态最丰富的开源扩散模型工具之一，帮助用户将创意高效转化为现实。",107662,"2026-04-03T11:11:01",[13,14,15],{"id":36,"name":37,"github_repo":38,"description_zh":39,"stars":40,"difficulty_score":23,"last_commit_at":41,"category_tags":42,"status":16},3704,"NextChat","ChatGPTNextWeb\u002FNextChat","NextChat 是一款轻量且极速的 AI 助手，旨在为用户提供流畅、跨平台的大模型交互体验。它完美解决了用户在多设备间切换时难以保持对话连续性，以及面对众多 AI 模型不知如何统一管理的痛点。无论是日常办公、学习辅助还是创意激发，NextChat 都能让用户随时随地通过网页、iOS、Android、Windows、MacOS 或 Linux 端无缝接入智能服务。\n\n这款工具非常适合普通用户、学生、职场人士以及需要私有化部署的企业团队使用。对于开发者而言，它也提供了便捷的自托管方案，支持一键部署到 Vercel 或 Zeabur 等平台。\n\nNextChat 的核心亮点在于其广泛的模型兼容性，原生支持 Claude、DeepSeek、GPT-4 及 Gemini Pro 等主流大模型，让用户在一个界面即可自由切换不同 AI 能力。此外，它还率先支持 MCP（Model Context Protocol）协议，增强了上下文处理能力。针对企业用户，NextChat 提供专业版解决方案，具备品牌定制、细粒度权限控制、内部知识库整合及安全审计等功能，满足公司对数据隐私和个性化管理的高标准要求。",87618,"2026-04-05T07:20:52",[13,26],{"id":44,"name":45,"github_repo":46,"description_zh":47,"stars":48,"difficulty_score":23,"last_commit_at":49,"category_tags":50,"status":16},2268,"ML-For-Beginners","microsoft\u002FML-For-Beginners","ML-For-Beginners 是由微软推出的一套系统化机器学习入门课程，旨在帮助零基础用户轻松掌握经典机器学习知识。这套课程将学习路径规划为 12 周，包含 26 节精炼课程和 52 道配套测验，内容涵盖从基础概念到实际应用的完整流程，有效解决了初学者面对庞大知识体系时无从下手、缺乏结构化指导的痛点。\n\n无论是希望转型的开发者、需要补充算法背景的研究人员，还是对人工智能充满好奇的普通爱好者，都能从中受益。课程不仅提供了清晰的理论讲解，还强调动手实践，让用户在循序渐进中建立扎实的技能基础。其独特的亮点在于强大的多语言支持，通过自动化机制提供了包括简体中文在内的 50 多种语言版本，极大地降低了全球不同背景用户的学习门槛。此外，项目采用开源协作模式，社区活跃且内容持续更新，确保学习者能获取前沿且准确的技术资讯。如果你正寻找一条清晰、友好且专业的机器学习入门之路，ML-For-Beginners 将是理想的起点。",84991,"2026-04-05T10:45:23",[14,51,52,53,15,54,26,13,55],"数据工具","视频","插件","其他","音频",{"id":57,"name":58,"github_repo":59,"description_zh":60,"stars":61,"difficulty_score":10,"last_commit_at":62,"category_tags":63,"status":16},3128,"ragflow","infiniflow\u002Fragflow","RAGFlow 是一款领先的开源检索增强生成（RAG）引擎，旨在为大语言模型构建更精准、可靠的上下文层。它巧妙地将前沿的 RAG 技术与智能体（Agent）能力相结合，不仅支持从各类文档中高效提取知识，还能让模型基于这些知识进行逻辑推理和任务执行。\n\n在大模型应用中，幻觉问题和知识滞后是常见痛点。RAGFlow 通过深度解析复杂文档结构（如表格、图表及混合排版），显著提升了信息检索的准确度，从而有效减少模型“胡编乱造”的现象，确保回答既有据可依又具备时效性。其内置的智能体机制更进一步，使系统不仅能回答问题，还能自主规划步骤解决复杂问题。\n\n这款工具特别适合开发者、企业技术团队以及 AI 研究人员使用。无论是希望快速搭建私有知识库问答系统，还是致力于探索大模型在垂直领域落地的创新者，都能从中受益。RAGFlow 提供了可视化的工作流编排界面和灵活的 API 接口，既降低了非算法背景用户的上手门槛，也满足了专业开发者对系统深度定制的需求。作为基于 Apache 2.0 协议开源的项目，它正成为连接通用大模型与行业专有知识之间的重要桥梁。",77062,"2026-04-04T04:44:48",[15,14,13,26,54],{"id":65,"github_repo":66,"name":67,"description_en":68,"description_zh":69,"ai_summary_zh":69,"readme_en":70,"readme_zh":71,"quickstart_zh":72,"use_case_zh":73,"hero_image_url":74,"owner_login":75,"owner_name":76,"owner_avatar_url":77,"owner_bio":78,"owner_company":79,"owner_location":79,"owner_email":79,"owner_twitter":79,"owner_website":79,"owner_url":80,"languages":81,"stars":86,"forks":87,"last_commit_at":88,"license":79,"difficulty_score":89,"env_os":90,"env_gpu":91,"env_ram":91,"env_deps":92,"category_tags":95,"github_topics":96,"view_count":23,"oss_zip_url":79,"oss_zip_packed_at":79,"status":16,"created_at":104,"updated_at":105,"faqs":106,"releases":107},2761,"SpaceLearner\u002FAwesome-DynamicGraphLearning","Awesome-DynamicGraphLearning","Awesome papers about machine learning (deep learning) on dynamic (temporal) graphs (networks \u002F knowledge graphs).","Awesome-DynamicGraphLearning 是一个专注于动态图（或称时序图）机器学习领域的开源资源合集。它系统性地整理了关于动态网络、动态知识图谱及其在推荐系统等场景应用的前沿论文与代码实现。\n\n在现实世界中，许多数据关系（如社交互动、金融交易）随时间不断变化，传统静态图模型难以捕捉这种演化规律。Awesome-DynamicGraphLearning 正是为了解决这一痛点而生，它帮助研究者和开发者快速定位处理时空依赖性的最新算法，从早期的基础综述到 2025 年结合大语言模型（LLM4DyG）、可学习时间编码等创新成果，涵盖了从理论基准到实际落地的完整链条。\n\n该资源特别适合人工智能领域的研究人员、算法工程师及高校师生使用。无论是希望入门动态图学习的新手，还是寻求最新技术突破的资深专家，都能从中高效获取经过筛选的高质量文献和复现代码。其独特亮点在于不仅收录了经典的图神经网络综述，更持续追踪如“长程传播”、“自适应邻域”等前沿技术方向，是探索动态图智能分析不可或缺的导航工具。","# Awesome-DynamicGraphLearning [![Awesome](https:\u002F\u002Fawesome.re\u002Fbadge.svg)](https:\u002F\u002Fawesome.re)\n\nAwesome papers (codes) about machine learning (deep learning) on dynamic (temporal) graphs (networks \u002F knowledge graphs) and their applications (i.e. Recommender Systems).\n\n## Survey\n\n* Deep learning for dynamic graphs: models and benchmarks (**TNNLS, 2024**) [[paper](https:\u002F\u002Fieeexplore.ieee.org\u002Fdocument\u002F10490120)][[code](https:\u002F\u002Fgithub.com\u002Fgravins\u002Fdynamic_graph_benchmark)]\n* Graph Neural Networks for temporal graphs: State of the art, open challenges, and opportunities (**ARXIV, 2023**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2302.01018.pdf)]\n* Graph Neural Networks Designed for Different Graph Types: A Survey (**ARXIV, 2022**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2204.03080.pdf)]\n* Representation Learning for Dynamic Graphs: A Survey (**JMLR, 2020**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1905.11485.pdf)]\n* A Survey on Embedding Dynamic Graphs (**ARXIV, 2021**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2101.01229v1.pdf)]\n* Relational Representation Learning for Dynamic (Knowledge) Graphs: A Survey (**ARXIV, 2019**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1905.11485v1.pdf)]\n* Nonlinearity + Networks: A 2020 Vision (**ARXIV, 2019**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1911.03805v1.pdf)]\n* Temporal networks (**Physics Report, 2012**) [[paper](https:\u002F\u002Fpdf.sciencedirectassets.com\u002F271542\u002F1-s2.0-S0370157312X00309\u002F1-s2.0-S0370157312000841\u002Fmain.pdf?X-Amz-Security-Token=IQoJb3JpZ2luX2VjEIL%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEaCXVzLWVhc3QtMSJHMEUCIQCKOucVujQP07wkAKxMuMLMSsHqFv%2FFb%2BYWtNrMjUENTwIgWQ3afs%2FTBaPhDmEiZPlrVllfwmAmEN9OMs0ReONp0PIq%2BgMIOxAEGgwwNTkwMDM1NDY4NjUiDHz5nNBj0grdhs2zQirXA6ElLZy%2FDlDSDU86oA74varszz0ma8Dbz0g92LEczi64XvbCAHQmPQIiskdJpYzeQQCoHHQwirdve6OhcF6pTJwGbJ2lL84oSrywuWhi6Z0e9kdtLUw2deUEHp2La7FUeebH%2FnHaHV3BpYfl2%2BXA0Y1zI67VWbtXv6MALP6e9THRpRmS6omIAgiB9u6bOm3NDQ4hC7Cp%2F22gUvRvSOm14Y%2F9s2kk7QcqRxTMDTW94Dbtty2O8Pw54CJulxcOo7Nby7%2FXrarewlMgFBxCwhNteoXaVviFrgl91rtQTq5EnU9HEBntgE0r8z%2F0e%2FGh1JuYvd0aK5FzC2ZTGjFHNq7bx%2BdscwV1QiLkiVHsNKc2CURzGvUx0dRFIud8w3PkH7aZVESvKlNvLyKa%2FgL4TU%2B0n5j92ppZHbC3DfB8kwZV1I1QFzB7mFmhdpoAWFlXXY2xxPPkQqsV1%2BsPanWb9JIgkpBnu5ZO2xmVHPRSlvL%2BUTKvD3Jq0LmqEYo1tFy3F4sYEmGV4vw0RKKo7tOYb8SFgdTw26SVhera5aeLIwSFZYAvv0wRb%2BsXgoPJK51YLI1XTXnNep%2FWNLv1Gem993YkpKZdgmoEpPheKv1%2B85mELU1J82NJeExBXTDumIiTBjqlAUQbPzomGso8OdXiqdTW8V8WaRL%2B0ZgHpf3Kzcb3k2W%2FWSKiA91fU6BA%2FTFUUd2iafG1k%2Bgm8Yvli8YBEroGEXmUOH5IdiIyFTIUL7BfcyvedVwWbBCVoGyLxs48G6KWaVwowy2XYP%2BXHfbDghl6NNzdaUwlWc1blXx%2BkkoUq1ZIwsAzhLrwLthvrB%2BLeKT8IdYOWCDhjSdy%2BsHs6t%2F4WEMwQVHScg%3D%3D&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Date=20220422T030440Z&X-Amz-SignedHeaders=host&X-Amz-Expires=300&X-Amz-Credential=ASIAQ3PHCVTY3GYD4RGC%2F20220422%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Signature=e6c67c850d5b9141253837519a558f2d56ee022ea163233ccf17b88f9815df4f&hash=5c12a8966a05652f2c4464fea3e79f3c7f669cf94a84365674e2e0647371a431&host=68042c943591013ac2b2430a89b270f6af2c76d8dfd086a07176afe7c76c2c61&pii=S0370157312000841&tid=spdf-a3f8bf90-014a-4f05-ba9e-eebf11a58a9a&sid=55bff1c97359b9450988c4c294161bf31292gxrqa&type=client&ua=4c00050055575755530304&rr=6ffb22261cbf968e)]\n\n## Papers\n\n### 2025\n\n* Rethinking Time Encoding via Learnable Transformation Functions (**ICML, 2025**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2505.00887)][[code](https:\u002F\u002Fgithub.com\u002Fchenxi1228\u002FLeTE)]\n* Dynamic Graph Transformer with Correlated Spatial-Temporal Positional Encoding  (**WSDM, 2025**) [[paper](https:\u002F\u002Farxiv.org\u002Fabs\u002F2407.16959)][[code](https:\u002F\u002Fgithub.com\u002Fwangz3066\u002FCorDGT)]\n\n### 2024\n\n* Long Range Propagation on Continuous-Time Dynamic Graphs (**ICML, 2024**) [[paper](https:\u002F\u002Fproceedings.mlr.press\u002Fv235\u002Fgravina24a.html)][[code](https:\u002F\u002Fgithub.com\u002Fgravins\u002Fnon-dissipative-propagation-CTDGs)]\n*  LLM4DyG: Can Large Language Models Solve Spatial-Temporal Problems on Dynamic Graphs? (**SIGKDD, 2024**) [[paper](https:\u002F\u002Farxiv.org\u002Fabs\u002F2310.17110)][[code](https:\u002F\u002Fgithub.com\u002Fwondergo2017\u002FLLM4DyG)]\n* Towards Adaptive Neighborhood for Advancing Temporal Interaction Graph Modeling (**SIGKDD, 2024**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2406.11891)]\n* SLADE: Detecting Dynamic Anomalies in Edge Streams without Labels via Self-Supervised Learning (**SIGKDD, 2024**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2402.11933)][[code](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2402.11933)]\n* Predicting Long-term Dynamics of Complex Networks via Identifying Skeleton in Hyperbolic Space (**SIGKDD, 2024**) [[code](https:\u002F\u002Fgithub.com\u002Ftsinghua-fib-lab\u002FDiskNet)]\n* Latent Conditional Diffusion-based Data Augmentation for Continuous-Time Dynamic Graph Model (**SIGKDD, 2024**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2407.08500)][[code]()]\n* MemMap: An Adaptive and Latent Memory Structure for Dynamic Graph Learning (**SIGKDD, 2024**)\n* TASER: Temporal Adaptive Sampling for Fast and Accurate Dynamic Graph Representation Learning (**IPDPS, 2024**) [[paper](https:\u002F\u002Farxiv.org\u002Fabs\u002F2402.05396)][[code](https:\u002F\u002Fgithub.com\u002Ffacebookresearch\u002Ftaser-tgnn)]\n* Mayfly: a Neural Data Structure for Graph Stream Summarization (**ICLR, 2024, Spotlight**) [[paper](https:\u002F\u002Fopenreview.net\u002Fattachment?id=n7Sr8SW4bn&name=pdf)]\n* Causality-Inspired Spatial-Temporal Explanations for Dynamic Graph Neural Networks (**ICLR, 2024, Poster**) [[paper](https:\u002F\u002Fopenreview.net\u002Fattachment?id=AJBkfwXh3u&name=pdf)][[code](https:\u002F\u002Fgithub.com\u002Fkesenzhao\u002FDyGNNExplainer)]\n* FreeDyG: Frequency Enhanced Continuous-Time Dynamic Graph Model for Link Prediction (**ICLR, 2024, Poster**) [[paper](https:\u002F\u002Fopenreview.net\u002Fattachment?id=82Mc5ilInM&name=pdf)][[code](https:\u002F\u002Fgithub.com\u002FTianxzzz\u002FFreeDyG)]\n* PRES: Toward Scalable Memory-Based Dynamic Graph Neural Networks (**ICLR, 2024, Poster**) [[paper](https:\u002F\u002Fopenreview.net\u002Fattachment?id=gjXor87Xfy&name=pdf)][[code](https:\u002F\u002Fgithub.com\u002Fjwsu825\u002FMDGNN_BS)]\n* Hypergraph Dynamic System (**ICLR, 2024, Poster**) [[paper](https:\u002F\u002Fopenreview.net\u002Fattachment?id=NLbRvr840Q&name=pdf)]\n* Deep Temporal Graph Clustering (**ICLR, 2024, Poster**) [[paper](https:\u002F\u002Fopenreview.net\u002Fattachment?id=ViNe1fjGME&name=pdf)][[code](https:\u002F\u002Fgithub.com\u002FMGitHubL\u002FDeep-Temporal-Graph-Clustering)]\n* GraphPulse: Topological representations for temporal graph property prediction (**ICLR, 2024, Poster**) [[paper](https:\u002F\u002Fopenreview.net\u002Fattachment?id=DZqic2sPTY&name=pdf)][[code](https:\u002F\u002Fgithub.com\u002Fkiarashamsi\u002FGraphPulse)]\n* Beyond Spatio-Temporal Representations: Evolving Fourier Transform for Temporal Graphs (**ICLR, 2024, Poster**) [[paper](https:\u002F\u002Fopenreview.net\u002Fattachment?id=uvFhCUPjtI&name=pdf)][[code](https:\u002F\u002Fgithub.com\u002Fansonb\u002FEFT)]\n* HOPE: High-order Graph ODE For Modeling Interacting Dynamics (**ICML, 2024, Poster**) [[paper](https:\u002F\u002Fopenreview.net\u002Fattachment?id=9iChKP4k32&name=pdf)]\n* Temporal Generalization Estimation in Evolving Graphs (**ICLR, 2024, Poster**) [[paper](https:\u002F\u002Fopenreview.net\u002Fattachment?id=HFtrXBfNru&name=pdf)]\n* Dynamic Graph Information Bottleneck (**WWW, 2024**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2402.06716.pdf)][[code](https:\u002F\u002Fgithub.com\u002FRingBDStack\u002FDGIB)]\n* On the Feasibility of Simple Transformer for Dynamic Graph Modeling (**WWW, 2024**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2401.14009.pdf)]\n* Temporal Conformity-aware Hawkes Graph Network for Recommendations (**WWW, 2024**) \n* IME: Integrating Multi-curvature Shared and Specific Embedding for Temporal Knowledge Graph Completion (**WWW, 2024**)\n* TATKC: A Temporal Graph Neural Network for Fast Approximate Temporal Katz Centrality Ranking (**WWW, 2024**)\n* Efficient exact and approximate betweenness centrality computation for temporal graphs (**WWW, 2024**)\n* Temporal Graph ODEs for Irregularly-Sampled Time Series (**IJCAI, 2024**) [[paper](https:\u002F\u002Fwww.ijcai.org\u002Fproceedings\u002F2024\u002F445)][[code](https:\u002F\u002Fgithub.com\u002Fgravins\u002FTG-ODE)]\n* Large Language Models-guided Dynamic Adaptation for Temporal Knowledge Graph Reasoning (**Neurips 2024 Submission**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2405.14170)][[code](https:\u002F\u002Fanonymous.4open.science\u002Fr\u002FLLM-DA-1E6D)]\n* Anomaly Detection in Continuous-Time Temporal Provenance Graphs (**Temporal Graph Learning Workshop @ NeurIPS, 2023**) [[paper](https:\u002F\u002Fopenreview.net\u002Fpdf?id=88tGIxxhsfn)][[code](https:\u002F\u002Fgithub.com\u002FJakubReha\u002FProvCTDG)]\n\n### 2023\n\n* Spectral Invariant Learning for Dynamic Graphs under Distribution Shifts (**Neurips, 2023**) [[paper](https:\u002F\u002Farxiv.org\u002Fabs\u002F2403.05026)][[code](https:\u002F\u002Fgithub.com\u002Fwondergo2017\u002Fsild)]\n* DistTGL: Distributed Memory-Based Temporal Graph Neural Network Training (**SC, 2023**) [[paper](https:\u002F\u002Farxiv.org\u002Fabs\u002F2307.07649)][[code](https:\u002F\u002Fgithub.com\u002Famazon-science\u002Fdisttgl)]\n* Towards Better Dynamic Graph Learning: New Architecture and Unified Library (**ARXIV, 2023**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2303.13047.pdf)][[code](https:\u002F\u002Fgithub.com\u002Fyule-BUAA\u002FDyGLib)]\n* SUREL+: Moving from Walks to Sets for Scalable Subgraph-based Graph Representation Learning (**ARXIV, 2023**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2303.03379.pdf)][[code](https:\u002F\u002Fgithub.com\u002FGraph-COM\u002FSUREL_Plus)]\n* Towards Open Temporal Graph Neural Networks (**ICLR, 2023**) [[paper](https:\u002F\u002Fopenreview.net\u002Fpdf?id=N9Pk5iSCzAn)][[code](https:\u002F\u002Fgithub.com\u002Ftulerfeng\u002FOTGNet)]\n* Do We Really Need Complicated Model Architectures For Temporal Networks? (**ICLR, 2023**) [[paper](https:\u002F\u002Fopenreview.net\u002Fpdf?id=ayPPc0SyLv1)][[code](https:\u002F\u002Fgithub.com\u002FCongWeilin\u002FGraphMixer)]\n* Zebra: When Temporal Graph Neural Networks Meet Temporal Personalized PageRank (**VLDB, 2023**) [[paper](https:\u002F\u002Fwww.vldb.org\u002Fpvldb\u002Fvol16\u002Fp1332-li.pdf)][[code](https:\u002F\u002Fgithub.com\u002FLuckyLYM\u002FZebra)]\n* Temporal SIR-GN: Eficient and Efective Structural Representation Learning for Temporal Graphs (**VLDB, 2023**) [[paper](https:\u002F\u002Fwww.vldb.org\u002Fpvldb\u002Fvol16\u002Fp2075-layne.pdf)][[code](https:\u002F\u002Fgithub.com\u002Fjanetlayne2\u002FTemporal-SIR-GN)]\n* SEIGN: A Simple and Efficient Graph Neural Network for Large Dynamic Graphs (**ICDE, 2023**) [[paper](https:\u002F\u002Fieeexplore.ieee.org\u002Fabstract\u002Fdocument\u002F10184567)]\n* A Higher-Order Temporal H-Index for Evolving Networks (**KDD, 2023**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2305.16001.pdf)]\n* Using Motif Transitions for Temporal Graph Generation (**KDD, 2023**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2306.11190.pdf)]\n* Temporal Dynamics Aware Adversarial Attacks on Discrete-Time Graph Models (**KDD, 2023**) [[paper](https:\u002F\u002Fopenreview.net\u002Fpdf?id=yUY15QBERj)][[code](https:\u002F\u002Fgithub.com\u002FerdemUB\u002FKDD23-MTM)]\n* Fairness-Aware Continuous Predictions of Multiple Analytics Targets in Dynamic Networks (**KDD, 2023**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2209.01678.pdf)]\n* DyTed: Disentangled Representation Learning for Discrete-time Dynamic Graph (**KDD, 2023**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2210.10592.pdf)]\n* WinGNN: Dynamic Graph Neural Networks with Random Gradient Aggregation Window (**KDD, 2023**) \n* Community-based Dynamic Graph Learning for Popularity Prediction (**KDD, 2023**)\n* An Atentional Multi-scale Co-evolving Model for Dynamic Link Prediction (**WWW, 2023**) [[paper](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3543507.3583396)][[code](https:\u002F\u002Fgithub.com\u002Ftsinghua-fib-lab\u002FAMCNet)]\n* TIGER: Temporal Interaction Graph Embedding with Restarts (**WWW, 2023**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2302.06057.pdf)][[code](https:\u002F\u002Fgithub.com\u002Fyzhang1918\u002Fwww2023tiger)]\n* HGWaveNet: A Hyperbolic Graph Neural Network for Temporal Link Prediction (**WWW, 2023**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2304.07302.pdf)][[code](https:\u002F\u002Fgithub.com\u002FTaiLvYuanLiang\u002FHGWaveNet)]\n* Expressive and Efficient Representation Learning for Ranking Links in Temporal Graphs (**WWW, 2023**) [[paper](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3543507.3583476)][[code](https:\u002F\u002Fgithub.com\u002Fsusheels\u002Ftgrank)]\n* Local Edge Dynamics and Opinion Polarization (**WSDM, 2023**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2111.14020.pdf)][[code](https:\u002F\u002Fgithub.com\u002Fadamlechowicz\u002Fopinion-polarization\u002F)]\n* Graph Sequential Neural ODE Process for Link Prediction on Dynamic and Sparse Graphs (**WSDM, 2023**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2211.08568.pdf)][[code](https:\u002F\u002Fgithub.com\u002FRManLuo\u002FGSNOP)]\n* Interpretable Research Interest Shift Detection with Temporal Heterogeneous Graphs (**WSDM, 2023**) [[paper](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3539597.3570453)]\n* Dynamic Heterogeneous Graph Attention Neural Architecture Search (**AAAI, 2023**) [[paper](https:\u002F\u002Fojs.aaai.org\u002Findex.php\u002FAAAI\u002Farticle\u002Fview\u002F26338)][[code](https:\u002F\u002Fgithub.com\u002Fwondergo2017\u002FDHGAS)]\n* Scaling Up Dynamic Graph Representation Learning via Spiking Neural Networks (**AAAI, 2023**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2208.10364.pdf)][[code](https:\u002F\u002Fgithub.com\u002FEdisonLeeeee\u002FSpikeNet)]\n* Hidden Markov Models for Temporal Graph Representation Learning (**ESANN, 2023**) [[paper](https:\u002F\u002Fwww.esann.org\u002Fsites\u002Fdefault\u002Ffiles\u002Fproceedings\u002F2023\u002FES2023-35.pdf)][[code](https:\u002F\u002Fgithub.com\u002Fnec-research\u002Fhidden_markov_model_temporal_graphs)]\n\n\n### 2022\n\n* TGL: A General Framework for Temporal GNN Training on Billion-Scale Graphs (**VLDB, 2022**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2203.14883.pdf)][[code](https:\u002F\u002Fgithub.com\u002Famazon-science\u002Ftgl)]\n* Neural Temporal Walks: Motif-Aware Representation Learning on Continuous-Time Dynamic Graphs (**Neurips, 2022**)[[paper](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper_files\u002Fpaper\u002F2022\u002Ffile\u002F7dadc855cef7494d5d956a8d28add871-Paper-Conference.pdf)][[code](https:\u002F\u002Fgithub.com\u002FKimMeen\u002FNeural-Temporal-Walks)]\n* Dynamic Graph Neural Networks Under Spatio-Temporal Distribution Shift (**Neurips, 2022**) [[paper](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper_files\u002Fpaper\u002F2022\u002Fhash\u002F2857242c9e97de339ce642e75b15ff24-Abstract-Conference.html)][[code](https:\u002F\u002Fgithub.com\u002Fwondergo2017\u002FDIDA)]\n* Adaptive Data Augmentation on Temporal Graphs (**Neurips, 2022**) [[paper](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper\u002F2021\u002Ffile\u002F0b0b0994d12ad343511adfbfc364256e-Paper.pdf)]\n* Parameter-free Dynamic Graph Embedding for Link Prediction (**Neurips, 2022**) [[paper](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper_files\u002Fpaper\u002F2022\u002Ffile\u002Fb14d7175755b180dc2163e15e3110cb6-Paper-Conference.pdf)][[code](https:\u002F\u002Fgithub.com\u002FFudanCISL\u002FFreeGEM)]\n* Instant Graph Neural Networks for Dynamic Graphs (**KDD, 2022**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2206.01379.pdf)][[code]()]\n* Disentangled Dynamic Heterogeneous Graph Learning for Opioid Overdose Prediction (**KDD, 2022**) [[paper](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3534678.3539279)][[code]()]\n* ROLAND: Graph Learning Framework for Dynamic Graphs (**KDD, 2022**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2208.07239.pdf)][[code](https:\u002F\u002Fgithub.com\u002Fsnap-stanford\u002Froland)]\n* Subset Node Anomaly Tracking over Large Dynamic Graphs (**KDD, 2022**) [[paper](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3534678.3539389)][[code](https:\u002F\u002Fgithub.com\u002Fzjlxgxz\u002FDynAnom)]\n* Streaming Graph Neural Networks via Generative Replay (**KDD, 2022**) [[paper](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3534678.3539336)][[code](https:\u002F\u002Fgithub.com\u002FJunshan-Wang\u002FSGNN-GR)]\n* Neighborhood-aware Scalable Temporal Network Representation Learning (**LoG, 2022**) [[paper](https:\u002F\u002Fopenreview.net\u002Fpdf?id=EPUtNe7a9ta)][[code](https:\u002F\u002Fgithub.com\u002FGraph-COM\u002FNeighborhood-Aware-Temporal-Network)]\n* DisenCTR: Dynamic Graph-based Disentangled Representation for Click-Through Rate Prediction (**SIGIR, 2022**) [[paper](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3477495.3531851)][[code](https:\u002F\u002Fgithub.com\u002FFang6ang\u002FDisenCTR)]\n* STAM: A Spatiotemporal Aggregation Method for Graph Neural Network-based Recommendation (**WWW, 2022**) [[paper](https:\u002F\u002Fkeg.cs.tsinghua.edu.cn\u002Fjietang\u002Fpublications\u002FWWW22-Yang%20et%20al.-STAM-GNN.pdf)][[code](https:\u002F\u002Fgithub.com\u002Fzyang-16\u002FSTAM)]\n* Neural Predicting Higher-order Patterns in Temporal Networks (**WWW, 2022**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2106.06039.pdf)][[code](https:\u002F\u002Fgithub.com\u002FGraph-COM\u002FNeural_Higher-order_Pattern_Prediction)]\n* TREND: TempoRal Event and Node Dynamics for Graph Representation Learning (**WWW, 2022**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2203.14303.pdf)][[code](https:\u002F\u002Fgithub.com\u002FWenZhihao666\u002FTREND)]\n* A Viral Marketing-Based Model For Opinion Dynamics in Online Social Networks (**WWW, 2022**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2202.03573.pdf)]\n* EvoKG: Jointly Modeling Event Time and Network Structure for Reasoning over Temporal Knowledge Graphs (**WSDM, 2022**) [[paper](http:\u002F\u002Fkeg.cs.tsinghua.edu.cn\u002Fyuxiao\u002Fpapers\u002FWSDM22-park-evokg.pdf)][[code](https:\u002F\u002Fgithub.com\u002FNamyongPark\u002FEvoKG)]\n* Finding a Concise, Precise, and Exhaustive Set of Near Bi-Cliques in Dynamic Graphs (**WSDM, 2022**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2110.14875.pdf)][[code](https:\u002F\u002Fgithub.com\u002Fhyeonjeong1\u002Fcutnpeel)]\n* Few-shot Link Prediction in Dynamic Networks (**WSDM, 2022**) [[paper](http:\u002F\u002Fwww.shichuan.org\u002Fdoc\u002F120.pdf)]\n* On Generalizing Static Node Embedding to Dynamic Settings (**WSDM, 2022**) [[paper](https:\u002F\u002Fgemslab.github.io\u002Fpapers\u002Fdijin-2021-trg.pdf)]\n* Along the Time: Timeline-traced Embedding for Temporal Knowledge Graph Completion (**CIKM, 2022**) [[paper](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3511808.3557233)][[code](https:\u002F\u002Fgithub.com\u002Fzhangfw123\u002FTLT-KGE)]\n* DA-Net: Distributed Attention Network for Temporal Knowledge Graph Reasoning (**CIKM, 2022**) [[paper](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3511808.3557280)]\n* A Self-supervised Riemannian GNN with Time Varying Curvature for Temporal Graph Learning (**CIKM, 2022**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2208.14073.pdf)]\n* Dynamic Hypergraph Learning for Collaborative Filtering (**CIKM, 2022**) [[paper]](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3511808.3557301)\n\n### 2021\n\n* Inductive Representation Learning in Temporal Networks via Causal Anonymous Walks (**ICLR, 2021**) [[paper](https:\u002F\u002Fopenreview.net\u002Fpdf?id=KYPz4YsCPj)][[code](https:\u002F\u002Fgithub.com\u002Fsnap-stanford\u002FCAW)]\n* Coupled Graph ODE for Learning Interacting System Dynamics (**KDD, 2021**) [[paper](http:\u002F\u002Fweb.cs.ucla.edu\u002F~yzsun\u002Fpapers\u002F2021_KDD_CG_ODE.pdf)][[code](https:\u002F\u002Fgithub.com\u002FZijieH\u002FCG-ODE)]\n* Subset Node Representation Learning over Large Dynamic Graphs (**KDD, 2021**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2106.01570.pdf)][[code](https:\u002F\u002Fgithub.com\u002Fzjlxgxz\u002FDynamicPPE)]\n* Discrete-time Temporal Network Embedding via Implicit Hierarchical Learning in Hyperbolic Space [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2107.03767.pdf)][[code](https:\u002F\u002Fgithub.com\u002Fmarlin-codes\u002FHTGN-KDD21)]\n* Learning to Walk across Time for Temporal Knowledge Graph Completion (**KDD, 2021**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2012.10595v1.pdf)]\n* Forecasting Interaction Order on Temporal Graphs (**KDD, 2021**) \n* Temporal Knowledge Graph Reasoning Based on Evolutional Representation Learning (**SIGIR, 2021**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2104.10353.pdf)][[code](https:\u002F\u002Fgithub.com\u002FLee-zix\u002FRE-GCN)]\n* Inductive Representation Learning in Temporal Networks via Mining Neighborhood and Community Influences (**SIGIR, 2021**)\n* TIE: A Framework for Embedding-based Incremental Temporal Knowledge Graph Completion [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2104.08419.pdf)]\n* SDG: A Simplified and Dynamic Graph Neural Network (**SIGIR SHORT, 2021**) [[paper](https:\u002F\u002Fgithub.com\u002FDongqiFu\u002FSDG\u002Fblob\u002Fmain\u002Fpaper\u002FSDG_A%20Simplified%20and%20Dynamic%20Graph%20Neural%20Network.pdf)][[code](https:\u002F\u002Fgithub.com\u002FDongqiFu\u002FSDG)]\n* Temporal Augmented Graph Neural Networks for Session-Based Recommendations (**SIGIR SHORT, 2021**) [[paper](https:\u002F\u002Fwww4.comp.polyu.edu.hk\u002F~xiaohuang\u002Fdocs\u002FHuachi_sigir2021.pdf)]\n* HINTS: Citation Time Series Prediction for New Publications via Dynamic Heterogeneous Information Network Embedding (**WWW, 2021**) [[paper](http:\u002F\u002Fweb.cs.ucla.edu\u002F~yzsun\u002Fpapers\u002F2021_WWW_HINTS.pdf)][[code](https:\u002F\u002Fgithub.com\u002Fsongjiang0909\u002FHINTS_code)]\n* TEDIC: Neural Modeling of Behavioral Patterns in Dynamic Social Interaction Networks (**WWW, 2021**) [[paper](http:\u002F\u002Fsnap.stanford.edu\u002Ftedic\u002Ffiles\u002Fwww21_tedic.pdf)]\n* Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs (**AAAI, 2021**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2104.02228.pdf)]\n* Interpretable Clustering on Dynamic Graphs with Recurrent Graph Neural Networks (**AAAI, 2021**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2012.08740.pdf)][[code](https:\u002F\u002Fgithub.com\u002FInterpretableClustering\u002FInterpretableClustering)]\n* Overcoming Catastrophic Forgetting in Graph Neural Networks with Experience Replay (**AAAI, 2021**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2003.09908.pdf)]\n* Learning and Updating Node Embedding on Dynamic Heterogeneous Information Network (**WSDM, 2021**) [[paper](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3437963.3441745)]\n* F-FADE: Frequency Factorization for Anomaly Detection in Edge Streams (**WSDM, 2021**) [[paper](https:\u002F\u002Fcs.stanford.edu\u002Fpeople\u002Fjure\u002Fpubs\u002Fffade-wsdm21.pdf)][[code](https:\u002F\u002Fgithub.com\u002Fsnap-stanford\u002FF-FADE)]\n* Cache-based GNN System for Dynamic Graphs (**CIKM 2021**) [[paper]]\n* Self-supervised Representation Learning on Dynamic Graphs (**CIKM 2021**)[[paper]]\n* Continuous-Time Sequential Recommendation with Temporal Graph Collaborative Transformer [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2108.06625.pdf)][[code](https:\u002F\u002Fgithub.com\u002FDyGRec\u002FTGSRec)]\n* Structural Temporal Graph Neural Networks for Anomaly Detection in Dynamic Graphs (**CIKM 2021**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2005.07427.pdf)]\n* Neural Higher-order Pattern (Motif) Prediction in Temporal Networks (**ARXIV, 2021**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2106.06039.pdf)]\n\n### 2020\n\n* Inductive Representation Learning on Temporal Graphs (**ICLR, 2020**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2002.07962.pdf)][[code](https:\u002F\u002Fgithub.com\u002FStatsDLMathsRecomSys\u002FIsnductive-representation-learning-on-temporal-graphs)]\n* Temporal Graph Networks for Deep Learning on Dynamic Graphs (**ICML Workshop, 2020**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2006.10637v1.pdf)][[code](https:\u002F\u002Fgithub.com\u002Ftwitter-research\u002Ftgn)]\n* A Data-Driven Graph Generative Model for Temporal Interaction Networks (**KDD, 2020**) [[paper](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3394486.3403082)][[code](https:\u002F\u002Fgithub.com\u002Fdavidchouzdw\u002FTagGen)]\n* Dynamic Knowledge Graph based Multi-Event Forecasting (**KDD, 2020**) [[paper](https:\u002F\u002Fyue-ning.github.io\u002Fdocs\u002FKDD20-glean.pdf)][[code](https:\u002F\u002Fgithub.com\u002Famy-deng\u002Fglean)]\n* Laplacian Change Point Detection for Dynamic Graphs (**KDD, 2020**) [[paper](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3394486.3403077)][[code](https:\u002F\u002Fgithub.com\u002FshenyangHuang\u002FLAD)]\n* Algorithmic Aspects of Temporal Betweenness (**KDD, 2020**) [[paper](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3394486.3403259)][[code](https:\u002F\u002Ffpt.akt.tu-berlin.de\u002Fsoftware\u002Ftemporal_betweenness\u002F)]\n* Heterogeneous Graph Transformer (**WWW, 2020**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2003.01332.pdf)][[code](https:\u002F\u002Fgithub.com\u002Facbull\u002FpyHGT)]\n* Streaming Graph Neural Network (**SIGIR, 2020**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1810.10627.pdf)][[code](https:\u002F\u002Fgithub.com\u002Falge24\u002FDyGNN)]\n* Next-item Recommendation with Sequential Hypergraphs (**SIGIR, 2020**) [[paper](http:\u002F\u002Fwww.public.asu.edu\u002F~kding9\u002Fpdf\u002FSIGIR2020_HyperRec.pdf)][[code](https:\u002F\u002Fgithub.com\u002Fwangjlgz\u002FHyperRec)]\n* Temporal Network Embedding with High-Order Nonlinear Information (**AAAI, 2020**) [[paper](https:\u002F\u002Fojs.aaai.org\u002Findex.php\u002FAAAI\u002Farticle\u002Fview\u002F5993)]\n* Motif-Preserving Temporal Network Embedding (**IJCAI, 2020**) [[paper](https:\u002F\u002Fwww.ijcai.org\u002Fproceedings\u002F2020\u002F0172.pdf)]\n* Dynamic Graph Collaborative Filtering (**ICDM, 2020**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2101.02844.pdf)][[code](https:\u002F\u002Fgithub.com\u002FCRIPAC-DIG\u002FDGCF)]\n* DySAT: Deep Neural Representation Learning on Dynamic Graphs via Self-Attention Networks (**WSDM, 2020**) [[papr](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3336191.3371845)][[code](https:\u002F\u002Fgithub.com\u002Faravindsankar28\u002FDySAT)]\n* Learning and Updating Node Embedding on Dynamic Heterogeneous Information Network (**WSDM, 2020**) [[paper](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3437963.3441745)][[code]()]\n* Continuous-Time Dynamic Graph Learning via Neural Interaction Processes (**CIKM, 2020**) [[paper](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3340531.3411946)]\n* tdGraphEmbed: Temporal Dynamic Graph-Level Embedding (**CIKM, 2020**) [[paper](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3340531.3411953)][[code](https:\u002F\u002Fgithub.com\u002Fmoranbel\u002FtdGraphEmbed)]\n* Streaming Graph Neural Network via Continue Learning (**CIKM, 2020**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2009.10951.pdf)][[code](https:\u002F\u002Fgithub.com\u002FJunshan-Wang\u002FContinualGNN)]\n* Disentangle-based Continual Graph Representation Learning (**EMNLP, 2020**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2010.02565.pdf)][[code](https:\u002F\u002Fgithub.com\u002FKXY-PUBLIC\u002FDiCGRL)]\n* TeMP: Temporal Message Passing for Temporal Knowledge Graph Completion (**EMNLP, 2020**) [[paper](https:\u002F\u002Faclanthology.org\u002F2020.emnlp-main.462.pdf)][[code](https:\u002F\u002Fgithub.com\u002FJiapengWu\u002FTeMP)]\n* Recurrent Event Network: Autoregressive Structure Inferenceover Temporal Knowledge Graphs (**EMNLP, 2020**) [[paper](https:\u002F\u002Faclanthology.org\u002F2020.emnlp-main.541.pdf)][[code](https:\u002F\u002Fgithub.com\u002FINK-USC\u002FRE-Net)]\n* EPNE: Evolutionary Pattern Preserving Network Embedding (**ECAI, 2020**) [[paper](http:\u002F\u002Fecai2020.eu\u002Fpapers\u002F528_paper.pdf)]\n* GloDyNE: Global Topology Preserving Dynamic Network Embedding (**TKDE, 2020**) [[paper](https:\u002F\u002Fieeexplore.ieee.org\u002Fstamp\u002Fstamp.jsp?tp=&arnumber=9302718)][[code](https:\u002F\u002Fgithub.com\u002Fhouchengbin\u002FGloDyNE)]\n* Dynamic Heterogeneous Information Network Embedding with Meta-path based Proximity (**TKDE, 2020**) [[paper](https:\u002F\u002Fyuanfulu.github.io\u002Fpublication\u002FTKDE-DyHNE.pdf)][[code](https:\u002F\u002Fgithub.com\u002Frootlu\u002FDyHNE)]\n* Lifelong Graph Learning (**ARXIV, 2020**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2009.00647.pdf)]\n\n\n\n### 2019\n\n* Variational Graph Recurrent Neural Networks (**NeurIPS, 2019**) [[paper](https:\u002F\u002Fpapers.nips.cc\u002Fpaper\u002F2019\u002Ffile\u002Fa6b8deb7798e7532ade2a8934477d3ce-Paper.pdf)][[code](https:\u002F\u002Fgithub.com\u002FVGraphRNN\u002FVGRNN)]\n* Recurrent Space-time Graph Neural Networks (**NeurIPS, 2019**) [[paper](http:\u002F\u002Fexport.arxiv.org\u002Fpdf\u002F1904.05582#:~:text=Our%20recurrent%20neural%20graph%20ef%EF%AC%81ciently%20processes%20information%20in,in%20space-time%20using%20a%20backbone%20deep%20neural%20network.)][[code](https:\u002F\u002Fgithub.com\u002FIuliaDuta\u002FRSTG)]\n* DyRep: Learning Representations over Dynamic Graphs (**ICLR, 2019**) [[paper](https:\u002F\u002Fopenreview.net\u002Fpdf?id=HyePrhR5KX)]\n* Predicting Dynamic Embedding Trajectory in Temporal Interaction Networks (**KDD, 2019**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1908.01207.pdf)][[code](https:\u002F\u002Fgithub.com\u002Fsrijankr\u002Fjodie)]\n* Learning Dynamic Context Graphs for Predicting Social Events (**KDD, 2019**) [[paper](https:\u002F\u002Fyue-ning.github.io\u002Fdocs\u002FKDD19-dengA.pdf)][[code](https:\u002F\u002Fgithub.com\u002Famy-deng\u002FDynamicGCN)]\n* EvolveGCN: Evolving Graph Convolutional Networks for Dynamic Graphs (**AAAI, 2019**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1902.10191.pdf)][[code](https:\u002F\u002Fgithub.com\u002FIBM\u002FEvolveGCN)]\n* Hierarchical Temporal Convolutional Networks for Dynamic Recommender Systems (**WWW, 2019**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1904.04381.pdf)]\n* Real-Time Streaming Graph Embedding Through Local Actions (**WWW, 2019**) [[paper](https:\u002F\u002Fnickduffield.net\u002Fdownload\u002Fpapers\u002FDL4G-SDE-2019.pdf)]\n* Dynamic Hypergraph Neural Networks (**IJCAI, 2019**) [[paper](https:\u002F\u002Fwww.ijcai.org\u002FProceedings\u002F2019\u002F0366.pdf)][[code](https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDHGNN#:~:text=%20DHGNN%3A%20Dynamic%20Hypergraph%20Neural%20Networks%20%201,%28Zhilin%20Yang%2C%20William%20W.%20-%20Cohen%2C...%20More%20)]\n* Node Embedding over Temporal Graphs (**IJCAI, 2019**) [[paper](https:\u002F\u002Fwww.ijcai.org\u002Fproceedings\u002F2019\u002F0640.pdf)][[code](https:\u002F\u002Fgithub.com\u002Furielsinger\u002FtNodeEmbed#:~:text=Node%20Embedding%20over%20Temporal%20Graphs.%20Uriel%20Singer%2C%20Ido,for%20nodes%20in%20any%20%28un%29directed%2C%20%28un%29weighted%20temporal%20graph.)]\n* Temporal Network Embedding with Micro- and Macro-dynamics (**CIKM, 2019**) [[paper](https:\u002F\u002Fpar.nsf.gov\u002Fservlets\u002Fpurl\u002F10148548)][[code](https:\u002F\u002Fgithub.com\u002Frootlu\u002FMMDNE)]\n\n\n### 2018\n\n* NetWalk: A Flexible Deep Embedding Approach for Anomaly Detection in Dynamic Networks (**KDD, 2018**) [[paper](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3219819.3220024)][[code](https:\u002F\u002Fgithub.com\u002Fkdmsit\u002FNetWalk)]\n* Embedding Temporal Network via Neighborhood Formation (**KDD, 2018**) [[paper](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3219819.3220054)][[code]()]\n* Dynamic Network Embedding by Modeling Triadic Closure Process (**AAAI, 2018**) [[paper](http:\u002F\u002Fyangy.org\u002Fworks\u002Fdynamictriad\u002Fdynamic_triad.pdf)][[code](https:\u002F\u002Fgithub.com\u002Fluckiezhou\u002FDynamicTriad)]\n* Continuous-Time Dynamic Network Embeddings (**WWW, 2018**) [[paper](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3184558.3191526)][[code](https:\u002F\u002Fgithub.com\u002FShubhranshu-Shekhar\u002Fctdne)]\n* Dynamic Network Embedding : An Extended Approach for Skip-gram based Network Embedding (**IJCAI, 2018**) [[paper](https:\u002F\u002Fwww.ijcai.org\u002Fproceedings\u002F2018\u002F0288.pdf)]\n\n### 2017\n\n* Know-Evolve: Deep Temporal Reasoning for Dynamic Knowledge Graphs (**ICML, 2017**) [[paper](http:\u002F\u002Fproceedings.mlr.press\u002Fv70\u002Ftrivedi17a\u002Ftrivedi17a.pdf)][[code](https:\u002F\u002Fgithub.com\u002Frstriv\u002FKnow-Evolve)]\n* The Co-Evolution Model for Social Network Evolving and Opinion Migration (**KDD, 2017**) [[paper](http:\u002F\u002Fweb.cs.ucla.edu\u002F~yzsun\u002Fpapers\u002F2017_kdd_coevolution.pdf)[code]()]\n* Attributed Network Embedding for Learning in a Dynamic Environment (**CIKM, 2017**) [[paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1706.01860.pdf)][[code](https:\u002F\u002Fgithub.com\u002Fgaoghc\u002FDANE)]\n\n## Tools\n\n### General Graph Learning\n* [Deep Graph Library](https:\u002F\u002Fwww.dgl.ai\u002F)\n* [Pytorch Geometric](https:\u002F\u002Fpytorch-geometric.readthedocs.io\u002Fen\u002Flatest\u002F)\n* [Pytorch Geometric Temporal](https:\u002F\u002Fpytorch-geometric-temporal.readthedocs.io\u002Fen\u002Flatest\u002Fnotes\u002Fintroduction.html)\n* [Stellar Graph](https:\u002F\u002Fstellargraph.readthedocs.io\u002Fen\u002Fstable\u002F)\n* [GraphVite](https:\u002F\u002Fgraphvite.io\u002F)\n\n### Knowledge Graph\n* [DGL-KE](https:\u002F\u002Fdglke.dgl.ai\u002Fdoc\u002F)\n* [OpenKE](https:\u002F\u002Fgithub.com\u002Fthunlp\u002FOpenKE)\n\n### Recommender System\n* [RecBole](https:\u002F\u002Fwww.recbole.io\u002F)\n\n","# 令人惊叹的动态图学习 [![Awesome](https:\u002F\u002Fawesome.re\u002Fbadge.svg)](https:\u002F\u002Fawesome.re)\n\n关于动态（时序）图（网络\u002F知识图谱）及其应用（即推荐系统）的机器学习（深度学习）优秀论文（代码）集合。\n\n## 综述\n\n* 动态图上的深度学习：模型与基准测试（**TNNLS, 2024**）[[论文](https:\u002F\u002Fieeexplore.ieee.org\u002Fdocument\u002F10490120)][[代码](https:\u002F\u002Fgithub.com\u002Fgravins\u002Fdynamic_graph_benchmark)]\n* 时序图上的图神经网络：现状、开放挑战与机遇（**ARXIV, 2023**）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2302.01018.pdf)]\n* 针对不同图类型的图神经网络综述（**ARXIV, 2022**）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2204.03080.pdf)]\n* 动态图表示学习综述（**JMLR, 2020**）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1905.11485.pdf)]\n* 动态图嵌入综述（**ARXIV, 2021**）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2101.01229v1.pdf)]\n* 动态（知识）图上的关系表示学习综述（**ARXIV, 2019**）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1905.11485v1.pdf)]\n* 非线性 + 网络：2020年的愿景（**ARXIV, 2019**）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1911.03805v1.pdf)]\n* 时序网络（**Physics Report, 2012**）[[论文](https:\u002F\u002Fpdf.sciencedirectassets.com\u002F271542\u002F1-s2.0-S0370157312X00309\u002F1-s2.0-S0370157312000841\u002Fmain.pdf?X-Amz-Security-Token=IQoJb3JpZ2luX2VjEIL%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEaCXVzLWVhc3QtMSJHMEUCIQCKOucVujQP07wkAKxMuMLMSsHqFv%2FFb%2BYWtNrMjUENTwIgWQ3afs%2FTBaPhDmEiZPlrVllfwmAmEN9OMs0ReONp0PIq%2BgMIOxAEGgwwNTkwMDM1NDY4NjUiDHz5nNBj0grdhs2zQirXA6ElLZy%2FDlDSDU86oA74varszz0ma8Dbz0g92LEczi64XvbCAHQmPQIiskdJpYzeQQCoHHQwirdve6OhcF6pTJwGbJ2lL84oSrywuWhi6Z0e9kdtLUw2deUEHp2La7FUeebH%2FnHaHV3BpYfl2%2BXA0Y1zI67VWbtXv6MALP6e9THRpRmS6omIAgiB9u6bOm3NDQ4hC7Cp%2F22gUvRvSOm14Y%2F9s2kk7QcqRxTMDTW94Dbtty2O8Pw54CJulxcOo7Nby7%2FXrarewlMgFBxCwhNteoXaVviFrgl91rtQTq5EnU9HEBntgE0r8z%2F0e%2FGh1JuYvd0aK5FzC2ZTGjFHNq7bx%2BdscwV1QiLkiVHsNKc2CURzGvUx0dRFIud8w3PkH7aZVESvKlNvLyKa%2FgL4TU%2B0n5j92ppZHbC3DfB8kwZV1I1QFzB7mFmhdpoAWFlXXY2xxPPkQqsV1%2BsPanWb9JIgkpBnu5ZO2xmVHPRSlvL%2BUTKvD3Jq0LmqEYo1tFy3F4sYEmGV4vw0RKKo7tOYb8SFgdTw26SVhera5aeLIwSFZYAvv0wRb%2BsXgoPJK51YLI1XTXnNep%2FWNLv1Gem993YkpKZdgmoEpPheKv1%2B85mELU1J82NJeExBXTDumIiTBjqlAUQbPzomGso8OdXiqdTW8V8WaRL%2B0ZgHpf3Kzcb3k2W%2FWSKiA91fU6BA%2FTFUUd2iafG1k%2Bgm8Yvli8YBEroGEXmUOH5IdiIyFTIUL7BfcyvedVwWbBCVoGyLxs48G6KWaVwowy2XYP%2BXHfbDghl6NNzdaUwlWc1blXx%2BkkoUq1ZIwsAzhLrwLthvrB%2BLeKT8IdYOWCDhjSdy%2BsHs6t%2F4WEMwQVHScg%3D%3D&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Date=20220422T030440Z&X-Amz-SignedHeaders=host&X-Amz-Expires=300&X-Amz-Credential=ASIAQ3PHCVTY3GYD4RGC%2F20220422%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Signature=e6c67c850d5b9141253837519a558f2d56ee022ea163233ccf17b88f9815df4f&hash=5c12a8966a05652f2c4464fea3e79f3c7f669cf94a84365674e2e0647371a431&host=68042c943591013ac2b2430a89b270f6af2c76d8dfd086a07176afe7c76c2c61&pii=S0370157312000841&tid=spdf-a3f8bf90-014a-4f05-ba9e-eebf11a58a9a&sid=55bff1c97359b9450988c4c294161bf31292gxrqa&type=client&ua=4c00050055575755530304&rr=6ffb22261cbf968e)]\n\n## 论文\n\n### 2025年\n\n* 通过可学习变换函数重新思考时间编码（**ICML, 2025**）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2505.00887)][[代码](https:\u002F\u002Fgithub.com\u002Fchenxi1228\u002FLeTE)]\n* 具有相关时空位置编码的动态图Transformer（**WSDM, 2025**）[[论文](https:\u002F\u002Farxiv.org\u002Fabs\u002F2407.16959)][[代码](https:\u002F\u002Fgithub.com\u002Fwangz3066\u002FCorDGT)]\n\n### 2024\n\n* 连续时间动态图上的长程传播 (**ICML, 2024**) [[论文](https:\u002F\u002Fproceedings.mlr.press\u002Fv235\u002Fgravina24a.html)][[代码](https:\u002F\u002Fgithub.com\u002Fgravins\u002Fnon-dissipative-propagation-CTDGs)]\n* LLM4DyG：大型语言模型能否解决动态图上的时空问题？(**SIGKDD, 2024**) [[论文](https:\u002F\u002Farxiv.org\u002Fabs\u002F2310.17110)][[代码](https:\u002F\u002Fgithub.com\u002Fwondergo2017\u002FLLM4DyG)]\n* 面向自适应邻域的时序交互图建模研究 (**SIGKDD, 2024**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2406.11891)]\n* SLADE：基于自监督学习的无标签边流动态异常检测 (**SIGKDD, 2024**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2402.11933)][[代码](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2402.11933)]\n* 基于双曲空间骨架识别的复杂网络长期动态预测 (**SIGKDD, 2024**) [[代码](https:\u002F\u002Fgithub.com\u002Ftsinghua-fib-lab\u002FDiskNet)]\n* 基于潜在条件扩散的连续时间动态图模型数据增强 (**SIGKDD, 2024**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2407.08500)][[代码]()]\n* MemMap：用于动态图学习的自适应潜在记忆结构 (**SIGKDD, 2024**)\n* TASER：面向快速准确的动态图表示学习的时序自适应采样方法 (**IPDPS, 2024**) [[论文](https:\u002F\u002Farxiv.org\u002Fabs\u002F2402.05396)][[代码](https:\u002F\u002Fgithub.com\u002Ffacebookresearch\u002Ftaser-tgnn)]\n* Mayfly：用于图流摘要的神经数据结构 (**ICLR, 2024, Spotlight**) [[论文](https:\u002F\u002Fopenreview.net\u002Fattachment?id=n7Sr8SW4bn&name=pdf)]\n* 受因果启发的动态图神经网络时空解释方法 (**ICLR, 2024, Poster**) [[论文](https:\u002F\u002Fopenreview.net\u002Fattachment?id=AJBkfwXh3u&name=pdf)][[代码](https:\u002F\u002Fgithub.com\u002Fkesenzhao\u002FDyGNNExplainer)]\n* FreeDyG：面向链接预测的频率增强型连续时间动态图模型 (**ICLR, 2024, Poster**) [[论文](https:\u002F\u002Fopenreview.net\u002Fattachment?id=82Mc5ilInM&name=pdf)][[代码](https:\u002F\u002Fgithub.com\u002FTianxzzz\u002FFreeDyG)]\n* PRES：迈向可扩展的基于内存的动态图神经网络 (**ICLR, 2024, Poster**) [[论文](https:\u002F\u002Fopenreview.net\u002Fattachment?id=gjXor87Xfy&name=pdf)][[代码](https:\u002F\u002Fgithub.com\u002Fjwsu825\u002FMDGNN_BS)]\n* 超图动态系统 (**ICLR, 2024, Poster**) [[论文](https:\u002F\u002Fopenreview.net\u002Fattachment?id=NLbRvr840Q&name=pdf)]\n* 深度时序图聚类 (**ICLR, 2024, Poster**) [[论文](https:\u002F\u002Fopenreview.net\u002Fattachment?id=ViNe1fjGME&name=pdf)][[代码](https:\u002F\u002Fgithub.com\u002FMGitHubL\u002FDeep-Temporal-Graph-Clustering)]\n* GraphPulse：用于时序图属性预测的拓扑表示方法 (**ICLR, 2024, Poster**) [[论文](https:\u002F\u002Fopenreview.net\u002Fattachment?id=DZqic2sPTY&name=pdf)][[代码](https:\u002F\u002Fgithub.com\u002Fkiarashamsi\u002FGraphPulse)]\n* 超越时空表示：面向时序图的进化傅里叶变换 (**ICLR, 2024, Poster**) [[论文](https:\u002F\u002Fopenreview.net\u002Fattachment?id=uvFhCUPjtI&name=pdf)][[代码](https:\u002F\u002Fgithub.com\u002Fansonb\u002FEFT)]\n* HOPE：高阶图微分方程用于建模交互动力学 (**ICML, 2024, Poster**) [[论文](https:\u002F\u002Fopenreview.net\u002Fattachment?id=9iChKP4k32&name=pdf)]\n* 演化图中的时序泛化能力估计 (**ICLR, 2024, Poster**) [[论文](https:\u002F\u002Fopenreview.net\u002Fattachment?id=HFtrXBfNru&name=pdf)]\n* 动态图信息瓶颈理论 (**WWW, 2024**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2402.06716.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FRingBDStack\u002FDGIB)]\n* 简单Transformer在动态图建模中的可行性研究 (**WWW, 2024**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2401.14009.pdf)]\n* 基于时序一致性感知的霍克斯图网络推荐系统 (**WWW, 2024**)\n* IME：融合多曲率共享与特定嵌入的时序知识图谱补全方法 (**WWW, 2024**)\n* TATKC：用于快速近似时序卡茨中心性排序的时序图神经网络 (**WWW, 2024**)\n* 针对时序图的有效精确及近似介数中心性计算方法 (**WWW, 2024**)\n* 不规则采样时间序列的时序图微分方程模型 (**IJCAI, 2024**) [[论文](https:\u002F\u002Fwww.ijcai.org\u002Fproceedings\u002F2024\u002F445)][[代码](https:\u002F\u002Fgithub.com\u002Fgravins\u002FTG-ODE)]\n* 大型语言模型引导的时序知识图谱推理动态适应方法 (**NeurIPS 2024 提交**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2405.14170)][[代码](https:\u002F\u002Fanonymous.4open.science\u002Fr\u002FLLM-DA-1E6D)]\n* 连续时间时序溯源图中的异常检测 (**NeurIPS 2023 时序图学习研讨会**) [[论文](https:\u002F\u002Fopenreview.net\u002Fpdf?id=88tGIxxhsfn)][[代码](https:\u002F\u002Fgithub.com\u002FJakubReha\u002FProvCTDG)]\n\n### 2023年\n\n* 面对分布偏移的动态图谱不变量学习（**NeurIPS，2023**）[[论文](https:\u002F\u002Farxiv.org\u002Fabs\u002F2403.05026)][[代码](https:\u002F\u002Fgithub.com\u002Fwondergo2017\u002Fsild)]\n* DistTGL：基于分布式内存的时序图神经网络训练（**SC，2023**）[[论文](https:\u002F\u002Farxiv.org\u002Fabs\u002F2307.07649)][[代码](https:\u002F\u002Fgithub.com\u002Famazon-science\u002Fdisttgl)]\n* 向更优的动态图学习迈进：新架构与统一库（**ARXIV，2023**）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2303.13047.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Fyule-BUAA\u002FDyGLib)]\n* SUREL+：从游走走向集合，实现可扩展的子图基图表示学习（**ARXIV，2023**）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2303.03379.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FGraph-COM\u002FSUREL_Plus)]\n* 走向开放的时序图神经网络（**ICLR，2023**）[[论文](https:\u002F\u002Fopenreview.net\u002Fpdf?id=N9Pk5iSCzAn)][[代码](https:\u002F\u002Fgithub.com\u002Ftulerfeng\u002FOTGNet)]\n* 我们真的需要复杂的时序网络模型架构吗？（**ICLR，2023**）[[论文](https:\u002F\u002Fopenreview.net\u002Fpdf?id=ayPPc0SyLv1)][[代码](https:\u002F\u002Fgithub.com\u002FCongWeilin\u002FGraphMixer)]\n* Zebra：当时序图神经网络遇上时间感知个性化PageRank（**VLDB，2023**）[[论文](https:\u002F\u002Fwww.vldb.org\u002Fpvldb\u002Fvol16\u002Fp1332-li.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FLuckyLYM\u002FZebra)]\n* Temporal SIR-GN：高效且有效的时序图结构表示学习（**VLDB，2023**）[[论文](https:\u002F\u002Fwww.vldb.org\u002Fpvldb\u002Fvol16\u002Fp2075-layne.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Fjanetlayne2\u002FTemporal-SIR-GN)]\n* SEIGN：一种简单高效的大型动态图神经网络（**ICDE，2023**）[[论文](https:\u002F\u002Fieeexplore.ieee.org\u002Fabstract\u002Fdocument\u002F10184567)]\n* 用于演化网络的高阶时间H指数（**KDD，2023**）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2305.16001.pdf)]\n* 利用模体转移进行时序图生成（**KDD，2023**）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2306.11190.pdf)]\n* 针对离散时间图模型的时间动态感知对抗攻击（**KDD，2023**）[[论文](https:\u002F\u002Fopenreview.net\u002Fpdf?id=yUY15QBERj)][[代码](https:\u002F\u002Fgithub.com\u002FerdemUB\u002FKDD23-MTM)]\n* 动态网络中面向多个分析目标的公平性感知连续预测（**KDD，2023**）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2209.01678.pdf)]\n* DyTed：面向离散时间动态图的解耦表示学习（**KDD，2023**）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2210.10592.pdf)]\n* WinGNN：具有随机梯度聚合窗口的动态图神经网络（**KDD，2023**）\n* 基于社区的动态图学习用于热度预测（**KDD，2023**）\n* 一种注意力多尺度协同演化模型用于动态链接预测（**WWW，2023**）[[论文](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3543507.3583396)][[代码](https:\u002F\u002Fgithub.com\u002Ftsinghua-fib-lab\u002FAMCNet)]\n* TIGER：带重启机制的时间交互图嵌入（**WWW，2023**）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2302.06057.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Fyzhang1918\u002Fwww2023tiger)]\n* HGWaveNet：用于时间链接预测的双曲图神经网络（**WWW，2023**）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2304.07302.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FTaiLvYuanLiang\u002FHGWaveNet)]\n* 针对时序图中链接排序的富有表现力且高效的表示学习（**WWW，2023**）[[论文](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3543507.3583476)][[代码](https:\u002F\u002Fgithub.com\u002Fsusheels\u002Ftgrank)]\n* 局部边动态与观点极化（**WSDM，2023**）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2111.14020.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Fadamlechowicz\u002Fopinion-polarization\u002F)]\n* 用于动态稀疏图链接预测的图序列神经ODE过程（**WSDM，2023**）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2211.08568.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FRManLuo\u002FGSNOP)]\n* 基于时序异质图的可解释性研究兴趣变化检测（**WSDM，2023**）[[论文](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3539597.3570453)]\n* 动态异质图注意力神经架构搜索（**AAAI，2023**）[[论文](https:\u002F\u002Fojs.aaai.org\u002Findex.php\u002FAAAI\u002Farticle\u002Fview\u002F26338)][[代码](https:\u002F\u002Fgithub.com\u002Fwondergo2017\u002FDHGAS)]\n* 通过脉冲神经网络扩展动态图表示学习（**AAAI，2023**）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2208.10364.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FEdisonLeeeee\u002FSpikeNet)]\n* 隐马尔可夫模型用于时序图表示学习（**ESANN，2023**）[[论文](https:\u002F\u002Fwww.esann.org\u002Fsites\u002Fdefault\u002Ffiles\u002Fproceedings\u002F2023\u002FES2023-35.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Fnec-research\u002Fhidden_markov_model_temporal_graphs)]\n\n### 2022年\n\n* TGL：用于亿级图的时序图神经网络训练通用框架（VLDB，2022）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2203.14883.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Famazon-science\u002Ftgl)]\n* 神经时序游走：连续时间动态图上的基元感知表示学习（NeurIPS，2022）[[论文](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper_files\u002Fpaper\u002F2022\u002Ffile\u002F7dadc855cef7494d5d956a8d28add871-Paper-Conference.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FKimMeen\u002FNeural-Temporal-Walks)]\n* 面向时空分布漂移的动态图神经网络（NeurIPS，2022）[[论文](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper_files\u002Fpaper\u002F2022\u002Fhash\u002F2857242c9e97de339ce642e75b15ff24-Abstract-Conference.html)][[代码](https:\u002F\u002Fgithub.com\u002Fwondergo2017\u002FDIDA)]\n* 时序图上的自适应数据增强（NeurIPS，2022）[[论文](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper\u002F2021\u002Ffile\u002F0b0b0994d12ad343511adfbfc364256e-Paper.pdf)]\n* 用于链接预测的无参数动态图嵌入（NeurIPS，2022）[[论文](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper_files\u002Fpaper\u002F2022\u002Ffile\u002Fb14d7175755b180dc2163e15e3110cb6-Paper-Conference.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FFudanCISL\u002FFreeGEM)]\n* 面向动态图的即时图神经网络（KDD，2022）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2206.01379.pdf)][[代码]()]\n* 用于阿片类药物过量预测的解耦动态异质图学习（KDD，2022）[[论文](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3534678.3539279)][[代码]()]\n* ROLAND：面向动态图的图学习框架（KDD，2022）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2208.07239.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Fsnap-stanford\u002Froland)]\n* 大型动态图上的子集节点异常跟踪（KDD，2022）[[论文](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3534678.3539389)][[代码](https:\u002F\u002Fgithub.com\u002Fzjlxgxz\u002FDynAnom)]\n* 基于生成式回放的流式图神经网络（KDD，2022）[[论文](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3534678.3539336)][[代码](https:\u002F\u002Fgithub.com\u002FJunshan-Wang\u002FSGNN-GR)]\n* 邻域感知的可扩展时序网络表示学习（LoG，2022）[[论文](https:\u002F\u002Fopenreview.net\u002Fpdf?id=EPUtNe7a9ta)][[代码](https:\u002F\u002Fgithub.com\u002FGraph-COM\u002FNeighborhood-Aware-Temporal-Network)]\n* DisenCTR：基于动态图的解耦表示用于点击率预测（SIGIR，2022）[[论文](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3477495.3531851)][[代码](https:\u002F\u002Fgithub.com\u002FFang6ang\u002FDisenCTR)]\n* STAM：一种用于图神经网络推荐的时空聚合方法（WWW，2022）[[论文](https:\u002F\u002Fkeg.cs.tsinghua.edu.cn\u002Fjietang\u002Fpublications\u002FWWW22-Yang%20et%20al.-STAM-GNN.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Fzyang-16\u002FSTAM)]\n* 神经网络预测时序网络中的高阶模式（WWW，2022）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2106.06039.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FGraph-COM\u002FNeural_Higher-order_Pattern_Prediction)]\n* TREND：用于图表示学习的时序事件与节点动态（WWW，2022）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2203.14303.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FWenZhihao666\u002FTREND)]\n* 基于病毒式营销的在线社交网络观点演化模型（WWW，2022）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2202.03573.pdf)]\n* EvoKG：联合建模事件时间和网络结构以进行时序知识图谱推理（WSDM，2022）[[论文](http:\u002F\u002Fkeg.cs.tsinghua.edu.cn\u002Fyuxiao\u002Fpapers\u002FWSDM22-park-evokg.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FNamyongPark\u002FEvoKG)]\n* 在动态图中寻找简洁、精确且完备的近双团集合（WSDM，2022）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2110.14875.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Fhyeonjeong1\u002Fcutnpeel)]\n* 动态网络中的少样本链接预测（WSDM，2022）[[论文](http:\u002F\u002Fwww.shichuan.org\u002Fdoc\u002F120.pdf)]\n* 关于将静态节点嵌入推广到动态场景的研究（WSDM，2022）[[论文](https:\u002F\u002Fgemslab.github.io\u002Fpapers\u002Fdijin-2021-trg.pdf)]\n* 沿着时间轴：用于时序知识图谱补全的时间线追踪嵌入（CIKM，2022）[[论文](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3511808.3557233)][[代码](https:\u002F\u002Fgithub.com\u002Fzhangfw123\u002FTLT-KGE)]\n* DA-Net：用于时序知识图谱推理的分布式注意力网络（CIKM，2022）[[论文](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3511808.3557280)]\n* 具有时变曲率的自监督黎曼图神经网络用于时序图学习（CIKM，2022）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2208.14073.pdf)]\n* 用于协同过滤的动态超图学习（CIKM，2022）[[论文]](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3511808.3557301)\n\n### 2021年\n\n* 基于因果匿名游走的时序网络归纳式表示学习 (**ICLR, 2021**) [[论文](https:\u002F\u002Fopenreview.net\u002Fpdf?id=KYPz4YsCPj)][[代码](https:\u002F\u002Fgithub.com\u002Fsnap-stanford\u002FCAW)]\n* 用于学习相互作用系统动力学的耦合图微分方程 (**KDD, 2021**) [[论文](http:\u002F\u002Fweb.cs.ucla.edu\u002F~yzsun\u002Fpapers\u002F2021_KDD_CG_ODE.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FZijieH\u002FCG-ODE)]\n* 大型动态图上的子集节点表示学习 (**KDD, 2021**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2106.01570.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Fzjlxgxz\u002FDynamicPPE)]\n* 基于双曲空间中隐式层次学习的离散时间时序网络嵌入 [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2107.03767.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Fmarlin-codes\u002FHTGN-KDD21)]\n* 为时序知识图谱补全而学习跨时间游走 (**KDD, 2021**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2012.10595v1.pdf)]\n* 时序图上交互顺序的预测 (**KDD, 2021**)\n* 基于演化表示学习的时序知识图谱推理 (**SIGIR, 2021**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2104.10353.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FLee-zix\u002FRE-GCN)]\n* 基于邻域与社区影响力挖掘的时序网络归纳式表示学习 (**SIGIR, 2021**)\n* TIE：基于嵌入的增量式时序知识图谱补全框架 [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2104.08419.pdf)]\n* SDG：一种简化且动态的图神经网络 (**SIGIR SHORT, 2021**) [[论文](https:\u002F\u002Fgithub.com\u002FDongqiFu\u002FSDG\u002Fblob\u002Fmain\u002Fpaper\u002FSDG_A%20Simplified%20and%20Dynamic%20Graph%20Neural%20Network.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FDongqiFu\u002FSDG)]\n* 用于会话推荐的时序增强图神经网络 (**SIGIR SHORT, 2021**) [[论文](https:\u002F\u002Fwww4.comp.polyu.edu.hk\u002F~xiaohuang\u002Fdocs\u002FHuachi_sigir2021.pdf)]\n* HINTS：通过动态异质信息网络嵌入预测新发表文献的引用时间序列 (**WWW, 2021**) [[论文](http:\u002F\u002Fweb.cs.ucla.edu\u002F~yzsun\u002Fpapers\u002F2021_WWW_HINTS.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Fsongjiang0909\u002FHINTS_code)]\n* TEDIC：动态社交交互网络中行为模式的神经建模 (**WWW, 2021**) [[论文](http:\u002F\u002Fsnap.stanford.edu\u002Ftedic\u002Ffiles\u002Fwww21_tedic.pdf)]\n* 用于建模动态图的双曲变分图神经网络 (**AAAI, 2021**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2104.02228.pdf)]\n* 基于循环图神经网络的动态图可解释聚类 (**AAAI, 2021**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2012.08740.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FInterpretableClustering\u002FInterpretableClustering)]\n* 利用经验回放克服图神经网络中的灾难性遗忘 (**AAAI, 2021**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2003.09908.pdf)]\n* 动态异质信息网络上的节点嵌入学习与更新 (**WSDM, 2021**) [[论文](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3437963.3441745)]\n* F-FADE：用于边流异常检测的频率因子分解 (**WSDM, 2021**) [[论文](https:\u002F\u002Fcs.stanford.edu\u002Fpeople\u002Fjure\u002Fpubs\u002Fffade-wsdm21.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Fsnap-stanford\u002FF-FADE)]\n* 面向动态图的基于缓存的图神经网络系统 (**CIKM 2021**) [[论文]]\n* 动态图上的自监督表示学习 (**CIKM 2021**) [[论文]]\n* 基于时序图协作Transformer的连续时间序列推荐 [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2108.06625.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FDyGRec\u002FTGSRec)]\n* 用于动态图异常检测的结构化时序图神经网络 (**CIKM 2021**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2005.07427.pdf)]\n* 时序网络中高阶模式（基元）的神经预测 (**ARXIV, 2021**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2106.06039.pdf)]\n\n### 2020年\n\n* 时间图上的归纳表示学习 (**ICLR, 2020**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2002.07962.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FStatsDLMathsRecomSys\u002FIsnductive-representation-learning-on-temporal-graphs)]\n* 用于动态图深度学习的时间图网络 (**ICML Workshop, 2020**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2006.10637v1.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Ftwitter-research\u002Ftgn)]\n* 面向 temporal interaction networks 的数据驱动图生成模型 (**KDD, 2020**) [[论文](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3394486.3403082)][[代码](https:\u002F\u002Fgithub.com\u002Fdavidchouzdw\u002FTagGen)]\n* 基于动态知识图谱的多事件预测 (**KDD, 2020**) [[论文](https:\u002F\u002Fyue-ning.github.io\u002Fdocs\u002FKDD20-glean.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Famy-deng\u002Fglean)]\n* 动态图的拉普拉斯变化点检测 (**KDD, 2020**) [[论文](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3394486.3403077)][[代码](https:\u002F\u002Fgithub.com\u002FshenyangHuang\u002FLAD)]\n* 时间介数的算法方面 (**KDD, 2020**) [[论文](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3394486.3403259)][[代码](https:\u002F\u002Ffpt.akt.tu-berlin.de\u002Fsoftware\u002Ftemporal_betweenness\u002F)]\n* 异构图Transformer (**WWW, 2020**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2003.01332.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Facbull\u002FpyHGT)]\n* 流式图神经网络 (**SIGIR, 2020**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1810.10627.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Falge24\u002FDyGNN)]\n* 基于序列超图的下一项推荐 (**SIGIR, 2020**) [[论文](http:\u002F\u002Fwww.public.asu.edu\u002F~kding9\u002Fpdf\u002FSIGIR2020_HyperRec.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Fwangjlgz\u002FHyperRec)]\n* 具有高阶非线性信息的时间网络嵌入 (**AAAI, 2020**) [[论文](https:\u002F\u002Fojs.aaai.org\u002Findex.php\u002FAAAI\u002Farticle\u002Fview\u002F5993)]\n* 保持基元结构的时间网络嵌入 (**IJCAI, 2020**) [[论文](https:\u002F\u002Fwww.ijcai.org\u002Fproceedings\u002F2020\u002F0172.pdf)]\n* 动态图协同过滤 (**ICDM, 2020**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2101.02844.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FCRIPAC-DIG\u002FDGCF)]\n* DySAT：通过自注意力网络在动态图上进行深度神经表示学习 (**WSDM, 2020**) [[论文](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3336191.3371845)][[代码](https:\u002F\u002Fgithub.com\u002Faravindsankar28\u002FDySAT)]\n* 在动态异构信息网络上学习和更新节点嵌入 (**WSDM, 2020**) [[论文](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3437963.3441745)][[代码]()]\n* 通过神经交互过程进行连续时间动态图学习 (**CIKM, 2020**) [[论文](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3340531.3411946)]\n* tdGraphEmbed：时间动态图级别的嵌入 (**CIKM, 2020**) [[论文](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3340531.3411953)][[代码](https:\u002F\u002Fgithub.com\u002Fmoranbel\u002FtdGraphEmbed)]\n* 通过持续学习的流式图神经网络 (**CIKM, 2020**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2009.10951.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FJunshan-Wang\u002FContinualGNN)]\n* 基于解耦的持续图表示学习 (**EMNLP, 2020**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2010.02565.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FKXY-PUBLIC\u002FDiCGRL)]\n* TeMP：用于时间知识图谱补全的时间消息传递 (**EMNLP, 2020**) [[论文](https:\u002F\u002Faclanthology.org\u002F2020.emnlp-main.462.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FJiapengWu\u002FTeMP)]\n* 回归事件网络：时间知识图谱上的自回归结构推断 (**EMNLP, 2020**) [[论文](https:\u002F\u002Faclanthology.org\u002F2020.emnlp-main.541.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FINK-USC\u002FRE-Net)]\n* EPNE：进化模式保持网络嵌入 (**ECAI, 2020**) [[论文](http:\u002F\u002Fecai2020.eu\u002Fpapers\u002F528_paper.pdf)]\n* GloDyNE：全局拓扑保持的动态网络嵌入 (**TKDE, 2020**) [[论文](https:\u002F\u002Fieeexplore.ieee.org\u002Fstamp\u002Fstamp.jsp?tp=&arnumber=9302718)][[代码](https:\u002F\u002Fgithub.com\u002Fhouchengbin\u002FGloDyNE)]\n* 基于元路径邻近性的动态异构信息网络嵌入 (**TKDE, 2020**) [[论文](https:\u002F\u002Fyuanfulu.github.io\u002Fpublication\u002FTKDE-DyHNE.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Frootlu\u002FDyHNE)]\n* 终身图学习 (**ARXIV, 2020**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2009.00647.pdf)]\n\n\n\n### 2019年\n\n* 变分图循环神经网络 (**NeurIPS, 2019**) [[论文](https:\u002F\u002Fpapers.nips.cc\u002Fpaper\u002F2019\u002Ffile\u002Fa6b8deb7798e7532ade2a8934477d3ce-Paper.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FVGraphRNN\u002FVGRNN)]\n* 循环时空图神经网络 (**NeurIPS, 2019**) [[论文](http:\u002F\u002Fexport.arxiv.org\u002Fpdf\u002F1904.05582#:~:text=Our%20recurrent%20neural%20graph%20ef%EF%AC%81ciently%20processes%20information%20in,in%20space-time%20using%20a%20backbone%20deep%20neural%20network.)][[代码](https:\u002F\u002Fgithub.com\u002FIuliaDuta\u002FRSTG)]\n* DyRep：在动态图上学习表示 (**ICLR, 2019**) [[论文](https:\u002F\u002Fopenreview.net\u002Fpdf?id=HyePrhR5KX)]\n* 预测 temporal interaction networks 中的动态嵌入轨迹 (**KDD, 2019**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1908.01207.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Fsrijankr\u002Fjodie)]\n* 学习动态上下文图以预测社交事件 (**KDD, 2019**) [[论文](https:\u002F\u002Fyue-ning.github.io\u002Fdocs\u002FKDD19-dengA.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Famy-deng\u002FDynamicGCN)]\n* EvolveGCN：面向动态图的演化图卷积网络 (**AAAI, 2019**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1902.10191.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FIBM\u002FEvolveGCN)]\n* 用于动态推荐系统的层次化时间卷积网络 (**WWW, 2019**) [[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1904.04381.pdf)]\n* 通过局部动作进行实时流式图嵌入 (**WWW, 2019**) [[论文](https:\u002F\u002Fnickduffield.net\u002Fdownload\u002Fpapers\u002FDL4G-SDE-2019.pdf)]\n* 动态超图神经网络 (**IJCAI, 2019**) [[论文](https:\u002F\u002Fwww.ijcai.org\u002FProceedings\u002F2019\u002F0366.pdf)][[代码](https:\u002F\u002Fgithub.com\u002FiMoonLab\u002FDHGNN#:~:text=%20DHGNN%3A%20Dynamic%20Hypergraph%20Neural%20Networks%20%201,%28Zhilin%20Yang%2C%20William%20W.%20-%20Cohen%2C...%20More%20)]\n* 时间图上的节点嵌入 (**IJCAI, 2019**) [[论文](https:\u002F\u002Fwww.ijcai.org\u002Fproceedings\u002F2019\u002F0640.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Furielsinger\u002FtNodeEmbed#:~:text=Node%20Embedding%20over%20Temporal%20Graphs.%20Uriel%20Singer%2C%20Ido,for%20nodes%20in%20any%20%28un%29directed%2C%20%28un%29weighted%20temporal%20graph.)]\n* 具有微观和宏观动态的时间网络嵌入 (**CIKM, 2019**) [[论文](https:\u002F\u002Fpar.nsf.gov\u002Fservlets\u002Fpurl\u002F10148548)][[代码](https:\u002F\u002Fgithub.com\u002Frootlu\u002FMMDNE)]\n\n### 2018年\n\n* NetWalk：一种用于动态网络异常检测的灵活深度嵌入方法（**KDD，2018**）[[论文](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3219819.3220024)][[代码](https:\u002F\u002Fgithub.com\u002Fkdmsit\u002FNetWalk)]\n* 基于邻居结构的时序网络嵌入（**KDD，2018**）[[论文](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3219819.3220054)][[代码]()]\n* 通过建模三元闭包过程的动态网络嵌入（**AAAI，2018**）[[论文](http:\u002F\u002Fyangy.org\u002Fworks\u002Fdynamictriad\u002Fdynamic_triad.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Fluckiezhou\u002FDynamicTriad)]\n* 连续时间动态网络嵌入（**WWW，2018**）[[论文](https:\u002F\u002Fdl.acm.org\u002Fdoi\u002Fpdf\u002F10.1145\u002F3184558.3191526)][[代码](https:\u002F\u002Fgithub.com\u002FShubhranshu-Shekhar\u002Fctdne)]\n* 动态网络嵌入：一种基于Skip-gram的网络嵌入扩展方法（**IJCAI，2018**）[[论文](https:\u002F\u002Fwww.ijcai.org\u002Fproceedings\u002F2018\u002F0288.pdf)]\n\n### 2017年\n\n* Know-Evolve：面向动态知识图谱的深度时序推理（**ICML，2017**）[[论文](http:\u002F\u002Fproceedings.mlr.press\u002Fv70\u002Ftrivedi17a\u002Ftrivedi17a.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Frstriv\u002FKnow-Evolve)]\n* 用于社会网络演化与观点迁移的共演化模型（**KDD，2017**）[[论文](http:\u002F\u002Fweb.cs.ucla.edu\u002F~yzsun\u002Fpapers\u002F2017_kdd_coevolution.pdf)[[代码]()]\n* 用于动态环境学习的属性化网络嵌入（**CIKM，2017**）[[论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F1706.01860.pdf)][[代码](https:\u002F\u002Fgithub.com\u002Fgaoghc\u002FDANE)]\n\n## 工具\n\n### 通用图学习\n* [Deep Graph Library](https:\u002F\u002Fwww.dgl.ai\u002F)\n* [PyTorch Geometric](https:\u002F\u002Fpytorch-geometric.readthedocs.io\u002Fen\u002Flatest\u002F)\n* [PyTorch Geometric Temporal](https:\u002F\u002Fpytorch-geometric-temporal.readthedocs.io\u002Fen\u002Flatest\u002Fnotes\u002Fintroduction.html)\n* [Stellar Graph](https:\u002F\u002Fstellargraph.readthedocs.io\u002Fen\u002Fstable\u002F)\n* [GraphVite](https:\u002F\u002Fgraphvite.io\u002F)\n\n### 知识图谱\n* [DGL-KE](https:\u002F\u002Fdglke.dgl.ai\u002Fdoc\u002F)\n* [OpenKE](https:\u002F\u002Fgithub.com\u002Fthunlp\u002FOpenKE)\n\n### 推荐系统\n* [RecBole](https:\u002F\u002Fwww.recbole.io\u002F)","# Awesome-DynamicGraphLearning 快速上手指南\n\n`Awesome-DynamicGraphLearning` 并非一个单一的 Python 库或框架，而是一个**精选资源列表**，汇集了动态图（时序图）学习领域的顶级论文、综述及对应的开源代码实现。本指南将帮助你快速利用该列表找到适合的工具并运行示例。\n\n## 环境准备\n\n由于列表中包含多个独立的开源项目，环境需求取决于你选择的具体模型（如 `DyGLib`, `GraphMixer`, `LLM4DyG` 等）。以下是通用的基础环境要求：\n\n*   **操作系统**: Linux (推荐 Ubuntu 20.04+), macOS, 或 Windows (WSL2 推荐)\n*   **Python 版本**: 3.8 - 3.10 (大多数图深度学习项目在此范围兼容最好)\n*   **核心依赖**:\n    *   `PyTorch` >= 1.9.0\n    *   `CUDA` (如需 GPU 加速，建议版本 11.1+)\n    *   `Git`\n*   **常用辅助库**: `numpy`, `pandas`, `scikit-learn`, `networkx`\n\n> **提示**：建议在开始之前创建一个独立的虚拟环境（如使用 `conda` 或 `venv`），以避免不同项目间的依赖冲突。\n\n## 安装步骤\n\n由于这是一个资源索引，你需要先克隆仓库浏览列表，然后针对感兴趣的具体项目进行安装。\n\n### 1. 获取资源列表\n克隆主仓库以查看最新的论文和代码链接：\n\n```bash\ngit clone https:\u002F\u002Fgithub.com\u002Fyour-target-repo\u002FAwesome-DynamicGraphLearning.git\ncd Awesome-DynamicGraphLearning\n```\n*(注：请替换为实际的仓库地址，此处基于通用开源习惯)*\n\n### 2. 选择并安装具体项目\n在 `README` 的 \"Papers\" 部分找到你需要的模型（例如 2023 年的统一库 **DyGLib** 或 **GraphMixer**），点击其 `[code]` 链接进入对应仓库。\n\n以 **DyGLib** (一个统一的动态图学习库) 为例，安装步骤如下：\n\n```bash\n# 克隆具体项目代码\ngit clone https:\u002F\u002Fgithub.com\u002Fyule-BUAA\u002FDyGLib.git\ncd DyGLib\n\n# 创建并激活虚拟环境 (推荐)\nconda create -n dyglib python=3.9\nconda activate dyglib\n\n# 安装依赖 (国内用户推荐使用清华源加速)\npip install -r requirements.txt -i https:\u002F\u002Fpypi.tuna.tsinghua.edu.cn\u002Fsimple\n\n# 安装项目本身\npip install -e .\n```\n\n对于其他独立项目（如 `GraphMixer`），通常遵循相同的 `git clone` -> `pip install -r requirements.txt` 流程。\n\n## 基本使用\n\n以下以 **DyGLib** 为例，展示如何加载数据并训练一个动态图模型。大多数列表中的项目都遵循类似的“数据加载 -> 模型定义 -> 训练”流程。\n\n### 1. 准备数据\n确保数据目录结构符合项目要求（通常位于 `data\u002F` 文件夹下）。许多项目内置了标准数据集（如 GDELT, Wikipedia, MOOC）。\n\n### 2. 运行训练脚本\n大多数项目提供了直接的训练脚本。以下是一个典型的启动命令：\n\n```bash\n# 使用 DyGLib 训练 TGN 模型 (在 Wikipedia 数据集上)\npython main.py \\\n    --model tgn \\\n    --dataset wikipedia \\\n    --gpu 0 \\\n    --epochs 50\n```\n\n### 3. 代码调用示例 (Python API)\n如果你选择的项目支持作为库导入（如 `DyGLib`），可以在 Python 脚本中直接调用：\n\n```python\nfrom dyglib import get_dataset, get_model\n\n# 1. 加载动态图数据集\ndata = get_dataset(name='wikipedia')\n\n# 2. 初始化模型 (例如 TGN)\nmodel = get_model(model_name='tgn', data=data)\n\n# 3. 训练模型\nmodel.train()\n\n# 4. 评估与预测\nauc_score = model.evaluate()\nprint(f\"Link Prediction AUC: {auc_score}\")\n```\n\n> **注意**：具体参数（如 `--model`, `--dataset`）需参考你所选具体项目仓库中的 `README.md` 或 `args.py` 文件，因为每个模型的实现细节有所不同。","某电商平台的算法团队正致力于优化实时推荐系统，需要精准捕捉用户与商品之间随时间快速变化的交互关系。\n\n### 没有 Awesome-DynamicGraphLearning 时\n- **文献检索如大海捞针**：面对海量关于动态图神经网络的论文，工程师难以区分哪些是针对连续时间图的最新突破，哪些仅适用于静态快照，调研效率极低。\n- **复现成本高昂且易错**：缺乏统一的代码索引，团队需花费数周在杂乱的仓库中寻找可运行的基线模型，常因版本不兼容或文档缺失而中途放弃。\n- **技术选型盲目**：由于缺少系统的综述指引（如 TNNLS 2024 或 JMLR 2020 的权威总结），难以判断\"LLM 结合动态图”等新兴方向是否适合当前业务场景。\n- **错过关键基准测试**：无法快速获取权威的动态图基准数据集和评估标准，导致自研模型的效果对比缺乏公信力，难以说服业务方上线。\n\n### 使用 Awesome-DynamicGraphLearning 后\n- **前沿技术一键直达**：直接定位到 ICML 2025 关于“可学习时间编码”的最新论文及代码，迅速掌握处理时序信号的最优解，将调研周期从数周缩短至几天。\n- **开箱即用的基线库**：通过收录的 CorDGT (WSDM 2025) 等项目的官方代码链接，团队当天即可搭建起高性能的动态图 Transformer 基线，大幅降低试错成本。\n- **清晰的演进路线图**：借助列表中高质量的 Survey 文章，团队明确了从传统 RNN 到自适应邻域聚合的技术演进路径，制定了符合业务阶段的研发路线。\n- **标准化的效果验证**：利用项目中提供的标准 Benchmark 和评测脚本，快速验证新模型在长程传播任务上的优势，用扎实数据推动了推荐策略的迭代上线。\n\nAwesome-DynamicGraphLearning 将分散的动态图学习资源转化为结构化的知识引擎，帮助团队在瞬息万变的时序数据中快速锁定最优技术方案。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FSpaceLearner_Awesome-DynamicGraphLearning_e5b6743a.png","SpaceLearner","NLNR","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002FSpaceLearner_e5a719a0.png","Enjoy coding life.",null,"https:\u002F\u002Fgithub.com\u002FSpaceLearner",[82],{"name":83,"color":84,"percentage":85},"Shell","#89e051",100,703,84,"2026-03-17T07:22:38",5,"","未说明",{"notes":93,"python":91,"dependencies":94},"该仓库是一个动态图学习领域的论文和代码资源列表（Awesome List），本身不是一个独立的软件工具，因此没有统一的运行环境需求。列表中包含了多个不同作者开发的独立项目（如 LeTE, CorDGT, DyGLib 等），每个项目都有各自特定的环境配置要求。用户需点击具体项目的代码链接（GitHub 仓库）查看其独立的 README 文件以获取详细的安装和运行指南。",[],[13],[97,98,99,100,101,102,103],"deep-learning","graph-neural-network","graph-neural-networks","dynamic-network-embedding","dynamic-graph-embedding","temporal-network","temporal-graph","2026-03-27T02:49:30.150509","2026-04-06T05:15:18.763513",[],[]]