[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"similar-WenjieDu--PyPOTS":3,"tool-WenjieDu--PyPOTS":61},[4,18,26,36,44,53],{"id":5,"name":6,"github_repo":7,"description_zh":8,"stars":9,"difficulty_score":10,"last_commit_at":11,"category_tags":12,"status":17},4358,"openclaw","openclaw\u002Fopenclaw","OpenClaw 是一款专为个人打造的本地化 AI 助手，旨在让你在自己的设备上拥有完全可控的智能伙伴。它打破了传统 AI 助手局限于特定网页或应用的束缚，能够直接接入你日常使用的各类通讯渠道，包括微信、WhatsApp、Telegram、Discord、iMessage 等数十种平台。无论你在哪个聊天软件中发送消息，OpenClaw 都能即时响应，甚至支持在 macOS、iOS 和 Android 设备上进行语音交互，并提供实时的画布渲染功能供你操控。\n\n这款工具主要解决了用户对数据隐私、响应速度以及“始终在线”体验的需求。通过将 AI 部署在本地，用户无需依赖云端服务即可享受快速、私密的智能辅助，真正实现了“你的数据，你做主”。其独特的技术亮点在于强大的网关架构，将控制平面与核心助手分离，确保跨平台通信的流畅性与扩展性。\n\nOpenClaw 非常适合希望构建个性化工作流的技术爱好者、开发者，以及注重隐私保护且不愿被单一生态绑定的普通用户。只要具备基础的终端操作能力（支持 macOS、Linux 及 Windows WSL2），即可通过简单的命令行引导完成部署。如果你渴望拥有一个懂你",349277,3,"2026-04-06T06:32:30",[13,14,15,16],"Agent","开发框架","图像","数据工具","ready",{"id":19,"name":20,"github_repo":21,"description_zh":22,"stars":23,"difficulty_score":10,"last_commit_at":24,"category_tags":25,"status":17},3808,"stable-diffusion-webui","AUTOMATIC1111\u002Fstable-diffusion-webui","stable-diffusion-webui 是一个基于 Gradio 构建的网页版操作界面，旨在让用户能够轻松地在本地运行和使用强大的 Stable Diffusion 图像生成模型。它解决了原始模型依赖命令行、操作门槛高且功能分散的痛点，将复杂的 AI 绘图流程整合进一个直观易用的图形化平台。\n\n无论是希望快速上手的普通创作者、需要精细控制画面细节的设计师，还是想要深入探索模型潜力的开发者与研究人员，都能从中获益。其核心亮点在于极高的功能丰富度：不仅支持文生图、图生图、局部重绘（Inpainting）和外绘（Outpainting）等基础模式，还独创了注意力机制调整、提示词矩阵、负向提示词以及“高清修复”等高级功能。此外，它内置了 GFPGAN 和 CodeFormer 等人脸修复工具，支持多种神经网络放大算法，并允许用户通过插件系统无限扩展能力。即使是显存有限的设备，stable-diffusion-webui 也提供了相应的优化选项，让高质量的 AI 艺术创作变得触手可及。",162132,"2026-04-05T11:01:52",[14,15,13],{"id":27,"name":28,"github_repo":29,"description_zh":30,"stars":31,"difficulty_score":32,"last_commit_at":33,"category_tags":34,"status":17},1381,"everything-claude-code","affaan-m\u002Feverything-claude-code","everything-claude-code 是一套专为 AI 编程助手（如 Claude Code、Codex、Cursor 等）打造的高性能优化系统。它不仅仅是一组配置文件，而是一个经过长期实战打磨的完整框架，旨在解决 AI 代理在实际开发中面临的效率低下、记忆丢失、安全隐患及缺乏持续学习能力等核心痛点。\n\n通过引入技能模块化、直觉增强、记忆持久化机制以及内置的安全扫描功能，everything-claude-code 能显著提升 AI 在复杂任务中的表现，帮助开发者构建更稳定、更智能的生产级 AI 代理。其独特的“研究优先”开发理念和针对 Token 消耗的优化策略，使得模型响应更快、成本更低，同时有效防御潜在的攻击向量。\n\n这套工具特别适合软件开发者、AI 研究人员以及希望深度定制 AI 工作流的技术团队使用。无论您是在构建大型代码库，还是需要 AI 协助进行安全审计与自动化测试，everything-claude-code 都能提供强大的底层支持。作为一个曾荣获 Anthropic 黑客大奖的开源项目，它融合了多语言支持与丰富的实战钩子（hooks），让 AI 真正成长为懂上",142651,2,"2026-04-06T23:34:12",[14,13,35],"语言模型",{"id":37,"name":38,"github_repo":39,"description_zh":40,"stars":41,"difficulty_score":32,"last_commit_at":42,"category_tags":43,"status":17},2271,"ComfyUI","Comfy-Org\u002FComfyUI","ComfyUI 是一款功能强大且高度模块化的视觉 AI 引擎，专为设计和执行复杂的 Stable Diffusion 图像生成流程而打造。它摒弃了传统的代码编写模式，采用直观的节点式流程图界面，让用户通过连接不同的功能模块即可构建个性化的生成管线。\n\n这一设计巧妙解决了高级 AI 绘图工作流配置复杂、灵活性不足的痛点。用户无需具备编程背景，也能自由组合模型、调整参数并实时预览效果，轻松实现从基础文生图到多步骤高清修复等各类复杂任务。ComfyUI 拥有极佳的兼容性，不仅支持 Windows、macOS 和 Linux 全平台，还广泛适配 NVIDIA、AMD、Intel 及苹果 Silicon 等多种硬件架构，并率先支持 SDXL、Flux、SD3 等前沿模型。\n\n无论是希望深入探索算法潜力的研究人员和开发者，还是追求极致创作自由度的设计师与资深 AI 绘画爱好者，ComfyUI 都能提供强大的支持。其独特的模块化架构允许社区不断扩展新功能，使其成为当前最灵活、生态最丰富的开源扩散模型工具之一，帮助用户将创意高效转化为现实。",107888,"2026-04-06T11:32:50",[14,15,13],{"id":45,"name":46,"github_repo":47,"description_zh":48,"stars":49,"difficulty_score":32,"last_commit_at":50,"category_tags":51,"status":17},4721,"markitdown","microsoft\u002Fmarkitdown","MarkItDown 是一款由微软 AutoGen 团队打造的轻量级 Python 工具，专为将各类文件高效转换为 Markdown 格式而设计。它支持 PDF、Word、Excel、PPT、图片（含 OCR）、音频（含语音转录）、HTML 乃至 YouTube 链接等多种格式的解析，能够精准提取文档中的标题、列表、表格和链接等关键结构信息。\n\n在人工智能应用日益普及的今天，大语言模型（LLM）虽擅长处理文本，却难以直接读取复杂的二进制办公文档。MarkItDown 恰好解决了这一痛点，它将非结构化或半结构化的文件转化为模型“原生理解”且 Token 效率极高的 Markdown 格式，成为连接本地文件与 AI 分析 pipeline 的理想桥梁。此外，它还提供了 MCP（模型上下文协议）服务器，可无缝集成到 Claude Desktop 等 LLM 应用中。\n\n这款工具特别适合开发者、数据科学家及 AI 研究人员使用，尤其是那些需要构建文档检索增强生成（RAG）系统、进行批量文本分析或希望让 AI 助手直接“阅读”本地文件的用户。虽然生成的内容也具备一定可读性，但其核心优势在于为机器",93400,"2026-04-06T19:52:38",[52,14],"插件",{"id":54,"name":55,"github_repo":56,"description_zh":57,"stars":58,"difficulty_score":10,"last_commit_at":59,"category_tags":60,"status":17},4487,"LLMs-from-scratch","rasbt\u002FLLMs-from-scratch","LLMs-from-scratch 是一个基于 PyTorch 的开源教育项目，旨在引导用户从零开始一步步构建一个类似 ChatGPT 的大型语言模型（LLM）。它不仅是同名技术著作的官方代码库，更提供了一套完整的实践方案，涵盖模型开发、预训练及微调的全过程。\n\n该项目主要解决了大模型领域“黑盒化”的学习痛点。许多开发者虽能调用现成模型，却难以深入理解其内部架构与训练机制。通过亲手编写每一行核心代码，用户能够透彻掌握 Transformer 架构、注意力机制等关键原理，从而真正理解大模型是如何“思考”的。此外，项目还包含了加载大型预训练权重进行微调的代码，帮助用户将理论知识延伸至实际应用。\n\nLLMs-from-scratch 特别适合希望深入底层原理的 AI 开发者、研究人员以及计算机专业的学生。对于不满足于仅使用 API，而是渴望探究模型构建细节的技术人员而言，这是极佳的学习资源。其独特的技术亮点在于“循序渐进”的教学设计：将复杂的系统工程拆解为清晰的步骤，配合详细的图表与示例，让构建一个虽小但功能完备的大模型变得触手可及。无论你是想夯实理论基础，还是为未来研发更大规模的模型做准备",90106,"2026-04-06T11:19:32",[35,15,13,14],{"id":62,"github_repo":63,"name":64,"description_en":65,"description_zh":66,"ai_summary_zh":66,"readme_en":67,"readme_zh":68,"quickstart_zh":69,"use_case_zh":70,"hero_image_url":71,"owner_login":72,"owner_name":73,"owner_avatar_url":74,"owner_bio":75,"owner_company":76,"owner_location":77,"owner_email":78,"owner_twitter":79,"owner_website":80,"owner_url":81,"languages":82,"stars":91,"forks":92,"last_commit_at":93,"license":94,"difficulty_score":95,"env_os":96,"env_gpu":96,"env_ram":96,"env_deps":97,"category_tags":103,"github_topics":107,"view_count":32,"oss_zip_url":123,"oss_zip_packed_at":123,"status":17,"created_at":124,"updated_at":125,"faqs":126,"releases":156},4745,"WenjieDu\u002FPyPOTS","PyPOTS","A Python toolkit\u002Flibrary for reality-centric machine\u002Fdeep learning and data mining on partially-observed time series, including SOTA neural network models for scientific analysis tasks of imputation\u002Fclassification\u002Fclustering\u002Fforecasting\u002Fanomaly detection\u002Fcleaning on incomplete industrial (irregularly-sampled) multivariate TS with NaN missing values","PyPOTS 是一个专为处理“部分观测时间序列”而设计的 Python 开源工具箱。在工业监控、医疗记录等现实场景中，传感器故障或数据录入遗漏常导致时间序列数据出现缺失值（NaN）或不规则采样，传统机器学习模型往往难以直接应对此类不完整数据。PyPOTS 正是为了解决这一痛点而生，它提供了一套完整的解决方案，支持对含缺失值的多元时间序列进行填补、分类、聚类、预测、异常检测及数据清洗等核心任务。\n\n该工具特别适合数据科学家、算法研究人员以及从事工业数据分析的开发者使用。PyPOTS 的最大亮点在于其集成了多种针对不完整数据优化的最先进（SOTA）深度学习模型，并基于 PyTorch 构建，确保了高效性与灵活性。它不仅屏蔽了处理缺失数据的复杂底层逻辑，让使用者能专注于业务分析，还保持了高度的代码规范性和可维护性，拥有完善的文档与活跃的社区支持。无论是学术研究中的算法验证，还是实际生产环境中的数据挖掘，PyPOTS 都能帮助用户轻松从不完整的数据中提取有价值的洞察。","\u003Cimg src=\"https:\u002F\u002Fpypots.com\u002Ffigs\u002Ficons\u002Ftranslate.svg\" width=\"18\"> [简体中文](\u002FREADME_zh.md) | English\n\n\u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\">\n    \u003Cimg src=\"https:\u002F\u002Fpypots.com\u002Ffigs\u002Fpypots_logos\u002FPyPOTS\u002Flogo_FFBG.svg\" width=\"200\" align=\"right\">\n\u003C\u002Fa>\n\n\u003Ch3 align=\"center\">Welcome to PyPOTS\u003C\u002Fh3>\n\n\u003Cp align=\"center\">\u003Ci>a Python toolbox for machine learning on Partially-Observed Time Series\u003C\u002Fi>\u003C\u002Fp>\n\n\u003Cp align=\"center\">\n    \u003Ca href=\"https:\u002F\u002Fdocs.pypots.com\u002Fen\u002Flatest\u002Finstall.html#reasons-of-version-limitations-on-dependencies\">\n       \u003Cimg alt=\"Python version\" src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FPython-v3.8+-F8C6B5?logo=python&logoColor=white\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Flandscape.pytorch.org\u002F?item=modeling--specialized--pypots\">\n        \u003Cimg alt=\"Pytorch landscape\" src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FPyTorch%20Landscape-EE4C2C?logo=pytorch&logoColor=white\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fblob\u002Fmain\u002FLICENSE\">\n        \u003Cimg alt=\"BSD-3 license\" src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FLicense-BSD--3-E9BB41?logo=opensourceinitiative&logoColor=white\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS?tab=coc-ov-file\">\n        \u003Cimg alt=\"Code of Conduct\" src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FContributor_Covenant-2.1-4baaaa\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS#-community\">\n        \u003Cimg alt=\"Community\" src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fjoin_us-community!-C8A062\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Freleases\">\n        \u003Cimg alt=\"the latest release version\" src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fv\u002Frelease\u002Fwenjiedu\u002Fpypots?color=EE781F&include_prereleases&label=Release&logo=github&logoColor=white\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fgraphs\u002Fcontributors\">\n        \u003Cimg alt=\"GitHub contributors\" src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fcontributors\u002Fwenjiedu\u002Fpypots?color=D8E699&label=Contributors&logo=GitHub\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fstar-history.com\u002F#wenjiedu\u002Fpypots\">\n        \u003Cimg alt=\"GitHub Repo stars\" src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fstars\u002Fwenjiedu\u002Fpypots?logo=None&color=6BB392&label=%E2%98%85%20Stars\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fnetwork\u002Fmembers\">\n        \u003Cimg alt=\"GitHub Repo forks\" src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fforks\u002Fwenjiedu\u002Fpypots?logo=forgejo&logoColor=black&label=Forks\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fsonarcloud.io\u002Fcomponent_measures?id=WenjieDu_PyPOTS&metric=sqale_rating&view=list\">\n        \u003Cimg alt=\"maintainability\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FWenjieDu_PyPOTS_readme_46aae964756d.png\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fcoveralls.io\u002Fgithub\u002FWenjieDu\u002FPyPOTS?branch=full_test\">\n        \u003Cimg alt=\"Coveralls coverage\" src=\"https:\u002F\u002Fimg.shields.io\u002FcoverallsCoverage\u002Fgithub\u002FWenjieDu\u002FPyPOTS?branch=full_test&logo=coveralls&color=75C1C4&label=Coverage\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Factions\u002Fworkflows\u002Ftesting_ci.yml\">\n        \u003Cimg alt=\"GitHub Testing\" src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Factions\u002Fworkflow\u002Fstatus\u002Fwenjiedu\u002Fpypots\u002Ftesting_ci.yml?logo=circleci&color=C8D8E1&label=CI\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fdocs.pypots.com\">\n        \u003Cimg alt=\"Docs building\" src=\"https:\u002F\u002Fimg.shields.io\u002Freadthedocs\u002Fpypots?logo=readthedocs&label=Docs&logoColor=white&color=395260\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fdeepwiki.com\u002FWenjieDu\u002FPyPOTS\">\n        \u003Cimg alt=\"Ask DeepWiki\" src=\"https:\u002F\u002Fdeepwiki.com\u002Fbadge.svg\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpsf\u002Fblack\">\n        \u003Cimg alt=\"Code Style\" src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FCode_Style-black-000000\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fanaconda.org\u002Fconda-forge\u002Fpypots\">\n        \u003Cimg alt=\"Conda downloads\" src=\"https:\u002F\u002Fpypots.com\u002Ffigs\u002Fdownloads_badges\u002Fconda_pypots_downloads.svg\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fpepy.tech\u002Fproject\u002Fpypots\">\n        \u003Cimg alt=\"PyPI downloads\" src=\"https:\u002F\u002Fpypots.com\u002Ffigs\u002Fdownloads_badges\u002Fpypi_pypots_downloads.svg\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Farxiv.org\u002Fabs\u002F2305.18811\">\n        \u003Cimg alt=\"arXiv DOI\" src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FDOI-10.48550\u002FarXiv.2305.18811-F8F7F0\">\n    \u003C\u002Fa>\n\u003C\u002Fp>\n\n⦿ `Motivation`: Due to all kinds of reasons like failure of collection sensors, communication error,\nand unexpected malfunction, missing values are common to see in time series from the real-world environment.\nThis makes partially-observed time series (POTS) a pervasive problem in open-world modeling and prevents advanced\ndata analysis. Although this problem is important, the area of machine learning on POTS still lacks a dedicated toolkit.\nPyPOTS is created to fill in this blank.\n\n⦿ `Mission`: PyPOTS (pronounced \"Pie Pots\") is born to become a handy toolbox that is going to make machine learning on\nPOTS easy rather than tedious, to help engineers and researchers focus more on the core problems in their hands rather\nthan on how to deal with the missing parts in their data. PyPOTS will keep integrating classical and the latest\nstate-of-the-art machine learning algorithms for partially-observed multivariate time series. For sure, besides various\nalgorithms, PyPOTS is going to have unified APIs together with detailed documentation and interactive examples across\nalgorithms as tutorials.\n\n🤗 **Please** star this repo to help others notice PyPOTS if you think it is a useful toolkit.\n**Please** kindly [cite PyPOTS](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS#-citing-pypots) in your publications if it helps with\nyour research.\nThis really means a lot to our open-source research. Thank you!\n\nThe rest of this readme file is organized as follows:\n[**❖ Available Algorithms**](#-available-algorithms),\n[**❖ PyPOTS Ecosystem**](#-pypots-ecosystem),\n[**❖ Installation**](#-installation),\n[**❖ Usage**](#-usage),\n[**❖ Citing PyPOTS**](#-citing-pypots),\n[**❖ Contribution**](#-contribution),\n[**❖ Community**](#-community).\n\n## ❖ Available Algorithms\n\nPyPOTS supports imputation, classification, clustering, forecasting, and anomaly detection tasks on multivariate\npartially-observed time series with missing values. The table below shows the availability of each algorithm\n(sorted by Year) in PyPOTS for different tasks. The symbol `✅` indicates the algorithm is available for the\ncorresponding task (note that models will be continuously updated in the future to handle tasks that are not\ncurrently supported. Stay tuned❗️).\n\n🌟 Since **v0.2**, all neural-network models in PyPOTS has got hyperparameter-optimization support.\nThis functionality is implemented with the [Microsoft NNI](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fnni) framework. You may want to\nrefer to our time-series imputation survey and benchmark\nrepo [Awesome_Imputation](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FAwesome_Imputation)\nto see how to config and tune the hyperparameters.\n\n🔥 Note that all models whose name with `🧑‍🔧` in the table (e.g. Transformer, iTransformer, Informer etc.) are not\noriginally proposed as algorithms for POTS data in their papers, and they cannot directly accept time series with\nmissing values as input, let alone imputation. **To make them applicable to POTS data, we specifically apply the\nembedding strategy and training approach (ORT+MIT) the same as we did in\n[the SAITS paper](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2202.08516)[^1].**\n\nThe task types are abbreviated as follows:\n**`IMPU`**: Imputation;\n**`FORE`**: Forecasting;\n**`CLAS`**: Classification;\n**`CLUS`**: Clustering;\n**`ANOD`**: Anomaly Detection.\nIn addition to the 5 tasks, PyPOTS also provides TS2Vec[^48] for time series representation learning and vectorization.\nThe paper references and links are all listed at the bottom of this file.\n\n| **Type**      | **Algo**                                                                                                                                        | **IMPU** | **FORE** | **CLAS** | **CLUS** | **ANOD** | **Year - Venue**                                                                                         |\n|:--------------|:------------------------------------------------------------------------------------------------------------------------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:---------------------------------------------------------------------------------------------------------|\n| LLM&TSFM      | \u003Ca href=\"https:\u002F\u002Ftime-series.ai\">\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FWenjieDu_PyPOTS_readme_a3729f3a1cbe.png\" width=\"26px\" align=\"center\">Time-Series.AI\u003C\u002Fa>  [^36] |    ✅     |    ✅     |    ✅     |    ✅     |    ✅     | \u003Ca href=\"https:\u002F\u002Fdocs.google.com\u002Fforms\u002Fd\u002F1Ff2ndYUFQEL3tIcwtcR8lWeopQ2vTXX6D_x8WGFKH6E\">Join waitlist\u003C\u002Fa> |\n| Neural Net    | MixLinear🧑‍🔧[^52]                                                                                                                             |          |    ✅     |          |          |          | `2026 - ICLR`                                                                                            |\n| Neural Net    | SegRNN🧑‍🔧[^43]                                                                                                                                |    ✅     |    ✅     |          |          |    ✅     | `2026 - IoT-J`                                                                                           |\n| Neural Net    | TEFN🧑‍🔧[^39]                                                                                                                                  |    ✅     |    ✅     |    ✅     |          |    ✅     | `2025 - TPAMI`                                                                                           |\n| Neural Net    | TimeMixer++[^49]                                                                                                                                |    ✅     |    ✅     |          |          |    ✅     | `2025 - ICLR`                                                                                            |\n| LLM           | Time-LLM🧑‍🔧[^45]                                                                                                                              |    ✅     |    ✅     |          |          |          | `2024 - ICLR`                                                                                            |\n| TSFM          | MOMENT[^47]                                                                                                                                     |    ✅     |    ✅     |          |          |          | `2024 - ICML`                                                                                            |\n| Neural Net    | TSLANet[^51]                                                                                                                                    |    ✅     |          |          |          |          | `2024 - ICML`                                                                                            |\n| Neural Net    | FITS🧑‍🔧[^41]                                                                                                                                  |    ✅     |    ✅     |          |          |          | `2024 - ICLR`                                                                                            |\n| Neural Net    | TimeMixer[^37]                                                                                                                                  |    ✅     |    ✅     |          |          |     ✅     | `2024 - ICLR`                                                                                            |\n| Neural Net    | iTransformer🧑‍🔧[^24]                                                                                                                          |    ✅     |          |    ✅     |          |    ✅      | `2024 - ICLR`                                                                                            |\n| Neural Net    | ModernTCN[^38]                                                                                                                                  |    ✅     |    ✅     |          |          |          | `2024 - ICLR`                                                                                            |\n| Neural Net    | ImputeFormer🧑‍🔧[^34]                                                                                                                          |    ✅     |          |          |          |    ✅     | `2024 - KDD`                                                                                             |\n| Neural Net    | TOTEM[^50]                                                                                                                                      |    ✅     |          |          |          |          | `2024 - TMLR`                                                                                            |\n| Neural Net    | TKAN[^54]                                                                                                                                       |    ✅     |          |          |          |          | `2024 - arXiv`                                                                                           |\n| Neural Net    | SAITS[^1]                                                                                                                                       |    ✅     |          |    ✅     |          |    ✅     | `2023 - ESWA`                                                                                            |\n| LLM           | GPT4TS[^46]                                                                                                                                     |    ✅     |    ✅     |          |          |          | `2023 - NeurIPS`                                                                                         |\n| Neural Net    | FreTS🧑‍🔧[^23]                                                                                                                                 |    ✅     |          |          |          |          | `2023 - NeurIPS`                                                                                         |\n| Neural Net    | Koopa🧑‍🔧[^29]                                                                                                                                 |    ✅     |          |          |          |          | `2023 - NeurIPS`                                                                                         |\n| Neural Net    | Crossformer🧑‍🔧[^16]                                                                                                                           |    ✅     |          |          |          |     ✅     | `2023 - ICLR`                                                                                            |\n| Neural Net    | TimesNet[^14]                                                                                                                                   |    ✅     |    ✅     |    ✅     |          |    ✅     | `2023 - ICLR`                                                                                            |\n| Neural Net    | PatchTST🧑‍🔧[^18]                                                                                                                              |    ✅     |          |    ✅     |          |    ✅     | `2023 - ICLR`                                                                                            |\n| Neural Net    | ETSformer🧑‍🔧[^19]                                                                                                                             |    ✅     |          |          |          |     ✅     | `2023 - ICLR`                                                                                            |\n| Neural Net    | MICN🧑‍🔧[^27]                                                                                                                                  |    ✅     |    ✅     |          |          |          | `2023 - ICLR`                                                                                            |\n| Neural Net    | DLinear🧑‍🔧[^17]                                                                                                                               |    ✅     |    ✅     |          |          |    ✅     | `2023 - AAAI`                                                                                            |\n| Neural Net    | TiDE🧑‍🔧[^28]                                                                                                                                  |    ✅     |          |          |          |          | `2023 - TMLR`                                                                                            |\n| Neural Net    | CSAI[^42]                                                                                                                                       |    ✅     |          |    ✅     |          |          | `2023 - arXiv`                                                                                           |\n| Neural Net    | TS2Vec[^48]                                                                                                                                     |          |          |    ✅     |          |          | `2022 - AAAI`                                                                                            |\n| Neural Net    | SCINet🧑‍🔧[^30]                                                                                                                                |    ✅     |          |          |          |    ✅     | `2022 - NeurIPS`                                                                                         |\n| Neural Net    | Nonstationary Tr.🧑‍🔧[^25]                                                                                                                     |    ✅     |          |          |          |     ✅     | `2022 - NeurIPS`                                                                                         |\n| Neural Net    | FiLM🧑‍🔧[^22]                                                                                                                                  |    ✅     |    ✅     |          |          |     ✅     | `2022 - NeurIPS`                                                                                         |\n| Neural Net    | RevIN_SCINet🧑‍🔧[^31]                                                                                                                          |    ✅     |          |          |          |          | `2022 - ICLR`                                                                                            |\n| Neural Net    | Pyraformer🧑‍🔧[^26]                                                                                                                            |    ✅     |          |          |          |     ✅     | `2022 - ICLR`                                                                                            |\n| Neural Net    | Raindrop[^5]                                                                                                                                    |          |          |    ✅     |          |          | `2022 - ICLR`                                                                                            |\n| Neural Net    | FEDformer🧑‍🔧[^20]                                                                                                                             |    ✅     |          |          |          |     ✅     | `2022 - ICML`                                                                                            |\n| Neural Net    | Autoformer🧑‍🔧[^15]                                                                                                                            |    ✅     |          |    ✅     |          |    ✅     | `2021 - NeurIPS`                                                                                         |\n| Neural Net    | CSDI[^12]                                                                                                                                       |    ✅     |    ✅     |          |          |          | `2021 - NeurIPS`                                                                                         |\n| Neural Net    | Informer🧑‍🔧[^21]                                                                                                                              |    ✅     |          |          |          |    ✅      | `2021 - AAAI`                                                                                            |\n| Neural Net    | US-GAN[^10]                                                                                                                                     |    ✅     |          |          |          |          | `2021 - AAAI`                                                                                            |\n| Neural Net    | CRLI[^6]                                                                                                                                        |          |          |          |    ✅     |          | `2021 - AAAI`                                                                                            |\n| Probabilistic | BTTF[^8]                                                                                                                                        |          |    ✅     |          |          |          | `2021 - TPAMI`                                                                                           |\n| Neural Net    | StemGNN🧑‍🔧[^33]                                                                                                                               |    ✅     |          |          |          |          | `2020 - NeurIPS`                                                                                         |\n| Neural Net    | SeFT[^53]                                                                                                                                       |          |          |    ✅     |          |          | `2020 - ICML`                                                                                            |\n| Neural Net    | Reformer🧑‍🔧[^32]                                                                                                                              |    ✅     |          |          |          |    ✅     | `2020 - ICLR`                                                                                            |\n| Neural Net    | GP-VAE[^11]                                                                                                                                     |    ✅     |          |          |          |          | `2020 - AISTATS`                                                                                         |\n| Neural Net    | VaDER[^7]                                                                                                                                       |          |          |          |    ✅     |          | `2019 - GigaSci.`                                                                                        |\n| Neural Net    | M-RNN[^9]                                                                                                                                       |    ✅     |          |          |          |          | `2019 - TBME`                                                                                            |\n| Neural Net    | BRITS[^3]                                                                                                                                       |    ✅     |          |    ✅     |          |          | `2018 - NeurIPS`                                                                                         |\n| Neural Net    | GRU-D[^4]                                                                                                                                       |    ✅     |          |    ✅     |          |          | `2018 - Sci. Rep.`                                                                                       |\n| Neural Net    | TCN🧑‍🔧[^35]                                                                                                                                   |    ✅     |          |          |          |          | `2018 - arXiv`                                                                                           |\n| Neural Net    | Transformer🧑‍🔧[^2]                                                                                                                            |    ✅     |    ✅     |          |          |     ✅     | `2017 - NeurIPS`                                                                                         |\n| MF            | TRMF[^44]                                                                                                                                       |    ✅     |          |          |          |          | `2016 - NeurIPS`                                                                                         |\n| Naive         | Lerp[^40]                                                                                                                                       |    ✅     |          |          |          |          |                                                                                                          |\n| Naive         | LOCF\u002FNOCB                                                                                                                                       |    ✅     |          |          |          |          |                                                                                                          |\n| Naive         | Mean                                                                                                                                            |    ✅     |          |          |          |          |                                                                                                          |\n| Naive         | Median                                                                                                                                          |    ✅     |          |          |          |          |                                                                                                          |\n\n🙋 Differences between `LLM (Large Language Model)` and `TSFM (Time-Series Foundation Model)` in the above table:\n`LLM` refers to the models that are pre-trained on large-scale text data and can be fine-tuned for specific tasks.\n`TSFM` refers to the models that are pre-trained on large-scale time series data, inspired by recent achievements\nof foundation models in CV and NLP.\n\n💯 Contribute your model right now to increase your research impact! PyPOTS downloads are increasing rapidly\n(**[1M+ in total and 2k+ daily on PyPI so far](https:\u002F\u002Fwww.pepy.tech\u002Fprojects\u002Fpypots)**),\nand your work will be widely used and cited by the community.\nRefer to the [contribution guide](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS#-contribution) to see how to include your model in\nPyPOTS.\n\n## ❖ PyPOTS Ecosystem\n\nAt PyPOTS, things are related to coffee, which we're familiar with. Yes, this is a coffee universe!\nAs you can see, there is a coffee pot in the PyPOTS logo. And what else? Please read on ;-)\n\n\u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FTSDB\">\n    \u003Cimg src=\"https:\u002F\u002Fpypots.com\u002Ffigs\u002Fpypots_logos\u002FTSDB\u002Flogo_FFBG.svg\" align=\"left\" width=\"140\" alt=\"TSDB logo\"\u002F>\n\u003C\u002Fa>\n\n👈 Time series datasets are taken as coffee beans at PyPOTS, and POTS datasets are incomplete coffee beans with missing\nparts that have their own meanings. To make various public time-series datasets readily available to users,\n\u003Ci>Time Series Data Beans (TSDB)\u003C\u002Fi> is created to make loading time-series datasets super easy!\nVisit [TSDB](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FTSDB) right now to know more about this handy tool 🛠, and it now supports a\ntotal of 172 open-source datasets!\n\n\u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyGrinder\">\n    \u003Cimg src=\"https:\u002F\u002Fpypots.com\u002Ffigs\u002Fpypots_logos\u002FPyGrinder\u002Flogo_FFBG.svg\" align=\"right\" width=\"140\" alt=\"PyGrinder logo\"\u002F>\n\u003C\u002Fa>\n\n👉 To simulate the real-world data beans with missingness, the ecosystem library\n[PyGrinder](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyGrinder), a toolkit helping grind your coffee beans into incomplete ones, is\ncreated. Missing patterns fall into three categories according to Robin's theory[^13]:\nMCAR (missing completely at random), MAR (missing at random), and MNAR (missing not at random).\nPyGrinder supports all of them and additional functionalities related to missingness.\nWith PyGrinder, you can introduce synthetic missing values into your datasets with a single line of code.\n\n\u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FBenchPOTS\">\n    \u003Cimg src=\"https:\u002F\u002Fpypots.com\u002Ffigs\u002Fpypots_logos\u002FBenchPOTS\u002Flogo_FFBG.svg\" align=\"left\" width=\"140\" alt=\"BenchPOTS logo\"\u002F>\n\u003C\u002Fa>\n\n👈 To fairly evaluate the performance of PyPOTS algorithms, the benchmarking suite\n[BenchPOTS](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FBenchPOTS) is created, which provides standard and unified data-preprocessing\npipelines to prepare datasets for measuring the performance of different POTS algorithms on various tasks.\n\n\u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FBrewPOTS\">\n    \u003Cimg src=\"https:\u002F\u002Fpypots.com\u002Ffigs\u002Fpypots_logos\u002FBrewPOTS\u002Flogo_FFBG.svg\" align=\"right\" width=\"140\" alt=\"BrewPOTS logo\"\u002F>\n\u003C\u002Fa>\n\n👉 Now the beans, grinder, and pot are ready, please have a seat on the bench and let's think about how to brew us a cup\nof coffee. Tutorials are necessary! Considering the future workload, PyPOTS tutorials are released in a single repo,\nand you can find them in [BrewPOTS](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FBrewPOTS).\nTake a look at it now, and learn how to brew your POTS datasets!\n\n\u003Cp align=\"center\">\n\u003Ca href=\"https:\u002F\u002Fpypots.com\u002Fecosystem\u002F\">\n    \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FWenjieDu_PyPOTS_readme_cff90719f8b8.png\" width=\"95%\"\u002F>\n\u003C\u002Fa>\n\u003Cbr>\n\u003Cb> ☕️ Welcome to the universe of PyPOTS. Enjoy it and have fun!\u003C\u002Fb>\n\u003C\u002Fp>\n\n## ❖ Installation\n\nYou can refer to [the installation instruction](https:\u002F\u002Fdocs.pypots.com\u002Fen\u002Flatest\u002Finstall.html) in PyPOTS documentation\nfor a guideline with more details.\n\nPyPOTS is available on both [PyPI](https:\u002F\u002Fpypi.python.org\u002Fpypi\u002Fpypots)\nand [Anaconda](https:\u002F\u002Fanaconda.org\u002Fconda-forge\u002Fpypots).\nYou can install PyPOTS like below as well as\n[TSDB](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FTSDB),[PyGrinder](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyGrinder),\n[BenchPOTS](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FBenchPOTS), and [AI4TS](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FAI4TS):\n\n``` bash\n# via pip\npip install pypots            # the first time installation\npip install pypots --upgrade  # update pypots to the latest version\n# install from the latest source code with the latest features but may be not officially released yet\npip install https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Farchive\u002Fmain.zip\n\n# via conda\nconda install conda-forge::pypots  # the first time installation\nconda update  conda-forge::pypots  # update pypots to the latest version\n\n# via docker\ndocker run -it --name pypots wenjiedu\u002Fpypots  # docker will auto pull our built image and run a instance for you\n# after things settled, you can run python in the container to access the well-configured environment for running pypots\n# if you'd like to detach from the container, press ctrl-P + ctrl-Q \n# run `docker attach pypots` to enter the container again. \n```\n\n## ❖ Usage\n\nBesides [BrewPOTS](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FBrewPOTS), you can also find a simple and quick-start tutorial notebook\non Google Colab\n\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fdrive\u002F1HEFjylEy05-r47jRy0H9jiS_WhD0UWmQ\">\n\u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FGoogleColab-PyPOTS_Tutorials-F9AB00?logo=googlecolab&logoColor=white\" alt=\"Colab tutorials\" align=\"center\"\u002F>\n\u003C\u002Fa>. If you have further questions, please refer to PyPOTS documentation [docs.pypots.com](https:\u002F\u002Fdocs.pypots.com).\nYou can also [raise an issue](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fissues) or [ask in our community](#-community).\n\nWe present you a usage example of imputing missing values in time series with PyPOTS below, you can click it to view.\n\n\u003Cdetails open>\n\u003Csummary>\u003Cb>Click here to see an example applying SAITS on PhysioNet2012 for imputation:\u003C\u002Fb>\u003C\u002Fsummary>\n\n``` python\nimport numpy as np\nfrom sklearn.preprocessing import StandardScaler\nfrom pygrinder import mcar, calc_missing_rate\nfrom benchpots.datasets import preprocess_physionet2012\ndata = preprocess_physionet2012(subset='set-a',rate=0.1) # Our ecosystem libs will automatically download and extract it\ntrain_X, val_X, test_X = data[\"train_X\"], data[\"val_X\"], data[\"test_X\"]\nprint(train_X.shape)  # (n_samples, n_steps, n_features)\nprint(val_X.shape)  # samples (n_samples) in train set and val set are different, but they have the same sequence len (n_steps) and feature dim (n_features)\nprint(f\"We have {calc_missing_rate(train_X):.1%} values missing in train_X\")  \ntrain_set = {\"X\": train_X}  # in training set, simply put the incomplete time series into it\nval_set = {\n    \"X\": val_X,\n    \"X_ori\": data[\"val_X_ori\"],  # in validation set, we need ground truth for evaluation and picking the best model checkpoint\n}\ntest_set = {\"X\": test_X}  # in test set, only give the testing incomplete time series for model to impute\ntest_X_ori = data[\"test_X_ori\"]  # test_X_ori bears ground truth for evaluation\nindicating_mask = np.isnan(test_X) ^ np.isnan(test_X_ori)  # mask indicates the values that are missing in X but not in X_ori, i.e. where the gt values are \n\nfrom pypots.imputation import SAITS  # import the model you want to use\nfrom pypots.nn.functional import calc_mae\nsaits = SAITS(n_steps=train_X.shape[1], n_features=train_X.shape[2], n_layers=2, d_model=256, n_heads=4, d_k=64, d_v=64, d_ffn=128, dropout=0.1, epochs=5)\nsaits.fit(train_set, val_set)  # train the model on the dataset\nimputation = saits.impute(test_set)  # impute the originally-missing values and artificially-missing values\nmae = calc_mae(imputation, np.nan_to_num(test_X_ori), indicating_mask)  # calculate mean absolute error on the ground truth (artificially-missing values)\nsaits.save(\"save_it_here\u002Fsaits_physionet2012.pypots\")  # save the model for future use\nsaits.load(\"save_it_here\u002Fsaits_physionet2012.pypots\")  # reload the serialized model file for following imputation or training\n```\n\n\u003C\u002Fdetails>\n\n## ❖ Citing PyPOTS\n\n> [!TIP]\n> **[Updates in Jun 2024]** 😎 The 1st comprehensive time-seres imputation benchmark paper\n[TSI-Bench: Benchmarking Time Series Imputation](https:\u002F\u002Farxiv.org\u002Fabs\u002F2406.12747) now is public available.\n> The code is open source in the repo [Awesome_Imputation](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FAwesome_Imputation).\n> With nearly 35,000 experiments, we provide a comprehensive benchmarking study on 28 imputation methods, 3 missing\n> patterns (points, sequences, blocks),\n> various missing rates, and 8 real-world datasets.\n>\n> **[Updates in Feb 2024]** 🎉 Our survey\n> paper [Deep Learning for Multivariate Time Series Imputation: A Survey](https:\u002F\u002Farxiv.org\u002Fabs\u002F2402.04059) has been\n> released on arXiv.\n> We comprehensively review the literature of the state-of-the-art deep-learning imputation methods for time series,\n> provide a taxonomy for them, and discuss the challenges and future directions in this field.\n\nThe paper introducing PyPOTS is available [on arXiv](https:\u002F\u002Farxiv.org\u002Fabs\u002F2305.18811),\nand a short version of it is accepted by the 9th SIGKDD international workshop on Mining and Learning from Time\nSeries ([MiLeTS'23](https:\u002F\u002Fkdd-milets.github.io\u002Fmilets2023\u002F))).\n**Additionally**, PyPOTS has been included as a [PyTorch Ecosystem](https:\u002F\u002Flandscape.pytorch.org\u002F?item=modeling--specialized--pypots) project.\nWe are pursuing to publish it in prestigious academic venues, e.g. JMLR (track for\n[Machine Learning Open Source Software](https:\u002F\u002Fwww.jmlr.org\u002Fmloss\u002F)). If you use PyPOTS in your work,\nplease cite it as below and 🌟star this repository to make others notice this library. 🤗\n\nThere are scientific research projects using PyPOTS and referencing in their papers.\nHere is [an incomplete list of them](https:\u002F\u002Fscholar.google.com\u002Fscholar?as_ylo=2022&q=%E2%80%9CPyPOTS%E2%80%9D&hl=en).\n\n```bibtex\n@article{du2023pypots,\ntitle = {{PyPOTS: A Python Toolkit for Data Mining on Partially-Observed Time Series}},\nauthor = {Wenjie Du},\njournal = {KDD 2023 MiLeTS},\nyear = {2023},\n}\n```\n\n```bibtex\n@article{du2025pypots,\ntitle = {{PyPOTS v1: A Python Toolkit for Machine Learning on Partially-Observed Time Series}},\nauthor = {Wenjie Du, Yiyuan Yang, Linglong Qian, Jun Wang, and Qingsong Wen},\nyear = {2025},\n}\n```\n\n## ❖ Contribution\n\nYou're very welcome to contribute to this exciting project!\n\nBy committing your code, you'll\n\n1. make your well-established model out-of-the-box for PyPOTS users to run,\n   and help your work obtain more exposure and impact.\n   Take a look at our [inclusion criteria](https:\u002F\u002Fdocs.pypots.com\u002Fen\u002Flatest\u002Ffaq.html#inclusion-criteria).\n   You can utilize the `template` folder in each task package (e.g.\n   [pypots\u002Fimputation\u002Ftemplate](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Ftree\u002Fmain\u002Fpypots\u002Fimputation\u002Ftemplate)) to quickly\n   start;\n2. become one of [PyPOTS contributors](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fgraphs\u002Fcontributors) and\n   be listed as a volunteer developer [on the PyPOTS website](https:\u002F\u002Fpypots.com\u002Fabout\u002F#volunteer-developers);\n3. get mentioned in PyPOTS [release notes](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Freleases);\n\nYou can also contribute to PyPOTS by simply staring🌟 this repo to help more people notice it.\nYour star is your recognition to PyPOTS, and it matters!\n\n\u003Cdetails open>\n\u003Csummary>\n    \u003Cb>\u003Ci>\n    👏 Click here to view PyPOTS stargazers and forkers.\u003Cbr>\n    We're so proud to have more and more awesome users, as well as more bright ✨stars:\n    \u003C\u002Fi>\u003C\u002Fb>\n\u003C\u002Fsummary>\n\u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fstargazers\">\n    \u003Cimg alt=\"PyPOTS stargazers\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FWenjieDu_PyPOTS_readme_860c71108f75.png\">\n\u003C\u002Fa>\n\u003Cbr>\n\u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fnetwork\u002Fmembers\">\n    \u003Cimg alt=\"PyPOTS forkers\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FWenjieDu_PyPOTS_readme_ac4649852c23.png\">\n\u003C\u002Fa>\n\u003C\u002Fdetails>\n\n👀 Check out a full list of our users' affiliations [on PyPOTS website here](https:\u002F\u002Fpypots.com\u002Fusers\u002F)!\n\n## ❖ Community\n\nWe care about the feedback from our users, so we're building PyPOTS community on\n\n- [Slack](https:\u002F\u002Fjoin.slack.com\u002Ft\u002Fpypots-org\u002Fshared_invite\u002Fzt-1gq6ufwsi-p0OZdW~e9UW_IA4_f1OfxA). General discussion,\n  Q&A, and our development team are here;\n- [LinkedIn](https:\u002F\u002Fwww.linkedin.com\u002Fcompany\u002Fpypots). Official announcements and news are here;\n- [WeChat (微信公众号)](https:\u002F\u002Fmp.weixin.qq.com\u002Fs\u002FX3ukIgL1QpNH8ZEXq1YifA). We also run a group chat on WeChat,\n  and you can get the QR code from the official account after following it;\n\nIf you have any suggestions or want to contribute ideas or share time-series related papers, join us and tell.\nPyPOTS community is open, transparent, and surely friendly. Let's work together to build and improve PyPOTS!\n\n\n[\u002F\u002F]: # (Use APA reference style below)\n[^1]: Du, W., Cote, D., & Liu, Y. (2023).\n[SAITS: Self-Attention-based Imputation for Time Series](https:\u002F\u002Fdoi.org\u002F10.1016\u002Fj.eswa.2023.119619).\n*Expert systems with applications*.\n[^2]: Vaswani, A., Shazeer, N.M., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, L., & Polosukhin, I. (\n2017).\n[Attention is All you Need](https:\u002F\u002Fpapers.nips.cc\u002Fpaper\u002F2017\u002Fhash\u002F3f5ee243547dee91fbd053c1c4a845aa-Abstract.html).\n*NeurIPS 2017*.\n[^3]: Cao, W., Wang, D., Li, J., Zhou, H., Li, L., & Li, Y. (2018).\n[BRITS: Bidirectional Recurrent Imputation for Time Series](https:\u002F\u002Fpapers.nips.cc\u002Fpaper\u002F2018\u002Fhash\u002F734e6bfcd358e25ac1db0a4241b95651-Abstract.html).\n*NeurIPS 2018*.\n[^4]: Che, Z., Purushotham, S., Cho, K., Sontag, D.A., & Liu, Y. (2018).\n[Recurrent Neural Networks for Multivariate Time Series with Missing Values](https:\u002F\u002Fwww.nature.com\u002Farticles\u002Fs41598-018-24271-9).\n*Scientific Reports*.\n[^5]: Zhang, X., Zeman, M., Tsiligkaridis, T., & Zitnik, M. (2022).\n[Graph-Guided Network for Irregularly Sampled Multivariate Time Series](https:\u002F\u002Farxiv.org\u002Fabs\u002F2110.05357).\n*ICLR 2022*.\n[^6]: Ma, Q., Chen, C., Li, S., & Cottrell, G. W. (2021).\n[Learning Representations for Incomplete Time Series Clustering](https:\u002F\u002Fojs.aaai.org\u002Findex.php\u002FAAAI\u002Farticle\u002Fview\u002F17070).\n*AAAI 2021*.\n[^7]: Jong, J.D., Emon, M.A., Wu, P., Karki, R., Sood, M., Godard, P., Ahmad, A., Vrooman, H.A., Hofmann-Apitius, M., &\nFröhlich, H. (2019).\n[Deep learning for clustering of multivariate clinical patient trajectories with missing values](https:\u002F\u002Facademic.oup.com\u002Fgigascience\u002Farticle\u002F8\u002F11\u002Fgiz134\u002F5626377).\n*GigaScience*.\n[^8]: Chen, X., & Sun, L. (2021).\n[Bayesian Temporal Factorization for Multidimensional Time Series Prediction](https:\u002F\u002Farxiv.org\u002Fabs\u002F1910.06366).\n*IEEE transactions on pattern analysis and machine intelligence*.\n[^9]: Yoon, J., Zame, W. R., & van der Schaar, M. (2019).\n[Estimating Missing Data in Temporal Data Streams Using Multi-Directional Recurrent Neural Networks](https:\u002F\u002Fieeexplore.ieee.org\u002Fdocument\u002F8485748).\n*IEEE Transactions on Biomedical Engineering*.\n[^10]: Miao, X., Wu, Y., Wang, J., Gao, Y., Mao, X., & Yin, J. (2021).\n[Generative Semi-supervised Learning for Multivariate Time Series Imputation](https:\u002F\u002Fojs.aaai.org\u002Findex.php\u002FAAAI\u002Farticle\u002Fview\u002F17086).\n*AAAI 2021*.\n[^11]: Fortuin, V., Baranchuk, D., Raetsch, G. & Mandt, S. (2020).\n[GP-VAE: Deep Probabilistic Time Series Imputation](https:\u002F\u002Fproceedings.mlr.press\u002Fv108\u002Ffortuin20a.html).\n*AISTATS 2020*.\n[^12]: Tashiro, Y., Song, J., Song, Y., & Ermon, S. (2021).\n[CSDI: Conditional Score-based Diffusion Models for Probabilistic Time Series Imputation](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper\u002F2021\u002Fhash\u002Fcfe8504bda37b575c70ee1a8276f3486-Abstract.html).\n*NeurIPS 2021*.\n[^13]: Rubin, D. B. (1976).\n[Inference and missing data](https:\u002F\u002Facademic.oup.com\u002Fbiomet\u002Farticle-abstract\u002F63\u002F3\u002F581\u002F270932).\n*Biometrika*.\n[^14]: Wu, H., Hu, T., Liu, Y., Zhou, H., Wang, J., & Long, M. (2023).\n[TimesNet: Temporal 2d-variation modeling for general time series analysis](https:\u002F\u002Fopenreview.net\u002Fforum?id=ju_Uqw384Oq).\n*ICLR 2023*\n[^15]: Wu, H., Xu, J., Wang, J., & Long, M. (2021).\n[Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper\u002F2021\u002Fhash\u002Fbcc0d400288793e8bdcd7c19a8ac0c2b-Abstract.html).\n*NeurIPS 2021*.\n[^16]: Zhang, Y., & Yan, J. (2023).\n[Crossformer: Transformer utilizing cross-dimension dependency for multivariate time series forecasting](https:\u002F\u002Fopenreview.net\u002Fforum?id=vSVLM2j9eie).\n*ICLR 2023*.\n[^17]: Zeng, A., Chen, M., Zhang, L., & Xu, Q. (2023).\n[Are transformers effective for time series forecasting?](https:\u002F\u002Fojs.aaai.org\u002Findex.php\u002FAAAI\u002Farticle\u002Fview\u002F26317).\n*AAAI 2023*\n[^18]: Nie, Y., Nguyen, N. H., Sinthong, P., & Kalagnanam, J. (2023).\n[A time series is worth 64 words: Long-term forecasting with transformers](https:\u002F\u002Fopenreview.net\u002Fforum?id=Jbdc0vTOcol).\n*ICLR 2023*\n[^19]: Woo, G., Liu, C., Sahoo, D., Kumar, A., & Hoi, S. (2023).\n[ETSformer: Exponential Smoothing Transformers for Time-series Forecasting](https:\u002F\u002Fopenreview.net\u002Fforum?id=5m_3whfo483).\n*ICLR 2023*\n[^20]: Zhou, T., Ma, Z., Wen, Q., Wang, X., Sun, L., & Jin, R. (2022).\n[FEDformer: Frequency enhanced decomposed transformer for long-term series forecasting](https:\u002F\u002Fproceedings.mlr.press\u002Fv162\u002Fzhou22g.html).\n*ICML 2022*.\n[^21]: Zhou, H., Zhang, S., Peng, J., Zhang, S., Li, J., Xiong, H., & Zhang, W. (2021).\n[Informer: Beyond efficient transformer for long sequence time-series forecasting](https:\u002F\u002Fojs.aaai.org\u002Findex.php\u002FAAAI\u002Farticle\u002Fview\u002F17325).\n*AAAI 2021*.\n[^22]: Zhou, T., Ma, Z., Wen, Q., Sun, L., Yao, T., Yin, W., & Jin, R. (2022).\n[FiLM: Frequency improved Legendre Memory Model for Long-term Time Series Forecasting](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper_files\u002Fpaper\u002F2022\u002Fhash\u002F524ef58c2bd075775861234266e5e020-Abstract-Conference.html).\n*NeurIPS 2022*.\n[^23]: Yi, K., Zhang, Q., Fan, W., Wang, S., Wang, P., He, H., An, N., Lian, D., Cao, L., & Niu, Z. (2023).\n[Frequency-domain MLPs are More Effective Learners in Time Series Forecasting](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper_files\u002Fpaper\u002F2023\u002Fhash\u002Ff1d16af76939f476b5f040fd1398c0a3-Abstract-Conference.html).\n*NeurIPS 2023*.\n[^24]: Liu, Y., Hu, T., Zhang, H., Wu, H., Wang, S., Ma, L., & Long, M. (2024).\n[iTransformer: Inverted Transformers Are Effective for Time Series Forecasting](https:\u002F\u002Fopenreview.net\u002Fforum?id=JePfAI8fah).\n*ICLR 2024*.\n[^25]: Liu, Y., Wu, H., Wang, J., & Long, M. (2022).\n[Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper_files\u002Fpaper\u002F2022\u002Fhash\u002F4054556fcaa934b0bf76da52cf4f92cb-Abstract-Conference.html).\n*NeurIPS 2022*.\n[^26]: Liu, S., Yu, H., Liao, C., Li, J., Lin, W., Liu, A. X., & Dustdar, S. (2022).\n[Pyraformer: Low-Complexity Pyramidal Attention for Long-Range Time Series Modeling and Forecasting](https:\u002F\u002Fopenreview.net\u002Fforum?id=0EXmFzUn5I).\n*ICLR 2022*.\n[^27]: Wang, H., Peng, J., Huang, F., Wang, J., Chen, J., & Xiao, Y. (2023).\n[MICN: Multi-scale Local and Global Context Modeling for Long-term Series Forecasting](https:\u002F\u002Fopenreview.net\u002Fforum?id=zt53IDUR1U).\n*ICLR 2023*.\n[^28]: Das, A., Kong, W., Leach, A., Mathur, S., Sen, R., & Yu, R. (2023).\n[Long-term Forecasting with TiDE: Time-series Dense Encoder](https:\u002F\u002Fopenreview.net\u002Fforum?id=pCbC3aQB5W).\n*TMLR 2023*.\n[^29]: Liu, Y., Li, C., Wang, J., & Long, M. (2023).\n[Koopa: Learning Non-stationary Time Series Dynamics with Koopman Predictors](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper_files\u002Fpaper\u002F2023\u002Fhash\u002F28b3dc0970fa4624a63278a4268de997-Abstract-Conference.html).\n*NeurIPS 2023*.\n[^30]: Liu, M., Zeng, A., Chen, M., Xu, Z., Lai, Q., Ma, L., & Xu, Q. (2022).\n[SCINet: Time Series Modeling and Forecasting with Sample Convolution and Interaction](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper_files\u002Fpaper\u002F2022\u002Fhash\u002F266983d0949aed78a16fa4782237dea7-Abstract-Conference.html).\n*NeurIPS 2022*.\n[^31]: Kim, T., Kim, J., Tae, Y., Park, C., Choi, J. H., & Choo, J. (2022).\n[Reversible Instance Normalization for Accurate Time-Series Forecasting against Distribution Shift](https:\u002F\u002Fopenreview.net\u002Fforum?id=cGDAkQo1C0p).\n*ICLR 2022*.\n[^32]: Kitaev, N., Kaiser, Ł., & Levskaya, A. (2020).\n[Reformer: The Efficient Transformer](https:\u002F\u002Fopenreview.net\u002Fforum?id=rkgNKkHtvB).\n*ICLR 2020*.\n[^33]: Cao, D., Wang, Y., Duan, J., Zhang, C., Zhu, X., Huang, C., Tong, Y., Xu, B., Bai, J., Tong, J., & Zhang, Q. (\n2020).\n[Spectral Temporal Graph Neural Network for Multivariate Time-series Forecasting](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper\u002F2020\u002Fhash\u002Fcdf6581cb7aca4b7e19ef136c6e601a5-Abstract.html).\n*NeurIPS 2020*.\n[^34]: Nie, T., Qin, G., Mei, Y., & Sun, J. (2024).\n[ImputeFormer: Low Rankness-Induced Transformers for Generalizable Spatiotemporal Imputation](https:\u002F\u002Farxiv.org\u002Fabs\u002F2312.01728).\n*KDD 2024*.\n[^35]: Bai, S., Kolter, J. Z., & Koltun, V. (2018).\n[An empirical evaluation of generic convolutional and recurrent networks for sequence modeling](https:\u002F\u002Farxiv.org\u002Fabs\u002F1803.01271).\n*arXiv 2018*.\n[^36]: Project Gungnir, the world 1st LLM for time-series multitask modeling, will meet you soon. 🚀 Missing values and\nvariable lengths in your datasets? Hard to perform multitask learning with your time series? Not problems no longer. \nJoin our waitlist right now to receive the latest news and be the first to try it when it's released!\n\u003Ca href=\"https:\u002F\u002Ftime-series.ai\">\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FWenjieDu_PyPOTS_readme_a3729f3a1cbe.png\" width=\"20px\">Time-Series.AI\u003C\u002Fa>\n[^37]: Wang, S., Wu, H., Shi, X., Hu, T., Luo, H., Ma, L., ... & ZHOU, J. (2024).\n[TimeMixer: Decomposable Multiscale Mixing for Time Series Forecasting](https:\u002F\u002Fopenreview.net\u002Fforum?id=7oLshfEIC2).\n*ICLR 2024*.\n[^38]: Luo, D., & Wang X. (2024).\n[ModernTCN: A Modern Pure Convolution Structure for General Time Series Analysis](https:\u002F\u002Fopenreview.net\u002Fforum?id=vpJMJerXHU).\n*ICLR 2024*.\n[^39]: Zhan, T., He, Y., Deng, Y., Li, Z., Du, W., & Wen, Q. (2025).\n[Time Evidence Fusion Network: Multi-source View in Long-Term Time Series Forecasting](https:\u002F\u002Fdoi.org\u002F10.1109\u002FTPAMI.2025.3596905).\n*TPAMI 2025*.\n[^40]: [Wikipedia: Linear interpolation](https:\u002F\u002Fen.wikipedia.org\u002Fwiki\u002FLinear_interpolation)\n[^41]: Xu, Z., Zeng, A., & Xu, Q. (2024).\n[FITS: Modeling Time Series with 10k parameters](https:\u002F\u002Fopenreview.net\u002Fforum?id=bWcnvZ3qMb).\n*ICLR 2024*.\n[^42]: Qian, L., Ibrahim, Z., Ellis, H. L., Zhang, A., Zhang, Y., Wang, T., & Dobson, R. (2023).\n[Knowledge Enhanced Conditional Imputation for Healthcare Time-series](https:\u002F\u002Farxiv.org\u002Fabs\u002F2312.16713).\n*arXiv 2023*.\n[^43]: Lin, S., Lin, W., Wu, W., Zhao, F., Mo, R., & Zhang, H. (2026).\n[SegRNN: Segment Recurrent Neural Network for Long-Term Time Series Forecasting](https:\u002F\u002Farxiv.org\u002Fabs\u002F2308.11200).\n*IEEE IoT-J 2026*.\n[^44]: Yu, H. F., Rao, N., & Dhillon, I. S. (2016).\n[Temporal regularized matrix factorization for high-dimensional time series prediction](https:\u002F\u002Fpapers.nips.cc\u002Fpaper_files\u002Fpaper\u002F2016\u002Fhash\u002F85422afb467e9456013a2a51d4dff702-Abstract.html).\n*NeurIPS 2016*.\n[^45]: Jin, M., Wang, S., Ma, L., Chu, Z., Zhang, J. Y., Shi, X., ... & Wen, Q. (2024).\n[Time-LLM: Time Series Forecasting by Reprogramming Large Language Models](https:\u002F\u002Fopenreview.net\u002Fforum?id=Unb5CVPtae).\n*ICLR 2024*.\n[^46]: Zhou, T., Niu, P., Sun, L., & Jin, R. (2023).\n[One Fits All: Power General Time Series Analysis by Pretrained LM](https:\u002F\u002Fopenreview.net\u002Fforum?id=gMS6FVZvmF).\n*NeurIPS 2023*.\n[^47]: Goswami, M., Szafer, K., Choudhry, A., Cai, Y., Li, S., & Dubrawski, A. (2024).\n[MOMENT: A Family of Open Time-series Foundation Models](https:\u002F\u002Fproceedings.mlr.press\u002Fv235\u002Fgoswami24a.html).\n*ICML 2024*.\n[^48]: Yue, Z., Wang, Y., Duan, J., Yang, T., Huang, C., Tong, Y., & Xu, B. (2022).\n[TS2Vec: Towards Universal Representation of Time Series](https:\u002F\u002Fojs.aaai.org\u002Findex.php\u002FAAAI\u002Farticle\u002Fview\u002F20881).\n*AAAI 2022*.\n[^49]: Wang, S., Li, J., Shi, X., Ye, Z., Mo, B., Lin, W., Ju, S., Chu, Z. & Jin, M. (2025).\n[TimeMixer++: A General Time Series Pattern Machine for Universal Predictive Analysis](https:\u002F\u002Fopenreview.net\u002Fforum?id=1CLzLXSFNn).\n*ICLR 2025*.\n[^50]: Talukder, S., Yue, Y., & Gkioxari, G. (2024).\n[TOTEM: TOkenized Time Series EMbeddings for General Time Series Analysis](https:\u002F\u002Fopenreview.net\u002Fforum?id=QlTLkH6xRC).\n*TMLR 2024*.\n[^51]: Eldele, E., Ragab, M., Chen, Z., Wu, M., & Li, X. (2024).\n[TSLANet: Rethinking Transformers for Time Series Representation Learning](https:\u002F\u002Fproceedings.mlr.press\u002Fv235\u002Feldele24a.html).\n*ICML 2024*.\n[^52]: Ma, A., Luo, D., & Sha, M. (2026).\n[MixLinear: Extreme Low Resource Multivariate Time Series Forecasting with 0.1K Parameters](https:\u002F\u002Fopenreview.net\u002Fforum?id=QUj0KuCumD).\n*ICLR 2026*.\n[^53]: Horn, M., Moor, M., Bock, C., Rieck, B., & Borgwardt, K. (2020).\n[Set Functions for Time Series](https:\u002F\u002Fproceedings.mlr.press\u002Fv119\u002Fhorn20a).\n*ICML 2020*.\n[^54]: Genet, R., & Inzirillo, H. (2024).\n[TKAN: Temporal Kolmogorov-Arnold Networks](https:\u002F\u002Farxiv.org\u002Fabs\u002F2405.07344).\n*arXiv 2024*.","\u003Cimg src=\"https:\u002F\u002Fpypots.com\u002Ffigs\u002Ficons\u002Ftranslate.svg\" width=\"18\"> [简体中文](\u002FREADME_zh.md) | English\n\n\u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\">\n    \u003Cimg src=\"https:\u002F\u002Fpypots.com\u002Ffigs\u002Fpypots_logos\u002FPyPOTS\u002Flogo_FFBG.svg\" width=\"200\" align=\"right\">\n\u003C\u002Fa>\n\n\u003Ch3 align=\"center\">欢迎来到 PyPOTS\u003C\u002Fh3>\n\n\u003Cp align=\"center\">\u003Ci>用于部分观测时间序列机器学习的 Python 工具箱\u003C\u002Fi>\u003C\u002Fp>\n\n\u003Cp align=\"center\">\n    \u003Ca href=\"https:\u002F\u002Fdocs.pypots.com\u002Fen\u002Flatest\u002Finstall.html#reasons-of-version-limitations-on-dependencies\">\n       \u003Cimg alt=\"Python版本\" src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FPython-v3.8+-F8C6B5?logo=python&logoColor=white\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Flandscape.pytorch.org\u002F?item=modeling--specialized--pypots\">\n        \u003Cimg alt=\"PyTorch景观\" src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FPyTorch%20Landscape-EE4C2C?logo=pytorch&logoColor=white\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fblob\u002Fmain\u002FLICENSE\">\n        \u003Cimg alt=\"BSD-3许可\" src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FLicense-BSD--3-E9BB41?logo=opensourceinitiative&logoColor=white\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS?tab=coc-ov-file\">\n        \u003Cimg alt=\"行为准则\" src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FContributor_Covenant-2.1-4baaaa\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS#-community\">\n        \u003Cimg alt=\"社区\" src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002Fjoin_us-community!-C8A062\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Freleases\">\n        \u003Cimg alt=\"最新发布版本\" src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fv\u002Frelease\u002Fwenjiedu\u002Fpypots?color=EE781F&include_prereleases&label=Release&logo=github&logoColor=white\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fgraphs\u002Fcontributors\">\n        \u003Cimg alt=\"GitHub贡献者\" src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fcontributors\u002Fwenjiedu\u002Fpypots?color=D8E699&label=Contributors&logo=GitHub\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fstar-history.com\u002F#wenjiedu\u002Fpypots\">\n        \u003Cimg alt=\"GitHub仓库星标数\" src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fstars\u002Fwenjiedu\u002Fpypots?logo=None&color=6BB392&label=%E2%98%85%20Stars\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fnetwork\u002Fmembers\">\n        \u003Cimg alt=\"GitHub仓库叉数\" src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Fforks\u002Fwenjiedu\u002Fpypots?logo=forgejo&logoColor=black&label=Forks\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fsonarcloud.io\u002Fcomponent_measures?id=WenjieDu_PyPOTS&metric=sqale_rating&view=list\">\n        \u003Cimg alt=\"可维护性\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FWenjieDu_PyPOTS_readme_46aae964756d.png\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fcoveralls.io\u002Fgithub\u002FWenjieDu\u002FPyPOTS?branch=full_test\">\n        \u003Cimg alt=\"Coveralls覆盖率\" src=\"https:\u002F\u002Fimg.shields.io\u002FcoverallsCoverage\u002Fgithub\u002FWenjieDu\u002FPyPOTS?branch=full_test&logo=coveralls&color=75C1C4&label=Coverage\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Factions\u002Fworkflows\u002Ftesting_ci.yml\">\n        \u003Cimg alt=\"GitHub测试\" src=\"https:\u002F\u002Fimg.shields.io\u002Fgithub\u002Factions\u002Fworkflow\u002Fstatus\u002Fwenjiedu\u002Fpypots\u002Ftesting_ci.yml?logo=circleci&color=C8D8E1&label=CI\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fdocs.pypots.com\">\n        \u003Cimg alt=\"文档构建\" src=\"https:\u002F\u002Fimg.shields.io\u002Freadthedocs\u002Fpypots?logo=readthedocs&label=Docs&logoColor=white&color=395260\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fdeepwiki.com\u002FWenjieDu\u002FPyPOTS\">\n        \u003Cimg alt=\"Ask DeepWiki\" src=\"https:\u002F\u002Fdeepwiki.com\u002Fbadge.svg\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fpsf\u002Fblack\">\n        \u003Cimg alt=\"代码风格\" src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FCode_Style-black-000000\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fanaconda.org\u002Fconda-forge\u002Fpypots\">\n        \u003Cimg alt=\"Conda下载量\" src=\"https:\u002F\u002Fpypots.com\u002Ffigs\u002Fdownloads_badges\u002Fconda_pypots_downloads.svg\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Fpepy.tech\u002Fproject\u002Fpypots\">\n        \u003Cimg alt=\"PyPI下载量\" src=\"https:\u002F\u002Fpypots.com\u002Ffigs\u002Fdownloads_badges\u002Fpypi_pypots_downloads.svg\">\n    \u003C\u002Fa>\n    \u003Ca href=\"https:\u002F\u002Farxiv.org\u002Fabs\u002F2305.18811\">\n        \u003Cimg alt=\"arXiv DOI\" src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FDOI-10.48550\u002FarXiv.2305.18811-F8F7F0\">\n    \u003C\u002Fa>\n\u003C\u002Fp>\n\n⦿ `动机`: 由于采集传感器故障、通信错误以及意外故障等各种原因，真实世界环境中的时间序列中经常出现缺失值。这使得部分观测时间序列（POTS）成为开放世界建模中普遍存在的问题，并阻碍了高级数据分析的进行。尽管这一问题非常重要，但针对 POTS 的机器学习领域仍然缺乏专门的工具包。PyPOTS 的诞生正是为了填补这一空白。\n\n⦿ `使命`: PyPOTS（发音为“派·波特”）旨在成为一个便捷的工具箱，让 POTS 上的机器学习变得简单而非繁琐，帮助工程师和研究人员将更多精力集中在手头的核心问题上，而不是如何处理数据中的缺失部分。PyPOTS 将持续整合经典及最新的先进机器学习算法，用于部分观测的多变量时间序列。当然，除了各种算法之外，PyPOTS 还将提供统一的 API，以及详细的文档和跨算法的交互式示例教程。\n\n🤗 **请**为本仓库点赞，如果您认为 PyPOTS 是一个有用的工具箱，请让更多人注意到它。\n**请**在您的出版物中友好地[引用 PyPOTS](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS#-citing-pypots)，如果它对您的研究有所帮助。\n这对我们的开源研究意义重大。非常感谢！\n\n本自述文件的其余部分按以下顺序组织：\n[**❖ 可用算法**](#-available-algorithms),\n[**❖ PyPOTS 生态系统**](#-pypots-ecosystem),\n[**❖ 安装**](#-installation),\n[**❖ 使用**](#-usage),\n[**❖ 引用 PyPOTS**](#-citing-pypots),\n[**❖ 贡献**](#-contribution),\n[**❖ 社区**](#-community).\n\n## ❖ 可用算法\n\nPyPOTS 支持对含有缺失值的多变量部分观测时间序列进行插补、分类、聚类、预测和异常检测等任务。下表展示了 PyPOTS 中各算法（按年份排序）在不同任务中的可用性。符号 `✅` 表示该算法可用于相应任务（请注意，未来模型将持续更新以支持当前不支持的任务。敬请期待❗️）。\n\n🌟 自 **v0.2** 起，PyPOTS 中的所有神经网络模型均已支持超参数优化。此功能通过 [Microsoft NNI](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fnni) 框架实现。您可参考我们的时间序列插补综述与基准库 [Awesome_Imputation](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FAwesome_Imputation)，了解如何配置和调优超参数。\n\n🔥 请注意，表格中名称带有 `🧑‍🔧` 的所有模型（例如 Transformer、iTransformer、Informer 等）在其原论文中并非专门针对 POTS 数据提出的方法，且这些模型无法直接接受含有缺失值的时间序列作为输入，更不用说进行插补了。**为了使它们能够应用于 POTS 数据，我们特别采用了与 [SAITS 论文](https:\u002F\u002Farxiv.org\u002Fpdf\u002F2202.08516)[^1] 中相同的嵌入策略和训练方法（ORT+MIT）。**\n\n任务类型缩写如下：\n**`IMPU`**：插补；\n**`FORE`**：预测；\n**`CLAS`**：分类；\n**`CLUS`**：聚类；\n**`ANOD`**：异常检测。\n除了这 5 项任务外，PyPOTS 还提供了 TS2Vec[^48]，用于时间序列的表示学习和向量化。本文的参考文献及链接均列于本文件底部。\n\n| **类型**      | **算法**                                                                                                                                        | **IMPU** | **FORE** | **CLAS** | **CLUS** | **ANOD** | **年份 - 地点**                                                                                         |\n|:--------------|:------------------------------------------------------------------------------------------------------------------------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:---------------------------------------------------------------------------------------------------------|\n| LLM&TSFM      | \u003Ca href=\"https:\u002F\u002Ftime-series.ai\">\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FWenjieDu_PyPOTS_readme_a3729f3a1cbe.png\" width=\"26px\" align=\"center\">Time-Series.AI\u003C\u002Fa>  [^36] |    ✅     |    ✅     |    ✅     |    ✅     |    ✅     | \u003Ca href=\"https:\u002F\u002Fdocs.google.com\u002Fforms\u002Fd\u002F1Ff2ndYUFQEL3tIcwtcR8lWeopQ2vTXX6D_x8WGFKH6E\">加入等待名单\u003C\u002Fa> |\n| 神经网络    | MixLinear🧑‍🔧[^52]                                                                                                                             |          |    ✅     |          |          |          | `2026 - ICLR`                                                                                            |\n| 神经网络    | SegRNN🧑‍🔧[^43]                                                                                                                                |    ✅     |    ✅     |          |          |    ✅     | `2026 - IoT-J`                                                                                           |\n| 神经网络    | TEFN🧑‍🔧[^39]                                                                                                                                  |    ✅     |    ✅     |    ✅     |          |    ✅     | `2025 - TPAMI`                                                                                           |\n| 神经网络    | TimeMixer++[^49]                                                                                                                                |    ✅     |    ✅     |          |          |    ✅     | `2025 - ICLR`                                                                                            |\n| LLM           | Time-LLM🧑‍🔧[^45]                                                                                                                              |    ✅     |    ✅     |          |          |          | `2024 - ICLR`                                                                                            |\n| TSFM          | MOMENT[^47]                                                                                                                                     |    ✅     |    ✅     |          |          |          | `2024 - ICML`                                                                                            |\n| 神经网络    | TSLANet[^51]                                                                                                                                    |    ✅     |          |          |          |          | `2024 - ICML`                                                                                            |\n| 神经网络    | FITS🧑‍🔧[^41]                                                                                                                                  |    ✅     |    ✅     |          |          |          | `2024 - ICLR`                                                                                            |\n| 神经网络    | TimeMixer[^37]                                                                                                                                  |    ✅     |    ✅     |          |          |     ✅     | `2024 - ICLR`                                                                                            |\n| 神经网络    | iTransformer🧑‍🔧[^24]                                                                                                                          |    ✅     |          |    ✅     |          |    ✅      | `2024 - ICLR`                                                                                            |\n| 神经网络    | ModernTCN[^38]                                                                                                                                  |    ✅     |    ✅     |          |          |          | `2024 - ICLR`                                                                                            |\n| 神经网络    | ImputeFormer🧑‍🔧[^34]                                                                                                                          |    ✅     |          |          |          |    ✅     | `2024 - KDD`                                                                                             |\n| 神经网络    | TOTEM[^50]                                                                                                                                      |    ✅     |          |          |          |          | `2024 - TMLR`                                                                                            |\n| 神经网络    | TKAN[^54]                                                                                                                                       |    ✅     |          |          |          |          | `2024 - arXiv`                                                                                           |\n| 神经网络    | SAITS[^1]                                                                                                                                       |    ✅     |          |    ✅     |          |    ✅     | `2023 - ESWA`                                                                                            |\n| LLM           | GPT4TS[^46]                                                                                                                                     |    ✅     |    ✅     |          |          |          | `2023 - NeurIPS`                                                                                         |\n| 神经网络    | FreTS🧑‍🔧[^23]                                                                                                                                 |    ✅     |          |          |          |          | `2023 - NeurIPS`                                                                                         |\n| 神经网络    | Koopa🧑‍🔧[^29]                                                                                                                                 |    ✅     |          |          |          |          | `2023 - NeurIPS`                                                                                         |\n| 神经网络    | Crossformer🧑‍🔧[^16]                                                                                                                           |    ✅     |          |          |          |     ✅     | `2023 - ICLR`                                                                                            |\n| 神经网络    | TimesNet[^14]                                                                                                                                   |    ✅     |    ✅     |    ✅     |          |    ✅     | `2023 - ICLR`                                                                                            |\n| 神经网络    | PatchTST🧑‍🔧[^18]                                                                                                                              |    ✅     |          |    ✅     |          |    ✅     | `2023 - ICLR`                                                                                            |\n| 神经网络    | ETSformer🧑‍🔧[^19]                                                                                                                             |    ✅     |          |          |          |     ✅     | `2023 - ICLR`                                                                                            |\n| 神经网络    | MICN🧑‍🔧[^27]                                                                                                                                  |    ✅     |    ✅     |          |          |          | `2023 - ICLR`                                                                                            |\n| 神经网络    | DLinear🧑‍🔧[^17]                                                                                                                               |    ✅     |    ✅     |          |          |    ✅     | `2023 - AAAI`                                                                                            |\n| 神经网络    | TiDE🧑‍🔧[^28]                                                                                                                                  |    ✅     |          |          |          |          | `2023 - TMLR`                                                                                            |\n| 神经网络    | CSAI[^42]                                                                                                                                       |    ✅     |          |    ✅     |          |          | `2023 - arXiv`                                                                                           |\n| 神经网络    | TS2Vec[^48]                                                                                                                                     |          |          |    ✅     |          |          | `2022 - AAAI`                                                                                            |\n| 神经网络    | SCINet🧑‍🔧[^30]                                                                                                                                |    ✅     |          |          |          |    ✅     | `2022 - NeurIPS`                                                                                         |\n| 神经网络    | Nonstationary Tr.🧑‍🔧[^25]                                                                                                                     |    ✅     |          |          |          |     ✅     | `2022 - NeurIPS`                                                                                         |\n| 神经网络    | FiLM🧑‍🔧[^22]                                                                                                                                  |    ✅     |    ✅     |          |          |     ✅     | `2022 - NeurIPS`                                                                                         |\n| 神经网络    | RevIN_SCINet🧑‍🔧[^31]                                                                                                                          |    ✅     |          |          |          |          | `2022 - ICLR`                                                                                            |\n| 神经网络    | Pyraformer🧑‍🔧[^26]                                                                                                                            |    ✅     |          |          |          |     ✅     | `2022 - ICLR`                                                                                            |\n| 神经网络    | Raindrop[^5]                                                                                                                                    |          |          |    ✅     |          |          | `2022 - ICLR`                                                                                            |\n| 神经网络    | FEDformer🧑‍🔧[^20]                                                                                                                             |    ✅     |          |          |          |     ✅     | `2022 - ICML`                                                                                            |\n| 神经网络    | Autoformer🧑‍🔧[^15]                                                                                                                            |    ✅     |          |    ✅     |          |    ✅     | `2021 - NeurIPS`                                                                                         |\n| 神经网络    | CSDI[^12]                                                                                                                                       |    ✅     |    ✅     |          |          |          | `2021 - NeurIPS`                                                                                         |\n| 神经网络    | Informer🧑‍🔧[^21]                                                                                                                              |    ✅     |          |          |          |    ✅      | `2021 - AAAI`                                                                                            |\n| 神经网络    | US-GAN[^10]                                                                                                                                     |    ✅     |          |          |          |          | `2021 - AAAI`                                                                                            |\n| 神经网络    | CRLI[^6]                                                                                                                                        |          |          |          |    ✅     |          | `2021 - AAAI`                                                                                            |\n| 概率模型    | BTTF[^8]                                                                                                                                        |          |    ✅     |          |          |          | `2021 - TPAMI`                                                                                           |\n| 神经网络    | StemGNN🧑‍🔧[^33]                                                                                                                               |    ✅     |          |          |          |          | `2020 - NeurIPS`                                                                                         |\n| 神经网络    | SeFT[^53]                                                                                                                                       |          |          |    ✅     |          |          | `2020 - ICML`                                                                                            |\n| 神经网络    | Reformer🧑‍🔧[^32]                                                                                                                              |    ✅     |          |          |          |    ✅     | `2020 - ICLR`                                                                                            |\n| 神经网络    | GP-VAE[^11]                                                                                                                                     |    ✅     |          |          |          |          | `2020 - AISTATS`                                                                                         |\n| 神经网络    | VaDER[^7]                                                                                                                                       |          |          |          |    ✅     |          | `2019 - GigaSci.`                                                                                        |\n| 神经网络    | M-RNN[^9]                                                                                                                                       |    ✅     |          |          |          |          | `2019 - TBME`                                                                                            |\n| 神经网络    | BRITS[^3]                                                                                                                                       |    ✅     |          |    ✅     |          |          | `2018 - NeurIPS`                                                                                         |\n| 神经网络    | GRU-D[^4]                                                                                                                                       |    ✅     |          |    ✅     |          |          | `2018 - Sci. Rep.`                                                                                       |\n| 神经网络    | TCN🧑‍🔧[^35]                                                                                                                                   |    ✅     |          |          |          |          | `2018 - arXiv`                                                                                           |\n| 神经网络    | Transformer🧑‍🔧[^2]                                                                                                                            |    ✅     |    ✅     |          |          |     ✅     | `2017 - NeurIPS`                                                                                         |\n| MF            | TRMF[^44]                                                                                                                                       |    ✅     |          |          |          |          | `2016 - NeurIPS`                                                                                         |\n| 简单方法    | Lerp[^40]                                                                                                                                       |    ✅     |          |          |          |          |                                                                                                          |\n| 简单方法    | LOCF\u002FNOCB                                                                                                                                       |    ✅     |          |          |          |          |                                                                                                          |\n| 简单方法    | 均值                                                                                                                                            |    ✅     |          |          |          |          |                                                                                                          |\n| 简单方法    | 中位数                                                                                                                                          |    ✅     |          |          |          |          |                                                                                                          |\n\n🙋 上表中 `LLM（大型语言模型）` 与 `TSFM（时间序列基础模型）` 的区别：\n`LLM` 指的是在大规模文本数据上预训练，并可针对特定任务进行微调的模型。\n`TSFM` 则是指受计算机视觉和自然语言处理领域基础模型最新成果启发，在大规模时间序列数据上预训练的模型。\n\n💯 立即贡献你的模型，提升你的研究影响力！PyPOTS 的下载量正迅速增长\n(**截至目前，PyPI 总下载量已超过 100 万次，日均下载量超过 2000 次**),\n你的工作将被社区广泛使用并引用。请参阅 [贡献指南](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS#-contribution),\n了解如何将你的模型纳入 PyPOTS 中。\n\n\n\n## ❖ PyPOTS 生态系统\n\n在 PyPOTS，一切都与我们熟悉的咖啡息息相关。没错，这里就是一个咖啡宇宙！\n正如你所见，PyPOTS 的 logo 中就有一个咖啡壶。那还有什么呢？请继续阅读 ;-)\n\n\u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FTSDB\">\n    \u003Cimg src=\"https:\u002F\u002Fpypots.com\u002Ffigs\u002Fpypots_logos\u002FTSDB\u002Flogo_FFBG.svg\" align=\"left\" width=\"140\" alt=\"TSDB logo\"\u002F>\n\u003C\u002Fa>\n\n👈 在 PyPOTS 中，时间序列数据集被视为咖啡豆，而 POTS 数据集则是带有缺失部分的不完整咖啡豆，\n这些缺失本身也具有独特的意义。为了使各种公开的时间序列数据集更易于用户获取，\n\u003Ci>时间序列数据豆（TSDB）\u003C\u002Fi> 应运而生，让加载时间序列数据集变得无比简单！\n立即访问 [TSDB](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FTSDB) 了解更多关于这个便捷工具的信息 🛠，它目前已支持总计 172 个开源数据集！\n\n\u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyGrinder\">\n    \u003Cimg src=\"https:\u002F\u002Fpypots.com\u002Ffigs\u002Fpypots_logos\u002FPyGrinder\u002Flogo_FFBG.svg\" align=\"right\" width=\"140\" alt=\"PyGrinder logo\"\u002F>\n\u003C\u002Fa>\n\n👉 为了模拟现实世界中存在缺失的数据，生态系统库\n[PyGrinder](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyGrinder) 被创建出来，它是一个帮助将你的咖啡豆研磨成不完整状态的工具包。\n根据 Robin 的理论[^13]，缺失模式可分为三类：MCAR（完全随机缺失）、MAR（随机缺失）和 MNAR（非随机缺失）。\nPyGrinder 支持所有这三种类型，同时还提供与缺失相关的其他功能。\n借助 PyGrinder，你只需一行代码就能为你的数据集引入合成缺失值。\n\n\u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FBenchPOTS\">\n    \u003Cimg src=\"https:\u002F\u002Fpypots.com\u002Ffigs\u002Fpypots_logos\u002FBenchPOTS\u002Flogo_FFBG.svg\" align=\"left\" width=\"140\" alt=\"BenchPOTS logo\"\u002F>\n\u003C\u002Fa>\n\n👈 为了公平评估 PyPOTS 算法的性能，基准测试套件\n[BenchPOTS](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FBenchPOTS) 被创建，它提供了标准且统一的数据预处理流程，\n用于准备数据集，以便在不同任务上衡量各类 POTS 算法的性能。\n\n\u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FBrewPOTS\">\n    \u003Cimg src=\"https:\u002F\u002Fpypots.com\u002Ffigs\u002Fpypots_logos\u002FBrewPOTS\u002Flogo_FFBG.svg\" align=\"right\" width=\"140\" alt=\"BrewPOTS logo\"\u002F>\n\u003C\u002Fa>\n\n👉 现在，咖啡豆、研磨机和咖啡壶都已准备就绪，请坐到长椅上，让我们一起思考如何为我们冲泡一杯咖啡吧。\n教程是必不可少的！考虑到未来的工作量，PyPOTS 教程被整合在一个仓库中发布，\n你可以在 [BrewPOTS](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FBrewPOTS) 中找到它们。\n现在就来看看吧，学习如何“冲泡”你的 POTS 数据集！\n\n\u003Cp align=\"center\">\n\u003Ca href=\"https:\u002F\u002Fpypots.com\u002Fecosystem\u002F\">\n    \u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FWenjieDu_PyPOTS_readme_cff90719f8b8.png\" width=\"95%\"\u002F>\n\u003C\u002Fa>\n\u003Cbr>\n\u003Cb> ☕️ 欢迎来到 PyPOTS 的宇宙。尽情享受，玩得开心吧！\u003C\u002Fb>\n\u003C\u002Fp>\n\n## ❖ 安装\n\n你可以参考 PyPOTS 文档中的 [安装说明](https:\u002F\u002Fdocs.pypots.com\u002Fen\u002Flatest\u002Finstall.html),\n以获取更详细的指导。\n\nPyPOTS 同时在 [PyPI](https:\u002F\u002Fpypi.python.org\u002Fpypi\u002Fpypots)\n和 [Anaconda](https:\u002F\u002Fanaconda.org\u002Fconda-forge\u002Fpypots) 上提供。\n你可以像安装 [TSDB](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FTSDB)、[PyGrinder](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyGrinder)、\n[BenchPOTS](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FBenchPOTS) 和 [AI4TS](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FAI4TS) 一样安装 PyPOTS：\n\n``` bash\n# 使用 pip\npip install pypots            # 首次安装\npip install pypots --upgrade  # 更新至最新版本\n# 从最新源代码安装，包含尚未正式发布的最新功能\npip install https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Farchive\u002Fmain.zip\n\n# 使用 conda\nconda install conda-forge::pypots  # 首次安装\nconda update  conda-forge::pypots  # 更新至最新版本\n\n# 使用 docker\ndocker run -it --name pypots wenjiedu\u002Fpypots  # docker 会自动拉取我们构建好的镜像并为你运行一个实例\n# 当一切就绪后，你可以在容器内运行 Python，进入已配置好环境的 PyPOTS 运行环境\n# 如果你想退出容器，按下 ctrl-P + ctrl-Q 即可\n# 再次输入 `docker attach pypots` 即可重新进入容器。\n```\n\n## ❖ 使用方法\n\n除了 [BrewPOTS](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FBrewPOTS) 之外，您还可以在 Google Colab 上找到一个简单快捷的入门教程笔记本：\n\u003Ca href=\"https:\u002F\u002Fcolab.research.google.com\u002Fdrive\u002F1HEFjylEy05-r47jRy0H9jiS_WhD0UWmQ\">\n\u003Cimg src=\"https:\u002F\u002Fimg.shields.io\u002Fbadge\u002FGoogleColab-PyPOTS_Tutorials-F9AB00?logo=googlecolab&logoColor=white\" alt=\"Colab tutorials\" align=\"center\"\u002F>\n\u003C\u002Fa>。如果您有更多疑问，请参阅 PyPOTS 文档 [docs.pypots.com](https:\u002F\u002Fdocs.pypots.com)。您也可以 [提交问题](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fissues) 或在我们的社区中提问。\n\n下面我们将展示一个使用 PyPOTS 对时间序列进行缺失值插补的示例，您可以点击查看。\n\n\u003Cdetails open>\n\u003Csummary>\u003Cb>点击此处查看在 PhysioNet2012 数据集上应用 SAITS 进行插补的示例：\u003C\u002Fb>\u003C\u002Fsummary>\n\n``` python\nimport numpy as np\nfrom sklearn.preprocessing import StandardScaler\nfrom pygrinder import mcar, calc_missing_rate\nfrom benchpots.datasets import preprocess_physionet2012\ndata = preprocess_physionet2012(subset='set-a',rate=0.1) # 我们的生态库会自动下载并解压数据\ntrain_X, val_X, test_X = data[\"train_X\"], data[\"val_X\"], data[\"test_X\"]\nprint(train_X.shape)  # (n_samples, n_steps, n_features)\nprint(val_X.shape)  # 训练集和验证集中的样本数不同，但它们具有相同的序列长度 (n_steps) 和特征维度 (n_features)\nprint(f\"训练集中有 {calc_missing_rate(train_X):.1%} 的值缺失\")  \ntrain_set = {\"X\": train_X}  # 在训练集中，只需将不完整的时间序列放入其中\nval_set = {\n    \"X\": val_X,\n    \"X_ori\": data[\"val_X_ori\"],  # 在验证集中，我们需要真实标签来进行评估并选择最佳模型检查点\n}\ntest_set = {\"X\": test_X}  # 在测试集中，只需提供待插补的不完整时间序列\ntest_X_ori = data[\"test_X_ori\"]  # test_X_ori 包含用于评估的真实标签\nindicating_mask = np.isnan(test_X) ^ np.isnan(test_X_ori)  # 掩码指示那些在 X 中缺失但在 X_ori 中存在的值，即真实标签所在的位置\n\nfrom pypots.imputation import SAITS  # 导入您想要使用的模型\nfrom pypots.nn.functional import calc_mae\nsaits = SAITS(n_steps=train_X.shape[1], n_features=train_X.shape[2], n_layers=2, d_model=256, n_heads=4, d_k=64, d_v=64, d_ffn=128, dropout=0.1, epochs=5)\nsaits.fit(train_set, val_set)  # 在数据集上训练模型\nimputation = saits.impute(test_set)  # 插补原始缺失值和人为制造的缺失值\nmae = calc_mae(imputation, np.nan_to_num(test_X_ori), indicating_mask)  # 计算在真实标签上的平均绝对误差（针对人为制造的缺失值）\nsaits.save(\"save_it_here\u002Fsaits_physionet2012.pypots\")  # 保存模型以备将来使用\nsaits.load(\"save_it_here\u002Fsaits_physionet2012.pypots\")  # 加载序列化的模型文件，以便后续插补或训练\n```\n\n\u003C\u002Fdetails>\n\n## ❖ 引用 PyPOTS\n\n> [!TIP]\n> **[2024年6月更新]** 😎 第一篇全面的时间序列插补基准论文\n[TSI-Bench: Benchmarking Time Series Imputation](https:\u002F\u002Farxiv.org\u002Fabs\u002F2406.12747) 现已公开发布。\n> 代码已在仓库 [Awesome_Imputation](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FAwesome_Imputation) 中开源。\n> 通过近35,000次实验，我们对28种插补方法、3种缺失模式（点、序列、块）、\n> 不同的缺失率以及8个真实世界数据集进行了全面的基准研究。\n>\n> **[2024年2月更新]** 🎉 我们的综述论文\n[Deep Learning for Multivariate Time Series Imputation: A Survey](https:\u002F\u002Farxiv.org\u002Fabs\u002F2402.04059) 已在 arXiv 上发表。\n> 我们全面回顾了当前最先进的深度学习时间序列插补方法文献，\n> 为这些方法提供了分类体系，并讨论了该领域的挑战与未来发展方向。\n\n介绍 PyPOTS 的论文已在 [arXiv](https:\u002F\u002Farxiv.org\u002Fabs\u002F2305.18811) 上发布，\n其简短版本已被第九届 SIGKDD 国际时间序列挖掘与学习研讨会 ([MiLeTS'23](https:\u002F\u002Fkdd-milets.github.io\u002Fmilets2023\u002F)) 接受。\n**此外**，PyPOTS 已被纳入 [PyTorch 生态系统](https:\u002F\u002Flandscape.pytorch.org\u002F?item=modeling--specialized--pypots) 项目。\n我们正努力将其发表在诸如 JMLR（[机器学习开源软件](https:\u002F\u002Fwww.jmlr.org\u002Fmloss\u002F) 专题）等知名学术期刊上。如果您在工作中使用了 PyPOTS，请按照以下方式引用，并🌟星标本仓库，让更多人注意到这个库。🤗\n\n目前已有科学研究项目使用了 PyPOTS，并在其论文中进行了引用。\n以下是 [一份不完全列表](https:\u002F\u002Fscholar.google.com\u002Fscholar?as_ylo=2022&q=%E2%80%9CPyPOTS%E2%80%9D&hl=en)。\n\n```bibtex\n@article{du2023pypots,\ntitle = {{PyPOTS: 用于部分观测时间序列的数据挖掘 Python 工具包}},\nauthor = {Wenjie Du},\njournal = {KDD 2023 MiLeTS},\nyear = {2023},\n}\n```\n\n```bibtex\n@article{du2025pypots,\ntitle = {{PyPOTS v1: 用于部分观测时间序列的机器学习 Python 工具包}},\nauthor = {Wenjie Du, Yiyuan Yang, Linglong Qian, Jun Wang, and Qingsong Wen},\nyear = {2025},\n}\n```\n\n## ❖ 贡献\n\n非常欢迎您为这个令人兴奋的项目做出贡献！\n\n通过提交您的代码，您将：\n\n1. 将您成熟的模型直接集成到 PyPOTS 中，供用户开箱即用，\n   并帮助您的工作获得更多的曝光和影响力。请参阅我们的 [入选标准](https:\u002F\u002Fdocs.pypots.com\u002Fen\u002Flatest\u002Ffaq.html#inclusion-criteria)。\n   您可以利用每个任务包中的 `template` 文件夹（例如\n   [pypots\u002Fimputation\u002Ftemplate](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Ftree\u002Fmain\u002Fpypots\u002Fimputation\u002Ftemplate)) 快速上手；\n2. 成为 [PyPOTS 贡献者](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fgraphs\u002Fcontributors)之一，\n   并在 [PyPOTS 官网](https:\u002F\u002Fpypots.com\u002Fabout\u002F#volunteer-developers)上列为志愿者开发者；\n3. 在 PyPOTS 的 [发布说明](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Freleases) 中被提及；\n\n您也可以通过简单地🌟星标本仓库来为 PyPOTS 做出贡献，这有助于让更多人注意到它。\n您的星标是对 PyPOTS 的认可，意义重大！\n\n\u003Cdetails open>\n\u003Csummary>\n    \u003Cb>\u003Ci>\n    👏 点击此处查看 PyPOTS 的星标用户和叉子。\u003Cbr>\n    我们为越来越多优秀的用户感到自豪，同时也涌现出更多闪耀的 ✨明星：\n    \u003C\u002Fi>\u003C\u002Fb>\n\u003C\u002Fsummary>\n\u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fstargazers\">\n    \u003Cimg alt=\"PyPOTS 星标用户\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FWenjieDu_PyPOTS_readme_860c71108f75.png\">\n\u003C\u002Fa>\n\u003Cbr>\n\u003Ca href=\"https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fnetwork\u002Fmembers\">\n    \u003Cimg alt=\"PyPOTS 叉子\" src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FWenjieDu_PyPOTS_readme_ac4649852c23.png\">\n\u003C\u002Fa>\n\u003C\u002Fdetails>\n\n👀 请访问 [PyPOTS 官网](https:\u002F\u002Fpypots.com\u002Fusers\u002F) 查看我们用户的完整名单！\n\n## ❖ 社区\n\n我们非常重视用户的反馈，因此正在构建 PyPOTS 社区。\n\n- [Slack](https:\u002F\u002Fjoin.slack.com\u002Ft\u002Fpypots-org\u002Fshared_invite\u002Fzt-1gq6ufwsi-p0OZdW~e9UW_IA4_f1OfxA)。这里进行社区讨论、问答交流，我们的开发团队也在其中；\n- [LinkedIn](https:\u002F\u002Fwww.linkedin.com\u002Fcompany\u002Fpypots)。官方公告和最新动态发布在这里；\n- [微信公众号](https:\u002F\u002Fmp.weixin.qq.com\u002Fs\u002FX3ukIgL1QpNH8ZEXq1YifA)。我们还在微信上建立了交流群，关注公众号后即可获取二维码；\n\n如果你有任何建议、想贡献创意或分享时序相关论文，欢迎加入我们并告诉我们！\nPyPOTS社区开放、透明且友好。让我们携手共建、共同完善PyPOTS吧！\n\n[\u002F\u002F]: # (请使用以下APA引用格式)\n[^1]: Du, W., Cote, D., & Liu, Y. (2023).\n[SAITS：基于自注意力机制的时间序列插补](https:\u002F\u002Fdoi.org\u002F10.1016\u002Fj.eswa.2023.119619)。\n*Expert systems with applications*。\n[^2]: Vaswani, A., Shazeer, N.M., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, L., & Polosukhin, I. (\n2017).\n[Attention is All you Need](https:\u002F\u002Fpapers.nips.cc\u002Fpaper\u002F2017\u002Fhash\u002F3f5ee243547dee91fbd053c1c4a845aa-Abstract.html)。\n*NeurIPS 2017*。\n[^3]: Cao, W., Wang, D., Li, J., Zhou, H., Li, L., & Li, Y. (2018).\n[BRITS：用于时间序列的双向循环插补](https:\u002F\u002Fpapers.nips.cc\u002Fpaper\u002F2018\u002Fhash\u002F734e6bfcd358e25ac1db0a4241b95651-Abstract.html)。\n*NeurIPS 2018*。\n[^4]: Che, Z., Purushotham, S., Cho, K., Sontag, D.A., & Liu, Y. (2018).\n[用于含有缺失值的多变量时间序列的循环神经网络](https:\u002F\u002Fwww.nature.com\u002Farticles\u002Fs41598-018-24271-9)。\n*Scientific Reports*。\n[^5]: Zhang, X., Zeman, M., Tsiligkaridis, T., & Zitnik, M. (2022).\n[用于不规则采样多变量时间序列的图引导网络](https:\u002F\u002Farxiv.org\u002Fabs\u002F2110.05357)。\n*ICLR 2022*。\n[^6]: Ma, Q., Chen, C., Li, S., & Cottrell, G. W. (2021).\n[用于不完整时间序列聚类的表示学习](https:\u002F\u002Fojs.aaai.org\u002Findex.php\u002FAAAI\u002Farticle\u002Fview\u002F17070)。\n*AAAI 2021*。\n[^7]: Jong, J.D., Emon, M.A., Wu, P., Karki, R., Sood, M., Godard, P., Ahmad, A., Vrooman, H.A., Hofmann-Apitius, M., &\nFröhlich, H. (2019).\n[深度学习在含缺失值的多变量临床患者轨迹聚类中的应用](https:\u002F\u002Facademic.oup.com\u002Fgigascience\u002Farticle\u002F8\u002F11\u002Fgiz134\u002F5626377)。\n*GigaScience*。\n[^8]: Chen, X., & Sun, L. (2021).\n[用于多维时间序列预测的贝叶斯时序分解](https:\u002F\u002Farxiv.org\u002Fabs\u002F1910.06366)。\n*IEEE Transactions on Pattern Analysis and Machine Intelligence*。\n[^9]: Yoon, J., Zame, W. R., & van der Schaar, M. (2019).\n[利用多方向循环神经网络估计时间数据流中的缺失数据](https:\u002F\u002Fieeexplore.ieee.org\u002Fdocument\u002F8485748)。\n*IEEE Transactions on Biomedical Engineering*。\n[^10]: Miao, X., Wu, Y., Wang, J., Gao, Y., Mao, X., & Yin, J. (2021).\n[用于多变量时间序列插补的生成式半监督学习](https:\u002F\u002Fojs.aaai.org\u002Findex.php\u002FAAAI\u002Farticle\u002Fview\u002F17086)。\n*AAAI 2021*。\n[^11]: Fortuin, V., Baranchuk, D., Raetsch, G. & Mandt, S. (2020).\n[GP-VAE：深度概率时间序列插补](https:\u002F\u002Fproceedings.mlr.press\u002Fv108\u002Ffortuin20a.html)。\n*AISTATS 2020*。\n[^12]: Tashiro, Y., Song, J., Song, Y., & Ermon, S. (2021).\n[CSDI：用于概率性时间序列插补的条件得分扩散模型](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper\u002F2021\u002Fhash\u002Fcfe8504bda37b575c70ee1a8276f3486-Abstract.html)。\n*NeurIPS 2021*。\n[^13]: Rubin, D. B. (1976).\n[推断与缺失数据](https:\u002F\u002Facademic.oup.com\u002Fbiomet\u002Farticle-abstract\u002F63\u002F3\u002F581\u002F270932)。\n*Biometrika*。\n[^14]: Wu, H., Hu, T., Liu, Y., Zhou, H., Wang, J., & Long, M. (2023).\n[TimesNet：用于通用时间序列分析的时序二维变体建模](https:\u002F\u002Fopenreview.net\u002Fforum?id=ju_Uqw384Oq)。\n*ICLR 2023*。\n[^15]: Wu, H., Xu, J., Wang, J., & Long, M. (2021).\n[Autoformer：具有自相关性的分解变压器，用于长期序列预测](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper\u002F2021\u002Fhash\u002Fbcc0d400288793e8bdcd7c19a8ac0c2b-Abstract.html)。\n*NeurIPS 2021*。\n[^16]: Zhang, Y., & Yan, J. (2023).\n[Crossformer：利用跨维度依赖关系进行多变量时间序列预测的Transformer](https:\u002F\u002Fopenreview.net\u002Fforum?id=vSVLM2j9eie)。\n*ICLR 2023*。\n[^17]: Zeng, A., Chen, M., Zhang, L., & Xu, Q. (2023).\n[Transformer是否适用于时间序列预测？](https:\u002F\u002Fojs.aaai.org\u002Findex.php\u002FAAAI\u002Farticle\u002Fview\u002F26317)。\n*AAAI 2023*。\n[^18]: Nie, Y., Nguyen, N. H., Sinthong, P., & Kalagnanam, J. (2023).\n[一个时间序列相当于64个词：使用Transformer进行长期预测](https:\u002F\u002Fopenreview.net\u002Fforum?id=Jbdc0vTOcol)。\n*ICLR 2023*。\n[^19]: Woo, G., Liu, C., Sahoo, D., Kumar, A., & Hoi, S. (2023).\n[ETSformer：用于时间序列预测的指数平滑Transformer](https:\u002F\u002Fopenreview.net\u002Fforum?id=5m_3whfo483)。\n*ICLR 2023*。\n[^20]: Zhou, T., Ma, Z., Wen, Q., Wang, X., Sun, L., & Jin, R. (2022).\n[FEDformer：用于长期序列预测的频率增强型分解Transformer](https:\u002F\u002Fproceedings.mlr.press\u002Fv162\u002Fzhou22g.html)。\n*ICML 2022*。\n[^21]: Zhou, H., Zhang, S., Peng, J., Zhang, S., Li, J., Xiong, H., & Zhang, W. (2021).\n[Informer：超越高效Transformer的长序列时间序列预测](https:\u002F\u002Fojs.aaai.org\u002Findex.php\u002FAAAI\u002Farticle\u002Fview\u002F17325)。\n*AAAI 2021*。\n[^22]: Zhou, T., Ma, Z., Wen, Q., Sun, L., Yao, T., Yin, W., & Jin, R. (2022).\n[FiLM：用于长期时间序列预测的频率改进型勒让德记忆模型](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper_files\u002Fpaper\u002F2022\u002Fhash\u002F524ef58c2bd075775861234266e5e020-Abstract-Conference.html)。\n*NeurIPS 2022*。\n[^23]: Yi, K., Zhang, Q., Fan, W., Wang, S., Wang, P., He, H., An, N., Lian, D., Cao, L., & Niu, Z. (2023).\n[频域MLP在时间序列预测中是更有效的学习者](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper_files\u002Fpaper\u002F2023\u002Fhash\u002Ff1d16af76939f476b5f040fd1398c0a3-Abstract-Conference.html)。\n*NeurIPS 2023*。\n[^24]: Liu, Y., Hu, T., Zhang, H., Wu, H., Wang, S., Ma, L., & Long, M. (2024).\n[iTransformer：反转Transformer在时间序列预测中的有效性](https:\u002F\u002Fopenreview.net\u002Fforum?id=JePfAI8fah)。\n*ICLR 2024*。\n[^25]: Liu, Y., Wu, H., Wang, J., & Long, M. (2022).\n[非平稳Transformer：探索时间序列预测中的平稳性](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper_files\u002Fpaper\u002F2022\u002Fhash\u002F4054556fcaa934b0bf76da52cf4f92cb-Abstract-Conference.html)。\n*NeurIPS 2022*。\n[^26]: Liu, S., Yu, H., Liao, C., Li, J., Lin, W., Liu, A. X., & Dustdar, S. (2022).\n[Pyraformer：用于长距离时间序列建模和预测的低复杂度金字塔注意力](https:\u002F\u002Fopenreview.net\u002Fforum?id=0EXmFzUn5I)。\n*ICLR 2022*。\n[^27]: Wang, H., Peng, J., Huang, F., Wang, J., Chen, J., & Xiao, Y. (2023).\n[MICN：用于长期序列预测的多尺度局部与全局上下文建模](https:\u002F\u002Fopenreview.net\u002Fforum?id=zt53IDUR1U)。\n*ICLR 2023*。\n[^28]: Das, A., Kong, W., Leach, A., Mathur, S., Sen, R., & Yu, R. (2023).\n[使用TiDE：时间序列密集编码器进行长期预测](https:\u002F\u002Fopenreview.net\u002Fforum?id=pCbC3aQB5W)。\n*TMLR 2023*。\n[^29]: Liu, Y., Li, C., Wang, J., & Long, M. (2023).\n[Koopa：利用库普曼预测器学习非平稳时间序列动态](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper_files\u002Fpaper\u002F2023\u002Fhash\u002F28b3dc0970fa4624a63278a4268de997-Abstract-Conference.html)。\n*NeurIPS 2023*。\n[^30]: Liu, M., Zeng, A., Chen, M., Xu, Z., Lai, Q., Ma, L., & Xu, Q. (2022).\n[SCINet：利用样本卷积与交互进行时间序列建模和预测](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper_files\u002Fpaper\u002F2022\u002Fhash\u002F266983d0949aed78a16fa4782237dea7-Abstract-Conference.html)。\n*NeurIPS 2022*。\n[^31]: Kim, T., Kim, J., Tae, Y., Park, C., Choi, J. H., & Choo, J. (2022).\n[可逆实例归一化用于应对分布漂移的精确时间序列预测](https:\u002F\u002Fopenreview.net\u002Fforum?id=cGDAkQo1C0p)。\n*ICLR 2022*。\n[^32]: Kitaev, N., Kaiser Ł., & Levskaya A. (2020).\n[Reformer：高效的Transformer](https:\u002F\u002Fopenreview.net\u002Fforum?id=rkgNKkHtvB)。\n*ICLR 2020*。\n[^33]: Cao, D., Wang, Y., Duan, J., Zhang, C., Zhu, X., Huang, C., Tong, Y., Xu, B., Bai, J., Tong, J., & Zhang, Q. (\n2020).\n[用于多变量时间序列预测的谱时序图神经网络](https:\u002F\u002Fproceedings.neurips.cc\u002Fpaper\u002F2020\u002Fhash\u002Fcdf6581cb7aca4b7e19ef136c6e601a5-Abstract.html)。\n*NeurIPS 2020*。\n[^34]: Nie, T., Qin, G., Mei, Y., & Sun, J. (2024).\n[ImputeFormer：由低秩性诱导的Transformer，用于可泛化的时空插补](https:\u002F\u002Farxiv.org\u002Fabs\u002F2312.01728)。\n*KDD 2024*。\n[^35]: Bai, S., Kolter, J. Z., & Koltun, V. (2018).\n[对通用卷积和循环网络用于序列建模的实证评估](https:\u002F\u002Farxiv.org\u002Fabs\u002F1803.01271)。\n*arXiv 2018*。\n[^36]: 项目“冈尼尔”，全球首个用于时间序列多任务建模的大语言模型，即将与您见面。🚀 您的数据集中存在缺失值和不同长度的序列吗？难以用您的时间序列进行多任务学习吗？这些问题将不再困扰您。立即加入我们的等待名单，获取最新消息，并在发布时第一时间试用！\n\u003Ca href=\"https:\u002F\u002Ftime-series.ai\">\u003Cimg src=\"https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FWenjieDu_PyPOTS_readme_a3729f3a1cbe.png\" width=\"20px\">Time-Series.AI\u003C\u002Fa>\n[^37]: Wang, S., Wu, H., Shi, X., Hu, T., Luo, H., Ma, L., ... & ZHOU, J. (2024).\n[TimeMixer：用于时间序列预测的可分解多尺度混合](https:\u002F\u002Fopenreview.net\u002Fforum?id=7oLshfEIC2)。\n*ICLR 2024*。\n[^38]: Luo, D., & Wang X. (2024).\n[ModernTCN：一种用于通用时间序列分析的现代纯卷积结构](https:\u002F\u002Fopenreview.net\u002Fforum?id=vpJMJerXHU)。\n*ICLR 2024*。\n[^39]: Zhan, T., He, Y., Deng, Y., Li, Z., Du, W., & Wen, Q. (2025).\n[时间证据融合网络：长期时间序列预测中的多源视角](https:\u002F\u002Fdoi.org\u002F10.1109\u002FTPAMI.2025.3596905)。\n*TPAMI 2025*。\n[^40]: [维基百科：线性插值](https:\u002F\u002Fen.wikipedia.org\u002Fwiki\u002FLinear_interpolation)\n[^41]: Xu, Z., Zeng, A., & Xu, Q. (2024).\n[FITS：用1万个参数建模时间序列](https:\u002F\u002Fopenreview.net\u002Fforum?id=bWcnvZ3qMb)。\n*ICLR 2024*。\n[^42]: Qian, L., Ibrahim, Z., Ellis, H. L., Zhang, A., Zhang, Y., Wang, T., & Dobson, R. (2023).\n[知识增强型医疗时间序列条件插补](https:\u002F\u002Farxiv.org\u002Fabs\u002F2312.16713)。\n*arXiv 2023*。\n[^43]: Lin, S., Lin, W., Wu, W., Zhao, F., Mo, R., & Zhang, H. (2026).\n[SegRNN：用于长期时间序列预测的分段循环神经网络](https:\u002F\u002Farxiv.org\u002Fabs\u002F2308.11200)。\n*IEEE IoT-J 2026*。\n[^44]: Yu, H. F., Rao, N., & Dhillon, I. S. (2016).\n[用于高维时间序列预测的时序正则化矩阵分解](https:\u002F\u002Fpapers.nips.cc\u002Fpaper_files\u002Fpaper\u002F2016\u002Fhash\u002F85422afb467e9456013a2a51d4dff702-Abstract.html)。\n*NeurIPS 2016*。\n[^45]: Jin, M., Wang, S., Ma, L., Chu, Z., Zhang, J. Y., Shi, X., ... & Wen, Q. (2024).\n[Time-LLM：通过重编程大型语言模型进行时间序列预测](https:\u002F\u002Fopenreview.net\u002Fforum?id=Unb5CVPtae)。\n*ICLR 2024*。\n[^46]: Zhou, T., Niu, P., Sun, L., & Jin, R. (2023).\n[One Fits All：通过预训练的语言模型实现强大的通用时间序列分析](https:\u002F\u002Fopenreview.net\u002Fforum?id=gMS6FVZvmF)。\n*NeurIPS 2023*。\n[^47]: Goswami, M., Szafer, K., Choudhry, A., Cai, Y., Li, S., & Dubrawski, A. (2024).\n[MOMENT：一系列开放的时间序列基础模型](https:\u002F\u002Fproceedings.mlr.press\u002Fv235\u002Fgoswami24a.html)。\n*ICML 2024*。\n[^48]: Yue, Z., Wang, Y., Duan, J., Yang, T., Huang, C., Tong, Y., & Xu, B. (2022).\n[TS2Vec：迈向时间序列的通用表示](https:\u002F\u002Fojs.aaai.org\u002Findex.php\u002FAAAI\u002Farticle\u002Fview\u002F20881)。\n*AAAI 2022*。\n[^49]: Wang, S., Li, J., Shi, X., Ye, Z., Mo, B., Lin, W., Ju, S., Chu, Z. & Jin, M. (2025).\n[TimeMixer++：一种用于通用预测分析的通用时间序列模式机器](https:\u002F\u002Fopenreview.net\u002Fforum?id=1CLzLXSFNn)。\n*ICLR 2025*。\n[^50]: Talukder, S., Yue, Y., & Gkioxari G. (2024).\n[TOTEM：用于通用时间序列分析的标记化时间序列嵌入](https:\u002F\u002Fopenreview.net\u002Fforum?id=QlTLkH6xRC)。\n*TMLR 2024*。\n[^51]: Eldele, E., Ragab, M., Chen, Z., Wu, M., & Li, X. (2024).\n[TSLANet：重新思考Transformer在时间序列表征学习中的作用](https:\u002F\u002Fproceedings.mlr.press\u002Fv235\u002Feldele24a.html)。\n*ICML 2024*。\n[^52]: Ma, A., Luo, D., & Sha, M. (2026).\n[MixLinear：用0.1千个参数进行极端低资源的多变量时间序列预测](https:\u002F\u002Fopenreview.net\u002Fforum?id=QUj0KuCumD)。\n*ICLR 2026*。\n[^53]: Horn, M., Moor, M., Bock, C., Rieck, B., & Borgwardt, K. (2020).\n[用于时间序列的集合函数](https:\u002F\u002Fproceedings.mlr.press\u002Fv119\u002Fhorn20a.html)。\n*ICML 2020*。\n[^54]: Genet, R., & Inzirillo, H. (2024).\n[TKAN：时间科尔莫戈罗夫-阿诺德网络](https:\u002F\u002Farxiv.org\u002Fabs\u002F2405.07344)。\n*arXiv 2024*。","# PyPOTS 快速上手指南\n\nPyPOTS 是一个专为**部分观测时间序列（Partially-Observed Time Series, POTS）**设计的 Python 机器学习工具箱。它旨在解决现实世界中因传感器故障、通信错误等原因导致的时间序列数据缺失问题，提供统一的 API 支持缺失值填补、分类、聚类、预测和异常检测等任务。\n\n## 1. 环境准备\n\n在开始之前，请确保您的开发环境满足以下要求：\n\n*   **操作系统**: Linux, macOS 或 Windows\n*   **Python 版本**: 3.8 或更高版本 (`v3.8+`)\n*   **核心依赖**:\n    *   PyTorch (深度学习框架)\n    *   NumPy, Pandas, Scikit-learn (基础数据处理)\n*   **可选依赖**: 若需使用超参数优化功能，需安装 [Microsoft NNI](https:\u002F\u002Fgithub.com\u002Fmicrosoft\u002Fnni)。\n\n> **提示**: 建议使用虚拟环境（如 `conda` 或 `venv`）以避免依赖冲突。\n\n## 2. 安装步骤\n\n您可以通过 `pip` 或 `conda` 进行安装。国内用户推荐使用清华或阿里镜像源以加速下载。\n\n### 方式一：使用 pip 安装（推荐）\n\n```bash\n# 使用阿里云镜像源加速安装\npip install pypots -i https:\u002F\u002Fmirrors.aliyun.com\u002Fpypi\u002Fsimple\u002F\n\n# 或者使用清华镜像源\npip install pypots -i https:\u002F\u002Fpypi.tuna.tsinghua.edu.cn\u002Fsimple\n```\n\n### 方式二：使用 conda 安装\n\n```bash\nconda install -c conda-forge pypots\n```\n\n### 验证安装\n\n安装完成后，运行以下代码验证是否成功：\n\n```python\nimport pypots\nprint(pypots.__version__)\n```\n\n## 3. 基本使用\n\nPyPOTS 采用了面向对象的统一 API 设计。以下是一个最简单的**缺失值填补（Imputation）**示例，使用经典的 `SAITS` 模型处理带有缺失值的时间序列数据。\n\n### 示例：使用 SAITS 进行缺失值填补\n\n```python\nimport numpy as np\nfrom pypots.imputation import SAITS\nfrom pypots.data.generating import gene_random_walk\nfrom pypots.utils.metrics import calc_mae\n\n# 1. 生成模拟数据 (实际使用时请替换为您的数据集)\n# data: 原始完整数据, X: 含缺失值的数据, missing_mask: 缺失掩码\ndata, X, missing_mask = gene_random_walk(\n    n_samples=100, \n    n_steps=24, \n    n_features=5, \n    missing_rate=0.1\n)\n\n# 2. 初始化模型\n# n_steps: 时间步长, n_features: 特征数量\nmodel = SAITS(\n    n_steps=24,\n    n_features=5,\n    n_layers=2,\n    d_model=256,\n    d_ffn=512,\n    n_heads=4,\n    d_k=64,\n    d_v=64,\n    dropout=0.1,\n    epochs=10,\n    patience=5,\n    device=\"cpu\"  # 如果有 GPU 可改为 \"cuda\"\n)\n\n# 3. 训练模型\n# 输入需要是字典格式，包含 'X' (含缺失值数据) 和 'missing_mask'\ntrain_data = {\"X\": X, \"missing_mask\": missing_mask}\nmodel.fit(train_data)\n\n# 4. 执行填补\nimputed_data = model.impute({\"X\": X, \"missing_mask\": missing_mask})\n\n# 5. 评估结果 (计算平均绝对误差)\nmae = calc_mae(imputed_data, data, missing_mask)\nprint(f\"填补结果的 MAE: {mae:.4f}\")\n```\n\n### 关键说明\n*   **数据格式**: PyPOTS 的模型输入通常为字典，必须包含 `X` (numpy array 或 torch tensor) 和 `missing_mask` (指示哪些位置缺失)。\n*   **任务扩展**: 除了填补 (`imputation`)，PyPOTS 还支持 `classification` (分类), `clustering` (聚类), `forecasting` (预测) 和 `anomaly_detection` (异常检测)，使用方法类似，只需导入对应的模块类即可。\n*   **预训练模型**: 部分模型支持加载预训练权重，具体请参考官方文档。","某工业物联网团队正在处理风力发电机传感器回传的多变量时间序列数据，旨在预测设备故障，但数据因网络波动存在大量不规则缺失值（NaN）。\n\n### 没有 PyPOTS 时\n- 工程师需手动编写复杂的插值代码（如线性插值或均值填充），不仅耗时且难以处理不规则采样数据，导致关键故障特征被平滑丢失。\n- 面对分类、聚类和异常检测等多种任务，必须分别寻找并适配不同的开源模型，重复造轮子导致开发周期长达数周。\n- 缺乏针对“部分观测”数据的专用深度学习架构，直接丢弃含缺失值的样本造成数据浪费，或强行填充引入巨大噪声，最终模型预测准确率低下。\n- 实验过程难以复现，不同成员使用的数据清洗逻辑不一致，导致团队协作混乱，无法科学评估算法优劣。\n\n### 使用 PyPOTS 后\n- 直接调用 PyPOTS 内置的 SOTA 神经网络模型（如 SAITS 或 BRITS），自动高效地完成不规则时间序列的高精度缺失值填补，完整保留故障前兆特征。\n- 利用统一的 API 接口，在同一框架下无缝切换进行故障分类、运行状态聚类及异常点检测，将原本数周的开发工作压缩至几天内完成。\n- 原生支持含 NaN 的不完整多变量数据输入，无需预先粗暴清洗，显著提升了数据利用率，使故障预警模型的准确率大幅提升。\n- 依托标准化的实验流程和完善的文档，团队成员可快速复现彼此结果，专注于策略优化而非底层数据预处理细节。\n\nPyPOTS 通过提供一站式、面向真实残缺数据的深度学习解决方案，将工业时间序列分析从繁琐的数据清洗泥潭中解放出来，实现了从“勉强可用”到“精准智能”的跨越。","https:\u002F\u002Foss.gittoolsai.com\u002Fimages\u002FWenjieDu_PyPOTS_3d2b9dd5.png","WenjieDu","Wenjie Du","https:\u002F\u002Foss.gittoolsai.com\u002Favatars\u002FWenjieDu_a86fc77e.jpg","AI Researcher \u003C​Time Series, FDA 510k Regulatory​> More awesome private repos will open source! Follow me to get notified ;-)","@TimeSeries-AI","where time series is observed & valued","wdu@time-series.ai","_W_DU_","https:\u002F\u002FTime-Series.AI","https:\u002F\u002Fgithub.com\u002FWenjieDu",[83,87],{"name":84,"color":85,"percentage":86},"Python","#3572A5",100,{"name":88,"color":89,"percentage":90},"Shell","#89e051",0,1982,183,"2026-04-05T06:32:22","BSD-3-Clause",1,"未说明",{"notes":98,"python":99,"dependencies":100},"该工具包专注于部分观测时间序列（POTS）的机器学习任务。自 v0.2 版本起，所有神经网络模型均支持通过 Microsoft NNI 框架进行超参数优化。部分原本不支持缺失值输入的模型（如 Transformer, iTransformer 等）已通过特定的嵌入策略和训练方法（ORT+MIT）进行了适配，使其能够处理 POTS 数据。","3.8+",[101,102],"torch","nni",[16,104,15,105,14,106,35],"视频","其他","音频",[108,109,110,111,112,113,114,115,116,117,118,119,120,121,122],"time-series","data-mining","deep-learning","missing-values","anomaly-detection","classification","clustering","forecasting","generation","imputation","machine-learning","pytorch","data-analysis","data-science","neural-networks",null,"2026-03-27T02:49:30.150509","2026-04-07T09:49:49.026500",[127,132,137,142,147,151],{"id":128,"question_zh":129,"answer_zh":130,"source_url":131},21560,"如何在多 GPU 环境下正确指定设备以避免模型和数据不在同一设备上？","当传递的设备参数不是 `torch.device` 对象（例如字符串 'cuda:2' 或整数 2）时，旧版本代码可能无法正确将数据移动到指定设备，导致模型在 cuda:2 而数据在 cuda:0 从而崩溃。该问题已在 PR #631 中修复。建议确保传入正确的设备类型，或升级到已修复该问题的最新版本。参考修复代码：https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F631\u002Ffiles","https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fissues\u002F575",{"id":133,"question_zh":134,"answer_zh":135,"source_url":136},21561,"如何自定义数据集以适配 PyPOTS 中的 SOTA 插补模型？","用户可以参考 SAITS 模型的具体实现代码来了解如何适配自己的数据集。官方建议仔细阅读文档并查看模型实现源码（https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FSAITS\u002Fblob\u002Fmain\u002Fmodeling\u002Fsaits.py#L32-L224）作为参考。如果有针对特定模型（如 SAITS）的进一步问题，也可以在其专属仓库中提出 Issue。","https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fissues\u002F141",{"id":138,"question_zh":139,"answer_zh":140,"source_url":141},21562,"PyPOTS 中的数据掩码（indicating_mask）、原始数据（X_ori）和输入数据（X）之间有什么区别？","`X_ori` 代表原始数据（Original），包含数据集中原本就存在的缺失值；`X` 是在 `X_ori` 基础上额外添加了人工缺失数据后的版本，用于作为模型的输入；`indicating_mask` 用于标识 `X` 和 `X_ori` 之间的差异，即指出哪些部分是人为掩码（人工缺失）的。`X_ori` 主要用于误差计算和模型验证。","https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fissues\u002F431",{"id":143,"question_zh":144,"answer_zh":145,"source_url":146},21563,"如何从聚类算法（如 VaDER 和 CRLI）中提取潜在表示（latent representation）？","PyPOTS 已更新 VaDER 和 CRLI 模型，使其能够返回用于聚类的潜在表示。用户可以直接调用相关方法获取这些低维表示，以便进行下游分析（如计算轮廓系数等内部聚类验证指标）。具体的单元测试示例可参考：https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fblob\u002F09b494d7f26275a465339b903a83e031e6c32cef\u002Ftests\u002Fclustering\u002Fvader.py#L66-L76","https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fissues\u002F177",{"id":148,"question_zh":149,"answer_zh":150,"source_url":141},21564,"在使用 Google Colab 运行教程时遇到数据集加载或预处理问题怎么办？","如果在 Colab 中运行官方教程脚本时遇到问题，通常与数据集（如 physionet_2012）的高缺失率或预处理逻辑有关。请确保使用的是最新版本的库，并注意日志中关于忽略长度过短时间序列的警告。如果涉及具体的指标计算或掩码理解，请参考 `pypots.utils.metrics` 中的函数以及维护者关于数据结构的解释。",{"id":152,"question_zh":153,"answer_zh":154,"source_url":155},21565,"如何在训练 Raindrop 分类模型时使用多个 GPU？","在训练 Raindrop 模型时，若需使用多 GPU，可以将 `device` 参数设置为设备列表（例如 `device = ['cuda:0', 'cuda:1']`），并配合设置 `num_workers`。如果遇到错误，请检查 PyTorch 版本兼容性（如 torch 2.0.1）以及 PyPOTS 版本是否支持该特性。确保按照文档正确配置多卡环境。","https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fissues\u002F147",[157,162,167,172,177,182,187,192,197,202,207,212,217,222,227,232,237,242,247,252],{"id":158,"version":159,"summary_zh":160,"released_at":161},127550,"v1.3","本次新版本将 TKAN（时序柯尔莫哥洛夫-阿诺德网络的实现）集成到 PyPOTS 中，并修复了一些 bug。详细信息请参阅下方的变更日志。\n\n👍 恭喜我们的新贡献者 @awanawana！\n\n## 变更内容\n* 添加由 @Copilot 实现的 TKAN（时序柯尔莫哥洛夫-阿诺德网络）插补模型，详见 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F809\n* 由 @WenjieDu 更新文档，详见 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F810\n* 由 @WenjieDu 修复代码风格问题并更新文档，详见 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F811\n* 构建依赖：将 docker\u002Fsetup-buildx-action 从 3 升级至 4，由 @dependabot[bot] 完成，详见 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F817\n* 构建依赖：将 docker\u002Fbuild-push-action 从 6 升级至 7，由 @dependabot[bot] 完成，详见 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F816\n* 构建依赖：将 docker\u002Flogin-action 从 3 升级至 4，由 @dependabot[bot] 完成，详见 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F815\n* 构建依赖：将 docker\u002Fsetup-qemu-action 从 3 升级至 4，由 @dependabot[bot] 完成，详见 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F814\n* 更新文档以添加 DeepWiki 链接，由 @WenjieDu 完成，详见 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F818\n* 修复：为 calc_quantile_loss 函数添加 NumPy 支持，由 @awanawana 完成，详见 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F822\n* 由 @WenjieDu 更新文档，详见 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F823\n\n## 新贡献者\n* @awanawana 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F822 中完成了首次贡献\n\n**完整变更日志**：https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fcompare\u002Fv1.2...v1.3","2026-03-26T04:22:46",{"id":163,"version":164,"summary_zh":165,"released_at":166},127551,"v1.2","## 变更内容\n* 修复 GPVAE 训练中的 TypeError：在 matern_kernel 中将 length_scale 浮点数转换为张量，由 @Copilot 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F799 中完成\n* 修复 MOMENT 在多 GPU 上的段错误，由 @Copilot 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F800 中完成\n* 添加 MixLinear 作为预测模型，由 @Copilot 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F805 中完成\n* 添加 TimeMixer++ 的预测任务支持，由 @Copilot 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F806 中完成\n* 添加 SeFT 作为分类模型，由 @Copilot 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F801 中完成\n* 更新文档，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F807 中完成\n\n## 新贡献者\n* @Copilot 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F799 中完成了首次贡献\n\n**完整变更日志**：https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fcompare\u002Fv1.1...v1.2","2026-03-05T15:07:13",{"id":168,"version":169,"summary_zh":170,"released_at":171},127552,"v1.1","在本次发布中，我们修复了一些已知的 bug（详见变更日志）。\n👍 向我们的新贡献者 Emmanuel @emmanuel-ferdman 和 Arina @arinagoncharova2005 致以热烈的祝贺！\n\n## 变更内容\n* 修复：由 @emmanuel-ferdman 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F761 中更新测试中的线程接口\n* 更新文档：由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F762 中完成\n* 更新文档并修复已弃用的 threading.Thread.setDaemon 方法：由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F763 中完成\n* 更新文档中的团队页面：由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F764 中完成\n* 构建依赖项：将 actions\u002Ffirst-interaction 从 1 升级到 2：由 @dependabot[bot] 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F765 中完成\n* 构建依赖项：将 actions\u002Ffirst-interaction 从 1 升级到 2：由 @dependabot[bot] 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F767 中完成\n* 构建依赖项：将 actions\u002Ffirst-interaction 从 2 升级到 3：由 @dependabot[bot] 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F771 中完成\n* 构建依赖项：将 actions\u002Fcheckout 从 4 升级到 5：由 @dependabot[bot] 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F770 中完成\n* 更新 TEFN 参考文献：由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F772 中完成\n* 构建依赖项：将 actions\u002Fsetup-python 从 5 升级到 6：由 @dependabot[bot] 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F776 中完成\n* 构建依赖项：将 actions\u002Fstale 从 9 升级到 10：由 @dependabot[bot] 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F775 中完成\n* 构建依赖项：将 pypa\u002Fgh-action-pypi-publish 从 1.12.4 升级到 1.13.0：由 @dependabot[bot] 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F774 中完成\n* 更新问候语中的配置：由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F779 中完成\n* 使 CI 仅在 Ubuntu 上运行：由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F782 中完成\n* 修复 TimeMixer 的通道独立性预测问题：由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F780 中完成\n* 更新 issue_manager 工作流中的配置：由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F784 中完成\n* 构建依赖项：将 tiangolo\u002Fissue-manager 从 0.5.1 升级到 0.6.0：由 @dependabot[bot] 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F786 中完成\n* 构建依赖项：将 actions\u002Fcheckout 从 5 升级到 6：由 @dependabot[bot] 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F791 中完成\n* 修复 CSAI 问题：由 @LinglongQian 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F788 中完成\n* 修复 timemixer\u002Ftimemixerpp.classification() 中 n_layers 的拼写错误：由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F793 中完成\n* 更新文档：由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F796 中完成\n* 修复填充函数中的设备不匹配问题：由 @arinagoncharova2005 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F794 中完成\n* 修复 LogisticRegression 中的 multi_class 类型错误以及 torch_pad_nan 中的设备不匹配问题：由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F798 中完成\n\n## 新贡献者\n* @emmanuel-ferdman 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F761 中完成了首次贡献\n* @arinagoncharova2005 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F794 中完成了首次贡献\n\n**完整变更日志**：https:\u002F\u002Fgithub.com\u002FWenj","2026-01-18T13:20:15",{"id":173,"version":174,"summary_zh":175,"released_at":176},127553,"v1.0","我们使 PatchTST 和 Autoformer 能够用于分类任务。此外，社区报告的一些 bug 也已修复。👏 特别感谢我们的新贡献者 @zltututu！\n\n鉴于当前阶段的主要功能均已实现，并且我们已经研究出一个稳定的版本，因此本次发布作为 PyPOTS 的首个 major 版本，即 v1.0。这是我们新的里程碑，让我们继续向 v2.0 迈进吧！\n\n## 变更内容\n* @zltututu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F751 中修复了 ModernTCN 在使用多层时的运行时错误。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F756 中修复了 TimesNet 中意外覆盖的问题。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F757 中新增了用于分类任务的 PatchTST。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F758 中新增了用于分类任务的 Autoformer。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F759 中更新了文档。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F760 中发布了 v1.0。\n\n## 新贡献者\n* @zltututu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F751 中完成了首次贡献。\n\n**完整变更日志**: https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fcompare\u002Fv0.19...v1.0","2025-07-08T09:04:34",{"id":178,"version":179,"summary_zh":180,"released_at":181},127554,"v0.19","MICN、DLinear 和 FiLM 是用于时间序列预测的实现。\n\n## 变更内容\n* 由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F743 中更新文档\n* 由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F746 中更新文档\n* 由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F747 中添加 MICN 预测模型\n* 由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F748 中添加 DLinear 预测模型\n* 由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F749 中添加 FiLM 预测模型\n* 由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F750 中添加 FiLM、DLinear、MICN 预测模型，并发布 v0.19 版本\n\n\n**完整变更日志**: https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fcompare\u002Fv0.18...v0.19","2025-05-29T08:48:22",{"id":183,"version":184,"summary_zh":185,"released_at":186},127555,"v0.18","iTransforme、Crossforme、Pyraformer、FEDformer、Informer、Transformer、ETSformer、TimeMixer、Nonstationary Tr. 和 FiLM 已在异常检测任务上实现。\n\n## 变更内容\n* @yyysjz1997 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F738 中新增了 10 种异常检测算法。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F739 中新增了 10 个新模型。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F741 中更新了文档。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F742 中更新了文档并发布了 v0.18 版本。\n\n\n**完整变更日志**: https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fcompare\u002Fv0.17...v0.18","2025-05-07T08:44:10",{"id":188,"version":189,"summary_zh":190,"released_at":191},127556,"v0.17","TimeMixer++、SCINet、DLinear、TimesNet 和 Reformer 已在异常检测任务上实现。\n\n👍 向我们的新贡献者 Yiyuan @yyysjz1997 和 Pavel @Durakavalyanie 致以热烈的祝贺！\n\n## 变更内容\n* @yyysjz1997 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F725 中新增了用于异常检测的 TimesNet 模型。\n* @Durakavalyanie 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F730 中修复了 CSDI.predict 中未使用的 n_sampling_times 参数。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F732 中实现了用于异常检测的 Reformer 模型。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F733 中实现了用于异常检测的 SCINet 模型。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F734 中实现了用于异常检测的 DLinear 模型。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F735 中实现了用于异常检测的 TimeMixerPP 模型。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F731 中更新了过时的工作流和 PR 模板。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F736 中更新了文档。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F737 中发布了 v0.17 版本。\n\n## 新贡献者\n* @yyysjz1997 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F725 中完成了首次贡献。\n* @Durakavalyanie 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F730 中完成了首次贡献。\n\n**完整变更日志**: https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fcompare\u002Fv0.16...v0.17","2025-04-18T07:49:49",{"id":193,"version":194,"summary_zh":195,"released_at":196},127557,"v0.16","在本次发布中，ModernTCN、TimesNet 和 SegRNN 已被实现用于预测任务。\n\n## 变更内容\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F705 中添加了基于 TimesNet 的预测功能。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F709 中更新了开发环境的依赖版本。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F713 中更新了一些 CI 配置。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F712 中修复了 TimeLLM 中的 AttributeError: 'NoneType' 对象没有 'endswith' 属性的问题。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F715 中将 `sentencepiece` 添加到开发环境的依赖中。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F716 中将带有“潜在 bug”标签的问题和 PR 从过期规则中排除。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F717 中添加了基于 ModernTCN 的预测功能。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F720 中添加了基于 SegRNN 的预测功能。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F721 中添加了问题管理器，用于自动关闭已完成的问题。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F722 中更新了文档。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F723 中降低了 CI 测试中的异常率，以避免 GPT4TS 输出 NaN 值。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F724 中发布了 v0.16 版本。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F728 中修复了问题管理器失败的问题。\n* @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F727 中修复了问候工作流失败的问题。\n\n\n**完整变更日志**: https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fcompare\u002Fv0.15...v0.16","2025-04-10T08:36:37",{"id":198,"version":199,"summary_zh":200,"released_at":201},127558,"v0.15","在本次发布中，新增了 TimeMixer++、TOTEM 和 TSLANet，并已在数据插补任务上实现。\n\n## 变更内容\n* 添加 TimeMixer++，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F691 中完成  \n* 将 GitHub CI 工作流中 Python 的最低版本提升至 3.9，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F698 中完成  \n* 添加 TOTEM 模块及 IMPU TOTEM，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F694 中完成  \n* 添加 TSLANet 模块及 IMPU TSLANet，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F696 中完成  \n* 发布 v0.15 版本，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F700 中完成  \n* 更新文档，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F702 中完成  \n* 发布至 Docker Hub，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F703 中完成  \n\n\n**完整变更日志**：https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fcompare\u002Fv0.14...v0.15","2025-04-02T02:51:07",{"id":203,"version":204,"summary_zh":205,"released_at":206},127559,"v0.14","本次新版本实现了用于异常检测的TEFN、ImputeFormer、SAITS、PatchTST、SegRNN和Autoformer。此外，模型现在会输出其潜在表示 #674，这些潜在表示作为字典 `results` 的一部分，在 `pypots.{task_name}.{model_name}.core._{mode_name}.forward()` 中返回。同时修复了一个模型保存相关的 bug (#668)，该 bug 可能导致最佳模型状态无法正确加载或保存。\n\n更多详细信息请参阅下方的变更日志。\n\n## 变更内容\n* 修复模型状态未进行深拷贝的 bug，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F668 中完成  \n* build(deps): 将 actions\u002Fsetup-python 从版本 3 升级至 5，由 @dependabot 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F636 中完成  \n* 修复 CLAS TEFN 在 `ROC AUC`\u003C0.5 时失败的问题，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F671 中完成  \n* 新增 ANOD Autoformer，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F672 中完成  \n* 更新文档以添加 ANOD 包，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F673 中完成  \n* 输出模型的潜在表示并重构框架，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F674 中完成  \n* 修复 TimeLLM 在测试时出现的 OOM 问题，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F676 中完成  \n* 使用统一命名规范区分不同阶段的数据，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F678 中完成  \n* 修复在多 GPU 环境下 `calc_criterion()` 不可调用的 bug，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F681 中完成  \n* 实现 SAITS 用于异常检测，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F684 中完成  \n* 修复文档构建失败的问题，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F685 中完成  \n* 实现 TEFN 用于异常检测，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F686 中完成  \n* 实现 ImputeFormer 用于异常检测，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F687 中完成  \n* 实现 PatchTST 用于异常检测，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F688 中完成  \n* 实现 SegRNN 用于异常检测，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F689 中完成  \n* 发布 v0.14 版本，由 @WenjieDu 在 https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F690 中完成  \n\n\n**完整变更日志**: https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fcompare\u002Fv0.13...v0.14","2025-03-26T18:39:32",{"id":208,"version":209,"summary_zh":210,"released_at":211},127560,"v0.13","TS2Vec (`pypots.vec.ts2vec`) is included in PyPOTS for representation learning and vectorization on POTS data. TEFN, iTransformer, SAITS, TimesNet, and the new added TS2Vec are implemented for classification. **Note that**, from this version, classification category results are output as key `classification` of the returned dictionary, and classification probabilities are returned as key `classification_proba` instead. Function `predict_proba()` is added to all classification models for users to obtain classification probabilities directly.\r\n\r\nSeveral bugs are fixed in this release. Refer to the changelog below for details.\r\n\r\n## What's Changed\r\n* build(deps): bump actions\u002Fcheckout from 3 to 4 by @dependabot in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F635\r\n* build(deps): bump conda-incubator\u002Fsetup-miniconda from 2 to 3 by @dependabot in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F637\r\n* Fix the bug data and mode not on same device by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F631\r\n* Fix failed CRLI, Koopa, and USGAN on multiple GPUs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F633\r\n* Omit not-trained LLM parts when saving models to decrease file sizes by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F640\r\n* Add TS2Vec by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F642\r\n* Add model.eval() at the beginning of predict() to avoid potential bug by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F643\r\n* Add classification TS2Vec by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F644\r\n* Fix wrong dim when concatenating eval labels in pypots.classification.base by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F646\r\n* Add classification SAITS by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F649\r\n* Update testing configs and issue templates by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F647\r\n* Add classification TimesNet by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F651\r\n* Add classification iTransformer by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F656\r\n* Add classification TEFN by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F657\r\n* Refactoring to make training_loss\u002Fvalidation_metric\u002Foptimizer accept class type by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F660\r\n* Fix map_location err by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F661\r\n* Make`Criterion` take logits by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F659\r\n* Unify `_train_model()` by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F662\r\n* Update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F658\r\n* Add `predict_proba()` for classification models by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F664\r\n* Release v0.13 by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F665\r\n\r\n## New Contributors\r\n* @dependabot made their first contribution in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F635\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fcompare\u002Fv0.12...v0.13","2025-03-21T09:19:07",{"id":213,"version":214,"summary_zh":215,"released_at":216},127561,"v0.12","MOMENT, a time-series foundation model, is added in this version and has been implemented on the tasks of forecasting and imputation. We also fix a bug that user customized training loss was not applied to some models #610. Moreover, please note that we unify the names of arguments patching length and patching stride for models utilizing patch embedding proposed in PatchTST #628.\r\n\r\n## What's Changed\r\n* Fix bug that given customized loss not applied to some models by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F610\r\n* Apply decorator @torch.no_grad() to simplify predict() by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F612\r\n* Load pickled data file with pd.read_pickle for pandas>=2 by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F614\r\n* Update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F615\r\n* Add Dependabot by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F617\r\n* Add attribute `amp_enabled` to switch `autocast` by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F620\r\n* Implement MOMENT on imputation and forecasting tasks by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F622\r\n* Fix bug that devices not consistent  by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F626\r\n* Unify the name of patch size and stride by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F628\r\n* Update docs and fix linting errors by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F629\r\n* Add the script to run full test by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F624\r\n* Release v0.12 and bump TSDB to v0.7.1 by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F630\r\n* Release v0.12 by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F634\r\n\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fcompare\u002Fv0.11...v0.12","2025-03-13T17:51:19",{"id":218,"version":219,"summary_zh":220,"released_at":221},127562,"v0.11","**We make Time-LLM, TEFN, FITS, TimeMixer, GPT4TS, and Transformer work on the forecasting task (still accept POTS as input) for you in this release of v0.11**\r\n\r\nAdditionally, we conduct some refactorings in this version: \r\n1. AMP (Automatic Mixed Precision) is enabled for LLM-based model training. Users can switch it on by specifying the env var `ENABLE_AMP` #594; \r\n2. pypots tuning is now renamed into pypots hpo #592;\r\n3. pypots environment variables are capitalized #591;\r\n4. all data preprocessing functions are removed from pypots, and users are encouraged to fully use [BenchPOTS](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FBenchPOTS) instead, which includes processing pipelines for 172 public datasets #585;\r\n\r\n## What's Changed\r\n* Refactor some parts by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F586\r\n* Update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F587\r\n* Remove data prerprocessing pipelines and update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F588\r\n* Capitalize env vars and rename PyPOTS `tuning` module into `hpo` by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F593\r\n* Enable AMP (Automatic Mixed Precision) in PyPOTS by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F594\r\n* Add `pypots.forecasting.Transformer` by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F597\r\n* Add `pypots.forecasting.FITS` by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F600\r\n* Add `pypots.forecasting.TEFN` by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F602\r\n* Add `pypots.forecasting.TimeMixer` by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F603\r\n* Add `pypots.forecasting.TimeLLM` by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F604\r\n* Add `pypots.forecasting.GPT4TS` by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F605\r\n* Fix x and x_mark shape not consistent bug in forecasting TimeMixer by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F607\r\n* Release v0.11 by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F608\r\n\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fcompare\u002Fv0.10...v0.11","2025-03-07T04:12:51",{"id":223,"version":224,"summary_zh":225,"released_at":226},127563,"v0.10","We make the following main updates in this new release:\r\n\r\n1. added Time-LLM and GPT4TS;\r\n2. enabled users to customize their training loss and evaluation metric for models;\r\n3. fixed an argument-order error in CRPS loss calculation;\r\n4. fixed a bug that data and model not on the same device when applying a list of CUDA devices;\r\n\r\nKudos to our new contributors @c-lyu and @giacomoguiduzzi 👍!\r\n\r\n## What's Changed\r\n* Fix CRPS loss calculation by @c-lyu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F565\r\n* Fix wrong order of arguments when calling calc_quantile_loss by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F566\r\n* Enable to customized loss and val funcs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F526\r\n* Implement TimeLLM as an imputation model by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F567\r\n* Make pytest ignore LLM-based testing cases by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F569\r\n* Enable customizing training loss and val metric, add Time-LLM by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F570\r\n* Refactor deprecated torch.cuda.amp.autocast by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F521\r\n* Expose more models for tuning, bump dependency PyGrinder version num, and overwrite torch.autocast by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F572\r\n* Refactor CSAI imputation & classification by @LinglongQian in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F552\r\n* Update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F573\r\n* Update CSAI, refactor code and update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F574\r\n* Fix a bug that data and model are not on the same device when CUDA device list is applied by @giacomoguiduzzi in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F563\r\n* Update the docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F576\r\n* Fix potential bug that data and model not on the same cuda device, update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F577\r\n* Implement GPT4TS for time series imputation by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F579\r\n* Update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F580\r\n* Including GPT4TS and update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F581\r\n* Update Docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F582\r\n* Update docs and release v0.10 by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F583\r\n\r\n## New Contributors\r\n* @c-lyu made their first contribution in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F565\r\n* @giacomoguiduzzi made their first contribution in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F563\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fcompare\u002Fv0.9...v0.10","2025-02-28T08:26:16",{"id":228,"version":229,"summary_zh":230,"released_at":231},127564,"v0.9","In this release, PyPOTS brings you new models FITS, SegRNN, CSAI, and TRMF. Kudos👍 to our new contributors Shengsheng @lss-1138 and Joseph @joseph-arulraj!\r\n\r\n## What's Changed\r\n* Add FITS by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F522\r\n* Update docs and configs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F530\r\n* Add FITS imputation model and update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F531\r\n* CSAI Pipeline by @joseph-arulraj in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F534\r\n* add csai test cases by @LinglongQian in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F535\r\n* Add SegRNN Implementation by @lss-1138 in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F537\r\n* Update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F538\r\n* Implement SegRNN as an imputation model by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F539\r\n* Fix CSAI cannot accept dataset files for lazy loading by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F545\r\n* Update the stale workflow by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F547\r\n* Add SegRNN testing cases by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F548\r\n* Update docs for CSAI by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F549\r\n* Update the stale workflow and docs, add SegRNN tests by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F550\r\n* Add TRMF imputation method  by @AugustJW in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F556\r\n* Integrate raw TRMF implementation into PyPOTS by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F560\r\n* Update docs and dependency versions by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F561\r\n* Release v0.9 by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F562\r\n\r\n## New Contributors\r\n* @joseph-arulraj made their first contribution in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F534\r\n* @lss-1138 made their first contribution in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F537\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fcompare\u002Fv0.8.1...v0.9","2025-02-08T08:17:14",{"id":233,"version":234,"summary_zh":235,"released_at":236},127565,"v0.8.1","We fixed two model-saving issues in some conditions: \r\n1. unintended overwrite of the existing model file when calling func `.save()` even with arg `overwrite` default to False; \r\n2. `model_saving_strategy=best` does not work and pypots still save every better model;\r\n\r\n\r\n## What's Changed\r\n* Fix unintended overwrite saving by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F516\r\n* Update conda_dev_env.yml by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F517\r\n* Fix unintended overwrite when saving models and update conda dependencies by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F518\r\n* `model_saving_strategy=best` does not work by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F514\r\n* Update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F523\r\n* Fix model saving issue and update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F524\r\n* Fix error link to Reformer on openreview by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F527\r\n* Update docs and release v0.8.1 by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F528\r\n\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fcompare\u002Fv0.8...v0.8.1","2024-09-26T05:56:47",{"id":238,"version":239,"summary_zh":240,"released_at":241},127566,"v0.8","We bring you new models ModernTCN (`ICLR 2024`), TimeMixer (`ICLR 2024`), and TEFN in this release ;-)\r\n\r\nKudos to our new contributors Eljas (@eroell) and Tianxiang (@ztxtech)!\r\n\r\n\r\n## What's Changed\r\n* Update testing workflows and dependencies, and refactor by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F482\r\n* Update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F483\r\n* Initialize the client of Gungnir and update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F484\r\n* Update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F487\r\n* Update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F489\r\n* Refactor Gungnir logging and update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F490\r\n* Allow failure when PR gets merged before jobs get finished by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F492\r\n* Update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F493\r\n* Update tsdb.load_dataset to tsdb.load in doc of load_specific_dataset by @eroell in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F494\r\n* Doc update Quickstart Example by @eroell in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F497\r\n* Update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F498\r\n* Implement TimeMixer as an imputation model by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F499\r\n* Update the docs for TimeMixer by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F500\r\n* Add TimeMixer by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F501\r\n* Implement ModernTCN as an imputation model by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F502\r\n* Add ModernTCN docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F503\r\n* Add ModernTCN by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F504\r\n* Add TEFN model by @ztxtech in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F505\r\n* Add TEFN and implement it as an imputation model by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F507\r\n* Apply line-length=120 to black format by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F509\r\n* Import random walk funcs from BenchPOTS and add AI4TS as a dependency by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F510\r\n* Apply line-length=120 to refactor code, update dependencies and pre-commit config by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F512\r\n\r\n## New Contributors\r\n* @eroell made their first contribution in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F494\r\n* @ztxtech made their first contribution in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F505\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fcompare\u002Fv0.7.1...v0.8","2024-09-13T09:41:16",{"id":243,"version":244,"summary_zh":245,"released_at":246},127567,"v0.7.1","Previously we removed pypots.data.load_specific_datasets packages since the preprocessing functions have been all gathered and managed in [BenchPOTS](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FBenchPOTS). The removal caused some incompatibility (see #474), hence we added it back in this minor version. But it still will be deprecated in the near future and we encourage users to use [BenchPOTS](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FBenchPOTS) for dataset preprocessing, which now supports 170+ public time series datasets. Also, we \r\n1. added a visualization function to plot the map of attention weights. 👍Kudos to Anshu @gugababa for his contribution;\r\n2. deprecated setup.py and added pyproject.toml to config the project;\r\n\r\n\r\n## What's Changed\r\n* Visualize attention matrix in SAITS by @gugababa in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F302\r\n* Add attention map visualization func by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F475\r\n* Gather requirements in one dir by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F477\r\n* Add toml config and gather dependency files by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F478\r\n* Add pyproject.toml, gather dependency files, and fix flake8 with toml config file by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F480\r\n* Fix missing load_specific_dataset(), update testing_daily workflow, release v0.7.1 by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F481\r\n\r\n## New Contributors\r\n* @gugababa made their first contribution in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F302\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fcompare\u002Fv0.7...v0.7.1","2024-07-27T02:14:53",{"id":248,"version":249,"summary_zh":250,"released_at":251},127568,"v0.7","Update summary for v0.7 release:\r\n\r\n1. included **ImputeFormer** [KDD'24], kudos👍 to @tongnie, also the author of ImputeFormer;\r\n2. implemented **Lerp (Linear Interpolation)**,  thanks👍 to @colesussmeier;\r\n3. added **TCN** as an imputation model, with SAITS embedding and training methodology applied;\r\n4. fixed a minor bug in RevIN for POTS data;\r\n5. fixed failed model saving when `model_saving_strategy` is set as `better`;\r\n6. added `pypots.data.utils.inverse_sliding_window` func to help restore time series samples sliced by `sliding_window` func;\r\n\r\n## What's Changed\r\n* Make the number of max steps adjustable in TimesNet by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F438\r\n* Enable to restore from `sliding_window()` by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F441\r\n* Add `inverse_sliding_window()` and enable TimesNet to work with len>5000 samples by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F442\r\n* Update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F443\r\n* Use `inspect` to fetch models arguments and update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F444\r\n* Expose new models for tuning, add get_class_full_path(), and test visual funcs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F447\r\n* Update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F448\r\n* Make classification GRUD more robust, and update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F449\r\n* Update Imputeformer by @tongnie in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F450\r\n* Update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F452\r\n* Add ImputeFormer, fix RevIN, and update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F454\r\n* Implement Linear Interpolation (Lerp) Imputation Method by @colesussmeier in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F459\r\n* Update docs conf by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F461\r\n* Add Lerp as an imputation method and update the docs config by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F462\r\n* Update dependencies in conda env files by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F463\r\n* Update docs and conda env dependencies by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F465\r\n* Add TCN as an imputation model by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F467\r\n* Add TCN and update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F468\r\n* Fix saving failed when the strategy is 'better' by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F469\r\n* Use xeLatex engine to avoid Unicode error by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F472\r\n* Fix failed saving strategy \"better\", update docs, and release v0.7 by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F470\r\n\r\n## New Contributors\r\n* @tongnie made their first contribution in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F450\r\n* @colesussmeier made their first contribution in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F459\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fcompare\u002Fv0.6...v0.7","2024-07-21T17:22:42",{"id":253,"version":254,"summary_zh":255,"released_at":256},127569,"v0.6","In v0.4 and v0.5, PyPOTS brought you new models. Now, let's fan🪭 the frame🔥 in v0.6!\r\n1. Non-stationary Transformer, Pyraformer, Reformer, SCINet, RevIN, Koopa, MICN, TiDE, StemGNN are included in this new release;\r\n2. another new PyPOTS Ecosystem library [`BenchPOTS`](https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FBenchPOTS) has been released and supports preprocessing pipelines of 170 public time series datasets for benchmarking machine learning on POTS data;\r\n3. add the argument `verbose` to mute all info level logging;\r\n\r\n👍 Kudos to our new contributor @LinglongQian.\r\n\r\nPlease refer to the changelog below for more details. \r\n\r\n## What's Changed\r\n* Implement Non-stationary Transformer as an imputation model by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F388\r\n* Implement Pyraformer as an imputation model by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F389\r\n* Add Nonstationary Transformer and Pyraformer, update docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F390\r\n* Treat keyboard interruption during training as a warning, and update the docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F391\r\n* Add SCINet modules and implement it as an imputation model by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F406\r\n* Add RevIN modules and implement it as an imputation model by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F407\r\n* Add Koopa modules and implement it as an imputation model  by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F403\r\n* Add MICN modules and implement it as an imputation model by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F401\r\n* Update docs and references by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F410\r\n* Add TiDE modules and implement it as an imputation model  by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F402\r\n* Add Koopa, SCINet, RevIN, MICN and TiDE, and update the docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F412\r\n* Add StemGNN modules and implement it as an imputation model by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F415\r\n* Add GRU-D as an imputation model by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F417\r\n* Update README and docs by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F420\r\n* Implement StemGNN and GRU-D as an imputation model by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F421\r\n* Update set_random_seed() by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F423\r\n* Enable tuning new added models by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F424\r\n* ETSformer hyperparameters mismatch during NNI tuning by @LinglongQian in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F425\r\n* Fix ETSformer tuning bug, and release v0.6rc1 by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F427\r\n* Add arg `verbose` to control logging  by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F428\r\n* Add Reformer as an imputation model by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F433\r\n* Add Reformer, add option `version` to control training log, and add benchpots as a dependency by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F434\r\n* Raise the minimum support python version to v3.8 by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F436\r\n* Fix linting error by @WenjieDu in https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fpull\u002F437\r\n\r\n\r\n**Full Changelog**: https:\u002F\u002Fgithub.com\u002FWenjieDu\u002FPyPOTS\u002Fcompare\u002Fv0.5...v0.6","2024-06-18T09:12:27"]