Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement StemGNN and GRU-D as an imputation model #421

Merged
merged 7 commits into from
May 23, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 9 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,9 @@
<a href="https://github.com/WenjieDu/PyPOTS/blob/main/README_zh.md">
<img alt="README in Chinese" src="https://pypots.com/figs/pypots_logos/readme/CN.svg">
</a>
<a href="https://github.com/WenjieDu/PyPOTS/blob/main/README.md">
<img alt="README in English" src="https://pypots.com/figs/pypots_logos/readme/US.svg">
</a>
</p>

⦿ `Motivation`: Due to all kinds of reasons like failure of collection sensors, communication error,
Expand Down Expand Up @@ -97,7 +100,7 @@ tune the hyperparameters.
🔥 Note that all models whose name with `🧑‍🔧` in the table (e.g. Transformer, iTransformer, Informer etc.) are not originally
proposed as algorithms for POTS data in their papers, and they cannot directly accept time series with missing values as input,
let alone imputation. **To make them applicable to POTS data, we specifically apply the embedding strategy and
training approach (ORT+MIT) the same as we did in [SAITS paper](https://arxiv.org/pdf/2202.08516).**
training approach (ORT+MIT) the same as we did in [the SAITS paper](https://arxiv.org/pdf/2202.08516)[^1].**

The task types are abbreviated as follows:
**`IMPU`**: Imputation;
Expand All @@ -123,7 +126,7 @@ The paper references and links are all listed at the bottom of this file.
| Neural Net | SCINet🧑‍🔧[^30] | ✅ | | | | | `2022 - NeurIPS` |
| Neural Net | Nonstationary Tr.🧑‍🔧[^25] | ✅ | | | | | `2022 - NeurIPS` |
| Neural Net | FiLM🧑‍🔧[^22] | ✅ | | | | | `2022 - NeurIPS` |
| Neural Net | RevIN_SCInet🧑‍🔧[^31] | ✅ | | | | | `2022 - ICLR` |
| Neural Net | RevIN_SCINet🧑‍🔧[^31] | ✅ | | | | | `2022 - ICLR` |
| Neural Net | Pyraformer🧑‍🔧[^26] | ✅ | | | | | `2022 - ICLR` |
| Neural Net | Raindrop[^5] | | | ✅ | | | `2022 - ICLR` |
| Neural Net | FEDformer🧑‍🔧[^20] | ✅ | | | | | `2022 - ICML` |
Expand All @@ -133,6 +136,8 @@ The paper references and links are all listed at the bottom of this file.
| Neural Net | US-GAN[^10] | ✅ | | | | | `2021 - AAAI` |
| Neural Net | CRLI[^6] | | | | ✅ | | `2021 - AAAI` |
| Probabilistic | BTTF[^8] | | ✅ | | | | `2021 - TPAMI` |
| Neural Net | StemGNN🧑‍🔧[^33] | ✅ | | | | | `2020 - NeurIPS` |
| Neural Net | Reformer🧑‍🔧[^32] | ✅ | | | | | `2020 - ICLR` |
| Neural Net | GP-VAE[^11] | ✅ | | | | | `2020 - AISTATS` |
| Neural Net | VaDER[^7] | | | | ✅ | | `2019 - GigaSci.` |
| Neural Net | M-RNN[^9] | ✅ | | | | | `2019 - TBME` |
Expand Down Expand Up @@ -366,6 +371,8 @@ PyPOTS community is open, transparent, and surely friendly. Let's work together
[^29]: Liu, Y., Li, C., Wang, J., & Long, M. (2023). [Koopa: Learning Non-stationary Time Series Dynamics with Koopman Predictors](https://proceedings.neurips.cc/paper_files/paper/2023/hash/28b3dc0970fa4624a63278a4268de997-Abstract-Conference.html). *NeurIPS 2023*.
[^30]: Liu, M., Zeng, A., Chen, M., Xu, Z., Lai, Q., Ma, L., & Xu, Q. (2022). [SCINet: Time Series Modeling and Forecasting with Sample Convolution and Interaction](https://proceedings.neurips.cc/paper_files/paper/2022/hash/266983d0949aed78a16fa4782237dea7-Abstract-Conference.html). *NeurIPS 2022*.
[^31]: Kim, T., Kim, J., Tae, Y., Park, C., Choi, J. H., & Choo, J. (2022). [Reversible Instance Normalization for Accurate Time-Series Forecasting against Distribution Shift](https://openreview.net/forum?id=cGDAkQo1C0p). *ICLR 2022*.
[^32]: Kitaev, N., Kaiser, Ł., & Levskaya, A. (2020). [Reformer: The Efficient Transformer](https://openreview.net/forum?id=0EXmFzUn5I). *ICLR 2020*.
[^33]: Cao, D., Wang, Y., Duan, J., Zhang, C., Zhu, X., Huang, C., Tong, Y., Xu, B., Bai, J., Tong, J., & Zhang, Q. (2020). [Spectral Temporal Graph Neural Network for Multivariate Time-series Forecasting](https://proceedings.neurips.cc/paper/2020/hash/cdf6581cb7aca4b7e19ef136c6e601a5-Abstract.html). *NeurIPS 2020*.


<details>
Expand Down
17 changes: 13 additions & 4 deletions README_zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,9 @@
<a href="https://github.com/WenjieDu/PyPOTS/blob/main/README.md">
<img alt="README in English" src="https://pypots.com/figs/pypots_logos/readme/US.svg">
</a>
<a href="https://github.com/WenjieDu/PyPOTS/blob/main/README_zh.md">
<img alt="README in Chinese" src="https://pypots.com/figs/pypots_logos/readme/CN.svg">
</a>
</p>

⦿ `开发背景`: 由于传感器故障、通信异常以及不可预见的未知原因,在现实环境中收集的时间序列数据普遍存在缺失值,
Expand Down Expand Up @@ -91,10 +94,10 @@ PyPOTS当前支持多变量POTS数据的插补,预测,分类,聚类以及

🔥 请注意: 表格中名称带有`🧑‍🔧`的模型(例如Transformer, iTransformer, Informer等)在它们的原始论文中并非作为可以处理POTS数据的算法提出,
所以这些模型的输入中不能带有缺失值,无法接受POTS数据作为输入,更加不是插补算法。
**为了使上述模型能够适用于POTS数据,我们采用了与[SAITS论文](https://arxiv.org/pdf/2202.08516)中相同的embedding策略和训练方法(ORT+MIT)对它们进行改进**。
**为了使上述模型能够适用于POTS数据,我们采用了与[SAITS论文](https://arxiv.org/pdf/2202.08516)[^1]中相同的embedding策略和训练方法(ORT+MIT)对它们进行改进**。

| **类型** | **算法** | **插补** | **预测** | **分类** | **聚类** | **异常检测** | **年份 - 刊物** |
|:--------------|:-----------------------------------|:------:|:------:|:------:|:------:|:--------:|:-----------------|
| **类型** | **算法** | **插补** | **预测** | **分类** | **聚类** | **异常检测** | **年份 - 刊物** |
|:--------------|:----------------------------|:------:|:------:|:------:|:------:|:--------:|:-----------------|
| Neural Net | iTransformer🧑‍🔧[^24] | ✅ | | | | | `2024 - ICLR` |
| Neural Net | SAITS[^1] | ✅ | | | | | `2023 - ESWA` |
| Neural Net | FreTS🧑‍🔧[^23] | ✅ | | | | | `2023 - NeurIPS` |
Expand All @@ -109,7 +112,7 @@ PyPOTS当前支持多变量POTS数据的插补,预测,分类,聚类以及
| Neural Net | SCINet🧑‍🔧[^30] | ✅ | | | | | `2022 - NeurIPS` |
| Neural Net | Nonstationary Tr.🧑‍🔧[^25] | ✅ | | | | | `2022 - NeurIPS` |
| Neural Net | FiLM🧑‍🔧[^22] | ✅ | | | | | `2022 - NeurIPS` |
| Neural Net | RevIN_SCInet🧑‍🔧[^31] | ✅ | | | | | `2022 - ICLR` |
| Neural Net | RevIN_SCINet🧑‍🔧[^31] | ✅ | | | | | `2022 - ICLR` |
| Neural Net | Pyraformer🧑‍🔧[^26] | ✅ | | | | | `2022 - ICLR` |
| Neural Net | Raindrop[^5] | | | ✅ | | | `2022 - ICLR` |
| Neural Net | FEDformer🧑‍🔧[^20] | ✅ | | | | | `2022 - ICML` |
Expand All @@ -119,6 +122,8 @@ PyPOTS当前支持多变量POTS数据的插补,预测,分类,聚类以及
| Neural Net | US-GAN[^10] | ✅ | | | | | `2021 - AAAI` |
| Neural Net | CRLI[^6] | | | | ✅ | | `2021 - AAAI` |
| Probabilistic | BTTF[^8] | | ✅ | | | | `2021 - TPAMI` |
| Neural Net | StemGNN🧑‍🔧[^33] | ✅ | | | | | `2020 - NeurIPS` |
| Neural Net | Reformer🧑‍🔧[^32] | ✅ | | | | | `2020 - ICLR` |
| Neural Net | GP-VAE[^11] | ✅ | | | | | `2020 - AISTATS` |
| Neural Net | VaDER[^7] | | | | ✅ | | `2019 - GigaSci.` |
| Neural Net | M-RNN[^9] | ✅ | | | | | `2019 - TBME` |
Expand Down Expand Up @@ -339,6 +344,10 @@ PyPOTS社区是一个开放、透明、友好的社区,让我们共同努力
[^27]: Wang, H., Peng, J., Huang, F., Wang, J., Chen, J., & Xiao, Y. (2023). [MICN: Multi-scale Local and Global Context Modeling for Long-term Series Forecasting](https://openreview.net/forum?id=zt53IDUR1U). *ICLR 2023*.
[^28]: Das, A., Kong, W., Leach, A., Mathur, S., Sen, R., & Yu, R. (2023). [Long-term Forecasting with TiDE: Time-series Dense Encoder](https://openreview.net/forum?id=pCbC3aQB5W). *TMLR 2023*.
[^29]: Liu, Y., Li, C., Wang, J., & Long, M. (2023). [Koopa: Learning Non-stationary Time Series Dynamics with Koopman Predictors](https://proceedings.neurips.cc/paper_files/paper/2023/hash/28b3dc0970fa4624a63278a4268de997-Abstract-Conference.html). *NeurIPS 2023*.
[^30]: Liu, M., Zeng, A., Chen, M., Xu, Z., Lai, Q., Ma, L., & Xu, Q. (2022). [SCINet: Time Series Modeling and Forecasting with Sample Convolution and Interaction](https://proceedings.neurips.cc/paper_files/paper/2022/hash/266983d0949aed78a16fa4782237dea7-Abstract-Conference.html). *NeurIPS 2022*.
[^31]: Kim, T., Kim, J., Tae, Y., Park, C., Choi, J. H., & Choo, J. (2022). [Reversible Instance Normalization for Accurate Time-Series Forecasting against Distribution Shift](https://openreview.net/forum?id=cGDAkQo1C0p). *ICLR 2022*.
[^32]: Kitaev, N., Kaiser, Ł., & Levskaya, A. (2020). [Reformer: The Efficient Transformer](https://openreview.net/forum?id=0EXmFzUn5I). *ICLR 2020*.
[^33]: Cao, D., Wang, Y., Duan, J., Zhang, C., Zhu, X., Huang, C., Tong, Y., Xu, B., Bai, J., Tong, J., & Zhang, Q. (2020). [Spectral Temporal Graph Neural Network for Multivariate Time-series Forecasting](https://proceedings.neurips.cc/paper/2020/hash/cdf6581cb7aca4b7e19ef136c6e601a5-Abstract.html). *NeurIPS 2020*.


<details>
Expand Down
Loading
Loading