Skip to content

Commit

Permalink
Merge pull request #337 from WenjieDu/dev
Browse files Browse the repository at this point in the history
Add Informer, speed up CI testing, and make self-attention operator replaceable
  • Loading branch information
WenjieDu authored Apr 2, 2024
2 parents 356277b + 6bc1be0 commit 590d7be
Show file tree
Hide file tree
Showing 42 changed files with 980 additions and 138 deletions.
3 changes: 3 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -203,6 +203,7 @@ This functionality is implemented with the [Microsoft NNI](https://github.com/mi
| Neural Net | DLinear | Are Transformers Effective for Time Series Forecasting? [^17] | 2023 |
| Neural Net | ETSformer | Exponential Smoothing Transformers for Time-series Forecasting [^19] | 2023 |
| Neural Net | FEDformer | Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting [^20] | 2022 |
| Neural Net | Informer | Beyond Efficient Transformer for Long Sequence Time-Series Forecasting [^21] | 2021 |
| Neural Net | Autoformer | Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting [^15] | 2021 |
| Neural Net | CSDI | Conditional Score-based Diffusion Models for Probabilistic Time Series Imputation [^12] | 2021 |
| Neural Net | US-GAN | Unsupervised GAN for Multivariate Time Series Imputation [^10] | 2021 |
Expand Down Expand Up @@ -332,6 +333,8 @@ PyPOTS community is open, transparent, and surely friendly. Let's work together
[^18]: Nie, Y., Nguyen, N. H., Sinthong, P., & Kalagnanam, J. (2023). [A time series is worth 64 words: Long-term forecasting with transformers](https://openreview.net/forum?id=Jbdc0vTOcol). *ICLR 2023*
[^19]: Woo, G., Liu, C., Sahoo, D., Kumar, A., & Hoi, S. (2023). [ETSformer: Exponential Smoothing Transformers for Time-series Forecasting](https://openreview.net/forum?id=5m_3whfo483). *ICLR 2023*
[^20]: Zhou, T., Ma, Z., Wen, Q., Wang, X., Sun, L., & Jin, R. (2022). [FEDformer: Frequency enhanced decomposed transformer for long-term series forecasting](https://proceedings.mlr.press/v162/zhou22g.html). *ICML 2022*.
[^21]: Zhou, H., Zhang, S., Peng, J., Zhang, S., Li, J., Xiong, H., & Zhang, W. (2021). [Informer: Beyond efficient transformer for long sequence time-series forecasting](https://ojs.aaai.org/index.php/AAAI/article/view/17325). *AAAI 2021*.



<details>
Expand Down
1 change: 1 addition & 0 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -181,6 +181,7 @@ Imputation Neural Net PatchTST (A Time Series is Worth
Imputation Neural Net DLinear (Are transformers effective for time series forecasting?) 2023 :cite:`zeng2023dlinear`
Imputation Neural Net ETSformer (Exponential Smoothing Transformers for Time-series Forecasting) 2023 :cite:`woo2023etsformer`
Imputation Neural Net FEDformer (Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting) 2022 :cite:`zhou2022fedformer`
Imputation Neural Net Informer (Beyond Efficient Transformer for Long Sequence Time-Series Forecasting) 2021 :cite:`zhou2021informer`
Imputation Neural Net Autoformer (Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting) 2021 :cite:`wu2021autoformer`
Imputation Neural Net US-GAN (Unsupervised GAN for Multivariate Time Series Imputation) 2021 :cite:`miao2021SSGAN`
Imputation Neural Net CSDI (Conditional Score-based Diffusion Models for Probabilistic Time Series Imputation) 2021 :cite:`tashiro2021csdi`
Expand Down
9 changes: 9 additions & 0 deletions docs/pypots.imputation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,15 @@ pypots.imputation.fedformer
:show-inheritance:
:inherited-members:

pypots.imputation.informer
------------------------------

.. automodule:: pypots.imputation.informer
:members:
:undoc-members:
:show-inheritance:
:inherited-members:

pypots.imputation.autoformer
------------------------------

Expand Down
10 changes: 10 additions & 0 deletions docs/references.bib
Original file line number Diff line number Diff line change
Expand Up @@ -544,3 +544,13 @@ @inproceedings{zhou2022fedformer
pdf = {https://proceedings.mlr.press/v162/zhou22g/zhou22g.pdf},
url = {https://proceedings.mlr.press/v162/zhou22g.html},
}

@inproceedings{zhou2021informer,
title={Informer: Beyond efficient transformer for long sequence time-series forecasting},
author={Zhou, Haoyi and Zhang, Shanghang and Peng, Jieqi and Zhang, Shuai and Li, Jianxin and Xiong, Hui and Zhang, Wancai},
booktitle={Proceedings of the AAAI conference on artificial intelligence},
volume={35},
number={12},
pages={11106--11115},
year={2021}
}
2 changes: 2 additions & 0 deletions pypots/imputation/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@
from .etsformer import ETSformer
from .fedformer import FEDformer
from .crossformer import Crossformer
from .informer import Informer
from .autoformer import Autoformer
from .dlinear import DLinear
from .patchtst import PatchTST
Expand All @@ -36,6 +37,7 @@
"TimesNet",
"PatchTST",
"DLinear",
"Informer",
"Autoformer",
"BRITS",
"MRNN",
Expand Down
4 changes: 2 additions & 2 deletions pypots/imputation/autoformer/modules/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,10 +10,10 @@
from .submodules import (
SeasonalLayerNorm,
AutoformerEncoderLayer,
AutoformerEncoder,
AutoCorrelation,
AutoCorrelationLayer,
)
from ...informer.modules.submodules import InformerEncoder
from ....nn.modules.transformer.embedding import DataEmbedding
from ....utils.metrics import calc_mse

Expand Down Expand Up @@ -43,7 +43,7 @@ def __init__(
dropout=dropout,
with_pos=False,
)
self.encoder = AutoformerEncoder(
self.encoder = InformerEncoder(
[
AutoformerEncoderLayer(
AutoCorrelationLayer(
Expand Down
49 changes: 0 additions & 49 deletions pypots/imputation/autoformer/modules/submodules.py
Original file line number Diff line number Diff line change
Expand Up @@ -285,35 +285,6 @@ def forward(self, x, attn_mask=None):
return res, attn


class AutoformerEncoder(nn.Module):
def __init__(self, attn_layers, conv_layers=None, norm_layer=None):
super().__init__()
self.attn_layers = nn.ModuleList(attn_layers)
self.conv_layers = (
nn.ModuleList(conv_layers) if conv_layers is not None else None
)
self.norm = norm_layer

def forward(self, x, attn_mask=None):
attns = []
if self.conv_layers is not None:
for attn_layer, conv_layer in zip(self.attn_layers, self.conv_layers):
x, attn = attn_layer(x, attn_mask=attn_mask)
x = conv_layer(x)
attns.append(attn)
x, attn = self.attn_layers[-1](x)
attns.append(attn)
else:
for attn_layer in self.attn_layers:
x, attn = attn_layer(x, attn_mask=attn_mask)
attns.append(attn)

if self.norm is not None:
x = self.norm(x)

return x, attns


class AutoformerDecoderLayer(nn.Module):
"""
Autoformer decoder layer with the progressive decomposition architecture
Expand Down Expand Up @@ -372,23 +343,3 @@ def forward(self, x, cross, x_mask=None, cross_mask=None):
1, 2
)
return x, residual_trend


class AutoformerDecoder(nn.Module):
def __init__(self, layers, norm_layer=None, projection=None):
super().__init__()
self.layers = nn.ModuleList(layers)
self.norm = norm_layer
self.projection = projection

def forward(self, x, cross, x_mask=None, cross_mask=None, trend=None):
for layer in self.layers:
x, residual_trend = layer(x, cross, x_mask=x_mask, cross_mask=cross_mask)
trend = trend + residual_trend

if self.norm is not None:
x = self.norm(x)

if self.projection is not None:
x = self.projection(x)
return x, trend
24 changes: 20 additions & 4 deletions pypots/imputation/crossformer/modules/submodules.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
import torch.nn as nn
from einops import rearrange, repeat

from ....nn.modules.transformer import MultiHeadAttention
from ....nn.modules.transformer import ScaledDotProductAttention, MultiHeadAttention


class TwoStageAttentionLayer(nn.Module):
Expand All @@ -33,10 +33,26 @@ def __init__(
super().__init__()
d_ff = 4 * d_model if d_ff is None else d_ff
self.time_attention = MultiHeadAttention(
n_heads, d_model, d_k, d_v, attn_dropout
n_heads,
d_model,
d_k,
d_v,
ScaledDotProductAttention(d_k**0.5, attn_dropout),
)
self.dim_sender = MultiHeadAttention(
n_heads,
d_model,
d_k,
d_v,
ScaledDotProductAttention(d_k**0.5, attn_dropout),
)
self.dim_receiver = MultiHeadAttention(
n_heads,
d_model,
d_k,
d_v,
ScaledDotProductAttention(d_k**0.5, attn_dropout),
)
self.dim_sender = MultiHeadAttention(n_heads, d_model, d_k, d_v, attn_dropout)
self.dim_receiver = MultiHeadAttention(n_heads, d_model, d_k, d_v, attn_dropout)
self.router = nn.Parameter(torch.randn(seg_num, factor, d_model))

self.dropout = nn.Dropout(dropout)
Expand Down
4 changes: 2 additions & 2 deletions pypots/imputation/fedformer/modules/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,11 @@

from .submodules import MultiWaveletTransform, FourierBlock
from ...autoformer.modules.submodules import (
AutoformerEncoder,
AutoformerEncoderLayer,
AutoCorrelationLayer,
SeasonalLayerNorm,
)
from ...informer.modules.submodules import InformerEncoder
from ....nn.modules.transformer.embedding import DataEmbedding
from ....utils.metrics import calc_mse

Expand Down Expand Up @@ -57,7 +57,7 @@ def __init__(
f"Unsupported version: {version}. Please choose from ['Wavelets', 'Fourier']."
)

self.encoder = AutoformerEncoder(
self.encoder = InformerEncoder(
[
AutoformerEncoderLayer(
AutoCorrelationLayer(
Expand Down
17 changes: 17 additions & 0 deletions pypots/imputation/informer/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
"""
The package of the partially-observed time-series imputation model Informer.
Refer to the paper "Wu, H., Xu, J., Wang, J., & Long, M. (2021).
Informer: Decomposition transformers with auto-correlation for long-term series forecasting. NeurIPS 2021.".
"""

# Created by Wenjie Du <wenjay.du@gmail.com>
# License: BSD-3-Clause


from .model import Informer

__all__ = [
"Informer",
]
24 changes: 24 additions & 0 deletions pypots/imputation/informer/data.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
"""
Dataset class for Informer.
"""

# Created by Wenjie Du <wenjay.du@gmail.com>
# License: BSD-3-Clause

from typing import Union

from ..saits.data import DatasetForSAITS


class DatasetForInformer(DatasetForSAITS):
"""Actually Informer uses the same data strategy as SAITS, needs MIT for training."""

def __init__(
self,
data: Union[dict, str],
return_X_ori: bool,
return_labels: bool,
file_type: str = "h5py",
rate: float = 0.2,
):
super().__init__(data, return_X_ori, return_labels, file_type, rate)
Loading

0 comments on commit 590d7be

Please sign in to comment.