Skip to content

DAMO-DI-ML/CIKM2023-GCformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GCformer

GCformer combines a structured global convolutional branch for processing long input sequences with a local Transformer-based branch for capturing short, recent signals. Experiments demonstrate that GCformer outperforms state-of-the-art methods, reducing MSE error in multivariate time series benchmarks by 4.38% and model parameters by 61.92%. In particular, the global convolutional branch can serve as a plug-in block to enhance the performance of other models, with an average improvement of 31.93%, including various recently published Transformer-based models.

Method

model_structure
Figure 1. GCformer overall framework
global_kernel
Figure 2. Different parameterization methods of global convolution kernel

Main Results

|boosting_result|

|full_benchmark|

Get Started

  1. Install Python 3.6, PyTorch 1.11.0.
  2. Download data. You can obtain all the six benchmarks from [FEDformer] or [Autoformer].
  3. Train the model. We provide the experiment scripts of all benchmarks under the folder ./scripts/GCformer. For instance, you can reproduce the experiment result on illness dataset by:
bash ./scripts/GCformer/illness.sh

Citation

Contact

Acknowledgement

We appreciate the following github repos a lot for their valuable code base or datasets:

https://github.com/yuqinie98/PatchTST

https://github.com/MAZiqing/FEDformer

https://github.com/ctlllll/SGConv

https://github.com/thuml/Autoformer

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published