Skip to content
vijaye12 edited this page Sep 15, 2024 · 11 revisions

Time Series Foundation Models (TSFM)

In this section, we highlight the papers, blogs, pre-trained models, and open-source codes from IBM Research's TSFM group.

HF Downloads Github Stars Github Forks Citations Blogs/Media/Articles
Downloads Stars Forks Citations Blogs

HuggingFace releases

Pre-trained models

  1. TinyTimeMixer (TTM): https://huggingface.co/ibm-granite/granite-timeseries-ttm-v1
    🚀 Downloads: 1.1 Million+, Likes: 148 (as of 26 Aug 2024) 🚀

Architectures

  1. PatchTSMixer: https://huggingface.co/docs/transformers/en/model_doc/patchtsmixer

  2. PatchTST: https://huggingface.co/docs/transformers/en/model_doc/patchtst

Publications

4 KDD, 1 ICLR, 2 AAAI, 1 ICML.
🚀 Total citations: 1700 (as of 26 Aug 2024). 🚀

  1. TST: Zerveas, G., Jayaraman, S., Patel, D., Bhamidipaty, A., & Eickhoff, C. A transformer-based framework for multivariate time series representation learning. In KDD 2021. (citations: 840)

  2. PatchTST: Nie, Y., Nguyen, N. H., Sinthong, P., & Kalagnanam, J. (2022). A Time Series is Worth 64 Words: Long-term Forecasting with Transformers. ICLR 2023. (citations: 656)

  3. PatchTSMixer: Ekambaram, V., Jati, A., Nguyen, N., Sinthong, P., & Kalagnanam, J. TSMixer: Lightweight MLP-Mixer Model for Multivariate Time Series Forecasting. In KDD 2023. (citations: 59)

  4. NPF: Ekambaram, V., Manglik, K., Mukherjee, S., Sajja, S. S. K., Dwivedi, S., & Raykar, V. Attention based multi-modal new product sales time-series forecasting. In KDD 2020. (citations: 68)

  5. TLAE: Nguyen, N., & Quanz, B. Temporal latent auto-encoder: A method for probabilistic multivariate time series forecasting. AAAI 2021. (citations: 67)

  6. HPRO: Jati, A., Ekambaram, V., Pal, S., Quanz, B., Gifford, W.M., Harsha, P., Siegel, S., Mukherjee, S. and Narayanaswami, C. Hierarchical proxy modeling for improved HPO in time series forecasting. In KDD 2023. (citations: 6)

  7. AutoMixer: Palaskar, S., Ekambaram, V., Jati, A., Gantayat, N., Saha, A., Nagar, S., Nguyen, N.H., Dayama, P., Sindhgatta, R., Mohapatra, P. and Kumar, H. Automixer for improved multivariate time-series forecasting on business and it observability data. In AAAI 2024. (citations: 3)

  8. ConCerNet: Zhang, W., Weng, T.W., Das, S., Megretski, A., Daniel, L. and Nguyen, L.M. ConCerNet: A Contrastive Learning Based Framework for Automated Conservation Law Discovery and Trustworthy Dynamical System Prediction. In ICML 2023. (citations: 1)

Preprints

  1. TTM: Ekambaram, V., Jati, A., Nguyen, N.H., Dayama, P., Reddy, C., Gifford, W.M. and Kalagnanam, J., Tiny Time Mixers (TTMs): Fast Pre-trained Models for Enhanced Zero/Few-Shot Forecasting of Multivariate Time Series., arXiv preprint 2024.

  2. Trang H. Tran, Lam M. Nguyen, Kyongmin Yeo, Nam Nguyen, Dzung Phan, Roman Vaculin, Jayant Kalagnanam. An End-to-End Time Series Model for Simultaneous Imputation and Forecast. arXiv preprint 2023.

  3. Anh Duy Nguyen, Trang H. Tran, Hieu H. Pham, Phi Le Nguyen, Lam M. Nguyen. Learning Robust and Consistent Time Series Representations: A Dilated Inception-Based Approach. arXiv preprint 2023.

Workshops/Invited Talks/Tutorials

  1. Arindam Jati, Vijay Ekambaram, Pankaj Dayama, Nam H. Nguyen,Jayant Kalagnanam. Light-Weight Pre-Trained Mixer Models For Effective Transfer Learning In Multivariate Time Series Forecasting. Presented at the 44th International Symposium on Forecasting (ISF), 2024, held at Dijon, France.

  2. Sumanta Mukherjee, Chandramouli Kamanchi, Pankaj Dayama, Vijay Ekambaram, Arindam Jati, Kameshwaran Sampath. Intervention Aware Forecasting For Process Control With Sparse Data. Presented at the 44th International Symposium on Forecasting (ISF), 2024, held at Dijon, France.

  3. Lam M. Nguyen, Trang H. Tran, Wang Zhang, Subhro Das, Tsui-Wei Weng. When Machine Learning meets Dynamical Systems: Theory and Applications. Workshop at The 37th Conference on Artificial Intelligence (AAAI 2023).

Blogs, Media and News Articles

We sincerely thank all the blog authors for dedicating their valuable time to analyzing and exploring our TSFM models. The analysis and conclusions presented are entirely the work of the respective authors.

IBM Research articles

External Blogs/Media/Articles

Tutorial videos

We sincerely thank all the tutors for dedicating their valuable time to analyzing and exploring our TSFM models. The analysis and conclusions presented are entirely the work of the respective authors and tutors.

  1. TinyTimeMixer TTM Model by IBM - Run in Google Colab for Forecasting (July 5, 2024)

  2. PALS Industry Assisted Lecture Series on AI for IOT: Introduction and Hands-On AI for IOT Data (Oct 12, 2023)

Model-wise List of Opensource Implementations

🚀 Stars: 2500+, Forks: 480+ (as of 26 Aug 2024) 🚀

Model Repository Stars Forks Comment
TTM granite-tsfm 319 160 From the Authors
TTM sktime -- -- --
PatchTSMixer HuggingFace -- -- From the Authors
PatchTST HuggingFace -- -- From the Authors
PatchTST GluonTS -- -- --
PatchTST Nixtla -- -- --
PatchTST yuqinie98/PatchTST 1.5k 256 From the authors
TST tsai -- -- --
TST mvts_transformer 734 171 From the authors

[-- indicates that our model is included in the library, but there is currently no capability to track stars at the individual model level.]