⚙️ English | 简体中文
PASSL is a Paddle based vision library for state-of-the-art Self-Supervised Learning research with PaddlePaddle. PASSL aims to accelerate research cycle in self-supervised learning: from designing a new self-supervised task to evaluating the learned representations.
Key features of PASSL:
-
Reproducible implementation of SOTA in Self-Supervision
Existing SOTA in Self-Supervision are implemented - SimCLR, MoCo(v1), MoCo(v2), MoCo-BYOL, BYOL, BEiT. Supervised classification training is also supported.
-
Modular Design
Easy to build new tasks and reuse the existing components from other tasks (Trainer, models and heads, data transforms, etc.)
🛠️ The ultimate goal of PASSL is to use self-supervised learning to provide more appropriate pre-training weights for downstream tasks while significantly reducing the cost of data annotation.
📣 Recent Update:
- (2022-2-9): Refactoring README
- 🔥 Now:
- Self-Supervised Learning Models
PASSL implements a series of self-supervised learning algorithms, See Document for details on its use
Epochs | Official results | PASSL results | Backbone | Model | Document | |
---|---|---|---|---|---|---|
MoCo | 200 | 60.6 | 60.64 | ResNet-50 | download | Train MoCo |
SimCLR | 100 | 64.5 | 65.3 | ResNet-50 | download | Train SimCLR |
MoCo v2 | 200 | 67.7 | 67.72 | ResNet-50 | download | Train MoCo |
MoCo-BYOL | 300 | 71.56 | 72.10 | ResNet-50 | download | Train MoCo-BYOL |
BYOL | 300 | 72.50 | 71.62 | ResNet-50 | download | Train BYOL |
PixPro | 100 | 55.1(fp16) | 57.2(fp32) | ResNet-50 | download | Train PixPro |
SimSiam | 100 | 68.3 | 68.4 | ResNet-50 | download | Train SimSiam |
DenseCL | 200 | 63.62 | 63.37 | ResNet-50 | download | Train DenseCL |
SwAV | 100 | 72.1 | 72.4 | ResNet-50 | download | Train SwAV |
Benchmark Linear Image Classification on ImageNet-1K.
Comming Soon:More algorithm implementations are already in our plans ...
- Classification Models
PASSL implements influential image classification algorithms such as Visual Transformer, and provides corresponding pre-training weights. Designed to support the construction and research of self-supervised, multimodal, large-model algorithms. See Classification_Models_Guide.md for more usage details
Detail | Tutorial | |
---|---|---|
ViT | / | PaddleEdu |
Swin Transformer | / | PaddleEdu |
CaiT | config | PaddleFleet |
T2T-ViT | config | PaddleFleet |
CvT | config | PaddleFleet |
BEiT | config | unofficial |
MLP-Mixer | config | PaddleFleet |
ConvNeXt | config | PaddleFleet |
🔥 PASSL provides a detailed dissection of the algorithm, see Tutorial for details.
See INSTALL.md.
Please see GETTING_STARTED.md for the basic usage of PASSL.
Self-Supervised Learning (SSL) is a rapidly growing field, and some influential papers are listed here for research use.PASSL seeks to implement self-supervised algorithms with application potential
- Masked Feature Prediction for Self-Supervised Visual Pre-Training by Chen Wei, Haoqi Fan, Saining Xie, Chao-Yuan Wu, Alan Yuille, Christoph Feichtenhofer.
- Masked Autoencoders Are Scalable Vision Learners by Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Dollár, Ross Girshick.
- Corrupted Image Modeling for Self-Supervised Visual Pre-Training by Yuxin Fang, Li Dong, Hangbo Bao, Xinggang Wang, Furu Wei.
- Are Large-scale Datasets Necessary for Self-Supervised Pre-training? by Alaaeldin El-Nouby, Gautier Izacard, Hugo Touvron, Ivan Laptev, Hervé Jegou, Edouard Grave.
- PeCo: Perceptual Codebook for BERT Pre-training of Vision Transformers by Xiaoyi Dong, Jianmin Bao, Ting Zhang, Dongdong Chen, Weiming Zhang, Lu Yuan, Dong Chen, Fang Wen, Nenghai Yu.
- SimMIM: A Simple Framework for Masked Image Modeling by Zhenda Xie, Zheng Zhang, Yue Cao, Yutong Lin, Jianmin Bao, Zhuliang Yao, Qi Dai, Han Hu.
PASSL is still young. It may contain bugs and issues. Please report them in our bug track system. Contributions are welcome. Besides, if you have any ideas about PASSL, please let us know.
If PASSL is helpful to your research, feel free to cite
@misc{=passl,
title={PASSL: A visual Self-Supervised Learning Library},
author={PASSL Contributors},
howpublished = {\url{https://github.com/PaddlePaddle/PASSL}},
year={2022}
}
As shown in the LICENSE.txt file, PASSL uses the Apache 2.0 copyright agreement.