Skip to content

Modality-Agnostic Self-Supervised Learning with Meta-Learned Masked Auto-Encoder (NeurIPS 2023)

Notifications You must be signed in to change notification settings

alinlab/MetaMAE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Modality-agnostic Self-Supervised Learning with Meta-Learned Masked Auto-Encoder

PyTorch implementation for "Modality-agnostic Self-Supervised Learning with Meta-Learned Masked Auto-Encoder" (accepted in NeurIPS 2023)

TL;DR: Interpreting MAE through meta-learning and applying advanced meta-learning techniques to improve unsupervised representation of MAE on arbitrary modalities.

Install

conda create -n meta-mae python=3.9
conda activate meta-mae
conda install pytorch==1.12.0 torchvision==0.13.0 torchaudio==0.12.0 cudatoolkit=10.2 -c pytorch
pip install numpy==1.21.5
conda install ignite -c pytorch
pip install timm==0.6.12
pip install librosa
pip install pandas
pip install packaging tensorboard sklearn

Download datasets

Pretraining MetaMAE

  • E.g., pamap2
python pretrain.py --logdir ./logs_final/pamap2/metamae --seed 0 --model metamae \
	--datadir [DATA_ROOT] --dataset pamap2 \
	--inner-lr 0.5 --reg-weight 1 --num-layer-dec 4 --dropout 0.1 --mask-ratio 0.85

Evaluating MetaMAE

python linear_evaluation.py --ckptdir ./logs_final/pamap2/metamae --seed 0 --model metamae \
	--datadir [DATA_ROOT] --dataset pamap2 \
	--inner-lr 0.5 --reg-weight 1 --num-layer-dec 4 --dropout 0.1 --mask-ratio 0.85

About

Modality-Agnostic Self-Supervised Learning with Meta-Learned Masked Auto-Encoder (NeurIPS 2023)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages