Skip to content

ilex-paraguariensis/examples

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

39 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Examples with Maté 🧉

Install locally

First install mate by:

  git clone https://github.com/ilex-paraguariensis/yerbamate -b v2

Then move to the yerbamate directory

cd yerbamate
python install.py

Then install the requirements:

pip install -r requirements.txt

Then clone this repo:

 git clone https://github.com/ilex-paraguariensis/vision -b v2

And test that everything is working by:

mate list models

Should output:

	vit
	lightning
	jax
	keras

Running the project

To run the project, you can use Mate to run different configurations. Look at resnet/hyperparatmers/vanilla.json and vit/hyperparameters/vanilla.json for examples of configurations. Any configuration file can be selected to train. To train a model, run:

mate train {model_name} {hyperparameter_file}

where {model_name} can be anything e.g., resnet or vit and {hyperparameter_file} is the name of the hyperparameter file and the experiment.

Logging

The project by default uses Weights and Biases to log the training process. You can also select any pytorch lightning loggers, e.g., TensorBoardLogger or CSVLogger. See /vit/hyperparateres/tensorboard.json for an example.

Training

You can select any combination of your models with hyperparameters, for example:

mate train vit cifar100 # train vit on cifar100
mate train resnet fine_tune # fine tune a resnet trained on imagenet on cifar
mate train vit small_datasets #  model from Vision Transformer for Small-Size Datasets paper
mate train vit vanilla # original ViT paper: An Image is Worth 16x16 Words

You can consequently restart the training with the same configuration by running:

mate restart vit vanilla

Experimenting and trying other models

You can try other models by changing the model in the hyperparameters or making new configuration file. Over 30 ViTs are available to experiment with. You can also fork the vit models and change the source code as you wish:

mate clone vit awesome_vit

Then, change the models in project/models/awesome_vit and keep on experimenting.

Customizing the hyperparameters

You can customize the hyperparameters by changing the hyperparameter file. For example, you can change the model, learning rate, batch size, optimizer, etc. this project is not limited to cifar dataset, with adding a PytorchLightningDataModule, you can train on any dataset. Optimizers, Trainers, Models and Pytorch-Lightning modules are directly created from the arguments in the configuration file and pytorch packages.

Special thanks

Special thanks to the legend lucidrains for the vit-pytorch library. His licence applies to the ViT models in this project.

About

An example mate project

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published