Skip to content

Latest commit

 

History

History
55 lines (36 loc) · 1.95 KB

README.md

File metadata and controls

55 lines (36 loc) · 1.95 KB

Introduction:

This project is the implementation of the paper: "Using filter banks in Convolutional Neural Networks for texture classification" [arXiv] in pytorch.
V. Andrearczyk & Paul F. Whelan

In this project, TCNN3 architecture is implemented in end to end manner from scratch(No pretraining) for the DTD dataset.

Architecture

Dataset:

Download the raw images (train/test/val) from the following link:
https://www.robots.ox.ac.uk/~vgg/data/dtd/

Prepare the train test and validation images:

Edit the ROOT_PATH variable in create_texture_train_test_val_file.py to give the path of the downloaded images from DTD dataset link.

Run the following command:
python3 create_texture_train_test_val_file.py
This will seprate out the the 10 splits of train, test and validtion files in the ./images folder.

Create pickle files for train test and validation images

Run the following command:
python3 create_pickle.py
This will create the train, test and val pickle files in the folder: ./Dataset/Texture/DTD/

Software required:

Python(ver: 3.7)
Pytorch(ver: 1.3.1)

Models:

Pretrained pytorch models for DTD dataset of TCNN3 architecture can be dowbloaded from the following link:
https://uflorida-my.sharepoint.com/:f:/g/personal/shantanughosh_ufl_edu/EsslShM1m61Ji2lxzrtI9gUB-yqIhDIntbkzaVHPlYv1vQ?e=Z0CBah

Training:

python3 train.py
This will create the models and place them in the ./Models folder.

Testing:

python3 test.py

Hyperparameters:

Epochs: 400
Learning rate: 0.0001
Batch size: 32
Weight Decay: 0.0005

Accuracy on the DTD dataset:

The accuracy is replicated as 27.8 % on the DTD dataset training from scratch(end to end as specified by the authors of the paper)