Skip to content

This project uses a deep CNN for classifying images from the Flowers-102 dataset. I employed cross-entropy loss for accurate classification and AdamW optimizer for stable training. Data augmentation techniques like random flips and rotations are used to enhance model generalization.

Notifications You must be signed in to change notification settings

ShyHasVan/IMLO-project

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

32 Commits
 
 
 
 
 
 

Repository files navigation

Image Classification using a Convoultional Neural Network on the Flowers-102 Dataset

View the Image Classification Report pdf for an in-depth review of the process.

Project Overview

The project uses the Flowers102 dataset to train a deep convolutional neural network. The model architecture consists of several convolutional layers followed by fully connected layers. The training process includes data augmentation techniques such as random horizontal flip and random rotation to improve the generalization of the model.

Prerequisites

  • Must use Python 3
  • Google Colab(recommneded for training with GPU support)
  • Google Drive(to save your model)

Install Packages

!pip install torch torchvision(If on Google Colab it is preinstalled but could update)

Running the code

On Google Colab click on Runtime then Run All and the code should excute, starting with training the model and then once trained it will save and then load the best model and test it in the evaultion loop, which reveals your final test classificstion accuracy.

Tips

When Running code on Google Colab Make sure to change runtime to GPU, for faster execution.

About

This project uses a deep CNN for classifying images from the Flowers-102 dataset. I employed cross-entropy loss for accurate classification and AdamW optimizer for stable training. Data augmentation techniques like random flips and rotations are used to enhance model generalization.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages