This application harnesses the power of computer vision and deep learning to identify human emotions from images. By utilizing OpenCV for image processing and frameworks like PyTorch and TensorFlow for model training, this tool offers an intuitive interface for users to classify emotions from facial expressions accurately.
- Data Collection: Use a diverse dataset containing images of faces labeled with corresponding emotions.
- Data Preprocessing: Process images using OpenCV to ensure they are suitable for model input.
- Model Development: Implement and compare multiple deep learning models:
- Convolutional Neural Networks (CNNs) using PyTorch
- Pre-trained models with TensorFlow (e.g., VGG16, ResNet)
- User Interface: Develop a Python UI for users to upload images and receive emotion predictions.
The dataset used for this project contains a variety of facial expressions, including:
Emotion | Description |
---|---|
Anger | Expressing anger or frustration |
Disgust | Showing disgust or disdain |
Fear | Exhibiting fear or anxiety |
Happy | Displaying happiness or joy |
Sad | Reflecting sadness or disappointment |
Surprise | Showing surprise or shock |
To set up the project locally, follow these steps:
-
Clone the repository:
git clone https://github.com/yourusername/emotion-detection.git cd emotion-detection
-
Install the required packages:
pip install -r requirements.txt
To train the emotion detection models, run:
python train_model.py
To launch the interactive application, execute:
python app.py
The results of the emotion detection models are evaluated and presented in the results/
directory. Key outputs include:
-
Model Accuracy:
- CNN (PyTorch): 92%
- Pre-trained Model (TensorFlow): 95%
-
Confusion Matrix: Visualize the performance of the models.
-
Sample Predictions: View some examples of emotion predictions made by the model.
Future enhancements may include:
- Integrating real-time emotion detection using webcam input.
- Expanding the dataset with more diverse images for improved accuracy.
- Enhancing the UI with more features such as emotion tracking over time.
- Libraries Used: OpenCV, PyTorch, TensorFlow