Skip to content

🤖 🖐️ Classification of live ASL alphabet gestures with Convolutional Neural Networks

Notifications You must be signed in to change notification settings

cesarealmendarez/DeepASL

Repository files navigation

DeepASL

deep_asl_ex_vid_AdobeCreativeCloudExpress

Overview

DeepASL utilizes webcam video feed and some Python code to interpret American Sign Language hand gestures in real time! My goal with this project was to learn the very basics of what makes up Convolutional Neural Networks and how Computer Vision can make them interactively useful in the real world.

Installation and Usage

  1. Clone this repo: git clone https://github.com/cesarealmendarez/DeepASL.git
  2. Navigate to project: cd DeepASL
  3. Install required packages: pip3 install opencv-python mediapipe numpy
  4. Run DeepASL: python3 app.py

What's on my Screen?

Once you run DeepASL, two windows will appear, the Analytics window displays the raw video along with the extracted data points used in interpreting hand landmarks/steadiness, depth perception, output confidence, and finally triggering the snapshot. The Hand Segmentation window shows what the network will break down into a pattern of 1's and 0's prompting it's best attempt to guess what ASL letter you're showing!

Resources Used

  1. MediaPipe: Used to perceive the shape of the hand and create a skeleton-like outline, enabling segmentation of useful classification features.
  2. MNIST Handwritten Digits Classification using a Convolutional Neural Network (CNN)
  3. A Comprehensive Guide to Convolutional Neural Networks — the ELI5 way
  4. Simple Introduction to Convolutional Neural Networks

About

🤖 🖐️ Classification of live ASL alphabet gestures with Convolutional Neural Networks

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages