Skip to content

We trained a deep learning convolutional neural network to recognize A.S.L. gestures in real-time.

Notifications You must be signed in to change notification settings

AlanConstantino/CSUN-AI-Jam-2019

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 

Repository files navigation

We used a deep learning convolutional neural network to train a model to recognize A.S.L. hand gestures in real-time.

Our original idea was to have a mobile application that you can download on your phone to translate A.S.L. hand gestures in real-time. So far, we have only trained our model to recognize A.S.L. letters.

If we were to expand on this idea, we would need more images for each hand gesture to have a more accurate model. Next, we would expand the number of gestures the model can recognize instead of solely focusing on ASL letters. Finally, we would need to port the AI over to a mobile environment for users to download.

Dependencies

  • Python3 (3.6.7)
  • Cython (You will have to build the Cython module)
  • Tensorflow
  • Numpy
  • Opencv 3

How to install?

How to use?

  • After you have installed the dependencies, run the real-time.py script to launch the project.

Video


Notes:

  • This was written using Python 3.6.7.
  • You need to have a camera installed in order for the script to properly execute.
  • Use the YOLO weights and cfg files that are included to run the real-time.py script.

About

We trained a deep learning convolutional neural network to recognize A.S.L. gestures in real-time.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages