We used a deep learning convolutional neural network to train a model to recognize A.S.L. hand gestures in real-time.
Our original idea was to have a mobile application that you can download on your phone to translate A.S.L. hand gestures in real-time. So far, we have only trained our model to recognize A.S.L. letters.
If we were to expand on this idea, we would need more images for each hand gesture to have a more accurate model. Next, we would expand the number of gestures the model can recognize instead of solely focusing on ASL letters. Finally, we would need to port the AI over to a mobile environment for users to download.
- Python3 (3.6.7)
- Cython (You will have to build the Cython module)
- Tensorflow
- Numpy
- Opencv 3
- After you have installed the dependencies, run the
real-time.py
script to launch the project.