Skip to content

This innovative web app utilizes cutting-edge machine learning techniques to detect isolated sign language gestures. πŸ“²πŸ€–βœ¨ The application combines the power of state-of-the-art deep learning models with a user-friendly interface to provide real-time recognition of sign language signs. πŸ–οΈπŸ”πŸ’‘

Notifications You must be signed in to change notification settings

dhaneshragu/i-SLR

Repository files navigation

i-SLR is a Machine learning powered webapp aimed at recognizing Indian and American Sign Language signs, to help the normal people who are learning sign language to test their skills.

This app can currently recognize 250 American Signs and 64 Indian Sign Signs.

πŸ“— Tech stack

πŸ“½οΈ Video Demo

2023-07-08.22-12-13.mp4

πŸ§ͺ Pipeline

Training and Inference Flowchart

  • Trained a custom Transformer with ASL Dataset containing 90k+ landmark data of 250 American Language signs for 40 Epochs with early stopping. (Best Epoch was 38.)
  • Used transfer learning to fine-tune the custom transformer with INCLUDE dataset containing 64 Indian Sign Language signs by just changing the last logit layer for 100 Epochs.
  • The Top 5 signs with the greatest probability are displayed.
  • πŸ“ˆ Results

    Achieved top 5 accuracies of 97.32% and 95.11%, and cross-entropy losses of 0.534 and 0.766 for training and validation, respectively for the ISL model.

    ISL evaluation

    🀩 Build your own sign language recognizer using i-SLR

    • First of all git clone this repository and cd to the appropriate folder
    • Go to Dataset-Creation Folder . There are 2 python scripts dataset_creater.py and preprocess.py. Run dataset_creator.py while having the dataset videos in the directory structure as shown :
    /Dataset-Creation
    └── INCLUDE
        β”œβ”€β”€ Sign-Category-1
        └── Sign-Category-2
            β”œβ”€β”€ 1.Sign-Name-1
            β”œβ”€β”€ 2.Sign-Name-2
                └── Sign-Video-1.mp4
                └── Sign-Video-2.mp4
            └── 3.Sign-Name-3
    └── dataset_creator.py
    └── preprocess.py
    
    • All the videos will be preprocessed with mediapipe and landmarks will be saved in a csv called train-preprocessed.csv.
    • Go to the fine-tuning section of the iSLR-Notebook.ipynb and replace the train-csv URL with the train-preprocessed.csv path.
    • Run make_json.py to store signs with respect to their labels into a json file.
    • Run the notebook and you can get the model.pth file which can be replaced in flask webapp to generate predictions !!!

    πŸ‘€ To use the app :

    • Make sure you are in the cloned repository folder.

    • In terminal , type python app.py and then the flask webapp will start in your browser.

      image
    • Navigate to Indian Sign Language and American Sign Language section, click Start and sign and click Stop when you are done.

      image
    • Viola!! You will get the top-5 predictions of the sign you made.

      image

    🦾 Contributors

    🌟 Stay connected

    Don't forget to ⭐️ star this repository to show your support and stay connected for future updates!

    About

    This innovative web app utilizes cutting-edge machine learning techniques to detect isolated sign language gestures. πŸ“²πŸ€–βœ¨ The application combines the power of state-of-the-art deep learning models with a user-friendly interface to provide real-time recognition of sign language signs. πŸ–οΈπŸ”πŸ’‘

    Topics

    Resources

    Stars

    Watchers

    Forks

    Releases

    No releases published

    Packages

    No packages published