A Sign-Language interpreter developed for the Microsoft Student's Hackathon 2021.
- Create a vitual environment using
Python 3.7
. - Install the required dependencies using
pip3 install -r requirements.txt
. - Set your hand histogram by running
python3 set_hand_hist.py
- You may use
create_gestures.py
to add create new custom gestures. - After this, images need to be loaded using
python load_images.py
- The model can be trained using
python cnn_keras.py
- All set! Run
server.py
and log in tohttp://localhost:5001/
to test the server!
A convolutional neural network has been used to train the model and obtain predictions. Keras
library of python has been used to train it, on a live video feed. More details are present in the model details
folder.