Surya-Hariharan/Hand-Gesture-Driven-Speech-Aid-for-Mute-Individuals
An assistive technology system enabling mute individuals to communicate through hand gestures. Uses flex sensors with Arduino to capture finger movements, KNN machine learning for real-time gesture classification, and text-to-speech for voice output. Supports 11 predefined phrases with expandable gesture library.
Stars
1
Forks
—
Language
Python
License
MIT
Category
Last pushed
Mar 22, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Surya-Hariharan/Hand-Gesture-Driven-Speech-Aid-for-Mute-Individuals"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
harshbg/Sign-Language-Interpreter-using-Deep-Learning
A sign language interpreter using live video feed from the camera.
AvishakeAdhikary/Realtime-Sign-Language-Detection-Using-LSTM-Model
Realtime Sign Language Detection: Deep learning model for accurate, real-time recognition of...
beingaryan/Sign-To-Speech-Conversion
Sign Language Detection system based on computer vision and deep learning using OpenCV and...
Arshad221b/Sign-Language-Recognition
Indian Sign language Recognition using OpenCV
Tachionstrahl/SignLanguageRecognition
Real-time Recognition of german sign language (DGS) with MediaPipe