Sign-Language-Interpreter-using-Deep-Learning and Realtime-Sign-Language-Detection-Using-LSTM-Model

Maintenance 2/25
Adoption 10/25
Maturity 16/25
Community 25/25
Maintenance 6/25
Adoption 9/25
Maturity 16/25
Community 20/25
Stars: 740
Forks: 251
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stars: 78
Forks: 24
Downloads:
Commits (30d): 0
Language: Jupyter Notebook
License: MIT
Stale 6m No Package No Dependents
No Package No Dependents

About Sign-Language-Interpreter-using-Deep-Learning

harshbg/Sign-Language-Interpreter-using-Deep-Learning

A sign language interpreter using live video feed from the camera.

This project helps deaf individuals communicate more easily by translating American Sign Language (ASL) gestures into text in real-time. It takes a live video feed from a camera, identifies the hand signs, and outputs the corresponding letters or words. This tool is designed for deaf people who want a personal, always-available translator for daily communication without needing a human interpreter.

assistive-technology accessibility deaf-community sign-language daily-communication

About Realtime-Sign-Language-Detection-Using-LSTM-Model

AvishakeAdhikary/Realtime-Sign-Language-Detection-Using-LSTM-Model

Realtime Sign Language Detection: Deep learning model for accurate, real-time recognition of sign language gestures using Python and TensorFlow.

This project helps bridge communication gaps by instantly interpreting sign language gestures. You perform gestures in front of a camera, and the system translates them in real-time. It's designed for individuals with hearing impairments and those who communicate with them, such as educators or support staff, to facilitate more natural interaction.

assistive-technology communication-accessibility sign-language-interpretation deaf-community-support real-time-translation

Scores updated daily from GitHub, PyPI, and npm data. How scores work