Tanwar-12/Indian_Sign_Language_Detection-Yolov5

Creating an Indian Sign Language detection model using YOLOv5 involves training a model to recognize and locate different signs in images or videos.

13
/ 100
Experimental

This project helps interpret Indian Sign Language (ISL) gestures from images or videos, translating them into recognized letters and numbers. It takes visual input of someone signing and identifies the specific signs being made. This tool is designed for assistive technology developers, educators working with deaf or hard-of-hearing individuals, or researchers in accessibility.

No commits in the last 6 months.

Use this if you need to automatically recognize and translate individual Indian Sign Language letters and numbers from visual data.

Not ideal if you need to interpret full ISL sentences or complex grammatical structures beyond individual signs.

assistive technology sign language interpretation accessibility education for deaf computer vision applications
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 8 / 25
Community 0 / 25

How are scores calculated?

Stars

13

Forks

Language

Jupyter Notebook

License

Last pushed

Aug 15, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/Tanwar-12/Indian_Sign_Language_Detection-Yolov5"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.