Nyx1311/Toxicity-Detector-using-BiLSTM
๐ What we built: An AI-powered Womenโs Safety & Well-Being Detector โ a web app that flags multiple forms of online abuse in real time and offers tools for emotional recovery. ๐ Under the hood: BiLSTM + Word2Vec embeddings for deep, context-aware detection Trained on 21K+ labeled comments across 7 toxicity categories Built with Python, Tensor
No commits in the last 6 months.
Stars
—
Forks
—
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Sep 01, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/Nyx1311/Toxicity-Detector-using-BiLSTM"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
unitaryai/detoxify
Trained models & code to predict toxic comments on all 3 Jigsaw Toxic Comment Challenges. Built...
kensk8er/chicksexer
A Python package for gender classification.
Infinitode/ValX
ValX is an open-source Python package for text cleaning tasks, including profanity detection and...
PavelOstyakov/toxic
Toxic Comment Classification Challenge
minerva-ml/open-solution-toxic-comments
Open solution to the Toxic Comment Classification Challenge