Toxic Comment Detection LLM Tools
There are 1 toxic comment detection tools tracked. 1 score above 70 (verified tier). The highest-rated is glincker/glin-profanity at 73/100 with 44 stars and 39,119 monthly downloads.
Get all 1 projects as JSON
curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=llm-tools&subcategory=toxic-comment-detection&limit=20"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
| # | Tool | Score | Tier |
|---|---|---|---|
| 1 |
glincker/glin-profanity
Open-source ML-powered profanity filter with TensorFlow.js toxicity... |
|
Verified |