davidsbatista/NER-Evaluation
An implementation of a full named-entity evaluation metrics based on SemEval'13 Task 9 - not at tag/token level but considering all the tokens that are part of the named-entity
222 stars. No commits in the last 6 months.
Stars
222
Forks
48
Language
Python
License
MIT
Category
Last pushed
Jul 02, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/davidsbatista/NER-Evaluation"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Higher-rated alternatives
MantisAI/nervaluate
Full named-entity (i.e., not tag/token) evaluation metrics based on SemEval’13
dice-group/gerbil
GERBIL - General Entity annotatoR Benchmark
syuoni/eznlp
Easy Natural Language Processing
OpenJarbas/simple_NER
simple rule based named entity recognition
bltlab/seqscore
SeqScore: Scoring for named entity recognition and other sequence labeling tasks