mandarjoshi90/coref

BERT for Coreference Resolution

43
/ 100
Emerging

Extends the end-to-end coreference architecture with BERT and SpanBERT encoders, leveraging subword tokenization and span representations for improved mention detection and linking. Provides pretrained models on OntoNotes (up to 79.6 F1 with SpanBERT-large) alongside training infrastructure with configurable segment length and dual learning rates for encoder vs. task-specific parameters. Supports batched inference via JSONL input with speaker and genre metadata, compatible with both TensorFlow and PyTorch checkpoint formats.

455 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 9 / 25
Community 24 / 25

How are scores calculated?

Stars

455

Forks

94

Language

Python

License

Apache-2.0

Last pushed

Dec 08, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/mandarjoshi90/coref"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.