facebookresearch/fairseq

Facebook AI Research Sequence-to-Sequence Toolkit written in Python.

53
/ 100
Established

Built on PyTorch, fairseq implements diverse sequence modeling architectures—Transformers, CNNs, LSTMs, and non-autoregressive variants—with modular components for efficient distributed training via fully-sharded data parallelism. Beyond text generation, it extends to speech processing (wav2vec, speech-to-speech translation) and multimodal tasks (VideoCLIP), using Hydra for reproducible configuration management and integrating with xFormers for optimized attention mechanisms.

32,190 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 25 / 25

How are scores calculated?

Stars

32,190

Forks

6,676

Language

Python

License

MIT

Last pushed

Sep 30, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/facebookresearch/fairseq"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.