zakir-codes/transformer-tokenization-experiments
Controlled experiments exploring how tokenization impacts transformer training efficiency, memory usage, and attention patterns.
Stars
—
Forks
—
Language
Python
License
MIT
Category
Last pushed
Mar 13, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/zakir-codes/transformer-tokenization-experiments"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
luozhouyang/transformers-keras
Transformer-based models implemented in tensorflow 2.x(using keras).
xv44586/toolkit4nlp
transformers implement (architecture, task example, serving and more)
ufal/neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
uzaymacar/attention-mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language...
graykode/xlnet-Pytorch
Simple XLNet implementation with Pytorch Wrapper