OpenMachine-ai/transformer-tricks
A collection of tricks and tools to speed up transformer models
197 stars and 145 monthly downloads. Available on PyPI.
Stars
197
Forks
12
Language
TeX
License
MIT
Category
Last pushed
Feb 23, 2026
Monthly downloads
145
Commits (30d)
0
Dependencies
3
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/OpenMachine-ai/transformer-tricks"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related models
huggingface/text-generation-inference
Large Language Model Text Generation Inference
poloclub/transformer-explainer
Transformer Explained Visually: Learn How LLM Transformer Models Work with Interactive Visualization
IBM/TabFormer
Code & Data for "Tabular Transformers for Modeling Multivariate Time Series" (ICASSP, 2021)
tensorgi/TPA
[NeurIPS 2025 Spotlight] TPA: Tensor ProducT ATTenTion Transformer (T6)...
lorenzorovida/FHE-BERT-Tiny
Source code for the paper "Transformer-based Language Models and Homomorphic Encryption: an...