TomasrRodrigues/TinyGPT
A research-grade PyTorch implementation of a decoder-only transformer from scratch, designed for experimentation on small-scale GPT models (0.5M–5M parameters). Includes full training loop, modular architecture, and tools for analyzing attention, embeddings, and scaling behavior.
Stars
—
Forks
—
Language
Python
License
MIT
Category
Last pushed
Mar 13, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/embeddings/TomasrRodrigues/TinyGPT"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
langformers/langformers
🚀 Unified NLP Pipelines for Language Models
nlpcloud/nlpcloud-js
NLP Cloud serves high performance pre-trained or custom models for NER, sentiment-analysis,...
Hellisotherpeople/CX_DB8
a contextual, biasable, word-or-sentence-or-paragraph extractive summarizer powered by the...
EQTPartners/TSDE
TSDE is a novel SSL framework for TSRL, the first of its kind, effectively harnessing a...
nlpcloud/nlpcloud-php
NLP Cloud serves high performance pre-trained or custom models for NER, sentiment-analysis,...