vixhal-baraiya/microgpt-c

The most atomic way to train and inference a GPT in pure, dependency-free C

51
/ 100
Established

Implements transformer architecture with multi-head attention and feedforward layers entirely in C, requiring only the standard math library. Supports both training from scratch on raw text and inference sampling, with compilation flags leveraging CPU-native vectorization (AVX) and fast-math optimizations for performance. No external ML frameworks or dependencies—the entire forward/backward pass and tokenization pipeline is self-contained.

234 stars.

No Package No Dependents
Maintenance 10 / 25
Adoption 10 / 25
Maturity 11 / 25
Community 20 / 25

How are scores calculated?

Stars

234

Forks

44

Language

C

License

MIT

Last pushed

Feb 15, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/vixhal-baraiya/microgpt-c"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.