microgpt-c and microjpt
About microgpt-c
vixhal-baraiya/microgpt-c
The most atomic way to train and inference a GPT in pure, dependency-free C
This project offers a foundational way to create and use a GPT model. It takes text data for training and generates new text samples based on what it has learned. This is for researchers, hobbyists, or educators interested in the absolute basics of how a GPT model works from the ground up.
About microjpt
ssrhaso/microjpt
The most atomic way to train and run inference for a GPT in 100 lines of pure, dependency-free Julia.
This project offers a highly efficient way to train and use a simple language model for generating text from scratch. You provide a text dataset, and the system learns patterns to produce new, similar text. It's designed for developers interested in the underlying mechanics of language models, especially those working with the Julia programming language.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work