BobMcDear/trap
Autoregressive transformers in APL
Implements complete GPT-2 architecture with backpropagation and Adam optimizer training in APL, achieving numerical parity with PyTorch. Leverages APL's native multi-dimensional array support and data parallelism for a self-contained, mathematically transparent implementation that can be compiled to CPU/GPU via Co-dfns. Provides both full training (`TRANSFORMER` namespace) and inference-only (`INF` module) variants, supporting character-level autoregressive generation from minimal code.
107 stars. No commits in the last 6 months.
Stars
107
Forks
6
Language
APL
License
MIT
Category
Last pushed
Sep 03, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/BobMcDear/trap"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
LowinLi/transformers-stream-generator
This is a text generation method which returns a generator, streaming out each token in...
ystemsrx/mini-nanoGPT
One-click training of your own GPT. Training a GPT has never been easier for beginners. /...
jaymody/picoGPT
An unnecessarily tiny implementation of GPT-2 in NumPy.
kamalkraj/minGPT-TF
A minimal TF2 re-implementation of the OpenAI GPT training
Eamon2009/Codeformer-A.I
A character-level GPT transformer built from scratch in PyTorch, trained on Linux kernel C...