BobMcDear/trap

Autoregressive transformers in APL

36
/ 100
Emerging

Implements complete GPT-2 architecture with backpropagation and Adam optimizer training in APL, achieving numerical parity with PyTorch. Leverages APL's native multi-dimensional array support and data parallelism for a self-contained, mathematically transparent implementation that can be compiled to CPU/GPU via Co-dfns. Provides both full training (`TRANSFORMER` namespace) and inference-only (`INF` module) variants, supporting character-level autoregressive generation from minimal code.

107 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 9 / 25

How are scores calculated?

Stars

107

Forks

6

Language

APL

License

MIT

Last pushed

Sep 03, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/BobMcDear/trap"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.