miguelvanegas-c/Embeddings
This repository contains experiments and explanations based on the book Build a Large Language Model from Scratch. It explores key steps such as tokenization, data sampling with sliding windows, and token embeddings, highlighting their importance for Large Language Models and agentic systems.
Stars
—
Forks
—
Language
Jupyter Notebook
License
—
Category
Last pushed
Feb 21, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/embeddings/miguelvanegas-c/Embeddings"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
cosmosgl/graph
GPU-accelerated force graph layout and rendering
Clay-foundation/model
The Clay Foundation Model - An open source AI model and interface for Earth
nomic-ai/nomic
Nomic Developer API SDK
alexshtf/torchcurves
Parametric differentiable curves with PyTorch for continuous embeddings, shape-restricted models, or KANs
omoindrot/tensorflow-triplet-loss
Implementation of triplet loss in TensorFlow