transformerlab/transformerlab-app
The open source research environment for AI researchers to seamlessly train, evaluate, and scale models from local hardware to GPU clusters.
Supports foundation model inference (Llama, DeepSeek, Mistral) across multiple engines (MLX, vLLM, Ollama), training methods including LoRA/QLoRA and RLHF variants, and diffusion-based image generation. Unifies local single-machine workflows with cluster orchestration via Slurm and SkyPilot, featuring automatic hyperparameter sweeps, LLM-as-a-judge evaluation, and a Python SDK for integrating existing training scripts with automatic logging and artifact tracking.
4,820 stars. Actively maintained with 1,449 commits in the last 30 days.
Stars
4,820
Forks
501
Language
Python
License
AGPL-3.0
Category
Last pushed
Mar 12, 2026
Commits (30d)
1449
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/transformerlab/transformerlab-app"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related models
naru-project/naru
Neural Relation Understanding: neural cardinality estimators for tabular data
neurocard/neurocard
State-of-the-art neural cardinality estimators for join queries
upb-lea/mag-net-hub
MagNet Toolkit - Certified Models of the MagNet Challenge
danielzuegner/code-transformer
Implementation of the paper "Language-agnostic representation learning of source code from...
salcc/QuantumTransformers
Quantum Transformers for High Energy Physics Analysis at the Large Hadron Collider