transformerlab-app and DifferentialTransformer
These two tools are ecosystem siblings within the "power-transformer-design" category: TransformerLab provides a broad research environment for AI model development, including various transformer architectures, while DifferentialTransformer is a specific implementation of a particular transformer model, potentially leveraging environments like TransformerLab for its training, evaluation, or scaling.
About transformerlab-app
transformerlab/transformerlab-app
The open source research environment for AI researchers to seamlessly train, evaluate, and scale models from local hardware to GPU clusters.
Supports foundation model inference (Llama, DeepSeek, Mistral) across multiple engines (MLX, vLLM, Ollama), training methods including LoRA/QLoRA and RLHF variants, and diffusion-based image generation. Unifies local single-machine workflows with cluster orchestration via Slurm and SkyPilot, featuring automatic hyperparameter sweeps, LLM-as-a-judge evaluation, and a Python SDK for integrating existing training scripts with automatic logging and artifact tracking.
About DifferentialTransformer
kyegomez/DifferentialTransformer
An open source community implementation of the model from "DIFFERENTIAL TRANSFORMER" paper by Microsoft.
Scores updated daily from GitHub, PyPI, and npm data. How scores work