transformerlab-app and DifferentialTransformer

These two tools are ecosystem siblings within the "power-transformer-design" category: TransformerLab provides a broad research environment for AI model development, including various transformer architectures, while DifferentialTransformer is a specific implementation of a particular transformer model, potentially leveraging environments like TransformerLab for its training, evaluation, or scaling.

Maintenance 25/25
Adoption 10/25
Maturity 16/25
Community 20/25
Maintenance 13/25
Adoption 7/25
Maturity 16/25
Community 0/25
Stars: 4,820
Forks: 501
Downloads:
Commits (30d): 1449
Language: Python
License: AGPL-3.0
Stars: 39
Forks:
Downloads:
Commits (30d): 0
Language: Python
License: MIT
No Package No Dependents
No Package No Dependents

About transformerlab-app

transformerlab/transformerlab-app

The open source research environment for AI researchers to seamlessly train, evaluate, and scale models from local hardware to GPU clusters.

Supports foundation model inference (Llama, DeepSeek, Mistral) across multiple engines (MLX, vLLM, Ollama), training methods including LoRA/QLoRA and RLHF variants, and diffusion-based image generation. Unifies local single-machine workflows with cluster orchestration via Slurm and SkyPilot, featuring automatic hyperparameter sweeps, LLM-as-a-judge evaluation, and a Python SDK for integrating existing training scripts with automatic logging and artifact tracking.

About DifferentialTransformer

kyegomez/DifferentialTransformer

An open source community implementation of the model from "DIFFERENTIAL TRANSFORMER" paper by Microsoft.

Scores updated daily from GitHub, PyPI, and npm data. How scores work