lightning-hydra-template and pytorch-lightning-template

These two tools are competitors, as both provide templates for PyTorch Lightning projects, with ashleve/lightning-hydra-template offering additional integration with Hydra for configuration management, while miracleyoo/pytorch-lightning-template focuses on a simpler adaptation of existing PyTorch code.

Maintenance 0/25
Adoption 10/25
Maturity 8/25
Community 22/25
Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 21/25
Stars: 5,187
Forks: 757
Downloads:
Commits (30d): 0
Language: Python
License:
Stars: 1,541
Forks: 193
Downloads:
Commits (30d): 0
Language: Jupyter Notebook
License: Apache-2.0
No License Stale 6m No Package No Dependents
Stale 6m No Package No Dependents

About lightning-hydra-template

ashleve/lightning-hydra-template

PyTorch Lightning + Hydra. A very user-friendly template for ML experimentation. ⚡🔥⚡

Combines Hydra's hierarchical config composition with PyTorch Lightning's training abstractions to enable rapid experimentation through command-line overrides and config-driven instantiation. Includes built-in support for multiple experiment tracking backends (W&B, MLFlow, Neptune, Comet), hyperparameter search via Hydra plugins like Optuna, and automated logging/checkpointing with dynamically-generated folder structures. Provides a structured project layout with pre-commit hooks, CI/CD workflows, and generic test utilities to accelerate ML prototyping on prepared datasets.

About pytorch-lightning-template

miracleyoo/pytorch-lightning-template

An easy/swift-to-adapt PyTorch-Lighting template. 套壳模板,简单易用,稍改原来Pytorch代码,即可适配Lightning。You can translate your previous Pytorch code much easier using this template, and keep your freedom to edit all the functions as well. Big-project-friendly as well. No need to rewrite your config in hydra.

Decouples models and datasets through interface abstractions (MInterface/DInterface), allowing multiple implementations to coexist without code duplication while maintaining full control over training logic like `training_step` and `configure_optimizers`. Provides specialized templates for classification and super-resolution tasks with pre-configured project structures, reducing boilerplate while supporting extensibility through command-line argument passing to dynamically instantiated components.

Scores updated daily from GitHub, PyPI, and npm data. How scores work