PyTorch-VAE and pytorch-vae

The first is a single VAE implementation serving as a reference example, while the second is a comprehensive collection of multiple VAE architectures and variants, making them competitors for educational purposes but with the latter offering broader coverage.

PyTorch-VAE
49
Emerging
pytorch-vae
50
Established
Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 23/25
Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 24/25
Stars: 7,605
Forks: 1,189
Downloads:
Commits (30d): 0
Language: Python
License: Apache-2.0
Stars: 432
Forks: 107
Downloads:
Commits (30d): 0
Language: Python
License: BSD-3-Clause
Stale 6m No Package No Dependents
Stale 6m No Package No Dependents

About PyTorch-VAE

AntixK/PyTorch-VAE

A Collection of Variational Autoencoders (VAE) in PyTorch.

Implements 16+ VAE variants (Beta-VAE, VQ-VAE, IWAE, InfoVAE, etc.) with standardized architectures for direct comparison, trained on CelebA for reproducibility. Integrates with PyTorch Lightning for training orchestration and uses YAML configuration files to specify model hyperparameters, dataset paths, and trainer settings. Supports TensorBoard visualization and enables custom kernel choices (RBF, IMQ) for Wasserstein autoencoders.

About pytorch-vae

ethanluoyc/pytorch-vae

A Variational Autoencoder (VAE) implemented in PyTorch

Related comparisons

Scores updated daily from GitHub, PyPI, and npm data. How scores work