PyTorch-VAE and pytorch-vae
The first is a single VAE implementation serving as a reference example, while the second is a comprehensive collection of multiple VAE architectures and variants, making them competitors for educational purposes but with the latter offering broader coverage.
About PyTorch-VAE
AntixK/PyTorch-VAE
A Collection of Variational Autoencoders (VAE) in PyTorch.
Implements 16+ VAE variants (Beta-VAE, VQ-VAE, IWAE, InfoVAE, etc.) with standardized architectures for direct comparison, trained on CelebA for reproducibility. Integrates with PyTorch Lightning for training orchestration and uses YAML configuration files to specify model hyperparameters, dataset paths, and trainer settings. Supports TensorBoard visualization and enables custom kernel choices (RBF, IMQ) for Wasserstein autoencoders.
About pytorch-vae
ethanluoyc/pytorch-vae
A Variational Autoencoder (VAE) implemented in PyTorch
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work