AntixK/PyTorch-VAE

A Collection of Variational Autoencoders (VAE) in PyTorch.

49
/ 100
Emerging

Implements 16+ VAE variants (Beta-VAE, VQ-VAE, IWAE, InfoVAE, etc.) with standardized architectures for direct comparison, trained on CelebA for reproducibility. Integrates with PyTorch Lightning for training orchestration and uses YAML configuration files to specify model hyperparameters, dataset paths, and trainer settings. Supports TensorBoard visualization and enables custom kernel choices (RBF, IMQ) for Wasserstein autoencoders.

7,605 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 23 / 25

How are scores calculated?

Stars

7,605

Forks

1,189

Language

Python

License

Apache-2.0

Last pushed

Mar 21, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/AntixK/PyTorch-VAE"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.