eloialonso/iris

Transformers are Sample-Efficient World Models. ICLR 2023, notable top 5%.

38
/ 100
Emerging

Built on a two-stage architecture combining a VQ-VAE discrete autoencoder with an autoregressive Transformer, IRIS reformulates world modeling as sequence prediction over learned image tokens rather than raw pixels. Trained end-to-end with a reinforcement learning actor-critic on Atari environments, it generates millions of imagined rollouts for policy optimization while requiring minimal real environment interaction. The codebase integrates with PyTorch, Hydra for configuration management, and Weights & Biases for experiment tracking, with pretrained checkpoints available on Hugging Face.

870 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 9 / 25
Community 19 / 25

How are scores calculated?

Stars

870

Forks

92

Language

Python

License

GPL-3.0

Last pushed

Oct 14, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/eloialonso/iris"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.