xie-lab-ml/Golden-Noise-for-Diffusion-Models
[ICCV2025] The code of our work "Golden Noise for Diffusion Models: A Learning Framework".
Introduces a learnable noise optimization framework that generates task-specific "golden noise" for diffusion models through paired noise prediction networks (supporting architectures like SVD-UNet+UNet and DiT). Trains on prompt-conditioned noise pairs collected via reward models (PickScore, HPS v2, ImageReward) to improve generation quality across SDXL, SD2.1, and other pipelines. Provides end-to-end utilities for data collection, multi-GPU training with gradient accumulation, and inference with classifier-free guidance integration.
194 stars.
Stars
194
Forks
8
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 17, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/diffusion/xie-lab-ml/Golden-Noise-for-Diffusion-Models"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related models
UNIC-Lab/RadioDiff
This is the code for the paper "RadioDiff: An Effective Generative Diffusion Model for...
yulewang97/ERDiff
[NeurIPS 2023 Spotlight] Official Repo for "Extraction and Recovery of Dpatio-temporal Structure...
pantheon5100/pid_diffusion
This repository is the official implementation of the paper: Physics Informed Distillation for...
zju-pi/diff-sampler
An open-source toolbox for fast sampling of diffusion models. Official implementations of our...
dome272/Paella
Official Implementation of Paella https://arxiv.org/abs/2211.07292v2