Abstract
We present a simple, yet effective diffusion-based method for fine-grained, parametric control over light sources in an image. Existing relighting methods either rely on multiple input views to perform inverse rendering at inference time, or fail to provide explicit control over light changes. Our method fine-tunes a diffusion model on a small set of real raw photograph pairs, supplemented by synthetically rendered images at scale, to elicit its photorealistic prior for the relighting task. We leverage the linearity of light to synthesize image pairs depicting controlled light changes of either a target light source or ambient illumination. Using this data and an appropriate fine-tuning scheme, we train a model for precise illumination changes with explicit control over light intensity and color. Lastly, we show how our method can achieve compelling light editing results, and outperforms existing methods based on user preference.
| Original language | English |
|---|---|
| Title of host publication | Proceedings - SIGGRAPH 2025 Conference Papers |
| Editors | Stephen N. Spencer |
| Publisher | Association for Computing Machinery, Inc |
| ISBN (Electronic) | 9798400715402 |
| DOIs | |
| State | Published - 27 Jul 2025 |
| Event | SIGGRAPH 2025 Conference Papers - Vancouver, Canada Duration: 10 Aug 2025 → 14 Oct 2025 |
Publication series
| Name | Proceedings - SIGGRAPH 2025 Conference Papers |
|---|
Conference
| Conference | SIGGRAPH 2025 Conference Papers |
|---|---|
| Country/Territory | Canada |
| City | Vancouver |
| Period | 10/08/25 → 14/10/25 |
Bibliographical note
Publisher Copyright:© 2025 Copyright held by the owner/author(s).