Lighting plays a crucial role when it comes to visual storytelling. Whether it’s film or photography, creators spend countless hours, and often significant budgets, crafting the perfect illumination for their shot. But once a photograph or video is captured, the illumination is essentially fixed. Adjusting it afterward, a task called “relighting,” typically demands time-consuming manual work by skilled artists.
While some generative AI tools attempt to tackle this task, they rely on large-scale neural networks and billions of training images to guess how light might interact with a scene. But the process is often a black box; users can’t control the lighting directly or understand how the result was generated, often leading to unpredictable outputs that can stray from the original content of the scene. Getting the result one envisions often requires prompt engineering and trial-and-error, hindering the creative vision of the user.
In a new paper to be presented at this year’s SIGGRAPH conference in Vancouver, researchers in the Computational Photography Lab at SFU offer a different approach to relighting. Their work, “Physically Controllable Relighting of Photographs“, brings explicit control over lights, typically available in computer graphics software such as Blender or Unreal Engine, to image and photo editing.
Given a photograph, the method begins by estimating a 3D version of the scene. This 3D model represents the shape and surface colors of the scene, while intentionally leaving out any lighting. Creating this 3D representation is made possible by prior works, including previously developed research from the Computational Photography Lab.
“After creating the 3D scene, users can place virtual light sources into it, much like they would in a real photo studio or 3D modeling software,” explains Chris Careaga, a Ph.D. student at SFU and the lead author of the work. “We then interactively simulate the light sources defined by the user with well-established techniques from computer graphics.”
The result is a rough preview of the scene under the new lighting, but it doesn’t quite look realistic on its own, Careaga explains. In this new work, the researchers have developed a neural network that can transform this rough preview into a realistic photograph.
“What makes our approach unique is that it gives users the same kind of lighting control you’d expect in 3D tools like Blender or Unreal Engine,” Careaga adds. “By simulating the lights, we ensure our result is a physically accurate rendition of the user’s desired lighting.”
Their approach makes it possible to insert new light sources into images and have them interact realistically with the scene. The result is the ability to create relit images that were previously impossible to achieve.
The team’s relighting system currently works with static images, but the team is interested in extending functionality to video in the future, which would make it an invaluable tool for VFX artists and filmmakers.
“As this technology continues to develop, it could save independent filmmakers and content creators a significant amount of time and money,” explains Dr. Yağız Aksoy, who leads the Computational Photography Lab at SFU. “Instead of buying expensive lighting gear or reshooting scenes, they can make realistic lighting changes after the fact, without having to filter their creative vision through a generative AI model.”
This paper is the latest in a series of “illumination-aware” research projects from the Computational Photography Lab. The group’s earlier work on intrinsic decomposition lays the groundwork for their new relighting method, and they break down how it all connects in their explainer video.
You can find out more about the Computational Photography Lab’s research on their web page.
More information:
Chris Careaga et al, Physically Controllable Relighting of Photographs, Proceedings of the Special Interest Group on Computer Graphics and Interactive Techniques Conference Conference Papers (2025). DOI: 10.1145/3721238.3730666
Citation:
New tool offers direct lighting control for photographs using 3D scene modeling (2025, August 2)
retrieved 2 August 2025
from https://techxplore.com/news/2025-08-tool-3d-scene.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.