Capturing Light with Robots:
A Novel Workflow for Reproducing Realistic Lens Flares
I’m excited to share my latest research using motion control to reproduce realistic lens flares, which I presented at SIGGRAPH Asia 2024 and won the best short paper award at CVMP2024. This innovative approach aims to enhance the visual quality of both live-action and animated films by accurately replicating the complex phenomenon of lens flares.
Introduction
Lens flares are a common yet captivating optical artifact in photography and filmmaking. They result from reflections and scattering within a camera’s lens system. While often considered imperfections, lens flares can also be used artistically to add flair (pun intended) to visual storytelling.
In recent years, high-profile films like “The Batman” and “Dune” have utilized vintage lenses to achieve unique visual styles, making accurate lens flare reproduction more crucial than ever. Traditional methods, such as 2D sprite animations or ray tracing, fall short in capturing the intricate details of real-world lens flares. This is where my novel workflow comes into play.
Novel Approach
My involvement with the final year 3d animated film project “The Deep Above” led metoaddress a specific challenge: recreating the effects of the Hawk V-Lite 55mm, an anamorphic lens with pronounced rainbow flaring and a distinctive blue streak.
To tackle this challenge, I developed a new workflow that combines motion control systems and advanced image processing techniques. The method involves systematically capturing flare patterns by varying the position of a light source using a motion control robot. This process allows us to gather a comprehensive dataset of lens flares for various lenses, which can then be used in production-ready compositing workflows.


Capturing Process
I first set up a testing environment with a high-contrast OLED television and an LED screen to move a small point of light across the screen. This unfortunately did not achieve sufficient brightness for flare production without long exposure times.
To overcome this, I switched to a focused white LED light source with an adjustable aperture, capturing the data in a nearly black room.
Using an Arri Alexa 35 cinema camera mounted on an industrial motion-control robot, we recorded the lensflares as an image sequence. This high dynamic range approach allowed us to capture a rich dataset of flare images, which were then processed and interpolated to create realistic lens flares.
Generating Lens Flares
To generate lens flares from the captured dataset, I explored various techniques, including image-based interpolation and machine learning. By using tools like Nuke’s Kronos node and the RIFE network, I achieved smooth interpolation and reduced artifacts. Additionally, I developed custom machine learning models to infer images from positional data, creating an artist-friendly tool for controlling photorealistic lens flares within compositing applications like Nuke.
Machine Learning
I also developed a custom Convolutional Neural Network (CNN) designed to generate lens flares based on input parameters such as light position and lens aperture. The CNN was trained on the previously captured dataset of lens flare images, interpolating position and aperture in latent space. My CNN architecture uses multiple upsampling convolution layers, batch normalization, and LeakyReLU activation functions. I optimized the loss function to balance L2 loss, gradient loss, and structural similarity index (SSIM), enhancing detail preservation while producing smooth, realistic flares.
Convolution Glow
Creating the characteristic “glow” associated with lens flares can add a layer of realism to visual effects. Traditional methods involve applying a stack of exponential blurs to simulate light falloff, my technique improves upon this by using actual flare images as convolution filters. We downscale the input image, multiply each pixel by its corresponding flare image, sum the resulting tiles, and upscale the final image to its original size. This method captures the natural diffusion and glow around bright areas.

Conclusion
The innovative capture method and its resulting lens flare tools have demonstrated successful applications in various VFX and CG productions, like The Deep Above and A Sparrow’s Song. Looking ahead, I would like to develop an affordable, compact DSLR-based pan-tilt motion-control setup. This setup would potentially capture still images using HDR bracketing instead of video, albeit with longer capture times. Such a kit could facilitate custom dataset creation for lens research and on-set VFX data capture, akin to shooting lens grids, HDRIs, or chrome ball references.
Ultimately, a deeper understanding of lens and camera characteristics, coupled with more accurate data, will undoubtedly enhance the image quality of visual effects and animated films in the future.
You can learn more about the research on its project page.
Paper: https://dl.acm.org/doi/abs/10.1145/3681758.3697995