Abstract

Abstract While style transfer has been extensively studied, most existing approaches fail to account for the defocus effects inherent in content images, thereby compromising the photographer's intended focus cues. To overcome this shortcoming, we introduce an optimisation‐based post‐processing framework that restores defocus characteristics to stylised images, regardless of the style transfer technique used. Our method initiates by estimating a blur map through a data‐driven model that predicts pixel‐level blur magnitudes. This blur map subsequently guides a layer‐based defocus rendering framework, which effectively simulates depth‐of‐field (DoF) effects using a Gaussian filter bank. To map the blur values to appropriate kernel sizes in the filter bank, we introduce a neural network that determines the optimal maximum filter size, ensuring both content integrity and stylistic fidelity. Experimental results, both quantitative and qualitative, show that our method significantly improves stylised images by preserving the original depth cues and defocus details.

Affiliated Institutions

Related Publications

Publication Info

Year
2025
Type
article
Citations
0
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

0
OpenAlex

Cite This

Hongyi Wang, Yuting Wu (2025). Preserving Photographic Defocus in Stylised Image Synthesis. Computer Graphics Forum . https://doi.org/10.1111/cgf.70299

Identifiers

DOI
10.1111/cgf.70299