We present the new Bokeh Effect Transformation Dataset (BETD), and review the proposed solutions for this novel task at the NTIRE 2023 Bokeh Effect Transformation Challenge. Recent advancements of mobile photography aim to reach the visual quality of full-frame cameras. Now, a goal in computational photography is to optimize the Bokeh effect itself, which is the aesthetic quality of the blur in out-of-focus areas of an image. Photographers create this aesthetic effect by benefiting from the lens optical properties.The aim of this work is to design a neural network capable of converting the the Bokeh effect of one lens to the effect of another lens without harming the sharp foreground regions in the image. For a given input image, knowing the target lens type, we render or transform the Bokeh effect accordingly to the lens properties. We build the BETD using two full-frame Sony cameras, and diverse lens setups.To the best of our knowledge, we are the first attempt to solve this novel task, and we provide the first BETD dataset and benchmark for it. The challenge had 99 registered participants. The submitted methods gauge the state-of-the-art in Bokeh effect rendering and transformation.
6 papers
Radu Timofte
49 papers
Xingyi Yang
2 papers
Ke Xian
4 papers
Marcos V. Conde
9 papers
Chengxin Liu
3 papers
Ziwei Luo
6 papers
Wenyi Lian
2 papers
Fredrik K. Gustafsson
4 papers
Amirsaeed Yazdani
2 papers
Baoliang Chen
2 papers
Zhiguo Cao
4 papers
Songhua Liu
2 papers
Zheng Zhao
3 papers
Tim Seizinger
2 papers
Manuel Kolmet
2 papers
†. TomE.Bishop
1 papers
Xianrui Luo
1 papers
Huiqiang Sun
1 papers
Liao Shen
1 papers
Chaowei Liu
1 papers
Zigeng Chen
1 papers
Yongcheng Jing
1 papers
Michael Bi
1 papers
Mi Wang
1 papers
Zhihao Yang
1 papers
Siyuan Lai
1 papers
Haichuan Zhang
1 papers
T. Hoang
1 papers
Jens Sj¨olund
1 papers
Thomas B. Sch¨on
1 papers
Yuxuan Zhao
1 papers
Yiqing Xu
1 papers