Computer Laboratory


Teaser figure
Existing protocols for evaluating single-image HDR reconstruction methods are unreliable due to large tone and color differences between the reference and reconstructed HDR images. We demonstrate that the accuracy of metrics can be much improved if we correct for camera-response-curve inversion errors before computing image quality using existing full-reference metrics.


As the problem of reconstructing high dynamic range (HDR) images from a single exposure has attracted much research effort, it is essential to provide a robust protocol and clear guidelines on how to evaluate and compare new methods. In this work, we compared six recent single image HDR reconstruction (SI-HDR) methods in a subjective image quality experiment on an HDR display. We found that only two methods produced results that are, on average, more preferred than the unprocessed single exposure images (see HTML viewer). When the same methods are evaluated using image quality metrics, as typically done in papers, the metric predictions correlate poorly with subjective quality scores. The main reason is a significant tone and color difference between the reference and reconstructed HDR images. To improve the predictions of image quality metrics, we propose correcting for the inaccuracies of the estimated camera response curve before computing quality values. We further analyze the sources of prediction noise when evaluating SI-HDR methods and demonstrate that existing metrics can reliably predict only large quality differences.



      author    = {Hanji, Param and Mantiuk, Rafa{\l} K. and Eilertsen, Gabriel and Hajisharif, Saghi and Unger, Jonas},
      title     = {Comparison of single image HDR reconstruction methods — the caveats of quality assessment},
      booktitle = {Special Interest Group on Computer Graphics and Interactive Techniques Conference Proceedings (SIGGRAPH '22 Conference Proceedings)},
      year      = {2022},
      doi       = {10.1145/3528233.3530729},
      url       = {},


We would like to thank anonymous reviewers and Aamir Mustafa for their comments. This project has received funding from the European Research Council (ERC) under the European Union's Horizon 2020 research and innovation programme (grant agreement No 725253 - EyeCode).