Dense Scene Information Estimation Network for Dehazing

Winner of NTIRE 2019 Dehazing Challenge

A CVPR Workshop

ABSTRACT

Image dehazing continues to be one of the most challenging inverse problems. Deep learning methods have emerged to complement traditional model based methods and have helped define a new state of the art in achievable dehazed image quality. Yet, practical challenges remain in dehazing of real-world images where the scene is heavily covered with dense haze, even to the extent that no scene information can be observed visually. Many recent dehazing methods have addressed this challenge by designing deep networks that estimate physical parameters in the haze model, i.e. ambient light (A) and transmission map (t). The inverse of the haze model may then be used to estimate the dehazed image. In this work, we develop two novel network architectures to further this line of investigation. Our first model, denoted as `At-DH', designs a shared DenseNet based encoder and two distinct DensetNet based decoders to jointly estimate the scene information. As a natural extension of `At-DH', we develop the `AtJ-DH' network, which adds one more DenseNet based decoder to jointly recreate the haze-free image. The knowledge of (ground truth) training dehazed/clean images can be exploited by a custom regularization term that further enhances the estimates of model parameters in AtJ-DH. Experiments performed on challenging benchmark image datasets of NTIRE'19 and NTIRE'18 demonstrate that `At-DH' and `AtJ-DH' can outperform state-of-the-art alternatives, especially when recovering images corrupted by dense hazes.

Code

The python version of AtJ-DH testing code can be found at [GitHub].

The Matlab version of post-processing (IRCNN) code can be found at [GitHub].

Network Structure

Dehaze Results

Evaluations

Related Publications

  1. T. Guo, X. Li, V. Cherukuri, and V. Monga, “Dense Scene Information Estimation Network for Dehazing”, in IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2019. [PDF]

  2. Co-author of C. O. Ancuti, C. Ancuti, R. Timofte et al., “NTIRE 2019 Challenge on image dehazing: Methods and results”, in IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2019. [CVF]

Selected References

  1. C. O. Ancuti et al., “Dense haze: A benchmark for image dehazing with dense-haze and haze-free images,” arXiv preprint arXiv:1904.02904, 2019.

  2. H. Zhang and V. M. Patel, “Densely connected pyramid dehazing network,” in Proc. IEEE Conf. on Comp. Vis. Patt. Recog., 2018, pp. 3194–3203.

  3. H. Zhang, V. Sindagi, and V. M. Patel, “Multi-scale single image dehazing using perceptual pyramid deep network,” in Proc. IEEE Conf. Workshop on Comp. Vis. Patt. Recog., 2018, pp. 902–911.

  4. C. O. Ancuti et al., “Color transfer for underwater dehazing and depth estimation,” in Proc. IEEE Conf. on Image Proc., 2017, pp. 695–699.

Email
ipal.psu@gmail.com

Address
104 Electrical Engineering East,
University Park, PA 16802, USA

Lab Phone:
814-863-7810
814-867-4564