Recently, there has been an increased interest in the vision and graphics
communities in dehazing single images [1–5]. In this paper we introduce an alternative
approach to solve this challenging problem. Our technique is based on
the remark that the distance from the observer to the scene objects is highly correlated
with the contrast degradation and the fading of the object colors. More
specifically, by an extensive study it has been disclosed an important difference
between hazy and non-hazy image regions, by performing a per pixel comparison
of the hue values in the original image to their values in a ’semi-inversed’ image.
This ’semi-inversed’ image version is obtained by replacing the RGB values of
each pixel on a per channel basis by the maximum of the initial channel value
(r, g or b) and its inverse (1 − r,1 − g or 1 − b), followed by an image-wide
renormalization. This observation has been validated on a large set of images,
and allows for the detection of the hazy image regions by applying only a single
simple operator. This facilitates the estimation of the airlight constant color,
and enables us to compute a good approximation of the haze-free image using a
layer-based approach.
Contributions. This paper introduces the following three main contributions:
- first of all, we introduce a novel single image algorithm for the automatic detection
of hazy regions.
- secondly, our approach works on a per pixel basis. This makes it suitable for
parallelization, and allows us to retain sharp detail near edges.
- finally, our layer-based fusion dehazing strategy yields comparative and even
better restored results than the existing approaches but performs faster being
suitable for real-time applications.