Abstract

Conventional vision systems are designed to perform in clear weather. However, any outdoor vision system is incomplete without mechanisms that guarantee satisfactory performance under poor weather conditions. It is known that the atmosphere can significantly alter light energy reaching an observer. Therefore, atmospheric scattering models must be used to make vision systems robust in bad weather. In this paper, we develop a geometric framework for analyzing the chromatic effects of atmospheric scattering. First, we study a simple color model for atmospheric scattering and verify it for fog and haze. Then, based on the physics of scattering, we derive several geometric constraints on scene color changes, caused by varying atmospheric conditions. Finally, using these constraints we develop algorithms for computing fog or haze color depth segmentation, extracting three dimensional structure, and recovering "true" scene colors, from two or more images taken under different but unknown weather conditions.

Keywords

HazeComputer scienceAtmospheric modelComputer visionChromatic scaleObserver (physics)Diffuse sky radiationArtificial intelligenceScatteringAtmosphere (unit)SegmentationAtmospheric opticsMeteorologyRemote sensingEnvironmental scienceOpticsGeographyPhysics

Affiliated Institutions

Related Publications

Publication Info

Year
2002
Type
article
Volume
1
Pages
598-605
Citations
771
Access
Closed

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

771
OpenAlex
30
Influential
523
CrossRef

Cite This

Srinivasa G. Narasimhan, Shree K. Nayar (2002). Chromatic framework for vision in bad weather. Proceedings IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2000 (Cat. No.PR00662) , 1 , 598-605. https://doi.org/10.1109/cvpr.2000.855874

Identifiers

DOI
10.1109/cvpr.2000.855874

Data Quality

Data completeness: 81%