Call for Paper - September 2020 Edition
IJCA solicits original research papers for the September 2020 Edition. Last date of manuscript submission is August 20, 2020. Read More

A Visibility Restoration Algorithm for Real-World Hazy Scenes

Print
PDF
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Year of Publication: 2017
Authors:
Sanjay Sharma, Padma J. Bonde, Samta Gajbhiye
10.5120/ijca2017914422

Sanjay Sharma, Padma J Bonde and Samta Gajbhiye. A Visibility Restoration Algorithm for Real-World Hazy Scenes. International Journal of Computer Applications 168(5):44-46, June 2017. BibTeX

@article{10.5120/ijca2017914422,
	author = {Sanjay Sharma and Padma J. Bonde and Samta Gajbhiye},
	title = {A Visibility Restoration Algorithm for Real-World Hazy Scenes},
	journal = {International Journal of Computer Applications},
	issue_date = {June 2017},
	volume = {168},
	number = {5},
	month = {Jun},
	year = {2017},
	issn = {0975-8887},
	pages = {44-46},
	numpages = {3},
	url = {http://www.ijcaonline.org/archives/volume168/number5/27875-2017914422},
	doi = {10.5120/ijca2017914422},
	publisher = {Foundation of Computer Science (FCS), NY, USA},
	address = {New York, USA}
}

Abstract

With the increase in industrial production and human activities, the concentration of atmospheric particulate matter (PM) is substantially increased; due to which fog and haze occur more frequently. Limited visibility is caused by suspended particles in the air, such as fog and haze are a major problem for many applications of computer vision. The captured scenes by such computer vision systems suffer from poor visibility, low contrast, dimmed brightness, low luminance and distorted color. The detection of objects within the scene is more difficult. Therefore visibility improvement, contrast and features enhancement of images and videos captured in bad weather are also called as dehazing, is an inevitable task.

The Motion detection is the first essential process within the extraction of data concerning moving objects and makes use of stabilization in purposeful areas, like tracking, classification, recognition, and so on. A total unique and accurate approach to motion detection for the automated video surveillance system has been adopted. Complete detection of moving objects can be achieved by involving three significant projected units, a background modeling (BM) unit, an alarm trigger (AT) unit and an object extraction (OE) module. Intelligent service mechanism development is a crucial and critical issue for human community applications. With the diverse and complicated service desires, the perception and navigation are essential subjects. First of all, a new augmented approach of graph-based optimum estimation derived for concurrent mechanism postures and affecting objective course approximation. Moreover, all the moving object detection issues of a robot’s indoor navigation has been solved by divided and conquered via multisensory fusion methodologies.

References

  1. Shih-Chia Huang, Jian-Hui Ye, and Bo-Hao Chen “An Advanced Single-Image Visibility Restoration Algorithm for Real-World Hazy Scenes” IEEE Trans.on industrial electronics, vol. 62, no. 5, may 2015
  2. Jiahao Pang, Oscar C. Au and Zheng Guo “Improved Single Image Dehazing Using Guided Filter”, APSIPA ASC 2011 Xi’an
  3. Pierre Charbonnier, Laure Blanc-Feraud, Gilles Aubert, and Michel Barlaud, “Deterministic Edge-Preserving Regularization in Computed Imaging”, IEEE Trans. On image processing, vol. 6, no. 2, FEBRUARY 1997.
  4. Anju Rani, Gagandeep Kaur “Image Enhancement using Image Fusion Techniques”, IJARCSSE volume 4,Issue 9 September - 2014, pp. 413-416.
  5. Er. Jagroop Kaur, Dr. Rajiv Mahajan “Improved Degraded Document Image Binarization Using Guided Image Filter”, IJARCSSE, Volume 4, Issue 9, September 2014
  6. R. C. Luo and C. L. Chun, “Multisensor fusion-based concurrent environment mapping and moving object detection for intelligent service robotics,” IEEE Trans. Ind. Electron., vol. 61, no. 8, pp. 4043–4051,Aug. 2014.
  7. H. Zhuang, K. S. Low, and W.Y.Yau, “Multichannel pulse-coupledneuralnetwork-based color image segmentation for object detection,” IEEETrans. Ind. Electron., vol. 59, no. 8, pp. 3299–3308, Aug. 2012.
  8. H. H. Kim, D. J. Kim, and K. H. Park, “Robust elevator button recognitionin the presence of partial occlusion and clutter by specular reflections,”IEEE Trans. Ind. Electron., vol. 59, no. 3, pp. 1597–1611, Mar. 2012.
  9. H. Rezaee and F. Abdollahi, “A decentralized cooperative control scheme with obstacle avoidance for a team of mobile robots,” IEEE Trans. Ind.Electron., vol. 61, no. 1, pp. 347–354, Jan. 2014.
  10. J. S. Hu, J. J. Wang, and D. M. Ho, “Design of sensing system and anticipative behavior for human following of mobile robots,” IEEE Trans. Ind. Electron., vol. 61, no. 4, pp. 1916–1927, Apr. 2014.
  11. S. Hong, Y. Oh, D. Kim, and B. J. You, “Real-time walking pattern generation method for humanoid robots by combining feedback and feedforward controller,” IEEE Trans. Ind. Electron., vol. 61, no. 1, pp. 355–364, Jan. 2014.
  12. Lee, Sungmin, et al. "A review on dark channel prior based image dehazing algorithms." EURASIP Journal on Image and Video Processing 2016.1 (2016): 4.
  13. S. C. Huang, “An advanced motion detection algorithm with video quality analysis for video surveillance systems,” IEEE Trans. Circuits Syst. Video Technol., vol. 21, no. 1, pp. 1–14, Jan. 2011.
  14. X. Zhang, W. Hu, S. Chen, and S. Maybank, “Graph-embedding-based learning for robust object tracking,” IEEE Trans. Ind. Electron., vol. 61, no. 2, pp. 1072–1084, Feb. 2014.
  15. Y. Y. Schechner, S. G. Narasimhan, and S. K. Nayar, “Polarization based vision through haze,” Appl. Opt., vol. 42, no. 3, pp. 511–525, Jan. 2003.
  16. K. Tan and J. P. Oakley, “Enhancement of color images in poor visibility conditions,” in Proc. IEEE ICIP, Sep. 2000, vol. 2, pp. 788–791.
  17. R. Fattal, “Single image dehazing,” in Proc. ACM SIGGRAPH, 2008, pp. 1–7.
  18. S. C. Huang and B. H. Chen, “Highly accurate moving object detection in variable-bit-rate video-based traffic monitoring systems,” IEEE Trans. Neural Netw. Learn. Syst., vol. 24, no. 12, pp. 1920–1931, Dec. 2013.
  19. M. Chacon and S. Gonzalez, “An adaptive neural-fuzzy approach for object detection in dynamic backgrounds for surveillance systems,” IEEE Trans. Ind. Electron., vol. 59, no. 8, pp. 3286–3298, Aug. 2012.

Keywords

Retinex theory, Visibility restoration, dehazing, Edge-preserving regularization, Fusion technique.