CFP last date
20 May 2024
Reseach Article

Sensor Fusion of Laser & Stereo Vision Camera for Depth Estimation and Obstacle Avoidance

by Saurav Kumar, Daya Gupta, Sakshi Yadav
journal cover thumbnail
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 1 - Number 26
Year of Publication: 2010
Authors: Saurav Kumar, Daya Gupta, Sakshi Yadav
10.5120/485-795

Saurav Kumar, Daya Gupta, Sakshi Yadav . Sensor Fusion of Laser & Stereo Vision Camera for Depth Estimation and Obstacle Avoidance. International Journal of Computer Applications. 1, 26 ( February 2010), 20-25. DOI=10.5120/485-795

@article{ 10.5120/485-795,
author = { Saurav Kumar, Daya Gupta, Sakshi Yadav },
title = { Sensor Fusion of Laser & Stereo Vision Camera for Depth Estimation and Obstacle Avoidance },
journal = { International Journal of Computer Applications },
issue_date = { February 2010 },
volume = { 1 },
number = { 26 },
month = { February },
year = { 2010 },
issn = { 0975-8887 },
pages = { 20-25 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume1/number26/485-795/ },
doi = { 10.5120/485-795 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T19:48:46.548461+05:30
%A Saurav Kumar
%A Daya Gupta
%A Sakshi Yadav
%T Sensor Fusion of Laser & Stereo Vision Camera for Depth Estimation and Obstacle Avoidance
%J International Journal of Computer Applications
%@ 0975-8887
%V 1
%N 26
%P 20-25
%D 2010
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Laser Range Finders (LRF) have been widely used in the field of robotics to generate very accurate 2-D maps of environment perceived by Autonomous Mobile Robot. Stereo Vision devices on the other hand provide 3-D view of the surroundings with a range far much than of a LRF but at the tradeoff of accuracy. This paper demonstrates a technique of sensor fusion of information obtained from LRF and Stereovision camera systems to extract the accuracy and range of independents systems respectively. Pruning of the 3D point cloud obtained by the Stereo Vision Camera is done to achieve computational efficiency in real time environment, after which the point cloud model is scaled down to a 2-D vision map, to further reduce computational costs. The 2D map of the camera is fused with the 2D cost map of the LRF to generate a 2-D navigation map of the surroundings which in turn is passed as an occupancy grid to VFH+ for obstacle avoidance and path-planning. This technique has been successfully tested on ‘Lakshya’- an IGV platform developed at Delhi College of Engineering in outdoor environments.

References
  1. S. Thrun, D. Fox, and W. Burgard. “A real-time algorithm for mobile robot mapping with application to multi robot and 3D mapping”, in proceedings of the IEEE Int. Conf. on Robotics and Automation (ICRA ’00), USA, 2000.
  2. Haris Baltzakis, Antonis Argyros, Panos Trahanias, “Fusion of laser and visual data for robot motion planning and collision avoidance”, Machine Vision and Applications (2003) 15: 92–100
  3. Mathias Perrollaz, Raphael Labayrade, Cyril Royere, Nicolas Hautiere, Didier Aubert, “Long Range Obstacle Detection Using Laser Scanner and Stereovision”, in Intelligent Vehicles Symposium 2006, June 13-15, 2006, Tokyo, Japan.
  4. C. Stiller, J. Hipp, C. Rossig, A. Ewald, “Multisensor obstacle detection and tracking”, Image and Vision Computing, Volume 18, Issue 5, April 2000, Pages 389-396.
  5. Romuald Aufrere, Jay Gowdy, Christoph Mertz, Chuck Thorpe, Chieh-Chih Wang, Teruko Yata, ”Perception for collision avoidance and autonomous driving”, Mechatronics, Volume 13, Issue 10, December 2003, Pages 1149-1161
  6. Zhuoyun Zhang, Chunping Hou, Lili Shen, Jiachen Yang, “An Objective Evaluation for Disparity Map based on the Disparity Gradient and Disparity Acceleration”, 2009 International Conference on Information Technology and Computer Science.
  7. K. L. Boyer, D. M. Wuescher, and S. Sarkar, “Dynamic Edge Warping: An Experimental System for Recovering Disparity Maps in Weakly Constrained Systems”, IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS, VOL. 21, NO. 1 JANUARY/FEBRUARY 1991
  8. R. Labayrade, D. Aubert, and J.P. Tarel, “Real time obstacle detection on non flat road geometry trough V disparity representation”, in IEEE Intelligent Vehicle Symposium , Versailles , June 2002.
  9. Alper Yilmaz, “Sensor Fusion in Computer Vision”, IEEE Urban Remote Sensing Joint Event, 2007, 11-13 April 2007 Page(s):1 - 5
  10. M.A. Fischler and R.C. Bolles, “Random consensus: A paradigm for model fitting with applications to image analysis and automated cartography”. Communications of the ACM, 1981, 24(6), 381-395.
  11. Saurav Kumar, “Binocular Stereo Vision Based Obstacle Avoidance Algorithm for Autonomous Mobile Robots”, IEEE Advance Computing Conference 2009
  12. Iwan Ulrich and Johann Borenstein, “VFH+: Reliable Obstacle Avoidance for Fast Mobile Robots”, in 1998 IEEE International Conference on Robotics and Automation.
Index Terms

Computer Science
Information Sciences

Keywords

Sensor fusion Stereovision Laser range finder Obstacle avoidance Navigation map 3D point cloud and Robotics