Back close

Stereo Camera and LIDAR Sensor Fusion-Based Collision Warning System for Autonomous Vehicles

Publication Type : Book Chapter

Publisher : Springer Singapore

Source : Advances in Computational Intelligence Techniques, Springer Singapore, Singapore, p.239–252 (2020)

Url : https://doi.org/10.1007/978-981-15-2620-6_17

ISBN : 9789811526206

Campus : Coimbatore

School : School of Engineering

Center : Computational Engineering and Networking

Department : Electronics and Communication

Year : 2020

Abstract : This paper proposes a forward collision warning system for an autonomous vehicle based on a novel point to pixel multi-sensor data fusion algorithm which combines both the LIDAR 2D data and the image pixel data from the stereo camera to detect, classify and track the obstacles in front of the vehicle in real time. The LIDAR and stereo camera sensors were synchronized and calibrated, then obtained distance measurements from both the sensors were combined using Kalman filter algorithm performing multi-sensor data fusion in real time on an embedded platform. The region of interest (ROI) was selected from the camera image, and then the fused distance data was overlaid on top of the contour. Distance and angle of the target are obtained from the LIDAR, and target classification was performed by applying MobileNet SSD deep learning algorithm to camera data. The root mean squared error (RMSE) and mean absolute error (MAE) of the proposed fusion algorithm are 93.802 mm and 83.453 mm lower than the individual distance measurements from Stereo Camera and 2D LIDAR Sensor. Along with that uncertainty and variance of the fused measurements were also decreased.

Cite this Research Publication : A. Dinesh Kumar, Karthika, R., and Dr. Soman K. P., “Stereo Camera and LIDAR Sensor Fusion-Based Collision Warning System for Autonomous Vehicles”, in Advances in Computational Intelligence Techniques, S. Jain, Sood, M., and Paul, S., Eds. Singapore: Springer Singapore, 2020, pp. 239–252.

Admissions Apply Now