Back close

Driving Assistance System Based Ongaze Tracking and Road Scene Events Detection

School: School of Engineering

Project Incharge:Mrs. D. Bharathi.
Co-Project Incharge:Sathish P
Driving Assistance System Based Ongaze Tracking and Road Scene Events Detection

Driver inattention is a major contribution to highway crashes. The NH Traffic Safety Administration estimates that at least 25% of crashes involve some form of driver inattention. Distraction occurs when a driver is delayed in the recognition of information needed to safely accomplish the driving task.

To develop an application which can assist the driver to drive safely by giving alerts based on road scene events, and gaze direction estimation. A driver assistance system is implemented which is not only responsive to the road environment and the driver’s actions but also designed to correlate the driver’s eye gaze with road events to determine the driver’s observations. We present a prototype system capable to provide Driving Assistant System which combines driver monitoring with road scene feature extraction too monitor driver’s observations that enhance vehicle and road safety.

A new method for head posture and gaze direction estimation is proposed. Firstly, three models of head position are established and postures are judged based on triangle attribute constituted by eyes and mouth. For gaze direction estimation, pupil is located using Hough transform in eye area. With the method of horizontal, vertical projection and eye prior knowledge, the normal eye outline is fitted. Finally, gaze direction is estimated according to the position of pupil in normal state eye and head posture. 

The proposed system monitors, road scene events like Pedestrian detection, sign board recognition, and lane tracking. Pedestrian detection is done by using HOG Descriptors and SVM Classifier. The System detects and tracks road lane markers in video sequence and gives notification to the driver, whether he/she is moving across the lane by using Hough transform and Kalman filter. Traffic sign board detection and recognition is done by using Pattern matching algorithm, Finally the system gives the spatial location information of object as well as relative information of that object. DAS will correlate the driver’s eye gaze direction estimation with road scene events to determine the driver’s observations.

Related Projects

Testing of a Water Hydration-Dehydration Unit
Testing of a Water Hydration-Dehydration Unit
Developing a Sustainable Education System for Rural Chhattisgarh
Developing a Sustainable Education System for Rural Chhattisgarh
Video Analytics based Identification and Tracking in Smart Spaces
Video Analytics based Identification and Tracking in Smart Spaces
Lane Changing and Overtaking Decision	 Autonomous Vehicles
Lane Changing and Overtaking Decision Autonomous Vehicles
Mu-share: Multi-user Realtime Shared Access platform for Remote Experimentation
Mu-share: Multi-user Realtime Shared Access platform for Remote Experimentation
Admissions Apply Now