Project Incharge: 
Mrs. D. Bharathi.
Co-Project Incharge: 
Sathish P
Department: 
Computer Science Engineering
School: 
School of Engineering

Driver inattention is a major contribution to highway crashes. The NH Traffic Safety Administration estimates that at least 25% of crashes involve some form of driver inattention. Distraction occurs when a driver is delayed in the recognition of information needed to safely accomplish the driving task.

To develop an application which can assist the driver to drive safely by giving alerts based on road scene events, and gaze direction estimation. A driver assistance system is implemented which is not only responsive to the road environment and the driver’s actions but also designed to correlate the driver’s eye gaze with road events to determine the driver’s observations. We present a prototype system capable to provide Driving Assistant System which combines driver monitoring with road scene feature extraction too monitor driver’s observations that enhance vehicle and road safety.

A new method for head posture and gaze direction estimation is proposed. Firstly, three models of head position are established and postures are judged based on triangle attribute constituted by eyes and mouth. For gaze direction estimation, pupil is located using Hough transform in eye area. With the method of horizontal, vertical projection and eye prior knowledge, the normal eye outline is fitted. Finally, gaze direction is estimated according to the position of pupil in normal state eye and head posture. 

The proposed system monitors, road scene events like Pedestrian detection, sign board recognition, and lane tracking. Pedestrian detection is done by using HOG Descriptors and SVM Classifier. The System detects and tracks road lane markers in video sequence and gives notification to the driver, whether he/she is moving across the lane by using Hough transform and Kalman filter. Traffic sign board detection and recognition is done by using Pattern matching algorithm, Finally the system gives the spatial location information of object as well as relative information of that object. DAS will correlate the driver’s eye gaze direction estimation with road scene events to determine the driver’s observations.

 

Share this Story: