Back close

Pose invariant method for emotion recognition from 3D images

Publication Type : Conference Proceedings

Publisher : 12th IEEE International Conference Electronics, Energy, Environment, Communication, Computer, Control: (E3-C3), INDICON 2015

Source : 12th IEEE International Conference Electronics, Energy, Environment, Communication, Computer, Control: (E3-C3), INDICON 2015, Institute of Electrical and Electronics Engineers Inc. (2015)

Url : https://www.scopus.com/inward/record.uri?eid=2-s2.0-84994336765&partnerID=40&md5=19406820fb87bd2a39558f3ef0d247b8

ISBN : 9781467373999

Keywords : Classification (of information), Emotion recognition, Emotional state, extraction, Face recognition, facial expressions, Feature extraction, Feature points, Geometric feature, Human robot interaction, Human robot Interaction (HRI), intelligent robots, Neural networks, Non-frontal views, Pose invariant, Robots, Speech recognition

Campus : Bengaluru

School : Department of Computer Science and Engineering, School of Engineering

Department : Computer Science, Electronics and Communication

Year : 2015

Abstract : Information about the emotional state of a person can be inferred from facial expressions. Emotion recognition has become an active research area in recent years in various fields such as Human Robot Interaction (HRI), medicine, intelligent vehicle, etc., The challenges in emotion recognition from images with pose variations, motivates researchers to explore further. In this paper, we have proposed a method based on geometric features, considering images of 7 yaw angles (-45°,-30°,-15°,0°,+15°,+30°,+45°) from BU3DFE database. Most of the work that has been reported considered only positive yaw angles. In this work, we have included both positive and negative yaw angles. In the proposed method, feature extraction is carried out by concatenating distance and angle vectors between the feature points, and classification is performed using neural network. The results obtained for images with pose variations are encouraging and comparable with literature where work has been performed on pitch and yaw angles. Using our proposed method non-frontal views achieve similar accuracy when compared to frontal view thus making it pose invariant. The proposed method may be implemented for pitch and yaw angles in future.

Cite this Research Publication : Dr. Suja P., Krishnasri, D., and Dr. Shikha Tripathi, “Pose invariant method for emotion recognition from 3D images”, 12th IEEE International Conference Electronics, Energy, Environment, Communication, Computer, Control: (E3-C3), INDICON 2015. Institute of Electrical and Electronics Engineers Inc., 2015.

Admissions Apply Now