Current eLearning systems enable streaming of live lectures to distant students facilitating a live instructor-student interaction. However, studies have shown that there exists a marked divide in local students' (student present in the teacher's location) experience as compared to distant students'. One of the major factors attributing to this rift is lack of gaze aligned interaction. In this paper, we present a system architecture that receives gesture triggers as input, and dynamically calculates the perspective angle to be captured of the speaking participant, for the listener, facilitating eye contact. The gesture triggers are calculated using Microsoft Kinect sensor which extracts skeleton joint information of the instructor, and performs gesture recognition with the acquired joint information real-time. This serves as interaction-initiation triggers for dynamic perspective correction for gaze alignment during a conversation. For evaluation, we constructed a five classroom test-bed with dynamic perspective correction and user study results indicate a marked 42 enhancement in experience with the gaze correction in place. © 2017-IOS Press and the authors. All rights reserved.
cited By 1
N. Ramkumar, P. Rangan, V., Uma Gopalakrishnan, and Balaji Hariharan, “Gesture triggered, dynamic gaze alignment architecture for intelligent eLearning systems”, Journal of Intelligent and Fuzzy Systems, vol. 32, pp. 2963-2969, 2017.