Interpreting sign language gestures is a challenging task for a vast majority of people not accustomed to interacting with the speech-impaired. This also tends to further limit the interaction that the speech-impaired people can have with the society at large. To address this issue, a machine learning based wearable and low-cost system to identify dynamic Indian sign language gestures, is presented. The system uses machine learning to identify the sign made by the wearer and then the audio clip corresponding to that gesture is played out aloud so that the listener is able to identify the meaning of that gesture. A prototype system that can identify and interpret a total of fifty-six dynamic Indian sign language gestures has been designed and tested. Effectiveness of three different machine learning approaches based on Naive Bayes, Decision trees and Support Vector Machines respectively, has also been evaluated. Results indicate that the proposed system is capable of successfully identifying gestures with an accuracy greater than 93%. © 2016 IEEE.
cited By ; Conference of 1st International Conference on Robotics and Automation for Humanitarian Applications, RAHA 2016 ; Conference Date: 18 December 2016 Through 20 December 2016; Conference Code:128160
A. K. Singh, John, B. P., Subramanian, S. R. Venkata, A. Kumar, S., and Nair, B. B., “A low-cost wearable Indian sign language interpretation system”, in International Conference on Robotics and Automation for Humanitarian Applications, RAHA 2016 - Conference Proceedings, 2017.