<p>Deaf and dumb people communicate among themselves using sign languages, but they find it difficult to expose themselves to the outside world. This paper proposes a method to convert the Indian Sign Language (ISL) hand gestures into appropriate text message. In this paper the hand gestures corresponding to ISL English alphabets are captured through a webcam. In the captured frames the hand is segmented and the state of fingers is used to recognize the alphabet. The features such as angle made between fingers, number of fingers that are fully opened, fully closed or semi closed and identification of each finger are used for recognition. Experimentation done for single hand alphabets and the results are summarised. © 2012 IEEE.</p>
cited By (since 1996)0; Conference of org.apache.xalan.xsltc.dom.DOMAdapter@65be09d1 ; Conference Date: org.apache.xalan.xsltc.dom.DOMAdapter@1f76ac86 Through org.apache.xalan.xsltc.dom.DOMAdapter@2838a18d; Conference Code:95713
R. K. Shangeetha., Valliammai., V., and Dr. Padmavathi S., “Computer vision based approach for Indian sign language character recognition”, 2012 International Conference on Machine Vision and Image Processing, MVIP 2012. Coimbatore, Tamil Nadu, pp. 181-184, 2012.