Sign Language is a visual gesture language used by speech impaired people to convey their thoughts and ideas with the help of hand gestures and facial expressions. This paper presents a stroke based representation of dynamic gestures of Indian Sign Language Signs incorporating both local as well as global motion information. This compact representation of a gesture is analogous to phonemic representation of speech signals. To incorporate the local motion of the hand, each stroke contains the features corresponding to the hand shape as well. The dynamic gesture trajectories are segmented based on Maximum Curvature Points(MCPs). MCPs are selected based on the direction change of trajectories. The frames corresponding to the MCP points of the trajectory are considered as the key frames. Local information features are taken as the hand shape of the Key frames. The existing methods of Sign Language Recognition has scalability problems apart from high complexity and the need for extensive training data. In contrast, our proposed method of stroke based representation has less expensive training phase since it only requires the training of stroke features and stroke sequences of each word. Our algorithms also address the issue of scalability. We have tested our approach in the context of Indian Sign Language recognition and we present the results from this study
M. Geetha, Aswathi, P. V., and Dr. M. R. Kaimal, “A Stroke Based Representation of Indian Sign Language Signs Incorporating Global and Local Motion Information”, in Second International Conference on Advanced Computing, Networking and Security, 2013.