Publication Type:

Conference Paper

Source:

ACM International Conference Proceeding Series, Association for Computing Machinery, p.28-32 (2018)

ISBN:

9781450364706

URL:

https://www.scopus.com/inward/record.uri?eid=2-s2.0-85055323518&doi=10.1145%2f3232651.3232670&partnerID=40&md5=ebc0a1d906c2d0dcb4bde106fb45ed87

Keywords:

Computer vision, Diagnosis, Flex sensor, Gaming, Inertial measurement unit, Medical fields, Robotics, Sign language, Smart gloves, Speech communication, State estimation methods

Abstract:

Speechlessness is a colossal barrier of communication between the ordinary people and the speech impaired. This paper presents a novel methodology with a working model which is used to convert the sign language to speech in order to help speech impaired people. It uses flex sensor and Inertial Measurement Unit in order to determine the position of finger as well as the position of hand in 3-Dimensional space. These sensors are embedded into the gloves, which when processed, outputs the accurate gesture which has been made by the user, thereby making it smart. The paper also presents a unique division of a matrix in 3-Dimensional space comprising of states. These states have to be estimated in a generalized manner in order to be used by anybody irrespective of their gender or height. The paper also highlights the use of both hands for compound two hand gestures with static and dynamic gesture recognition system. The smart gloves can further be used in a variety of applications such as motion sensing gaming, remote medical diagnosis, and robotics. © 2018 Association for Computing Machinery.

Notes:

cited By 0; Conference of 2018 International Conference on Control and Computer Vision, ICCCV 2018 ; Conference Date: 15 June 2018 Through 18 June 2018; Conference Code:140160

Cite this Research Publication

H. Bhardawaj, Dhaker, M., Sivani, K., and Nandi Vardhan H. R., “Smart gloves: A novel 3-D work space generation for compound two hand gestures”, in ACM International Conference Proceeding Series, 2018, pp. 28-32.