Systematic Literature Survey on Sign Language Recognition Systems

Systematic Literature Survey on Sign Language Recognition Systems

Ashok Kumar L. (PSG College of Technology, India), Karthika Renuka D. (PSG College of Technology, India), and Raajkumar G. (PSG College of Technology, India)
DOI: 10.4018/978-1-6684-6001-6.ch012
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Recently, communication via signing acknowledgment has received a lot of attention in personal computer vision. Sign language is a method of conveying messages by using the hand, arm, body, and face to convey considerations and implications. Communication through gestures, like communication in languages, arises and develops naturally within hearing-impaired networks. All the same, gesture-based communication is uncommon. There is no universally perceived and accepted gesture-based communication for all deaf and hard-of-hearing people. Each nation has its own communication via gestures with a significant level of syntactic variety, just as it does when communicating in language. The gesture-based communication utilized is usually known as sign language.
Chapter Preview
Top

Literature Survey

Acquisition Using Wearable Computing

Wearable registration approaches to gesture-based communication information security provide a precise method for separating data about the underwriters' hand developments and hand shape. Each detecting innovation differs in a few ways, including precision, goal and range of movement, client comfort, and cost.

(Berman, 2011) proposed a reasonable visual movement information glove with high acknowledgment precision. In place of the more widely used development separating fibres or multi-channel accounting, the glove device employed a single - carrier video, with a repeating estimate to make up for the deficiencies of single-channel accounts. The growth of the hand was captured using a monocular camera, and after that, a visual analyser estimation identified the optical markings and reconstructed the 3D locations of the joints and fingers. In MATLAB, three different circumstances (left/right snaps, numerals, and the OK symbol) were dealt with and made into 3D graphics.

(Madeo, 2013) used the KHU-l information glove to create a 3D hand movement following and motion recognition framework. A Bluetooth device was used to connect the information glove to a PC. It was capable of performing hand movements such as clench hand grasping, hand extending, and bowing. For 50 preliminary trials, three signals (scissor, rock, and paper) were tried with 100% precision. Although 3D recognition and remote transmission were significant advancements, they resulted in time lag.

(Witt, 2007) devised a method for integrating glove-based devices into various applications using a setting system. The hand glove synchronised with electronic device may be used in three different ways, as demonstrated: to move, zoom, and choose parts of an assistant; to study a regulator in display; and to control a toy robot's left and right movements. Backwards/advances One issue was that, while this device could detect movement in the X and Y hatchets, it couldn't detect movement in the Z centre, such as the claimed “yaw.” Furthermore. The precision of acknowledgment was sacrificed in order to achieve wear capacity, light weight, and a cool appearance.

Complete Chapter List

Search this Book:
Reset