Downloads: 165 | Views: 401 | Weekly Hits: ⮙1 | Monthly Hits: ⮙1
Research Paper | Computer Engineering | Iraq | Volume 9 Issue 5, May 2020 | Popularity: 6.9 / 10
Deep Learning-based Deaf & Mute Gesture Translation System
Azher Atallah Fahad, Hassan Jaleel Hassan, Salma Hameedi Abdullah
Abstract: Translation System (TS) is a method used by Deaf individuals to interact with other ordinary persons in the world. Since computers are an important component of our community, the progress of human-computer interaction (HCI) has supported disabled people. And the main purpose of our proposed system is to progress an intelligent system which can turn as a translator between normal and deaf or dumb individuals and can be the communication path between people with speaking deficiency and normal people with both effective and efficient ways. The proposed system consists of a Convolutional Neural Network (CNN) based on the deep learning algorithm for effective extraction of handy features to understand the American Sign Language (ASL), for classifying the hand sign. This paper constructs to interpret ASL and also provides a complete overview of deep learning-based methodologies for sign language recognition. The proposed solution was tested on data samples from ASL data sets and achieved an overall accuracy of 96.68 %. The proposed system was appropriate and reliable for Deaf individuals. Furthermore, an efficient and low-cost Hand Gesture Recognition (HGR) system for the real-time video stream from a mobile device camera. A separate individual hand gesture is utilized for validation in this article. The proposed system has to be designed with the front of the camera and the output is given in the form of text or audio.
Keywords: Human-computer-interaction HCI, Hand Gesture Recognition HGR, American Sign Language ASL, Image Segmentation, Convolutional Neural Network CNN
Edition: Volume 9 Issue 5, May 2020
Pages: 288 - 292
Make Sure to Disable the Pop-Up Blocker of Web Browser