Please use this identifier to cite or link to this item: https://dl.ucsc.cmb.ac.lk/jspui/handle/123456789/4535
Title: Sinhala Sign Language to Text Interpreter based on Machine Learning
Authors: Peiris, W. D. T.
Issue Date: 11-Aug-2021
Abstract: Sinhala sign language is the primary mode of communication between hearingimpaired Sri Lankans. Their main difficulty in interacting with the general public is that majority of the population do not know how to interpret sign language. This problem in communication and level of comprehension is disadvantageous for the hearing and speaking impaired. For sign languages with a large user base, there are already many projects in place to alleviate these problems; but since the sign language user community in Sri Lanka is small, there is little effort put into projects and research to come up with solutions for this problem. Therefore, the primary objective of this study was to design and develop a desktop software application that captures video in real-time of a person using Sinhalese fingerspelling sign language, process and identify the gestures based on machine learning and interpret the signed hand gestures in the video and output text to a screen as words. In order to achieve this task, A dataset was created for the Sinhala fingerspelling alphabet to serve as training images for the machine learning process. 27000+ images were obtained to train 27 hand gestures. After careful consideration, Inception, a convolution neural network was selected and trained to interpret the images. The Graphical User Interface and the underlying code was written in Python with Tensorflow acting as the framework which handled the machine learning component. In addition to that, to achieve the final objective; various methods of image preprocessing, image extraction, skin detail, and background removal techniques were also studied. The project intentionally left out any wearable technology or other 3rd party appliances to keep the cost to the user as low as possible. Once application development was complete, the system was evaluated against 6 individuals with each test subject performing 120 tests for hand gesture character recognition resulting in an overall accuracy of 95%.
URI: http://dl.ucsc.cmb.ac.lk/jspui/handle/123456789/4535
Appears in Collections:2020

Files in This Item:
File Description SizeFormat 
2017 MCS 058.pdf3.72 MBAdobe PDFView/Open


Items in UCSC Digital Library are protected by copyright, with all rights reserved, unless otherwise indicated.