Please use this identifier to cite or link to this item: https://dl.ucsc.cmb.ac.lk/jspui/handle/123456789/3790
Title: Face Expression Analysis and Recognition System
Authors: RATNAYAKE, D.N.
Issue Date: 22-Sep-2016
Abstract: Automatic Facial Expression Recognition and Analysis, in particular FACS Action Unit (AU) detection and discrete emotion detection, has been an active topic in computer science for over two decades. Standardization and comparability has come some way; for instance, there exist a number of commonly used facial expression databases. However, lack of a common evaluation protocol and lack of sufficient details to reproduce the reported individual results make it difficult to compare systems to each other. This in turn hinders the progress of the field. A periodical challenge in Facial Expression Recognition and Analysis would allow this comparison in a fair manner.Facial expressions convey non-verbal cues, which play an important role in interpersonal relations. Automatic recognition of facial expressions can be an important component of natural human-machine interfaces; it may also be used in behavioral science and in clinical practice. Although humans recognize facial expressions virtually without effort or delay, reliable expression recognition by machine is still a challenge.Facial expression recognition system employing Bezier curves approximation technique which is based on facial features extraction using the knowledge of the face geometry and approximated by 3rd order Bezier curves representing the relationship between the motion of features and changes of expressions. For face detection, color segmentation based on the novel idea of fuzzy classification has been employed that manipulates ambiguity in colors. To evaluate the performance of the proposed algorithm, assess the ratio of success with emotionally expressive facial image database. Experimental results shows average 78.8% of success to analyze and recognize the facial expression and emotion.It is important to recognized human expression because human beings’ express emotions in day to day interactions, understanding emotions and knowing how to react to people’s expressions greatly enriches the interaction. Application areas related to face and its expressions are personal identification and access control, video phone and teleconferencing, forensic applications, human-computer interaction, automated surveillance, cosmetology, and so on.Methods used to detect human face in image are;knowledge-based methods,Feature-based methods,Template based methods,Appearance-based methods These methods all aiming to meet different requirements, consist of pre-processing and processing parts. The detection and extraction of face images from the input data together with the normalization process, which aims to align these extracted images independent of varying environmental conditions such as illumination and orientation, form up the pre-processing part. The processing part on the other hand; aims to extract specific features from the already pre-processed images, and recognize the facial action units/facial expressions depending on these features. Several methods try to find different optimized algorithms mainly for the processing part.Facial analysis and recognition system developed using C# and ASP.NET programming languages with the help of Microsoft access database. C# language is a very mature and well developed language, C# has a similar syntax to Java, but with even more functionality. Like Java, it is a true object oriented language, only it is geared for enterprise level development. The .Net framework very mature and robust framework that is actively maintained by Microsoft, even though it is open source. This gives it a lot of flexibility and stability.
URI: http://hdl.handle.net/123456789/3790
Appears in Collections:Master of Information Technology - 2016

Files in This Item:
File Description SizeFormat 
13550716_Thesis.pdf
  Restricted Access
2.6 MBAdobe PDFView/Open Request a copy


Items in UCSC Digital Library are protected by copyright, with all rights reserved, unless otherwise indicated.