2.50
Hdl Handle:
http://hdl.handle.net/10547/250949
Title:
Categorizing facial expressions: a comparison of computational models
Authors:
Shenoy, Aruna; Anthony, Sue; Frank, Ray; Davey, Neil
Abstract:
Recognizing expressions is a key part of human social interaction, and processing of facial expression information is largely automatic for humans, but it is a non-trivial task for a computational system. The purpose of this work is to develop computational models capable of differentiating between a range of human facial expressions. Raw face images are examples of high-dimensional data, so here we use two dimensionality reduction techniques: principal component analysis and curvilinear component analysis. We also preprocess the images with a bank of Gabor filters, so that important features in the face images may be identified. Subsequently, the faces are classified using a support vector machine. We show that it is possible to differentiate faces with a prototypical expression from the neutral expression. Moreover, we can achieve this with data that has been massively reduced in size: in the best case the original images are reduced to just 5 components. We also investigate the effect size on face images, a concept which has not been reported previously on faces. This enables us to identify those areas of the face that are involved in the production of a facial expression.
Citation:
Categorizing facial expressions: a comparison of computational models 2011, 20 (6):815-823 Neural Computing and Applications
Publisher:
Springer Link
Journal:
Neural Computing and Applications
Issue Date:
Sep-2011
URI:
http://hdl.handle.net/10547/250949
DOI:
10.1007/s00521-010-0446-9
Additional Links:
http://www.springerlink.com/index/10.1007/s00521-010-0446-9
Type:
Article
Language:
en
ISSN:
0941-0643; 1433-3058
Appears in Collections:
Centre for Research in Distributed Technologies (CREDIT)

Full metadata record

DC FieldValue Language
dc.contributor.authorShenoy, Arunaen_GB
dc.contributor.authorAnthony, Sueen_GB
dc.contributor.authorFrank, Rayen_GB
dc.contributor.authorDavey, Neilen_GB
dc.date.accessioned2012-11-05T11:05:27Z-
dc.date.available2012-11-05T11:05:27Z-
dc.date.issued2011-09-
dc.identifier.citationCategorizing facial expressions: a comparison of computational models 2011, 20 (6):815-823 Neural Computing and Applicationsen_GB
dc.identifier.issn0941-0643-
dc.identifier.issn1433-3058-
dc.identifier.doi10.1007/s00521-010-0446-9-
dc.identifier.urihttp://hdl.handle.net/10547/250949-
dc.description.abstractRecognizing expressions is a key part of human social interaction, and processing of facial expression information is largely automatic for humans, but it is a non-trivial task for a computational system. The purpose of this work is to develop computational models capable of differentiating between a range of human facial expressions. Raw face images are examples of high-dimensional data, so here we use two dimensionality reduction techniques: principal component analysis and curvilinear component analysis. We also preprocess the images with a bank of Gabor filters, so that important features in the face images may be identified. Subsequently, the faces are classified using a support vector machine. We show that it is possible to differentiate faces with a prototypical expression from the neutral expression. Moreover, we can achieve this with data that has been massively reduced in size: in the best case the original images are reduced to just 5 components. We also investigate the effect size on face images, a concept which has not been reported previously on faces. This enables us to identify those areas of the face that are involved in the production of a facial expression.en_GB
dc.language.isoenen
dc.publisherSpringer Linken_GB
dc.relation.urlhttp://www.springerlink.com/index/10.1007/s00521-010-0446-9en_GB
dc.rightsArchived with thanks to Neural Computing and Applicationsen_GB
dc.titleCategorizing facial expressions: a comparison of computational modelsen
dc.typeArticleen
dc.identifier.journalNeural Computing and Applicationsen_GB
All Items in UOBREP are protected by copyright, with all rights reserved, unless otherwise indicated.