Computational modelling to track human emotion trajectories through time : a thesis presented to Massey University in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Computer Science at School of Engineering and Advanced Technology, Massey University, Palmerston North, New Zealand
Loading...
Date
2013
DOI
Open Access Location
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Massey University
Rights
The Author
Abstract
There has been a lot of research into the field of a ffective computing over the past
three decades. In the context of this thesis, aff ective computing is the computing
that relates to emotion recognition, representation, and analysis. Much of the past
work has focused on the basic emotions. However, most human emotions are not pure
examples of one basic emotion, but a mixture of them, known as complex emotions.
Emotions are dynamic, they change continuously over time. This thesis focuses on
computational modelling to recognise, represent, and analyse continuous spontaneous
emotions through time.
Emotions are internal, and hence impossible to see directly. However, there are
some external presentations of emotions enabling computational tools to be used to
identify them. This thesis focuses on the use of facial points as a measure of underlying
emotions. The main focus is the development of computational models to track the
patterns of facial changes in order to analyse the paths followed by emotions over
time.
While there has been lots of work on shape models to classify facial expressions
into discrete basic emotion categories, they are generally based on the analysis of the
full face. However, the research shows that some expressions are better recognized
by muscle activity in the upper half of the face, while others use muscles primarily
from the lower half of the face. This thesis introduces a joint face model based on
shape models of full, upper, and lower parts of the face separately that signi cantly
improves the accuracy.
The set of shape models gives a degree of match to each basic emotion. Using
this information, this thesis addresses the problem of complex emotion recognition
by developing a mixture model that combines each basic emotion in an appropriate
amount. The proposed model represents emotions in the activation-evaluation space,
which is the most widely-used representation of emotions in psychological studies. It
represents emotions on the basis of their polarity and similarity to each other. This
thesis uses a mixture of von Mises distributions for emotion recognition, which is an
approximation to the normal distribution for circular data and is the most common
model for describing directional data. The results show that the proposed mixture
model ts the data well.
Emotions vary continuously with regard to intensity, duration, persistence with
time, and other attributes. In addition, their appearance on the face varies, and the
transition in facial expressions is based on both the change in emotion and physiological
constraints. This thesis examines the trajectories between emotions in activation evaluation
space and shows that these trajectories are smooth and follow `common'
paths between different emotions. In the past, very few efforts have been made on
the analysis of continuous emotion dynamics. The ndings presented in this thesis
can be used and extended in several directions to improve the emotion recognition as
well as emotion synthesis.
Description
Keywords
Emotion recognition, Affective computing, Human-computer interaction, Face recognition, Facial expression analysis