Identifier

etd-11012016-102229

Degree

Doctor of Philosophy (PhD)

Department

Engineering Science (Interdepartmental Program)

Document Type

Access to Dissertation Restricted to LSU Campus

Abstract

Affect (emotion) recognition has many applications, such as human assistive robotics, human computer interaction and empathic agents, virtual tutoring, marketing, surveillance, and counseling. Previous research has focused primarily on unimodal or bimodal affect recognition (facial expressions and speech). This research developed multimodal emotion recognition by using data from facial expressions, head position, hand movement, body posture and speech. A novel hybrid event driven fusion technique was used to combine data from multiple input channels at the feature level and decision level. Position and temporal data from tracked feature points was used for training a support vector machine based classifier. New rule based features in addition to existing geometric, kinetic and 3D features were created. An emotional key word look-up using speech recognition technology was incorporated in the recognition process. The research developed a real time affect estimation system that accurately predicts multiple emotions, intensity of the emotions and maintains the context history of recognized emotions.

Date

2016

Document Availability at the Time of Submission

Student has submitted appropriate documentation to restrict access to LSU for 365 days after which the document will be released for worldwide access.

Committee Chair

Knapp, Gerald

DOI

10.31390/gradschool_dissertations.4243

Share

COinS