Speech, Head, and Eye-based Cues for Continuous Affect Prediction

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

4 Citations (Scopus)

Abstract

Continuous affect prediction involves the discrete time-continuous regression of affect dimensions. Researchers in this domain are currently embracing multimodal model input. This provides motivation for researchers to investigate previously unexplored affective cues. Speech-based cues have traditionally received the most attention for affect prediction, however, nonverbal inputs have significant potential to increase the performance of affective computing systems and enable affect modelling in the absence of speech. Non-verbal inputs that have received little attention for continuous affect prediction include head and eye-based cues. Both head and eye-based cues are involved in emotion displays and perception. Additionally, these cues can be estimated non-intrusively from video, using computer vision tools. This work exploits this gap by comprehensively investigating head and eye-based features and their combination with speech for continuous affect prediction. Hand-crafted, automatically generated and convolutional neural network (CNN) learned features from these modalities will be investigated for continuous affect prediction. The highest performing feature set combinations will answer how effective these features are for the prediction of an individual's affective state.

Original languageEnglish
Title of host publication2019 8th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos, ACIIW 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages16-20
Number of pages5
ISBN (Electronic)9781728138916
DOIs
Publication statusPublished - Sep 2019
Event8th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos, ACIIW 2019 - Cambridge, United Kingdom
Duration: 3 Sep 20196 Sep 2019

Publication series

Name2019 8th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos, ACIIW 2019

Conference

Conference8th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos, ACIIW 2019
Country/TerritoryUnited Kingdom
CityCambridge
Period3/09/196/09/19

Keywords

  • affective computing
  • eyes
  • feature engineering
  • head pose
  • speech

Fingerprint

Dive into the research topics of 'Speech, Head, and Eye-based Cues for Continuous Affect Prediction'. Together they form a unique fingerprint.

Cite this