Abstract
We present a method that automatically evaluates emotional response from spontaneous facial activity. The automatic evaluation of emotional response, or affect, is a fascinating challenge with many applications. Our approach is based on the inferred activity of facial muscles over time, as automatically obtained from an RGB-D video recording of spontaneous facial activity. Our contribution is two-fold: First, we constructed a database of publicly available short video clips, which elicit a strong emotional response in a consistent manner across different individuals. Each video was tagged by its characteristic emotional response along 4 scales: Valence, Arousal, Likability and Rewatch (the desire to watch again). The second contribution is a two-step prediction method, based on learning, which was trained and tested using this database of tagged video clips. Our method was able to successfully predict the aforementioned 4 dimensional representation of affect, achieving high correlation (0.87-0.95) between the predicted scores and the affect tags. As part of the prediction algorithm we identified the period of strongest emotional response in the viewing recordings, in a method that was blind to the video clip being watched, showing high agreement between independent viewers. Finally, inspection of the relative contribution of different feature types to the prediction process revealed that temporal facets contributed more to the prediction of individual affect than to media tags.
Original language | English |
---|---|
Title of host publication | Proceedings - 12th IEEE International Conference on Automatic Face and Gesture Recognition, FG 2017 - 1st International Workshop on Adaptive Shot Learning for Gesture Understanding and Production, ASL4GUP 2017, Biometrics in the Wild, Bwild 2017, Heterogeneous Face Recognition, HFR 2017, Joint Challenge on Dominant and Complementary Emotion Recognition Using Micro Emotion Features and Head-Pose Estimation, DCER and HPE 2017 and 3rd Facial Expression Recognition and Analysis Challenge, FERA 2017 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 727-734 |
Number of pages | 8 |
ISBN (Electronic) | 9781509040230 |
DOIs | |
State | Published - 28 Jun 2017 |
Event | 12th IEEE International Conference on Automatic Face and Gesture Recognition, FG 2017 - Washington, United States Duration: 30 May 2017 → 3 Jun 2017 |
Publication series
Name | Proceedings - 12th IEEE International Conference on Automatic Face and Gesture Recognition, FG 2017 - 1st International Workshop on Adaptive Shot Learning for Gesture Understanding and Production, ASL4GUP 2017, Biometrics in the Wild, Bwild 2017, Heterogeneous Face Recognition, HFR 2017, Joint Challenge on Dominant and Complementary Emotion Recognition Using Micro Emotion Features and Head-Pose Estimation, DCER and HPE 2017 and 3rd Facial Expression Recognition and Analysis Challenge, FERA 2017 |
---|
Conference
Conference | 12th IEEE International Conference on Automatic Face and Gesture Recognition, FG 2017 |
---|---|
Country/Territory | United States |
City | Washington |
Period | 30/05/17 → 3/06/17 |
Bibliographical note
Publisher Copyright:© 2017 IEEE.