Description
|
The Emognition dataset is dedicated to testing methods for emotion recognition (ER) from physiological responses and facial expressions. We collected data from 43 participants who watched short film clips eliciting nine discrete emotions: amusement, awe, enthusiasm, liking, surprise, anger, disgust, fear, and sadness. Three wearables were utilized to record physiological data: EEG, BVP (2x), HR, EDA, SKT, ACC (3x), and GYRO (2x), alongside the upper-body video recordings. After each film clip, participants completed two types of self-reports, i.e., related to nine discrete emotions and three dimensional ones: valence, arousal, motivation. The obtained data facilitates various ER approaches, e.g.,multimodal ER, EEG- vs. cardiovascular-based ER, discrete to dimensional representation transitions. The technical validation supported that watching film clips elicited the targeted emotions.
|
Related Publication
| Saganowski, S., Komoszyńska, J., Behnke, M. et al. Emognition dataset: emotion recognition with self-reports, facial expressions, and physiology using wearables. Sci Data 9, 158 (2022). https://doi.org/10.1038/s41597-022-01262-0doi: 10.1038/s41597-022-01262-0 |
Notes
| The use of the Emognition dataset is limited to academic research purposes only. The data will be made available after completing the End User License Agreement (EULA) available in the repository. The EULA should be signed and emailed to Emognition Group at emotions@pwr.edu.pl. The mail has to be sent from an academic email address associated with the Harvard Dataverse platform account. |