MASR Dataset

This dataset contains 750 videos from 15 subjects. We designed a list of body actions and facial expressions commonly encountered during typical game play, eliciting basic emotions according to Ekman’s theory. Based on this list of emotions, a bi-modal database was created using Microsoft’s Kinect for Xbox One sensor (v2.0), containing feature vectors extracted from users’ facial expressions and body gestures. Each emotion was represented with two different types of motions and each recording had duration of 3 seconds. Subjects were shown a short video with the aforementioned movements and afterwards they were asked to perform each movement according to their personal style, 5 times, in front of a Microsoft Kinect sensor. The emotion label refers to the requested expression, and may not correspond to what the subjects may have actually performed.

Body dataset: masr_body

Face dataset: masr_face