Kavli Affiliate: Xiang Zhang
| First 5 Authors: Xiaotian Li, Xiang Zhang, Huiyuan Yang, Wenna Duan, Weiying Dai
| Summary:
Emotion is an experience associated with a particular pattern of
physiological activity along with different physiological, behavioral and
cognitive changes. One behavioral change is facial expression, which has been
studied extensively over the past few decades. Facial behavior varies with a
person’s emotion according to differences in terms of culture, personality,
age, context, and environment. In recent years, physiological activities have
been used to study emotional responses. A typical signal is the
electroencephalogram (EEG), which measures brain activity. Most of existing
EEG-based emotion analysis has overlooked the role of facial expression
changes. There exits little research on the relationship between facial
behavior and brain signals due to the lack of dataset measuring both EEG and
facial action signals simultaneously. To address this problem, we propose to
develop a new database by collecting facial expressions, action units, and EEGs
simultaneously. We recorded the EEGs and face videos of both posed facial
actions and spontaneous expressions from 29 participants with different ages,
genders, ethnic backgrounds. Differing from existing approaches, we designed a
protocol to capture the EEG signals by evoking participants’ individual action
units explicitly. We also investigated the relation between the EEG signals and
facial action units. As a baseline, the database has been evaluated through the
experiments on both posed and spontaneous emotion recognition with images
alone, EEG alone, and EEG fused with images, respectively. The database will be
released to the research community to advance the state of the art for
automatic emotion recognition.
| Search Query: ArXiv Query: search_query=au:”Xiang Zhang”&id_list=&start=0&max_results=10