Decoding facial expressions based on face‐selective and motion‐sensitive areas |
| |
Authors: | Yin Liang Baolin Liu Junhai Xu Gaoyan Zhang Xianglin Li Peiyuan Wang Bin Wang |
| |
Affiliation: | 1. School of Computer Science and Technology, Tianjin Key Laboratory of Cognitive Computing and Application, Tianjin University, Tianjin, People's Republic of China;2. State Key Laboratory of Intelligent Technology and Systems, National Laboratory for Information Science and Technology, Tsinghua University, Beijing, People's Republic of China;3. Medical Imaging Research Institute, Binzhou Medical University, Yantai, Shandong, People's Republic of China;4. Department of Radiology, Yantai Affiliated Hospital of Binzhou Medical University, Yantai, Shandong, People's Republic of China |
| |
Abstract: | Humans can easily recognize others' facial expressions. Among the brain substrates that enable this ability, considerable attention has been paid to face‐selective areas; in contrast, whether motion‐sensitive areas, which clearly exhibit sensitivity to facial movements, are involved in facial expression recognition remained unclear. The present functional magnetic resonance imaging (fMRI) study used multi‐voxel pattern analysis (MVPA) to explore facial expression decoding in both face‐selective and motion‐sensitive areas. In a block design experiment, participants viewed facial expressions of six basic emotions (anger, disgust, fear, joy, sadness, and surprise) in images, videos, and eyes‐obscured videos. Due to the use of multiple stimulus types, the impacts of facial motion and eye‐related information on facial expression decoding were also examined. It was found that motion‐sensitive areas showed significant responses to emotional expressions and that dynamic expressions could be successfully decoded in both face‐selective and motion‐sensitive areas. Compared with static stimuli, dynamic expressions elicited consistently higher neural responses and decoding performance in all regions. A significant decrease in both activation and decoding accuracy due to the absence of eye‐related information was also observed. Overall, the findings showed that emotional expressions are represented in motion‐sensitive areas in addition to conventional face‐selective areas, suggesting that motion‐sensitive regions may also effectively contribute to facial expression recognition. The results also suggested that facial motion and eye‐related information played important roles by carrying considerable expression information that could facilitate facial expression recognition. Hum Brain Mapp 38:3113–3125, 2017. © 2017 Wiley Periodicals, Inc. |
| |
Keywords: | facial expressions fMRI MVPA face‐selective areas motion‐sensitive areas facial motion eye‐related information |
|
|