Chinese Journal of Intelligent Science and Technology ›› 2021, Vol. 3 ›› Issue (1): 76-84.doi: 10.11959/j.issn.2096-6652.202108

• Special topic:emotional brain computer interface • Previous Articles     Next Articles

Multi-modal physiological signal emotion recognition based on 3D hierarchical convolution fusion

Wenfen LING1,2, Sihan CHEN1,2, Yong PENG1,2, Wanzeng KONG1,2   

  1. 1 College of Computer Science and Techonology, Hangzhou Dianzi University, Hangzhou 310018, China
    2 Key Laboratory of Brain Machine Collaborative Intelligence of Zhejiang Province, Hangzhou 310018, China
  • Revised:2021-02-05 Online:2021-03-15 Published:2021-03-01
  • Supported by:
    The National Key Research and Development Program of China(2017YFE0116800);The National Natural Science Foundation of China(U1909202);Science and Technology Program of Zhejiang Province(2018C04012);Key Laboratory of Brain Machine Collaborative Intelligence of Zhejiang Province(20200E10010)

Abstract:

In recent years, physiological signals such as electroencephalograhpy (EEG) have gradually become popular objects of emotion recognition research because they can objectively reflect true emotions.However, the single-modal EEG signal has the problem of incomplete emotional information representation, and the multi-modal physiological signal has the problem of insufficient emotional information interaction.Therefore, a 3D hierarchical convolutional fusion model was proposed, which aimed to fully explore multi-modal interaction relationships and more accurately describe emotional information.The method first extracted the primary emotional representation information of EEG , electro-oculogram (EOG) and electromyography (EMG) by depthwise separable convolution network, and then performed 3D convolution fusion operation on the obtained multi-modal primary emotional representation information to realize the pairwise mode local interactions between states and global interactions among all modalities, so as to obtain multi-modal fusion representations containing emotional characteristics of different physiological signals.The results show that the accuracy in the valence and arousal of the two-class and four-class tasks on DEAP dataset are both 98% by the proposed model.

Key words: physiological signal, emotion recognition, 3D hierarchical convolutional, multi-modal interaction

CLC Number: 

No Suggested Reading articles found!