An Affective EEG Analysis Method Without Feature Engineering
Download PDF


Dynamic graph classification
Self-attention mechanism
Dynamic self-attention network
SEED dataset



Submitted : 2023-12-19
Accepted : 2024-01-03
Published : 2024-01-18


Emotional electroencephalography (EEG) signals are a primary means of recording emotional brain activity.Currently, the most effective methods for analyzing emotional EEG signals involve feature engineering and neuralnetworks. However, neural networks possess a strong ability for automatic feature extraction. Is it possible to discardfeature engineering and directly employ neural networks for end-to-end recognition? Based on the characteristics of EEGsignals, this paper proposes an end-to-end feature extraction and classification method for a dynamic self-attention network(DySAT). The study reveals significant differences in brain activity patterns associated with different emotions acrossvarious experimenters and time periods. The results of this experiment can provide insights into the reasons behind thesedifferences.


Zheng W-L, Lu B-L, 2015, Investigating Critical Frequency Bands and Channels for EEG-Based Emotion Recognition with Deep Neural Networks. IEEE Transactions on Autonomous Mental Development, 7(3): 162–175.

Duan R-N, Zhu J-Y, Lu B-L, 2013, 6th International IEEE/EMBS Conference on Neural Engineering (NER), November 6–8, 2013: Differential Entropy Feature for EEG-Based Emotion Classification. IEEE, San Diego, 8184.

Song T, Zheng W, Song P, et al., 2020, EEG Emotion Recognition Using Dynamical Graph Convolutional Neural Networks. IEEE Transactions on Affective Computing, 11(3): 532–541.

Liu J, Zhao Y, Wu H, et al., 2021, 13th Asia Pacific Signal and Information Processing Association (APSIPA) Annual Summit and Conference, December 14–17, 2021: Positional-Spectral-Tempotal Attention in 3D Convolutional Neural Networks for EEG Emotion Recognition. APSIPA, Tokyo, 305–312.

Kim J, André E, 2008, Emotion Recognition Based on Physiological Changes in Music Listening. IEEE Transactions on Pattern Analysis and Machine Intelligence, 30(12): 2067–2083.

Dzie?yc M, Gjoreski M, Kazienko P, et al., 2020, Can We Ditch Feature Engineering? End-to-End Deep Learning for Effect Recognition from Physiological Sensor Data. Sensors (Basel), 20(22): 6535.

Sankar A, Wu Y, Gou L, et al., 2020, The 13th ACM International Conference on Web Search and Data Mining, February 3–7, 2020: DySAT: Deep Neural Representation Learning on Dynamic Graphs via Self-Attention Networks. Association for Computing Machinery, Houston, 519–527.

Maas AL, Hannun AY, Ng AY, 2013, Proceedings of the 30th International Conference on Machine Learning, June 17–19, 2013: Rectifier Nonlinearities Improve Neural Network Acoustic Models. JMLR, Atlanta, vol. 28, 3.

He K, Zhang X, Ren S, et al., 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 27–30, 2016: Deep Residual Learning for Image Recognition. IEEE, Las Vegas, 770–778.

Veli?kovi? P, Cucurull G, Casanova A, et al., 2018, 6th International Conference on Learning Representations, April 30–May 3, 2018: Graph Attention Networks. ICLR, Vancouver.

Vaswani A, Shazeer N, Parmar N, et al., 2017, 31st Conference on Neural Information Processing Systems (NIPS 2017), December 4–9, 2017: Attention Is All You Need. Neural Information Processing Systems Foundation, Long Island.

Shen T, Jiang J, Zhou T, et al., 2018, DiSAN: Directional Self-Attention Network for RNN/CNN-Free Language Understanding. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1): 5446–5455.

Tan Z, Wang M, Xie J, et al., 2018, Deep Semantic Role Labeling with Self-Attention. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1): 4929–4936.

Hornik K, Stinchcombe M, White H, 1989, Multilayer Feedforward Networks Are Universal Approximators. Neural Networks, 2(5): 359–366.