Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/27970
Title: MGEED: A Multimodal Genuine Emotion and Expression Detection Database
Authors: Wang, Y
Yu, H
Gao, W
Xia, Y
Nduka, C
Keywords: emotion recognition;facial expression analysis;multi-modal emotion database;affective sensing and analysis
Issue Date: 15-Jun-2023
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Citation: Wang, Y. et al. (2023) 'MGEED: A Multimodal Genuine Emotion and Expression Detection Database', IEEE Transactions on Affective Computing, 0 (early access), pp. 1 - 13. doi: 10.1109/TAFFC.2023.3286351.
Abstract: Multimodal emotion recognition has attracted increasing interest from academia and industry in recent years, since it enables emotion detection using various modalities, such as facial expression images, speech and physiological signals. Although research in this field has grown rapidly, it is still challenging to create a multimodal database containing facial electrical information due to the difficulty in capturing natural and subtle facial expression signals, such as optomyography (OMG) signals. To this end, we present a newly developed Multimodal Genuine Emotion and Expression Detection (MGEED) database in this paper, which is the first publicly available database containing the facial OMG signals. MGEED consists of 17 subjects with over 150K facial images, 140K depth maps and different modalities of physiological signals including OMG, electroencephalography (EEG) and electrocardiography (ECG) signals. The emotions of the participants are evoked by video stimuli and the data are collected by a multimodal sensing system. With the collected data, an emotion recognition method is developed based on multimodal signal synchronisation, feature extraction, fusion and emotion prediction. The results show that superior performance can be achieved by fusing the visual, EEG and OMG features. The database can be obtained from https://github.com/YMPort/MGEED.
URI: https://bura.brunel.ac.uk/handle/2438/27970
DOI: https://doi.org/10.1109/TAFFC.2023.3286351
Other Identifiers: ORCID iD: Hui Yu https://orcid.org/0000-0002-7655-9228
Appears in Collections:Dept of Computer Science Research Papers

Files in This Item:
File Description SizeFormat 
FullText.pdfCopyright © 2023 Institute of Electrical and Electronics Engineers (IEEE). Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. See: https://journals.ieeeauthorcenter.ieee.org/become-an-ieee-journal-author/publishing-ethics/guidelines-and-policies/post-publication-policies/14.25 MBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.