Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/27970
Full metadata record
DC FieldValueLanguage
dc.contributor.authorWang, Y-
dc.contributor.authorYu, H-
dc.contributor.authorGao, W-
dc.contributor.authorXia, Y-
dc.contributor.authorNduka, C-
dc.date.accessioned2024-01-06T11:09:58Z-
dc.date.available2024-01-06T11:09:58Z-
dc.date.issued2023-06-15-
dc.identifierORCID iD: Hui Yu https://orcid.org/0000-0002-7655-9228-
dc.identifier.citationWang, Y. et al. (2023) 'MGEED: A Multimodal Genuine Emotion and Expression Detection Database', IEEE Transactions on Affective Computing, 0 (early access), pp. 1 - 13. doi: 10.1109/TAFFC.2023.3286351.en_US
dc.identifier.urihttps://bura.brunel.ac.uk/handle/2438/27970-
dc.description.abstractMultimodal emotion recognition has attracted increasing interest from academia and industry in recent years, since it enables emotion detection using various modalities, such as facial expression images, speech and physiological signals. Although research in this field has grown rapidly, it is still challenging to create a multimodal database containing facial electrical information due to the difficulty in capturing natural and subtle facial expression signals, such as optomyography (OMG) signals. To this end, we present a newly developed Multimodal Genuine Emotion and Expression Detection (MGEED) database in this paper, which is the first publicly available database containing the facial OMG signals. MGEED consists of 17 subjects with over 150K facial images, 140K depth maps and different modalities of physiological signals including OMG, electroencephalography (EEG) and electrocardiography (ECG) signals. The emotions of the participants are evoked by video stimuli and the data are collected by a multimodal sensing system. With the collected data, an emotion recognition method is developed based on multimodal signal synchronisation, feature extraction, fusion and emotion prediction. The results show that superior performance can be achieved by fusing the visual, EEG and OMG features. The database can be obtained from https://github.com/YMPort/MGEED.en_US
dc.description.sponsorshipEngineering and Physical Sciences Research Council (EPSRC) through the Project 4D Facial Sensing and Modelling under Grant EP/N025849/1.en_US
dc.format.extent1 - 13-
dc.format.mediumElectronic-
dc.languageEnglish-
dc.language.isoen_USen_US
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)en_US
dc.rightsCopyright © 2023 Institute of Electrical and Electronics Engineers (IEEE). Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. See: https://journals.ieeeauthorcenter.ieee.org/become-an-ieee-journal-author/publishing-ethics/guidelines-and-policies/post-publication-policies/-
dc.rights.urihttps://journals.ieeeauthorcenter.ieee.org/become-an-ieee-journal-author/publishing-ethics/guidelines-and-policies/post-publication-policies/-
dc.subjectemotion recognitionen_US
dc.subjectfacial expression analysisen_US
dc.subjectmulti-modal emotion databaseen_US
dc.subjectaffective sensing and analysisen_US
dc.titleMGEED: A Multimodal Genuine Emotion and Expression Detection Databaseen_US
dc.typeArticleen_US
dc.identifier.doihttps://doi.org/10.1109/TAFFC.2023.3286351-
dc.relation.isPartOfIEEE Transactions on Affective Computing-
pubs.issueearly access-
pubs.publication-statusPublished-
pubs.volume0-
dc.identifier.eissn1949-3045-
dc.rights.holderInstitute of Electrical and Electronics Engineers (IEEE)-
Appears in Collections:Dept of Computer Science Research Papers

Files in This Item:
File Description SizeFormat 
FullText.pdfCopyright © 2023 Institute of Electrical and Electronics Engineers (IEEE). Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. See: https://journals.ieeeauthorcenter.ieee.org/become-an-ieee-journal-author/publishing-ethics/guidelines-and-policies/post-publication-policies/14.25 MBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.