Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/9730
Full metadata record
DC FieldValueLanguage
dc.contributor.authorJan, A-
dc.contributor.authorMeng, H-
dc.contributor.authorGaus, YFA-
dc.contributor.authorZhang, F-
dc.contributor.authorTurabzadeh, S-
dc.coverage.spatialOrlando, FL, USA-
dc.coverage.spatialOrlando, FL, USA-
dc.coverage.spatialOrlando, FL, USA-
dc.coverage.spatialOrlando, FL, USA-
dc.date.accessioned2015-01-14T09:34:52Z-
dc.date.available2014-11-07-
dc.date.available2015-01-14T09:34:52Z-
dc.date.issued2014-
dc.identifier.citationProceedings of the 4th International Workshop on Audio/Visual Emotion Challenge (AVEC14): 73 - 80, (2014)en_US
dc.identifier.isbn978-1-4503-3119-7-
dc.identifier.urihttp://dl.acm.org/citation.cfm?doid=2661806.2661812-
dc.identifier.urihttp://bura.brunel.ac.uk/handle/2438/9730-
dc.description.abstractDepression is a state of low mood and aversion to activity that can affect a person's thoughts, behaviour, feelings and sense of well-being. In such a low mood, both the facial expression and voice appear different from the ones in normal states. In this paper, an automatic system is proposed to predict the scales of Beck Depression Inventory from naturalistic facial expression of the patients with depression. Firstly, features are extracted from corresponding video and audio signals to represent characteristics of facial and vocal expression under depression. Secondly, dynamic features generation method is proposed in the extracted video feature space based on the idea of Motion History Histogram (MHH) for 2-D video motion extraction. Thirdly, Partial Least Squares (PLS) and Linear regression are applied to learn the relationship between the dynamic features and depression scales using training data, and then to predict the depression scale for unseen ones. Finally, decision level fusion was done for combining predictions from both video and audio modalities. The proposed approach is evaluated on the AVEC2014 dataset and the experimental results demonstrate its effectiveness.en_US
dc.description.sponsorshipThe work by Asim Jan was supported by School of Engineering & Design/Thomas Gerald Gray PGR Scholarship. The work by Hongying Meng and Saeed Turabzadeh was partially funded by the award of the Brunel Research Initiative and Enterprise Fund (BRIEF). The work by Yona Falinie Binti Abd Gaus was supported by Majlis Amanah Rakyat (MARA) Scholarship.en_US
dc.format.extent73 - 80 (8)-
dc.format.extent73 - 80 (8)-
dc.format.extent73 - 80 (8)-
dc.format.extent73 - 80 (8)-
dc.language.isoenen_US
dc.publisherACMen_US
dc.sourceThe 4th International Workshop on Audio/Visual Emotion Challenge (AVEC14)-
dc.sourceThe 4th International Workshop on Audio/Visual Emotion Challenge (AVEC14)-
dc.sourceThe 4th International Workshop on Audio/Visual Emotion Challenge (AVEC14)-
dc.sourceThe 4th International Workshop on Audio/Visual Emotion Challenge (AVEC14)-
dc.subjectAffective computingen_US
dc.subjectDepression recognitionen_US
dc.subjectBeck depressionen_US
dc.subjectInventoryen_US
dc.subjectFacial expressionen_US
dc.subjectChallengeen_US
dc.titleAutomatic depression scale prediction using facial expression dynamics and regressionen_US
dc.typeConference Paperen_US
dc.identifier.doihttp://dx.doi.org/10.1145/2661806.2661812-
dc.relation.isPartOfProceedings of the 4th International Workshop on Audio/Visual Emotion Challenge (AVEC14)-
dc.relation.isPartOfProceedings of the 4th International Workshop on Audio/Visual Emotion Challenge (AVEC14)-
dc.relation.isPartOfProceedings of the 4th International Workshop on Audio/Visual Emotion Challenge (AVEC14)-
dc.relation.isPartOfProceedings of the 4th International Workshop on Audio/Visual Emotion Challenge (AVEC14)-
pubs.finish-date2014-11-07-
pubs.finish-date2014-11-07-
pubs.finish-date2014-11-07-
pubs.finish-date2014-11-07-
pubs.place-of-publicationACM New York, NY, USA-
pubs.publication-statusPublished-
pubs.publication-statusPublished-
pubs.publication-statusPublished-
pubs.publication-statusPublished-
pubs.start-date2014-11-07-
pubs.start-date2014-11-07-
pubs.start-date2014-11-07-
pubs.start-date2014-11-07-
pubs.organisational-data/Brunel-
pubs.organisational-data/Brunel/Brunel Staff by College/Department/Division-
pubs.organisational-data/Brunel/Brunel Staff by College/Department/Division/College of Engineering, Design and Physical Sciences-
pubs.organisational-data/Brunel/Brunel Staff by College/Department/Division/College of Engineering, Design and Physical Sciences/Dept of Electronic and Computer Engineering-
pubs.organisational-data/Brunel/Brunel Staff by College/Department/Division/College of Engineering, Design and Physical Sciences/Dept of Electronic and Computer Engineering/Electronic and Computer Engineering-
pubs.organisational-data/Brunel/Brunel Staff by Institute/Theme-
pubs.organisational-data/Brunel/Brunel Staff by Institute/Theme/Institute of Environmental, Health and Societies-
pubs.organisational-data/Brunel/Brunel Staff by Institute/Theme/Institute of Environmental, Health and Societies/Biomedical Engineering and Healthcare Technologies-
Appears in Collections:Dept of Electronic and Electrical Engineering Research Papers

Files in This Item:
File Description SizeFormat 
Fulltext.pdf707.2 kBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.