Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/26927
Full metadata record
DC FieldValueLanguage
dc.contributor.authorJi, X-
dc.contributor.authorDong, Z-
dc.contributor.authorLai, CS-
dc.date.accessioned2023-08-09T11:46:56Z-
dc.date.available2023-08-09T11:46:56Z-
dc.date.issued2023-04-04-
dc.identifierORCID iD: Chun Sing Lai https://orcid.org/0000-0002-4169-4438-
dc.identifier.citationJi, X., Dong, Z. and Lai, C.S. (2023) 'Circuit Design of Multimodal Attention Memristive Network for Affective Video Content Analysis, Proceedings of the 2023 IEEE International Conference on Industrial Technology (ICIT), Orlando, FL, USA, 4-6 April, pp. 1 - 5. doi: 10.1109/ICIT58465.2023.10143111.en_US
dc.identifier.isbn979-8-3503-3650-4 (ebk)-
dc.identifier.isbn979-8-3503-3651-1 (PoD)-
dc.identifier.issn2641-0184-
dc.identifier.urihttps://bura.brunel.ac.uk/handle/2438/26927-
dc.description.abstractAffective video content analysis aims at automatically identifying human emotion triggered by video, which plays an important role in mental health monitoring. This paper proposes a multimodal attention memristive network for affective video content analysis, which offers an energy-efficient approach with low time consumption and high classification accuracy. To illustrate the complexity of the proposed multimodal attention memristive network, two core modules are proposed. Firstly, unimodal feature representation module with cascaded configuration is designed to capture unique characteristics from multimodal signals. Then, multimodal local-global fusion module is proposed to stimulate the process of multimodal information sensing and processing in human brain. Furthermore, the proposed system is validated by applying it to affective content analysis. The experimental results demonstrate that the multimodal attention memristive network outperforms the existing state-of-the-art methods with high classification accuracy and low time consumption.en_US
dc.description.sponsorship10.13039/501100001809-National Natural Science Foundation of China (Grant Number: 62001149)en_US
dc.format.extent1 - 5-
dc.format.mediumPrint-Electronic-
dc.language.isoen_USen_US
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)en_US
dc.rightsCopyright © 2023 Institute of Electrical and Electronics Engineers (IEEE). This article has been accepted for publication in a future proccedings of this conference, but has not been fully edited. Content may change prior to final publication. Citation information: DOI10.1109/ICIT58465.2023.10143111, 2023 IEEE International Conference on Industrial Technology (ICIT). Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works by sending a request to pubs-permissions@ieee.org. For more information, see https://www.ieee.org/ publications/rights/rights-policies.html-
dc.rights.urihttps://www.ieee.org/publications/rights/rights-policies.html-
dc.subjectcircuit designen_US
dc.subjectmemristive networken_US
dc.subjectaffective video content analysisen_US
dc.titleCircuit Design of Multimodal Attention Memristive Network for Affective Video Content Analysisen_US
dc.typeConference Paperen_US
dc.identifier.doihttps://doi.org/10.1109/ICIT58465.2023.10143111-
dc.relation.isPartOfProceedings of the IEEE International Conference on Industrial Technology-
pubs.publication-statusPublished-
pubs.volume2023-April-
dc.identifier.eissn2643-2978-
dc.rights.holderInstitute of Electrical and Electronics Engineers (IEEE)-
Appears in Collections:Dept of Electronic and Electrical Engineering Research Papers

Files in This Item:
File Description SizeFormat 
FullText.pdfCopyright © 2023 Institute of Electrical and Electronics Engineers (IEEE). This article has been accepted for publication in a future proccedings of this conference, but has not been fully edited. Content may change prior to final publication. Citation information: DOI10.1109/ICIT58465.2023.10143111, 2023 IEEE International Conference on Industrial Technology (ICIT). Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works by sending a request to pubs-permissions@ieee.org. For more information, see https://www.ieee.org/ publications/rights/rights-policies.html671.4 kBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.