Please use this identifier to cite or link to this item:
http://bura.brunel.ac.uk/handle/2438/25452
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Lai, Y | - |
dc.contributor.author | Guan, W | - |
dc.contributor.author | Luo, L | - |
dc.contributor.author | Guo, Y | - |
dc.contributor.author | Song, H | - |
dc.contributor.author | Meng, H | - |
dc.date.accessioned | 2022-11-05T12:57:11Z | - |
dc.date.available | 2022-11-05T12:57:11Z | - |
dc.date.issued | 2022-10-25 | - |
dc.identifier | ORCiD: Yuping Lai https://orcid.org/0000-0002-3797-1228 | - |
dc.identifier | ORCiD: Wenbo Guan https://orcid.org/0000-0002-4645-6121 | - |
dc.identifier | ORCiD: Lijuan Luo https://orcid.org/0000-0002-3702-372X | - |
dc.identifier | ORCiD: Heping Song https://orcid.org/0000-0002-8583-2804 | - |
dc.identifier | ORCiD: Hongying Meng https://orcid.org/0000-0002-8836-1382 | - |
dc.identifier.citation | Lai, Y..et al. (2024) 'Bayesian Estimation of Inverted Beta Mixture Models With Extended Stochastic Variational Inference for Positive Vector Classification', IEEE Transactions on Neural Networks and Learning Systems, 35 (5), pp. 6948 - 6962. doi: 10.1109/tnnls.2022.3213518 | en_US |
dc.identifier.issn | 2162-237X | - |
dc.identifier.uri | https://bura.brunel.ac.uk/handle/2438/25452 | - |
dc.description.abstract | The finite inverted beta mixture model (IBMM) has been proven to be efficient in modeling positive vectors. Under the traditional variational inference framework, the critical challenge in Bayesian estimation of the IBMM is that the computational cost of performing inference with large datasets is prohibitively expensive, which often limits the use of Bayesian approaches to small datasets. An efficient alternative provided by the recently proposed stochastic variational inference (SVI) framework allows for efficient inference on large datasets. Nevertheless, when using the SVI framework to address the non-Gaussian statistical models, the evidence lower bound (ELBO) cannot be explicitly calculated due to the intractable moment computation. Therefore, the algorithm under the SVI framework cannot directly use stochastic optimization to optimize the ELBO, and an analytically tractable solution cannot be derived. To address this problem, we propose an extended version of the SVI framework with more flexibility, namely, the extended SVI (ESVI) framework. This framework can be used in many non-Gaussian statistical models. First, some approximation strategies are applied to further lower the ELBO to avoid intractable moment calculations. Then, stochastic optimization with noisy natural gradients is used to optimize the lower bound. The excellent performance and effectiveness of the proposed method are verified in real data evaluation. | - |
dc.description.sponsorship | 10.13039/501100001809-National Natural Science Foundation of China (Grant Number: 62272051, 62172193 and 72101157); 10.13039/501100012226-Fundamental Research Funds for the Central Universities (Grant Number: 2022RC16); Research and Development Program of Beijing Municipal Education Commission (Grant Number: KM201910009014). | en_US |
dc.format.extent | 6948 - 6962 | - |
dc.format.medium | Print-Electronic | - |
dc.language.iso | en_US | en_US |
dc.publisher | Institute of Electrical and Electronics Engineers (IEEE) | en_US |
dc.rights | Copyright © 2022 Institute of Electrical and Electronics Engineers (IEEE). Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. See: https://journals.ieeeauthorcenter.ieee.org/become-an-ieee-journal-author/publishing-ethics/guidelines-and-policies/post-publication-policies/ | - |
dc.rights.uri | https://journals.ieeeauthorcenter.ieee.org/become-an-ieee-journal-author/publishing-ethics/guidelines-and-policies/post-publication-policies/ | - |
dc.subject | extended stochastic variational inference | en_US |
dc.subject | mixture models | en_US |
dc.subject | Bayesian estimation | en_US |
dc.subject | text categrization | en_US |
dc.subject | network traffiic classification | en_US |
dc.subject | misuse intrusion detecton | en_US |
dc.title | Bayesian Estimation of Inverted Beta Mixture Models With Extended Stochastic Variational Inference for Positive Vector Classification | en_US |
dc.type | Article | en_US |
dc.identifier.doi | https://doi.org/10.1109/tnnls.2022.3213518 | - |
dc.relation.isPartOf | IEEE Transactions on Neural Networks and Learning Systems | - |
pubs.issue | 5 | - |
pubs.publication-status | Published | - |
pubs.volume | 35 | - |
dc.identifier.eissn | 2162-2388 | - |
dcterms.dateAccepted | 2022-10-04 | - |
dc.rights.holder | Institute of Electrical and Electronics Engineers (IEEE) | - |
Appears in Collections: | Dept of Electronic and Electrical Engineering Research Papers |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
FullText.pdf | Copyright © 2022 Institute of Electrical and Electronics Engineers (IEEE). Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. See: https://journals.ieeeauthorcenter.ieee.org/become-an-ieee-journal-author/publishing-ethics/guidelines-and-policies/post-publication-policies/ | 23.14 MB | Adobe PDF | View/Open |
Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.