Please use this identifier to cite or link to this item:
http://bura.brunel.ac.uk/handle/2438/25962
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Shi, Y | - |
dc.contributor.author | Han, L | - |
dc.contributor.author | Huang, W | - |
dc.contributor.author | Chang, S | - |
dc.contributor.author | Dong, Y | - |
dc.contributor.author | Dancey, D | - |
dc.contributor.author | Han, L | - |
dc.date.accessioned | 2023-02-14T13:54:21Z | - |
dc.date.available | 2023-02-14T13:54:21Z | - |
dc.date.issued | 2021-12-06 | - |
dc.identifier.citation | Han, L. et al. (2022) 'A Biologically Interpretable Two-Stage Deep Neural Network (BIT-DNN) for Vegetation Recognition From Hyperspectral Imagery', IEEE Transactions on Geoscience and Remote Sensing, 60, pp. 1 - 20. doi: 10.1109/TGRS.2021.3058782. | en_US |
dc.identifier.issn | 0196-2892 | - |
dc.identifier.uri | https://bura.brunel.ac.uk/handle/2438/25962 | - |
dc.description.abstract | Copyright © The Authors. Spectral-spatial-based deep learning models have recently proven to be effective in hyper-spectral image (HSI) classification for various earth monitoring applications such as land cover classification and agricultural monitoring. However, due to the nature of 'black-box' model representation, how to explain and interpret the learning process and the model decision, especially for vegetation classification, remains an open challenge. This study proposes a novel interpretable deep learning model - a biologically interpretable two-stage deep neural network (BIT-DNN), by incorporating the prior-knowledge (i.e., biophysical and biochemical attributes and their hierarchical structures of target entities)-based spectral-spatial feature transformation into the proposed framework, capable of achieving both high accuracy and interpretability on HSI-based classification tasks. The proposed model introduces a two-stage feature learning process: in the first stage, an enhanced interpretable feature block extracts the low-level spectral features associated with the biophysical and biochemical attributes of target entities; and in the second stage, an interpretable capsule block extracts and encapsulates the high-level joint spectral-spatial features representing the hierarchical structure of biophysical and biochemical attributes of these target entities, which provides the model an improved performance on classification and intrinsic interpretability with reduced computational complexity. We have tested and evaluated the model using four real HSI data sets for four separate tasks (i.e., plant species classification, land cover classification, urban scene recognition, and crop disease recognition tasks). The proposed model has been compared with five state-of-the-art deep learning models. The results demonstrate that the proposed model has competitive advantages in terms of both classification accuracy and model interpretability, especially for vegetation classification. | en_US |
dc.description.sponsorship | 10.13039/501100000268-Biotechnology and Biological Sciences Research Council (BBSRC) (Grant Number: BB/R019983/1 and BB/S020969/1) 10.13039/100010897-Newton Fund Institutional Links through the Newton-Ungku Omar Fund partnership funded by the U.K. Department of Business, Energy, and Industrial Strategy (BEIS) and the Malaysian Industry-Government Group for High Technology and delivered by the British Council (Grant Number: ID 332438911) | en_US |
dc.format.extent | 1 - 20 | - |
dc.format.medium | Print-Electronic | - |
dc.publisher | Institute of Electrical and Electronics Engineers | en_US |
dc.rights | Copyright © The Authors. This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/ | - |
dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | - |
dc.subject | Classification | en_US |
dc.subject | deep learning | en_US |
dc.subject | hyper-spectral images (HSIs) | en_US |
dc.subject | interpretability | en_US |
dc.title | A Biologically Interpretable Two-Stage Deep Neural Network (BIT-DNN) for Vegetation Recognition from Hyperspectral Imagery | en_US |
dc.type | Article | en_US |
dc.identifier.doi | https://doi.org/10.1109/TGRS.2021.3058782 | - |
dc.relation.isPartOf | IEEE Transactions on Geoscience and Remote Sensing | - |
pubs.publication-status | Published | - |
pubs.volume | 60 | - |
dc.identifier.eissn | 1558-0644 | - |
Appears in Collections: | Dept of Computer Science Research Papers |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
FullText.pdf | Copyright © The Authors. This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/ | 27.8 MB | Adobe PDF | View/Open |
This item is licensed under a Creative Commons License