Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/16620
Full metadata record
DC FieldValueLanguage
dc.contributor.authorBrunel, SMJ-
dc.contributor.authorLi, Y-
dc.contributor.authorLiu, X-
dc.date.accessioned2018-07-20T14:51:59Z-
dc.date.available2017-12-01-
dc.date.available2018-07-20T14:51:59Z-
dc.date.issued2017-
dc.identifier.citationInternational Journal of Machine Learning and Computing, 2017, 7 (6), pp. 232 - 237en_US
dc.identifier.issn2010-3700-
dc.identifier.urihttp://bura.brunel.ac.uk/handle/2438/16620-
dc.description.abstractThis paper presents a novel approach to automatic shadow identification and removal from video input. Based on the observation that the length and position of a shadow changes linearly over a relatively long period in outdoor environments, due to the relative movement of the sun, we can distinguish a shadow from other dark regions in an input video. Subsequently, we can identify the Reference Shadow as that with the highest confidence of the aforementioned linear changes. This Reference Shadow is used to fit the shadow-free invariant model, with which the shadow-free invariant images can be computed for all frames in the input video. Our method does not require camera calibration and shadows from stationary objects, as moving objects are detected automatically.en_US
dc.format.extent232 - 237-
dc.language.isoenen_US
dc.publisherIJMLCen_US
dc.subjectInvariant imageen_US
dc.subjectReference shadowen_US
dc.subjectVideo surveillanceen_US
dc.subjectShadow-less imageen_US
dc.subjectShadow detectionen_US
dc.titleRemoving shadows from videoen_US
dc.typeArticleen_US
dc.identifier.doihttp://dx.doi.org/10.18178/ijmlc.2017.7.6.652-
dc.relation.isPartOfInternational Journal of Machine Learning and Computing-
pubs.issue6-
pubs.publication-statusPublished-
pubs.volume7-
Appears in Collections:Dept of Computer Science Research Papers

Files in This Item:
File Description SizeFormat 
FullText.pdf935.73 kBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.