Please use this identifier to cite or link to this item: https://idr.l4.nitk.ac.in/jspui/handle/123456789/7199
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAsha, C.S.
dc.contributor.authorNarasimhadhan, A.V.
dc.date.accessioned2020-03-30T09:58:37Z-
dc.date.available2020-03-30T09:58:37Z-
dc.date.issued2016
dc.identifier.citationProcedia Computer Science, 2016, Vol.89, , pp.614-622en_US
dc.identifier.urihttp://idr.nitk.ac.in/jspui/handle/123456789/7199-
dc.description.abstractVisual tracking is a difficult problem in computer vision due to illumination, pose, scale, appearance variations of object. Most of the trackers use either gray scale/color information or gradient information for image description. However the use of multiple channel features provide more information than single feature alone. Recently correlation filter based video tracking gained popularity due to its efficiency and high frame rate. Existing correlation filters use fixed learning rate to update filter template in every frame. In this paper, a method for adapting learning rate in correlation filter (CF) is presented which depends on the position of target in the present and previous frames (target velocity). This method uses integral channel features in correlation filter framework with adaptive learning rate to efficiently track the object. We experiment this technique on 12 challenging video sequences from visual object tracking (VOT challenges) datasets. Proposed technique can track any object irrespective of illumination variance, occlusion, scale change and outperforms the state-of-the-art trackers. � 2016 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license.en_US
dc.titleAdaptive Learning Rate for Visual Tracking Using Correlation Filtersen_US
dc.typeBook chapteren_US
Appears in Collections:2. Conference Papers

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.