Deep learning has been widely used in visual tracking due to strong feature extraction ability of convolutional neural network(CNN). Many trackers pre-train CNN primarily and fine-tune it during tracking, which could improve representation ability from off-line database and adjust to appearance variation of the interested object. However, since target information is limited, the network is likely to overfit to a single target state. In this paper, an update strategy composed of two modules is proposed. First, we fine-tune the pre-trained CNN using active learning that emphasizes the most discriminative data iteratively. Second, artificial convolutional features generated from empirical distribution are employed to train fully connected layers, which makes up the deficiency of training examples. Experiments evaluated on VOT2016 benchmark shows that our algorithm outperforms many state-of-the-art trackers.
Robust object tracking is a challenging task in computer vision due to interruptions such as deformation, fast motion and especially, occlusion of tracked object. When occlusions occur, image data will be unreliable and is insufficient for the tracker to depict the object of interest. Therefore, most trackers are prone to fail under occlusion. In this paper, an occlusion judgement and handling method based on segmentation of the target is proposed. If the target is occluded, the speed and direction of it must be different from the objects occluding it. Hence, the value of motion features are emphasized. Considering the efficiency and robustness of Kernelized Correlation Filter Tracking (KCF), it is adopted as a pre-tracker to obtain a predicted position of the target. By analyzing long-term motion cues of objects around this position, the tracked object is labelled. Hence, occlusion could be detected easily. Experimental results suggest that our tracker achieves a favorable performance and effectively handles occlusion and drifting problems.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.