scholarly journals Deep Directional Network for Object Tracking

Algorithms ◽  
2018 ◽  
Vol 11 (11) ◽  
pp. 178 ◽  
Author(s):  
Zhaohua Hu ◽  
Xiaoyi Shi

Existing object trackers are mostly based on correlation filtering and neural network frameworks. Correlation filtering is fast but has poor accuracy. Although a neural network can achieve high precision, a large amount of computation increases the tracking time. To address this problem, we utilize a convolutional neural network (CNN) to learn object direction. We propose a target direction classification network based on CNNs that has a directional shortcut to the tracking target, unlike the particle filter that randomly finds the target. Our network uses an end-to-end approach to determine scale variation that has good robustness to scale variation sequences. In the pretraining stage, the Visual Object Tracking Challenges (VOT) dataset is used to train the network for learning positive and negative sample classification and direction classification. In the online tracking stage, the sliding window operation is performed by using the obtained directional information to determine the exact position of the object. The network only calculates a single sample, which guarantees a low computational burden. The positive and negative sample redetection strategies can successfully ensure that the samples are not lost. The one-pass evaluation (OPE) evaluation results of the object tracking benchmark (OTB) demonstrate that the algorithm is very robust and is also faster than several deep trackers.

2020 ◽  
Author(s):  
Dominika Przewlocka ◽  
Mateusz Wasala ◽  
Hubert Szolc ◽  
Krzysztof Blachut ◽  
Tomasz Kryjak

In this paper the research on optimisation of visual object tracking using a Siamese neural network for embedded vision systems is presented. It was assumed that the solution shall operate in real-time, preferably for a high resolution video stream, with the lowest possible energy consumption. To meet these requirements, techniques such as the reduction of computational precision and pruning were considered. Brevitas, a tool dedicated for optimisation and quantisation of neural networks for FPGA implementation, was used. A number of training scenarios were tested with varying levels of optimisations-from integer uniform quantisation with 16 bits to ternary and binary networks. Next, the influence of these optimisations on the tracking performance was evaluated. It was possible to reduce the size of the convolutional filters up to 10 times in relation to the original network. The obtained results indicate that using quantisation can significantly reduce the memory and computational complexity of the proposed network while still enabling precise tracking, thus allow to use it in embedded vision systems. Moreover , quantisation of weights positively affects the network training by decreasing overfitting.


2019 ◽  
Vol 78 (24) ◽  
pp. 34725-34744 ◽  
Author(s):  
Yang Huang ◽  
Zhiqiang Zhao ◽  
Bin Wu ◽  
Zhuolin Mei ◽  
Zongmin Cui ◽  
...  

2011 ◽  
Vol 22 (2,3) ◽  
pp. 69-81 ◽  
Author(s):  
José Everardo B. Maia ◽  
Guilherme A. Barreto ◽  
André L.V. Coelho

Sign in / Sign up

Export Citation Format

Share Document