scholarly journals Tracktor: image-based automated tracking of animal movement and behaviour

2018 ◽  
Author(s):  
Vivek Hari Sridhar ◽  
Dominique G. Roche ◽  
Simon Gingins

Abstract1. Automated movement tracking is essential for high-throughput quantitative analyses of the behaviour and kinematics of organisms. Automated tracking also improves replicability by avoiding observer biases and allowing reproducible workflows. However, few automated tracking programs exist that are open access, open source, and capable of tracking unmarked organisms in noisy environments.2. Tracktor is an image-based tracking freeware designed to perform single-object tracking in noisy environments, or multi-object tracking in uniform environments while maintaining individual identities. Tracktor is code-based but requires no coding skills other than the user being able to specify tracking parameters in a designated location, much like in a graphical user interface (GUI). The installation and use of the software is fully detailed in a user manual.3. Through four examples of common tracking problems, we show that Tracktor is able to track a variety of animals in diverse conditions. The main strengths of Tracktor lie in its ability to track single individuals under noisy conditions (e.g. when the object shape is distorted), its robustness to perturbations (e.g. changes in lighting conditions during the experiment), and its capacity to track multiple individuals while maintaining their identities. Additionally, summary statistics and plots allow measuring and visualizing common metrics used in the analysis of animal movement (e.g. cumulative distance, speed, acceleration, activity, time spent in specific areas, distance to neighbour, etc.).4. Tracktor is a versatile, reliable, easy-to-use automated tracking software that is compatible with all operating systems and provides many features not available in other existing freeware. Access Tracktor and the complete user manual here: https://github.com/vivekhsridhar/tracktor

2019 ◽  
Vol 17 (2) ◽  
pp. 264-271
Author(s):  
Asha Narayana ◽  
Narasimhadhan Venkata

Object tracking is a fundamental task in video surveillance, human-computer interaction and activity analysis. One of the common challenges in visual object tracking is illumination variation. A large number of methods for tracking have been proposed over the recent years, and median flow tracker is one of them which can handle various challenges. Median flow tracker is designed to track an object using Lucas-Kanade optical flow method which is sensitive to illumination variation, hence fails when sudden illumination changes occur between the frames. In this paper, we propose an enhanced median flow tracker to achieve an illumination invariance to abruptly varying lighting conditions. In this approach, illumination variation is compensated by modifying the Discrete Cosine Transform (DCT) coefficients of an image in the logarithmic domain. The illumination variations are mainly reflected in the low-frequency coefficients of an image. Therefore, a fixed number of DCT coefficients are ignored. Moreover, the Discrete Cosine (DC) coefficient is maintained almost constant all through the video based on entropy difference to minimize the sudden variations of lighting impacts. In addition, each video frame is enhanced by employing pixel transformation technique that improves the contrast of dull images based on probability distribution of pixels. The proposed scheme can effectively handle the gradual and abrupt changes in the illumination of the object. The experiments are conducted on fast-changing illumination videos, and results show that the proposed method improves median flow tracker with outperforming accuracy compared to the state-of-the-art trackers


2019 ◽  
Vol 10 (6) ◽  
pp. 815-820 ◽  
Author(s):  
Vivek Hari Sridhar ◽  
Dominique G. Roche ◽  
Simon Gingins

2021 ◽  
Author(s):  
C.-H. Huck Yang ◽  
Mohit Chhabra ◽  
Y.-C. Liu ◽  
Quan Kong ◽  
Tomoaki Yoshinaga ◽  
...  

2012 ◽  
Vol 246-247 ◽  
pp. 179-183
Author(s):  
Yu Bin Yang ◽  
Jiao Jiao Gu ◽  
Xiao Yu Zhang ◽  
Zhi Liu

Object tracking is a very important application in the fields of computer vision. In practice, automated tracking systems can rarely meet the required performance. This paper improves the attentional based tracking framework with Adaboost and gaze selection. The object classifier is implemented using the ADA Boosting to recognize digits from the MNIST dataset. At the initial object position Gaze selection is performed. The performance of the framework is evaluated using digit videos generated from the MNIST dataset with clutters. In general, the performance of the framework is robust to changes in motion routes and degree of clutter.


IEEE Access ◽  
2018 ◽  
Vol 6 ◽  
pp. 29283-29296 ◽  
Author(s):  
Gaocheng Liu ◽  
Shuai Liu ◽  
Khan Muhammad ◽  
Arun Kumar Sangaiah ◽  
Faiyaz Doctor

Animals ◽  
2021 ◽  
Vol 11 (7) ◽  
pp. 1867
Author(s):  
Zsofia Kelemen ◽  
Herwig Grimm ◽  
Claus Vogl ◽  
Mariessa Long ◽  
Jessika M. V. Cavalleri ◽  
...  

Housing and management conditions strongly influence the health, welfare and behaviour of horses. Consequently, objective and quantifiable comparisons between domestic environments and their influence on different equine demographics are needed to establish evidence-based criteria to assess and optimize horse welfare. Therefore, the present study aimed to measure and compare the time budgets (=percentage of time spent on specific activities) of horses with chronic orthopaedic disease and geriatric (≥20 years) horses living in different husbandry systems using an automated tracking device. Horses spent 42% (range 38.3–44.8%) of their day eating, 39% (range 36.87–44.9%) resting, and 19% (range 17–20.4%) in movement, demonstrating that geriatric horses and horses suffering from chronic orthopaedic disease can exhibit behaviour time budgets equivalent to healthy controls. Time budget analysis revealed significant differences between farms, turn-out conditions and time of day, and could identify potential areas for improvement. Horses living in open-air group housing on a paddock had a more uniform temporal distribution of feeding and movement activities with less pronounced peaks compared to horses living in more restricted husbandry systems.


Author(s):  
Jerrold L. Abraham

Inorganic particulate material of diverse types is present in the ambient and occupational environment, and exposure to such materials is a well recognized cause of some lung disease. To investigate the interaction of inhaled inorganic particulates with the lung it is necessary to obtain quantitative information on the particulate burden of lung tissue in a wide variety of situations. The vast majority of diagnostic and experimental tissue samples (biopsies and autopsies) are fixed with formaldehyde solutions, dehydrated with organic solvents and embedded in paraffin wax. Over the past 16 years, I have attempted to obtain maximal analytical use of such tissue with minimal preparative steps. Unique diagnostic and research data result from both qualitative and quantitative analyses of sections. Most of the data has been related to inhaled inorganic particulates in lungs, but the basic methods are applicable to any tissues. The preparations are primarily designed for SEM use, but they are stable for storage and transport to other laboratories and several other instruments (e.g., for SIMS techniques).


Author(s):  
K. Botterill ◽  
R. Allen ◽  
P. McGeorge

The Multiple-Object Tracking paradigm has most commonly been utilized to investigate how subsets of targets can be tracked from among a set of identical objects. Recently, this research has been extended to examine the function of featural information when tracking is of objects that can be individuated. We report on a study whose findings suggest that, while participants can only hold featural information for roughly two targets this task does not affect tracking performance detrimentally and points to a discontinuity between the cognitive processes that subserve spatial location and featural information.


Sign in / Sign up

Export Citation Format

Share Document