scholarly journals Characterizing animal movement patterns across different scales and habitats using information theory

2018 ◽  
Author(s):  
Kehinde Owoeye ◽  
Mirco Musolesi ◽  
Stephen Hailes

AbstractUnderstanding the movement patterns of animals across different spatio-temporal scales, conditions, habitats and contexts is becoming increasingly important for addressing a series of questions in animal behaviour studies, such as mapping migration routes, evaluating resource use, modelling epidemic spreading in a population, developing strategies for animal conservation as well as understanding several emerging patterns related to feeding, growth and reproduction. In recent times, information theory has been successfully applied in several fields of science, in particular for understanding the dynamics of complex systems and characterizing adaptive social systems, such as dynamics of entities as individuals and as part of groups.In this paper, we describe a series of non-parametric information-theoretic measures that can be used to derive new insights about animal behaviour with a specific focus on movement patterns namely Shannon entropy, Mutual information, Kullback-Leibler divergence and Kolmogorov complexity. In particular, we believe that the metrics presented in this paper can be used to formulate new hypotheses that can be verified potentially through a set of different observations. We show how these measures can be used to characterize the movement patterns of several animals across different habitats and scales. Specifically, we show the effectiveness in using Shannon entropy to characterize the movement of sheep with Batten disease, mutual information to measure association in pigeons, Kullback Leibler divergence to study the flights of Turkey vulture, and Kolmogorov complexity to find similarities in the movement patterns of animals across different scales and habitats. Finally, we discuss the limitations of these methods and we outline the challenges in this research area.

Author(s):  
Wentao Huang ◽  
Kechen Zhang

Information theory is widely used in various disciplines, and effective calculation of Shannon mutual information is typically not an easy task for many practical applications, including problems of neural population coding in computational and theoretical neuroscience. Asymptotic formulas based on Fisher information may provide accurate approximations to mutual information but this approach is restricted to continuous variables because the calculation requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations based on Kullback-Leibler divergence and Rényi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding, and these asymptotic formulas hold true for discrete variables as there is no requirement for differentiability. In particular, one of our approximation formulas has consistent performance and good accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating mutual information between the discrete variables or stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems.


Oryx ◽  
2021 ◽  
pp. 1-9
Author(s):  
Helen M. K. O'Neill ◽  
Sarah M. Durant ◽  
Stefanie Strebel ◽  
Rosie Woodroffe

Abstract Wildlife fences are often considered an important tool in conservation. Fences are used in attempts to prevent human–wildlife conflict and reduce poaching, despite known negative impacts on landscape connectivity and animal movement patterns. Such impacts are likely to be particularly important for wide-ranging species, such as the African wild dog Lycaon pictus, which requires large areas of continuous habitat to fulfil its resource requirements. Laikipia County in northern Kenya is an important area for wild dogs but new wildlife fences are increasingly being built in this ecosystem. Using a long-term dataset from the area's free-ranging wild dog population, we evaluated the effect of wildlife fence structure on the ability of wild dogs to cross them. The extent to which fences impeded wild dog movement differed between fence designs, although individuals crossed fences of all types. Purpose-built fence gaps increased passage through relatively impermeable fences. Nevertheless, low fence permeability can lead to packs, or parts of packs, becoming trapped on the wrong side of a fence, with consequences for population dynamics. Careful evaluation should be given to the necessity of erecting fences; ecological impact assessments should incorporate evaluation of impacts on animal movement patterns and should be undertaken for all large-scale fencing interventions. Where fencing is unavoidable, projects should use the most permeable fencing structures possible, both in the design of the fence and including as many purpose-built gaps as possible, to minimize impacts on wide-ranging wildlife.


2012 ◽  
Vol 27 (28) ◽  
pp. 1250164
Author(s):  
J. MANUEL GARCÍA-ISLAS

In the three-dimensional spin foam model of quantum gravity with a cosmological constant, there exists a set of observables associated with spin network graphs. A set of probabilities is calculated from these observables, and hence the associated Shannon entropy can be defined. We present the Shannon entropy associated with these observables and find some interesting bounded inequalities. The problem relates measurements, entropy and information theory in a simple way which we explain.


2014 ◽  
Vol 11 (99) ◽  
pp. 20140542 ◽  
Author(s):  
Nathan F. Putman ◽  
Erica S. Jenkins ◽  
Catherine G. J. Michielsens ◽  
David L. G. Noakes

Animals navigate using a variety of sensory cues, but how each is weighted during different phases of movement (e.g. dispersal, foraging, homing) is controversial. Here, we examine the geomagnetic and olfactory imprinting hypotheses of natal homing with datasets that recorded variation in the migratory routes of sockeye ( Oncorhynchus nerka ) and pink ( Oncorhynchus gorbuscha ) salmon returning from the Pacific Ocean to the Fraser River, British Columbia. Drift of the magnetic field (i.e. geomagnetic imprinting) uniquely accounted for 23.2% and 44.0% of the variation in migration routes for sockeye and pink salmon, respectively. Ocean circulation (i.e. olfactory imprinting) predicted 6.1% and 0.1% of the variation in sockeye and pink migration routes, respectively. Sea surface temperature (a variable influencing salmon distribution but not navigation, directly) accounted for 13.0% of the variation in sockeye migration but was unrelated to pink migration. These findings suggest that geomagnetic navigation plays an important role in long-distance homing in salmon and that consideration of navigation mechanisms can aid in the management of migratory fishes by better predicting movement patterns. Finally, given the diversity of animals that use the Earth's magnetic field for navigation, geomagnetic drift may provide a unifying explanation for spatio-temporal variation in the movement patterns of many species.


2008 ◽  
Vol 10 ◽  
pp. 47-60 ◽  
Author(s):  
ELC Shepard ◽  
RP Wilson ◽  
F Quintana ◽  
A Gómez Laich ◽  
N Liebsch ◽  
...  

2012 ◽  
Vol 82 (1) ◽  
pp. 96-106 ◽  
Author(s):  
Tal Avgar ◽  
Anna Mosser ◽  
Glen S. Brown ◽  
John M. Fryxell

1987 ◽  
Vol 19 (3) ◽  
pp. 385-394 ◽  
Author(s):  
J R Roy

In the use of information theory for the development of forecasting models, two alternative approaches can be used, based either on Shannon entropy or on Kullback information gain. In this paper, a new approach is presented, which combines the usually superior statistical inference powers of the Kullback procedure with the advantages of the availability of calibrated ‘elasticity’ parameters in the Shannon approach. Situations are discussed where the combined approach is preferable to either of the two existing procedures, and the principles are illustrated with the help of a small numerical example.


Author(s):  
M. Vidyasagar

This chapter provides an introduction to some elementary aspects of information theory, including entropy in its various forms. Entropy refers to the level of uncertainty associated with a random variable (or more precisely, the probability distribution of the random variable). When there are two or more random variables, it is worthwhile to study the conditional entropy of one random variable with respect to another. The last concept is relative entropy, also known as the Kullback–Leibler divergence, which measures the “disparity” between two probability distributions. The chapter first considers convex and concave functions before discussing the properties of the entropy function, conditional entropy, uniqueness of the entropy function, and the Kullback–Leibler divergence.


This chapter presents a higher-order-logic formalization of the main concepts of information theory (Cover & Thomas, 1991), such as the Shannon entropy and mutual information, using the formalization of the foundational theories of measure, Lebesgue integration, and probability. The main results of the chapter include the formalizations of the Radon-Nikodym derivative and the Kullback-Leibler (KL) divergence (Coble, 2010). The latter provides a unified framework based on which most of the commonly used measures of information can be defined. The chapter then provides the general definitions that are valid for both discrete and continuous cases and then proves the corresponding reduced expressions where the measures considered are absolutely continuous over finite spaces.


Sign in / Sign up

Export Citation Format

Share Document