A comparison of two information theory approaches to animal communication systems

2001 ◽  
Vol 109 (5) ◽  
pp. 2429-2429
Author(s):  
Laurance R. Doyle ◽  
Jon M. Jenkins ◽  
Sean F. Hanser ◽  
Brenda McCowan
2004 ◽  
Vol 213 ◽  
pp. 514-518 ◽  
Author(s):  
Sean F. Hanser ◽  
Laurance R. Doyle ◽  
Brenda McCowan ◽  
Jon M. Jenkins

Information theory, as first introduced by Claude Shannon (Shannon & Weaver 1949) quantitatively evaluates the organizational complexity of communication systems. At the same time George Zipf was examining linguistic structure in a way that was mathematically similar to the components of the Shannon first-order entropy (Zipf 1949). Both Shannon's and Zipf's mathematical procedures have been applied to animal communication and recently have been providing insightful results. The Zipf plot is a useful tool for a first estimate of the characterization of a communication system's complexity (which can later be examined for complex structure at deeper levels using Shannon entropic analysis). In this paper we shall discuss some of the applications and pitfalls of using the Zipf distribution as a preliminary evaluator of the communication complexity of a signaling system.


2017 ◽  
Vol 284 (1855) ◽  
pp. 20170451 ◽  
Author(s):  
Henrik Brumm ◽  
Sue Anne Zollinger

Sophisticated vocal communication systems of birds and mammals, including human speech, are characterized by a high degree of plasticity in which signals are individually adjusted in response to changes in the environment. Here, we present, to our knowledge, the first evidence for vocal plasticity in a reptile. Like birds and mammals, tokay geckos ( Gekko gecko ) increased the duration of brief call notes in the presence of broadcast noise compared to quiet conditions, a behaviour that facilitates signal detection by receivers. By contrast, they did not adjust the amplitudes of their call syllables in noise (the Lombard effect), which is in line with the hypothesis that the Lombard effect has evolved independently in birds and mammals. However, the geckos used a different strategy to increase signal-to-noise ratios: instead of increasing the amplitude of a given call type when exposed to noise, the subjects produced more high-amplitude syllable types from their repertoire. Our findings demonstrate that reptile vocalizations are much more flexible than previously thought, including elaborate vocal plasticity that is also important for the complex signalling systems of birds and mammals. We suggest that signal detection constraints are one of the major forces driving the evolution of animal communication systems across different taxa.


PeerJ ◽  
2021 ◽  
Vol 9 ◽  
pp. e10736
Author(s):  
Kaja Wierucka ◽  
Michelle D. Henley ◽  
Hannah S. Mumby

The ability to recognize conspecifics plays a pivotal role in animal communication systems. It is especially important for establishing and maintaining associations among individuals of social, long-lived species, such as elephants. While research on female elephant sociality and communication is prevalent, until recently male elephants have been considered far less social than females. This resulted in a dearth of information about their communication and recognition abilities. With new knowledge about the intricacies of the male elephant social structure come questions regarding the communication basis that allows for social bonds to be established and maintained. By analyzing the acoustic parameters of social rumbles recorded over 1.5 years from wild, mature, male African savanna elephants (Loxodonta africana) we expand current knowledge about the information encoded within these vocalizations and their potential to facilitate individual recognition. We showed that social rumbles are individually distinct and stable over time and therefore provide an acoustic basis for individual recognition. Furthermore, our results revealed that different frequency parameters contribute to individual differences of these vocalizations.


2015 ◽  
Vol 282 (1816) ◽  
pp. 20151574 ◽  
Author(s):  
Matthew R. Wilkins ◽  
Daizaburo Shizuka ◽  
Maxwell B. Joseph ◽  
Joanna K. Hubbard ◽  
Rebecca J. Safran

Complex signals, involving multiple components within and across modalities, are common in animal communication. However, decomposing complex signals into traits and their interactions remains a fundamental challenge for studies of phenotype evolution. We apply a novel phenotype network approach for studying complex signal evolution in the North American barn swallow ( Hirundo rustica erythrogaster ). We integrate model testing with correlation-based phenotype networks to infer the contributions of female mate choice and male–male competition to the evolution of barn swallow communication. Overall, the best predictors of mate choice were distinct from those for competition, while moderate functional overlap suggests males and females use some of the same traits to assess potential mates and rivals. We interpret model results in the context of a network of traits, and suggest this approach allows researchers a more nuanced view of trait clustering patterns that informs new hypotheses about the evolution of communication systems.


2004 ◽  
Vol 4 (6&7) ◽  
pp. 450-459
Author(s):  
S.M. Barnett

The work of Holevo and other pioneers of quantum information theory has given us limits on the performance of communication systems. Only recently, however, have we been able to perform laboratory demonstrations approaching the ideal quantum limit. This article presents some of the known limits and bounds together with the results of our experiments based on optical polarisation.


Author(s):  
Kenneth M. Sayre

Information theory was established in 1948 by Claude Shannon as a statistical analysis of factors pertaining to the transmission of messages through communication channels. Among basic concepts defined within the theory are information (the amount of uncertainty removed by the occurrence of an event), entropy (the average amount of information represented by events at the source of a channel), and equivocation (the ‘noise’ that impedes faithful transmission of a message through a channel). Information theory has proved essential to the development of space probes, high-speed computing machinery and modern communication systems. The information studied by Shannon is sharply distinct from information in the sense of knowledge or of propositional content. It is also distinct from most uses of the term in the popular press (‘information retrieval’, ‘information processing’, ‘information highway’, and so on). While Shannon’s work has strongly influenced academic psychology and philosophy, its reception in these disciplines has been largely impressionistic. A major problem for contemporary philosophy is to relate the statistical conceptions of information theory to information in the semantic sense of knowledge and content.


1998 ◽  
Vol 21 (2) ◽  
pp. 282-283
Author(s):  
Michael J. Ryan ◽  
Nicole M. Kime ◽  
Gil G. Rosenthal

We consider Sussman et al.'s suggestion that auditory biases for processing low-noise relationships among pairs of acoustic variables is a preadaptation for human speech processing. Data from other animal communication systems, especially those involving sexual selection, also suggest that neural biases in the receiver system can generate strong selection on the form of communication signals.


Entropy ◽  
2020 ◽  
Vol 22 (4) ◽  
pp. 438
Author(s):  
Ibrahim Alabdulmohsin

In this paper, we introduce the notion of “learning capacity” for algorithms that learn from data, which is analogous to the Shannon channel capacity for communication systems. We show how “learning capacity” bridges the gap between statistical learning theory and information theory, and we will use it to derive generalization bounds for finite hypothesis spaces, differential privacy, and countable domains, among others. Moreover, we prove that under the Axiom of Choice, the existence of an empirical risk minimization (ERM) rule that has a vanishing learning capacity is equivalent to the assertion that the hypothesis space has a finite Vapnik–Chervonenkis (VC) dimension, thus establishing an equivalence relation between two of the most fundamental concepts in statistical learning theory and information theory. In addition, we show how the learning capacity of an algorithm provides important qualitative results, such as on the relation between generalization and algorithmic stability, information leakage, and data processing. Finally, we conclude by listing some open problems and suggesting future directions of research.


Sign in / Sign up

Export Citation Format

Share Document