scholarly journals An Information-Theoretic Framework for Evaluating Edge Bundling Visualization

Entropy ◽  
2018 ◽  
Vol 20 (9) ◽  
pp. 625 ◽  
Author(s):  
Jieting Wu ◽  
Feiyu Zhu ◽  
Xin Liu ◽  
Hongfeng Yu

Edge bundling is a promising graph visualization approach to simplifying the visual result of a graph drawing. Plenty of edge bundling methods have been developed to generate diverse graph layouts. However, it is difficult to defend an edge bundling method with its resulting layout against other edge bundling methods as a clear theoretic evaluation framework is absent in the literature. In this paper, we propose an information-theoretic framework to evaluate the visual results of edge bundling techniques. We first illustrate the advantage of edge bundling visualizations for large graphs, and pinpoint the ambiguity resulting from drawing results. Second, we define and quantify the amount of information delivered by edge bundling visualization from the underlying network using information theory. Third, we propose a new algorithm to evaluate the resulting layouts of edge bundling using the amount of the mutual information between a raw network dataset and its edge bundling visualization. Comparison examples based on the proposed framework between different edge bundling techniques are presented.

Entropy ◽  
2021 ◽  
Vol 23 (7) ◽  
pp. 858
Author(s):  
Dongshan He ◽  
Qingyu Cai

In this paper, we present a derivation of the black hole area entropy with the relationship between entropy and information. The curved space of a black hole allows objects to be imaged in the same way as camera lenses. The maximal information that a black hole can gain is limited by both the Compton wavelength of the object and the diameter of the black hole. When an object falls into a black hole, its information disappears due to the no-hair theorem, and the entropy of the black hole increases correspondingly. The area entropy of a black hole can thus be obtained, which indicates that the Bekenstein–Hawking entropy is information entropy rather than thermodynamic entropy. The quantum corrections of black hole entropy are also obtained according to the limit of Compton wavelength of the captured particles, which makes the mass of a black hole naturally quantized. Our work provides an information-theoretic perspective for understanding the nature of black hole entropy.


2017 ◽  
Vol 28 (7) ◽  
pp. 954-966 ◽  
Author(s):  
Colin Bannard ◽  
Marla Rosner ◽  
Danielle Matthews

Of all the things a person could say in a given situation, what determines what is worth saying? Greenfield’s principle of informativeness states that right from the onset of language, humans selectively comment on whatever they find unexpected. In this article, we quantify this tendency using information-theoretic measures and report on a study in which we tested the counterintuitive prediction that children will produce words that have a low frequency given the context, because these will be most informative. Using corpora of child-directed speech, we identified adjectives that varied in how informative (i.e., unexpected) they were given the noun they modified. In an initial experiment ( N = 31) and in a replication ( N = 13), 3-year-olds heard an experimenter use these adjectives to describe pictures. The children’s task was then to describe the pictures to another person. As the information content of the experimenter’s adjective increased, so did children’s tendency to comment on the feature that adjective had encoded. Furthermore, our analyses suggest that children balance informativeness with a competing drive to ease production.


In previous chapters, the authors provided a comprehensive framework that can be used in the formal probabilistic and information-theoretic analysis of a wide range of systems and protocols. In this chapter, they illustrate the usefulness of conducting this analysis using theorem proving by tackling a number of applications including a data compression application, the formal analysis of an anonymity-based MIX channel, and the properties of the onetime pad encryption system.


Entropy ◽  
2020 ◽  
Vol 22 (4) ◽  
pp. 444
Author(s):  
Stephen Fox ◽  
Adrian Kotelba

Amidst certainty, efficiency can improve sustainability by reducing resource consumption. However, flexibility is needed to be able to survive when uncertainty increases. Apropos, sustainable production cannot persist in the long-term without having both flexibility and efficiency. Referring to cognitive science to inform the development of production systems is well established. However, recent research in cognitive science encompassing flexibility and efficiency in brain functioning have not been considered previously. In particular, research by others that encompasses information (I), information entropy (H), relative entropy (D), transfer entropy (TE), and brain entropy. By contrast, in this paper, flexibility and efficiency for persistent sustainable production is analyzed in relation to these information theory applications in cognitive science and is quantified in terms of information. Thus, this paper is consistent with the established practice of referring to cognitive science to inform the development of production systems. However, it is novel in addressing the need to combine flexibility and efficiency for persistent sustainability in terms of cognitive functioning as modelled with information theory.


2020 ◽  
Vol 12 (5) ◽  
pp. 880
Author(s):  
Ying Zhang ◽  
Jingxiong Zhang ◽  
Wenjing Yang

Quantifying information content in remote-sensing images is fundamental for information-theoretic characterization of remote sensing information processes, with the images being usually information sources. Information-theoretic methods, being complementary to conventional statistical methods, enable images and their derivatives to be described and analyzed in terms of information as defined in information theory rather than data per se. However, accurately quantifying images’ information content is nontrivial, as information redundancy due to spectral and spatial dependence needs to be properly handled. There has been little systematic research on this, hampering wide applications of information theory. This paper seeks to fill this important research niche by proposing a strategy for quantifying information content in multispectral images based on information theory, geostatistics, and image transformations, by which interband spectral dependence, intraband spatial dependence, and additive noise inherent to multispectral images are effectively dealt with. Specifically, to handle spectral dependence, independent component analysis (ICA) is performed to transform a multispectral image into one with statistically independent image bands (not spectral bands of the original image). The ICA-transformed image is further normal-transformed to facilitate computation of information content based on entropy formulas for Gaussian distributions. Normal transform facilitates straightforward incorporation of spatial dependence in entropy computation for the aforementioned double-transformed image bands with inter-pixel spatial correlation modeled via variograms. Experiments were undertaken using Landsat ETM+ and TM image subsets featuring different dominant land cover types (i.e., built-up, agricultural, and hilly). The experimental results confirm that the proposed methods provide more objective estimates of information content than otherwise when spectral dependence, spatial dependence, or non-normality is not accommodated properly. The differences in information content between image subsets obtained with ETM+ and TM were found to be about 3.6 bits/pixel, indicating the former’s greater information content. The proposed methods can be adapted for information-theoretic analyses of remote sensing information processes.


Author(s):  
Cristian Mariani

In recent years, many scholars (Ladyman & Ross [39]; Floridi [25]; Bynum [9]) have been discussing the possibility of an ‘informational’ realism. The common idea behind these projects is that of taking the notion of ‘information’ as the central concept of both our scientific practice and our ontology. At the same time, many experts in Quantum Information Theory (Lloyd [40]; Vedral [53]; Chiribella, D’Ariano & Perinotti [14]) have developed the idea that it is possible to ground all our physical theories by following an information-theoretic approach. In what follows, I aim at showing that it is still not at all clear what does it mean to be an ‘informational realist’. Consequently, I show the reasons why I believe is misleading to talk about informational realism as something that could actually supersede the most common forms of realism, namely the standard ‘object oriented’ and the structural ones. Finally, I suggest that the only plausible way to define informational realism, and thus, more generally, to take a realist attitudine towards Quantum Information Theory, is that of assuming an epistemic and moderate structural position.


2007 ◽  
Vol 15 (2) ◽  
pp. 169-198 ◽  
Author(s):  
Dong-Il Seo ◽  
Byung-Ro Moon

In optimization problems, the contribution of a variable to fitness often depends on the states of other variables. This phenomenon is referred to as epistasis or linkage. In this paper, we show that a new theory of epistasis can be established on the basis of Shannon's information theory. From this, we derive a new epistasis measure called entropic epistasis and some theoretical results. We also provide experimental results verifying the measure and showing how it can be used for designing efficient evolutionary algorithms.


2012 ◽  
Vol 433-440 ◽  
pp. 5073-5077
Author(s):  
Jing Yao Wang ◽  
Meng Jia Li ◽  
Mei Song ◽  
Ying Hai Zhang

Information theory has made great impact on the research of communication systems. However, analyze and design of networks has not benefited too much from information theory. Therefore, in this paper, we propose the information-theoretical framework of context aware network to explore the relationship between the information and the network performance. We also analyze the information traffic process in context aware network. To illustrate our approach, we analyze the architecture of context aware network by the information entropy produced in the network, and discuss the way to improve the performance of context aware in an information-theoretic perspective. The results in this paper may be also used to design other network and guide the future network design.


2020 ◽  
Vol 34 (04) ◽  
pp. 3825-3833 ◽  
Author(s):  
Sanghamitra Dutta ◽  
Praveen Venkatesh ◽  
Piotr Mardziel ◽  
Anupam Datta ◽  
Pulkit Grover

The needs of a business (e.g., hiring) may require the use of certain features that are critical in a way that any discrimination arising due to them should be exempted. In this work, we propose a novel information-theoretic decomposition of the total discrimination (in a counterfactual sense) into a non-exempt component, which quantifies the part of the discrimination that cannot be accounted for by the critical features, and an exempt component, which quantifies the remaining discrimination. Our decomposition enables selective removal of the non-exempt component if desired. We arrive at this decomposition through examples and counterexamples that enable us to first obtain a set of desirable properties that any measure of non-exempt discrimination should satisfy. We then demonstrate that our proposed quantification of non-exempt discrimination satisfies all of them. This decomposition leverages a body of work from information theory called Partial Information Decomposition (PID). We also obtain an impossibility result showing that no observational measure of non-exempt discrimination can satisfy all of the desired properties, which leads us to relax our goals and examine alternative observational measures that satisfy only some of these properties. We then perform a case study using one observational measure to show how one might train a model allowing for exemption of discrimination due to critical features.


Leonardo ◽  
2020 ◽  
Vol 53 (3) ◽  
pp. 274-280
Author(s):  
Alan Marsden

Information Theory provoked the interest of arts researchers from its inception in the mid-twentieth century but failed to produce the expected impact, partly because the data and computing systems required were not available. With the modern availability of data from public collections and sophisticated software, there is renewed interest in Information Theory. Successful application in the analysis of music implies potential success in other art forms also. The author gives an illustrative example, applying the Information-Theoretic similarity measure normalized compression distance with the aim of ranking paintings in a large collection by their conventionality.


Sign in / Sign up

Export Citation Format

Share Document