scholarly journals Some Inequalities in Information Theory Using Tsallis Entropy

Author(s):  
Litegebe Wondie ◽  
Satish Kumar

We presenta relation betweenTsallis’s entropy and generalizedKerridge inaccuracywhich is called generalizedShannon inequalityand is well-known generalization ininformation theoryand then give its application in coding theory. The objective of the paper is to establish a result on noiseless coding theorem for the proposed mean code length in terms of generalized information measure of orderξ.

1978 ◽  
Vol 10 (04) ◽  
pp. 788-802
Author(s):  
Bruce Ebanks

It is shown that every measure of expected information which has the branching property is of the form where J is a given information measure which is compositive under a regular binary operation and the Ψ n are antisymmetric, bi-additive functions. In a probability space, such measures (entropies) take the form


1978 ◽  
Vol 10 (4) ◽  
pp. 788-802 ◽  
Author(s):  
Bruce Ebanks

It is shown that every measure of expected information which has the branching property is of the form where J is a given information measure which is compositive under a regular binary operation and the Ψn are antisymmetric, bi-additive functions. In a probability space, such measures (entropies) take the form


Author(s):  
Aurelio Fernández Bariviera ◽  
María Belén Guercio ◽  
Lisana B. Martinez ◽  
Osvaldo A. Rosso

This paper analyses Libor interest rates for seven different maturities and referred to operations in British pounds, euros, Swiss francs and Japanese yen, during the period 2001–2015. The analysis is performed by means of two quantifiers derived from information theory: the permutation Shannon entropy and the permutation Fisher information measure. An anomalous behaviour in the Libor is detected in all currencies except euros during the years 2006–2012. The stochastic switch is more severe in one, two and three months maturities. Given the special mechanism of Libor setting, we conjecture that the behaviour could have been produced by the manipulation that was uncovered by financial authorities. We argue that our methodology is pertinent as a market overseeing instrument.


2020 ◽  
Author(s):  
Anas Al-okaily ◽  
Abdelghani Tbakhi

ABSTRACTData compression is a fundamental problem in the fields of computer science, information theory, and coding theory. The need for compressing data is to reduce the size of the data so that the storage and the transmission of them become more efficient. Motivated from resolving the compression of DNA data, we introduce a novel encoding algorithm that works for any textual data including DNA data. Moreover, the design of this algorithm paves a novel approach so that researchers can build up on and resolve better the compression problem of DNA or textual data.


2020 ◽  
Vol 6 (1) ◽  
pp. 114
Author(s):  
Saeid Maadani ◽  
Gholam Reza Mohtashami Borzadaran ◽  
Abdol Hamid Rezaei Roknabadi

The variance of Shannon information related to the random variable \(X\), which is called varentropy, is a measurement that indicates, how the information content of \(X\) is scattered around its entropy and explains its various applications in information theory, computer sciences, and statistics. In this paper, we introduce a new generalized varentropy based on the Tsallis entropy and also obtain some results and bounds for it. We compare the varentropy with the Tsallis varentropy. Moreover, we explain the Tsallis varentropy of the order statistics and analyse this concept in residual (past) lifetime distributions and then introduce two new classes of distributions by them.


Sign in / Sign up

Export Citation Format

Share Document