scholarly journals Comparative Study of Generalized Quantitative-Qualitative Inaccuracy Fuzzy Measures for Noiseless Coding Theorem and 1:1 Codes

Author(s):  
H. D. Arora ◽  
Anjali Dhiman

In coding theory, we study various properties of codes for application in data compression, cryptography, error correction, and network coding. The study of codes is introduced in Information Theory, electrical engineering, mathematics, and computer sciences for the transmission of data through reliable and efficient methods. We have to consider how coding of messages can be done efficiently so that the maximum number of messages can be sent over a noiseless channel in a given time. Thus, the minimum value of mean codeword length subject to a given constraint on codeword lengths has to be founded. In this paper, we have introduced mean codeword length of orderαand typeβfor 1:1 codes and analyzed the relationship between average codeword length and fuzzy information measures for binary 1:1 codes. Further, noiseless coding theorem associated with fuzzy information measure has been established.

2020 ◽  
Author(s):  
Anas Al-okaily ◽  
Abdelghani Tbakhi

ABSTRACTData compression is a fundamental problem in the fields of computer science, information theory, and coding theory. The need for compressing data is to reduce the size of the data so that the storage and the transmission of them become more efficient. Motivated from resolving the compression of DNA data, we introduce a novel encoding algorithm that works for any textual data including DNA data. Moreover, the design of this algorithm paves a novel approach so that researchers can build up on and resolve better the compression problem of DNA or textual data.


2013 ◽  
Vol 25 (7) ◽  
pp. 1891-1925 ◽  
Author(s):  
Carina Curto ◽  
Vladimir Itskov ◽  
Katherine Morrison ◽  
Zachary Roth ◽  
Judy L. Walker

Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.


Author(s):  
Andrey Varlamov ◽  
Vladimir Rimshin

Considered the issues of interaction between man and nature. Noted that this interaction is fundamental in the existence of modern civilization. The question of possible impact on nature and society with the aim of preserving the existence of human civilization. It is shown that the study of this issue goes towards the crea-tion of models of interaction between nature and man. Determining when building models is information about the interaction of man and nature. Considered information theory from the viewpoint of interaction between nature and man. Noted that currently information theory developed mainly as a mathematical theory. The issues of interaction of man and nature, the availability and existence of information in the material sys-tem is not studied. Indicates the link information with the energy terms control large flows of energy. For con-sideration of the interaction of man and nature proposed to use the theory of degradation. Graphs are pre-sented of the information in the history of human development. Reviewed charts of population growth. As a prediction it is proposed to use the simplest based on the theory of degradation. Consideration of the behav-ior of these dependencies led to the conclusion about the existence of communication energy and information as a feature of the degradation of energy. It justifies the existence of border life ( including humanity) at the point with maximum information. Shows the relationship of energy and time using potential energy.


Entropy ◽  
2021 ◽  
Vol 23 (7) ◽  
pp. 858
Author(s):  
Dongshan He ◽  
Qingyu Cai

In this paper, we present a derivation of the black hole area entropy with the relationship between entropy and information. The curved space of a black hole allows objects to be imaged in the same way as camera lenses. The maximal information that a black hole can gain is limited by both the Compton wavelength of the object and the diameter of the black hole. When an object falls into a black hole, its information disappears due to the no-hair theorem, and the entropy of the black hole increases correspondingly. The area entropy of a black hole can thus be obtained, which indicates that the Bekenstein–Hawking entropy is information entropy rather than thermodynamic entropy. The quantum corrections of black hole entropy are also obtained according to the limit of Compton wavelength of the captured particles, which makes the mass of a black hole naturally quantized. Our work provides an information-theoretic perspective for understanding the nature of black hole entropy.


Sign in / Sign up

Export Citation Format

Share Document