scholarly journals Practice and Research on Inquiry Teaching for Information Theory and Coding Theory Courses

2015 ◽  
Vol 03 (02) ◽  
pp. 42-46
Author(s):  
刚 刘
2020 ◽  
Author(s):  
Anas Al-okaily ◽  
Abdelghani Tbakhi

ABSTRACTData compression is a fundamental problem in the fields of computer science, information theory, and coding theory. The need for compressing data is to reduce the size of the data so that the storage and the transmission of them become more efficient. Motivated from resolving the compression of DNA data, we introduce a novel encoding algorithm that works for any textual data including DNA data. Moreover, the design of this algorithm paves a novel approach so that researchers can build up on and resolve better the compression problem of DNA or textual data.


Author(s):  
Litegebe Wondie ◽  
Satish Kumar

We presenta relation betweenTsallis’s entropy and generalizedKerridge inaccuracywhich is called generalizedShannon inequalityand is well-known generalization ininformation theoryand then give its application in coding theory. The objective of the paper is to establish a result on noiseless coding theorem for the proposed mean code length in terms of generalized information measure of orderξ.


Author(s):  
H. D. Arora ◽  
Anjali Dhiman

In coding theory, we study various properties of codes for application in data compression, cryptography, error correction, and network coding. The study of codes is introduced in Information Theory, electrical engineering, mathematics, and computer sciences for the transmission of data through reliable and efficient methods. We have to consider how coding of messages can be done efficiently so that the maximum number of messages can be sent over a noiseless channel in a given time. Thus, the minimum value of mean codeword length subject to a given constraint on codeword lengths has to be founded. In this paper, we have introduced mean codeword length of orderαand typeβfor 1:1 codes and analyzed the relationship between average codeword length and fuzzy information measures for binary 1:1 codes. Further, noiseless coding theorem associated with fuzzy information measure has been established.


2013 ◽  
Vol 25 (7) ◽  
pp. 1891-1925 ◽  
Author(s):  
Carina Curto ◽  
Vladimir Itskov ◽  
Katherine Morrison ◽  
Zachary Roth ◽  
Judy L. Walker

Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.


Sign in / Sign up

Export Citation Format

Share Document