Cryptography, Information Theory, and Error‐Correction

2021 ◽  
Author(s):  
Aiden Bruen ◽  
Mario Forcinito ◽  
James McQuillan
Author(s):  
Per Ola Kristensson

In this chapter we explain how methods from statistical language processing serve as a foundation for the design of probabilistic text entry methods and error correction methods. We review concepts from information theory and language modelling and explain how to design a statistical decoder for text entry—a generative probabilistic model based on the token-passing paradigm. We then present five example applications of statistical language processing for text entry: correcting typing mistakes, enabling fast typing on a smartwatch, improving prediction in augmentative and alternative communication, enabling dwell-free eye-typing and intelligently supporting error correction of probabilistic text entry. We then discuss the limitations of the models presented in this chapter and highlight the importance of establishing solution principles based on engineering science and empirical research in order to guide the design of probabilistic text entry.


2017 ◽  
Vol 35 (12) ◽  
pp. 1170-1178 ◽  
Author(s):  
Zitian Chen ◽  
Wenxiong Zhou ◽  
Shuo Qiao ◽  
Li Kang ◽  
Haifeng Duan ◽  
...  

Author(s):  
H. D. Arora ◽  
Anjali Dhiman

In coding theory, we study various properties of codes for application in data compression, cryptography, error correction, and network coding. The study of codes is introduced in Information Theory, electrical engineering, mathematics, and computer sciences for the transmission of data through reliable and efficient methods. We have to consider how coding of messages can be done efficiently so that the maximum number of messages can be sent over a noiseless channel in a given time. Thus, the minimum value of mean codeword length subject to a given constraint on codeword lengths has to be founded. In this paper, we have introduced mean codeword length of orderαand typeβfor 1:1 codes and analyzed the relationship between average codeword length and fuzzy information measures for binary 1:1 codes. Further, noiseless coding theorem associated with fuzzy information measure has been established.


2013 ◽  
Vol 25 (7) ◽  
pp. 1891-1925 ◽  
Author(s):  
Carina Curto ◽  
Vladimir Itskov ◽  
Katherine Morrison ◽  
Zachary Roth ◽  
Judy L. Walker

Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.


Sign in / Sign up

Export Citation Format

Share Document