scholarly journals New Statistical Randomness Tests Based on Length of Runs

2015 ◽  
Vol 2015 ◽  
pp. 1-14 ◽  
Author(s):  
Ali Doğanaksoy ◽  
Fatih Sulak ◽  
Muhiddin Uğuz ◽  
Okan Şeker ◽  
Ziya Akcengiz

Random sequences and random numbers constitute a necessary part of cryptography. Many cryptographic protocols depend on random values. Randomness is measured by statistical tests and hence security evaluation of a cryptographic algorithm deeply depends on statistical randomness tests. In this work we focus on statistical distributions of runs of lengths one, two, and three. Using these distributions we state three new statistical randomness tests. New tests useχ2distribution and, therefore, exact values of probabilities are needed. Probabilities associated runs of lengths one, two, and three are stated. Corresponding probabilities are divided into five subintervals of equal probabilities. Accordingly, three new statistical tests are defined and pseudocodes for these new statistical tests are given. New statistical tests are designed to detect the deviations in the number of runs of various lengths from a random sequence. Together with some other statistical tests, we analyse our tests’ results on outputs of well-known encryption algorithms and on binary expansions ofe,π, and2. Experimental results show the performance and sensitivity of our tests.

2022 ◽  
Vol 2 (14) ◽  
pp. 55-65
Author(s):  
Hoang Dinh Linh ◽  
Do Dai Chi ◽  
Nguyen Tuan Anh ◽  
Le Thao Uyen

Abstract—Random numbers play a very important role in cryptography. More precisely, almost cryptographic primitives are ensured their security based on random values such as random key, nonces, salts... Therefore, the assessment of randomness according to statistical tests is really essential for measuring the security of cryptographic algorithms. In this paper, we focus on so far randomness tests based on runs in the literature. First, we have proved in detail that the expected number of gaps (or blocks) of length  in a random sequence of length  is . Secondly, we have evaluated correlation of some tests based on runs so far using Pearson coefficient method [5, 6] and Fail-Fail ratio one [7, 8]. Surprisingly, the Pearson coefficient method do not show any strong linear correlation of these runs-based tests but the Fail-Fail ratio do. Then, we have considered the sensitivity of these runs tests with some basic transformations. Finally, we have proposed some new runs tests based on the sensitivity results and applied evaluations to some random sources. Tóm tắt—Số ngẫu nhiên đóng một vai trò quan trọng trong mật mã. Cụ thể, độ an toàn của hầu hết các nguyên thủy mật mã đều được đảm bảo dựa trên các giá trị ngẫu nhiên như khóa, nonce, salt… Do đó, việc đánh giá tính ngẫu nhiên dựa trên các kiểm tra thống kê là thực sự cần thiết để đo độ an toàn cho các thuật toán mật mã. Trong bài báo này, chúng tôi tập trung vào các kiểm tra ngẫu nhiên dựa vào run trong các tài liệu. Đầu tiên, chúng tôi chứng minh chi tiết rằng kỳ vọng số các gap (khối) độ dài  trong một chuỗi ngẫu nhiên độ dài  là . Sau đó, chúng tôi đánh giá mối tương quan của một số kiểm tra dựa vào run bằng phương pháp hệ số Pearson [5, 6] và tỷ số Fail-Fail  [7, 8]. Đáng ngạc nhiên là phương pháp hệ số Pearson không cho thấy bất kỳ mối tương quan tuyến tính mạnh nào của các kiểm tra dựa vào run, trong khi đó tỷ số Fail-Fail lại chỉ ra. Tiếp theo, chúng tôi xem xét độ nhạy của các kiểm tra run này với một số phép biến đổi cơ bản. Cuối cùng, chúng tôi đề xuất một số kiểm tra run mới dựa trên các kết quả độ nhạy và đánh giá áp dụng chúng cho một số nguồn ngẫu nhiên.


2020 ◽  
Vol 8 (2) ◽  
pp. 10-18
Author(s):  
Hoàng Đình Linh

 Abstract— Random Sequences and random numbers play a very important role in cryptography. In symmetric cryptography primitives, a secret key is the most important component to ensure their security. While cryptographic protocols or digital signature schemes are also strongly dependent on random values. In addition, one of the criteria for evaluating security for cryptographic primitives such as block cipher, hash function... is to evaluate the output randomness. Therefore, the assessment of randomness according to statistical tests is really important for measuring the security of cryptographic algorithms. In this paper, we present some research results on randomness tests based on the length of runs proposed by A. Doğanaksoy et al in 2015. First, we show that some probability values for tests based on lengths 1 and 2 are inaccurate and suggest editing. Secondly, we have given and demonstrated for the general case the runs of any length k. Finally, we built a randomness testing tool and applied evaluations to true random sourcesTóm tắt— Các dãy và các số ngẫu nhiên đóng một vai trò rất quan trọng trong mật mã. Trong các nguyên thuỷ mật mã đối xứng, khoá bí mật chính là thành phần quan trọng nhất nhằm đảm bảo tính an toàn của chúng. Trong khi đó, các giao thức mật mã hay lược đồ chữ ký số cũng phụ thuộc nhiều vào các giá trị ngẫu nhiên. Ngoài ra, một trong các tiêu chí để đánh giá tính an toàn cho các nguyên thuỷ mật mã như mã khối, hàm băm… là đánh giá tính ngẫu nhiên đầu ra. Do đó, việc đánh giá tính ngẫu nhiên theo các kiểm tra thống kê thực sự rất quan trọng đối với việc đánh giá tính an toàn của các thuật toán mật mã. Trong bài báo này, chúng tôi trình bày một số kết quả nghiên cứu về các tiêu chuẩn kiểm tra loạt dựa trên độ dài đã được đề xuất bởi A. Doğanaksoy cùng đồng sự năm 2015. Đầu tiên, chúng tôi chỉ ra rằng một số giá trị xác suất cho các loạt độ dài 1 và 2 là chưa chính xác và đề xuất chỉnh sửa. Sau đó, chúng tôi đã đưa ra và chứng minh cho trường hợp tổng quát các loạt có độ dài kbất kỳ. Cuối cùng, chúng tôi đã xây dựng một công cụ kiểm tra tính ngẫu nhiên dựa trên độ dàicác loạt và áp dụng đánh giá cho các nguồn ngẫu nhiên thực sự.


Electronics ◽  
2020 ◽  
Vol 10 (1) ◽  
pp. 16
Author(s):  
Sehoon Lee ◽  
Myungseo Park ◽  
Jongsung Kim

With the rapid increase in computer storage capabilities, user data has become increasingly important. Although user data can be maintained by various protection techniques, its safety has been threatened by the advent of ransomware, defined as malware that encrypts user data, such as documents, photographs and videos, and demands money to victims in exchange for data recovery. Ransomware-infected files can be recovered only by obtaining the encryption key used to encrypt the files. However, the encryption key is derived using a Pseudo Random Number Generator (PRNG) and is recoverable only by the attacker. For this reason, the encryption keys of malware are known to be difficult to obtain. In this paper, we analyzed Magniber v2, which has exerted a large impact in the Asian region. We revealed the operation process of Magniber v2 including PRNG and file encryption algorithms. In our analysis, we found a vulnerability in the PRNG of Magniber v2 developed by the attacker. We exploited this vulnerability to successfully recover the encryption keys, which was by verified the result in padding verification and statistical randomness tests. To our knowledge, we report the first recovery result of Magniber v2-infected files.


2021 ◽  
Vol 17 (1) ◽  
pp. 260-264
Author(s):  
Alexandru VULPE ◽  
Raluca ANDREI ◽  
Alexandru BRUMARU ◽  
Octavian FRATU

Abstract: With the development of mobile devices and the advent of smartphones, the Internet has become part of everyday life. Any category of information about weather, flight schedule, etc. it is just a click away from the keyboard. This availability of data has led to a continuous increase in connectivity between devices, from any corner of the world. Combining device connectivity with systems automation allows the collection of information, its analysis and implicitly decision-making on the basis of information. Their introduction and continued expansion of devices that communicate in networks (including the Internet) have made security issues very important devices as well as for users. One of the main methodologies that ensures data confidentiality is encryption, which protects data from unauthorized access, but at the cost of using extensive mathematical models. Due to the nature of IoT devices, the resources allocated to a device can be constrained by certain factors, some of which are related to costs and others to the physical limitations of the device. Ensuring the confidentiality of data requires the use of encryption algorithms for these interconnected devices, which provide protection while maintaining the operation of that device. The need for these types of algorithms has created conditions for the growth and development of the concept of lightweight encryption, which aim to find encryption systems that can be implemented on these categories of devices, with limited hardware and software requirements. The paper proposes a lightweight cryptographic algorithm implemented on a microcontroller system, comparing its performances with those of the already existing system (based on x86).


1993 ◽  
Vol 13 (2) ◽  
pp. 1078-1092 ◽  
Author(s):  
J T Meier ◽  
S M Lewis

Antigen receptor genes acquire junctional inserts upon assembly from their component, germ line-encoded V, D, and J segments. Inserts are generally of random sequence, but a small number of V-D, D-J, or V-J junctions are exceptional. In such junctions, one or two added base pairs inversely repeat the sequence of the abutting germ line DNA. (For example, a gene segment ending AG might acquire an insert beginning with the residues CT upon joining). It has been proposed that the nonrandom residues, termed "P nucleotides," are a consequence of an obligatory end-modification step in V(D)J recombination. P insertion in normal, unselected V(D)J joining products, however, has not been rigorously established. Here, we use an experimentally manipulable system, isolated from immune selection of any kind, to examine the fine structure of V(D)J junctions formed in wild-type lymphoid cells. Our results, according to statistical tests, show the following, (i) The frequency of P insertion is influenced by the DNA sequence of the joined ends. (ii) P inserts may be longer than two residues in length. (iii) P inserts are associated with coding ends only. Additionally, a systematic survey of published P nucleotide data shows no evidence for variation in P insertion as a function of genetic locus and ontogeny. Together, these analyses establish the generality of the P nucleotide pattern within inserts but do not fully support previous conjectures as to their origin and centrality in the joining reaction.


2021 ◽  
Vol 13 (2) ◽  
pp. 10-18
Author(s):  
Botond L. Márton ◽  
Dóra Istenes ◽  
László Bacsárdi

Random numbers are of vital importance in today’s world and used for example in many cryptographical protocols to secure the communication over the internet. The generators producing these numbers are Pseudo Random Number Generators (PRNGs) or True Random Number Generators (TRNGs). A subclass of TRNGs are the Quantum based Random Number Generators (QRNGs) whose generation processes are based on quantum phenomena. However, the achievable quality of the numbers generated from a practical implementation can differ from the theoretically possible. To ease this negative effect post-processing can be used, which contains the use of extractors. They extract as much entropy as possible from the original source and produce a new output with better properties. The quality and the different properties of a given output can be measured with the help of statistical tests. In our work we examined the effect of different extractors on two QRNG outputs and found that witg the right extractor we can improve their quality.


Entropy ◽  
2019 ◽  
Vol 21 (1) ◽  
pp. 44 ◽  
Author(s):  
Sameh Askar ◽  
Abdel Karawia ◽  
Abdulrahman Al-Khedhairi ◽  
Fatemah Al-Ammar

In the literature, there are many image encryption algorithms that have been constructed based on different chaotic maps. However, those algorithms do well in the cryptographic process, but still, some developments need to be made in order to enhance the security level supported by them. This paper introduces a new cryptographic algorithm that depends on a logistic and two-dimensional chaotic economic map. The robustness of the introduced algorithm is shown by implementing it on several types of images. The implementation of the algorithm and its security are partially analyzed using some statistical analyses such as sensitivity to the key space, pixels correlation, the entropy process, and contrast analysis. The results given in this paper and the comparisons performed have led us to decide that the introduced algorithm is characterized by a large space of key security, sensitivity to the secret key, few coefficients of correlation, a high contrast, and accepted information of entropy. In addition, the results obtained in experiments show that our proposed algorithm resists statistical, differential, brute-force, and noise attacks.


2020 ◽  
Vol 30 (15) ◽  
pp. 2050223
Author(s):  
Yuling Luo ◽  
Shunsheng Zhang ◽  
Junxiu Liu ◽  
Lvchen Cao

The security of chaotic cryptographic system can be theoretically evaluated by using conventional statistical tests and numerical simulations, such as the character frequency test, entropy test, avalanche test and SP 800-22 tests. However, when the cryptographic algorithm operates on a cryptosystem, the leakage information such as power dissipation, electromagnetic emission and time-consuming can be used by attackers to analyze the secret keys, namely the Side Channel Analysis (SCA) attack. In this paper, a cryptanalysis method is proposed for evaluating the security of a chaotic block cryptographic system from a hardware perspective by utilizing the Template Attacks (TAs). Firstly, a chaotic block cryptographic system is described briefly and implemented based on an Atmel XMEGA microcontroller. Then the TA using a multivariate Gaussian model is introduced. In order to reduce computational complexity and improve the efficiency of TA, the Hamming weight is used in this work to model power consumption traces. The proposed TA method has the following advantages including (a) using the sum of difference to select points of interest of traces, (b) using a data processing method to minimize the influences on power information modeling from the redundant sampling points, and (c) all the traces are aligned precisely before establishing the templates. Experimental results show that the TA can be used to attack the chaotic cryptographic systems and is more efficient, i.e. [Formula: see text]32% less attack traces than correlation power analysis, when the templates are properly built.


Author(s):  
L-J Li ◽  
W-K Jiang ◽  
Y-H Ai

The security evaluation of some structures shocked by an underwater explosion (UNDEX) frequently plays a key role in some cases, and it is necessary to accurately predict the damage condition of the structure in an UNDEX environment. This study investigates the dynamic linear and non-linear responses and shock damages of two kinds of submerged cylindrical shell models exposed to underwater spherical trinitrotoluene (TNT) charge explosions in a circular lake. Two endplates and a middle plate are mounted on the cylindrical shells to provide support and to create two enclosed spaces. The two kinds of cylindrical shell models with the same geometry characteristics are unfilled and main hull sand-filled. Fifteen different tests are carried out by changing the TNT explosive weights of 1 and 2 kg, standoff distances ranging from 3 to 0.3 m, and two explosion positions. Measured experimental results are compared with each other, and some transformed data are obtained. A detailed discussion on experimental results shows that the dynamic responses and damage modes are much different, and the main hull sand-filled cylindrical shell is more difficult to be damaged by the shock wave loading than the unfilled model. Edge cracks are mainly observed at the instrument hull of the main hull sand-filled model, but surface tearing and cracks are observed on both the main hull and the instrumental hull of the unfilled model, respectively.


Sign in / Sign up

Export Citation Format

Share Document