scholarly journals Man-In-The-Middle Attack against Certain Authentication Protocols Revisited: Insights into the Approach and Performances Re-Evaluation

Electronics ◽  
2020 ◽  
Vol 9 (8) ◽  
pp. 1296
Author(s):  
Milica Knežević ◽  
Siniša Tomović ◽  
Miodrag J. Mihaljević

We address a class of authentication protocols called “HB” ones and the man-in-the-middle (MIM) attack, reported at the ASIACRYPT conference, called OOV-MIM (Ouafi-Overbeck-Vaudenay MIM). Analysis of the considered attack and its systematic experimental evaluation are given. It is shown that the main component of OOV-MIM, the algorithm for measuring the Hamming weight of noise vectors, outputs incorrect results as a consequence of the employed approximation of the probability distributions. The analysis reveals that, practically, the only scenario in which the OOV-MIM attack is effective is the one in which two incorrect estimations produced by the algorithm for measuring the Hamming weight, when coupled, give the correct result. This paper provides additional insights into the OOV-MIM and corrected claims about the performance/complexity showing that the performances of the considered attack have been overestimated, i.e., that the complexity of the attack has been underestimated. Particularly, the analysis points out the reasons for the incorrect claims and to the components of the attack that do not work as expected.

Mathematics ◽  
2021 ◽  
Vol 9 (5) ◽  
pp. 573
Author(s):  
Siniša Tomović ◽  
Milica Knežević ◽  
Miodrag J. Mihaljević

This paper reconsiders a powerful man-in-the-middle attack against Random-HB# and HB# authentication protocols, two prominent representatives of the HB family of protocols, which are built based on the Learning Parity in Noise (LPN) problem. A recent empirical report pointed out that the attack does not meet the claimed precision and complexity. Performing a thorough theoretical and numerical re-evaluation of the attack, in this paper we identify the root cause of the detected problem, which lies in reasoning based on approximate probability distributions of the central attack events, that can not provide the required precision due to the inherent limitations in the use of the Central Limit Theorem for this particular application. We rectify the attack by employing adequate Bayesian reasoning, after establishing the exact distributions of these events, and overcome the mentioned limitations. We further experimentally confirm the correctness of the rectified attack and show that it satisfies the required, targeted accuracy and efficiency, unlike the original attack.


Author(s):  
M. JULIA FLORES ◽  
JOSE A. GÁMEZ ◽  
KRISTIAN G. OLESEN

When a Bayesian network (BN) is modified, for example adding or deleting a node, or changing the probability distributions, we usually will need a total recompilation of the model, despite feeling that a partial (re)compilation could have been enough. Especially when considering dynamic models, in which variables are added and removed very frequently, these recompilations are quite resource consuming. But even further, for the task of building a model, which is in many occasions an iterative process, there is a clear lack of flexibility. When we use the term Incremental Compilation or IC we refer to the possibility of modifying a network and avoiding a complete recompilation to obtain the new (and different) join tree (JT). The main point we intend to study in this work is JT-based inference in Bayesian networks. Apart from undertaking the triangulation problem itself, we have achieved a great improvement for the compilation in BNs. We do not develop a new architecture for BNs inference, but taking some already existing framework JT-based for probability propagation such as Hugin or Shenoy and Shafer, we have designed a method that can be successfully applied to get better performance, as the experimental evaluation will show.


Author(s):  
Djamalddine Boumezerane

Abstract In this study, we use possibility distribution as a basis for parameter uncertainty quantification in one-dimensional consolidation problems. A Possibility distribution is the one-point coverage function of a random set and viewed as containing both partial ignorance and uncertainty. Vagueness and scarcity of information needed for characterizing the coefficient of consolidation in clay can be handled using possibility distributions. Possibility distributions can be constructed from existing data, or based on transformation of probability distributions. An attempt is made to set a systematic approach for estimating uncertainty propagation during the consolidation process. The measure of uncertainty is based on Klir's definition (1995). We make comparisons with results obtained from other approaches (probabilistic…) and discuss the importance of using possibility distributions in this type of problems.


<em>Abstract.</em>—Natural resource management requires difficult decisions, broad societal costs, and sacrifices from private landowners and public agencies. With so many financial, ecological and cultural resources at stake, policy-makers, managers, and citizens need scientific predictions that can help resolve conflicts and balance the often competing needs of ecosystems and communities. Modeled information is essential for meeting this need. The words “model uncertainty” are often misinterpreted as describing a lack of knowledge about model output. In fact, they describe knowledge, not only of the one most likely modeled estimate, but also of all the other possible estimates that the model might have provided, and their likelihood. We present six case studies, from salmon habitat recovery planning, illustrating how scientists can provide more useful products by describing distributions of possible outcomes as formal probability distributions, as confidence intervals, or as descriptions of alternative scenarios. In terms of management effectiveness, the communication and use of model uncertainty can be at least as important as the quality of the original model.


1980 ◽  
Vol 102 (3) ◽  
pp. 672-678
Author(s):  
W. Downs ◽  
S. J. Vecci ◽  
J. A. Barsin ◽  
W. C. Rovesti

This paper deals with an experimental evaluation of the combustion properties of solvent refined coal II fuel oil. The purpose was to identify problems, if any, associated with handling, storing, pumping, and burning SRC fuel oil. Detailed fuels characterizations were performed and compared to petroleum distillate products. Laboratory fuel analyses and combustion tests were performed with SRC fuel oil, No. 2 fuel oil, and No. 5 fuel oil. Four B&W atomizers were tested and two B&W oil burners were utilized. The laboratory fuel analyses indicated that in most respects this SRC fuel oil sample behaved similarly to No. 2 fuel oil. The combustion tests confirmed that expectation. The one identified problem was relatively high concentrations of fuel-bound nitrogen and, consequently NOx emissions were relatively high. It was concluded that SRC fuel oil may require the application of NOx combustion control techniques.


2018 ◽  
Vol 57 (6) ◽  
pp. 1249-1263 ◽  
Author(s):  
Domingo Muñoz-Esparza ◽  
Robert Sharman

AbstractA low-level turbulence (LLT) forecasting algorithm is proposed and implemented within the Graphical Turbulence Guidance (GTG) turbulence forecasting system. The LLT algorithm provides predictions of energy dissipation rate (EDR; turbulence dissipation to the one-third power), which is the standard turbulence metric used by the aviation community. The algorithm is based upon the use of distinct log-Weibull and lognormal probability distributions in a statistical remapping technique to represent accurately the behavior of turbulence in the atmospheric boundary layer for daytime and nighttime conditions, respectively, thus accounting for atmospheric stability. A 1-yr-long GTG LLT calibration was performed using the High-Resolution Rapid Refresh operational model, and optimum GTG ensembles of turbulence indices for clear-air and mountain-wave turbulence that minimize the mean absolute percentage error (MAPE) were determined. Evaluation of the proposed algorithm with in situ EDR data from the Boulder Atmospheric Observatory tower covering a range of altitudes up to 300 m above the surface demonstrates a reduction in the error by a factor of approximately 2.0 (MAPE = 55%) relative to the current operational GTG system (version 3). In addition, the probability of detection of typical small and large EDR values at low levels is increased by approximately 15%–20%. The improved LLT algorithm is expected to benefit several nonconventional turbulence-prediction sectors such as unmanned aerial systems and wind energy.


Water ◽  
2019 ◽  
Vol 11 (10) ◽  
pp. 2145
Author(s):  
Sokáč ◽  
Velísková ◽  
Gualtieri

Analytical solutions of the one-dimensional (1D) advection–dispersion equations, describing the substance transport in streams, are often used because of their simplicity and computational speed. Practical computations, however, clearly show the limits and the inaccuracies of this approach. These are especially visible in cases where the streams deform concentration distribution of the transported substance due to hydraulic and morphological conditions, e.g., by transient storage zones (dead zones), vegetation, and irregularities in the stream hydromorphology. In this paper, a new approach to the simulation of 1D substance transport is presented, adapted, and tested on tracer experiments available in the published research, and carried out in three small streams in Slovakia with dead zones. Evaluation of the proposed methods, based on different probability distributions, confirmed that they approximate the measured concentrations significantly better than those based upon the commonly used Gaussian distribution. Finally, an example of the application of the proposed methods to an iterative (inverse) task is presented.


2016 ◽  
Vol 53 (2) ◽  
pp. 622-629 ◽  
Author(s):  
Emmanuelle Anceaume ◽  
Yann Busnel ◽  
Ernst Schulte-Geers ◽  
Bruno Sericola

Abstract In this paper we study a generalized coupon collector problem, which consists of analyzing the time needed to collect a given number of distinct coupons that are drawn from a set of coupons with an arbitrary probability distribution. We suppose that a special coupon called the null coupon can be drawn but never belongs to any collection. In this context, we prove that the almost uniform distribution, for which all the nonnull coupons have the same drawing probability, is the distribution which stochastically minimizes the time needed to collect a fixed number of distinct coupons. Moreover, we show that in a given closed subset of probability distributions, the distribution with all its entries, but one, equal to the smallest possible value is the one which stochastically maximizes the time needed to collect a fixed number of distinct coupons.


2006 ◽  
Vol 61 (3-4) ◽  
pp. 278-284 ◽  
Author(s):  
Raimondas Mozūraitis ◽  
Vidmantas Karaliusa ◽  
Vincas Būda ◽  
Anna-Karin Borg-Karlson

Gas chromatography and mass spectrometry analyses of crude sex pheromone gland extracts revealed that virgin Synanthedon tipuliformis (Clerck), currant borer (Lepidoptera: Sesiidae) females, produced 6 compounds, structurally related to sex pheromone components of clearwing moths. By comparison of retention times and mass spectra of natural products with corresponding properties of synthetic standards, these compounds were identified as: (2E,13Z)-octadeca-2,13-dien-1-yl acetate (E2,Z13-18:OAc), (3E,13Z)-octadeca-3,13-dien-1-yl acetate (E3,Z13-18:OAc), (13Z)-octadec-13-en-1-yl acetate (Z13-18:OAc), (2E,13Z)-octadeca- 2,13-dien-1-ol (E2,Z13-18:OH), (13Z)-octadec-13-en-1-ol (Z13-18:OH) and octadecan- 1-ol (18:OH) in the ratio 100:0.7:2.7:3.2:traces:traces. The first 3 compounds were previously known to occur in the sex pheromone gland extracts of currant borers, while the last 3 chemicals are now reported for the first time. Trapping tests carried out in the black currant field revealed that E2,Z13-18:OAc, when tested separately, attracted S. tipuliformis males, while addition of E3,Z13-18:OAc to the main component increased the effectiveness of E2,Z13-18:OAc over seven times. The attractiveness of 6 component lures did not differ significantly from the one of the binary mixture, confirming that E2,Z13-18:OAc and E3,Z13- 18:OAc in the ratio100:0.7 are essential sex pheromone components of S. tipuliformis. Trapping tests carried out at the dwelling place of Synanthedon scoliaeformis (Borkhausen) (Lepidoptera: Sesiidae) revealed that, in addition to intraspecific synergistic effect, E3,Z13-18:OAc increased the specificity of the pheromone signal of S. tipuliformis, acting by intraspecific mode as an attraction antagonist against S. scoliaeformis males. By this way, it ensured the specificity of the sex attraction signal of the currant borer. Consequently, both compounds E2,Z13-18:OAc and E3,Z13-18:OAc have to be present in pheromone formulations used for monitoring and/or control of S. tipuliformis to avoid effecting non-target species. Other compounds identified from the sex pheromone gland of S. tipuliformis did not show any significant interspecific activity for males of S. scoliaeformis, however, they provide a basis to achieve specificity of a pheromone signal of S. tipuliformis and could act as attraction antagonists against other clearwing moth species which, like S. tipuliformis, employ E2,Z13- 18:OAc as their sex pheromone component.


1989 ◽  
Vol 45 (2) ◽  
pp. 163-165 ◽  
Author(s):  
D. Velmurugan ◽  
H. A. Hauptman ◽  
S. A. Potter

Applications are considered of the conditional probability distributions of the one-phase structure seminvariants in the monoclinic and orthorhombic systems when anomalous scatterers are present. Test results with error-free data show accurate estimates of seminvariants, the accuracy varying with the complexity of the structure and with the number and strength of the anomalous scatterers.


Sign in / Sign up

Export Citation Format

Share Document