scholarly journals Pandora's Box

10.29007/l7kx ◽  
2018 ◽  
Author(s):  
Ronald Middelkoop ◽  
Cornelis Huizing ◽  
Ruurd Kuiper ◽  
Erik J. Luit

An algebraic specification is viewed as a black box that rewrites input to a ``most basic'' canonical form. We argue that a canonical form should be given for each specific specification, to prevent ``cheating'' in the implementation. Furthermore, we argue that the definition of the canonical form may sometimes require semantic rather than syntactic information.To relate an OO implementation to a specification requires opening the black box to some extent; we assess the choices to be made here.

2021 ◽  
Vol 2021 (11) ◽  
Author(s):  
Gabriele Dian ◽  
Paul Heslop

Abstract We consider amplituhedron-like geometries which are defined in a similar way to the intrinsic definition of the amplituhedron but with non-maximal winding number. We propose that for the cases with minimal number of points the canonical form of these geometries corresponds to the product of parity conjugate amplitudes at tree as well as loop level. The product of amplitudes in superspace lifts to a star product in bosonised superspace which we give a precise definition of. We give an alternative definition of amplituhedron-like geometries, analogous to the original amplituhedron definition, and also a characterisation as a sum over pairs of on-shell diagrams that we use to prove the conjecture at tree level. The union of all amplituhedron-like geometries has a very simple definition given by only physical inequalities. Although such a union does not give a positive geometry, a natural extension of the standard definition of canonical form, the globally oriented canonical form, acts on this union and gives the square of the amplitude.


2019 ◽  
Vol 87 (2) ◽  
pp. 27-29
Author(s):  
Meagan Wiederman

Artificial intelligence (AI) is the ability of any device to take an input, like that of its environment, and work to achieve a desired output. Some advancements in AI have focused n replicating the human brain in machinery. This is being made possible by the human connectome project: an initiative to map all the connections between neurons within the brain. A full replication of the thinking brain would inherently create something that could be argued to be a thinking machine. However, it is more interesting to question whether a non-biologically faithful AI could be considered as a thinking machine. Under Turing’s definition of ‘thinking’, a machine which can be mistaken as human when responding in writing from a “black box,” where they can not be viewed, can be said to pass for thinking. Backpropagation is an error minimizing algorithm to program AI for feature detection with no biological counterpart which is prevalent in AI. The recent success of backpropagation demonstrates that biological faithfulness is not required for deep learning or ‘thought’ in a machine. Backpropagation has been used in medical imaging compression algorithms and in pharmacological modelling.


Author(s):  
С.И. Мартыненко

Сформулированы требования к вычислительным алгоритмам для перспективного программного обеспечения, устроенного по принципу "черного ящика" и предназначенного для математического моделирования в механике сплошных сред. Выполнен анализ прикладных свойств классических многосеточных методов и универсальной многосеточной технологии в рамках проблемы "универсальность-эффективность-параллелизм". Показано, что близкая к оптимальной трудоемкость при минимуме проблемно-зависимых компонентов и высокая эффективность параллелизма достижимы при использовании универсальной многосеточной технологии на глобально структурированных сетках. Применение неструктурированных сеток потребует определения двух проблемно-зависимых компонентов (межсеточных операторов), которые значительно влияют на трудоемкость алгоритма. A number of requirements are formulated to the numerical algorithms for black box software intended for mathematical modeling in continuum mechanics. An analysis of applied properties of the classical multigrid methods and robust multigrid technique in the framework of "robustness-efficiency-parallelism" problem is performed. It is shown that a close-to-optimal complexity with the least number of problem-dependent components and high parallel efficiency can be achieved with the robust multigrid technique on globally structured grids. Application of unstructured grids requires the accurate definition of two problem-dependent components (intergrid operators) that strongly affect on the complexity of an algorithm.


2020 ◽  
Vol 223 ◽  
pp. 02012
Author(s):  
Ekaterina Chzhan

The article deals with the problem of modeling stochastic processes under uncertainty. The peculiarity of the processes under consideration is that the researcher does not have information about the mathematical structure of the object; the object is represented as a black box. The article proposes to use a nonparametric modeling algorithm based on a nonparametric estimate of the regression function on observations. To improve the accuracy of modeling, it is proposed to use an algorithm for generating training samples. The algorithm differs from the previous modification by the definition of essential variables. The results of computational experiments have shown the effectiveness of the proposed algorithms.


2011 ◽  
Vol 2011 ◽  
pp. 1-15
Author(s):  
Dragomir Ž. Đoković

Base sequences BS(m,n) are quadruples (A;B;C;D) of {±1}-sequences, with A and B of length m and C and D of length n, such that the sum of their nonperiodic autocorrelation functions is a δ-function. Normal sequences NS(n) are base sequences (A;B;C;D)∈BS(n,n) such that A=B. We introduce a definition of equivalence for normal sequences NS(n) and construct a canonical form. By using this canonical form, we have enumerated the equivalence classes of NS(n) for n≤40.


2019 ◽  
Vol 9 (5) ◽  
pp. 1002 ◽  
Author(s):  
Yuichi Komano ◽  
Shoichi Hirose

The re-keying scheme is a variant of the symmetric encryption scheme where a sender (respectively, receiver) encrypts (respectively, decrypts) plaintext with a temporal session key derived from a master secret key and publicly-shared randomness. It is one of the system-level countermeasures against the side channel attacks (SCAs), which make attackers unable to collect enough power consumption traces for their analyses by updating the randomness (i.e., session key) frequently. In 2015, Dobraunig et al. proposed two kinds of re-keying schemes. The first one is a scheme without the beyond birthday security, which fixes the security vulnerability of the previous re-keying scheme of Medwed et al. Their second scheme is an abstract scheme with the beyond birthday security, which, as a black-box, consists of two functions; a re-keying function to generate a session key and a tweakable block cipher to encrypt plaintext. They assumed that the tweakable block cipher was ideal (namely, secure against the related key, chosen plaintext, and chosen ciphertext attacks) and proved the security of their scheme as a secure tweakable block cipher. In this paper, we revisit the re-keying scheme. The previous works did not discuss security in considering the SCA well. They just considered that the re-keying scheme was SCA resistant when the temporal session key was always refreshed with randomness. In this paper, we point out that such a discussion is insufficient by showing a concrete attack. We then introduce the definition of an SCA-resistant re-keying scheme, which captures the security against such an attack. We also give concrete schemes and discuss their security and applications.


2017 ◽  
Vol 35 (3) ◽  
pp. 157-178 ◽  
Author(s):  
Francisco J. León-Medina

Building mechanisms-based, black box–free explanations is the main goal of analytical sociology. In this article, I offer some reasons to question whether some of the conceptual and methodological developments of the analytical community really serve this goal. Specifically, I argue that grounding our computer modeling practices in the current definition of mechanisms posits a serious risk of defining an ideal-typical research path that neglects the role that the understanding of the generative process must have for a black box–free explanation to be met. I propose some conceptual and methodological alternatives, and I identify some collective challenges that the analytical community should tackle in order not to deviate from its main goal.


2012 ◽  
Vol 5 (2) ◽  
pp. 121-139 ◽  
Author(s):  
Miroslav Beblavy ◽  
Emilia Sicakova-Beblava ◽  
Darina Ondrusova

Summary / Abstract Discussion of politico-administrative relations as well as the research on agencies generally treats the “politicisation” of agency management as a single, “black-box” concept, according to which agency managements (and other senior civil servants) are either political or not. Our paper shows that, using a strict, but widely applied definition of what constitutes a political appointment, agency heads in Slovakia are overwhelmingly “political”, but that the implications of politicisation vary, depending on the type of politicisation. In particular, we distinguish personal nominations of the responsible minister and contrast them with party nominations based on coalition agreements. Based on a series of interviews with senior policy-makers on both sides of the politico-administrative divide, we show that the selection mechanism, incentive structure and robustness of actual accountability mechanisms differs more between these two types of politicisations than between the ministerial and formally “non-political” appointment.


2020 ◽  
Vol 309 ◽  
pp. 02008
Author(s):  
Mengqing TanLi ◽  
Yan Jiang ◽  
Xiang Wang ◽  
Rushu Peng

After a useful and summarized procedure of software testing is put forward based software engineering view, this paper proposed a definition of fat-property according to software testing activity in product quality monitoring software. Based on fat-property, black-box testing approach is deeply investigated. In unit testing, equivalence partitioning should include two aspects: data inputting type and function operating type. And a key point of black-box testing is design of boundary/sub- boundary testing case for data inputting type. In integration testing, Sandwich mode should be applied to improve coverage. In validation testing, keynote function and non- keynote function may be tested respectively to accelerate speed of testing and assure coverage of function. System testing based on black-box according to actual usage of software product is very important, and it will determine the quality level of software product.


Sign in / Sign up

Export Citation Format

Share Document