scholarly journals A Note on the Reality of Incomputable Real Numbers and Its Systemic Significance

Systems ◽  
2021 ◽  
Vol 9 (2) ◽  
pp. 44
Author(s):  
Gianfranco Minati

We discuss mathematical and physical arguments contrasting continuous and discrete, limitless discretization as arbitrary granularity. In this regard, we focus on Incomputable (lacking an algorithm that computes in finite time) Real Numbers (IRNs). We consider how, for measurements, the usual approach to dealing with IRNs is to approximate to avoid the need for more detailed, unrealistic surveys. In this regard, we contrast effective computation and emergent computation. Furthermore, we consider the alternative option of taking into account the properties of the decimal part of IRNs, such as the occurrence, distribution, combinations, quasi-periodicities, and other contextual properties, e.g., topological. For instance, in correspondence with chaotic behaviors, quasi-periodic solutions, quasi-systems, uniqueness, and singularities, non-computability represents and corresponds to theoretically incomplete properties of the processes of complexity, such as emergence and quantum-like properties. We elaborate upon cases of equivalences and symmetries, characterizing complexity and infiniteness as corresponding to the usage of multiple non-equivalent models that are constructively and theoretically incomplete due to the non-exhaustive nature of the multiplicity of complexity. Finally, we detail alternative computational approaches, such as hypercomputation, natural computing, quantum computing, and analog and hybrid computing. The reality of IRNs is considered to represent the theoretical incompleteness of complex phenomena taking place through collapse from equivalences and symmetries. A world of precise finite values, even if approximated, is assumed to have dynamics that are zippable in analytical formulae and to be computable and symbolically representable in the way it functions. A world of arbitrary precise infinite values with dynamics that are non-zippable in analytical formulae, non-computable, and, for instance, sub-symbolically representable, is assumed to be almost compatible with the coherence of emergence. The real world is assumed to be a continuous combination of the two—functioning and emergent—where the second dominates and is the norm, and the first is the locus of primarily epistemic extracts. Research on IRNs should focus on properties representing and corresponding to those that are detectable in real, even if extreme, phenomena, such as emergence and quantum phenomena.

2020 ◽  
Vol 2 (3) ◽  
pp. 337-342
Author(s):  
Michael Siomau

Quantum computing allows us to solve some problems much faster than existing classical algorithms. Yet, the quantum computer has been believed to be no more powerful than the most general computing model—the Turing machine. Undecidable problems, such as the halting problem, and unrecognizable inputs, such as the real numbers, are beyond the theoretical limit of the Turing machine. I suggest a model for a quantum computer, which is less general than the Turing machine, but may solve the halting problem for any task programmable on it. Moreover, inputs unrecognizable by the Turing machine can be recognized by the model, thus breaking the theoretical limit for a computational task. A quantum computer is not just a successful design of the Turing machine as it is widely perceived now, but is a different, less general but more powerful model for computing, the practical realization of which may need different strategies than those in use now.


Systems ◽  
2019 ◽  
Vol 7 (1) ◽  
pp. 5 ◽  
Author(s):  
Hennie Kruger ◽  
Anné Verhoef ◽  
Rika Preiser

Classical operational research (OR) is mainly concerned with the use of mathematical techniques and models used in decision-making situations. The basic assumptions of OR presuppose that the structure of the world is one of order and predictability. Although this positivistic approach produces significant results when levels of certainty and initial conditions are stable, it is limited when faced with an acknowledgement of the complex nature of the real world. This paper aims to highlight that by drawing on a general understanding of complexity theory, classical OR approaches can be enriched and broadened by adopting an epistemology based on the assumption that the underlying mechanisms governing the world are complex. It is argued that complexity theory (as interpreted by the philosopher Paul Cilliers) acknowledges the complex nature of the real world and helps to identify the characteristics of complex phenomena. By aligning OR epistemologies with the acknowledgment of complexity, new modelling methods could be developed. In addition, the implications for knowledge generating processes through boundary setting, as well as the provisional nature of such knowledge and what (ethical) responsibilities accompany the study of complex phenomena, will be discussed. Examples are presented to highlight the epistemological implications of complexity thinking for OR.


Author(s):  
Andrei Khrennikov

AbstractThe recent claim of Google to have brought forth a breakthrough in quantum computing represents a major impetus to further analyze the foundations for any claims of superiority regarding quantum algorithms. This note attempts to present a conceptual step in this direction. I start with a critical analysis of what is commonly referred to as entanglement and quantum nonlocality and whether or not these concepts may be the basis of quantum superiority. Bell-type experiments are then interpreted as statistical tests of Bohr’s principle of complementarity (PCOM), which is, thus, given a foothold within the area of quantum informatics and computation. PCOM implies (by its connection to probability) that probabilistic algorithms may proceed without the knowledge of joint probability distributions (jpds). The computation of jpds is exponentially time consuming. Consequently, classical probabilistic algorithms, involving the computation of jpds for n random variables, can be outperformed by quantum algorithms (for large values of n). Quantum probability theory (QPT) modifies the classical formula for the total probability (FTP). Inference based on the quantum version of FTP leads to a constructive interference that increases the probability of some events and reduces that of others. The physical realization of this probabilistic advantage is based on the discreteness of quantum phenomena (as opposed to the continuity of classical phenomena).


2020 ◽  
pp. 659-678
Author(s):  
Andrei George Florea ◽  
Cătălin Buiu

In order to use membrane computing models for real life applications there is a real need for software that can read a model from some form of input media and afterwards execute it according to the execution rules that are specified in the definition of the model. Another requirement of this software application is for it to be capable of interfacing the computing model with the real world. This chapter discusses how this problem was solved along the years by various researchers around the world. After presenting notable examples from the literature, the discussion continues with a detailed presentation of three membrane computing simulators that have been developed by the authors at the Laboratory of Natural Computing and Robotics at the Politehnica University of Bucharest, Romania.


1993 ◽  
Vol 5 (2) ◽  
pp. 129-149 ◽  
Author(s):  
Gene R. Stoner ◽  
Thomas D. Albright

The problem of processing visual motion is underconstrained—many possible real world motions are compatible with any given dynamic retinal image. Recent psychophysical and neurophysiological experiments have shown that the primate visual system's normally veridical interpretation of moving patterns is attained through utilization of image segmentation cues unrelated to motion per se. These findings challenge notions of modularity in which it is assumed that the processing of specific scene properties, such as motion, can be studied in isolation from other visual processes. We discuss the implications of these findings with regard to both experimental and computational approaches to the study of visual motion.


Mathematics ◽  
2021 ◽  
Vol 9 (5) ◽  
pp. 523
Author(s):  
Krzysztof Piasecki ◽  
Anna Łyczkowska-Hanćkowiak

A formal model of an imprecise number can be given as, inter alia, a fuzzy number or oriented fuzzy numbers. Are they formally equivalent models? Our main goal is to seek formal differences between fuzzy numbers and oriented fuzzy numbers. For this purpose, we examine algebraic structures composed of numerical spaces equipped with addition, dot multiplication, and subtraction determined in a usual way. We show that these structures are not isomorphic. It proves that oriented fuzzy numbers and fuzzy numbers are not equivalent models of an imprecise number. This is the first original study of a problem of a dissimilarity between oriented fuzzy numbers and fuzzy numbers. Therefore, any theorems on fuzzy numbers cannot automatically be extended to the case of oriented fuzzy numbers. In the second part of the article, we study the purposefulness of a replacement of fuzzy numbers by oriented fuzzy numbers. We show that for a portfolio analysis, oriented fuzzy numbers are more useful than fuzzy numbers. Therefore, we conclude that oriented fuzzy numbers are an original and useful tool for modelling a real-world problems.


Author(s):  
Leandro Nunes de Castro ◽  
Fernando J. Von Zuben

Biologically inspired computing is just one of the branches of natural computing, which also encompasses artificial life, fractal geometry and computing with natural means (molecular, membrane and quantum computing). This chapter provides a brief and general overview of natural computing, focusing on bio-inspired algorithms. Some relevant literature is cited for guidance purposes and the main objective and scope of the book is described.


Author(s):  
Suchada Pongprasert ◽  
Kanyarat Chaengsisai ◽  
Wuttichai Kaewleamthong ◽  
Puttarawadee Sriphrom

Polynomials can be used to represent real-world situations, and their roots have real-world meanings when they are real numbers. The fundamental theorem of algebra tells us that every nonconstant polynomial p with complex coefficients has a complex root. However, no analogous result holds for guaranteeing that a real root exists to p if we restrict the coefficients to be real. Let n ≥ 1 and P n be the vector space of all polynomials of degree n or less with real coefficients. In this article, we give explicit forms of polynomials in P n such that all of their roots are real. Furthermore, we present explicit forms of linear transformations on P n which preserve real roots of polynomials in a certain subset of P n .


2022 ◽  
Vol 2022 ◽  
pp. 1-17
Author(s):  
Tayyabah Hasan ◽  
Fahad Ahmad ◽  
Muhammad Rizwan ◽  
Nasser Alshammari ◽  
Saad Awadh Alanazi ◽  
...  

Fog computing (FC) based sensor networks have emerged as a propitious archetype for next-generation wireless communication technology with caching, communication, and storage capacity services in the edge. Mobile edge computing (MEC) is a new era of digital communication and has a rising demand for intelligent devices and applications. It faces performance deterioration and quality of service (QoS) degradation problems, especially in the Internet of Things (IoT) based scenarios. Therefore, existing caching strategies need to be enhanced to augment the cache hit ratio and manage the limited storage to accelerate content deliveries. Alternatively, quantum computing (QC) appears to be a prospect of more or less every typical computing problem. The framework is basically a merger of a deep learning (DL) agent deployed at the network edge with a quantum memory module (QMM). Firstly, the DL agent prioritizes caching contents via self organizing maps (SOMs) algorithm, and secondly, the prioritized contents are stored in QMM using a Two-Level Spin Quantum Phenomenon (TLSQP). After selecting the most appropriate lattice map (32 × 32) in 750,000 iterations using SOMs, the data points below the dark blue region are mapped onto the data frame to get the videos. These videos are considered a high priority for trending according to the input parameters provided in the dataset. Similarly, the light-blue color region is also mapped to get medium-prioritized content. After the SOMs algorithm’s training, the topographic error (TE) value together with quantization error (QE) value (i.e., 0.0000235) plotted the most appropriate map after 750,000 iterations. In addition, the power of QC is due to the inherent quantum parallelism (QP) associated with the superposition and entanglement principles. A quantum computer taking “n” qubits that can be stored and execute 2n presumable combinations of qubits simultaneously reduces the utilization of resources compared to conventional computing. It can be analyzed that the cache hit ratio will be improved by ranking the content, removing redundant and least important content, storing the content having high and medium prioritization using QP efficiently, and delivering precise results. The experiments for content prioritization are conducted using Google Colab, and IBM’s Quantum Experience is considered to simulate the quantum phenomena.


2012 ◽  
Vol 2 (2) ◽  
pp. 23-44
Author(s):  
Raymond Chiong ◽  
Ferrante Neri ◽  
R. I. McKay

Nature has always been a source of inspiration. Over the last few decades, it has stimulated many successful techniques, algorithms and computational applications for dealing with large, complex and dynamic real world problems. In this article, the authors discuss why nature-inspired solutions have become increasingly important and favourable for tackling the conventionally-hard problems. They also present the concepts and background of some selected examples from the domain of natural computing, and describe their key applications in business, science and engineering. Finally, the future trends are highlighted to provide a vision for the potential growth of this field.


Sign in / Sign up

Export Citation Format

Share Document