Three universal representations of recursively enumerable sets

1978 ◽  
Vol 43 (2) ◽  
pp. 335-351 ◽  
Author(s):  
James P. Jones

In his celebrated paper of 1931 [7], Kurt Gödel proved the existence of sentences undecidable in the axiomatized theory of numbers. Gödel's proof is constructive and such a sentence may actually be written out. Of course, if we follow Gödel's original procedure the formula will be of enormous length.Forty-five years have passed since the appearance of Gödel's pioneering work. During this time enormous progress has been made in mathematical logic and recursive function theory. Many different mathematical problems have been proved recursively unsolvable. Theoretically each such result is capable of producing an explicit undecidable number theoretic predicate. We have only to carry out a suitable arithmetization. Until recently, however, techniques were not available for carrying out these arithmetizations with sufficient efficiency.In this article we construct an explicit undecidable arithmetical formula, F(x, n), in prenex normal form. The formula is explicit in the sense that it is written out in its entirety with no abbreviations. The formula is undecidable in the recursive sense that there exists no algorithm to decide, for given values of n, whether or not F(n, n) is true or false. Moreover F(n, n) is undecidable in the formal (axiomatic) sense of Gödel [7]. Given any of the usual axiomatic theories to which Gödel's Incompleteness Theorem applies, there exists a value of n such that F(n, n) is unprovable and irrefutable. Thus Gödel's Incompleteness Theorem can be “focused” into the formula F(n, n). Thus some substitution instance of F(n, n) is undecidable in Peano arithmetic, ZF set theory, etc.

2016 ◽  
Vol 100 (549) ◽  
pp. 442-449
Author(s):  
A. C. Paseau

Metamathematics is the mathematical study of mathematics itself. Two of its most famous theorems were proved by Kurt Gödel in 1931. In a simplified form, Gödel's first incompleteness theorem states that no reasonable mathematical system can prove all the truths of mathematics. Gödel's second incompleteness theorem (also simplified) in turn states that no reasonable mathematical system can prove its own consistency. Another famous undecidability theorem is that the Continuum Hypothesis is neither provable nor refutable in standard set theory. Many of us logicians were first attracted to the field as students because we had heard something of these results. All research mathematicians know something of them too, and have at least a rough sense of why ‘we can't prove everything we want to prove’.


2016 ◽  
Vol 2 (3) ◽  
Author(s):  
Alexander Kharazishvili

The set of arithmetic truths is neither recursive, nor recursively enumerable. Mathematician Alexander Kharazishvili explores how powerful the celebrated diagonal method is for general and descriptive set theory, recursion theory, and Gödel’s incompleteness theorem.


1967 ◽  
Vol 45 (1) ◽  
pp. 119-126 ◽  
Author(s):  
J. Basinski ◽  
R. Olivier

Hall effect and resistivity measurements have been made in the temperature range 4.2–360 °K on several samples of n-type GaAs grown under oxygen atmosphere and without any other intentional dopings. The principal shallow donor in this material is considered to be Si. All samples exhibited impurity-band conduction at low temperature. Electron concentrations in the conduction band were calculated, using a two-band model, and then fitted to the usual equation expressing charge neutrality. A value of 2.3 × 10−3 eV was obtained for the ionization energy of the donors, for donor concentration ranging from 5 × 1015 cm−3 to 2 × 1016 cm−3. The conduction in the impurity band was of the hopping type for these concentrations. A value of 3.5 × 1016 cm−3 was obtained for the critical transition concentration of the impurity-band conduction to the metallic type.


2015 ◽  
Vol 3 (1) ◽  
pp. 45-73 ◽  
Author(s):  
Teun Zuiderent-Jerak ◽  
Stans Van Egmond

Valuation studies addresses how values are made in valuation practices. A next - or rather previous - question becomes: what then makes valuation practices? Two oppositional replies are starting to dominate how that question can be answered: a more materially oriented focus on devices of valuation and a more sociologically inclined focus on ineffable valuation cultures. The debate between proponents of both approaches may easily turn into the kind of leapfrog debates that have dominated many previous discussions on whether culture or materiality would play a decisive role in driving history. This paper explores a less repetitive reply. It does so by analyzing the puzzling case of the demise of solidarity as a core value within the recent Dutch health care system of regulated competition. While “solidarity among the insured” was both a strong cultural value within the Dutch welfare-based health system, and a value that was built into market devices by health economists, within a fairly short time “fairness” became of lesser importance than “competition”. This makes us call for a more historical, relational, and dynamic understanding of the role of economists, market devices, and of culture in valuation studies.


2005 ◽  
Vol 11 (2) ◽  
pp. 207-224 ◽  
Author(s):  
Donald A. Martin

Kurt Gödel is almost as famous—one might say “notorious”—for his extreme platonist views as he is famous for his mathematical theorems. Moreover his platonism is not a myth; it is well-documented in his writings. Here are two platonist declarations about set theory, the first from his paper about Bertrand Russell and the second from the revised version of his paper on the Continuum Hypotheses.Classes and concepts may, however, also be conceived as real objects, namely classes as “pluralities of things” or as structures consisting of a plurality of things and concepts as the properties and relations of things existing independently of our definitions and constructions.It seems to me that the assumption of such objects is quite as legitimate as the assumption of physical bodies and there is quite as much reason to believe in their existence.But, despite their remoteness from sense experience, we do have something like a perception also of the objects of set theory, as is seen from the fact that the axioms force themselves upon us as being true. I don't see any reason why we should have less confidence in this kind of perception, i.e., in mathematical intuition, than in sense perception.The first statement is a platonist declaration of a fairly standard sort concerning set theory. What is unusual in it is the inclusion of concepts among the objects of mathematics. This I will explain below. The second statement expresses what looks like a rather wild thesis.


2011 ◽  
pp. 1338-1349
Author(s):  
Anne-Marie Croteau ◽  
Anne Beaudry ◽  
Justin Holm

As per the Census Bureau of the Department of Commerce, the estimate of U.S. retail e-commerce sales for the first quarter of 2009 was $31.7 billion. For the same period, e-commerce accounted for 3.5 percent of total sales with a value of $30.2 billion sales. As electronic business (e-business) has become essential in our economy, organizations have begun to demand a return on their investment in such endeavors (Damanpour and Damanpour, 2001). More recently, research indicates that webbased technologies enhance performance when the environmental pressures are high, the technical capabilities within the organization are well integrated, and the management team highly supports and sees value in e-business initiatives (Sanders, 2007). An extensive and diverse body of literature has been produced regarding e-business. One research angle that lacked over the years is the definition and assessment of an e-business strategy (e-strategy). Some efforts were made in evaluating e-strategy through an electronic simulation (Ha and Forgianne, 2006). Another recent research observed that human, technological and business capabilities and e-business implementation influence the business performance at various levels (Coltman, Devinney, and Midgley, 2007). However, both studies did not develop an e-strategy construct empirically tested with managers.


Author(s):  
H. A. F. Chaves

Characteristic analysis is well known in mineral resources appraisal and has proved useful for petroleum exploration. It also can be used to integrate geological data in sedimentary basin analysis and hydrocarbon assessment, considering geological relationships and uncertainties that result from lack of basic geological knowledge, A generalization of characteristic analysis, using fuzzy-set theory and fuzzy logic, may prove better for quantification of geologic analogues and also for description of reservoir and sedimentary facies. Characteristic analysis is a discrete multivariate procedure for combining and interpreting data; Botbol (1971) originally proposed its application to geology, geochemistry, and geophysics. It has been applied mainly in the search for poorly exposed or concealed mineral deposits by exploring joint occurrences or absences of mineralogical, lithological, and structural attributes (McCammon et al., 1981). It forms part of a systematic approach to resource appraisal and integration of generalized and specific geological knowledge (Chaves, 1988, 1989; Chaves and Lewis, 1989). The technique usually requires some form of discrete sampling to be applicable—generally a spatial discretization of maps into cells or regular grids (Melo, 1988). Characteristic analysis attempts to determine the joint occurrences of various attributes that are favorable for, related to, or indicative of the occurrence of the desired phenomenon or target. In geological applications, the target usually is an economic accumulation of energy or mineral resources. Applying characteristic analysis requires the following steps: 1) the studied area is sampled using a regular square or rectangular grid of cells; 2) in each cell the favorabilities of the variables are expressed in binary or ternary form; 3) a model is chosen that indicates the cells that include the target (Sinding-Larsen et al, 1979); and 4) a combined favorability map of the area is produced that points out possible new targets. The favorability of individual variables is expressed either in binary form— assigning a value of +1 to favorable and a value of 0 to unfavorable or unevaluated variables—or in ternary form if the two states represented by 0 are distinguishable—the value +1 again means favorable, the value —1 means unfavorable, and the value 0 means unevaluated.


Author(s):  
Michael Detlefsen

In the first, geometric stage of Hilbert’s formalism, his view was that a system of axioms does not express truths particular to a given subject matter but rather expresses a network of logical relations that can (and, ideally, will) be common to other subject matters. The formalism of Hilbert’s arithmetical period extended this view by emptying even the logical terms of contentual meaning. They were treated purely as ideal elements whose purpose was to secure a simple and perspicuous logic for arithmetical reasoning – specifically, a logic preserving the classical patterns of logical inference. Hilbert believed, however, that the use of ideal elements should not lead to inconsistencies. He thus undertook to prove the consistency of ideal arithmetic with its contentual or finitary counterpart and to do so by purely finitary means. In this, ‘Hilbert’s programme’, Hilbert and his followers were unsuccessful. Work published by Kurt Gödel in 1931 suggested that such failure was perhaps inevitable. In his second incompleteness theorem, Gödel showed that for any consistent formal axiomatic system T strong enough to formalize what was traditionally regarded as finitary reasoning, it is possible to define a sentence that expresses the consistency of T, and is not provable in T. From this it has generally been concluded that the consistency of even the ideal arithmetic of the natural numbers is not finitarily provable and that Hilbert’s programme must therefore fail. Despite problematic elements in this reasoning, post-Gödelian work on Hilbert’s programme has generally accepted it and attempted to minimize its effects by proposing various modifications of Hilbert’s programme. These have generally taken one of three forms: attempts to extend Hilbert’s finitism to stronger constructivist bases capable of proving more than is provable by strictly finitary means; attempts to show that for a significant family of ideal systems there are ways of ‘reducing’ their consistency problems to those of theories possessing more elementary (if not altogether finitary) justifications; and attempts by the so-called ‘reverse mathematics’ school to show that the traditionally identified ideal theories do not need to be as strong as they are in order to serve their mathematical purposes. They can therefore be reduced to weaker theories whose consistency problems are more amenable to constructivist (indeed, finitist) treatment.


1971 ◽  
Vol 2 (1) ◽  
pp. 45-46
Author(s):  
R. Van Der Borght

The effective Rayleigh number, in the solar convection zone, soon reaches a value of the order of 106 and, although considerable progress has been made in the numerical integration of the basic system of differential equations at high Rayleigh number, it is of interest to investigate more fully the application of asymptotic methods to such a problem.


2016 ◽  
Vol 42 (1) ◽  
pp. 3-21 ◽  
Author(s):  
Frederick Harry Pitts

This article critiques post-operaist conceptualisations of immaterial labour from the perspective of Marxian value-form theory. Critiquing the idea of the ‘crisis of measurability’ created by immaterial labour and the contention that this makes redundant the law of value, it contests the novelty, immediate abstractness and immeasurable productivity post-operaists attribute to contemporary labour using the New Reading of Marx. The first part explores this theoretical conflict, asserting that post-operaismo refutes Marx’s value theory only insofar as it holds a productivist understanding of value to begin with. The second reflects upon the political implications through a consideration of the post-operaist advocacy of a universal basic income. Appeals to reward, recompense and redistribution rest upon the veracity of the claims made in the post-operaist treatment of labour, value and their immateriality and immeasurability. A value-form analysis exposes flaws in the assumptions about value and labour that support their case for a universal basic income.


Sign in / Sign up

Export Citation Format

Share Document