A formulation of the simple theory of types

1940 ◽  
Vol 5 (2) ◽  
pp. 56-68 ◽  
Author(s):  
Alonzo Church

The purpose of the present paper is to give a formulation of the simple theory of types which incorporates certain features of the calculus of λ-conversion. A complete incorporation of the calculus of λ-conversion into the theory of types is impossible if we require that λx and juxtaposition shall retain their respective meanings as an abstraction operator and as denoting the application of function to argument. But the present partial incorporation has certain advantages from the point of view of type theory and is offered as being of interest on this basis (whatever may be thought of the finally satisfactory character of the theory of types as a foundation for logic and mathematics).For features of the formulation which are not immediately connected with the incorporation of λ-conversion, we are heavily indebted to Whitehead and Russell, Hilbert and Ackermann, Hilbert and Bernays, and to forerunners of these, as the reader familiar with the works in question will recognize.The class of type symbols is described by the rules that ı and o are each type symbols and that if α and β are type symbols then (αβ) is a type symbol: it is the least class of symbols which contains the symbols ı and o and is closed under the operation of forming the symbol (αβ) from the symbols α and β.

2002 ◽  
Vol 8 (2) ◽  
pp. 185-245 ◽  
Author(s):  
Fairouz Kamareddine ◽  
Twan Laan ◽  
Rob Nederpelt

AbstractIn this article, we study the prehistory of type theory up to 1910 and its development between Russell and Whitehead's Principia Mathematica ([71], 1910–1912) and Church's simply typed λ-calculus of 1940. We first argue that the concept of types has always been present in mathematics, though nobody was incorporating them explicitly as such, before the end of the 19th century. Then we proceed by describing how the logical paradoxes entered the formal systems of Frege, Cantor and Peano concentrating on Frege's Grundgesetze der Arithmetik for which Russell applied his famous paradox and this led him to introduce the first theory of types, the Ramified Type Theory (RTT). We present RTT formally using the modern notation for type theory and we discuss how Ramsey, Hilbert and Ackermann removed the orders from RTT leading to the simple theory of types STT. We present STT and Church's own simply typed λ-calculus (λ→C) and we finish by comparing RTT, STT and λ→C.


Author(s):  
Lev D. Lamberov ◽  

In recent decades, some epistemological issues have become especially acute in mathematics. These issues are associated with long proofs of various important mathematical results, as well as with a large and constantly increasing number of publications in mathematics. It is assumed that (at least partially) these difficulties can be resolved by referring to computer proofs. However, computer proofs also turn out to be problematic from an epistemological point of view. With regard to both proofs in ordinary (informal) mathematics and computer proofs, the problem of their surveyability appears to be fundamental. Based on the traditional concept of proof, it must be surveyable, otherwise it will not achieve its main goal — the formation of conviction in the correctness of the mathematical result being proved. About 15 years ago, a new approach to the foundations of mathematics began to develop, combining constructivist, structuralist features and a number of advantages of the classical approach to mathematics. This approach is built on the basis of homotopy type theory and is called the univalent foundations of mathematics. Due to itspowerful notion of equality, this approach can significantly reduce the length of formalized proofs, which outlines a way to resolve the epistemological difficulties that have arisen


2018 ◽  
Vol 14 (1) ◽  
pp. 7540-7559
Author(s):  
MI lOS lAWA SOKO

Virtually every biological model utilising a random number generator is a Markov stochastic process. Numerical simulations of such processes are performed using stochastic or intensity matrices or kernels. Biologists, however, define stochastic processes in a slightly different way to how mathematicians typically do. A discrete-time discrete-value stochastic process may be defined by a function p : X0 × X → {f : Î¥ → [0, 1]}, where X is a set of states, X0 is a bounded subset of X, Î¥ is a subset of integers (here associated with discrete time), where the function p satisfies 0 < p(x, y)(t) < 1 and  EY p(x, y)(t) = 1. This definition generalizes a stochastic matrix. Although X0 is bounded, X may include every possible state and is often infinite. By interrupting the process whenever the state transitions into the X −X0 set, Markov stochastic processes defined this way may have non-quadratic stochastic matrices. Similar principle applies to intensity matrices, stochastic and intensity kernels resulting from considering many biological models as Markov stochastic processes. Class of such processes has important properties when considered from a point of view of theoretical mathematics. In particular, every process from this class may be simulated (hence they all exist in a physical sense) and has a well-defined probabilistic space associated with it.


1976 ◽  
Vol 41 (4) ◽  
pp. 747-760 ◽  
Author(s):  
Alonzo Church

In this paper we treat the ramified type theory of Russell [6], afterwards adopted by Whitehead and Russell in Principia mathematica [12], so that we may compare Russell's resolution of the semantical antinomies by ramified type theory with the now widely accepted resolution of them by the method of Tarski in [7], [8], [9].To avoid impredicativity the essential restriction is that quantification over any domain (type) must not be allowed to add new members to the domain, as it is held that adding new members changes the meaning of quantification over the domain in such a way that a vicious circle results. As Whitehead and Russell point out, there is no one particular form of the doctrine of types that is indispensable to accomplishing this restriction, and they have themselves offered two different versions of the ramified hierarchy in the first edition of Principia (see Preface, p. vii). The version in §§58–59 of the writer's [1], which will be followed in this paper, is still slightly different.To distinguish Russellian types or types in the sense of the ramified hierarchy from types in the sense of the simple theory of types, let us call the former r-types.There is an r-type i to which the individual variables belong. If β1, β2, …, βm are any given r-types, m ≧ 0, there is an r-type (β1, β2, …, βm)/n to which there belong m-ary functional variables of level n, n ≧ 1. The r-type (α1, α2, …, αm)/k is said to be directly lower than the r-type (β1, β2, …, βm)/n if α1 = β1, α2 = β2, …, αm = βm, k < n.


Author(s):  
Petri Mäenpää

This work proposes a new method of deriving programs from their specifications in constructive type theory: the method of analysis-synthesis. It is new as a mathematical method only in the area of programming methodology, as it is modelled upon the most successful and widespread method in the history of exact sciences. The method of analysis-synthesis, also known as the method of analysis, was devised by Ancient Greek mathematicians for solving geometric construction problems with ruler and compass. Its most important subsequent elaboration is Descartes’s algebraic method of analysis, which pervades all exact sciences today. The present work expands this method further into one that aims at systematizing program derivation in a heuristically useful way, analogously to the way Descartes’s method systematized the solution of geometric and arithmetical problems. To illustrate the method, we derive the Boyer-Moore algorithm for finding an element that has a majority of occurrences in a given list. It turns out that solving programming problems need not be too different from solving mathematical problems in general. This point of view has been emphasized in particular by Martin-Löf (1982) and Dijkstra (1986). The idea of a logic of problem solving originates in Kolmogorov (1932). We aim to refine the analogy between programming and mathematical problem solving by investigating the mathematical method of analysis in the context of programming. The central idea of the analytic method, in modern terms, is to analyze the functional dependencies between the constituents of a geometric configuration. The aim is to determine how the sought constituents depend on the given ones. A Greek analysis starts by drawing a diagram with the sought constructions drawn on the given ones, in the relation required by the problem specification. Then the sought constituents of the configuration are determined in terms of the given ones. Analysis was the Greeks’ method of discovering solutions to problems. Their method of justification was synthesis, which cast analysis into standard deductive form. First it constructed the sought objects from the given ones, and then demonstrated that they relate as required to the given ones. In his Geometry, Descartes developed Greek geometric analysis-synthesis into the modern algebraic method of analysis.


Author(s):  
Wei Shen ◽  
Benjamin Rouben

From the educational point of view, there are many textbooks on reactor physics used at various universities in the world. However, most of these textbooks focus either on application to Light Water Reactors (LWRs), or on the theory and mathematics, with a significant number of equations and computational schemes. Or else they were written more than 20, or even more than 60, years ago, and therefore they do not reflect the evolution of reactor concepts and engineering requirements since then. All those categories of books are either difficult to follow for non-physicists working in the nuclear industry, or else are of little value for those who are interested in special features of CANDU reactor physics.


1994 ◽  
Vol 41 (5) ◽  
pp. 238-239
Author(s):  
Rosalie S. Nichols ◽  
V. Ray Kurtz

Local, regional, state and provincial, and national contests in mathematics certainly generate enthusiasm for mathematics. Thousands of students are experiencing the excitement of comparing their mathematical skills with those of other student their age. Even though reasons for the current popularity of the contest movement are uncertain, this interest does appear to be contributing lO the promotion of mathematics. The movement is even contributing to the growth of local, state, and provincial mathematics education organizations.


1962 ◽  
Vol 55 (6) ◽  
pp. 476-478
Author(s):  
Lillian Marlin

The modern world, with its emphasis on science and mathematics, has given birth to new concepts in these fields. The dire need for trained mathematicians and an informed public resulted in the recognition of the need to incorporate into the study of mathematics new material and to present the old in a new light. The School Mathematics Study Group material is a prime mover in this field; it offers the new topics and language of elementary and high school mathematics, using modern approaches to learning.


1999 ◽  
Vol 8 (2) ◽  
pp. 123-137 ◽  
Author(s):  
Rakesh Popli

In this article, three different descriptions of curricula for scientific literacy (SL) are summarized, compared, and critically reviewed from the point of view of their suitability for all citizens. Science for All Americans, a publication of the American Association for the Advancement of Science, envisages giving every citizen a thorough exposure to the world of science, technology, and mathematics; the report Towards Scientific Literacy, published by the International Institute for Adult Literacy Methods, recommends a phenomenological approach to science designed to make science useful for people in their daily lives; and a similar curriculum, Minimum Science for Everybody, published by a voluntary organization in India, provides a detailed alternative conceptual framework for SL in which community traditions and knowledge systems are interfaced with science. The three reports are seen to differ from one another not only in respect of the contents of the curricula recommended, but also in their approaches, and the world views underlying these different approaches are brought out. It is suggested that SL curricula in both “developed” and “developing” countries be reviewed in the light of the ideas contained in all three reports in accordance with the needs and circumstances of the people. The article argues for the need to review the nature of science from the perspective of the common citizen.


Sign in / Sign up

Export Citation Format

Share Document