Hyperfinite type structures

1999 ◽  
Vol 64 (3) ◽  
pp. 1216-1242 ◽  
Author(s):  
Dag Normann ◽  
Erik Palmgren ◽  
Viggo Stoltenberg-Hansen

The notion of a hyperfinite set comes from nonstandard analysis. Such a set has the internal cardinality of a nonstandard natural number. By a transfer principle such sets share many properties of finite sets. Here we apply this notion to give a hyperfinite model of the Kleene-Kreisel continuous functionals. We also extend the method to provide a hyperfinite characterisation of certain transfinite type structures, thus, through the work of Waagbø [14], constructing a hyperfinite model for Martin-Löf type theory.This kind of application is not new. Normann [6] gave a characterisation of the Kleene-Kreisel continuous functionals using ‘hyperfinitary’ functionals. The novelty here is that we use a constructive version of hyperfinite functionals and also generalise the method to transfinite types. Many of the results of this paper are constructive, though not the characterisation theorems themselves.Our characterisation of the Kleene-Kreisel continuous functionals is a supplement to a number of previous characterisations of topological and recursion-theoretical nature, see [6] for a brief survey. Altogether these characterisations show that the original concept of Kleene and Kreisel forms the correct mathematical model of the idea of finitely based functions of finite types.There is, however, no a priori reason to believe that there is a canonical way to extend the continuous functionals to cover transfinite objects of transfinite type used in, e.g., type theory. Our characterisation of Waagbø's model indicates that the model is natural, not only seen from domain theory but from a higher perspective. Normann and Waagbø (unpublished) have subsequently obtained a limit-space characterisation that further supports this view.

1998 ◽  
Vol 4 (3) ◽  
pp. 233-272 ◽  
Author(s):  
Erik Palmgren

AbstractWe develop a constructive version of nonstandard analysis, extending Bishop's constructive analysis with infinitesimal methods. A full transfer principle and a strong idealisation principle are obtained by using a sheaf-theoretic construction due to I. Moerdijk. The construction is, in a precise sense, a reduced power with variable filter structure. We avoid the nonconstructive standard part map by the use of nonstandard hulls. This leads to an infinitesimal analysis which includes nonconstructive theorems such as the Heine–Borel theorem, the Cauchy–Peano existence theorem for ordinary differential equations and the exact intermediate-value theorem, while it at the same time provides constructive results for concrete statements. A nonstandard measure theory which is considerably simpler than that of Bishop and Cheng is developed within this context.


2017 ◽  
Vol 23 (3) ◽  
pp. 420-432 ◽  
Author(s):  
Pavel Krejčí ◽  
Adrien Petrov

The third-body concept is a pragmatic tool used to understand the friction and wear of sliding materials. The wear particles play a crucial role in this approach and constitute the main part of the third-body. This paper aims to introduce a mathematical model for the motion of a third-body interface separating two surfaces in contact. This model is written in accordance with the formalism of hysteresis operators as solution operators of the underlying variational inequalities. The existence result for this dynamical problem is obtained by using a priori estimates established for Faedo–Galerkin approximations, and some more specific techniques such as anisotropic Sobolev embedding theory.


Author(s):  
Diego Liberati

In many fields of research, as well as in everyday life, it often turns out that one has to face a huge amount of data, without an immediate grasp of an underlying simple structure, often existing. A typical example is the growing field of bio-informatics, where new technologies, like the so-called Micro-arrays, provide thousands of gene expressions data on a single cell in a simple and fast integrated way. On the other hand, the everyday consumer is involved in a process not so different from a logical point of view, when the data associated to his fidelity badge contribute to the large data base of many customers, whose underlying consuming trends are of interest to the distribution market. After collecting so many variables (say gene expressions, or goods) for so many records (say patients, or customers), possibly with the help of wrapping or warehousing approaches, in order to mediate among different repositories, the problem arise of reconstructing a synthetic mathematical model capturing the most important relations between variables. To this purpose, two critical problems must be solved: 1 To select the most salient variables, in order to reduce the dimensionality of the problem, thus simplifying the understanding of the solution 2 To extract underlying rules implying conjunctions and/or disjunctions between such variables, in order to have a first idea of their even non linear relations, as a first step to design a representative model, whose variables will be the selected ones When the candidate variables are selected, a mathematical model of the dynamics of the underlying generating framework is still to be produced. A first hypothesis of linearity may be investigated, usually being only a very rough approximation when the values of the variables are not close to the functioning point around which the linear approximation is computed. On the other hand, to build a non linear model is far from being easy: the structure of the non linearity needs to be a priori known, which is not usually the case. A typical approach consists in exploiting a priori knowledge to define a tentative structure, and then to refine and modify it on the training subset of data, finally retaining the structure that best fits a cross-validation on the testing subset of data. The problem is even more complex when the collected data exhibit hybrid dynamics, i.e. their evolution in time is a sequence of smooth behaviors and abrupt changes.


Author(s):  
Aurel Gaba ◽  
Vasile Bratu ◽  
Dorian Musat ◽  
Ileana Nicoleta Popescu ◽  
Maria Cristiana Enescu

Abstract This paper presents solutions and the equipment for preheating combustion air from scrap aluminum melting furnaces through flue gas heat recovery. For sizing convection pre-heaters, there has been developed a mathematical model which has been transcribed into a computer program in C + +. A constructive version of the pre-heater was drawn up and a recovery heat exchanger was manufactured and mounted on an aluminum melting furnace. Both the functional parameters values and the reasons causing the pre-heater worning out, as well as the steps taken for sizing and the achievement of a new air pre-heater able to bear the operating conditions of the aluminum melting furnace are shown.


2018 ◽  
Vol 11 (1) ◽  
pp. 160-206
Author(s):  
RICCARDO PINOSIO ◽  
MICHIEL VAN LAMBALGEN

AbstractIn this paper we provide a mathematical model of Kant’s temporal continuum that yields formal correlates for Kant’s informal treatment of this concept in theCritique of Pure Reasonand in other works of his critical period. We show that the formal model satisfies Kant’s synthetic a priori principles for time (whose consistence is not obvious) and that it even illuminates what “faculties and functions” must be in place, as “conditions for the possibility of experience”, for time to satisfy such principles. We then present a mathematically precise account of Kant’s transcendental theory of time—the most precise account to date.Moreover, we show that the Kantian continuum which we obtain has some affinities with the Brouwerian continuum but that it also has “infinitesimal intervals” consisting of nilpotent infinitesimals; these allow us to capture Kant’s theory of rest and motion in theMetaphysical Foundations of Natural Science.While our focus is on Kant’s theory of time the material in this paper is more generally relevant for the problem of developing a rigorous theory of the phenomenological continuum, in the tradition of Whitehead, Russell, and Weyl among others.


2005 ◽  
Vol 49 (4) ◽  
pp. 1483-1494 ◽  
Author(s):  
C. Wiuff ◽  
R. M. Zappala ◽  
R. R. Regoes ◽  
K. N. Garner ◽  
F. Baquero ◽  
...  

ABSTRACT When growing bacteria are exposed to bactericidal concentrations of antibiotics, the sensitivity of the bacteria to the antibiotic commonly decreases with time, and substantial fractions of the bacteria survive. Using Escherichia coli CAB1 and antibiotics of five different classes (ampicillin, ciprofloxacin, rifampin, streptomycin, and tetracycline), we examine the details of this phenomenon and, with the aid of mathematical models, develop and explore the properties and predictions of three hypotheses that can account for this phenomenon: (i) antibiotic decay, (ii) inherited resistance, and (iii) phenotypic tolerance. Our experiments cause us to reject the first two hypotheses and provide evidence that this phenomenon can be accounted for by the antibiotic-mediated enrichment of subpopulations physiologically tolerant to but genetically susceptible to these antibiotics, phenotypic tolerance. We demonstrate that tolerant subpopulations generated by exposure to one concentration of an antibiotic are also tolerant to higher concentrations of the same antibiotic and can be tolerant to antibiotics of the other four types. Using a mathematical model, we explore the effects of phenotypic tolerance to the microbiological outcome of antibiotic treatment and demonstrate, a priori, that it can have a profound effect on the rate of clearance of the bacteria and under some conditions can prevent clearance that would be achieved in the absence of tolerance.


2019 ◽  
Vol 5 (1) ◽  
pp. 246
Author(s):  
Vladimir Evgenievich Bazanov

The construction organizations lack adaptive infrastructure. It is required to use more intensively SMART-technologies of design and construction. Purposes of this article: a) the system analysis of categories "competitiveness", "stability", "rating of the company" for the construction organizations; b) to construct and investigate economic and mathematical model of competitiveness of the construction organization; c) to determine parameters of self-organization of construction company; d) to construct an identification algorithm for model. These new tasks also answer the purposes of modern construction business, problems of forecasting of its development. Using methods of the system analysis and modeling, in work three levels of the analysis of construction business are considered: macrolevel (level of the state), mesolevel (level of the region) and microlevel (level of the company). For example, 10 various classes of competitiveness of construction companies are offered. It improves the classification used traditionally. The new economic and mathematical model on the basis of production functions of type of Cobb-Douglas is constructed. The algorithm of its identification on the basis of situational scenarios is also developed. The algorithm finds parameters which will allow to define competitiveness of construction company a priori. The offered research has a development, for example, is possible to use for the forecast of adaptation of the enterprise.


1999 ◽  
Vol 9 (2) ◽  
pp. 177-223 ◽  
Author(s):  
BERNHARD REUS ◽  
THOMAS STREICHER

Synthetic domain theory (SDT) is a version of Domain Theory where ‘all functions are continuous’. Following the original suggestion of Dana Scott, several approaches to SDT have been developed that are logical or categorical, axiomatic or model-oriented in character and that are either specialised towards Scott domains or aim at providing a general theory axiomatising the structure common to the various notions of domains studied so far.In Reus and Streicher (1993), Reus (1995) and Reus (1998), we have developed a logical and axiomatic version of SDT, which is special in the sense that it captures the essence of Domain Theory à la Scott but rules out, for example, Stable Domain Theory, as it requires order on function spaces to be pointwise. In this article we will give a logical and axiomatic account of a general SDT with the aim of grasping the structure common to all notions of domains.As in loc.cit., the underlying logic is a sufficiently expressive version of constructive type theory. We start with a few basic axioms giving rise to a core theory on top of which we study various notions of predomains (such as, for example, complete and well-complete S-spaces (Longley and Simpson 1997)), define the appropriate notion of domain and verify the usual induction principles of domain theory.Although each domain carries a logically definable ‘specialization order’, we avoid order-theoretic notions as much as possible in the formulation of axioms and theorems. The reason is that the order on function spaces cannot be required to be pointwise, as this would rule out the model of stable domains à la Berry.The consequent use of logical language – understood as the internal language of some categorical model of type theory – avoids the irritating coexistence of the internal and the external view pervading purely categorical approaches. Therefore, the paper is aimed at providing an elementary introduction to synthetic domain theory, albeit requiring some knowledge of basic type theory.


Sign in / Sign up

Export Citation Format

Share Document