Nonstandard set theory

1989 ◽  
Vol 54 (3) ◽  
pp. 1000-1008 ◽  
Author(s):  
Peter Fletcher

AbstractNonstandard set theory is an attempt to generalise nonstandard analysis to cover the whole of classical mathematics. Existing versions (Nelson, Hrbáček, Kawai) are unsatisfactory in that the unlimited idealisation principle conflicts with the wish to have a full theory of external sets.I re-analyse the underlying requirements of nonstandard set theory and give a new formal system, stratified nonstandard set theory, which seems to meet them better than the other versions.

2002 ◽  
Vol 67 (1) ◽  
pp. 315-325 ◽  
Author(s):  
Mauro Di Nasso

AbstractA nonstandard set theory *ZFC is proposed that axiomatizes the nonstandard embedding *. Besides the usual principles of nonstandard analysis, all axioms of ZFC except regularity are assumed. A strong form of saturation is also postulated. *ZFC is a conservative extension of ZFC.


2001 ◽  
Vol 66 (3) ◽  
pp. 1321-1341 ◽  
Author(s):  
P. V. Andreev ◽  
E. I. Gordon

AbstractWe present an axiomatic framework for nonstandard analysis—the Nonstandard Class Theory (NCT) which extends von Neumann–Gödel–Bernays Set Theory (NBG) by adding a unary predicate symbol St to the language of NBG (St(X) means that the class X is standard) and axioms—related to it—analogs of Nelson's idealization, standardization and transfer principles. Those principles are formulated as axioms, rather than axiom schemes, so that NCT is finitely axiomatizable. NCT can be considered as a theory of definable classes of Bounded Set Theory by V. Kanovei and M. Reeken. In many aspects NCT resembles the Alternative Set Theory by P. Vopenka. For example there exist semisets (proper subclasses of sets) in NCT and it can be proved that a set has a standard finite cardinality iff it does not contain any proper subsemiset. Semisets can be considered as external classes in NCT. Thus the saturation principle can be formalized in NCT.


1992 ◽  
Vol 57 (2) ◽  
pp. 741-748 ◽  
Author(s):  
David Ballard ◽  
Karel Hrbacek

In the thirty years since its invention by Abraham Robinson, nonstandard analysis has become a useful tool for research in many areas of mathematics. It seems fair to say, however, that the search for practically satisfactory foundations for the subject is not yet completed. New proposals, intended to remedy various shortcomings of older approaches, continue to be put forward. The objective of this paper is to show that nonstandard concepts have a natural place in the usual (more or less “standard”) set theory, and to argue that this approach improves upon various aspects of hitherto considered systems, while retaining most of their attractive features. We do this by working in Zermelo-Fraenkel set theory with non-well-founded sets. It has always been clear that the axiom of regularity may fail for external sets. The previous approaches either avoid non-well-foundedness by considering only that fragment of nonstandard set theory that is well-founded (over individuals; enlargements of Robinson and Zakon [17]) or reluctantly live with it (various axiomatic nonstandard set theories). Ballard and Davidon [2] were the first to propose constructive use for non-well-foundedness in the foundations of nonstandard analysis. In the present paper we adopt a very strong anti-foundation axiom. In the resulting more or less “usual” set theory, the (to the “standard” mathematician) unfamiliar concepts of standard, external and internal sets can be defined and their requisite properties proved (rather than postulated, as is the case in axiomatic nonstandard set theories).


1986 ◽  
Vol 51 (2) ◽  
pp. 377-386 ◽  
Author(s):  
C. Ward Henson ◽  
H. Jerome Keisler

It is often asserted in the literature that any theorem which can be proved using nonstandard analysis can also be proved without it. The purpose of this paper is to show that this assertion is wrong, and in fact there are theorems which can be proved with nonstandard analysis but cannot be proved without it. There is currently a great deal of confusion among mathematicians because the above assertion can be interpreted in two different ways. First, there is the following correct statement: any theorem which can be proved using nonstandard analysis can be proved in Zermelo-Fraenkel set theory with choice, ZFC, and thus is acceptable by contemporary standards as a theorem in mathematics. Second, there is the erroneous conclusion drawn by skeptics: any theorem which can be proved using nonstandard analysis can be proved without it, and thus there is no need for nonstandard analysis.The reason for this confusion is that the set of principles which are accepted by current mathematics, namely ZFC, is much stronger than the set of principles which are actually used in mathematical practice. It has been observed (see [F] and [S]) that almost all results in classical mathematics use methods available in second order arithmetic with appropriate comprehension and choice axiom schemes.


Author(s):  
A. V. Crewe

We have become accustomed to differentiating between the scanning microscope and the conventional transmission microscope according to the resolving power which the two instruments offer. The conventional microscope is capable of a point resolution of a few angstroms and line resolutions of periodic objects of about 1Å. On the other hand, the scanning microscope, in its normal form, is not ordinarily capable of a point resolution better than 100Å. Upon examining reasons for the 100Å limitation, it becomes clear that this is based more on tradition than reason, and in particular, it is a condition imposed upon the microscope by adherence to thermal sources of electrons.


Author(s):  
Maxim B. Demchenko ◽  

The sphere of the unknown, supernatural and miraculous is one of the most popular subjects for everyday discussions in Ayodhya – the last of the provinces of the Mughal Empire, which entered the British Raj in 1859, and in the distant past – the space of many legendary and mythological events. Mostly they concern encounters with inhabitants of the “other world” – spirits, ghosts, jinns as well as miraculous healings following magic rituals or meetings with the so-called saints of different religions (Hindu sadhus, Sufi dervishes),with incomprehensible and frightening natural phenomena. According to the author’s observations ideas of the unknown in Avadh are codified and structured in Avadh better than in other parts of India. Local people can clearly define if they witness a bhut or a jinn and whether the disease is caused by some witchcraft or other reasons. Perhaps that is due to the presence in the holy town of a persistent tradition of katha, the public presentation of plots from the Ramayana epic in both the narrative and poetic as well as performative forms. But are the events and phenomena in question a miracle for the Avadhvasis, residents of Ayodhya and its environs, or are they so commonplace that they do not surprise or fascinate? That exactly is the subject of the essay, written on the basis of materials collected by the author in Ayodhya during the period of 2010 – 2019. The author would like to express his appreciation to Mr. Alok Sharma (Faizabad) for his advice and cooperation.


HortScience ◽  
1998 ◽  
Vol 33 (3) ◽  
pp. 452c-452 ◽  
Author(s):  
Schuyler D. Seeley ◽  
Raymundo Rojas-Martinez ◽  
James Frisby

Mature peach trees in pots were treated with nighttime temperatures of –3, 6, 12, and 18 °C for 16 h and a daytime temperature of 20 °C for 8 h until the leaves abscised in the colder treatments. The trees were then chilled at 6 °C for 40 to 70 days. Trees were removed from chilling at 40, 50, 60, and 70 days and placed in a 20 °C greenhouse under increasing daylength, spring conditions. Anthesis was faster and shoot length increased with longer chilling treatments. Trees exposed to –3 °C pretreatment flowered and grew best with 40 days of chilling. However, they did not flower faster or grow better than the other treatments with longer chilling times. There was no difference in flowering or growth between the 6 and 12 °C pretreatments. The 18 °C pretreatment resulted in slower flowering and very little growth after 40 and 50 days of chilling, but growth was comparable to other treatments after 70 days of chilling.


2020 ◽  
Vol 27 (3) ◽  
pp. 178-186 ◽  
Author(s):  
Ganesan Pugalenthi ◽  
Varadharaju Nithya ◽  
Kuo-Chen Chou ◽  
Govindaraju Archunan

Background: N-Glycosylation is one of the most important post-translational mechanisms in eukaryotes. N-glycosylation predominantly occurs in N-X-[S/T] sequon where X is any amino acid other than proline. However, not all N-X-[S/T] sequons in proteins are glycosylated. Therefore, accurate prediction of N-glycosylation sites is essential to understand Nglycosylation mechanism. Objective: In this article, our motivation is to develop a computational method to predict Nglycosylation sites in eukaryotic protein sequences. Methods: In this article, we report a random forest method, Nglyc, to predict N-glycosylation site from protein sequence, using 315 sequence features. The method was trained using a dataset of 600 N-glycosylation sites and 600 non-glycosylation sites and tested on the dataset containing 295 Nglycosylation sites and 253 non-glycosylation sites. Nglyc prediction was compared with NetNGlyc, EnsembleGly and GPP methods. Further, the performance of Nglyc was evaluated using human and mouse N-glycosylation sites. Results: Nglyc method achieved an overall training accuracy of 0.8033 with all 315 features. Performance comparison with NetNGlyc, EnsembleGly and GPP methods shows that Nglyc performs better than the other methods with high sensitivity and specificity rate. Conclusion: Our method achieved an overall accuracy of 0.8248 with 0.8305 sensitivity and 0.8182 specificity. Comparison study shows that our method performs better than the other methods. Applicability and success of our method was further evaluated using human and mouse N-glycosylation sites. Nglyc method is freely available at https://github.com/bioinformaticsML/ Ngly.


2019 ◽  
Vol 15 (5) ◽  
pp. 472-485 ◽  
Author(s):  
Kuo-Chen Chou ◽  
Xiang Cheng ◽  
Xuan Xiao

<P>Background/Objective: Information of protein subcellular localization is crucially important for both basic research and drug development. With the explosive growth of protein sequences discovered in the post-genomic age, it is highly demanded to develop powerful bioinformatics tools for timely and effectively identifying their subcellular localization purely based on the sequence information alone. Recently, a predictor called “pLoc-mEuk” was developed for identifying the subcellular localization of eukaryotic proteins. Its performance is overwhelmingly better than that of the other predictors for the same purpose, particularly in dealing with multi-label systems where many proteins, called “multiplex proteins”, may simultaneously occur in two or more subcellular locations. Although it is indeed a very powerful predictor, more efforts are definitely needed to further improve it. This is because pLoc-mEuk was trained by an extremely skewed dataset where some subset was about 200 times the size of the other subsets. Accordingly, it cannot avoid the biased consequence caused by such an uneven training dataset. </P><P> Methods: To alleviate such bias, we have developed a new predictor called pLoc_bal-mEuk by quasi-balancing the training dataset. Cross-validation tests on exactly the same experimentconfirmed dataset have indicated that the proposed new predictor is remarkably superior to pLocmEuk, the existing state-of-the-art predictor in identifying the subcellular localization of eukaryotic proteins. It has not escaped our notice that the quasi-balancing treatment can also be used to deal with many other biological systems. </P><P> Results: To maximize the convenience for most experimental scientists, a user-friendly web-server for the new predictor has been established at http://www.jci-bioinfo.cn/pLoc_bal-mEuk/. </P><P> Conclusion: It is anticipated that the pLoc_bal-Euk predictor holds very high potential to become a useful high throughput tool in identifying the subcellular localization of eukaryotic proteins, particularly for finding multi-target drugs that is currently a very hot trend trend in drug development.</P>


Author(s):  
B. Elavarasan ◽  
G. Muhiuddin ◽  
K. Porselvi ◽  
Y. B. Jun

AbstractHuman endeavours span a wide spectrum of activities which includes solving fascinating problems in the realms of engineering, arts, sciences, medical sciences, social sciences, economics and environment. To solve these problems, classical mathematics methods are insufficient. The real-world problems involve many uncertainties making them difficult to solve by classical means. The researchers world over have established new mathematical theories such as fuzzy set theory and rough set theory in order to model the uncertainties that appear in various fields mentioned above. In the recent days, soft set theory has been developed which offers a novel way of solving real world issues as the issue of setting the membership function does not arise. This comes handy in solving numerous problems and many advancements are being made now-a-days. Jun introduced hybrid structure utilizing the ideas of a fuzzy set and a soft set. It is to be noted that hybrid structures are a speculation of soft set and fuzzy set. In the present work, the notion of hybrid ideals of a near-ring is introduced. Significant work has been carried out to investigate a portion of their significant properties. These notions are characterized and their relations are established furthermore. For a hybrid left (resp., right) ideal, different left (resp., right) ideal structures of near-rings are constructed. Efforts have been undertaken to display the relations between the hybrid product and hybrid intersection. Finally, results based on homomorphic hybrid preimage of a hybrid left (resp., right) ideals are proved.


Sign in / Sign up

Export Citation Format

Share Document