Understanding palladium complexes structures and reactivities: beyond classical point of view

2013 ◽  
Vol 3 (6) ◽  
pp. 529-541 ◽  
Author(s):  
Etienne Derat ◽  
Giovanni Maestri
2012 ◽  
Vol 2012 ◽  
pp. 1-19 ◽  
Author(s):  
Guido Sciavicco

The role of time in artificial intelligence is extremely important. Interval-based temporal reasoning can be seen as a generalization of the classical point-based one, and the first results in this field date back to Hamblin (1972) and Benhtem (1991) from the philosophical point of view, to Allen (1983) from the algebraic and first-order one, and to Halpern and Shoham (1991) from the modal logic one. Without purporting to provide a comprehensive survey of the field, we take the reader to a journey through the main developments in modal and first-order interval temporal reasoning over the past ten years and outline some landmark results on expressiveness and (un)decidability of the satisfiability problem for the family of modal interval logics.


Author(s):  
Željko Ivezi ◽  
Andrew J. Connolly ◽  
Jacob T. VanderPlas ◽  
Alexander Gray ◽  
Željko Ivezi ◽  
...  

This chapter introduces the main concepts of statistical inference, or drawing conclusions from data. There are three main types of inference: point estimation, confidence estimation, and hypothesis testing. There are two major statistical paradigms which address the statistical inference questions: the classical, or frequentist paradigm, and the Bayesian paradigm. While most of statistics and machine learning is based on the classical paradigm, Bayesian techniques are being embraced by the statistical and scientific communities at an ever-increasing pace. The chapter begins with a short comparison of classical and Bayesian paradigms, and then discusses the three main types of statistical inference from the classical point of view.


1995 ◽  
Vol 60 (1) ◽  
pp. 325-337 ◽  
Author(s):  
Thierry Coquand

If it is difficult to give the exact significance of consistency proofs from a classical point of view, in particular the proofs of Gentzen [2, 6], and Novikoff [14], the motivations of these proofs are quite clear intuitionistically. Their significance is then less to give a mere consistency proof than to present an intuitionistic explanation of the notion of classical truth. Gentzen for instance summarizes his proof as follows [6]: “Thus propositions of actualist mathematics seem to have a certain utility, but no sense. The major part of my consistency proof, however, consists precisely in ascribing a finitist sense to actualist propositions.” From this point of view, the main part of both Gentzen's and Novikoff's arguments can be stated as establishing that modus ponens is valid w.r.t. this interpretation ascribing a “finitist sense” to classical propositions.In this paper, we reformulate Gentzen's and Novikoff's “finitist sense” of an arithmetic proposition as a winning strategy for a game associated to it. (To see a proof as a winning strategy has been considered by Lorenzen [10] for intuitionistic logic.) In the light of concurrency theory [7], it is tempting to consider a strategy as an interactive program (which represents thus the “finitist sense” of an arithmetic proposition). We shall show that the validity of modus ponens then gets a quite natural formulation, showing that “internal chatters” between two programs end eventually.We first present Novikoff's notion of regular formulae, that can be seen as an intuitionistic truth definition for classical infinitary propositional calculus. We use this in order to motivate the second part, which presents a game-theoretic interpretation of the notion of regular formulae, and a proof of the admissibility of modus ponens which is based on this interpretation.


Mathematics ◽  
2022 ◽  
Vol 10 (2) ◽  
pp. 189
Author(s):  
Vicente Moret-Bonillo ◽  
Samuel Magaz-Romero ◽  
Eduardo Mosqueira-Rey

In this paper, we illustrate that inaccurate knowledge can be efficiently implemented in a quantum environment. For this purpose, we analyse the correlation between certainty factors and quantum probability. We first explore the certainty factors approach for inexact reasoning from a classical point of view. Next, we introduce some basic aspects of quantum computing, and we pay special attention to quantum rule-based systems. In this context, a specific use case was built: an inferential network for testing the behaviour of the certainty factors approach in a quantum environment. After the design and execution of the experiments, the corresponding analysis of the obtained results was performed in three different scenarios: (1) inaccuracy in declarative knowledge, or imprecision, (2) inaccuracy in procedural knowledge, or uncertainty, and (3) inaccuracy in both declarative and procedural knowledge. This paper, as stated in the conclusions, is intended to pave the way for future quantum implementations of well-established methods for handling inaccurate knowledge.


Those who first applied genetics to the study of natural populations—and it was, we must remember, fifty years ago—applied it from what we may now call the classical point of view. This is the point of view which assumes that the properties of heredity and also of variation can be deduced from breeding experiments using the methods of Mendel, Bateson and Morgan. It is the point of view expressed by Morgan in 1926 under the title of the Theory of the Gene. The fact that Morgan believed in the chromosomes while Bateson did not, failed to produce the cleavage in this classical view that might have been expected. It failed to do so because, for Morgan and also for those who followed him, his theory did not raise questions: it answered them. The chromosomes did not make the law: they obeyed it. It is thus not the chromosome theory but the Mendelian situation which is crucial for classical genetics. The inbred lines of close relationship, the regulated succession of selfing or sibbing and crossing, the chosen and standard environment, the individual as the unit of observation and selection: these were necessary ingredients and premises for the first phase of getting to know heredity. Generalizations were reached in this way which proved to be valid. They did so because they rested on the properties of cell structures, nuclei and chromosomes, at mitosis and meiosis, which are found to be universal.


2012 ◽  
Vol 3 ◽  
pp. 40-43 ◽  
Author(s):  
N. Shrestha ◽  
J. J. Nakarmi ◽  
L. N. Jha

We have discussed the problems of non linear interaction between electromagnetic radiations with atoms from semi-classical point of view. Time dependent Schrödinger equation for single electron system is solved by using perturbative technique to obtain transition probability. Higher order perturbation is also discussed which is used in multiple processes, in which two or more quanta are emitted instead of a single photon. The theory is based on assumption that the perturbation is small. From this transition probability ionization rate and absorption cross-section of hydrogen atom is calculated. Its variation with photon energy and field strength is analyzed which agrees very well with experimental observations.The Himalayan PhysicsVol. 3, No. 32012Page : 40-43


1919 ◽  
Vol 28 (1) ◽  
pp. 98
Author(s):  
Katherine E. Gilbert ◽  
Emile Boutroux

Sign in / Sign up

Export Citation Format

Share Document