scholarly journals The Worse than Nothing Account of Harm and the Preemption Problem

2021 ◽  
pp. 1-24
Author(s):  
Daniel Immerman

Abstract Because harm is an important notion in ethics, it’s worth investigating what it amounts to. The counterfactual comparative account of harm, commonly thought to be the most promising account of harm, analyzes harm by comparing what actually happened with what would have happened in some counterfactual situation. But it faces the preemption problem, a problem so serious that it has driven some to suggest we abandon the counterfactual comparative account and maybe even abandon the notion of harm altogether. This paper defends a version of the counterfactual comparative account that solves the preemption problem, a version called the “worse than nothing account.” It says that you harm someone just in case you leave them worse off than if you’d done nothing at all.

Author(s):  
Joseph F. Boudreau ◽  
Eric S. Swanson

Built-in datatypes and C++ classes are introduced in this chapter, and discussed in relation to the important notion of encapsulation, which refers to the separation between the internal representation of the datatype and the operations to which it responds. Encapsulation later becomes an important consideration in the design of custom C++ classes that programmers develop themselves. It is illustrated with built-in floating-point datatypes float and double and with the complex class from the C++ standard library. While a sophisticated programmer is aware of the internal representation of data and its resulting limitations, encapsulation allows one to consider these as details and frees one to think at a higher level of program design. Some simple numerical examples are discussed in the text and in the exercises.


1978 ◽  
Vol 1 (2) ◽  
pp. 149-167 ◽  
Author(s):  
Gunnel Källgren

This article is an attempt at combining sentence level and text level by similar means of analysis, based on a DEEP CASE MODEL. Tentative lists of DEEP CASES and TEXTUAL RELATIONS are given and discussed.The theoretical apparatus has been developed through empirical analyses of important notion of LINK FAMILY is introduced. The ultimate goal of this type of analysis is to find the INFORMATION STRUCTURE of texts, but its connections wiht e.g. “pure” TEXTUAL COHESION are also treated.


2019 ◽  
Vol 17 (1) ◽  
pp. 653-667
Author(s):  
Zhongming Teng ◽  
Hong-Xiu Zhong

Abstract In the linear response eigenvalue problem arising from computational quantum chemistry and physics, one needs to compute a few of smallest positive eigenvalues together with the corresponding eigenvectors. For such a task, most of efficient algorithms are based on an important notion that is the so-called pair of deflating subspaces. If a pair of deflating subspaces is at hand, the computed approximated eigenvalues are partial eigenvalues of the linear response eigenvalue problem. In the case the pair of deflating subspaces is not available, only approximate one, in a recent paper [SIAM J. Matrix Anal. Appl., 35(2), pp.765-782, 2014], Zhang, Xue and Li obtained the relationships between the accuracy in eigenvalue approximations and the distances from the exact deflating subspaces to their approximate ones. In this paper, we establish majorization type results for these relationships. From our majorization results, various bounds are readily available to estimate how accurate the approximate eigenvalues based on information on the approximate accuracy of a pair of approximate deflating subspaces. These results will provide theoretical foundations for assessing the relative performance of certain iterative methods in the linear response eigenvalue problem.


2006 ◽  
Vol 129 (4) ◽  
pp. 445-448 ◽  
Author(s):  
Davide Paganelli

Singularities form surfaces in the jointspace of a serial manipulator. Paï and Leu (Paï and Leu, 1992, IEEE Trans. Rob. Autom., 8, pp. 545–559) introduced the important notion of generic manipulator, the singularity surfaces of which are smooth and do not intersect with each other. Burdick (Burdick, 1995, J. Mech. Mach. Theor., 30, pp. 71–89) proposed a homotopy-based classification method for generic 3R manipulators. Through this classification method, it was stated in Wenger, 1998, J. Mech. Des., 120, pp. 327–332 that there exist exactly eight classes of generic 3R manipulators. A counterexample to this classification is provided: a generic 3R manipulator belonging to none of the eight classes identified in (Wenger, 1998, J. Mech. Des., 120, pp. 327–332) is presented. The weak point of the proof given in (J. Mech. Des., 120, pp. 327–332) is highlighted. The counterexample proves the existence of at least nine homotopy classes of generic 3R manipulators. The paper points out two peculiar properties of the manipulator proposed as a counterexample, which are not featured by any manipulator belonging to the eight homotopy classes so far discovered. Eventually, it is proven in this paper that at most four branches of the singularity curve can coexist in the jointspace of a generic 3R manipulator and therefore at most eleven homotopy classes are possible.


Author(s):  
Berta Guerrero Almagro

En este artículo llevamos a cabo una revisión del concepto de ritmo en la poesía de José Antonio Ramos Sucre (Cumaná, 1890-Ginebra, 1930). La modalidad empleada por el poeta es el poema en prosa y, como demostraremos, este se construye sin dejar de lado una noción importante para la elaboración poética: el ritmo. Analizamos a continuación dos poemas de El cielo de esmalte (1929): “El asno” y “El jugador”. In this article we carry out a revision of the concept of rhythm in the poetry of José Antonio Ramos Sucre (Cumaná, 1890-Geneva, 1930). The mode used by the poet is the poem in prose and, as we will demonstrate, this is built without leaving aside an important notion for poetic elaboration: rhythm. We then analyze two poems from El cielo de esmalte (1929): "El asno" and "El jugador".


Author(s):  
Miguel D. Ramirez

This paper analyzes the very important notion of capital from a Marxian perspective as opposed to a neoclassical one. It is argued that when capital is viewed as a historically determined social process (relation), rather than as a thing or a col-lection of things, it tends to assume certain specific forms more often than others depending on the particular stage of economic history. Capital thus refers simulta-neously to social relations and to things. Given this frame of reference, notions such as money and property capital are more easily accommodated and conse-quently are not written off as financial or fictitious capital - not real capital because they produce nothing. The paper also focuses on Marx's important analy-sis of the time of production and the turnover of capital in terms of the production of surplus-value (profit). It then examines Marx's equally important and prescient analysis of how the turnover speed of capital is affected by the time of circulation of commodities (the realization of surplus-value) and the growing use of credit (in its various forms) in the capitalist system. Finally, the paper turns its attention to the economic role of time as it relates to interest - bearing capital - one whose clear comprehension rests on viewing capital as a social construct.


Author(s):  
Shaun M. Fallat ◽  
Charles R. Johnson

This chapter contains a detailed account of the distribution of rank deficient submatrices within a TN matrix, including a discussion of the important notion of row and column inclusion. The distribution of ranks among submatrices of a TN matrix is much less free than in a general matrix. Rank deficiency of submatrices in certain positions requires rank deficiency elsewhere. Whereas, in the case of general matrices, rank deficiency of a large submatrix can imply rank deficiency of smaller, included submatrices, in the TN case rank deficiency of small submatrices can imply that of much larger ones. The chapter discusses this and other related phenomenon.


Author(s):  
Timothy Williamson

The chapter responds to Dorothy Edgington’s article ‘Possible Knowledge of Unknown Truth’, which defends her seminal diagnosis of the Church–Fitch refutation of verificationist knowability principles. Using counterfactual conditionals, she reformulates those principles to block that objection. The chapter argues that, to avoid trivialization, Edgington must supply a more general constraint on how the knower specifies a counterfactual situation for purposes of her reformulated principles; it is unclear how to do so. The philosophical motivation for her strategy is also questioned, with special reference to her treatment of Putnam’s epistemic account of truth. In passing, it is questioned how dangerous Church–Fitch arguments are for verificationist principles with non-factive evidential attitudes in place of knowledge. Finally, a doubt is raised about the compatibility of Edgington’s reformulation strategy with her view that counterfactual conditionals lack truth-conditions.


2020 ◽  
pp. 289-318
Author(s):  
Giuseppe Mussardo

Chapter 8 introduces the key ideas of the renormalization group, including how they provide a theoretical scheme and a proper language to face critical phenomena. It covers the scaling transformations of a system and their implementations in the space of the coupling constants and reducing the degrees of freedom. From this analysis, the reader is led to the important notion of relevant, irrelevant and marginal operators and then to the universality of the critical phenomena. Furthermore, the chapter also covers (as regards the RG) transformation laws, effective Hamiltonians, the Gaussian model, the Ising model, operators of quantum field theory, universal ratios, critical exponents and β‎-functions.


Author(s):  
Jean Zinn-Justin

Chapter 11 is the first of four chapters that discuss various issues connected with the Standard Model of fundamental interactions at the microscopic scale. It discusses the important notion of gauge invariance, first Abelian and then non–Abelian, the basic geometric structure that generates interactions. It relates it to the concept of parallel transport. Due to gauge invariance, not all components of the gauge field are dynamical and gauge fixing is required (with the problem of Gribov copies in non–Abelian theories). The quantization of non–Abelian gauge theories is briefly discussed, with the introduction of Faddeev–Popov ghost fields and the appearance of BRST symmetry.


Sign in / Sign up

Export Citation Format

Share Document