Theoretical and computational models for organic chemistry. Edited by S. J. Formosinho, I.G. Csizmadia, and L.G. Arnaut, Nato Asi Series C, vol. 339, Kluwer, Dordrecht, 1991

1993 ◽  
Vol 45 (4) ◽  
pp. 403-403
Author(s):  
Boris Gordeychik ◽  
Tatiana Churikova ◽  
Thomas Shea ◽  
Andreas Kronz ◽  
Alexander Simakin ◽  
...  

Abstract Nickel is a strongly compatible element in olivine, and thus fractional crystallization of olivine typically results in a concave-up trend on a Fo–Ni diagram. ‘Ni-enriched’ olivine compositions are considered those that fall above such a crystallization trend. To explain Ni-enriched olivine crystals, we develop a set of theoretical and computational models to describe how primitive olivine phenocrysts from a parent (high-Mg, high-Ni) basalt re-equilibrate with an evolved (low-Mg, low-Ni) melt through diffusion. These models describe the progressive loss of Fo and Ni in olivine cores during protracted diffusion for various crystal shapes and different relative diffusivities for Ni and Fe–Mg. In the case when the diffusivity of Ni is lower than that for Fe–Mg interdiffusion, then olivine phenocrysts affected by protracted diffusion form a concave-down trend that contrasts with the concave-up crystallization trend. Models for different simple geometries show that the concavity of the diffusion trend does not depend on the size of the crystals and only weakly depends on their shape. We also find that the effect of diffusion anisotropy on trend concavity is of the same magnitude as the effect of crystal shape. Thus, both diffusion anisotropy and crystal shape do not significantly change the concave-down diffusion trend. Three-dimensional numerical diffusion models using a range of more complex, realistic olivine morphologies with anisotropy corroborate this conclusion. Thus, the curvature of the concave-down diffusion trend is mainly determined by the ratio of Ni and Fe–Mg diffusion coefficients. The initial and final points of the diffusion trend are in turn determined by the compositional contrast between mafic and more evolved melts that have mixed to cause disequilibrium between olivine cores and surrounding melt. We present several examples of measurements on olivine from arc basalts from Kamchatka, and published olivine datasets from mafic magmas from non-subduction settings (lamproites and kimberlites) that are consistent with diffusion-controlled Fo–Ni behaviour. In each case the ratio of Ni and Fe–Mg diffusion coefficients is indicated to be <1. These examples show that crystallization and diffusion can be distinguished by concave-up and concave-down trends in Fo–Ni diagrams.


Author(s):  
Fred Lacy

Electrical conductivity is a basic property of materials that determines how well the material conducts electricity. However, models are needed that help explain how conductors function as their size and temperature changes. This research demonstrates and explains how important atomic motion is in understanding electrical conductivity for conductors (and thus the ability of metals to function as temperature sensors). A derivation is performed (on an atomic level) that provides a theoretical relationship between electrical resistivity, temperature, and material thickness. Subsequently, computational models are used to determine the optimal parameters for the theoretical models as well as the conditions under which they are accurate. Comparisons are performed using experimental data showing that the models are valid and accurate.


2013 ◽  
Vol 29 (3) ◽  
pp. 1021-1041 ◽  
Author(s):  
Jason Wu ◽  
Leonardo Dueñas-Osorio

Barring a few exceptions, most theoretical and computational models of lifeline system fragility and interdependent response to extreme events still lack calibration and validation relative to real events. This paper expands on this area by evaluating and calibrating a recently proposed Interdependence Fragility Algorithm ( IFA) against field data observed after the 2010 Mw 8.8 offshore Maule, Chile, earthquake. This evaluation incorporates available and simulated properties of the Concepción and Talcahuano water and power networks to try to replicate their topology and seismic response, considering both direct damage and interdependent effects. The calibrated IFA predicts that the probabilities of exceeding the observed high connectivity losses of 0.70 (power) and 0.82 (water), if taken as limit states, are 97% and 72%, respectively. These predictions capture complex interdependent lifeline system responses reasonably well and reveal influential factors for IFA model accuracy and uncertainty reduction, enabling reliable planning, design, expansion, and maintenance of infrastructure systems in practice.


Author(s):  
Poornima Madhavan ◽  
Douglas A. Wiegmann

Studies have demonstrated that humans appear to apply norms of humanhuman interaction to their interaction with machines. Yet, there exist subtle differences in peoples' perceptions of automated aids compared to humans. We examined factors differentiating human-human and human-automation interaction, wherein participants (n = 180) performed a luggage-screening task with the assistance of human or automated advisers that differed in pedigree (expert vs. novice) and reliability (high vs. low). Dependence on advice was assessed. Participants agreed more with an automated 'novice' than a human 'novice' suggesting a bias toward automation. Automation biases broke down when automated aids portrayed as 'experts' generated errors, leading to a drop in compliance and reliance on automation relative to humans. The results have implications for the development of theoretical and computational models of optimal user dependence on decision aids.


Author(s):  
Elias Sundström ◽  
Bertrand Kerres ◽  
Sergio Sanz ◽  
Mihai Mihăescu

1D performance prediction modeling and steady-state CFD are applied to assess a high-performance centrifugal compressor. Computed total pressure ratio is compared with experimental data obtained from a gas stand. The focus of the paper is to assess the validity range of the methodologies used. Another aim is to quantify the relative differences between experimental and predicted data, and distinguish differences in the conjectured loss budget. The RANS data manifest overall a higher degree of accuracy than the 1D model when compared with experiments. The 1D model considered shows comparable accuracy at design condition but larger discrepancies at higher speedlines towards surge and choke. Component-wise parametric losses are correlated to pinpoint flow regimes with larger differences between 1D and RANS data. The result exposes significant disparity in the, impeller, vaneless diffuser and the volute model, respectively, especially off-design. Improving these features in the 1D modeling would potentially be profitable for improved accuracy in the performance prognosis.


Author(s):  
Grant Fisher

COMPUTATIONAL MODELING IN ORGANIC chemistry employs multiple methods of approximation and idealization. Coordinating and integrating methods can be challenging because even if a common theoretical basis is assumed, the computational result can depend on the choice of method. This can result in epistemic dissent as practitioners draw incompatible inferences about the mechanisms of organic reactions. These problems arose in the latter part of the twentieth century as quantum chemists attempted to extend their models and methods to the study of pericyclic reactions. The Woodward-Hoffmann rules were introduced in the mid-1960s to rationalize and predict the energetic requirements of a number of reactions of considerable synthetic significance. Soon after, quantitative quantum chemical approaches developed apace. But alternative methods of approximation yielded divergent quantitative predictions of transition state geometries and energies. This chapter explores the difficulties facing quantum chemists in the late twentieth century as they attempted to construct computational models of pericyclic reactions. Divergent model predictions resulted in the methods used to construct computational models becoming the focus of epistemic scrutiny and dissent. The failure to achieve robust quantitative results across quantitative methods prompted practitioners to scrutinize the consequences of pragmatic tradeoffs between computational manageability and predictive accuracy. I call the strategies employed to probe pragmatic tradeoffs diagnostics. Diagnostics provides the means to probe manageability—accuracy tradeoffs for sources of predictive divergence and to determine the reliability and applicability of approximation procedures, idealizations, and even techniques of parametrization. Furthermore, although technological developments in computing power continues to increase, and indeed that there is now a general consensus on the veracity of high level ab initio and density functional methods applied to pericyclic reactions, diagnostics imposes non-contingent pragmatic constraints on computational modelling. What counts as a “manageable” model is characterized by two dimensions: computational tractability and cognitive accessibility. While the former is a contingent feature of technological development the latter is not because cognitive skills are an ineliminable feature of computational modelling in organic chemistry.


Sign in / Sign up

Export Citation Format

Share Document