Modeling Process Variability in Scaled MOSFETs

2018 ◽  
pp. 285-312
Author(s):  
Samar K. Saha
1996 ◽  
Vol 59 (13) ◽  
pp. 6-9 ◽  
Author(s):  
MORRIS E. POTTER

ABSTRACT Risk assessment is the characterization of potential adverse effects of exposures to hazards, including estimates of the magnitude of the risk, the severity of outcome, and an indication of the uncertainties involved. Because risk assessments are based on statistical and other treatments of scientific data, the quality of such assessments is only as good as the data that go into their calculation. Sources of uncertainty include scanty and/or unrepresentative data, imprecise measuring devices, systematic flaws in the data collection process, variability in host response, and difficulties in the modeling process. Sources of uncertainty tend to be different for infectious and noninfectious hazards, which has led to the use of different risk assessment approaches. The ultimate goal in using risk assessment is to provide some objective estimate of risk that can be used by the food industry and regulatory agencies to assure that foods are acceptably safe. Public confidence in the risk-assessment technique will be won by its successful application and communication.


TAPPI Journal ◽  
2018 ◽  
Vol 17 (05) ◽  
pp. 295-305
Author(s):  
Wesley Gilbert ◽  
Ivan Trush ◽  
Bruce Allison ◽  
Randy Reimer ◽  
Howard Mason

Normal practice in continuous digester operation is to set the production rate through the chip meter speed. This speed is seldom, if ever, adjusted except to change production, and most of the other digester inputs are ratioed to it. The inherent assumption is that constant chip meter speed equates to constant dry mass flow of chips. This is seldom, if ever, true. As a result, the actual production rate, effective alkali (EA)-to-wood and liquor-to-wood ratios may vary substantially from assumed values. This increases process variability and decreases profits. In this report, a new continuous digester production rate control strategy is developed that addresses this shortcoming. A new noncontacting near infrared–based chip moisture sensor is combined with the existing weightometer signal to estimate the actual dry chip mass feedrate entering the digester. The estimated feedrate is then used to implement a novel feedback control strategy that adjusts the chip meter speed to maintain the dry chip feedrate at the target value. The report details the results of applying the new measurements and control strategy to a dual vessel continuous digester.


2013 ◽  
Vol 58 (3) ◽  
pp. 871-875
Author(s):  
A. Herberg

Abstract This article outlines a methodology of modeling self-induced vibrations that occur in the course of machining of metal objects, i.e. when shaping casting patterns on CNC machining centers. The modeling process presented here is based on an algorithm that makes use of local model fuzzy-neural networks. The algorithm falls back on the advantages of fuzzy systems with Takagi-Sugeno-Kanga (TSK) consequences and neural networks with auxiliary modules that help optimize and shorten the time needed to identify the best possible network structure. The modeling of self-induced vibrations allows analyzing how the vibrations come into being. This in turn makes it possible to develop effective ways of eliminating these vibrations and, ultimately, designing a practical control system that would dispose of the vibrations altogether.


2019 ◽  
Vol 952 (10) ◽  
pp. 2-9
Author(s):  
Yu.M. Neiman ◽  
L.S. Sugaipova ◽  
V.V. Popadyev

As we know the spherical functions are traditionally used in geodesy for modeling the gravitational field of the Earth. But the gravitational field is not stationary either in space or in time (but the latter is beyond the scope of this article) and can change quite strongly in various directions. By its nature, the spherical functions do not fully display the local features of the field. With this in mind it is advisable to use spatially localized basis functions. So it is convenient to divide the region under consideration into segments with a nearly stationary field. The complexity of the field in each segment can be characterized by means of an anisotropic matrix resulting from the covariance analysis of the field. If we approach the modeling in this way there can arise a problem of poor coherence of local models on segments’ borders. To solve the above mentioned problem it is proposed in this article to use new basis functions with Mahalanobis metric instead of the usual Euclidean distance. The Mahalanobis metric and the quadratic form generalizing this metric enables us to take into account the structure of the field when determining the distance between the points and to make the modeling process continuous.


Data & Policy ◽  
2021 ◽  
Vol 3 ◽  
Author(s):  
Harrison Wilde ◽  
Lucia L. Chen ◽  
Austin Nguyen ◽  
Zoe Kimpel ◽  
Joshua Sidgwick ◽  
...  

Abstract Rough sleeping is a chronic experience faced by some of the most disadvantaged people in modern society. This paper describes work carried out in partnership with Homeless Link (HL), a UK-based charity, in developing a data-driven approach to better connect people sleeping rough on the streets with outreach service providers. HL's platform has grown exponentially in recent years, leading to thousands of alerts per day during extreme weather events; this overwhelms the volunteer-based system they currently rely upon for the processing of alerts. In order to solve this problem, we propose a human-centered machine learning system to augment the volunteers' efforts by prioritizing alerts based on the likelihood of making a successful connection with a rough sleeper. This addresses capacity and resource limitations whilst allowing HL to quickly, effectively, and equitably process all of the alerts that they receive. Initial evaluation using historical data shows that our approach increases the rate at which rough sleepers are found following a referral by at least 15% based on labeled data, implying a greater overall increase when the alerts with unknown outcomes are considered, and suggesting the benefit in a trial taking place over a longer period to assess the models in practice. The discussion and modeling process is done with careful considerations of ethics, transparency, and explainability due to the sensitive nature of the data involved and the vulnerability of the people that are affected.


Sign in / Sign up

Export Citation Format

Share Document