Modeling Process Variability in Scaled CMOS Technology

2010 ◽  
Vol 27 (2) ◽  
pp. 8-16 ◽  
Author(s):  
Samar K. Saha
1996 ◽  
Vol 59 (13) ◽  
pp. 6-9 ◽  
Author(s):  
MORRIS E. POTTER

ABSTRACT Risk assessment is the characterization of potential adverse effects of exposures to hazards, including estimates of the magnitude of the risk, the severity of outcome, and an indication of the uncertainties involved. Because risk assessments are based on statistical and other treatments of scientific data, the quality of such assessments is only as good as the data that go into their calculation. Sources of uncertainty include scanty and/or unrepresentative data, imprecise measuring devices, systematic flaws in the data collection process, variability in host response, and difficulties in the modeling process. Sources of uncertainty tend to be different for infectious and noninfectious hazards, which has led to the use of different risk assessment approaches. The ultimate goal in using risk assessment is to provide some objective estimate of risk that can be used by the food industry and regulatory agencies to assure that foods are acceptably safe. Public confidence in the risk-assessment technique will be won by its successful application and communication.


1986 ◽  
Vol 33 (11) ◽  
pp. 1659-1666 ◽  
Author(s):  
Nobuhiro Endo ◽  
N. Kasai ◽  
A. Ishitani ◽  
H. Kitajima ◽  
Y. Kurogi
Keyword(s):  

Author(s):  
Xiaolei ZHU ◽  
Yanfei CHEN ◽  
Masaya KIBUNE ◽  
Yasumoto TOMITA ◽  
Takayuki HAMADA ◽  
...  

1999 ◽  
pp. 307-321
Author(s):  
R. Castello ◽  
I. Bietti ◽  
F. Svelto

2011 ◽  
Vol 9 ◽  
pp. 269-272
Author(s):  
J. Schleifer ◽  
T. Coenen ◽  
A. Elkammar ◽  
T. G. Noll

Abstract. Device scaling, the driving force of CMOS technology, led to continuous decrease in the energy level representing logic states. The resulting small noise margins in combination with increasing problems regarding the supply voltage stability and process variability creates a design conflict between efficiency and reliability. This conflict is expected to rise more in future technologies. Current research approaches on fault-tolerance architectures and countermeasures at circuit level, unfortunately, cause a significant area and energy penalty without guaranteeing absence of errors. To overcome this problem, it seems to be attractive to tolerate bit errors at circuit level and employ error handling methods at higher system levels. To do this, an estimate of the bit error rate (BER) at circuit level is necessary. Due to the size of the circuits, Monte Carlo simulation suffers from impractical runtimes. Therefore the needed modeling scheme is proposed. The model allows a probabilistic estimation of error rates at circuit level taking into account statistical effects ranging from supply noise and electromagnetic coupling to process variability within reasonable runtimes.


Sign in / Sign up

Export Citation Format

Share Document