TY - BOOK ID - 135532771 TI - Coping with Time-dependent Variability by a Combined Design and Technology Co-optimization AU - Catthoor, Francky. AU - Groeseneken, Guido. AU - KU Leuven. Depatement elektrotechniek (ESAT) PY - 2016 PB - Leuven KU Leuven.Faculteit ingenieurswetenschappen DB - UniCat KW - Theses UR - https://www.unicat.be/uniCat?func=search&query=sysid:135532771 AB - Defects, both as-fabricated and generated during operation, are an inevitable reality of real-world CMOS devices. Intermittent charging of these defects during operation is responsible for many reliability degradation mechanisms, including Bias Temperature Instability (BTI), Time Dependent Dielectric Breakdown (TDDB), Stress Induced Leakage Current (SILC) and Random Telegraph Noise (RTN). Their decreasing absolute numbers in downscaled devices, combined with the stochastic nature of charge capture and emission in these defects, results in a drastic increase in time-dependent variability among devices of the same technology, which adds on top of the initial time-zero variability. This study focuses on the characterization and simulation methodology for time- and workload-dependent BTI variability in advanced CMOS technologies. Accurately assessing the implications of BTI induced time-dependent threshold voltage distributions on the performance and yield estimation of digital circuits relies on a combined methodology comprising, (I) thorough statistical characterization, (II) relevant modeling methodologies and (III) appropriate compact models and simulation tools. We show that nFET and pFET time-dependent variability, in addition to the standard time-zero variability, can be fully characterized and projected using a series of measurements on a large test element group fabricated in an advanced technology. The statistical distributions encompassing both time-zero and time-dependent variability and their correlations are discussed. Furthermore, a generalized compound Poisson-Exponential distribution is derived to fully describe both (unimodal) NBTI and (bimodal) PBTI distributions with great accuracy in the extreme tail regions of the distribution. This added time dimension to the variability analysis is, however, proven to be a considerable design challenge. The assumption of Normally distributed threshold voltages, imposed by State-of-the-Art design approaches, is shown to induce inaccuracy which is readily solved by adopting our Exponential-Poisson statistical approach. However, the non-normally distributed threshold voltage shifts create compatibility issues with the current SotA statistical assessments techniques for evaluating high sigma yield of e.g. SRAM cells. Therefore we present a novel Non-Monte-Carlo numerical simulation methodology capable of evaluating circuit performance under workload-dependent BTI degradation. Complementary, we also develop a practical circuit level reliability compact model to enable fully coupled statistically varying degradation in the transient of SPICE simulations. Finally, we show that using Normally distributed BTI threshold voltage shift, imposed by the SotA design approaches, in contrast to the Exponential Poisson distribution, can significantly overestimate the yield and performance after degradation for both memory and logic applications. Incorporating the appropriate statistics is crucial for accurately predicting the necessary guard bands. Combining deterministic workloads with statistical assessment techniques will be imperative to reduce circuit margins which allows to extend technology scaling. The conclusions reported here strongly support that Design and Technology Co-Optimization (DTCO) will offer the solution to the reliability problems foreseen for ultra-scaled and future technologies. ER -