Probabilistic Design for Reliability in Electronics and Photonics: Role, Significance, Attributes, Challenges

Published In

Reliability Physics Symposium (IRPS), 2015 IEEE International

Document Type

Citation

Publication Date

2015

Subjects

Electronic circuit design -- Data processing, Reliability (Engineering), hotonics, Electronics -- Statistical methods

Abstract

The recently suggested probabilistic design for reliability (PDfR) concept of electronics and photonics (EP) products is based on 1) highly focused and highly cost-effective failure oriented accelerated testing (FOAT), aimed at understanding the physics of the anticipated failures and at quantifying, on the probabilistic basis, the outcome of FOAT conducted for the most vulnerable element(s) of the product of interest, for the most likely applications and for the most likely and meaningful combination of possible stressors (stimuli); 2) simple and physically meaningful predictive modeling (PM), both analytical and computer-aided, aimed at bridging the gap between the obtained FOAT data and the most likely actual operation conditions; and 3) subsequent FOAT-and-PM-based sensitivity analysis (SA) using the methodologies and algorithms developed as important by-products at the two previous steps. The PDfR concept proceeds from the recognition that nothing is perfect, and that the difference between a highly reliable and an insufficiently reliable product is “merely” in the level of the probability of its field failure. If this probability (evaluated for the anticipated loading conditions and the given time in operation) is not acceptable, then a SA can be effectively employed to determine what could/should be changed to improve the situation. The PDfR analysis enables one also to check if the product is not "over-engineered", i.e., is not superfluously robust. If it is, it might be too costly. The operational reliability cannot be low, but it does not have to be higher than necessary either. It has to be adequate for the given product and application. When reliability and cost-effectiveness are imperative, ability to optimize reliability is a must, and no optimization is possible if reliability is not quantified. We show that optimization of the total cost associated with creating a product with an adequate (high enough) reliability and acceptable (low- enough) cost can be interpreted in terms of an adequate level of the availability criterion. The major PDfR concepts are illustrated by practical examples. Although some advanced PDfR predictive modeling techniques have been recently developed, mostly for aerospace applications, the practical examples addressed in this talk employ more or less elementary analytical models. In this connection we elaborate on the roles and interaction of analytical (mathematical) and computer-aided (simulation) modeling. We show also how the recently suggested powerful and flexible Boltzmann-Arrhenius-Zhurkov (BAZ) model and particularly its multi-parametric extension could be successfully employed to predict, quantify and assure operational reliability. The model can be effectively used to analyze and design EP products with the predicted, quantified, assured, and, if appropriate and cost-effective, even maintained and specified probability of operational failure. It is concluded that these concepts and methodologies can be accepted as an effective means for the evaluation of the operational reliability of EP materials and products, and that the next generation of qualification testing (QT) specifications and practices for such products could be viewed and conducted as a quasi-FOAT, an early stage of FOAT that adequately replicates the initial non-destructive segment of the previously conducted comprehensive “full-scale” FOAT.

Rights

© Copyright 2015 IEEE

DOI

10.1109/IRPS.2015.7112749

Persistent Identifier

http://archives.pdx.edu/ds/psu/17015

Share

COinS