ad-a009 282 a comparative analysis of reliabiltiy prediction techniques

Upload: goldpanr8222

Post on 07-Apr-2018

220 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    1/92

    AD-A009 282A COMPARATIVE ANALYSIS OF RELIABILITYPREDICTION TECHNIQUESDouglas Joseph McGowenArmy M ateriel CommandTexarkana , TexasApril 1975

    DISTRIBUTED BY :

    National Technical Information ServiceU. S. D E PA R T ME N T OF COMMERCE

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    2/92

    SECURITY CLASSIFICATION. OF T..S PAGE fhn.. .Data Entered)REPORT DOCUMENTATION PAGE READ 'NSTRUCTINS1BEFORE CO-IPLETr.NG FORM

    1. REPORT NUM.ER 2. GOVT ACCESSION NO. 3. RECIPIENT'S CATALOG NUMER

    SUSAMC-.T-02- 8-75-103 -AA. TITLE (ane Subtitle) S. TYP OF REPORT & PERIOD COVERED

    A COMPARATIVE ANALYSIS OF RELIABILITY FINALPREDICTION TECHNIQUES 6. PERFORMING ORG. REPORT NUMBER

    7. AUTHOR(e) 8. CONTRACT OR GFIANT NUMBER(&)Douglas J. McGowen9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASKDepartment Maintenance Effectiveness AREA & WORK UNIT NUMBERS.Engineerinci USAiMC Intern Training CenterRed River Army DepotTexarkana Texas "75501"II. CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATEApril, 1975

    S '. '13. NUMBER OF PAGES14. MONITORING AGENCY NAME & AODRESSIt dllfe on t from ControllingOffice) 15. SECURITY CLASS. (of this report)

    IS. DECLASSIPFICATION/IDONGRADINGI .... SCHEDULE

    16. DISTRIBUTION STATEMENT (of this Report)

    Approved for Public Release: Distribution Unlimited17. DISTRIBUTION STATEMENT (of th e abetterc entered In Block 20, If different from Report)

    i1. SUPPLEMENTARY NOTESResearch performed by Douglas J. McGowen under the supervisionof Dr. S. Bart Childs, Professor, Industrial EngineeringDepartment, Texas A&M University.

    19. KEY WORDS (Continue on reverse side It neceseary and Identify by block number)

    Reliability, Prediction* [ 'PR ICES SUBJECT TO (HANGE

    1 20. ABSTRACT (Continue on reverse aide if neceseary and Identify by block number)This report describes variod-s reliability prediction techniquesutilized in the development of an equipment. The techniquesdescribed include prediction by similar system, similar complex-ity, function, generic part count, stress analysis, and degrada-tion techniques. These are described and compared with respectto the characteristics, basic assumptions, typical applications,and relative accuracy.DD 1473 EDIT% l.epdc.d byNATIONAL TECHNICALINFORMATION SERVICE ;CCURITY CLASSIFICATION OF THIS PAGE (When Daei LEntetad)

    US Deralmrent of CommerceSp.nhfield, VA. 22151

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    3/92

    FOREWORD

    The research discussed in this report was acccmplishedas part of the Maintenance Effectiveness EnjineeringGraduate Program conducted jointly by the USAMC InternTraining concepts and results herein presented are those ofthe author and do not necessarily reflect approval oracceptance by the Department of the Army.

    This report has been reviewed and is approved forrelease. For further information on this prcject contactRon Higgins, Intern Training Center, Red River Army Depot,Texarkana, Texas 75501.

    Approved:

    RONALD C. HIGGINSDepartment of Maintenance Effectiveness EngineeringFor the Commander

    JAMES L. ARNETT, Director, ITCf ii

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    4/92

    ABSTRACTResearch Performed by Douglas J. McGowen

    Under the Supervision of Dr. S. Bart ChildsThis report describes various reliability prediction

    techniques utilized in the development of an equipment.The techniques der!ribed include prediction by similarsystems, similar complexity, function, generic part count,stress analysis, and degradation techniques. They aredescribed and compared with respect to the characteristics,basic assumptions, typical applications, and relativeaccuracy.

    1,-i

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    5/92

    J,~

    ACKNOWLEDGEMENTS

    For serving as the chairman of my committee and offer-ing many helpful suggestions, I extend my appreciation toDr. S. Bart Childs. For serving as members of my committeeI wish to thank Dr. R. McNichols and Dr, J. Foster. Iwould especially like to thank my sister, LaVerne, for hersupport as well as her invaluable assistance in thecomposition, editing, and typing of this paper.

    During the course of this work, the author wasemployed by the U.S. Army as a career intern in the AMCMaintenance Engineering Graduate Program. He is gratefulto the U.S.'Army for the opportunity to participate inthis program.

    The ideas, concepts, and results herein presented arethose of the author and do not necessarily reflect approvalor acceptance by the Department of the Army.

    iv

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    6/92

    CONTENTSChapter Page

    I INTRODUCTION ............................. 1Scope ....................................Chapter Description .......................... 8

    II LITERATURE SURVEY ........................... 10III EARLY PREDICTION METHODS ................... lb

    Similar Systems .......................... 16Similar Complexity .......................... 18Function ................................. 21Table 3.1 ................................ 37

    IV LATER PREDICTION METHODS ................... 45Generic Part Count .......................... 45Stress Analysis .......................... 50Degradation Analysis ........................ 54

    V SUMMARY AND RECOMMENDATIONS ................ 58APPENDIX ....................... ......... 64LIST OF REFERENCES .......................... 84

    V%

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    7/92

    CHAPTER I

    INTRODUCTION AND MECHANICS

    INTRODUCT ION *Reliability is defined "as the probability that a

    system will perform satisfactorily fo r at least a givenperiod of time when used under stated conditions."(2)*This report discusses the area of reliability predictionwhich is defined as "the process of forecasting, fromavailable failure rate information, the realisticallyachiejable reliability of a part, component, subsystem, orsystem and the probability of its meeting performance andreliability requirements for a specified application."(l)Reliability predictions provide a source of quantitativelyassessing system reliability before actual hardware modelsare constructed and tested. In the field of reliabilityanalysis one is often faced with the problem of choosinga reliability prediction techniqub suitable fo r hisspecific problem.

    i'I

    *A parenthesized number refers to the source listed inthe references at the end of this paper.

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    8/92

    The real value of any numeric expression lies not inthe number i tself, but in the information it conveys andthe use made of that information. To some, reliabilityprediction is a numbers game that can produce any numberrequired. To others it is a frustrating experience and anightmare. To those qualified in reliability engineering,predictions are no mystery and they can be accurate andmeaningful if existing knowledge and techniques areproperly applied. Reliability predictions do not, in them-selves, contribute significantly to the reliability of asystem. Rather, they constitute criteria fo r selectingcourses of action that affect reliability. ()

    According to Feduccia(7), many times in the prepara-tion and evaluation of a proposal, the main goal is thedetermination of a reliability number fo r the equipment.In the hurry of getting this number, it is often and veryeasily done that the limitations and the techniques usedare overlooked. Feduccia states this has a two-foldhandicap: (1) Reliability prediction errors because oftechnique misapplication, and (2) lack of feedback onpossible technique improvements because their limitationsare either overlooked or otherwise tolerated as part of the

    2

    M. -- ,Na

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    9/92

    game. Tiger(27) states that the "use of reliability pre-diction has been handicapped by overemphasis on simplicityof technique. Reliability prediction consists of serialI-ock diagrams and Mdition of part failure rates only forelements which are individually required fo r missionsuccess, otherwise more complex probablistic analysis islecessary." He further comments that some people stillt hink )f eliability prediction in terms of simple serialblock tagrams and addition of part failure rates;however,use of multi-functional systems, redundancy, specialmaintenance policies, new electronic devices, non-elec-tronic hz,-dware, and complex mission requirements in whichdifferent functions are required in the various missionphases emphasize the need fo r greater depth in reliabilitymodeling. Keene(10) states that "too often there isoveremphasis on obtaining predictions of a system'sreliability and refining them to account fo r small changesor clarifications of the hardware. The truth is that theuse of failure rates, the development of models of asystem's reliability and the mechanics of generatingreliability predictions are helpful only as they effectdecisions about design, and their value can be judged only

    3

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    10/92

    by that criterion." The optimum utilization of reliabilityI pediltion requires a complete understanding of the types

    of prediction possible and implications for their use atvarious stages of hardware development. During earlyplanning phases little is known about the system or itscontents, so prediction at this stage must be. based onpreliminary information with many assumptions. As thedevelopment progresses and the design becomes firm, con-siderably better information becomes available and manyassumptions can be replaced with known facts or withSbetterducated guesses based on specific details of th ecase.

    Predictions are valuable when deciding among alterna-tive designs. Several designs for a particular system orsubsystem may be under consideration, and one of thefactors which influences the choice is their relativereliabilities. This is equally true when various types ofredundancy are being compared and when different componentsare being considered. Predictions also indicate thosecomponents and assemblies which contribute most towardtotal system failure probability. By defining potentialproblem areas, necessary corrective action can be taken to

    4

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    11/92

    reduce the probability of failure and improve the system.A concentrated design effort can be made where it will bemost beneficial to the system. Other correctije actionssuch as application of redundancy, change in components,incorpoation of periodic maintenance procedures, andstrict control of manufacturing processes and inspectioncan be carried out. Also, when the prediction indicatesthat, with normal development effort, the reliabilityobjective will be met easily, unnecessary development costscan be saved, and funds can be saved, and funds can betransferred to other areas which may require more concen-trated effort.(1)

    Several reliability prediction techniques, varying inlevel of complexity and detail of application, are avail-able to the reliability engineer. In general, th etechniques in current use provide means for predictingtotal equipment or system reliability as a function of itsdefined design or functional characteristics. Also, to theextent possible considering the maturity of available data,most prediction techniques consider the statistical dis-tribution of failures to permit reliability evaluation ina quantitative manner. During the life cycle time span,

    5

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    12/92

    data describing the system evolves from a qualitativedescription of systems functions to detailed specificationsand drawings suitable fo r hardware production. Therefore,reliability prediction techniques have been developed toacccmodate the different reliability study and analysisrequirements as the systems-design progresses.

    SCOPEThe intention of this paper is to present and compare

    different prediction techniques that are utilized throughthe system development of the hardware. The techniqueswere chosen on the basis of current usage. It is essentialthat proper techniques be used in each design phase. Thevarious techniques are discussed by the author because theyare currently being used by industry and government.System development tasks involving significant predictionactivity can be classified in seven general task categories(1) feasibility study, (2) allocation study, (3) designcomparison, (4) proposal evaluation, (5) trade-off study,(6) design review, and (7) design analysis. These taskcategories differ in principle objective, and are performedat different stages of the development program. However,

    6

    4 - -- - ~- -A

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    13/92

    the associated reliability predictions are performed using

    essentially the same basic procedure for any task category;the major difference in procedure being dictated by theparticular phase of the system life cycle rather than bythe objective of the task.

    For purpose of this paper, the following six categories-of prediction techniques are discussed:

    (a). similar equipment techniques--The equipmentunder consideration is compared with similar equipment ofknown reliability in estimating the probable level ofachievable reliability.

    (b). similar complexity techniques--The reliability of.a new design is estimated as a function of the relativecomplexity of the subject item with respect to a "typical"item of similar type.

    (c). prediction by function techniques--Previously..onstrated correlations between operational function andreliability are considered in obtaining reliabilitypredictions for a ne w design.

    (d). part count techniques--Equipment reliability isestimated as a function of the number of parts, in each ofseveral part classes, to be included in the eq- -ent.

    7

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    14/92

    (e). stress analysis techniques--The equipmentfailure rate is determined as an additional function of allindividual part failure rates, and considering part type,operational stress level, and derating characteristics ofeach part.

    (f). degradation analysis techniques--Circuittolerances, parameter drift characteristics, part variation,and other factors are considered together with stresslevels in predicting the probability of circuit malfunctiondue to wear out or other types of degradation.

    Reliability prediction techniques in each of thesecategories are described and compared in the subsequentchapters.

    In the discussion of each technique the followingformat is followed:

    (a). the characteristics(b). basic assumptions(c). typical applications(d). relative accuracy.

    CHAPTER DESCRIPTIONIn Chapter II, the author has reported on any work

    8

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    15/92

    already done in the comparison and analysis of predictiontechniques. Any missing references on his part is an over-sight and not intentional.

    Chapter III is primarily allotted to the techniquesused during the feasibility studies aliocation studies,and proposal evaluation of the hardware. This breakout by

    the author resulted primarily from his review of references(17), (13), (21), and (7).

    A discussion of the techniques employed during thedesign comparison, proposal evaluation, trade-off studies,design review, and design analysis of the hardware iscovered in Chapter IV. Chapter V is delegated to ieneraldiscussion of the techniques and a critical look atreliability prediction in general. An effort is made tosuggest ideas and problems to be invests,:,ated in the areaof reliability prediction.

    9

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    16/92

    CHAPTER IILITERATURE SURVEY

    Much of the work relating to reliability has been donefor the Department of Defense and various industries.Although an abundance of reliability information isavailable, no cumulative report has been prepared contrast-ing and analyzing the prediction techniques. In hisreport (7), Feduccia reviews two classes of predictiontechniques. He refers to these as the feasibility (orballpark) prediction procedures and the design (or stress-analysis) prediction method.

    The design prediction methods were initially reportedby the Rome Air Defense Command/Radio Corporation ofAmerica, ARINC, and VITRO. These techniques consistessentially of compilations of part failure rate informa-tion which is based on the premise that after the periodof infant mortality and prior wearout, failure rate isconstant. The ?art failure rates are usually expressed asa function of electrical and thermal stress, so before theycan be assigned, the stresses under which each part in thesystem is operating must be determined. After rates are

    10

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    17/92

    zvJ:ulated, they are combined in some appropriate model toyield'a failure rate for the system. He stresses thatstress-analysis prediction method should begin in thedesign stage and continue through the prototype development.A serious shortcoming of this technique lies in thequality and nature of the data base. Another shortcominglies in the time required to make the prediction.According to Feduccia, feasibility (or ballpark) predictiontechniques provide a quick estimate within the "ballpark"of final reliability figures. These methods require onlythat the number of tubes or total parts in the system beknown. Although a feasibility prediction cannot replacethe detailed stress-analysis procedure, it can assist thedesign engineer in estimating the reliability of a proposeddesign and provides management with a valuable tool fo r usein preliminary planning. In 1963, the ARINC ResearchCorporation and the Federal Electric Corporation completeda RADC-sponsored effort aimed at developing a series oftechniques for reliability prediction by function. Thesestudies were addressed to defining equipment functions(such as power output, frequency, signal-to-noise ratio)and to determining correlations between the functions and

    11

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    18/92

    S, .

    "equipment MTBF. With this series of techniques, thedesigner can obtain a MTBF estimate fo r his circuit or sub-system as soon as he knows the appropriate functions,usually in the very early planning stages of design. Thesystems studied were all ground-based and includedcommunications and radar receivers, transmitters andassociated equipment. These efforts were followed, in 1965,by a revision of the original equations, and the develop-ment of a prediction by function technique for dataprocessing equipment. (17)

    This series of techniques has some limitations. Theyare restricted by the data available from field experience.An effort has been made in this area by the Bird-Engineering-Research Associates, Inc. fo r the AerospaceCorporation. This effort resulted in the publication of ahandbook containing, "A Series of Techniques for thePrediction of Reliability Throughout the Life Cycle ofBallistic and Space Systems from Design to Development."(9)This technique, developed for use in the conceptual phase ofsystem development, includes methods fo r predicting thereliability of structural, propulsion, mechanical andelectronic subsystems. Parameters found to correlate with

    12

    4

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    19/92

    system reliability are vehicle weight (for structuralsubsystems), missile thrust (propulsion subsystems) andcomplexity (for mechanical and electrical subsystems).Each of these parameters is known, or can be reasonablyestimated, in the conceptual phase of system development,theref-cie, the techniques are useful as planning andmanagement tools for forecasting early the probability offailure in new designs.

    In his article, "The Status of Reliability Prediction,"Clifford M. Ryerson (21), discusses the accuracy of twelvetypes of reliability prediction techniques. His breakoutis given in Table 2.1. He gives a brief description ofeach technique but does not give a comparison and contrastof the various techniques. Some of his techniques are trueprediction techniques, others are techniques related toprediction fo r project control or forecasting.

    Military-Standard-Handbook 217A (14) discusses severalprediction methods that are now being used. There is adiscussion of how to perform prediction or reliabilityduring early stages of equipment design following certainguidelines. There is also a section dealing withreliability data and stesses that are associated with parts.

    13

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    20/92

    The Rome Air Defense Center (17) takes a general look atrelialility and reliability prediction. U.A. Walter, inhis paper, "Reliability as a total Concept," (29) makes abrief mention of various prediction methiods and in mostcases gives several steps to use when utilizing thetechniques he discusses. The techniques used in thefeasibility studies, allocation studies, and proposedevaluation of the hardware are discussed in the nextchapter. These phases of the hardware are critical areasas fa r as reliability is concerned. Here the propertechniques must be implemented in order to guide theproject in the right direction and avoid costly refit orredesign of the system. It can also help to select th eright alternative if there are more than one designconsideration.

    14

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    21/92

    TABLE 2.1 RYERSON, "The Status of ReliabilityPrediction," 1969.

    Prediction Techniques Optimum .Project Phase1. Comparison of SimilarSystems

    During Project Planning2. Standardized TypicalSystem Reliability

    3. Comparison of SimilarCircuits An d (Appor- During System Designt ionment)

    4. Active Element GroupCount

    During Equipment Design5. Generic Type PartCount

    6. Detail Part EstimatedStress Analysis During Prototype Perfec-7. Detail Stress Analysis tion

    8. Deficiency Technique9. Simulated Operation And Production Unit Testing

    (Part Screening)10. Environmental Testing

    System Testing11. AGREE Testing12 . Field Confirmation Specified Use

    15

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    22/92

    CHAPTER III

    EARLY PREDICTION METHODSIn the early, pre-design stage of system development,

    the designer needs some method of estimating the systemreliability. This chapter is concerned with tiie varioustechniques utilized during pre-design stage to predictsystem reliability. Its main concern is with techniquesused in the feasibility studies, allocation studies, andproposal evaluation of the hardware. In the initial stagesof development, very little is known about the systemconfiguration.

    SIMILAR SYSTEMSOne of the most fundamental techniques employed is the

    similar system method. Often a preliminary estimation isrequired prior to the total definition of a system. Atthis stage little specific reliability information isavailable and the prediction must be based on the state-of-the-art in similar systems plus appropriate specificassumptions. The primary characteristics of this method isthat it compares new equipment with similar equipment

    16

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    23/92

    already in use. It is usually expressed in terms of meantime between failures (MTBF), failure rate, or similarparameters. The basic underlying assumption in this methodof prediction is that it assumes an orderly evolution ofequipment and that similar equipment exhibit similarreliability.(17) Among factors that are taken into consid-eration are system design, performance, manufacture,physical comparison, and project similarities. Collectingdata from existing equipment under similar environmentaland operating conditions should be taken into consideration.

    In drawing conclusions about reliability utilizingthis technique care should be exercised in using theseresults. At best, only a "ballpark" estimate is given.Calabro (5) refers to it as a guess. An example of the useof this technique is shown by Joseph Fragola in his paper,"An Illustration of Bayesian Analysis of & Weibull Process."(18) Another example of the use of this technique isNAVSHIP 93820 (Method A) Technique. (14) This methodrequires finding of the nearest equivalent system and notingthe failure rate. Appendix A shows four steps to followwhen utilizing this technique.

    17

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    24/92

    SIMILAR COMPLEXITYAnother method employed during this period of

    development is the similar complexity prediction method.This prediction method has been developed as a result ofanalysis which tend to show a correlation between equipmentcomplexity and reliability. "(14) The most common techniqueinvolves the use of graphical procedures relating failurerate to active element group count (AEG). The primarycharacteristic of this method is that equipment reliabilityis estimated as a function of the relative comr.plexity ofthe subject item with respect to a "typical" item of asimilar type. The primary assumption is that a correlationexists between reliability and complexity. (17) Adequatedetail of the equipment design must be available in orderto estimate the number of active elements in the equipment.(29) AEG's are defined as tubes, transistors, or otheractive element items and their associated circuitry.Because of different environments in which the equipmentwill be operated, provisions are provided fo r compensation.

    A typical application of this method is given byMilitary-Standard 756A. (13) This is a graphical

    18

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    25/92

    application of this technique. Application of the

    techn3.que invdlves determining .the number of active elementgroups in each functional block of the equipment. Thegraphs published in MIL-STD 756A is for. determiningreliability in terms of mean life between failures. Thegraph includes two'bands indicating the probable range ofachievable reliability fo r equipment to be operated inairborne and shipboard environments (Appenlix B). Thehigher the MTBF values for a given number of series activeelements represents the level of reliability that can beachieved with good reliability engineering and designeffort. The procedure to determine functional complexity,from MIL-STD 756A is, "For each functional block, estimatethe number of active elements necessary to perform thefunction. An active element shall be consideied to be anelectron tube or transistor, except that ten computer diodesand associated circuitry shall be considered equivalent toone active element in digital computers. To date,insufficient data have been collected to permit a furtherbreakdown between equipment containing tubes, transistors,micro modules, and integrated circuits. Determine thecorresponding failure rate of each block for the number of

    19

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    26/92

    active elements by using chart 1."Another technique in this class is NAVSHIP 93820

    (Method B) technique. This provides failure rates peractive elements per equipment type. (14) This implies thatan element in a particular piece of equipment would have adifferent failure rate from-an active element in anotherequipment because of different usage and a varying numberof types of associated parts. Another method in this classis a refi-iement of NAVSHIP 93820 (Method A). (14) Theprocedure to follow is multiply the number of activeelements in the ne w equipment divided by the number ofactive elements in the old equipment by the old equipmentfailure rate. This would yield the predicted failure ratefor the new equipment. Another approach is to estimate therelative complexity of the new to the old equipment by thenumber of modules or circuits. From this an estimatemight be made so that the ne w equipment will be one andone-half times more complex than the old. Therefore, thefailure rate of the new equipment would be ine and one-half times the failure rate of the old equipment. (14)Since recent studies of the state-of-the-art equipment(ntegrated circuits) show the old methods do not work, a

    20

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    27/92

    I

    weighing f3ctor is incorporated. Bird-Engineering-ResearchAssociates(9)have developed two methods similar to MIL-STD756A. The only difference being that a procedure fordetermining AEG's is given.

    FUNCTIONThe final prediction technique for this stage of

    development is called prediction by function. The basicassumption is that there exists a ccrrelation betweenreliability and function as well as between reliability andcomplexity. Prediction by equipment characteristics refersto the estimation of the reliability of an equipment usingequations formulated on the basis of equipment class andsimilarity of functional performance. (12) The applicationof this prediction method is dependent on the availabilityof sufficient data which relates the reliability achieve-ments of similar systems and their functional performancecharacteristics. The characteristics of interest are thosewhich affect reliability significantly and can be quanti-fied early in the design phase. (29) The data should beapplicable to the latest design standards when operatedunder comparable environmental conditions.

    The Federal Electric Corporation initiated a study to21

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    28/92

    determine the reliability function equations in 1963. (24)In 1965 they expanded and refined their initial reliability.by function techniques. The equipment under study wereradar systems, communication systems, and electronic dataprocessing systems. (25) Specific systems under study wereselected on the basis of available field operational data,on the general range of the individual systems performancecharacteristics, on consideration of current and standarddesign practices, and on technical development in the state-of-the-art. The scope of the effort was restricted to aselected number of systems which met the criteria forselection. As a result the correlation analysis could notbe developed across the complete range of system typeswithin each category. (25) The resultant methods should beused carefully, especially in application involving systemsof the type not included in the data base used to performthe correlation analysis.

    Each system studied was divided into functional partsthat represented a distinct operation necessary fo r th esystem to perform its required mission. For example, radarwas divided into a transmitter, a receiver, and anindicator, representing the principle radar functions of

    22

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    29/92

    transmit, receive, and indicate. This method gave agreater degree of freedom than simply studying the wholesystem. It also provided a logical basis fo r study of thecorrelations between certain characteristics peculiar tothe function under investigation and the reliability of thefunctional hardware. Data was gathered from files onsystems for which Federal Electric Corporation has or hadoperational and maintenance responsibility. In analyzingthe data, a failure was defined as a detected cessation ofability to perform a specific function within previouslyestablished limits. Development of correlations usingactual field reliability figures in terms of mean timebetween failures was the basic approach followed except inthe case of radar receivers and transmitters. Here failurerates were used and were normalized to minimize the effectof complexity. A functional characteristic was defined asan "equipment characteristic which might be expected tohave a significant relationship to reliability." (25) Aregression analysis involves determining which of anequation's coefficients best fit the form of the equationto. he'observed data. To accomplish this, it is necessaryfirst to develop the form of the equation and the measures

    23

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    30/92

    of the independent variables.Three types of correlation analysis were used. These

    included Rank Order Correlation, Linear Correlation of TwoVariables, and Multiply Correlation with Tw o IndependentVaribles.

    The Rank Order Correlation was made to make initialcomparisons between each functional characteristic andfunctional reliability. It was used as a screening proce-dure. Linear Correlation of Tw o Variables was used to fita least squares line to the graphical plot of an independentvariable and dependent variable. Once it has been deter-mined that a linear relation exists, this statisticalprocedure is employed to determine the "best line" throughthe points. The value of this statistical procedure isthat it gives a mathematical check on the validity ofassuming a straight line. Having derived a least squaresline it is possible to determine confidence bands around it.The Multiple Correlation allows fo r the simultaneouscalculation of the linear effects of two independentvariables on one dependent.

    The verification of this technique was to apply theactual field reliability (MTBF or failure rate) of an

    24

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    31/92

    equipment not used in the original correlation and compar-ing it with the value predicted using the resultant predic-tion equation. (25)

    Seven pulsed radar types were used. in the developmentof the equations. Correlation analysis were performed onradar using data given in Appendix C, and characteristicsgiven in C. These are primarily for example and will notbe shown for other equipment. Correlation was foundbetween the characteristics of peak power and normalizedfailure rate of radar receivers. Normalized failure rateis arrived at by dividing fie2d failure rate by the numberof active element groups existing within a function. Againan active element group is defined as an electron tube andits associated parts. The equation fo r this is given by:

    .32Normalized failure rate = 4.17 (Pp)3 (3.1)where Pp is peak power.The same approach was taken with radar transmitters. Againa correlation was shown between normalized failure rateand peak power given by:

    .36Normalized failure rate 9.Ob (Pp) (3.2)The above results lead to a study of the relationship ofthe combined functions of receiver and transmitter. Again

    25

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    32/92

    the relationship was given by determining a correlationbetween normalized failure rate and peak power and wasgiven by:

    .30Normalized failure rate = 6.3 (Pp). (3.3)The resultant value is then multiplied by the number ofactive element groups anticipated in the design of th ecomplete function. The verification of this was done usingactual field MTBF of one type of radar.. The predicted MTBFwas 231 hours. The actual field trBF was 237 hours. Thisfell within the confidence limit bounds.

    The correlation analysis indicated a relationshipbetween maximum pulse width and MTBF fo r radar display.The relationship is given by the line:

    MTBF = 1483 + 3314 logl 0 T (3.4)where T is the maximum pulse width. In verification fieldMTBF was 1533 hours while predicted values was 1483 hours.Again it fell within confidence interval.

    As in the case of radar, the communication system wasclassified into two functions, receiver and transmitter.

    The only firm correlation that was developed existedbetween MTBF and maximum noise figure. The noise figure ofa receiver is a measure of the extent to which the noise

    26

    5S

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    33/92

    appearing in the receiver output, in the absence of asignal, is greater than the noise that would be present ifthe receiver were a perfect receiver from the point of viewof generating the minimum possible noise. The resultantexpression is given by:

    MTBF = 2889 e-l 3 6Nf (3.5)where Nf is the maximum noise figure in decibals. In veri-fication, the calculated value of MTBF was 1743 to 2210.The predicted value of MTBF was 1920. This fell withinacceptable confidence limits. Correlation analysis of thereliability of communications transmitters and th eapplicable characteristics indicated a relationship betweenpower gain and actual MTBF. The resultant equation is

    given by: -. b24MTBF = (6769) G (3.6)where G is power gain. In verification field MTBF was 903hours and predicted was 867 hours which fell inside theconfidence limits.

    Again the correlation method was used in analyzingmulplex systems giving a relationship for MTBF and thenumber of channels being used. The following equation is:: -. 0178C

    MTBF = (783) e (3.7)27

    " ... .... .. . . ..-,-- -,-- - - - -,- ' "" s ~ . .. .. .. . .. .. , : . .. . . . .. ......-... .. ... ... --- , . . . . .. .. .. . .. . .- :--

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    34/92

    where C is the number of channels. Again selecting a setto us6 in verification showed an actual MTBF of 764 hourswith a predicted of 69b hours. -

    FECD's approach to electronic data processing systemswas basically the same as utilized for radar and groundcommunication systems. That is, several representative EDPsystems were investigated in order to determine logicalfunctional breakdowns and associated characteristics whichappear suitable for correlation with reliability. (25)For their study the FEDC chose to divide EDP systems intotwo general areas: (1) Central Processor and, (2)Peripheral Equipment.

    The Central Processor is composed of functional unitswhich control the operation of the input, output, andbuffer equipment; mathematically operate on the data itreceives; extract required data from memory; storesresultant data; and ultimately controls transmission to theoutput equipment. In other words, it consists of thearithmetic, coi.trol, and storage, or main memory functions.The peripheral equipment was defined as the input, output,buffer, auxiliary storage, and othertequipment not underthe direct control of the central processor. Again

    28

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    35/92

    Ft

    validity was checked by comparing it to a system in use.P'rom a correlation analysis, a relationship between

    central processor and the ratio of word size in bits to theadd time in microseconds. The equation is given by:

    SMTBF = (524) e-" 135W/A (3.8)where A is the add time or time for one addition and W isfor words. The predicted MTBF was 405 hours while thefield MTBF was 328 hours. The predicted value fell withinthe confidence bands. No discussion of results were givenon the peripheal equipment.

    One other application of prediction by function is th eHughes Study. (15) Here, equipment was divided into sixgroups: radar, cathode ray tube, display, radio, transmit-ter, radio receiver, and buffering. It represented a totalof fifty-two equipment, all ground or shipboard. The datawas made up of ARINC's as well as Hughes.'s data. Thirtyequipment characteristics were studied and although somewere the same as the ARINC's, the majority were different.Regression analysises were. done with the aid of a computerand three to four equations resulted for each equipmenttype. As an example, the following equations were develop-ed fo r pulse radar:

    29

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    36/92

    1,1(a). e = 159.4 - 20.5X1 (3.9)(b). = 214 - b. 9 Xi - 24X2 (3.10)(c). i = 50 - 1OX 1 + 1.5X2 - 1.53 + 1.24X4 (3.11)

    where:I = predicted MTBF (hrs)XI= peak power output (10w)X2 = average output (KW)X3 = prime power input (Kw)X4 = HDBK-217 prediction (hrs)

    The equations are arranged in the order of their use sincepeak output would be the first characteristic known,equation b would be used second since the next character-istic normally determined is average output and similarlyfor the remaining equations. Similar sets of equationswere determined for the other five equipment categories.

    In his paper, "A Methodology fo r Software ReliabilityPrediction and Quality Control," (22) Schneidewindpresents a procedure to follow in order to arrive at aprediction function for software reliability. His approachis basically to determine a reliability function based onshape of frequency function of empirical data, estimatethe parametcrs, identification of reliability function by

    39

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    37/92

    using goodness of fit tests, estimate parameter confidencelimits, estimate function confidence limits, prediction ofreliability and its various intended operating times, andfinally a comparison of required reliability with predictedreliability. (22) He notes that the implementation iscomplicated by the fact that the time between troubles perfixed time intrr-,al is not a stationary process withrespect to test time. As a result of a reduction introuble rate as testing continues, the form of the distri-bution may remain the same over time but parameter valuesmay change, or the actual form of the distribution maychange. This means that a reliability function which isbased on the total number of data points collected over theentire test period may not be an accurate predictor.because-the data set is non-representative of the current state ofthe error occurence process. (22) He suggests that asmoothing technique could be used on the most current datapoints in order to obtain parameters estimates that wouldapply the next time. In other words, the parameters wouldbe updated as testing continues. If the form of the dis-tribution changes he notes that it would further complicatethe problem in that the most appropriate form of

    31

    - ---- - , .

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    38/92

    distribution would be needed at each test period. He givesan example utilizing his technique assuming an exponentialdistribution form in reference 18. A method similar toFederal Electric Corporation was used by the Naval AirDevelopment Center fo r non-avionics. (11)

    The ARINC Research Corporation (24)initiated a study todevelop a technique for predicting reliability of groundelectronic systems prior to the actual'design of thesystem. Their approach also was based on determining therelationships between system reliability and some of themore general characteristics of the system. One majorrequirement was that the resulting prediction be represent-ative of the state-of-the-art. Therefore, observed fielddata was the foundation of the investigation. Therelationships between reliability and system parameterswere determined by a multiple regression analysis. Alinear model of the form (24)

    Y =BO + BlXl + B2X2 + . BnXn (3.12)where XI, X2 , ... Xn represent system characteristics andBI, B2, Bn are the true regression coefficientsrelating these characteristics to reliability where Y is ameasure of reliability. In this investigation Lne v;as used

    32

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    39/92

    so that the basic regression equation becameLne = bo + blXj + b 2 X2 + ... bnXn (3.13)

    where bi is a statistical estimate of Bi, and e is thepredicted mean time between failures of a system. Lne,Xl...Xn are known for each system and solving the equations, onefor each system, gives the best coefficients bo, bI ... bn.The regression analysis with N parameters (in the optimalcase N independent parameters) some of the parameters maynot be significant so a portion of the task was to deter-.mine which parameters are related to reliability. Becauseof limited capacity, runs were made with subsets of systemparameters. Volume II (25) of this report presents a stepby step procedure for using this technique. The finalsupplement to this report is given by reference 4. Oneassumption made is the mean time to failure is exponen-tially distributed. Four equations were arrived at below:

    I. Ln5 = 7.173 - .80lXl - .136X 2 - .59bX 3 + .513X 5(3.14)

    II. Lng = 7.910 - .121X 8 - .740X 1 - .090X2 - .359X 3(3.15)

    III. Ln5 = 2.731 - .484X 7 - 1. 4 5 7Xb + 2.2ObX5.929X 9 - .139X 2 (3.16)

    33

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    40/92

    79 -77.

    IV. Ln= 2.172 - .277X 2 + 2.327X5 - 1.075X31.770Xb + .192X 4 (3.17)

    where Z is predicted MTBF and Xi are equipment ?arameters.X1 =.number of active elements by typeX2 = number of cathode ray tubes3= number of transm itters

    X4= highest frequencyX5 = analog or digitalXb = steerable antennaX7 = maximum dc voltsX8= rated powerX= ratio of turning range t hich-.t frequency

    The equation chosen is based on infoDmation given. Twenty-seven systems were observed in this experiment. Theassumption was made that the design of new equipment willcomprise elements and principles similar to those of thedesigns considered in this analysis.

    ARINC also did a study directed toward developing areliability prediction technique for monolithic integratedcircuits (18). A model that expresses reliability as afunction of device screening, sampling, system burn-in time,and field operating time was developed. The equation was

    34

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    41/92

    derived through multilinear-regression analysis of fielddata 6f sixteen system types. The data was supplied byusers or the manufacturers of the system. The equationfits to the Weibull distribution with the shaping parameterA being a constint while the scaling parameter i is afunction of dev.ce screening and sampling tests plussystem burn-in time. After determining the characteristicsto be utilized the following prediction equation wasderived:

    Rw (t 2 ) = e-t22/3/K (3.18)where

    K = 76,877e"025Sc + "0 0 0 9 5 Sa + "00b4tlSc screening scoreSa= sampling scoret= system burn-in timet2 =field operating time

    With the exception of field operating time, all variablesin the equation can be controlled within wide limitsduring production. Another study was conducted much in thesame way as before but on avionic equipment. (3) Table 3.1shows the resultant prediction equations.

    Appendix A of this paper presents the basic steps in35

    ---------

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    42/92

    utilizing the prediction by function technique. Afterinitial reliability numbers are realized and farther workis done and more is known about the system, othertechniques can be used to quantify prediction numbers.Chapter IV will be a discussion of these techniques.

    3b

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    43/92

    ARINC, Avionics Reliability and Maintainability PredictionBj Function~, l9bb.

    TABLE 3.1EQUIPMENT PREDICTION EQUATIONS-.Equipment Equation Equation ParameterClassification Tyoe Range

    Navigational ILnb=5.79b-0.5bl(Xl)- 0.JAX1t4.5and Radio 0.33b(X2)+0.2'32(lnX3) 0!5X 2!64Receiving Sets _____2!5& 3 !500

    1I Ln&=3.529-0.6l2(Xl)- 0.1lXXi'4.50.345(X2)+0.150(X 4) 0!X2-4

    ____ ___ ___ _____ ___ __ 5!X4:!22Electromechani- r& II Ln6=l.806-0.l20(X 5 )+ 3SX 5 620cal Analog Nay- 0.298(Xb) 9X64-13igationa Com-__ _ _ _ __ _ _ __ _ _ _ __ _ _igationlCrnIndicators EILn5=5.515-0.lb3(X7) 26X:;!8Signal Process- I& II Ln6=6.283-0.0l7b(Xl3) 20-SXl 3 ,lbBing/Generat ingEquipment__ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    *Radio Command r& II Ln6=8.779-0.708(lnxl4)- 9!.X14!400Communications _____0.354 ( lnXi5) 2:5X, 5.4QQHigh Power ILn6=3.317-0.267(Xlb)- 16-Xlbis4Radar Sets 0.136(Xl7)+0.291(X18) 2!EX17692 X R!54

    1I Lnb=4.l64-0.325(Xl6)- 1:EX16!-40.270(lnXZ)- 54X:ZKl85Low Power Navi- & II Ln5=4.349-0.445(X2 )+ O6X2.!5gation & 1FF 0.350(X8) 0-X86'3Transmitting &

    Receiving Sets _____Intercomnmunica- ELn5=6.973-l.2l5(Xl)- 0.1;lX2.0tion Sets _____ .155(Xg) 4!EXq!Sl7

    rI Ln6=7. 108-0.0202 (Xl0) 2O:Xlo!l400.507(XI) 0.JfiXi12.0All Equipment rLn6=2.986+0.242(Xll)- 20:EXlii!l00.507(Xl) 0.1:XlS48.7

    O:kX2611rI Lne=4.707-0.141(lnXlo) +14zXl 0 -l,.0.183(Xll)-0.443(lnXj. )+ 000

    0.0625(X4) l:XJElo~46-X12!579_____ _____ ________ __ 8 X4i 22

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    44/92

    TABLE 3.1. continuedEQUIPMENT PARAMETERS DEFINITIONS AN D QUANTIFICATION

    Parameter Parameter Quantific~tioriSymbolis

    X1 Volume Equipment volume in cu-bic feet.Note:for Radioand Navigational Re-ceiving Sets, the vol-ume of any antennawhich may be a part ofth'e set is not includedin the calculation.-

    X2 Number of Interfacing The number of otherEquipment equipment, excludingindicators, that feedsignals to or receivesignals from thisecruipment.-

    X3 Sensitivity Measured in volts for a10 db

    S+NNX4 Packaging Characteristic The sum of the ratingsRating given to each character-

    Characteristic Ratingr istic. for the equipmentType of Enclosure under study.Some cabinetspressurized 0No cabinets press-urized 4Vibration IsolationSome cabinets shockmounted 4No cabinets shockmounted 0Equipment PackagingEquipment in singlepackage 4

    38

    . ....

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    45/92

    TABLE 3.1 continuedEquipment in 2-4packages 3Equipment in 5-6 -packages 2Equipment in 9 ormore packages 1Type of CoolingForced air-refrig-erated(at alltimes) 4Fnrced air-insideambient at alltimes 3Convention 2Refrigerated airon deck,outside am-bient at altitude 1Component PackagingModularized 0Conventional Con-struction 4Type of wiringPrinted Circuits 0Conventional. wirinq4X5 Number of Signal Inputs The number of signalsAcceptable from other equipmentwhich the equipmentunder study requires orcan accomodate in its

    operation.X6 Equipment Feature Rating The sum of ratings forFeature Rating the applicable individ-Power Supply ual design features.Power supply ex-tern,1 t,. equip-ment 5Solid State 4Combination solidstate and tubes 3Tube 2Rotating machinery 1

    39

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    46/92

    TABLE 3.1 continuedTuning(operational)N ne required 4Manual 3Senti-automat ic(auto-tune) 2Fully automatic 1Type of indicatorsNone 4Meters 3Electro-mechanicar 2Cathode ray tube 1X7 Number of LRU's The sum of the totalLRU Function(MIL-STD-196 quantity of each LRUA Symbol): included in the equip-Air conditioning(HD) ment complement.Amplif ier( AM )Antenna, complex(AS)Antenna, simple(AT)Compensator(CN).Computer(CP)Control(C)Convert er (CV)Coupler (CU)Indicator,cathode raytube(IP)Indicator,Non C.R.T.(ID)Junction Box(J)Keyer(KY)Power Supply(PP)Receiver (R)Receiver/Transmitter (RT)Recorder (RD)Relay Box(RB)Switch(S)Tran,:_-itter(T)

    X83 Equipment Subfunction The rating of the sub-rating for Low Power function.Navigation and IFF Trans-mitting and Receiving Sets.

    40

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    47/92

    TABLE 3.1 continued

    Subfunction RatingDoppler, TACAN,Radio Altimeters 1Beacons 2IFF Sets 3X9 Number of Channels The number of channelsof operation fo r inter-

    communication sets.XI0 Power Consumption. The steady state powerin watts consumed by

    the equipment in itsmost power-consumingmode of operation.Note :Considers the"radiate" not the"standby" status ofradar sets. "Steady

    state" implies thatstarting power re -quirements are not tobe considered.Equipment Function Rating The rating given to theClassification Rating equipment function.

    FunctionNavigational Re-ceiving SetsLoran 6Radio ReceivingSets 3Radio NavigationSets 5Direction FinderEquipment 5ElectromechanicalAnal-:g NavigationComputers 1Indicator GroupIndicators 3

    41

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    48/92

    TABLE 3.1 continuedSignal Processing/Generating Equip-ment SignalConverter 2Signal Analyzers 2Coder-Decoder 10Radio CommandCommunicationsRadio CommandCommunications 2High-Power Radar SetsIntercept 1Tracking 1Side-Looking 1Search 1Fire Control 1Bombing/Navigation 1Acquist ion 1Low Power Navi-gational& IFFTransmitt ing&Re-ceiving SetsIFF 5Doppler 2Beacons 3TACAN 2Altimeters 2IntercommunicationsIntercom Sets 4X12 LRU Rating The sum of the productsLRU Function(MIL-STD-196 of the quantity of eachA Symbol)* of the LRU types shownRating Rating below times the ratingHD 2 ID 2 for that LRU type.AM 3 J 3AS 4 KY 4AT 1 PP 4CN 2 R 4CP 4 RT 4C 3 RO 4See X7 fo r definitions of symbols.

    42

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    49/92

    TABLE 3.1 continuedCV 3 RE 4CU 1 SA 2IP 4X Weight The weight in pounds of

    the equipment.Note:For Radio Navi-gational and Receiv-ing Sets, the weightof any antenna thatmay be a part of theset is excluded fromth e calculation.

    X14 Frequency The highest frequencyof operation in mega-cycles per second.X15 Power Output The power in watts de-livered to the antenna.

    X16 Number of Operational The total number of theFunctirns or Capabilities contained by the radarRadar Set Functions: set.Airborne Early Warningand ControlAnti-IntrusionAcquire on JammerAcquisitionBombingCalibrating(Test Equipment)CW IlluminationECM TrainingGround Controlled ApproachGun Fire ControlHome on JammerHeight Finding RadarIFF/SIFInterceptNavigationNoise JammingProjectile InterceptRadar DecoyRadar BeaconRadar Trainer

    43

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    50/92

    TABLE 3.1 continuedRangingReconnaissanceSearchTrackAny functions not otherwiselisted of the same order

    X NUMBER OF FEATURESFEATURES QUANTIFICATIONMTI The sum of the numberMultiple Range of the features con-ECCM tained by the radar set.Self Contained ComputerSelf Contained DisplayVariable Pulse WidthBeam ShapingVariable Scan CharacteristicsVariable Range BugSelf Contained GyroContains Beacon ReceiverX TYPE OF ACTIVE ELEMENT GROUPTYPED AEG Rating Use Average Ratings ifTransistor 4 more than one typeTube,standard or dominatesminiature 3Tube,subminiature 2ElectromechanicalDevices 1

    44

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    51/92

    CHAPTER IV

    LATE PREDICTION METHODSThis chapter concerns i tself with reliability predic-

    tion techniques used during the design comparison, proposalevaluation, trade-off studies, design review, and designanalysis of the hardware. The following three methods werechosen for discussion; part count technique, stress analy-sis technique, and degrad~ation analysis techniques.

    GENERIC PART COUNTThe basic assumption in the part count technique is

    that the average stress levels in new design are assumed tobe approximate average levels of stress in previous designs.Also, nominal levels of environmental stress are assumed.The math model is exp-nntial. (17) In other words, thisprediction method makes use of average failure rates whichare applicable to parts operating in similar environmentsof use. The detailed information of the operating part isnot needed as this system'requires only a measure of thesystem complexity in terms of part numbers. The data isacquired from field experience on a large variety of

    45

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    52/92

    equipment. A part type includes those items which performthe same function, such as transistors, resistors, motors,etc. More precise definitions of generic part types areemployed in order to gain better accuracy in the predictionmethod. These definitions take into account the types ofconstruction and the rated stress levels to which parts canbe subjected. For example, resistors may be divided intocarbon film, metal film and wirewound types while transis-tors may be divided into signal and power types. The de-gree to which the generic part can be defined may berestricted during the design phase.

    A weighing factor K is usually incorporated into themodel to compensate for different environmental conditions.Usually the part failure rate data is found under laborato-ry conditions of stress and temperature. In actual fieldapplications these parts are subjected to other stresses ofvarious natures not specifically accounted fo r in the data.In order to provide a range of failure rates keyed to suchadditional usage stresses such as humidity, vibration,shock, handling, turn-on and turn-off, etc., found inactual applications, the K factors have been assembled fromdata based on field equipment failure rates. Thus, part

    4b

    ?I

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    53/92

    I"--

    failure rates determined can be modified depending on theintenaed end use by multiplying the base failure rate bythe appropriate K factor. All the techniques used areessentially applied in the same manner, with the source ofdata being the major difference. Equipment failure ratesare determined on part count(actual or estimated)by classor type. Application of the technique involves countingthe number of parts of each class or type, taking thisnumber and multiplying it by the generic failure rate foreach part or class type, and adding all the products to getthe failure rate fo r the equipment. Tw o sources of dataare Volume II of the RADC Reliability Notebook (17) andMIL-STD-HDB 217A.

    To apply this technique, the equipment under consider-ation is broken down according to the classes and types ofparts. The total number of parts in each class and type isdetermined by actual count if possible. If an actual countis not possible, a good estimate of part count is required.(17) It is essentially a process of actually counting thenumber of parts of each class or type and multiplying .thisnumber by the generic failure .rate fo r that particularclass or type. These products are then summed accordingly.

    47

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    54/92

    For part class (14)t rM r +cMc + "'" (4.1)

    whereAr = average failure rate for all ;esistors.Ic = average failure rate for all capacitors.Mr = number of resistors.Mc number of capacitors.Lt = equipment failure rates,

    or for Piart type, (17)it M +AM +(42rl ri r2 r2 .(.

    where= failure rate for carbon composite resistors,

    Ir2 = failure rate for metal film resistors,Mrl = number of carbon composite resistors,Mr2 = number of metal film resistors,t = equipment failures rate;

    or in summary (17)nA N (4.3)E i1=1 2where

    E = equipment failure rateNi = the number of parts of type or class i included

    in the equipment48

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    55/92

    3 average failure rate or failure rate of part ofclass or type i

    n = the number of different type or class of partsincluded in the equipment

    Again it must be noted that generic failure rate is theinherent number of'failures-per unit of time or cycles thatwill occur under laboratory conditions. Generic failurerates of components are obtained from failure data ofprevious tests. These rates are the basic failure rates tobe adjusted for appropriate use conditions and otherapplicable considerations. (17)For example, le t

    AE = KE O0 (4.4)where,I. = mean failure rate under a specific use environ-

    ment)O = mean generic failure rate under an ambient use

    environmentKE = environmental weighing factor

    It is noted that some pacts fail in application less thanin vendor and laboratory tests. These K-factors are usuallyless than one. On the other hand, some fail more and thus

    49

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    56/92

    have a weighing factor greater than one. Because differentparts are affected to different degrees by each environment,it is necessary that K-factors be developed for eachgeneral set of environments. More accuracy would beattained by developing failure rates around each environ-mental factor (humidity, vibration, etc.) and to a degreearound the specific level fo r each environmental factor.

    (17) An example of K-factors is shown in Appendix D.Appendix A shows a step-by-step procedure for utilizingthis technique.

    STRESS ANALYSISAnother technique employed during this phase of the

    life cycle is the stress analysis tebhnique. A moreprecise definition of the parts within the system isrequired so that reference can be made to the part failuredata, which is expressed as a function of electrical,thermal, and mechanical stress. This permits a detailedpart-by-part analysis of the system design to the extentthat the effects of degrading stresses are considered indetermining the failure rates of individual parts. (14)These techniques are similar in use, the difference being

    50

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    57/92

    IIthe source of failure rate data, and the correspondingdiffeiences in the procedures used in extracting data fromthe data source, and translating these data for applicationto a specific system. (14) Actual operating conditions ofthe parts are first verified to be within their specificrated capabilities. This action not only provides data fordetermining the part failure rates, but also ensures thatany overstressed part is identified and subsquentlyeliminated from the system, either by the use of a differ-ent part or a new design configuration. Since details ofsystem design are required in determining stress ratios,temperature and other application and environmental data,this prediction technique is only applicable during thelater stages of design. (29) Once the failure rate dataare determined, the reliability prediction is completed bycombining the failure rates for each part in the systemaccording to a pre-established mathematical model. Ingeneral, this will involve substituting the failure ratesin the model to obtain the predicted reliability or MTBF ofan element of the system, and combining system elementreliabilities as appropriate to obtain a prediction of th eoverall system. (17) Because of the high level of

    51

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    58/92

    complexity of modern systems, the application of theprocedure consumes time and should be used when suchdetailed part by part analysis is warranted. Computerprocessing is "workable," but the initial stress analysisusually involves considerable effort at the engineeringlevel. The characteristics-of this prediction techniqueconsiders part type, operational and environmental stressesand derating characteristics associated with each individ-ual part in predicting MTBF. One major assumption is thatsimilar paits operated under similar conditions willexhibit comparable reliability. (1/)

    This detailed part-by-part analysis of a systemdesign permits the degrading effects of stresses. Whenutilizing this technique, operational and environmentalstresses as well as other characteristics are determinedfo r each part in the equipment or system. The character-istics and parameters must be defined and evaluated foreach part class. A base failure rate is determined foreach part. This is done by finding the stresses on eachpart. These stresses are usually in terms of electrical,mechanical, and thermal stresses. These stresses areusu3lly given in decimal fractions indicating the ratio of

    52

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    59/92

    actual stress to rated stress. The actual stress condi-tions can be determined by estimates, circuit analysis,actual measurements, planned derating criteria, or someother source. The rated stress is the stated rating forthe part parameter without derating. Again considerationmust be given for the K-factor adjustment. Failure ratesare noted fo r each module or unit failure rate. Then thecombined module or unit failure rates are used to obtainthe subsystem failure rate or survival probability. Sub-system rates are then combined to get the system reliabil-ity model. MIL-STD-HDB 217A lists several assumptionsthat should be noted when using this technique. They arethat most data do not include the failure rate forconnecting devices such as soldering, wirewraps, etc. Allpart failure rates are based on part test data derived fromvendor test and equipment manufacturer's test. K-factorsare usually used to convert the failure rates shown tooperational environment. The definition of a failure fora part under c:.ntrolled test conditions usually differsfrom the definition of a failure for the part when used inan equipment. The application K-factors take this intoaccount. Appendix A shows a step-by-step procedure to use

    53

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    60/92

    when employing this technique.

    DEGRADATION ANALYSISThe final technique is degradation analysis. It is a

    general term usually applied to a variety of analysistechniques which permit an evaluation of the probablesusceptibility of an equipment to variation or drift incircuit parameters. (17) Its basic characteristic is thatit is usually implemented on a computer and by circuitanalysis evaluation and optimizing is studied. It requiresa mathematical model in terms of circuit equations.

    One example of this technique is Monte Carlo Methods.(6 ) It involves the computer similation of an empiricalapproach whereby a number of copies of the network understudy are constructed by sampling component parts atrandom from a representative population of these parts.Input data for a Monte Carlo Method must include the com-plete frequency distribution of each parameter. Forexample, a certain part valve is required, so a set ofvalues, distributed in a manner representative of th eactual part values, would be provided. Random selection ofthis value from this set would then simulate the random

    54

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    61/92

    selection of a part. Selecting all input parameters thisway, and solving the circuit equations using selectedvalues, one set of output values results. Repeating theprocess over and over will then give a set of valuesdistributed for the part. The Monte Carlo Method could beused in an analysis of a new complex system in which nopast information is available. This type of analysis wouldenable one to gain information concerning which subsystemsand components of the system are most critical.

    Another technique in use under the degradationtechnique is the worst case method. These methods can beused to determine the worst case output for any variable.Still another technique is the parameter variation method.(17) This technique is used to provide information con-cerning part parameter drift stability, circuit performance-part parameter relationships, and certain circuit-generatedstresses. In the general case, a parameter variationanalysis would consist of solving circuit equations in aniterative manner until a solution is obtained for allprobable combinations of input parameter values. Ingeneral, input data consists of nominal or mean values andestimated or assumed circuit parameters drift

    55

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    62/92

    j!.

    characteristics. A typical parameter method would besolved in steps by varying one or two parameters at a timewhile holding the rest at a nominal value. Solution of thecircuit equations for each step would provide limitingvalues for parameters under investigation. This procedureis repeated for all parameters in the circuits. (17) Twoother methods are mentioned in reference (17). They areworst case methods or moment methods. Only brief mentionof them will be presented here. Worst case methods ofcircuit analysis can be used to determine the worst caseconditions for any output variable of the circuit. Theworst case methods are extensions of the parameter varia-tion methods in that output variables are evaluated withreference to varying values of the input parameters.Moment methods utilize the mean and variance of the distri-butions of the input parameters to get the mean and vari--ance of output variables. A list in MIL-STD-HDBK 217A ofvarious degradation prediction techniques are listed inArpendix E. These will not be discussed in these papersbut are included fo r the benefit of the reader. Thefollowing chapter is a discussion of prediction in generalwith the advantages and disadvantages of the prediction

    56

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    63/92

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    64/92

    CHAPTER V

    SUMMARY AND RECOMMENDATIONSReliability prediction should play.a'vital role in any

    development of a system. This check gives management anddesign people additional information fo r decisions. Itserves to guide management throughout the life cycle of theequipment. Reliability prediction shows various designflaws and allows fo r their correction before the finaldesign is completed and thus eliminates costly repair orrefit. A reliability "number" should be taken in the con-text of its derivation. Reliability models do not predictreliability of an item in the sense of a guess whichrequires no input information; models can only predict,based on previous results with the same or similar items.The accuracy of an estimate depends on two factors:(1). The validity of the model (equation) used.(2). The validity of the numerical parameters used in th e

    equation .data) :(a). were the parameters accurately derived

    (statistical techniques, sample size)?(b). were the operating conditions *ie same as for

    58

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    65/92

    the item being analyzed or , if different, werethese differences accounted for?

    It is therefore seen that a reliability model can onlyestimate reliability from known data describing the reli-ability of similar equipment, or from known physical lawsand measurements.

    Let us again summarize some of the problems associatedwith reliability prediction.

    In predicting reliabiity by function or regressionanalysis, it is impossible to model every possible contin-gency which can affect performance in the real world andthese prediction numbers are likely to be only a crudeapproximation of the real world. Most of the studies donein this area were restricted to only the equipment usedand by the data available from field experience. Determin-ation of similarity can be made by engineering judgementonly, no hard and fast rule is yet available to do this ina quantitative fashion.

    In predicti~on by function, the range of parametersmay be outside those under consideration. Equations arevalid for systems designed and fabricated according to theengineering practices of the time a.sociated with the

    59

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    66/92

    systems of this study program. A correlation existsbetween the year, complexity, and hardware improvements.Reliability studies applied throughout a system's lifecycle enforce checks on reliability itself.

    When utilizing any technique, where data is required,a fundamental limitation is-the ability to accumulate dataof known validity for each new application. Differences,such as environment, uses, operators, maintenance practices,measurement, and even definition of failure, can affect th edata gathered.

    When comparing similar systems, care must be exercisedin the utilization of the reliability number. Differencesin manufacturers, usage, and other factors could greatlyinfluence the reliability. The stress analysis techniqueis one of the most accurate prediction methods availablebut it, too, has severe shortcomings. The most glaringshortcoming is, again, the data base. Much of the data nowused was collected on limited sample sizes under conditionswhich lacked adequate controls or was derived by interpola-tion or even extrapolation. Variability in reliability fora part made by two different manufacturers or at differenttimes by the same manufacturer is usually unknown.

    60

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    67/92

    47,

    Environmental factors must be treated with gross K-factors.This method of prediction is time consuming, especiallyfor complex systems. This method does not account fo r orpredict the occurence of drift failures.

    Data is the prime disadvantage when utilizing thegeneric pait count technique. Degradation analysis tech-niques are usually done on a computer. The computer costfo r this method may become restrictive. This cost is re-lated to the .number of simulation runs and the complexityof the system. Another disadvantage is the man-hoursrequired to perform the modeling. Engineering and designchanges will, in general, prohibit any type of quickanalysis of certain design changes.

    When utilizing prediction techniques it must beremembered how and why the number was arrived at in th elife cycle. They are guidelines used in the various stagesof life cycle and are not "exact" figures that can be usedto show the reliability of the equipment when in use or inthe field. The methods covered in the text contributegreatly to reliability prediction. The reliabilityengineer should choose his technique so that it is not toobulky or hard to handle and yet not so simple as to have

    61

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    68/92

    little meaning. If he allows himself to get involved witha large model that is not capable of giving a realisticpicture of reliability, he has not accomplished his mission,On the other hand, if he allows himself to choose a simplemodel, unsuited for his needs, he may present a falsefeeling of confidence. The reliability engineer needs toevaluate his alternatives and choose the reliabilityprediction technique that most appropriately fits hissystem or equipment. He is then able to produce reliabil-ity numbers that have importance and can be used throughoutthe life cycle of the equipment.

    The author suggests the following topics fo r furtherstudy. The topic that invariably was mentioned as a keyproblem was the data base. Methods of collecting data,both from field usage and test, need to be examined.Problem areas include unreported failures, data bases fromreplaced parts rather than failed parts, environmentalconditions, people using the equipment or repairing it andeven the definition of failure. One prediction techniquenot covered in the text is prediction by deficiency Thistechnique needs further consideration. The author suggestscombining some of the other techniques to get reliability

    62

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    69/92

    numbers and comparing them to actual field data. Finally,reliab'ility in the future should be studied and its placein management as well as design decisions and its .abilityto help guide the project through its life cycle should bedefined.

    63

    ...................................

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    70/92

    APPENDIX A

    64

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    71/92

    Steps to follow when utilizing prediction techniques.I. By similar system

    A. Define new equipment relative to type and opera-t ional environment of use. Other characteristicsof the equipment may assist in th e definit ion, suchas size and output requirements.

    B. Identify an existing equipment which is th e nearestequivalent, making careful note of any obviousdifferences which exist for which th e prediction isdesired.

    C. Record the r~liability data available on thenearest equivalent, and make adjustments fo r eachdifference noted in step B.

    D. Draw conclusions concerning the level of reliabil-ity that will be obtained from the new equipment.Such conclusions assume similar equipment willexhibit similar reliability. The accuracy of th eestimate depends on the quality of the recordedfailure data which is ava~lable, and the ability ofthe analyst to make the necessary adjustments inorder to reflect the reliabilty potential of th enew equipment.

    b5

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    72/92

    II. By functionA. Determine from data relative to similar equipment

    the functional characteristics of the design whichgives the best correlation between failures peractive element and functional value.

    B. Estimate the functional value required for the newdesign, fo r which the prediction is required, anddetermine the failure rate per active elementgroup from the data analyzed in step A.

    C. Estimate the number of activ.. element groupscontained in the ne w design and multiply this bythe failure rate per active element determined instep B.

    III. Generic part count mcthodA. Determine the quantities of each generic part type

    used in the system for which the prediction isrequired.

    B. Calculate the failure rate contribution of eachgeneric part type by multiplying the quantity usedby the average failure rate of a single genericpart type.

    C. Estimate the overall system failure rate by summing66

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    73/92

    the contributions calculated for each generic parttype in step B. (Corrections may be applied toeither the generic part failure rates or overallsystem failure rate to reflect different environ-mental conditions of operation.

    IV. Stress analysisA. Prepare a list of parts in the equipment.B. Determine the stresses and environment the part

    must operate in.C. Get the proper K-factor and failure rate of part.D. Combine module or unit failure rates. Combine

    thesp to get subsystem and combine these to getsystem.

    '-7

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    74/92

    APPENDIX B

    b8

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    75/92

    N MIL-STD-756As ~~~REPRODUCED FROM MIL-STD-HDB 75bA15My96

    The reliability estimate obtained from this chart represents a band of possible outcomes. The smallerfailjte rate values of the hand are obtainable with good reliability engineering and design effort.

    * 103 ._ __ _ _4J

    $- 1034

    10~.

    1 10 16 1 0410NUMBER--.' OF SEIE ACIE LMET

    ILm .Fiuert n ,1H easJntinlcmlxt o lcrnoeupet

    I . (p 99

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    76/92

    .| .... ._ j _ _r i : = -: = ,. . ,L , ._, = b 7- -'-

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    77/92

    I LA LA 0(0z co LALAn Ln o~Lno 404 fn 1 -4n. 0 *r- , .0" .ONcj JN0coNO *LAOC)00,DNO

    0 -I) 0 fI r--C 40 --4r4 - a*r N2rOO nO - Ln - Z

    Q H HH * ~ ro-o H *** NE- HH t-n.-*H * A H H *iLto 10 r

    U) tn m 0 . rH4 O m C 0 zIZ -- 4 0 4J 4-)U E $ rO C4 M a) U 3: r4 U)Z) (0V) U 04) : rI - - - 4r

    0 Ej (n 0-P )0 r:U =0 f )N M )F-I~ ~ .q-s4I1 E0a 0w0 u :0 ,

    U)4 WH>1-1 AH( J- 4 MO(1)a) C - r4M Q J - N C I4

    04 -A0r,( " V - 1 4J (V >i4 WU Z > :1 zFZ 0U r' a4 EU-, -n4 N r:44w :4J >0 N5:

    O ~~d~71

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    78/92

    0

    .1E-4E-i

    E-4

    0o E-1H

    E-4

    I a N%.IE-4 0 0P 0n ( Uoo.,~4 *H Her4)n '(41 *.O r'%..-E-c4 r - E 0 0

    W 0)CoDC:!(U *0 )Mo -.4 r u i

    00 )4 %4 M0>4W kU > 0jt)- u *I- 0 (I4 . Q)r--1

    04 -4 ) ful( p 1 h a )0 E 4 C 4tCO M~r* 0> 'U4 -.. -r4 -P 4J _4M0:0 EM Ei C-: F '04 -40 sI r-.r~~ o E aH-4*dHs1 0 :$ 40 3E11-00 4Q U aW 4 w 0 u mE-t (o E E.- 0)UE k E ) - 44k -Az oo -'4N0 -r4.VJW51WE l A J ,1$4cn0 CA 4 XP - ( X M 4 X r-I0 *d4X -4 4004MP40 (CU Wt-A a)$ :DO 0 ;: ) n a4 04w 1 w: D~~4 W 4 4JU~ 4V .. ~ 0 072d

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    79/92

    V n Nm fn N m ONH Hro4- 0Ur N -4 H.4D1HIo N H- N LO Ln - ~OD4 HN4 ON r-

    H- DNA4 4H

    F24 10 -4H " ODc r-~o 0C4'N NN*'fu HOHO HHH (si rHA tAN00.E,4-)C

    00

    H~42000 SoloO 00 000 000 00 Hl 4J O COD N'JINCN coN m V mr)c oO0' 1-.I~ 0m %H t L00%0I0004 0 0--- *- H fn OD- .- r- 00Ln u) LOOL mH AAL tn

    H E-H U)H 0f

    0I 4-4J1- 4I4 .fu 0i0o0o OJD0N0N N C N

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    80/92

    . ' j - - -. -- ----; - ... a ._., - - >, .i , _ ' .- li'-s.

    a)) m 014 )4.)

    0) i(i) 0(a )~N ~ a0 4 .Wv,4 P H

    0 -A44HV0 H.44 H0 0)-

    E-4N Q4)aa) 4) E-f0 0. x

    0V VW4J

    Nr C-m HWI

    otr44

    I-I !> )- =-A4 rI>

    4 A ) ,. -4 H .-A M - .,- (n , - A,- r I .?4 rP M En v M w u M 0 u m U) 0 (C , ri w 4"0 $4 -H 4) .,1 0 M CO a) - r4

    "H CI

    I I I I- I74

    U) zS4' 4'4'

    4' '74'

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    81/92

    APPENDIX D

    75

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    82/92

    * 0 #. a(Q n tn~LnU 0 O C I I.9.4

    fuf H M .D Ln0~ irN m NH I IA . I I 0en)N I I I H

    S Q)O04r-U)OJW 0r 4 ~ *.* I I

    H H

    r74 Dz4 HH H H H

    0El U ~00NOr44

    X4 -zHU

    ElI El U 0 U) ) Ufu >9 > Q) U 0) (D 0) 0H 4 0 (a f 04 H04 04 04u : 0 4J~ ):?.' :>, (a R4 >40 >H (a V pNm 0O E-1 N-1H E-t E-1 E-i41. E-10) 0) 0 M N 4)04 pN H-4u H3. H H H)r4-q r4 M4 4)-A.40 Q) 0 -A-r4 H- Q)C~N H- H-

    C 43o1 N

    0 4- 4 0)4)t ) H l 4J 0

    0U) Ua 04 W r.I30) r. 4.u r4 NU)V ~ N C

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    83/92

    .4P4

    044

    CD~ ~ ~ ~ ( (D~A 1 'JE-1 X V

    0 C0-4 Q) ov0,-4o- Owc L eC

    0 ~ s> > 0 ~0 0U~r fu w1~H 4J 4JH W H -4

    E-40 4 E o)4J 0 00 mC 0

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    84/92

    IVY-I

    APPENDIX E

    7p

    'ii :' ...'- '!P ''- ' ... . .... " '"II ' -- ..... ii .. ...... '- .... [.. ... " .. _ - .. ....".... "

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    85/92

    SUMMARIES OF TYPICAL ANALYTICAL TECHNIQUESMANDEX W orst-Case Method

    Type of Analysis : Steady state AC and DC worst-caseMathematical Model Necessary: Circuit' Ssimultaneous

    equations or matrix equat ionParts ' Data Necessary: Nominal value and end-of-life limitsOutput Information Received: W orst-case value of output

    variable compared with allow-able valueType of Circuits Suitable: Class A amplifiers, power

    supplies, all biasing(dc)circuits,Logic circuits etc.MOMENT METHOD

    Type of Analysis: StatisticalMathematical Model Necessary: Circuit 's simultaneous

    equation or matrix equationParts' Data Necessary: Mean(or nominal)value and standarddeviation or variance of each input

    parameter and correlation coeffi-cients when they existOutput Information Received: The mean and variance of th edistribution of each output

    parameterType of Circuits Suitable: Any circuit for which a mathemat-ical model can be derived

    MONTE CARLO METHODType of Analysis: Statistical;predicts output variabledistribution at any time; steady state acor dc(Transient may be performed if

    79

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    86/92

    (cont'd from page 79)if formula is available).

    Mathematical Model Necessary: Circuit's simultaneousequation,matrix equation,transfer function(any mathe-matical r~presentation in-cluding input parameter)

    Parts' Data Necessary: Complete distribution of each inputparameter at a timeOutput Information Received: 20 cell histogram fo r eachoutput variableType of Circuits Suitable: An y circuit for which a mathemat-

    ical model can be derived.VINIL METHOD

    Type of Analysis: VINIL MethodMathematical Model Necessary: Piece-wise linear equivalentcircuitsParts' Data Necessary: Application curves over operating

    and environmental ranges along withdrift dataOutput Information Received: Input characteristics(mximum&minimum) Transfer chamacteris-

    tics(maximum&iiinimun) Outputcharacteristics (maximum&min-imum)

    Type of Circd-ts Suitable: Digital; Linear analogSCAN AC Method

    Type of Analysis: Linear sinusoidal dynamic analysisMathematical Model Necessary: Simultaneous complex variable

    80

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    87/92

    (cont'd from page 80)equations with the real andthe imaginary parts of theequations separated

    Parts' Data Necessary: Nominal(mean);Minimum(-3);Maximum(+3)Output Information Received: Families of frequency responsecurves;statistical variationof unknowns at any selected

    *frequency;+3,-3 and mean ofunknowns versus frequency(assumed)

    Type of Circuits Suitable: Any linear circuit that containsfrequency dependent devices andwhich is driven or is signifi-cantly analyzed with sinusoidalfunctions

    SCAN Transient MethodType of Analysis: Linear and non-linear transient analysis;

    differential equation solutionMathematical Model Necessary: Simultaneous differential

    equationsParts' Data Necessary: Nominal parts data;alternate sets of

    parts' data;Parts' data for theswitched statesOutput Information Received: Time response of linear or non-

    linear systems

    Type of Circuits Suitable: All circuits for which the tran-sient determining effects can bemodeled.

    811

  • 8/6/2019 AD-A009 282 a Comparative Analysis of Reliabiltiy Prediction Techniques

    88/92

    (cont'd from page 81)Parameter Variation Method

    Type of Analysis: General;determines allowable parametervariation before design fails to functicn.Considers bot