Measurement systems analysis v1.1

Download Measurement systems analysis v1.1

Post on 21-Jan-2015




3 download

Embed Size (px)




<ul><li> 1. Measurement Systems Analysis (MSA) 2001 Six Sigma Academy1</li></ul><p> 2. Why Measure? To understand a decision: Meet standards &amp; specifications Detection/reaction oriented Short-term results Stimulate continuous improvement: Where to improve? How much to improve? Is improvement cost effective? Prevention oriented Long-term strategyIf you cannot measure, you cannot improve! Taguchi 2001 Six Sigma Academy2 3. Measurement System As A Process MaterialMethodMachineCleanliness Sequence Cleanliness Temperature Temperature Timing Dimension Design Positioning Weight Precision Corrosion Calibration Location Hardness Resolution Set-up Conductivity Stability Density Preparation Wear Compliance-procedure Fatigue Vibration Attention Calculation error Atmospheric pressure Interpretation Speed Lighting Coordination Knowledge-instrument Temperature Dexterity Vision Humidity CleanlinessEnvironment 2001 Six Sigma AcademyMeasurement ErrorPeople3 4. What Is An MSA? Scientific and objective method of analyzing the validity of a measurement system A tool which quantifies: 1. Equipment Variation 2. Appraiser (Operator) Variation 3. The Total Variation of a Measurement System MSA is NOT just Calibration MSA is NOT just Gage Repeatability &amp; Reproducibility (R&amp;R)Measurement System Analysis is often a project within a project 2001 Six Sigma Academy4 5. MSA Relationship To DMAIC DefineMeasureAnalyzeImproveControlMeasurement Systems Analysis Quantitative evaluation of tools and processes used in making discrete or variable observations DefineMeasureAnalyzeImproveControlMeasurement Systems Control Established, documented, and continuously carried out Ensures measurement system maintains an acceptable status Often referred to as Long Term Gage Plan 2001 Six Sigma Academy5 6. MSA - A Starting Point Before you Make adjustments Implement solutions Run an experiment Perform a complex statistical analysis You should Validate your measurement systems Validate data and data collection systemsMSA quantifies a major source of process variation 2001 Six Sigma Academy6 7. Measurement Systems Examples Precision gage Data collection form Survey School entrance exam Customer satisfaction On-time delivery reportWhat is your system ? 2001 Six Sigma Academy7 8. Types of Measurement System Analysis Operational Definitions Walking the Process Gage R&amp;R Variable Data Attribute Data 2001 Six Sigma Academy8 9. MSA Operational DefinitionsThe Measurement System can be validated using Operational Definitions constructed by the Project Team to ensure that all measurement takers completely understand what is expected during the data collection phase. 2001 Six Sigma Academy9 10. Developing Operational Definition Operational definitions are descriptions written in a way that ensures consistent interpretation by different people The operational definition method of description will be used throughout the DMAIC process 2001 Six Sigma Academy10 11. Operational Definition The technique of defining an item, process or characteristic using Operational Definitions is an effective way to communicate between Team Members and other people involved in the project. Because Operational Definitions are so effective, the technique is used in a number of locations within the DMAIC process. Remember, to be effective, an Operation Definition must be written in a way that ensures consistent interpretation by different people.CC 2001 Six Sigma Academy11 12. General Example Operational Definitions Examples of Operational Definitions for data collection: Record the date that the lease company written notification arrives in the dealership using an MM/DD/YY format. List any cosmetic preparation in excess of the standard predelivery process required to render the vehicle acceptable for retail consumer sale. Record the weight of each package of coffee in ounces by pouring the coffee into the filter and placing the filter and coffee on the scale tray. Record the length of time that coffee remains in the urn by recording the actual time of day each time the Brew button is pressed to recharge the urn. Use 24-hour clock and round to the nearest minute. 2001 Six Sigma Academy12 13. MSA Walking the ProcessWalking the Process is a method of conducting MSA when it is not possible to perform a Gage R&amp;R. 2001 Six Sigma Academy13 14. How to Walk the Process Develop Operational Definitions for each of the measures to be collected Train data collectors prior to beginning the data collection activity Follow the process from beginning to end and monitor the data collection activities to determine if data is being collected properly Continue walking the process until the data compiled accurately reflects the existing process 2001 Six Sigma Academy14 15. Components Of Measurement Error 2001 Six Sigma Academy15 16. Components Of Measurement Error Resolution/Discrimination Accuracy (bias effects) Linearity Stability (consistency) Repeatability-test-retest (Precision) Reproducibility (Precision)Each component of measurement error can contribute to variation, causing wrong decisions to be made 2001 Six Sigma Academy16 17. Categories Of Measurement Error Which Affect Location Accuracy/ BiasLinearity 2001 Six Sigma AcademyStability17 18. Categories Of Measurement Error Which Affect SpreadPrecisionRepeatability 2001 Six Sigma AcademyReproducibility18 19. Resolution/DiscriminationResolution?Can change be detected?OK Accuracy/Bias? OK Linearity? OK Stability? OK Precision (R&amp;R)? 2001 Six Sigma Academy19 20. Resolution Simplest measurement system problem Poor resolution is a common issue Impact is rarely recognized and/or addressed Easily detected No special studies are necessary No known standards are needed 2001 Six Sigma Academy20 21. Definitions: Resolution/Discrimination Capability to detect the smallest tolerable changes Inadequate Measurement Units Measurement units too large to detect variation present Guideline: 10 Bucket Rule Increments in the measurement system should be one-tenth the product specification or process variation 2001 Six Sigma Academy21 22. Resolution/Discrimination Poor DiscriminationSame process output being measured123451 Better Discrimination123451.3 2001 Six Sigma Academy22 23. Resolution Actions Measure to as many decimal places as possible Use a device that can measure smaller units Live with it, but document that the problem exists Larger sample size may overcome problem Priorities may need to involve other considerations: Engineering tolerance Process Capability Cost and difficulty in replacing device 2001 Six Sigma Academy23 24. Accuracy/Bias Resolution? OK Accuracy/Bias?Measurements are shifted from true valueOK Linearity? OK Stability? OK Precision (R&amp;R)? 2001 Six Sigma Academy24 25. Accuracy/Bias Difference between the observed average value of measurements and the master value Master Value (Reference Standard)Master value is an accepted, traceable reference standard 2001 Six Sigma AcademyAverage Value25 26. Accuracy/Biasx x x x xx x x x More accurate 2001 Six Sigma Academyx x x x xx x x x Less accurate26 27. Accuracy/Bias Actions Calibrate when needed/scheduled Use operations instructions Review specifications Review software logic Create Operational Definitions 2001 Six Sigma Academy27 28. Linearity Resolution? OK Accuracy/Bias? OK Linearity? OKMeasurement is not true and/or consistent across the range of the gageStability? OK Precision (R&amp;R)? 2001 Six Sigma Academy28 29. Linearit Observed Average ValueBiasNo BiasReference Value Full Range of Gage 2001 Six Sigma Academy29 30. Linearity Actions Use only in restricted range Rebuild Use with correction factor/table/curve Sophisticated study required and will not be discussed in this course 2001 Six Sigma Academy30 31. Stability Resolution? OK Accuracy/Bias? OK Linearity? OK Stability?Measurement driftsOK Precision (R&amp;R)? 2001 Six Sigma Academy31 32. Stability Measurements remain constant and predictable over time For both mean and standard deviation Master Value (Reference Standard) No drifts, sudden shifts, cycles, etc. Evaluated using control charts Time 1 Time 2 2001 Six Sigma Academy32 33. Stability Actions Change/adjust components Establish life timeframe Use control charts Use/update current SOP 2001 Six Sigma Academy33 34. Precision Resolution? OK Accuracy/Bias? OK Linearity? OK Stability? OK Precision (R&amp;R)? 2001 Six Sigma AcademyRepeatability and Reproducibility34 35. Precision 2total = 2product/process + 2repeatability + 2reproducibility Master ValueGood PrecisionPoor PrecisionAB Also known as Gage R&amp;R 2001 Six Sigma Academy35 36. Repeatability (A Component Of Precision) Variation that occurs when repeated measurements are made of the same item under absolutely identical conditions Same: Operator Set-up Units Environmental conditions Short-term 2001 Six Sigma Academy36 37. Reproducibility (A Component Of Precision) The variation that results when different conditions are used to make the measurements Different: Operators Set-ups Test units Environmental conditions Locations Companies Long-term 2001 Six Sigma Academy37 38. R&amp;R Actions Repeatability Repair, replace, adjust equipment SOP Reproducibility Training SOP 2001 Six Sigma Academy38 39. Attribute Measurement Studies 2001 Six Sigma Academy39 40. Purpose Of Attribute MSA Assess standards against customers requirements Determine if all appraisers use the same criteria Quantify repeatability and reproducibility of operators Identify how well measurement system conforms to a known master Discover areas where: Training is needed Procedures are lacking Standards are not defined 2001 Six Sigma Academy40 41. Attribute MSA - Excel Method Allows for R&amp;R analysis within and between appraisers Test for effectiveness against standard Limited to nominal data at two levels 2001 Six Sigma Academy41 42. Attribute MSA Example 5Attribute Legend (used in computations) 1 Pass 2 FailOpen file MSA-Attribute.xlsOperator #1 Known Population Sample # 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 2001 Six Sigma AcademyAttribute Pass Pass Pass Pass Fail Fail Pass Pass Fail Pass Pass Pass Pass Pass Fail Pass Pass Pass Fail Pass Pass Pass Pass Pass Fail Pass Pass Pass Fail PassTry #1 Pass Pass Pass Pass Fail Pass Pass Pass Fail Pass Pass Pass Pass Pass Fail Pass Pass Pass Fail Pass Pass Fail Pass Pass Fail Pass Pass Pass Fail PassTry #2 Pass Pass Pass Pass Fail Pass Pass Pass Fail Pass Pass Pass Pass Pass Fail Pass Pass Pass Fail Pass Pass Fail Pass Pass Fail Pass Pass Pass Fail PassDATE: 1/4/2001 NAME: Acme Employee PRODUCT: Widgets BUSINESS: Earth Products Operator #2 Try #1 Try #2 Pass Pass Pass Pass Pass Pass Pass Pass Fail Fail Pass Pass Pass Pass Pass Pass Fail Fail Pass Pass Pass Pass Pass Pass Pass Pass Pass Pass Fail Fail Pass Pass Pass Pass Pass Pass Fail Fail Pass Pass Pass Pass Pass Pass Pass Pass Pass Pass Fail Fail Pass Pass Pass Pass Pass Pass Fail Fail Pass PassMicrosoft Excel WorksheetOperator #3 Try #1 Try #2 Pass Pass Pass Pass Pass Pass Fail Pass Pass Fail Pass Pass Pass Pass Pass Pass Fail Fail Pass Pass Pass Pass Pass Pass Pass Pass Fail Pass Pass Fail Pass Pass Pass Pass Pass Pass Fail Fail Pass Pass Pass Pass Pass Pass Pass Pass Fail Pass Fail Fail Pass Pass Pass Pass Pass Pass Fail Fail Pass Pass42 43. Scoring Example % APPRAISER SCORE - &gt;100.00%78.57%100.00%% SCORE VS. ATTRIBUTE - &gt;78.57%64.29%71.43%SCREEN % EFFECTIVE SCORE - &gt; 57.14% SCREEN % EFFECTIVE SCORE vs. ATTRIBUTE - &gt;42.86% 100% is target for all scores Stat &gt; Quality Tools &gt; Attribute Gage R&amp;R Study 2001 Six Sigma Academy49 50. Attribute Study - MINITAB AnalysisContinued1. Select Single Column if data is stacked1. Select Multiple Columns if data is un-stacked2. Enter number of appraisers and trials3. Enter name of column with Known 2001 Six Sigma Academy4. Select OK50 51. Attribute MSA - MINITAB Graphical Output Date of study: 1/03/2001 Reported by: Jose Name of product: XYZ Report Misc:Assessment AgreementLower variation within appraiserWithin AppraiserAppraiser vs StandardLower variation appraiser vs. standard100100[ , ] 95.0% CI Percent 90PercentPercent90808070Higher variation within appraiser7060 BobSueAppraiserTomBobSueTomAppraiserHigher variation appraiser vs. standardNot included if no Known 2001 Six Sigma Academy51 52. Attribute MSA MINITAB Session Window Results Each Appraiser vs. StandardIndividual vs. StandardAssessment Agreement Appraiser # Inspected # Matched Percent (%)95.0% CIBob302893.3 ( 77.9,99.2)Sue302996.7 ( 82.8,99.9)Tom302480.0 ( 61.4,92.3)# Matched: Appraiser's assessment across trials agrees with standard. Assessment Disagreement Appraiser # Pass/Fail Percent (%) # Fail/Pass Percent (%)# Mixed Percent (%)Bob13.313.300.0Sue13.300.000.0Tom13.300.0516.7# Pass/Fail: Assessments across trials = Pass/standard = Fail.Disagreement assessment (repeatability)# Fail/Pass: Assessments across trials = Fail/standard = Pass. # Mixed: Assessments across trials are not identical. Between Appraisers Assessment Agreement # Inspected # Matched Percent (%) 302495.0% CI80.0 ( 61.4,92.3)# Matched: All appraisers' assessments agree with each other. All Appraisers vs. Standard Assessment Agreement # Inspected # Matched Percent (%) 3023Total agreement (against known)95.0% CI76.7 ( 57.7,90.1)# Matched: All appraisers' assessments agree with standard. 2001 Six Sigma AcademyBetween appraisers (reproducibility)52 53. MINITAB Method - Ordinal Data Entry Ordinal MSA.mtw Survey data rated on a 1 to 5 scale Arranged in multiple columns 2001 Six Sigma AcademyMinitab Worksheet53 54. Attribute Study - OrdinalSelect categories of the attribute data are orderedAnalysis is same as 2 level data 2001 Six Sigma Academy54 55. Industrial Attribute MSA Exercise Evaluate samples supplied by instructor Determine the screen and appraiser scores Interpret the results Recommend actionsiGrafx Professional Documentattributecircles.MPJ 2001 Six Sigma Academy55 56. Variables Measurement Studies 2001 Six Sigma Academy56 57. Six Step Variables MSA 1. 2. 3. 4. 5. 6.Conduct initial gage calibration (or verification) Perform trials and data collection Obtain statistics via MINITAB Analyze, interpret results Check for inadequate measurement units On-going evaluation What would be your long-term gage plan ? 2001 Six Sigma Academy57 58. Trials And Data Collection Generally two to three operators Generally 5-10 process outputs to measure Each process output is measured 2-3 times (replicated) by each operator O p e r1 P1 12P2 312O p e r2P3 312P4 312P5 312P1 312O p e r3... 312P5 312P1 312... 312P5 31Randomization is Critical 2001 Six Sigma Academy5823 59. Randomization, Repeats, Replicates Randomization Runs are made in an arbitrary vs. patterned order Average out effects of noise or unkno...</p>


View more >