09 design scorecard(bp)

82
8/22/2019 09 Design Scorecard(BP) http://slidepdf.com/reader/full/09-design-scorecardbp 1/82 DFSS Business Process Scorecard

Upload: sirdba

Post on 08-Aug-2018

226 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 1/82

DFSS Business Process Scorecard

Page 2: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 2/82

© 2001 ConceptFlow 2

 At the end of this module the participants will

• Develop performance, process and support system scorecards• Integrate them into top level scorecard

• Interpret scorecard results using statistical approach

• Evaluate design using scorecard sigma scores

• Identify opportunities for design improvements

• Clarify underlying assumptions of scorecard

Page 3: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 3/82

© 2001 ConceptFlow 3

Module Contents

• Why and how it is used in DFSS?• What is a process design scorecard?

• What are the contents of a scorecard?

• How to compute and combine scoring values?

• Process example

• Performance scorecard

• Process scorecard

• Support system scorecard

• Top level scorecard

• Modify scorecard for design changes

Page 4: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 4/82

© 2001 ConceptFlow 4

Why Use a DFSS Scorecard?

• Show all critical elements of a design process and their performance• Enable communication among all stakeholders

• Record design progress and store learning process

• Predict final quality, recognize gaps in order to improve it

• Assess current design quantitatively

• Locate areas of improvement

• Optimize design

• Evaluate service performance against client requirements

• Illustrates VOP - mean, variance, defect

• Illustrates VOC - Specifications, targets• Statistically measure risk on how DFSS is fulfilling

• design intent and client needs

• Infrastructure and sub assembly quality levels

Page 5: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 5/82

© 2001 ConceptFlow 5

What is a DFSS Scorecard?

• A unified approach to visualize design fulfillment• Service performance

• Process performance

• Support Systems performance

• Series of linked worksheets to bridge client requirements with service

and process performance at all stages of the design process.

• Living document and revised regularly

• A tool to gage the health of the design

Page 6: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 6/82

© 2001 ConceptFlow 6

SIPOC and DFSS Scorecards

Supplier Input Process Output Client

Performance Scores: 

Client PerspectiveProcess Scores: 

Company Perspective

Support Systems Scores: 

Infrastructure Perspective

Top Level Scores 

Over All Multi-Functional

Design Perspective

Page 7: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 7/82© 2001 ConceptFlow 7

Design Process - Recap from Intro

• DFSS is an iterative process

• When flowing down from I to V the

process is called CTQ flow down

• Identify client needs

• Define specifications• Predict performance

• Use successive HOQs

• When design is verified against CTQs

it is called capability flow up• Evaluate design performance

• Locate capability gaps

• Use process capability analysis

• Display on scorecard

Identify

Design

Optimize

Validate

Requirement flow down

Capability flow up

Page 8: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 8/82© 2001 ConceptFlow 8

QFD and Business Process Scorecard

CTS Measures (Hows) 

   C   l   i  e  n   t   N  e  e   d  s

   (   W   h  a   t  s   ) HOQ 1

Functions 

   C   T   S   M  e  a  s  u  r  e  s

HOQ 2 High Level Design Processes 

   C  r   i   t   i  c  a   l   F  u  n  c   t   i  o  n  s

HOQ 3

Detailed Design Infrastructure

Support Systems

   C  r   i   t   i  c  a   l   P  r  o  c  e  s  s  e  s

HOQ 4

Establish CTS

Select Functions

Create High Level DesignPerformance Scorecard

Process Scorecard

Support Systems Scorecard

Build Detail DesignResponse

Specs

Requirement

flow down

Capability flow up

Page 9: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 9/82© 2001 ConceptFlow 9

General Structure of Scorecards

• Establish scorecards parallel to design steps (client first)

• Determine specifications at respective DFSS steps

• Collect product/service response from lower levels (pilot, simulation)

• Compute proposed design adequacy, performance levels and risks

• Review to locate performance gaps and take necessary actions

Establish CTS

Design Process Scorecard Progression

Select Functions

Create High Level Design

Performance Scorecard

Process Scorecard

Support Systems Scorecard

Build Detail DesignResponse

Specs

USLLSLResponse

Specs

Capability flow upRequirement flow down

Page 10: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 10/82© 2001 ConceptFlow 10

DFSS Business Process Scorecard Contents

• Performance Scorecard• Contains critical service performance CTSs

• Assures design meets client requirements

• Process Scorecard

• Measures top level processes and sub-processes

• Computes process capability

• Support Systems Scorecard

• Scores infrastructure and support activities used in the process

• Identifies high quality suppliers

• Top Level Design Summary Scorecard• Combines all the above scores

• Used to locate problem areas and improvement opportunities

Page 11: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 11/82© 2001 ConceptFlow 11

Scorecard Stakeholders

• Stakeholders can be internal or external

• Internal clients

• Clients who buy services

• Strategic partners

• Design reviewers

• Tollgate participants

Page 12: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 12/82© 2001 ConceptFlow 12

When to develop scorecards?

Client

Needs

T  ol  l   g a  t   e 1 

T  ol  l   g a  t   e 2 

T  ol  l   g a  t   e  3 

T  ol  l   g a  t   e 4 

T  ol  l   g a  t   e  5 

T  ol  l   g a  t   e  6 

Conceptual

Design

Preliminary

Design

Detail

Design

Pilot/

PrototypePre-Launch Launch

T  ol  l   g a  t   e  7 

Identify ValidateDesign Optimize

Define Structure of Performance Scorecard

Define Structure of Process Scorecard

Define Structure of Support System Scorecard

Fill-in Performance Scorecard

Fill-in Process Scorecard

Fill-in Support System Scorecard

Page 13: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 13/82© 2001 ConceptFlow 13

When to review Design Scorecards?

• Regular reviews are key for successful projects and should

be included in the project schedule

• There are several levels of review:

• Tollgate reviews

• Weekly reviews

• Daily reviews• In addition, design projects have three unique reviews:

• Concept review

• High level design review

• Pre-pilot review

Scorecard should be updated regularly and available for ad-hoc review 

Page 14: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 14/82

© 2001 ConceptFlow 14

Scorecard Questions to Consider 

Scorecard addresses and facilitates resolution of following questionsthroughout the design process:

• What are the client needs and expectations?

• What are the current design capabilities with respect to

• Service performance

• Process capability and

• Support system quality

• What is the current design performance?

• Have we included all stakeholders?

• What are the opportunities for a robust design?• Are there any gaps between reality and prediction?

• Are there any unintended consequences?

Page 15: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 15/82

© 2001 ConceptFlow 15

Sigma Services Case Study

• Manufacturers of successful SigmaScoot has a vibrant service deliveryside called Sigma Services

• Sigma Services handles orders, billing and client service for a whole

range of Sigma products

• Consequently, Sigma Services clients are varied in their age groups,

life styles and needs• As far as Sigma Services is concerned, they can take orders by mail,

by phone an by e-mail

• In this example we will consider scorecards for an existing design

Page 16: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 16/82

© 2001 ConceptFlow 16

Sigma Services Design Example

CTS Performance

Level Ease of Placement Order Status Inquiry On-time Delivery

Place Orders Manage Orders Fulfill OrdersFunctional

Level

High Level Process

Enter Order 

Check

Errors

Confirm

Order 

Transmit

Order 

Client

Data

Order 

Data

Schedule

Shipment

Queue

Delivery

Receive

Order 

Supplies

Data

Shipment

Data

Handle

Conflicts

Communicate Status Deliver Shipment

Fill

Orders

Track StatusDetailed Process Level

Support

System

Level

Page 17: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 17/82

© 2001 ConceptFlow 17

DFSS Scorecards and SIPOC

Supplier Input Process Output Client

Performance Scores: 

Client PerspectiveProcess Scores: 

Company Perspective

Support System Scores: 

Infrastructure Perspective

Top Level Scores 

Over All Multi-Functional

Design Perspective

USL

LSL

ZUSLZLSL

Page 18: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 18/82

© 2001 ConceptFlow 18

Performance Scorecard

• DFSS Process Performance Scorecard contains the following data:

• CTS Description - what is important to the client

• VOC - Voice of the Client in terms of targets and spec limits

• VOP - Voice of the Design Process (, or DPU)

• Performance - How VOP is meeting VOC in Z and DPM units• At a glance it reveals the design weaknesses (low RTY values)

• e.g. Which CTS is performing the best?

• It also provides an overall design performance level

CTS VOC Design Output Performance

Performance ScorecardParameter  MetricUnit DataType Target USL LSL LT/ST DefectRate mean Std. Dev

Z USL Z LSL YieldRTY

Delivery Cycle Time Day Cont 3 1 LT 4.55E-02 2.00 0.50 2.00 2.00 0.954

Order Defects Disc 0 LT 2.40E-01 0.760

Order Efficiency Disc 100% LT 2.21E-01 0.779

Processing Cost per Order $ Cont 12 13 0 LT 1.99E-01 7.30 5.00 1.14 1.46 0.801

5.48E-01 0.452

Page 19: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 19/82

© 2001 ConceptFlow 19

Purpose of Performance Scorecard

• Evaluate the extent to which the current design fulfills critical client

requirements - CTSs

• Estimate how design output meets client needs (VOC) under usual

design and operational variations

• Combine various CTS scores under a single design score

• Discrete (DPU or DPMO) data

• Continuous data (with mean and standard deviation)

• Highlight areas for improvement

• Evaluate alternatives to optimize or improve designs

Page 20: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 20/82

© 2001 ConceptFlow 20

Sources of Data for Performance Scorecard

The VOC data comes from client analysis• Client segmentation

• Client complaints / surveys / warranty issues etc

• QFD (1st HOQ)

• Benchmarking

• Kano surveys

• The VOP data comes from design model

• Test designs

• Pilot study - study on a similar services• QFD (2nd HOQ), Benchmark data

• Design of Experiments (DOE)

• Mathematical and simulation models

Page 21: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 21/82

© 2001 ConceptFlow 21

CTS Requirements and VOC

• Select appropriate* all client CTSs and select the key CTSs• Identify and characterize all critical parameters for these CTSs

• Make sure the CTSs have robust metrics (with MSA)

• Obtain target and specification limits for continuous data

• Obtain reliable defect definitions for discrete data

CTS VOC

Performance Scorecard

Parameter 

Metric

Unit

Data

Type Target USL LSL

Delivery Cycle Time Day Cont 3 1Order Defects Disc 0

Order Efficiency Disc 100%

Processing Cost per Order $ Cont 12 13 0

Page 22: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 22/82

© 2001 ConceptFlow 22

CTS VOC Design Output

Performance Scorecard

Parameter 

Metric

Unit

Data

Type Target USL LSL

LT/

ST

Defect

Rate mean

Std. Dev

Delivery Cycle Time Day Cont 3 1 LT 4.55E-02 2.00 0.50

Order Defects Disc 0 LT 2.40E-01

Order Efficiency Disc 100% LT 2.21E-01

Processing Cost per Order $ Cont 12 13 0 LT 1.99E-01 7.30 5.00

Design Output

• Obtain data on how CTS parameters behave under design conditions• Usually obtained from existing prototypes

• In some cases the design data is derived from mathematical simulations

• Benchmarking is another source of surrogate data

• From specially designed experiments

• Continuous data (with and ) requires smaller sample size

• Design output values for Sigma Service for selected CTSs:

Page 23: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 23/82

© 2001 ConceptFlow 23

CTS VOC Design Output

Performance Scorecard

Parameter 

Metric

Unit

Data

Type Target USL LSL

LT/

ST

Defect

Rate mean

Std. Dev

Delivery Cycle Time Day Cont 3 1 LT 4.55E-02 2.00 0.50

Order Defects Disc 0 LT 2.40E-01

Order Efficiency Disc 100% LT 2.21E-01

Processing Cost per Order $ Cont 12 13 0 LT 1.99E-01 7.30 5.00

How to Gather Design Output Data?

• Recall that the Performance Scorecard needs both VOC and design

output data?

• How did we get design output data? E.g., Delivery cycle time?

• Usually they can be obtained from simulations, transfer functions, pilot

studies, past histories or bench mark studies.

• In another module we will explore how to get them analytically using

simulations, transfer functions and DOEs.

Page 24: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 24/82

© 2001 ConceptFlow 24

Performance Statistics

• The statistic primarily indicate to what extent the design intent is

fulfilled in probabilistic terms.

• Defects, the area that falls outside the spec limits, is the probability

of failure

• Z values are calculated from this probabilistic estimation

• Z values enable comparison of continuous and discrete data in

probabilistic terms

USL

LSL

ZUSLZLSL

Page 25: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 25/82

© 2001 ConceptFlow 25

Process Performance - Z Scores and DPMs

• ZUSL = (USL-)/ ZLSL = (-LSL)/ 

• USL and LSL are Upper and Lower Spec Limits from VOC•   = average and = standard deviation from existing/new design

• Usually the long term is used for an existing in market;

short term is appropriate when experiments are done in a

controlled fashion under design conditions.

• Use LT/ST consistently throughout

USL

LSL

ZUSLZLSL

Z USL = (USL - )/

Z LSL = ( - LSL)/

CTS: Delivery (days)

DPM

= (3-2)/.5 = 2 22750

= (2-1)/.5 = 2 22750

Total DPM = 45500

RTY = 1-defect rate= 95.45%

Z Bench = 1.69

Page 26: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 26/82

© 2001 ConceptFlow 26

Performance Calculations: Exercise

• Calculate performance metrics for Processing Cost

• Assume normal distribution

• Calculate Z USL and Z LSL

• Compute Z Bench and RTY

• Compare your answers with Scorecard results

CTS VOC Design Output Performance

Performance Scorecard

Parameter 

Metric

Unit

Data

Type Target USL LSL

LT/

ST

Defect

Rate mean

Std. Dev

Z USL Z LSL

Yield

RTY

Delivery Cycle Time Day Cont 3 1 LT 4.55E-02 2.00 0.50 2.00 2.00 0.954

Order Defects Disc 0 LT 2.40E-01 0.760

Order Efficiency Disc 100% LT 2.21E-01 0.779

Processing Cost per Order $ Cont 12 13 0 LT 1.99E-01 7.30 5.00 1.14 1.46 0.801

Page 27: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 27/82

© 2001 ConceptFlow 27

Probabilistic Combination of Defects

Quick probability quiz:• If the probability of failure for Sigma Service delivery schedule is 2%

and the probability of order defects is 1%, what is the probability of 

Sigma Service failing to meet client needs? Hint RTY

• 0.0004%

• 0.02%• 2.98%

• 3.00%

• 5.87%

• 6.00%

• 8.88%

• None of the above

Page 28: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 28/82

© 2001 ConceptFlow 28

Probabilistic Combination of Defects

If the probability of failure for Sigma Service delivery schedule is 2% andthe probability of creating a defective order is 1%, what is the risk of 

Sigma Service failing to meet client needs?

• P(no delay) = 1- 0.02 = 0.98

• P(no defects = 1 - 0.01 = 0.99• P (no defectives) = RTY = 0.98 x 0.99 = 0.9702

• P (risk of failure) = 1- 0.9702 = 0.0298 = 3%(approx)

• This probabilistic principle will be used for combining defects from

multiple sources

Page 29: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 29/82

© 2001 ConceptFlow 29

Performance Calculations

CTS VOC Design Output Performance

Performance Scorecard

Parameter 

Metric

Unit

Data

Type Target USL LSL

LT/

ST

Defect

Rate mean

Std. Dev

Z USL Z LSL

Yield

RTY

Delivery Cycle Time Day Cont 3 1 LT 4.55E-02 2.00 0.50 2.00 2.00 0.954

Order Defects Disc 0 LT 2.40E-01 0.760

Order Efficiency Disc 100% LT 2.21E-01 0.779

Processing Cost per Order $ Cont 12 13 0 LT 1.99E-01 7.30 5.00 1.14 1.46 0.801

5.48E-01 0.452

Overall RTY= 0.954 x 0.760 x 0.779 x 0.801 = 0.452

Page 30: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 30/82

© 2001 ConceptFlow 30

Interpreting Performance Scorecard

•  A good scorecard facilitates dialog to improve design

• Enables a disciplined design iteration

• Following questions will help interpret the scorecard:

•  Are the individual and over all RTY values competitive with respect

to benchmarks?

• What are the main drivers for the current level of performance?

• What CTS perform best? Why? Worst? Why?

• How can we reach the design entitlement level?

• What are the design tradeoffs to improve overall performance?• Have CTS been validated?

Page 31: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 31/82

© 2001 ConceptFlow 31

Inquiring Performance Scorecard - Exercise

Take 5 minutes to answer the following Performance Scorecard questions:

•  Are the individual and over all RTY values competitive?

• What are the main drivers for the current level of performance?

• What CTQ perform best? Why? Worst? Why?

• What are the design tradeoffs to improve overall design performance?

•  Are these CTS really critical to our clients and stakeholders?

Page 32: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 32/82

© 2001 ConceptFlow 32

Performance Scorecard Checklist

• Start from QFD to list client needs first• Identify, qualify and prioritize client requirements

• Determine metrics and performance levels for the CTSs

• Perform measurement system analysis on key metrics

• Establish common terminology and operational definitions

• Examine assumptions

• normality, units, data types, distributions, etc.

• continuous data is not always normal

• Collect design data from pilots or models

• Compute performance statistics• Interpret scorecard as a team In real life ALL client requirements

must be listed and prioritized.

• Design is an iterative process so revisit scorecard as needed

Page 33: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 33/82

© 2001 ConceptFlow 33

Performance Scorecard - Project Exercise

Using the principles learned in this module, for your project• Select some key client requirements (CTSs)

• Determine metrics and performance levels for the CTSs

• Use some design data from pilots or models

• if the data is unavailable, for now, use some

representative data - in other words guesstimate• Compute design performance statistics using the templates

• Interpret scorecard results

Page 34: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 34/82

© 2001 ConceptFlow 34

SIPOC and DFSS Scorecards

Supplier Input Process Output Client

Performance Scores: 

Client PerspectiveProcess Scores: Company Perspective

Support Systems Scores: 

Infrastructure Perspective

Top Level Scores 

Over All Multi-Functional

Design Perspective

Page 35: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 35/82

© 2001 ConceptFlow 35

Sigma Service Processes

Place Orders Manage Orders Fulfill Orders

High Level Process

Enter Order 

Check

Errors

Confirm

Order 

Transmit

Order 

Schedule

Shipment

Queue

Delivery

Receive

Order 

Handle

Conflicts

Communicate Status Deliver Shipment

Fill

Orders

Track StatusDetailed Processes

Functional

Level

Page 36: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 36/82

© 2001 ConceptFlow 36

DFSS Process Scorecard Hierarchy

• DFSS Process Scorecard hierarchy reflects the actual processes and

steps performed to produce designed services

• Summary scorecard for all the processes is at the top

• Scorecards for sub-process steps stem from the top level

• Process sigma scores are computed at all levels

Order Management Process

Process Scorecard Parameter 

Scheduling Time

Rework

Scheduling Abandon Rate

Scheduling Cost/Order 

Inquiries resolved correctly RFT

Exception Error Rate

Order Placement Process

Process Scorecard

Parameter 

Placement Time

Rework

Placement Abandon Rate

Placement Cost/Order 

Order Status Accessibility

Daily Facility Availability

Order Fulfillment Process

Process Scorecard Parameter 

Delivery Time

Rework

 Abandoned Deliveries

Delivery Cost/Order 

Orders delivered correctly RFT

Process Summary Scorecard

Major Processes

# times

applied

# of 

Params

Defect

Rate

Yield

RTY

Order Placement 1 6 0.745 0.255Order Management 1 6 0.212 0.788

Order Fulfillment 1 5 0.532 0.468

3 17 0.906 0.094

Page 37: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 37/82

© 2001 ConceptFlow 37

DFSS Process Summary Scorecard

• DFSS Process Scorecard shows performance of key processes and

sub-processes used for delivering the designed services. It contains:

• Top level performance of critical processes and details• Specification and characterization of key process steps

• Cumulative performance scores of key processes

• Quickly reveals weak processes and improvement opportunities

Process Summary Scorecard

Major Processes

# times

applied

# of 

Params

Defect

Rate

Yield

RTY

Order Placement 1 6 0.745 0.255

Order Management 1 6 0.212 0.788

Order Fulfillment 1 5 0.532 0.468

3 17 0.906 0.094

Page 38: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 38/82

© 2001 ConceptFlow 38

Uses of Process Sigma Scorecard

• Data driven communication medium toevaluate, prioritize and optimize design processes

• Evaluate existing or proposed processes that are critical

• Compute the defect levels top and intermediate process operations

• Facilitate the concurrent enterprise practice

• Align process analysis with service design

• Pinpoint potential process problem areas

• Determine how proposed designs impact overall performance

Page 39: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 39/82

© 2001 ConceptFlow 39

Source Data for Process Scorecard

• Comprehensive process maps for existing and proposed processes

• Standard operating procedures (SOP)

• QFD HOQs - especially second house

• Policies and Procedures

• Process cause and effect matrices (C&E)

• Process Failure Mode Effects Analyses (PFMEA)

• Process data from the existing measurement systems

• Past and current data for the processes

• Specs, current defect levels, service volume

• Labor and material requirements, costing information• COPQ estimates for the processes

Page 40: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 40/82

© 2001 ConceptFlow 40

Process Parameter Selection

• Develop critical process maps and steps• Develop top level and detailed process maps

• Use QFD, C&E Matrix, FMEA to prioritize processes

• Select critical processes and steps

• Identify one or more critical parameters for selected process steps

• These process parameters are proxies to process behavior 

• Establish metrics, units and specifications (Target, USL and LSL)

Order Placement Process Process Specs

Process Scorecard

Parameter 

Metric

Unit

Data

Type Target USL LSL

Placement Time Hours Cont 4 0

Rework % Disc 0

Placement Abandon Rate DPU Disc 0

Placement Cost/Order  $ Cont 0.75 1

Order Status Accessibility % Disc 75 60

Daily Facility Availability Hours Cont 20 24 16

Page 41: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 41/82

© 2001 ConceptFlow 41

Caveats on Selecting Process Metrics

• Seek physical variables instead of yield or defect attributes

• Identify specification carefully

• Align them with service and client requirements

• If not possible, consider the next process as a client

• Be sure to perform measurement system analysis on key metrics

• Beware of tacit assumptions, e.g. normality and stable conditions,long-term and short term processes etc

• When using defect opportunities include only active opportunities

do not inflate opportunities lest they mask real problems

Page 42: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 42/82

© 2001 ConceptFlow 42

Process Scorecard

• For each parameter determine metrics, specs (target, USL, LSL)

• Process parameter specifications are obtained from pilots, DOEs,

process data, past records and also by simulation

• Usually represented in terms of mean and standard deviation

• Make necessary adjustments with respect to LT or ST data

• From parameter specifications and design output calculateperformance metrics Z, RTY etc for each process step

• The calculations are similar to Performance Scorecard calculations

Order Placement Process Process Specs Design Output Performance

Process Scorecard

Parameter 

Metric

Unit

Data

Type Target USL LSL

LT/

ST

Defect

Rate mean

Std. Dev

Z USL Z LSL

Yield

RTY

Placement Time Hours Cont 4 0 LT 1.34E-01 3 0.9 1.11 3.33 0.866

Rework % Disc 0 LT 1.00E-01 0.900

Placement Abandon Rate DPU Disc 0 LT 0.00E+00 1.000

Placement Cost/Order  $ Cont 0.75 1 LT 3.70E-01 0.83 0.51 0.33 0.630

Order Status Accessibility % Disc 75 60 LT 1.00E-01 0.900

Daily Facility Availability Hours Cont 20 24 16 LT 4.24E-01 20 5 0.80 0.80 0.576

Page 43: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 43/82

© 2001 ConceptFlow 43

Process Scorecard - Exercise

• Select main processes (2 to 4) for your project

• Identify critical process parameters for each of the main processes

• Determine targets and spec limits for the parameters

Page 44: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 44/82

© 2001 ConceptFlow 44

Process Scorecards for Key Processes

Place Orders

High Level Process

Enter Order 

Check

Errors

Confirm

Order 

Transmit

Order 

Detailed Processes

Functional

Level

Order Placement Process Process Specs Design Output Performance

Process Scorecard

Parameter 

Metric

Unit

Data

Type Target USL LSL

LT/

ST

Defect

Rate mean

Std. Dev

Z USL Z LSL

Yield

RTY

Placement Time Hours Cont 4 0 LT 1.34E-01 3 0.9 1.11 3.33 0.866

Rework % Disc 0 LT 1.00E-01 0.900

Placement Abandon Rate DPU Disc 0 LT 0.00E+00 1.000

Placement Cost/Order  $ Cont 0.75 1 LT 3.70E-01 0.83 0.51 0.33 0.630

Order Status Accessibility % Disc 75 60 LT 1.00E-01 0.900

Dai ly Facility Availability Hours Cont 20 24 16 LT 4.24E-01 20 5 0.80 0.80 0.576

7.45E-01 0.255

Order Placement Process

Page 45: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 45/82

© 2001 ConceptFlow 45

Process Scorecards for Key Processes

Order Management Process

Manage Orders

High Level

Process

QueueDelivery

ReceiveOrder 

HandleConflicts

Communicate Status

Track Status

Detailed

Processes

Functional

Level

Order Management Process Process Specs Design Output Performance

Process Scorecard Parameter 

Metric

Unit Data Type Target USL LSL LT/ST Total DPU

mean

Std. Dev

Z USL Z LSL Yield RTY

Scheduling Time Hours Cont 8 0 LT 7.66E-03 4 1.5 2.67 2.67 0.992

Rework % Disc 0 LT 8.00E-02 0.920Scheduling Abandon Rate DPU Disc 0 LT 0.00E+00 1.000

Scheduling Cost/Order  $ Cont 0.15 LT 4.01E-02 0.08 0.04 1.75 0.960

Inquiries resolved correctly RFT % Disc 100 LT 1.00E-01 0.900

Exception Error Rate DPU Disc 0 LT 1.00E-03 0.999

2.12E-01 0.788

Page 46: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 46/82

© 2001 ConceptFlow 46

Process Scorecards for Key Processes

Order Fulfillment ProcessFulfill Orders

Schedule

Shipment

Deliver Shipment

Fill

Orders

High Level

Process

Detailed

Processes

Functional

Level

Order Fulfillment Process Process Specs Design Output Performance

Process Scorecard Parameter 

Metric

Unit Data Type Target USL LSL LT/ST Total DPU

mean

Std. Dev

Z USL

Z

LSL Yield RTY

Delivery Time Hours Cont 12 0 LT 2.09E-01 9 3.6 0.83 2.50 0.791

Rework % Disc 0 LT 5.00E-02 0.950

 Abandoned Deliveries DPU Disc 0 LT 0.00E+00 1.000Delivery Cost/Order  $ Cont 7 LT 3.09E-01 6.00 2.00 0.50 0.691

Orders delivered correctly RFT % Disc 100 LT 1.00E-01 0.900

5.32E-01 0.468

Page 47: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 47/82

© 2001 ConceptFlow 47

Process Scorecard Exercise

• Divide into three teams - each team selects a process

• For your scorecard verify RTY values and calculate Z-Bench

• Verify overall RTY

• List what you learned from this exercise

•  

•  

•  

Page 48: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 48/82

© 2001 ConceptFlow 48

Interpreting Process Scorecard - Exercise

• Have we included all key processes and sub-processes?

• What is the impact process performance on design intent?

• How does process scorecard align with performance scorecard?

• Is overall process performance sufficient to meet design intent?

• How do the key processes differ?

• What process are performing the best? Worst?

• How to improve design process performance?

• What are the possible tradeoffs to improve?

Page 49: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 49/82

© 2001 ConceptFlow 49

 Alignment of Process Parameters andPerformance CTS

Design Performance CTSs

Key Processes and Parameters    D  e   l   i  v  e  r  y

   C  y  c   l  e   T   i  m  e

   O  r   d  e  r

   D  e   f  e  c   t  s

   O  r   d  e  r

   E   f   f   i  c   i  e  n  c  y

   P  r  o  c  e  s  s   i  n  g

   C  o  s   t  p  e  r

   O  r   d  e  r

Order Placement Process

Placement Time X

Rework X

Placement Abandon Rate X

Placement Cost/Order  X

Order Status Accessibility

Daily Facility Availability

Order Management Process

Scheduling Time X

Rework X

Scheduling Abandon Rate X

Scheduling Cost/Order  X

Inquiries resolved correctly RFT

Exception Error Rate XOrder Fulfillment Process

Delivery Time X

Rework X

 Abandoned Deliveries X

Delivery Cost/Order  X

Orders delivered correctly RFT X

Page 50: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 50/82

© 2001 ConceptFlow 50

Process Scorecard - Project Exercise

Develop a scorecard outline for your critical processes in your project• From QFD or FMEA map critical processes and steps

• Identify a critical proxy parameters (CTP) for selected process steps

• Choose metrics, units and specifications (Target, USL and LSL)

• For each CTP determine voice of the process

• From process parameter specifications and VOP calculate

performance metrics Z, RTY etc for each process step

• Repeat for each major process

• Outline a process summary scorecard

Page 51: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 51/82

© 2001 ConceptFlow 51

SIPOC and DFSS Scorecards

Supplier Input Process Output Client

Performance Scores: 

Client PerspectiveProcess Scores: 

Company PerspectiveSupport Systems Scores: 

Infrastructure Perspective

Top Level Scores 

Over All Multi-Functional

Design Perspective

Page 52: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 52/82

© 2001 ConceptFlow 52

Detailed Design Phase

• To fulfill the functions, processes employ several infrastructuresupport systems - they could be internal or external:

• Help desk Call Center Systems

• Communication Systems

• Human Resources Systems

• Information Systems

• Facilities Management systems

• Transportation Systems

• Purchasing / Sourcing Systems

• Client Service Systems

Si S i S t S t

Page 53: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 53/82

© 2001 ConceptFlow 53

Transportation

SystemsCall Center 

Systems

Information

Systems

Sigma Service - Support Systems

Some support systems used in Sigma Service are shown here

Can you identify some other infrastructure support systems?

Place Orders Manage Orders Fulfill Orders

Enter Order 

Check

Errors

Confirm

Order 

Transmit

Order 

Client

Data

Order 

Data

Schedule

Shipment

Queue

Delivery

Receive

Order 

Supplies

Data

Shipment

Data

Handle

Conflicts

Communicate Status Deliver Shipment

Fill

Orders

Track Status

O d Pl t P S t S t

Page 54: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 54/82

© 2001 ConceptFlow 54

Order Placement Process - Support Systems

Order Transmission

to Fulfillment

System

Automated Orders

• Web Ordering

• Telephone Ordering 

Order Placement 

Order Acceptance Order Entry & Processing 

Manual Orders 

• Agent Ordering by Telephone

• Mail Ordering

• Fax Ordering

• E-Mail Ordering

Automated Entry & Processing

•Web Entry

•Telephone Entry 

Manual Entry & Processing

• Agent Entry and processing for all

• Mail Ordering

• Fax Ordering

• E-Mail Ordering

+

DFSS S t S t S d

Page 55: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 55/82

© 2001 ConceptFlow 55

DFSS Support System Scorecard

ORDER PLACEMENT SYSTEM

# Orders Defects DPU RTY Zlt Zst

 Automatic Systems 400 19 4.8% 95.4% 1.681 3.181Manual Systems 400 26 6.5% 93.7% 1.531 3.031

Totals 800 45 5.6% 94.5% 1.601 3.101

ORDER PLACEMENT Order Acceptance System

# Orders Defects DPU RTY Zlt Zst

 Automatic Acceptance 400 16 4.0% 96.1% 1.760 3.260

Manual Acceptance 400 18 4.5% 95.6% 1.706 3.206

Totals 800 34 4.3% 95.8% 1.732 3.232

ORDER PLACEMENT Automatic Acceptance System

# Orders Defects DPU RTY Zlt Zst

Web Ordering 200 6 3.0% 97.0% 1.887 3.387

Telephone Ordering 200 10 5.0% 95.1% 1.657 3.157

Totals 400 16 4.0% 96.1% 1.760 3.260

ORDER PLACEMENT Manual Acceptance System

# Orders Defects DPU RTY Zlt Zst

Tel (Agent) Orders 100 10 10.0% 90.5% 1.310 2.810

Mail Orders 30 1 3.3% 96.7% 1.841 3.341

Fax Orders 120 2 1.7% 98.3% 2.131 3.631

E-mail Orders 150 5 3.3% 96.7% 1.841 3.341

Totals 400 18 4.5% 95.6% 1.706 3.206

ORDER PLACEMENT Order Entry System

# Orders Defects DPU RTY Zlt Zst

 Automatic Entry 400 3 0.8% 99.3% 2.434 3.934

Manual Entry 400 8 2.0% 98.0% 2.058 3.558

Totals 800 11 1.4% 98.6% 2.207 3.707

ORDER PLACEMENT Automatic Entry System

# Orders Defects DPU RTY Zlt Zst

Web Ordering 200 1 0.5% 99.5% 2.577 4.077

Telephone Ordering 200 2 1.0% 99.0% 2.328 3.828

Totals 400 3 0.8% 99.3% 2.434 3.934

ORDER PLACEMENT Manual Entry System

# Orders Defects DPU RTY Zlt Zst

Tel (Agent) Orders 100 2 2.0% 98.0% 2.058 3.558

Mail Orders 30 2 6.7% 93.6% 1.518 3.018

Fax Orders 120 2 1.7% 98.3% 2.131 3.631

E-mail Orders 150 2 1.3% 98.7% 2.219 3.719

Totals 400 8 2.0% 98.0% 2.058 3.558

P f S t S t S d

Page 56: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 56/82

© 2001 ConceptFlow 56

Purpose of Support Systems Scorecard

• Evaluate the extent to which support systems fulfill design intentand consequently client CTSs

• Summarize systems level quality

• Estimate how support system performance fulfill design needs

under operational variations

• Combine various support systems under a single design score

• Discrete (DPU or DPMO) data

• Continuous data (with mean and standard deviation)

• Illustrate the support systems for improvement statistically

• Test alternative support systems to optimize or improve designs• Facilitates dialog between designers and support systems suppliers

S t S t Li t

Page 57: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 57/82

© 2001 ConceptFlow 57

Support Systems List

• A comprehensive support system list is the pre-requisite for the

scorecard

• Support system list contains information on support systems and

subsystems:

• Support system Name, Number and Source (Supplier)

• Needed number of support systems or sub-assemblies or quantity

• The defect levels of support systems

• Total defects = defects/system X number of systems

• DPMO at the sub (assembly) level will be carried on to the higher levels 

S f D t f S t S d

Page 58: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 58/82

© 2001 ConceptFlow 58

Sources of Data for Systems Scorecard

• Data on incoming parts and support systems

• Inspection data and past experiences• Experience and data from similar support systems

• Purchasing department and supplier data

• Data from external agencies 

• Complaints / surveys / warranty issues etc• Benchmarking and Simulations

• The specifications for the systems comes from

• Design team: system descriptions and specifications

• Pilot study - study on a similar support systems

• Test designs and Design of Experiments (DOE)

• Mathematical and simulation models

S t S t S d Ch kli t

Page 59: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 59/82

© 2001 ConceptFlow 59

Support System Scorecard - Checklist

• Include support systems and services that impact CTSs

• Start with top level functional descriptions

• Identify support systems and sub systems

• Collect or determine sub-system level DPUs

• Identify critical system quality metrics

• These could be given a priori as DPU levels

• May have to be computed from key system characteristics

• Perform measurement system analysis on these key measures

• Compute DPMO contribution for each subsystem

• Carry the DPMO as the sub-assembly DPU to next higher level

Top Level Support System Scorecard

Page 60: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 60/82

© 2001 ConceptFlow 60

Top Level Support System Scorecard

• Sigma Service Order Placement System Scorecard

• Note there are two parallel channels to support this system

• For automatic system there are 19 defects for 400 orders

• Number of orders and defects came from lower levels

• Therefore DPU=19/400=0.048, RTY=e- 0.048=95.4%, Zlt=1.681• Similar statistics can be prepared for each sub-system and

summarized as shown above

• Let us drill down this

ORDER PLACEMENT SYSTEM

# Orders Defects DPU RTY Zlt Zst

 Automatic Systems 400 19 4.8% 95.4% 1.681 3.181

Manual Systems 400 26 6.5% 93.7% 1.531 3.031

Totals 800 45 5.6% 94.5% 1.601 3.101

Support System Scorecard 2nd Level

Page 61: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 61/82

© 2001 ConceptFlow 61

Support System Scorecard - 2nd Level

ORDER PLACEMENT SYSTEM

# Orders Defects DPU RTY Zlt Zst

 Automatic Systems 400 19 4.8% 95.4% 1.681 3.181Manual Systems 400 26 6.5% 93.7% 1.531 3.031

Totals 800 45 5.6% 94.5% 1.601 3.101

ORDER PLACEMENT Order Acceptance System

# Orders Defects DPU RTY Zlt Zst

 Automatic Acceptance 400 16 4.0% 96.1% 1.760 3.260

Manual Acceptance 400 18 4.5% 95.6% 1.706 3.206

Totals 800 34 4.3% 95.8% 1.732 3.232

ORDER PLACEMENT Automatic Acceptance System

# Orders Defects DPU RTY Zlt Zst

Web Ordering 200 6 3.0% 97.0% 1.887 3.387

Telephone Ordering 200 10 5.0% 95.1% 1.657 3.157

Totals 400 16 4.0% 96.1% 1.760 3.260

ORDER PLACEMENT Manual Acceptance System

# Orders Defects DPU RTY Zlt Zst

Tel (Agent) Orders 100 10 10.0% 90.5% 1.310 2.810

Mail Orders 30 1 3.3% 96.7% 1.841 3.341

Fax Orders 120 2 1.7% 98.3% 2.131 3.631

E-mail Orders 150 5 3.3% 96.7% 1.841 3.341

Totals 400 18 4.5% 95.6% 1.706 3.206

ORDER PLACEMENT Order Entry System

# Orders Defects DPU RTY Zlt Zst

 Automatic Entry 400 3 0.8% 99.3% 2.434 3.934

Manual Entry 400 8 2.0% 98.0% 2.058 3.558

Totals 800 11 1.4% 98.6% 2.207 3.707

ORDER PLACEMENT Automatic Entry System

# Orders Defects DPU RTY Zlt Zst

Web Ordering 200 1 0.5% 99.5% 2.577 4.077

Telephone Ordering 200 2 1.0% 99.0% 2.328 3.828

Totals 400 3 0.8% 99.3% 2.434 3.934

ORDER PLACEMENT Manual Entry System

# Orders Defects DPU RTY Zlt Zst

Tel (Agent) Orders 100 2 2.0% 98.0% 2.058 3.558

Mail Orders 30 2 6.7% 93.6% 1.518 3.018

Fax Orders 120 2 1.7% 98.3% 2.131 3.631

E-mail Orders 150 2 1.3% 98.7% 2.219 3.719

Totals 400 8 2.0% 98.0% 2.058 3.558

 

• For order placement system, for 400 orders, 16 defects came from Order 

 Acceptance and 3 from Order Entry to a total of 19. DPU=19/400=4.8%

• For manual systems corresponding DPU = 26/400 = 6.5%• Note that for both manual as well as automatic systems, order acceptance

is incurring higher number of defects than order entry system

• Now Order acceptance and entry systems can be further drilled down

Order Acceptance System Scorecard

Page 62: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 62/82

© 2001 ConceptFlow 62

Order Acceptance System Scorecard

ORDER PLACEMENT Order Acceptance System

# Orders Defects DPU RTY Zlt Zst

 Automatic Acceptance 400 16 4.0% 96.1% 1.760 3.260

Manual Acceptance 400 18 4.5% 95.6% 1.706 3.206

Totals 800 34 4.3% 95.8% 1.732 3.232

ORDER PLACEMENT Automatic Acceptance System

# Orders Defects DPU RTY Zlt Zst

Web Ordering 200 6 3.0% 97.0% 1.887 3.387

Telephone Ordering 200 10 5.0% 95.1% 1.657 3.157

Totals 400 16 4.0% 96.1% 1.760 3.260

ORDER PLACEMENT Manual Acceptance System

# Orders Defects DPU RTY Zlt Zst

Tel (Agent) Orders 100 10 10.0% 90.5% 1.310 2.810

Mail Orders 30 1 3.3% 96.7% 1.841 3.341

Fax Orders 120 2 1.7% 98.3% 2.131 3.631

E-mail Orders 150 5 3.3% 96.7% 1.841 3.341Totals 400 18 4.5% 95.6% 1.706 3.206

• Order acceptance system

has two channels:• Automatic acceptance

• Manual acceptance

• Automatic order placement is

further divided into web and

telephone ordering with their 

own defect levels

• Manual systems have their 

corresponding sub systems

and defect levels as shown

• Automatic and manual totals

are carried over to order 

acceptance system level.

Order Entry System Scorecard Exercise

Page 63: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 63/82

© 2001 ConceptFlow 63

Order Entry System Scorecard - Exercise

ORDER PLACEMENT Order Entry System

# Orders Defects DPU RTY Zlt Zst

 Automatic Entry 400 3 0.8% 99.3% 2.434 3.934

Manual Entry 400 8 2.0% 98.0% 2.058 3.558

Totals 800 11 1.4% 98.6% 2.207 3.707

ORDER PLACEMENT Automatic Entry System

# Orders Defects DPU RTY Zlt Zst

Web Ordering 200 1 0.5% 99.5% 2.577 4.077

Telephone Ordering 200 2 1.0% 99.0% 2.328 3.828

Totals 400 3 0.8% 99.3% 2.434 3.934

ORDER PLACEMENT Manual Entry System

# Orders Defects DPU RTY Zlt Zst

Tel (Agent) Orders 100 2 2.0% 98.0% 2.058 3.558

Mail Orders 30 2 6.7% 93.6% 1.518 3.018

Fax Orders 120 2 1.7% 98.3% 2.131 3.631

E-mail Orders 150 2 1.3% 98.7% 2.219 3.719Totals 400 8 2.0% 98.0% 2.058 3.558

• Order entry system also has two

channels:

•  Automatic entry

• Manual entry

• Verify the number of orders and

defect levels for each

•  Are RTY and Z values correct?

• Which channel is more effective?

• How do you design the scorecard?

• Top down or Bottom up

• How do you build this scorecard?

• Top down or Bottom up

Full Order Placement System Scorecard

Page 64: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 64/82

© 2001 ConceptFlow 64

Full Order Placement System Scorecard

ORDER PLACEMENT SYSTEM

# Orders Defects DPU RTY Zlt Zst

 Automatic Systems 400 19 4.8% 95.4% 1.681 3.181Manual Systems 400 26 6.5% 93.7% 1.531 3.031

Totals 800 45 5.6% 94.5% 1.601 3.101

ORDER PLACEMENT Order Acceptance System

# Orders Defects DPU RTY Zlt Zst

 Automatic Acceptance 400 16 4.0% 96.1% 1.760 3.260

Manual Acceptance 400 18 4.5% 95.6% 1.706 3.206

Totals 800 34 4.3% 95.8% 1.732 3.232

ORDER PLACEMENT Automatic Acceptance System

# Orders Defects DPU RTY Zlt Zst

Web Ordering 200 6 3.0% 97.0% 1.887 3.387

Telephone Ordering 200 10 5.0% 95.1% 1.657 3.157

Totals 400 16 4.0% 96.1% 1.760 3.260

ORDER PLACEMENT Manual Acceptance System

# Orders Defects DPU RTY Zlt Zst

Tel (Agent) Orders 100 10 10.0% 90.5% 1.310 2.810

Mail Orders 30 1 3.3% 96.7% 1.841 3.341

Fax Orders 120 2 1.7% 98.3% 2.131 3.631

E-mail Orders 150 5 3.3% 96.7% 1.841 3.341

Totals 400 18 4.5% 95.6% 1.706 3.206

ORDER PLACEMENT Order Entry System

# Orders Defects DPU RTY Zlt Zst

 Automatic Entry 400 3 0.8% 99.3% 2.434 3.934

Manual Entry 400 8 2.0% 98.0% 2.058 3.558

Totals 800 11 1.4% 98.6% 2.207 3.707

ORDER PLACEMENT Automatic Entry System

# Orders Defects DPU RTY Zlt Zst

Web Ordering 200 1 0.5% 99.5% 2.577 4.077

Telephone Ordering 200 2 1.0% 99.0% 2.328 3.828

Totals 400 3 0.8% 99.3% 2.434 3.934

ORDER PLACEMENT Manual Entry System

# Orders Defects DPU RTY Zlt Zst

Tel (Agent) Orders 100 2 2.0% 98.0% 2.058 3.558

Mail Orders 30 2 6.7% 93.6% 1.518 3.018

Fax Orders 120 2 1.7% 98.3% 2.131 3.631

E-mail Orders 150 2 1.3% 98.7% 2.219 3.719

Totals 400 8 2.0% 98.0% 2.058 3.558

Interpreting Support System Scorecard

Page 65: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 65/82

© 2001 ConceptFlow 65

Interpreting Support System ScorecardExercise

Take few minutes to will help interpret the scorecard:

• Is current quality level of support systems adequate?

• Have we included all key support systems and subsystems to the

sufficient detail?

• What are the critical support systems that determine the present

score?

• Since number of support systems increases complexity and the

opportunities for defect, can we reduce the number of support

systems? In other words, can we outsource?

• Who delivers best quality services? Who needs improvement?

• What are the design tradeoffs to improve the overall scores?

• Has an MSA performed on key system metrics?

• What are the tacit assumptions here? Are they valid?

Support System Scorecard Drivers

Page 66: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 66/82

© 2001 ConceptFlow 66

Support System Scorecard Drivers

•  A Pareto chart of the DPUs for major support systems are shown below

• Which support system seems to be a major driver?

•  Are all support systems behaving more or less the same?

• Create a chart showing effectiveness of manual and automatic systems

10.0%

1.7%2.0%

6.7%

1.7%

3.3% 3.3%

1.3%

0.0%

2.0%

4.0%

6.0%

8.0%

10.0%

12.0%

Tel (Agent) Orders Mail Orders E-mail Orders Fax Orders

   M  a  n

  u  a   l   A  p  p  r  o  a  c   h   D   P   U  s

Order AcceptanceOrder Entry

 Average

Support System Scorecard Project Exercise

Page 67: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 67/82

© 2001 ConceptFlow 67

Support System Scorecard - Project Exercise

• Identify support systems for your project

• Develop a top level system scorecard for one of the support systems.

• Identify its subsystems and its components

SIPOC and DFSS Scorecards

Page 68: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 68/82

© 2001 ConceptFlow 68

SIPOC and DFSS Scorecards

Supplier Input Process Output Client

Performance Scores: 

Client Perspective

Process Scores: 

Company Perspective

Support System Scores: 

Systems Perspective

Top Level Scores 

Over All Multi-Functional

Design Perspective

Page 69: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 69/82

© 2001 ConceptFlow 69

Top Level Design Summary Scorecard

• Summarizes performance scores from all available scorecards:

Design performance, process, and support system scorecards

Top Level Business Process Scorecard

Sigma Service Processes

DPM RTY DPM RTY DPU RTY

Delivery Cycle Time 45500 95.4%

Order Defects 240035 76.0%

Order Efficiency 221061 77.9%

Processing Cost per Order 199280 80.1%

Order Placement 745123 25.5% 0.05625 94.5%

Order Management 212047 78.8%

Order Fulfillment 532087 46.8%

Order Placement

 Automatic Acceptance 0.040 96.1%

Manual Acceptance 0.045 95.6%

 Automatic Order Entry 0.008 99.3%

Manual Order Entry 0.020 98.0%

Order Management TBD TBD

Order Fulfillment TBD TBD

Other Support Systems TBD TBD

Totals 0.5476 45.2% 906028 9.4% 0.0563 94.5%

Performance Process Support Systems

Why Use Top Level Design Scorecard?

Page 70: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 70/82

© 2001 ConceptFlow 70

Why Use Top Level Design Scorecard?

• Provides a high level view of all critical aspects of design and delivery

• Summarizes DPM and RTY scores from all scorecards

• Pinpoints the problems and improvement areas in a nutshell

• Provides cumulative scores indicate existing performance level

• Highlights inter relationship between performance, production

processes and supporting systems• Facilitates data driven communication among the multi-functional

DFSS team to evaluate, prioritize and choose design options

• Enables what-if scenarios before making major changes

• Purpose of the scorecard is not for grading DFSS team members

How to Use Top Level Scorecard

Page 71: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 71/82

© 2001 ConceptFlow 71

How to Use Top Level Scorecard

• Scorecards are just models -- not reality -- use with caution• Include all critical parameters, support systems and processes

• Maintain integrity by minimizing redundant calculations

• Review scorecards early and often

• Scorecards not scapegoats

• Use scorecards to arrive at consensus optimal design solutions

• Do not just depend on the scores - they are just indicators

• Supplement scores with sound design, production and business

 judgement

• Consider cost (COPQ) in all design decisions

I t ti T L l S d E i

Page 72: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 72/82

© 2001 ConceptFlow 72

Interpreting Top Level Scorecard - Exercise

• What did you learn from the top level scorecard and its scores?

• Is the scorecard complete? Explain your answer.

• If you want to improve design, what steps will you take?

• What discrepancies did you find in the example scorecard?

• What actions will you take to address these discrepancies?

I t ti T L l S d

Page 73: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 73/82

© 2001 ConceptFlow 73

Interpreting Top Level ScorecardPossible Answers

• What do you learn from the top level scorecard and its scores?

• Top level scorecard is merely a summary of all scorecards

• It shows that process design is performance is the lowest

• Is the scorecard complete? Explain your answer.

• No. We need to include additional support systems scores

• We may also need additional processes and CTSs• If you want to improve design, what steps will you take?

• I will start with process design and include more system scores

• What discrepancies did you find in the example scorecard?

• Why order placement process scores are so low while its supportservice scores are high. Where is the disconnect?

• Why don’t low process scores affect performance scores? 

• What actions will you take to address these discrepancies?

• Check assumptions and see if we are using the right parameters.

Top Level Scorecard - Project Exercise

Page 74: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 74/82

© 2001 ConceptFlow 74

Top Level Scorecard Project Exercise

Develop a top level scorecard for your project

• If you did other exercises it is a piece of cake!

• Summarizes DPM and RTY scores from all other scorecards

• Are there any support system and process alignments? If so,

calculate cumulative and average scores when appropriate

• How will you incorporate cost figures in your scorecard?

End of module checklist

Page 75: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 75/82

© 2001 ConceptFlow 75

End of module checklist

• Develop performance, process and support system scorecards

• Integrate them into top level scorecard

• Interpret scorecard results using statistical approach

• Evaluate design using scorecard sigma scores

• Identify opportunities for design improvements

• Clarify underlying assumptions of scorecard

Appendix

Page 76: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 76/82

© 2001 ConceptFlow 76

 Appendix

Long-Term Versus Short-Term Sigma

Page 77: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 77/82

© 2001 ConceptFlow 77

Long Term Versus Short Term Sigma

 Acceptable

USL

ZLT

 

ZS

ZLT

= ZST

  – 1.5

1.5 

ZLT estimatesPPM or 

DPMO over the long-term

ZST is used torate

performancebased on

benchmarking

PPM Conversion Chart

Page 78: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 78/82

© 2001 ConceptFlow 78

PPM Conversion Chart

0.00

0.01

0.10

1.00

10.00

100.00

1000.00

10000.00

100000.00

1000000.00

   1 .   5 

   2 .   0 

   2 .   5 

   3 .   0 

   3 .   5 

   4 .   0 

   4 .   5 

   5 .   0 

   5 .   5 

   6 .   0 

Shifted 1.5  Shifted 1.5  

A A

Z or “Sigma” Scale Z or “Sigma” Scale 

PPM PPM 

 Axis

B Axis

Centered Centered 

6 Sigma

generates only

3.4 defects per 

million

opportunities

over long term

examination

 A: Long Term

B: Short Term

ZLT + 1.5 = ZST 

Sigma Level is determined by Defects

Long-Term Versus Short - Term Sigma

Page 79: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 79/82

© 2001 ConceptFlow 79

Long Term Versus Short Term Sigma

 Acceptable

USL

Probability of aDefect over the

long-term

Example:DPO = .1003

ZLT 

ZS

ZLT = ZST  – 1.5

1.5 

Sigma - DPMO Conversion Chart

Page 80: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 80/82

© 2001 ConceptFlow 80

Bench-

mark

Sigma

Value

Long

Term

Sigma DPMO

Bench-

mark

Sigma

Value

Long

Term

Sigma DPMO

Bench-

mark

Sigma

Value

Long

Term

Sigma DPMO

Bench-

mark

Sigma

Value

Long

Term

Sigma DPMO

0.00 -1.5 933,192.8 1.55 0.05 480,061.1 3.05 1.55 60,570.8 4.55 3.05 1,144.3 

0.05 -1.45 926,470.7 1.60 0.1 460,172.1 3.10 1.6 54,799.3 4.60 3.1 967.7 

0.10 -1.4 919,243.3 1.65 0.15 440,382.3 3.15 1.65 49,471.5 4.65 3.15 816.4 

0.15 -1.35 911,491.9 1.70 0.2 420,740.3 3.20 1.7 44,565.4 4.70 3.2 687.2 

0.20 -1.3 903,199.5 1.75 0.25 401,293.7 3.25 1.75 40,059.1 4.75 3.25 577.1 

0.25 -1.25 894,350.2 1.80 0.3 382,088.6 3.30 1.8 35,930.3 4.80 3.3 483.5 

0.30 -1.2 884,930.3 1.85 0.35 363,169.4 3.35 1.85 32,156.7 4.85 3.35 404.1 

0.35 -1.15 874,928.0 1.90 0.4 344,578.3 3.40 1.9 28,716.5 4.90 3.4 337.0 

0.40 -1.1 864,333.9 1.95 0.45 326,355.2 3.45 1.95 25,588.0 4.95 3.45 280.3 

0.45 -1.05 853,140.9 2.00 0.5 308,537.5 3.50 2 22,750.1 5.00 3.5 232.7 

0.50 -1 841,344.7 2.05 0.55 291,159.7 3.55 2.05 20,182.1 5.05 3.55 192.7 

0.55 -0.95 828,943.9 2.10 0.6 274,253.1 3.60 2.1 17,864.4 5.10 3.6 159.1 

0.60 -0.9 815,939.9 2.15 0.65 257,846.0 3.65 2.15 15,777.6 5.15 3.65 131.2 

0.65 -0.85 802,337.5 2.20 0.7 241,963.6 3.70 2.2 13,903.4 5.20 3.7 107.8 

0.70 -0.8 788,144.7 2.25 0.75 226,627.3 3.75 2.25 12,224.4 5.25 3.75 88.4 

0.75 -0.75 773,372.7 2.30 0.8 211,855.3 3.80 2.3 10,724.1 5.30 3.8 72.4 

0.80 -0.7 758,036.4 2.35 0.85 197,662.5 3.85 2.35 9,386.7 5.35 3.85 59.1 

0.85 -0.65 742,154.0 2.40 0.9 184,060.1 3.90 2.4 8,197.5 5.40 3.9 48.1 

0.90 -0.6 725,746.9 2.45 0.95 171,056.1 3.95 2.45 7,142.8 5.45 3.95 39.1 

0.95 -0.55 708,840.3 2.50 1 158,655.3 4.00 2.5 6,209.7 5.50 4 31.7 

1.00 -0.5 691,462.5 2.55 1.05 146,859.1 4.05 2.55 5,386.2 5.55 4.05 25.6 

1.05 -0.45 673,644.8 2.60 1.1 135,666.1 4.10 2.6 4,661.2 5.60 4.1 20.7 

1.10 -0.4 655,421.7 2.65 1.15 125,072.0 4.15 2.65 4,024.6 5.65 4.15 16.6 

1.15 -0.35 636,830.6 2.70 1.2 115,069.7 4.20 2.7 3,467.0 5.70 4.2 13.4 

1.20 -0.3 617,911.4 2.75 1.25 105,649.8 4.25 2.75 2,979.8 5.75 4.25 10.7 

1.25 -0.25 598,706.3 2.80 1.3 96,800.5 4.30 2.8 2,555.2 5.80 4.3 8.5 

1.30 -0.2 579,259.7 2.85 1.35 88,508.1 4.35 2.85 2,186.0 5.85 4.35 6.8 

1.35 -0.15 559,617.7 2.90 1.4 80,756.7 4.40 2.9 1,865.9 5.90 4.4 5.4 

1.40 -0.1 539,827.9 2.95 1.45 73,529.3 4.45 2.95 1,588.9 5.95 4.45 4.3 

1.45 -0.05 519,938.9 3.00 1.5 66,807.2 4.50 3 1,350.0 6.00 4.5 3.4 

1.50 0 500,000.0 

Sigma Value: Short term Benchmark Sigma Long Term Sigma = -Normsinv(DPMO/1000000) = -Normsinv(500000/1000000) = 0

Long Term Sigma = Short Term - 1.5 Sigma Shift = 1.50 Sigma Value = Long term + 1.5 = 0 + 1.5 = 1.5

DPMO = (1-NORMSDIST(0))*1000000 = 5000,000. 0 Sigma = -NORMSINV(559,617. 7/1000000)+1.5 = -.15 + 1. 5 = 1.35

Zst = Zlt + 1.5

Standard Normal Table

Page 81: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 81/82

© 2001 ConceptFlow 81

Standard Normal Probabilities:

The table is based on the area P

under the standard

normal probability curve,

below the respective z-statistic

z 0 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09

-4.0 0.00003 0.00003 0.00003 0.00003 0.00003 0.00003 0.00002 0.00002 0.00002 0.00002

-3.9 0.00005 0.00005 0.00004 0.00004 0.00004 0.00004 0.00004 0.00004 0.00003 0.00003

-3.8 0.00007 0.00007 0.00007 0.00006 0.00006 0.00006 0.00006 0.00005 0.00005 0.00005

-3.7 0.00011 0.00010 0.00010 0.00010 0.00009 0.00009 0.00008 0.00008 0.00008 0.00008

-3.6 0.00016 0.00015 0.00015 0.00014 0.00014 0.00013 0.00013 0.00012 0.00012 0.00011

-3.5 0.00023 0.00022 0.00022 0.00021 0.00020 0.00019 0.00019 0.00018 0.00017 0.00017

-3.4 0.00034 0.00032 0.00031 0.00030 0.00029 0.00028 0.00027 0.00026 0.00025 0.00024

-3.3 0.00048 0.00047 0.00045 0.00043 0.00042 0.00040 0.00039 0.00038 0.00036 0.00035

-3.2 0.00069 0.00066 0.00064 0.00062 0.00060 0.00058 0.00056 0.00054 0.00052 0.00050

-3.1 0.00097 0.00094 0.00090 0.00087 0.00084 0.00082 0.00079 0.00076 0.00074 0.00071

-3.0 0.00135 0.00131 0.00126 0.00122 0.00118 0.00114 0.00111 0.00107 0.00103 0.00100

-2.9 0.00187 0.00181 0.00175 0.00169 0.00164 0.00159 0.00154 0.00149 0.00144 0.00139

-2.8 0.00256 0.00248 0.00240 0.00233 0.00226 0.00219 0.00212 0.00205 0.00199 0.00193

-2.7 0.00347 0.00336 0.00326 0.00317 0.00307 0.00298 0.00289 0.00280 0.00272 0.00264

-2.6 0.00466 0.00453 0.00440 0.00427 0.00415 0.00402 0.00391 0.00379 0.00368 0.00357

-2.5 0.00621 0.00604 0.00587 0.00570 0.00554 0.00539 0.00523 0.00508 0.00494 0.00480-2.4 0.00820 0.00798 0.00776 0.00755 0.00734 0.00714 0.00695 0.00676 0.00657 0.00639

-2.3 0.01072 0.01044 0.01017 0.00990 0.00964 0.00939 0.00914 0.00889 0.00866 0.00842

-2.2 0.01390 0.01355 0.01321 0.01287 0.01255 0.01222 0.01191 0.01160 0.01130 0.01101

-2.1 0.01786 0.01743 0.01700 0.01659 0.01618 0.01578 0.01539 0.01500 0.01463 0.01426

-2.0 0.02275 0.02222 0.02169 0.02118 0.02067 0.02018 0.01970 0.01923 0.01876 0.01831

-1.9 0.02872 0.02807 0.02743 0.02680 0.02619 0.02559 0.02500 0.02442 0.02385 0.02330

-1.8 0.03593 0.03515 0.03438 0.03362 0.03288 0.03216 0.03144 0.03074 0.03005 0.02938

-1.7 0.04456 0.04363 0.04272 0.04181 0.04093 0.04006 0.03920 0.03836 0.03754 0.03673

-1.6 0.05480 0.05370 0.05262 0.05155 0.05050 0.04947 0.04846 0.04746 0.04648 0.04551

-1.5 0.06681 0.06552 0.06425 0.06301 0.06178 0.06057 0.05938 0.05821 0.05705 0.05592

-1.4 0.08076 0.07927 0.07780 0.07636 0.07493 0.07353 0.07214 0.07078 0.06944 0.06811

-1.3 0.09680 0.09510 0.09342 0.09176 0.09012 0.08851 0.08691 0.08534 0.08379 0.08226

-1.2 0.11507 0.11314 0.11123 0.10935 0.10749 0.10565 0.10383 0.10204 0.10027 0.09852-1.1 0.13566 0.13350 0.13136 0.12924 0.12714 0.12507 0.12302 0.12100 0.11900 0.11702

-1.0 0.15865 0.15625 0.15386 0.15150 0.14917 0.14686 0.14457 0.14231 0.14007 0.13786

-0.9 0.18406 0.18141 0.17878 0.17618 0.17361 0.17105 0.16853 0.16602 0.16354 0.16109

-0.8 0.21185 0.20897 0.20611 0.20327 0.20045 0.19766 0.19489 0.19215 0.18943 0.18673

-0.7 0.24196 0.23885 0.23576 0.23269 0.22965 0.22663 0.22363 0.22065 0.21769 0.21476

-0.6 0.27425 0.27093 0.26763 0.26434 0.26108 0.25784 0.25462 0.25143 0.24825 0.24509

-0.5 0.30853 0.30502 0.30153 0.29805 0.29460 0.29116 0.28774 0.28434 0.28095 0.27759

-0.4 0.34457 0.34090 0.33724 0.33359 0.32997 0.32635 0.32276 0.31917 0.31561 0.31206

-0.3 0.38209 0.37828 0.37448 0.37070 0.36692 0.36317 0.35942 0.35569 0.35197 0.34826

-0.2 0.42074 0.41683 0.41293 0.40904 0.40516 0.40129 0.39743 0.39358 0.38974 0.38590

-0.1 0.46017 0.45620 0.45224 0.44828 0.44433 0.44038 0.43644 0.43250 0.42857 0.42465

0.0 0.50000 0.49601 0.49202 0.48803 0.48404 0.48006 0.47607 0.47209 0.46811 0.46414

Trademarks and Service Marks

Page 82: 09 Design Scorecard(BP)

8/22/2019 09 Design Scorecard(BP)

http://slidepdf.com/reader/full/09-design-scorecardbp 82/82

Six Sigma is a federally registered trademark of Motorola, Inc.

Breakthrough Strategy is a federally registered trademark of Six Sigma Academy.VISION. FOR A MORE PERFECT WORLD is a federally registered trademark of Six Sigma Academy.

ESSENTEQ is a trademark of Six Sigma Academy.

FASTART is a trademark of Six Sigma Academy.

Breakthrough Design is a trademark of Six Sigma Academy.

Breakthrough Lean is a trademark of Six Sigma Academy.

Design with the Power of Six Sigma is a trademark of Six Sigma Academy.

Legal Lean is a trademark of Six Sigma Academy.SSA Navigator is a trademark of Six Sigma Academy.

SigmaCALC is a trademark of ix Sigma Academy.

SigmaFlow is a trademark of Compass Partners, Inc.

SigmaTRAC is a trademark of DuPont.

MINITAB is a trademark of Minitab, Inc.