unit 4 statistics, detection limits and uncertainty€¦ · critical level (cl) lower limit of...

47
Unit 4 Statistics, Detection Limits and Uncertainty Experts Teaching from Practical Experience

Upload: hoangnhu

Post on 13-Apr-2018

219 views

Category:

Documents


4 download

TRANSCRIPT

Unit 4

Statistics,

Detection Limits

and Uncertainty

Experts Teaching from Practical Experience

© Kinectrics Inc. 2012 2

Unit 4 – Topics

Statistical Analysis

Detection Limits

Decision thresholds & detection levels

Instrument Detection Limits vs. Method Detection

Limits

© Kinectrics Inc. 2012 3

Unit 4 – Topics

Uncertainty

Requirements

Guide to Uncertainty in Measurement (GUM)

Uncertainty budgets

Sampling uncertainty

© Kinectrics Inc. 2012 4

Unit 4 – Learning Objectives

Describe the responsibilities of the user of the

standards with regard to Detection Limits

Identify the guidance on analysis of samples and

reporting of results

Describe the changes in the guidance on

estimating and reporting uncertainties

Unit 4a

Statistics

Experts Teaching from Practical Experience

© Kinectrics Inc. 2012 6

MARSSIM (Chapter 8) recommends the following

sequence of steps:

1. Review the Data Quality Objectives (DQOs) and

Sampling Design

2. Conduct a Preliminary Data Review

3. Select the Tests

4. Verify the Assumptions of the Tests

5. Draw Conclusions from the Data

More detail is given in EPA QA/G-9S

Interpretation of Survey Results

© Kinectrics Inc. 2012 7

Review the objectives of the study

Translate the objectives into statistical hypotheses

Translate the objectives into limits on Type I & Type

II errors

Review the sampling design & note any special

features or potential problems

Review the DQOs and Sampling Design

© Kinectrics Inc. 2012 8

Review Quality Assurance reports

• Look for problems or anomalies

Calculate basic statistical quantities

• Calculate percentiles, measures of central tendency,

dispersion and, if the data involves two variables, the

correlation coefficient

Graph the data

• Select graphical representations that illuminate the

structure of the data

Conduct a Preliminary Data Review

© Kinectrics Inc. 2012 9

Select the statistical method

• Follow the “Decision Tree” from EPA QA/G-9S

Identify assumptions underlying the test

• List the key underlying assumptions such as

distributional form, dispersion, independence, etc

• Note any sensitive assumptions where relatively small

deviations could jeopardize the validity of the test

Select the Statistical Methods

© Kinectrics Inc. 2012 10

Determine approach for verifying assumptions

• Review (or develop) a statistical model for the data

• Select the methods for verifying the assumptions

Perform tests of assumptions

• Adjust for distributional assumptions (if warranted)

• Perform the calculations required for the tests

Determine corrective actions

Verify the Assumptions

© Kinectrics Inc. 2012 11

Determine corrective actions (if required)

• Determine if data transformation will correct the

problem

• If data is missing, explore collecting more data or using

theoretical justification

• Consider robust procedures or nonparametric

hypothesis tests

Verify the Assumptions

© Kinectrics Inc. 2012 12

Perform the statistical procedure

• Perform & document the statistical tests

• Identify outliers and recalculate if necessary

Draw study conclusions

• If the null hypothesis is rejected, draw conclusions

& document

• If the null hypothesis is not rejected, verify limits on

decision errors then draw conclusions & document

• Interpret the results

Draw Conclusions from the Data

© Kinectrics Inc. 2012 13

Evaluate performance of the sampling design

• Evaluate the statistical power of the design over the full

range of parameter values

Draw Conclusions from the Data

Unit 4b

Detection Limits

Experts Teaching from Practical Experience

© Kinectrics Inc. 2012 15

Both N228.4-10 & N2885.-11 caution that

terminology and the definitions used for these

concepts (non-detect level & detection limit) are not

always consistent. It is the responsibility of the user

to understand, document, and justify the detection

limit reported by any laboratory engaged to perform

analyses.

Definitions

© Kinectrics Inc. 2012 16

Definitions

Organization or Discipline LC LD

CSA N288.4/5 Non-detect Level (LC) Detection Limit (LD)

Health Physics

(ANSI/HPS N13.30-1996) Critical Level (CL) Lower Limit of Detection (LLD)

Occupational Hygiene

(AIHA, IOHA) Limit of Detection (LOD) Limit of Quantification (LOQ)

Environmental Analytical

Chemistry Cmte. of ACS Limit of Detection (LOD) Limit of Quantification (LOQ)

IUPAC Detection Decisions (LC) Minimum Detectable Value or

Detection Limit (LD)

ISO 11929:2010 Decision Threshold (y*) Detection Limit (y#)

© Kinectrics Inc. 2012 17

Non-detect level — the level below which

quantitative results are not obtained from the

measurement system or analysis method selected.

• The non-detect level is the smallest value of the

measurand for which the probability of a wrong

conclusion that the measurand is present when it

actually is not present (‘error of the first kind’ or a ‘false

positive error’) does not exceed a specified probability,

α.

Excerpt from Clause 3.1 of CSA N288.4-10 and N288.5-11

Definitions

© Kinectrics Inc. 2012 18

Detection limit — the level (relative to background)

above which an effect can confidently be measured.

• The detection limit is the smallest value of the

measurand for which the probability of a wrong

conclusion that the measurand is not present when it

actually is present (‘error of the second kind’ of a ‘false

negative error’) does not exceed a specified

probability, β.

Excerpt from Clause 3.1 of CSA N288.4-10 and N288.5-11

Definitions

© Kinectrics Inc. 2012 19

Non-detect Level (LC) for Type I Error (α)

Detection Limit (LD) for Type II Error (β)

Definitions

From IUPAC Recommendations 1995

© Kinectrics Inc. 2012 20

Definitions

CSA N288.4-10

Annex D, Table D.1

Formulae for non-detect (critical) level and detection limit

© Kinectrics Inc. 2012 21

The level below which quantitative results are not

obtained from the measurement system or analysis

method selected is called the non-detect level.

• This level shall be defined and its derivation should be

documented.

• Measurements below this level are often reported as

being “less than” some value.

Paraphrased from Clause 8.1.10 of CSA N288.5-11

CSA N288.5-11

© Kinectrics Inc. 2012 22

The treatment of results that are less than the non-

detect level for the measurement shall be defined

and documented.

The results of measurements that are below the non-

detect level may be reported as “not detected” or as

being below a “less than value”.

The values of the non-detect level should be

documented and reported.

Paraphrased from Clause 8.3 of CSA N288.4-10

CSA N288.4-10

© Kinectrics Inc. 2012 23

Quantitative numerical values should be reported

rather than “less than” some value or non-detect

level.

• The requirement for numerical values needs to be

considered within the context of laboratory capabilities

and the proximity of the result to any applicable

benchmark value.

Paraphrased from Clause 8.1.10 of CSA N288.5-11

CSA N288.5-11

© Kinectrics Inc. 2012 24

The user is cautioned

that these formula

might not be

appropriate in all

situations,

particularly in low-

level counting.

(see Strom & MacLellan,

Health Physics, 81(1),

2001).

Low Count Rates

© Kinectrics Inc. 2012 25

A note following Clause 8.1.9 of N288.5-11 warns

that:

• …some laboratories might report an instrument

detection level that is often much lower than the

detection level of the method (which includes any

required sample preparation). It is the responsibility of

the user to understand, document, and justify the

detection limit reported by any laboratory engaged to

perform analyses.

Both standards generally assume (but often do not

always explicitly state) that the detection level is the

Method Detection Level

Instrument vs. Method Detection Limits

© Kinectrics Inc. 2012 26

Unit 4b – Summary and Review

Detection Limits

Non-detect level & detection limit

Instrument Detection Limits vs. Method Detection

Limits

Learning Objective

Describe the responsibilities of the user of the

standards with regard to Detection Limits

Unit 4c

Uncertainty

Experts Teaching from Practical Experience

© Kinectrics Inc. 2012 28

Uncertainty - a quantitative expression of error that

results from incomplete knowledge or information

about a parameter or value. – Measurement Uncertainty (from JCGM 200:2012 – International

Vocabulary of Metrology): non-negative parameter characterizing the

dispersion of the quantity values being attributed to a measurand,

based on the information used

• Statistical uncertainty - that component of uncertainty

which arises from imprecision.

• Systematic uncertainty - that component of uncertainty

which arises from biases. Excerpt from Clause 3.1 of CSA N288.4-10

• Type A uncertainty - determined by repeated

measurement

• Type B uncertainty - determined in any other manner

Uncertainty

© Kinectrics Inc. 2012 29

N288.4: The uncertainty associated with each

measured or calculated value should be estimated. Excerpt from Clause 9.3.3.1 of CSA N288.4-10

N288.5: The uncertainty associated with results of

effluent monitoring measurements and any dose

estimates derived from them should be discussed in

the report. Excerpt from Clause 9.3.3..2.1 of CSA N288.5-11

Both: The uncertainty should take into account both

sampling and measurement errors. Sampling errors

cannot always be quantified but they shall be kept to

a minimum by design of the monitoring program.

Requirements

© Kinectrics Inc. 2012 30

Introduction to Uncertainty

Two introductory texts on uncertainty:

© Kinectrics Inc. 2012 31

The Guide to Uncertainty in Measurement (GUM)

was prepared by the Joint Committee for Guides in

Metrology (JCGM)

• Originally published in 1995 (JCGM 100:1995)

• Minor revision published in 2008 (JCGM 100:2008)

– Available for download from the BIPM website at

http://www.bipm.org/en/publications/guides/gum.html

The 1995 version of the GUM was adopted by ISO &

IEC as ISO/IEC Guide 98-3:2008

• the 2008 revision has not yet been adopted by

ISO/IEC

Guide to Uncertainty in Measurement

© Kinectrics Inc. 2012 32

The JCGM Working Group on Uncertainty in

Measurement is preparing additional guidance on

the “Evaluation of Measurement Data”:

• Guide to the expression of uncertainty in measurement

(JCGM 100:2008)

• Propagation of distributions using a Monte Carlo

method – Supplement 1 to the GUM (JCGM 101:2008)

• Extension to any number of output quantities –

Supplement 2 to the GUM (JCGM 102:2011)

• Introduction to the “Guide to the expression of

uncertainty in measurement” and related documents

(JCGM 104:2009)

• Other documents & supplements are in preparation

Evaluation of Measurement Data

© Kinectrics Inc. 2012 33

UKAS M3003 (Ed. 2, Jan. 2007) “The Expression of

Uncertainty and Confidence in Measurement”

• Provides an introduction to the subject with examples

of the application of the guidance given in the GUM

• http://www.ukas.com/library/Technical-

Information/Pubs-Technical-Articles/Pubs-

List/M3003.pdf

The Eurachem/CITAC Guide (Ramsey & Ellison,

2007) “Measurement of uncertainty arising from

sampling: A guide to methods and approaches”

• Provides additional guidance on the estimation of

uncertainty due to sampling and sample preparation

• http://www.eurachem.org/guides/pdf/UfS_2007.pdf

Further Guidance

© Kinectrics Inc. 2012 34

The American Society for Quality (ASQ) defines an

‘uncertainty budget’ as:

a statement of measurement uncertainty, of the

components of that measurement uncertainty, and of

their calculation and combination

A National Research Council template for

Uncertainty Budgets (an Excel spreadsheet) is

available at:

www.nrc-nrc.gc.ca/obj/inms-ienm/doc/clas-

clas/uncertainty_budget_template.xls

Uncertainty Budgets

© Kinectrics Inc. 2012 35

Example: Calculation of activity by liquid

scintillation counting

A = {(C-B) x exp(-λt) x R x S} / {V x T x ε x Pϒ}

Uncertainty Budgets

Symbol Symbol

A Activity V Volume

C Gross Counts T Time

B Background Counts ε Efficiency

exp(-λt) Decay Factor Pϒ Emission Probability

R Random Summing

Correction

S Self-absorption Correction

© Kinectrics Inc. 2012 36

Uncertainty Budgets

Component Value Uncertainty Distribution Divisor Relative

Uncertainty

Gross Counts 414 20.3 Normal (1σ) 1 4.91%

Background Counts 22 4.7 Normal (1σ) 1 21.32%

Decay 1 0.10% Normal (1σ) 1 0.10%

R 1 0.30% Normal (1σ) 1 0.30%

S 1 0.30% Normal (1σ) 1 0.30%

Volume (l) 0.001 0.00002 Rectangular (half-range) √3 1.15%

Live Time (s) 600 0.001 Rectangular (half-range) √3 0.00%

Efficiency 0.34 3.50% Normal (2σ) 2 1.75%

Emission Probability 1 0.50% Normal (2σ) 2 0.25%

ACTIVITY (Bq/l) 1922 221 Normal (2σ) 11.49%

© Kinectrics Inc. 2012 37

P(x) = 1/2a, for (b–a) < x < (b+a)

0, otherwise

E(x) = Mean = b

E(x2) = b2 + a2/3

σ2 = E(x2) – E(x)2 = a2/3

σ = a/√3

Rectangular Distribution

© Kinectrics Inc. 2012 38

The standards do not address sampling uncertainty

in detail.

Several reviews have concluded that, in general:

• Uncertainty due to sampling is greater than uncertainty

due to analysis (often an order of magnitude or more

greater),

• Heterogeneity is the leading contributor to random

sampling uncertainty, and

• Non-representative sampling is the leading contributor

to systematic sampling uncertainty.

Sampling Uncertainty

© Kinectrics Inc. 2012 39

Sampling Uncertainty

© Kinectrics Inc. 2012 40

Sampling Uncertainty should be included but it

may be difficult (or impossible) to quantify.

Possible approaches include:

• Empirical Methods

– Duplicate samples

– Multiple protocols

– Collaborative trial

– Sampling proficiency test

– Reference sampling method/target

• Modeling

– Cause & Effect Modeling

– Gy’s Sampling Theory

Sampling Uncertainty

© Kinectrics Inc. 2012 41

Sampling Uncertainty

Method

Component Estimated

Sampling Analytical

Precision Bias Precision Bias

Duplicate Samples (Single sampler using a single

sampling protocol)

Yes No Yes No *

(may be estimated by

including Certified

Reference Materials)

Multiple Protocols (Single sampler using multiple

sampling protocols)

Between protocols Yes No *

(may be estimated by

including Certified

Reference Materials)

Collaborative Trial (Multiple samplers using the same

sampling protocol)

Between samplers Yes Yes

Sampling Proficiency Test (Multiple samplers each using a

different sampling protocols)

Between samplers &

protocols Yes Yes

© Kinectrics Inc. 2012 42

“Model uncertainty” is an assessment of the degree

of confidence that a mathematical model is a

"correct“ representation of the system.

Model uncertainty includes:

• An estimate of the uncertainty due to the structure of

the model (structure uncertainty); and

• An estimate of uncertainty in each of the parameters

used in the risk assessment equations (parameter

uncertainty).

Model Uncertainty

© Kinectrics Inc. 2012 43

Estimates of the uncertainties in the N288 models

are generally not available:

• Excerpt from CSA N288.1-08 Clause 4.2.9 –

“Conservatism is introduced into the current model by

selecting conservative values for food, water, soil, and

air intake rates for the representative person, typically

at the 95th percentile level.”

Model Uncertainty

© Kinectrics Inc. 2012 44

Estimates of the uncertainties in the N288 models

are generally not available:

• IAEA TRS 472 (Handbook of Parameter Values for the

Prediction of Radionuclide Transfer in Terrestrial and

Freshwater Environments) – “Estimations … of

uncertainty about each such value were carried out by

applying statistical analysis, where possible ... In some

cases, the values were given without a statement of

uncertainty or a range, because of the limited data

available.”

Model Uncertainty

© Kinectrics Inc. 2012 45

Estimates of the uncertainties in the N288 models

are generally not available:

• NCRP Report 164 (Uncertainties in Internal Dose

Assessment) – The dose limits that are recommended

by the International Commission on Radiological

Protection (ICRP) for regulatory purposes are based

on the use of values of dose per unit intake that are to

be applied without any consideration of uncertainty.

Model Uncertainty

© Kinectrics Inc. 2012 46

The number of significant figures quoted in a result

shall not imply a degree of precision greater than

that warranted by the sources of uncertainty.

• Uncertainty should be rounded to one significant figure

• The least significant figure in the result should

correspond to the significant figure in the uncertainty

(e.g., 1900 ± 200 Bq/l).

• More significant figures should be carried for

calculation steps than what is reported in the final

stage.

Paraphrased from Clause 9.3 of CSA N288.4-10 and CSA N288.5-11

Reporting Uncertainty

© Kinectrics Inc. 2012 47

Unit 4c – Summary and Review

Uncertainty

ISO Guide to Uncertainty in Measurement (GUM)

Uncertainty budgets

Sampling uncertainty

Reporting uncertainty

Learning Objectives

Identify the guidance on analysis of samples and

reporting of results

Describe the changes in the guidance on

estimating and reporting uncertainties