what role should probabilistic sensitivity analysis play in smc decision making?

27
What role should probabilistic sensitivity analysis play in SMC decision making? Andrew Briggs, DPhil University of Oxford

Upload: bethany-russo

Post on 01-Jan-2016

43 views

Category:

Documents


0 download

DESCRIPTION

What role should probabilistic sensitivity analysis play in SMC decision making?. Andrew Briggs, DPhil University of Oxford. What probabilistic modelling offers. Generating the appropriate (expected) cost-effectiveness - PowerPoint PPT Presentation

TRANSCRIPT

What role shouldprobabilistic sensitivity analysisplay in SMC decision making?

Andrew Briggs, DPhilUniversity of Oxford

What probabilistic modelling offers

• Generating the appropriate (expected) cost-effectiveness

• Reflects combined implications of parameter uncertainty on the outcome(s) of interest (cost-efectiveness)

• Can make probability statements about cost-effectiveness results – error probability under decision maker’s control

• Offers a means of to calculate the value of collecting additional information

Role of probabilistic sensitivity analysisOverview

• Data sources for parameter fitting• Distributions for common model parameters• Correlating parameters• Presenting simulation results• Using PSA for decision making• Continuing role of traditional sensitivity analysis• Micro simulation models

• Primary data– Can ‘fit’ parameters using standard statistical methods– Provides standard estimates of variance and correlation

• Secondary data– With appropriate information reported can still fit parameters– Meta analysis may be possible

• Expert opinion– Usefulness of Delphi limited (focus on consensus!)– Variability across estimates– Individual estimates of dispersion

Data sources for parameter estimation

• Probabilities are constrained on the interval zero-one• Probabilities must sum to one• Probabilities often estimated from proportions

– Data informing estimation are binomially distributed– Use Beta distribution

• May estimate probabilities from rates– E.G. from hazard rates in survival analysis– Use (multivariate) normal on log scale– Must make transformation from rates to probabilities

Distributions for common parametersProbability parameters

• Costs are a mixture of resource counts and unit costs• Could model counts individually as Poisson with

Gamma distributed mean (parameter)• Costs are constrained to be zero or positive• Can use Gamma distribution if cannot rely on the

Central Limit Theorem (if skewed)• Popular alternative is log-normal, particular when using

regression models on log cost

Distributions for common parameters Cost parameters

• Utilities are somewhat unusual with one representing perfect health and zero representing death

• Can have states worse than death so constraints are negative infinity up to one

• If far from zero, pragmatic approach is to fit beta distribution

• If it is important to represent negative utilities consider the transformation X = 1- U (utility decrement) and fit Gamma or log normal distribution to X

Distributions for common parameters Utility parameters

• Relative risks are ratios!• Can log transform to make additive• Variances and confidence intervals are estimated on the

log-scale then exponentiated• Suggests the log-normal distribution

Distributions for common parameters Relative risk parameters

Relative risk from published meta-analysisExample

• Suppose a published meta analysis quotes a relative risk of 0.86 with 95%CI(0.71 to 1.05)

• Log transform these to give -0.15 (-0.35 to 0.05) on log scale

• Calculate the SE on log scale:(0.05 - -0.35)/(1.96*2) = 0.1

• Generate a normally distributed random variable with mean –0.15 and SE 0.10

• Exponentiate the resulting variable

Correlating parameters

• PSA has sometimes been criticised for treating parameters as independent

• In principle can correlate parameters if we have information on covariance structure– e.g. covariance matrix in regression

• Cholesky decomposition used for correlated normal distributions

• Correlations among other distributional forms not straightforward

Variability and nonlinearity

Even if we are interested only in expected values we need to consider uncertainty when nonlinearities are involved:

E[ g(x) ] g( E[x])

• Uncertainty needed to calculate expectation of nonlinear parameters

• Uncertainty needed to calculate expectation of nonlinear models

Point estimates and variability

0.6 0.7 0.8 0.9 1 1.1 1.2

Relative risk

RR: 0.86 (95% CI: 0.71-1.05)

Standard point estimateExpected value

A model of Total Hip ReplacementExample: interpreting simulation results

cPrimary

RRR

cSuccess

uSuccessP

mr[age]

1- (omrRTHR + mr[age])

omrPTHR

1- (omrPTHR)

RR[age,sex,time]omrRTHR + mr[age]

Primary THR

Revision THR

Successful Primary

Deathmr[age]

uRevision

cRevision cSuccess

uSuccessR

Successful Revision

Example on the CE plane Spectron versus Charnley Hip prosthesis

-£1,000

-£800

-£600

-£400

-£200

£-

£200

£400

-0.05 0.00 0.05 0.10 0.15 0.20 0.25 0.30 0.35 0.40

QALYs gained

Ad

dit

ion

al

Co

st

Corresponding CEAC Spectron versus Charnley Hip prosthesis

0.00

0.10

0.20

0.30

0.40

0.50

0.60

0.70

0.80

0.90

1.00

£- £5,000 £10,000 £15,000 £20,000

Value of ceiling ratio

Pro

ba

bil

ity

co

st-

eff

ec

tiv

e

Multiple acceptability curves Why and how?

• Two reasons for employing multiple acceptability curves– Heterogeneity between patient groups– Multiple treatment options

• Correspond to two situations in CEA– Independent programmes– Mutually exclusive options

• Lead to two very different representations!

Multiple CEACs: handling heterogeneity Spectron versus Charnley (Males)

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

£- £5,000 £10,000 £15,000 £20,000

Value of ceiling ratio

Pro

ba

bili

ty c

os

t-e

ffe

cti

ve

Age 90

Age 80

Age 70

Age 60

Ages 40 & 50

Multiple CEACs: handling heterogeneity Spectron versus Charnley (Females)

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

£- £5,000 £10,000 £15,000 £20,000

Value of ceiling ratio

Pro

ba

bili

ty c

os

t-e

ffe

cti

ve

Age 90

Age 80

Age 70

Age 60

Age 40

Age 50

Example: GERD management Baseline results

D

C

A

E

B

F

600

700

800

900

1000

1100

1200

38.00 39.00 40.00 41.00 42.00 43.00 44.00 45.00 46.00 47.00 48.00

Weeks free of GERD

Str

ateg

y co

st

A: Intermittent PPIB: Maintenance PPIC: Maintenance H2RAD: Step-down maintenance PAE: Step-down maintenance H2RAF: Step-down maintenance PPI

$10/GFW

$264

/GF

W

$36/GFW

D

C

A

E

B

F

600

700

800

900

1000

1100

1200

38.00 39.00 40.00 41.00 42.00 43.00 44.00 45.00 46.00 47.00 48.00

Weeks free of GERD

Str

ateg

y co

st

A: Intermittent PPIB: Maintenance PPIC: Maintenance H2RAD: Step-down maintenance PAE: Step-down maintenance H2RAF: Step-down maintenance PPI

$10/GFW

$264

/GF

W

$36/GFW

Example: GERD management Uncertainty on the CE plane

F

B

E

A

C

D

$600

$700

$800

$900

$1,000

$1,100

$1,200

38.00 39.00 40.00 41.00 42.00 43.00 44.00 45.00 46.00 47.00 48.00

Weeks free of GERD

Str

ateg

y co

st

Example: GERD management Multiple CEACs

A

BCE

F0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

1 10 100 1000

Ceiling Ratio (Rc)

Pro

bab

ilit

y co

st-e

ffec

tive

Using probabilistic analysis for making decisions?

Link with standard statistical methods

1. Use standard inference (link with frequentist methods) 2. Use cost-effectiveness acceptability curves to allow

decision maker to select own ‘threshold’ error probability (more Bayesian)

3. Use PSA to establish the value of collecting additional information to inform decision (fully Bayesian decision theoretic approach)

Cost of uncertainty (value of information)

-£30,000 -£20,000 -£10,000 £0 £10,000 £20,000 £30,000

Net monetary benefit

£0

£500,000,000

£1,000,000,000

£1,500,000,000

£2,000,000,000

£2,500,000,000

£3,000,000,000

£3,500,000,000

Op

po

rtu

nit

y l

os

s

Net benefit probability density

Implementation loss function

Non-implementation loss function

Micro-simulation models and PSA

• Microsimulation is an ‘individual’ (rather than ‘cohort’) method of model evaluation

• Typically used to capture patient histories• Calculation requires large number of individual

simulations• PSA would require a second ‘layer’ of simulations

(increases computational time)• Think carefully about whether a micro simulation is

necessary• If it is, buy a fast machine, or use an approximate

solution

Is there any role for standard sensitivity analysis?

• Probabilistic sensitivity analysis is important for capturing parameter uncertainty

• Other forms of uncertainty relate to– Methodology– Structural uncertainty– Data sources– Heterogeneity

• Standard sensitivity analysis retains an important role (in conjuction with PSA)

Critiquing a probabilistic CE model

• Are all parameters included in PSA?• Were standard distributions specified?

– No triangular/uniform distributions• Was the appropriate expected value calculated?• Was standard sensitivity analysis employed to handle

non sampling uncertainty?• Was heterogeneity handled separately?• Was the effect of individual parameters explored?

Summary: the role of PSA

PSA has important role to play• Calculating the correct expected value• Calculating combined effect of uncertainty in all

parameters• Opening the debate about appropriate error probability• Required to calculate the value of information• Continuing role for standard sensitivity analysis