david a. scott ma msc senior director, icon health economics

27
Network meta-analysis in SAS Danish Society of Biopharmaceutical Statistics, Elsinore, May 27, 2014 David A. Scott MA MSc Senior Director, ICON Health Economics Visiting Fellow, SHTAC, University of Southampton

Upload: yuma

Post on 24-Feb-2016

47 views

Category:

Documents


0 download

DESCRIPTION

Network meta-analysis in SAS Danish Society of Biopharmaceutical Statistics, Elsinore, May 27, 2014. David A. Scott MA MSc Senior Director, ICON Health Economics Visiting Fellow, SHTAC, University of Southampton. Network Meta-Analysis: Software. winBUGS / OpenBUGS /JAGS (DSU series) - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: David A. Scott MA MSc Senior Director, ICON Health Economics

Network meta-analysis in SASDanish Society of Biopharmaceutical Statistics, Elsinore, May 27, 2014

David A. Scott MA MScSenior Director, ICON Health Economics

Visiting Fellow, SHTAC, University of Southampton

Page 2: David A. Scott MA MSc Senior Director, ICON Health Economics

Network Meta-Analysis: Software

• winBUGS/OpenBUGS/JAGS (DSU series)• R e.g. rmeta, netmeta, mvmeta packages• Stata mvmeta• SAS e.g. proc glimmix, proc mcmc

Page 3: David A. Scott MA MSc Senior Director, ICON Health Economics

A brief history of NMA in SAS

• Lots of different procedures to implement NMA in SAS– proc mixed, proc mixed, proc nlmixed, proc

genmod, proc glimmix1-3

– Frequentist techniques– Difficult to fit complex hierarchical models2

• MCMC techniques– proc genmod (using Easy Bayes) -> proc mcmc– SAS 9.2 (level 2M3), SAS 9.3 (sas stat 12)

1 Glenny AM et al, Health Technology Assessment 2005; 9(26)2 Jones B et al, Pharmaceutical Statistics 2011; 10:523-313 Piepho HP et al. Biometrics 2012; 68:1269-77

Page 4: David A. Scott MA MSc Senior Director, ICON Health Economics
Page 5: David A. Scott MA MSc Senior Director, ICON Health Economics
Page 6: David A. Scott MA MSc Senior Director, ICON Health Economics

Potential barriers

• DSU series winBUGS-focused• SAS not yet used in UK reimbursement

submissions• ERG limited experience of SAS• Limited published code/articles• Validation exercise

Page 7: David A. Scott MA MSc Senior Director, ICON Health Economics

Illustrative example 1 - binary data

Page 8: David A. Scott MA MSc Senior Director, ICON Health Economics
Page 9: David A. Scott MA MSc Senior Director, ICON Health Economics
Page 10: David A. Scott MA MSc Senior Director, ICON Health Economics

Syntax: load datadata smoking;input Study Trt R N narm;datalines;1 2 11 78 3 #Mothersill 19881 3 12 85 3 #Mothersill 19881 4 29 170 3 #Mothersill 19882 1 75 731 2 #Reid 1974…run;

Page 11: David A. Scott MA MSc Senior Director, ICON Health Economics

Syntax: fixed effectsproc mcmc data=smoking nmc=20000 seed=246810;random Studyeffect ~general(0) subject=Study init=(0);random Treat ~general(0) subject=Treatment init=(0) zero="No contact" monitor=(Treat);mu= Studyeffect + Treat;P=1-(1/(1+exp(mu)));model R ~ binomial(n=N, p=P);run;

Page 12: David A. Scott MA MSc Senior Director, ICON Health Economics

Syntax: random effectsproc mcmc data=smoking nbi=20000 nmc=200000 thin=10 seed=246810 monitor=(mysd) dic;random Studyeffect ~normal(0, var=10000) subject=Study init=(0) ;random Treat ~normal(0, var=10000) subject=Treatment init=(0) zero="No contact" monitor=(Treat);parms mysd 0.2;prior mysd ~ uniform(0,1);random RE ~ normal(0,sd=mysd/sqrt(2)) subject=_OBS_ init=(0);mu= Studyeffect + Treat +RE;P=1-(1/(1+exp(mu)));model R ~ binomial(n=N, p=P);run;

Page 13: David A. Scott MA MSc Senior Director, ICON Health Economics

Diagnostics

• Trace• Density• Autocorrelation– thin= option

• DIC (relative model fit)– dic option

Page 14: David A. Scott MA MSc Senior Director, ICON Health Economics

Diagnostics in SAS

Page 15: David A. Scott MA MSc Senior Director, ICON Health Economics
Page 16: David A. Scott MA MSc Senior Director, ICON Health Economics

Practical exercise 1

• Run the code as is• Compare results for each model• Amend the code to generate fewer MCMC samples,

how many are sufficient? How much burn-in is needed? Is thinning necessary in the RE model?

• Which model is the better fit, fixed or random effects?• Change the baseline from “no contact” to “self help”.

Are the results consistent? • Try changing the priors to other vague priors1, does this

affect results? 1 Lambert PC et al, Statistics in Medicine, 2005; 24:2401-28

Page 17: David A. Scott MA MSc Senior Director, ICON Health Economics

Results from WinBUGS

Fixed effects mean sd

Self help 0.25 0.13Individual counselling 0.75 0.06Group counselling 1.02 0.21DIC 485.0Random effects Self help 0.46 0.4Individual counselling 0.78 0.23Group counselling 1.09 0.51reSD 0.79 0.18DIC 298.6

Page 18: David A. Scott MA MSc Senior Director, ICON Health Economics

Illustrative example 2 - continuous data

Page 19: David A. Scott MA MSc Senior Director, ICON Health Economics

RQy449hbr123oxout

Page 20: David A. Scott MA MSc Senior Director, ICON Health Economics
Page 21: David A. Scott MA MSc Senior Director, ICON Health Economics

Syntax: load datadata scott;input study trt baseline y SE;datalines;1 2 8.5 -1.08 0.121 3 8.5 -1.13 0.121 1 8.5 0.23 0.22 2 8.4 -1 0.1…;run;

Page 22: David A. Scott MA MSc Senior Director, ICON Health Economics

Syntax: fixed effectsproc mcmc data=scott nmc=200000 nthin=20 seed=246810;random Studyeffect ~general(0) subject=Study init=(0) ;random Treat ~general(0) subject=Treatment init=(0) zero="Placebo" monitor=(Treat);Mu= Studyeffect + Treat ;model Y ~ normal(mean=Mu, var=SE*SE);run;

Page 23: David A. Scott MA MSc Senior Director, ICON Health Economics

Syntax: random effectsproc mcmc data=scott nmc=200000 nthin=20 seed=246810 monitor=(mysd) outpost=outp7 dic;random Studyeffect ~normal(0,var=10000) subject=Study init=(0) ;random Treat ~normal(0,var=10000) subject=Treatment init=(0) zero="Placebo" monitor=(Treat);parms mysd 0.2;prior mysd ~ uniform(0,1);random RE ~normal(0,sd=mysd/sqrt(2)) subject=_OBS_ init=(0);Mu= Studyeffect + Treat +RE;model Y ~ normal(mean=Mu, sd=SE);run;

Page 24: David A. Scott MA MSc Senior Director, ICON Health Economics

Syntax: fixed effects meta-regression

proc mcmc data=scott nmc=200000 nthin=20 seed=246810;random Studyeffect ~general(0) subject=Study init=(0) ;random Treat ~general(0) subject=Treatment init=(0) zero="Placebo" monitor=(Treat);parms hba1c 0;prior hba1c ~normal(0,var=10000);Mu= Studyeffect + Treat + baseline*hba1c;model Y ~ normal(mean=Mu, var=SE*SE);run;

Page 25: David A. Scott MA MSc Senior Director, ICON Health Economics

Practical exercise 2• Run the code as is• Compare results for each model• Which model is the better fit, fixed or random effects, or meta-

regression?• Amend the code to generate fewer MCMC samples, how many are

sufficient? How much burn-in is needed? Is thinning necessary in the RE model?

• Change the baseline from “Placebo” to “Insulin Glargine”. Are the results consistent?

• Compare results to WinBUGS output • Try changing the priors to other vague priors1, does this affect

results?

1 Lambert PC et al, Statistics in Medicine, 2005; 24:2401-28

Page 26: David A. Scott MA MSc Senior Director, ICON Health Economics

Results from WinBUGS

Fixed effects mean sdLiraglutide 1.2mg -1.04 0.07Liraglutide 1.8mg -1.21 0.06Insulin glargine -0.82 0.06Exenatide BID -0.79 0.05Exenatide QW -1.12 0.06Fixed effects adjusting for baseline hba1cLiraglutide 1.2mg -1.02 0.07Liraglutide 1.8mg -1.20 0.06Insulin glargine -0.83 0.06Exenatide BID -0.82 0.05Exenatide QW -1.13 0.06delta -0.41 0.14

Page 27: David A. Scott MA MSc Senior Director, ICON Health Economics

Random effects from paper