tools you can use · •the webinar will be archived and posted on npc’s website in coming days....
TRANSCRIPT
TOOLS YOU CAN USE April 8, 2014
Academy of Managed Care Pharmacy
International Society for Pharmacoeconomics and Outcomes Research
National Pharmaceutical Council
Speakers
Bernadette Eichelberger, PharmD Director, Pharmacy Affairs, AMCP
Marc Berger, MD Vice President, Real World Data and Analytics, Pfizer
Brian Sweet, MBA Executive Director, U.S. Payer & Real World Evidence, AstraZeneca (Moderator)
Jennifer Graff, PharmD Director, Comparative Effectiveness Research, NPC
Dan Allen, PharmD Clinical Pharmacist Consultant, OmedaRx
Submit questions and
comments via the Questions
section in the Control Panel
To Submit Questions
How to Ask a Question
How to Ask a Question
• Message @npcnow using #npcwebinar
• We also encourage you to tweet during the webinar using #npcwebinar
4
Marc Berger, MD
Vice President, Real World Data and Analytics, Pfizer
AMCP/ISPOR/NPC CER Collaborative
6
Part 1: Evaluate Quality of Individual Studies
•Prospective
•Retrospective
•Modeling
•Indirect Methods
Part 2: Synthesizing the Evidence Across Multiple Study Types
• RCT, Observational studies
Part 3: Assessing the Evidence by Decision Makers: A Toolkit
• Tools
• Educational Materials and Training
The Issues
• Pharmacy decision-makers (“users”) are uncertain to interpret CER evidence not derived from RCTs
– Inherent Causal Inference Challenge Due to Observational Nature
– Less Rigorous Regulatory Oversight
– Complex Statistical Modeling
– Heterogeneous Perspectives and Training
• The issues are growing more challenging
– More studies and more complex analytic approaches
Design Objectives
• Appraisal tools:
– Easy, fast, accurate, minimal skill required
– Help end users assess the quality, credibility, and relevance of studies
• Promote the use of non-RCT evidence in decision making
• Educational tools
Prospective and Retrospective Obs.
Studies
Indirect Treatment Comparison
Studies Modeling Studies
Evolution of the Tools
• Agreement on 2 main Categories
– Relevance (Are the results applicable?) • Generalizability/PICO
– Credibility (Can I trust the results?) • Validity
• Agreement on not to formally provide a score
– Concerns regarding “false precision”
– Concerns regarding “fatal flaws”
– Decision to ask rater to provide global assessment of each domain as either “sufficient” or “insufficient”
– Formal scoring would undermine the “educational” value of the tools
Examples of Questions
Retrospective/
Prospective
• Design
• Data
• Analysis
• Reporting
• Interpretation
• Conflicts of Interest
• 33 items
Modeling
• Validation
• External
• Verification
• Face
• Design
• Data
• Analysis
• Reporting
• Interpretation
• Conflicts of Interest
• 26 items
Indirect Treatment Comparisons
• Evidence Base
• Analysis
• Reporting
• Interpretation
• Conflict of Interest
• 15 items
Fig. 1
Source: Value in Health 2014; 17:143-156 (DOI:10.1016/j.jval.2013.12.011 )
12
Step by Step Instructions
Evaluation History
www.cercollaborative.org
13
Explanation/ Definitions
More Information Check Out the March 2014 Value in Health
• Editorial: Tools for Health Care Decision Making: Observational Studies, Modeling Studies, and Network Meta-Analyses
• A Questionnaire to Assess the Relevance and Credibility of Observational Studies to Inform Health Care Decision Making: An ISPOR-AMCP-NPC Good Practice Task Force Report
• Indirect Treatment Comparison/Network Meta-Analysis Study Questionnaire to Assess Relevance and Credibility to Inform Health Care Decision Making: An ISPOR-AMCP-NPC Good Practice Task Force Report
• Questionnaire to Assess Relevance and Credibility of Modeling Studies for Informing Health Care Decision Making: An ISPOR-AMCP-NPC Good Practice Task Force Report
Dan Allen, PharmD
Clinical Pharmacist Consultant, OmedaRx
Who is OmedaRx? • Stand-alone PBM wholly owned by Cambia Health
Solutions
– Formerly RegenceRx
– Cambia Health Solutions – BlueCross/Blue Shield franchise in Oregon, Washington, Idaho, Utah
– Full PBM services to the Regence family of health plans
• Provide formulary guidance and utilization management strategies to “Blues” and non-Blues plans nationwide
– Medication Reviews
– Medication Policies
– P&T Support
Why Implement ICER?
• Challenge: How do you systematically summarize separate clinical trials and available safety information?
• Institute for Clinical and Economic Review (ICER)
– Rating system evolved from earlier AHIP workgroup guided by insurers to meet needs for coverage decision-making
• Freely available, nationally vetted, meshes with current evaluation methods
Implementation • Extensive staff training
• In-service to clients’ P&T committees over multiple meetings
• Ongoing training
• Continuous quality improvement
• Decision tracking
• Developing custom formulary frameworks for each clients
The ICER Rating Matrix
High Certainty
D (Inferior)
C (Comparable)
B (Small/Modest
Benefit)
A (Moderate/
Large Benefit)
Moderate Certainty
I (Insufficient to determine)
P/I (Promising but Inconclusive)
Low Certainty
I (Insufficient to determine)
Negative Health Benefit
Comparable Health Benefit
Incremental Health Benefit
Substantial Health Benefit
The ICER Rating Matrix – Effect of Certainty
High Certainty
D (Inferior)
C (Comparable)
B (Small/Modest
Benefit)
A (Moderate/
Large Benefit)
Moderate Certainty
I (Insufficient to determine)
P/I (Promising but Inconclusive)
Low Certainty
I (Insufficient to determine)
Negative Health Benefit
Comparable Health Benefit
Incremental Health Benefit
Substantial Health Benefit
TE
Treatment Effect
Treatment Effect
OmedaRx ICER Process
Estimate certainty in the overall clinical
data
Estimate the magnitude of
net-health benefit
Make safety modification to point estimate
Identity “point estimate” on the
ICER matrix
Consideration for Using the ICER Matrix
• Most reviews involve two comparisons – against placebo and against existing therapies
• New therapies often lack direct comparative trials
– Compare the evidence synthesis of placebo-controlled data
• This process is always subject to peer review
– We challenge each other on our assessment of the evidence, our assessment of the standard of care, and our assessment of the safety profile
Finding Certainty of Benefit Certainty of
Benefit Definitions of Evidence
(ICER) Quantification of Studies
(OmedaRx)
High Certainty
Allows estimation for the relative potential chances / magnitude of net health benefit
≥1 high confidence study; consistent results OR ≥2 fair confidence studies with: • Consistent results • Possibly clinically meaningful endpoint
Moderate Certainty
Difficult to estimate the net health benefit with precision
≥1 high confidence study; consistent results OR ≥1 fair confidence study with: • Consistent results • Possibly clinically meaningful endpoint OR ≥2 low confidence studies with: • Consistent results • Possibly clinically meaningful endpoint
Low Certainty
Insufficient to allow assessment of the net health benefit
low confidence studies not meeting threshold for moderate certainty (defined above) OR ≥2 fair confidence studies with inconsistency in the results
Determining Magnitude of Health Benefit
High Certainty
D (Inferior)
C (Comparable)
B (Small/Modest
Benefit)
A (Moderate/
Large Benefit)
Moderate Certainty
I (Insufficient to determine)
P/I (Promising but Inconclusive)
Low Certainty
I (Insufficient to determine)
Negative Health Benefit
Comparable Health Benefit
Incremental Health Benefit
Substantial Health Benefit
Make Safety Modification to Point Estimate
Safety Conclusion Example
Estimate of Certainty Estimate of Benefit
Track record with proven advantages (over active comparator)
Track record with no new safety concerns
Insufficient track record
2013 New Medications: Overall Evidence and Clinical Trials Quality
High confidence (1)
1%
Fair confidence (11) 13%
Low confidence
(76) 86%
High certainty (1) 2%
Moderate certainty (17)
40%
Low certainty (17) 40%
Review in progress (8) 18%
Medications’ ICER
Evidence Synthesis
Individual Clinical
Trials
28
Lessons Learned
• Successful clinical evidence synthesis is dependent on rigorous clinical evidence evaluation
• Agreed upon, uniform grading and synthesis guidelines important for maintaining quality across multiple reviewers
• Never allow the quantification guidelines to override professional judgement
Bernadette Eichelberger, PharmD
Director, Pharmacy Affairs, AMCP
AMCP/ISPOR/NPC CER Collaborative
31
Part 1: Evaluate Quality of Individual Studies
•Prospective
•Retrospective
•Modeling
•Indirect Methods
Part 2: Synthesizing the Evidence Across Multiple Study Types
• RCT, Observational studies
Part 3: Assessing the Evidence by Decision Makers: A Toolkit
• Tools
• Educational Materials and Training
Additional Tools and Resources at www.cercollaborative.org
Summary Reports Can Be Downloaded in Word or Excel
34
Register Today for the CER Certificate Program http://www.pharmacists4knowledge.org/cips/cer
CER Certificate Program Demo and Modules Available Today
Jennifer Graff, PharmD
Director, Comparative Effectiveness Research, NPC
CER Collaborative Tool Improves the Evidence Assessment Dialogue
• Clarity – Agreed upon elements of “good practice”
for conducting studies
– When evidence is considered = ROI for investing in further research
• Consistency – Evaluation of evidence in a similar fashion
at different times or different individuals or organizations
• Transparency – Need for evidence is considered for those
generating and using evidence
38
CER Collaborative Improves Evidence Generation Certainty
• Began with good practices for research; goal is to ensure there is consistency in evidence evaluation
• Tools can be incorporated in study design, management, conduct and reporting processes
• Starting point for dialogue
– Feasibility of inclusion in a journal vs. on-line appendix vs. study report
CER Collaborative Continues the Dialogue to Improve Evidence Communication
• Framework for dialogue with managed market accounts and field based research groups
• Increase decision-maker awareness of all types of evidence
• May provide additional support for determining competent and reliable scientific evidence
Resources You Can Use https://members.npcnow.org/resources
Protocol and
Publication
Questionnaires
www.cercollaborative.org NPC Champion Slides
Implications for Industry
• Use familiar language of the decision-maker
• Ensure research meets good practice principles through eyes of reviewers
– Consider when designing research
– Review when finalizing publications
– Consider when communicating evidence to decision-makers
• Training
• Dialogue, dialogue, dialogue
Special Thanks to Work Groups Members Prospective Retrospective Indirect Treatment
Comparisons Modeling Synthesizing a Body of
Evidence
Dan Allen RegenceRx
Winnie Yang Blue Shield of
California
Sherry Andes Catamaran Cheryl Kaltz U. of Michigan
Lisa Cashman MedImpact
Karen Worley Humana
Eric Cannon SelectHealth
Jessica Daw UPMC Health Plan
Bimal Patel MedImpact
Jon Clouse OptumInsight
Scott Devine Merck
Kristijan Kahler Novartis
Joseph C. Cappelleri Pfizer
John Penrod Bristol-Myers Squibb
Jeff White WellPoint NextRX
John Graham Bristol-Myers Squibb
Nicole C. Quon Optimer
Vijay Joish Bayer
Hong Kan GlaxoSmithKline
Rahul Ganguly GlaxoSmithKline
Don Husereau U. Ottawa
William Crown Optuminsight Life
Sciences
Thomas Trikalinos Tufts Medical Center
David Eddy Archimedes Inc.
Bryan Luce PCORI
Dan Mullins U. Of Maryland
Michael Johnson U. of Houston
Georgia Salanti U. of Ioannina
Andy Briggs U. of Glasgow
Richard Willke Pfizer
Marc Berger (Chair) Pfizer
Bradley Martin (Chair)
U. of Arkansas
Jeroen Jansen (Chair)
Redwood Consulting
J. Jaime Caro (Chair)
United BioSource Corp.
Brian Sweet (Chair)
AstraZeneca
Helen Sherman (Chair)
Select Benefit Guidance
Questions?
44
Q&A Panel
Bernadette Eichelberger, PharmD Director, Pharmacy Affairs, AMCP
Marc Berger, MD Vice President, Real World Data and Analytics, Pfizer
Brian Sweet, MBA Executive Director, U.S. Payer & Real World Evidence, AstraZeneca (Moderator)
Jennifer Graff, PharmD Director, Comparative Effectiveness Research, NPC
Dan Allen, PharmD Clinical Pharmacist Consultant, OmedaRx
Thank You
• The webinar will be archived and posted on NPC’s website in coming days.
• For further information, contact:
– Brian Sweet [email protected]
– Marc Berger [email protected]
– Dan Allen [email protected]
– Bernadette Eichelberger [email protected]
– Jennifer Graff [email protected]
46
cercollaborative.org