session 3 managing uncertainty in the post- launch phase · 1. evidence generation timeline 2....

Post on 16-Oct-2020

3 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Session 3 – Managing uncertainty in the post-launch phase Piotr Szymanski Associate Professor of Cardiology European Society of Cardiology

Managing uncertainty in post-launch phase – key points

1. Evidence generation timeline

2. Uncertanity of evidence

3. Managing uncertainty

Managing uncertainty in post-launch phase – key points

1. Evidence generation timeline 2. Uncertainty of evidence

3. Managing uncertainty

Launch of technology

CE MARK

AVAILABLE NOT REIMBURSED

HTA

AVAILABLE REIMBURSED

Launch of technology

CE MARK

AVAILABLE NOT REIMBURSED

HTA

AVAILABLE REIMBURSED

Launch of technology

CE MARK HTA

AVAILABLE REIMBURSED

Evidence generation

CE MARK

AVAILABLE NOT REIMBURSED

HTA

AVAILABLE REIMBURSED

EVIDENCE EVIDENCE

EVIDENCE

Evidence generation

CE MARK

AVAILABLE NOT REIMBURSED

HTA

AVAILABLE REIMBURSED

EVIDENCE EVIDENCE

EVIDENCE

Evidence generation

CE MARK HTA

AVAILABLE REIMBURSED

EVIDENCE

EVIDENCE

Evidence generation timeline vs source

CE MARK

AVAILABLE NOT REIMBURSED

HTA

AVAILABLE REIMBURSED

EVIDENCE EVIDENCE

EVIDENCE

NON-COMMERCIAL REASERCH

INDUSTRY SPONSORED RESEARCH

Source of evidence

non-commercial : commercial

1:3

11

1/3 of clinical trials

non-commercial

Managing uncertainty in post-launch phase – key points

1. Evidence generation

2. Uncertainty of evidence 3. Managing uncertainty

Evidence pyramid

Evidence pyramid

„Benchmark” – randomized

controlled trial

Quality of Evidence

15 JAMA.

2014;311:368-77

JAMA. 2014 Jan 22-29;311(4):368-77

Quality of Evidence

16 JAMA.

2014;311:368-77

Randomized trials in 9/10 cases (in cancer -

5/10 cases)

Quality of Evidence

17 JAMA.

2014;311:368-77

Surrogate outcomes in ½ of

trials

Quality of Evidence

18 JAMA.

2014;311:368-77

Populations studied and duration of treatment are disproportionately small

compared to real life numbers

Quality of Evidence – Medical Devices

19 JAMA. 2017;318(7):619-

625.

Quality of Evidence – Medical Devices

20 JAMA. 2017;318(7):619-

625.

less than half of clinical studies submitted for approval of high-risk

medical devices were randomized

Evidence available at technology approval

Randomized controlled trial

is a „benchmark”

Quality of Evidence

22 PLoS Med.

2005;2:e124.

Randomized controlled trial is a „benchmark”

– but may be false

Quality of Evidence

23

power

True/not true

bias example PPV

PLoS Med.

2005;2:e124.

Even with good quality RCT, due to issues related

to power, bias, pre-test probability, the proportion

of true to false results is 1:1

Quality of Evidence

24

power

True/not true

bias example PPV

PLoS Med.

2005;2:e124.

True/not true

In case of underpowered RCT the risk rises 5-fold

Quality of Evidence

25 JAMA. 2005;294:218-28.

Quality of Evidence

26 JAMA. 2005;294:218-28.

1/4 of 49 highly cited clinical

studies remained largely unchallenged

by subsequent studies

Quality of evidence and postmarket safety

27 JAMA. 2017;317:1854-1863

Quality of evidence and postmarket safety

28 JAMA. 2017;317:1854-1863

The risk of postmarket safety events is over two-fold higher with

accelerated approval

Quality of evidence and postmarket safety – medical devices

29 BMJ. 2016

;353:i3323.

The risk of postmarket

safety events was two-fold

higher in EU vs US

Missing evidence in the postapproval phase

30 BMJ 2017;357:j1680

Missing evidence in the postapproval phase

31 BMJ 2017;357:j1680

no postapproval studies were performed for 43 of the 123

(35%) indications approved on the basis of limited evidence

(single pivotal trial or surrogate endpoints)

EVIDENCE

CE MARK

AVAILABLE NOT REIMBURSED

HTA

AVAILABLE REIMBURSED

MISSING EVIDENCE EVIDENCE

Missing evidence for HTA

Managing uncertainity in post-launch phase – key points

1. Evidence generation

2. Uncertainity of evidence

3. Managing uncertainity

Clinical registries – technology comparisons

34

An integrated clinical-data surveillance system used to conduct a prospective, propensity-matched analysis of the safety of the Mynx vascular-closure device, as compared with alternative approved vascular-closure devices, with data from the CathPCI Registry of the National Cardiovascular Data Registry.

Vascular complications more frequent with Mynx vs comparators

N Engl J Med 2017; 376:526

Postmarket surveillance

35 J Card Electrophysiol. 2018;29:5

Postmarket surveillance

36 J Card Electrophysiol. 2018;29:5

Risk of malfunction

several dozen higher post- as

compared to pre-FDA approval

ESC Atlas of Cardiology database, 2017

Variations in uptake may result in

variable learning curves, differences in complication

rates, cost-effectiveness, etc.

Variations in annual implant rates for CRTD/per million

Quality of Evidence

38

power

True/not true

bias example PPV

PLoS Med.

2005;2:e124.

True/not true

observational < randomized

Registry-based randomized trial

40 Courtesy: Jonas Oldgren, UCR

A change in guidelines

41 Courtesy: Jonas Oldgren, UCR

CLASS III – DO NOT

use

CLASS IIa –USE

A change in clinical practice

42 Courtesy: Jonas Oldgren, UCR

Conclusions

• The benefits of introduction of a new technology should be weighed against inevitable uncertainty of evidence

• Approval of new technologies on the basis of limited evidence should result in risk management plan, including systematic collection of real world data and pragmatic clinical trials, to reduce uncertainity.

43

Conclusions

• Adequate quality real world data are fundamental to judge the clinical benefits related to the use of new health technologies compared to existing ones

• Joint clinical assessments should be coordinated, but not aligned with CE marking, to allow for the collection of real world data

• A coordinated effort should be undertaken to improve the quality of data available for HTA

44

Piotr Szymanski pszymanski@ikard.pl

7 June 2018

top related