draft best practices: data for learning and...

41
Institute for Healthcare Improvement Best Practices: Data for Learning and Improvement Kevin Little, Ph.D. Improvement Advisor Kris White, IHI Faculty These presenters have nothing to disclose. DRAFT November 2014 Session Objectives At the conclusion of this session, participants will be able to: Develop changes to improve analysis of survey data Evaluate current state in use of qualitative and quantitative patient experience data Distinguish slow feedback from fast feedback

Upload: others

Post on 21-Aug-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Best Practices: Data for Learning and ImprovementKevin Little, Ph.D. Improvement Advisor

Kris White, IHI Faculty

These presenters havenothing to disclose.

DRAFT

November 2014

Session Objectives

At the conclusion of this session, participants will be

able to:

• Develop changes to improve analysis of survey data

• Evaluate current state in use of qualitative and

quantitative patient experience data

• Distinguish slow feedback from fast feedback

Page 2: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Agenda

Overview

Recognize many data sources

Survey Data Top Five Advice

Qualitative Data: Start with stories

Slow vs Fast Feedback

Patient Experience Data Overview

No single perfect measure of patient experience exists

“Good enough” data, multiply sourced, drives

improvement

Focus on things that matter

Understand organizational opportunities AND team

specific opportunities

4

Page 3: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

STOPDATA

CRAZINESS

Stages of Dealing with Patient

Experience Data

Deny– “We have Patient Experience data????”

Ignore– “Just don’t make eye contact, don’t open the email and if subject

comes up… change it and talk fast!”

Shoot the messenger– “ The survey tool is biased, my patients are crabbier than anybody

else’s,…cannot possibly reflect what is going on in my unit!!”

Accept– “OK- help me learn how to use this to drive change and understand

our impact on patients and families.”

Use– “Identify high leverage improvement to create the best care

outcomes and best environment in which to work.”

Page 4: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Symptoms of trouble

“We pretty much just look at our performance internally and overall we feel pretty good about it.”

“We look only at organizational numbers rolled up, that’s what matters at the end of the day.”

“We regularly review our data and form teams around the lowest scores.”

“Every month we review our scores and if we drop down, we form a team to fix it, and if it’s up- we get a pizza party.”

“It’s all so overwhelming- it’s just so hard to know where to start.”

“CAHPS has really changed our focus- it’s really the only thing we are focusing on now in my organization.”

7

It’s all about the “clues”

Work

Look

Feel

These clues are then translated into evaluation of the care

and quality of the providers/organization.

We must constantly be asking ourselves- “what are the

data really telling us?”

8

Page 5: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Sources of Patient Experience Data

9

Sources of Patient Experience Data

A holistic perspective is critical!

CAHPS: respecting its influence, understanding its limitations

Press Ganey, NRC Picker, Gallup, Avatar, etc.

Focus groups

Patient Relations

Patient/Family advisors

Billing

Physicians

Safety culture surveys

Staff and provider engagement surveys

“Hot” comments- a gold mine!

Page 6: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

11

Data Source Data TypeDirect or Indirect

Patient Experience

CAHPS surveys (national government-sponsored

patient experience surveys in U.S.)

Survey data Direct3rd party formal surveys, linked to common set of

questions across multiple organizations

In-house Comment Cards/Open Ended questions

of patients

Staff vitality surveys, safety culture surveys Indirect

Patient/Family Advisors

Focus groups,

conversations

DirectPatients and Families

StaffIndirect

Physicians

Front line process/service performance data Workplace (“Gemba”)

data

Indirect/Direct

Rounding observations Indirect

Patient Relations data (grievances, complaints and

positive letters)Admin/Operations

data

DirectBilling complaints and issues (U.S.)

Dashboard metrics: LWBS, errors, safety

performance etc.Indirect

A Table To Organize Patient Experience Data

Why are multiple data sources important ?

12

Page 7: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Patient experience is multi-faceted

Each data source gives you one view of patient experience.

Multiple sources provide different views and details, enable you to build a coherent picture.

Activity: Data self-assessment14

Page 8: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Activity Instructions

Individual fill out the chart (3 minutes)

Table Bingo (5 minutes):

– find out who has at least one row with YES filled in across all four columns.

– What's the row that is most rarely filled in across the columns?

Group Debrief (5 minutes)

15

CAHPS and other formal survey data

16

Page 9: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

What about CAHPS?

Why we care

Common across all U.S.

hospitals (and now

clinical groups, too)

Public access

Ballpark right stuff

Suitable for dashboards,

on run charts

CMS has your attention

17

Limitations in our work

Time lag—too delayed for improvement work

Global numbers may not reflect targeted unit work

Low response rates

“silo” focus, not team focus for care

The “n” problem (to double precision, you need to quadruple sample size)

CAHPS* data: Five items to know

1. Use Top Box

2. Understand Percentiles

3. Interpret Correlations

4. Remember how the “n” matters

5. Plot your data in time order

18

*also applies to every other formal patient experience survey data we know

Page 10: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Top Box

Why does it matter? What are the implications?

19

Top Box

For survey data, “Top Box” refers to most positive choice on a ordered scale.*In the CAHPS data, "Always" is often Top Box.

*There are a few exceptions, for example on the HCAHPS survey question 21, top box refers to evaluation of a

hospital as 9 or 10 out of 10 point scale, with 10 the best hospital possible.

CAHPS Clinician & Group Surveys, version 12-month

Survey 2.0, Population: Adult; Language: English, Last

Updated September 1, 2011.

Page 11: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Three Levels of Caring (Fred Lee)

Correlation of Patient Care and Evaluation

21

Staff Motivation Staff Performance Patient Evaluation

Inspired Compassion5

Very Satisfied

Required Courtesy4

Satisfied

Hired Competence3

Neutral

Fired1-2

Dissatisfied

Patients’ Perception of Overall Quality of Care22

Source: HCAHPS Survey & PRC Loyalty Survey: Why measure with both? PRC Inc. January 2008

Patients who rate

Quality of Care as

“Excellent” are four

times more likely to

recommend you than

those who rate

Quality of Care as

“Very Good”

87.00%

22.80%

7.90%

1.10%2.80%

0.00%

10.00%

20.00%

30.00%

40.00%

50.00%

60.00%

70.00%

80.00%

90.00%

100.00%

Lik

eli

ho

od

to

Re

com

me

nd

(Pe

rce

nt

Ex

cell

en

t)

Excellent Very Good Good Fair Poor

Page 12: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Reasons to Focus on Top Box

Common way for CMS and 3rd-party

vendors to report information

Top-box responses are linked to strong

customer loyalty

Why do you want to focus on less than

best service and performance?

Partial accounting for "positive bias" in

survey responses by patients

23

Percentiles

Why does it matter? What are the implications?

24

Page 13: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Trend in Inpatient Experience

Source: 2011 Press Ganey Hospital Pulse Report

Four Year Trend in CG CAHPS Visit

Survey Data26

70%

75%

80%

85%

90%

95%

2009 2010 2011 2012 2013 2014

Top box Overall Rating of Provider

Overall 90th ptile 75th ptile 50th ptile 25th ptile

source CAHPS database 2010-2013

Page 14: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

27

https://www.cahpsdatabase.ahrq.gov/CAHPSIDB/Public/CG/CG_Topscores.aspx

accessed 4 October 2014

Example:Communica-

tions

Composite,

based on six

questions

28

Select the best answers based on the Percentile Top Box Scores

Table from 2013 CG CAHPS patient responses:

1. For the top 10% of providers nationally, more than 97 out of 100 patients assessed the provider as "always listened carefully." T F2. More than 50% of providers nationally had at least 9 out of 10 of their patients say the provider always spent enough time with them. T F3. For the composite, a change of 3% in the top-box score translates into a change in the percentile score of 25 or more . T F

Page 15: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Overall Inpatient Experience Score by Hospital Size

Hospital Size (Number of Beds)

Source: 2010 Press Ganey Hospital Pulse Report

Understanding Data--Not for excuses

By hospital size?

By specialty?

By insurance status

30

Page 16: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Patient Experience by Type of Insurance31

Source: 2010 Press Ganey Hospital Pulse Report

Why percentiles matter

You get a sense of the market and

performance by competitors—can be a

shock!

Overall organization performance masks

service line performance (stratify!)

The same “top box” percent for different

service lines has different interpretation

32

Page 17: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Correlations

Why do they matter? What are the implications?

33

34

All survey questions are not created equal!!!

KL2

Page 18: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Slide 34

KL2 we have to make sure we don't miss a key point. Low correlations may arise from lack of scatter of skill

of nurses or physicians in the analysis. Absence of correlation does not mean no relationship....Kevin Little, 9/30/2013

Page 19: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

What is correlation?

Correlation, based on either scores or ranks, measures strength of association and ranges from 1 (perfect positive linear or rank order relationship) to 0 (no linear or rank relationship) to -1 (perfect negative linear or reverse rank order relationship.)

Here’s a picture that shows some invented data, with the correlation coefficient ranging from 0.96 to 0.55

35

Adult Inpatient

How well your pain was controlled

Staff addressed emotional needs

Room Cleanliness

Staff include decision re: treatment

Noise level in and around room

Staff attitude toward visitors

Staff sensitivity to inconvenience

Skill of physician

Teach/instruct self-care, med, treatment

Nurses kept you informed

Press Ganey National database – through June 30, 2012

Page 20: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Adult Inpatient: Overall Rating

correlations

Press Ganey National database – through June 30, 2012

Staff addressed emotional needs .79

Staff sensitivity to inconvenience .78

Teach/instruct self-care, med, treatment .78

Staff attitude toward visitors .74

Nurses kept you informed .73

How well your pain was controlled .69

Skill of physician .67

Room cleanliness .62

Noise level in and around room .52

Ambulatory Surgery

Nurse’s courtesy toward family

Degree staff worked together

Convenience of parking

Information given your family

Our concern for privacy

Information day of surgery

Press Ganey National database – through June 30, 2011

Page 21: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Ambulatory Surgery

Press Ganey National database – April 1, 2012-June 30, 2012

Degree staff worked together .79

Our concern for privacy .76

Information day of surgery .75

Information given your family .74

Nurses courtesy toward family .69

Convenience of parking .53

Why correlations matter

Correlations are a first step to making sense of relations

among multiple survey questions

The lowest score on a panel of questions may not be

strongly associated with overall evaluation

Tackling the lowest score may not be good use of

organization resources

40

See the Data Tools Self Assessment for more details about correlation

Page 22: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

How n matters

41

“n” should affect interpretationSuppose your top-box performance “really” is 80%.

Variation from sampling effects gives this table as number n

of surveys increases:*

42

*using the binomial model and

determining the range that you get

95% of the time on repeated

sampling. CMS tables use the 95%

threshold.

80.0%

n Low High

10 50.0% 100.0%

20 60.0% 95.0%

30 63.3% 90.0%

50 70.0% 90.0%

100 73.0% 87.0%

200 74.0% 85.0%

400 75.8% 83.5%

"no surprise" range

Page 23: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

“n” should affect interpretation

Suppose your top-box performance “really” is 90%.

Variation from sampling effects gives this table

43

90.0%

n Low High

10 70.0% 100.0%

20 75.0% 100.0%

30 76.7% 100.0%

50 80.0% 96.0%

100 84.0% 95.0%

200 84.0% 95.0%

400 87.0% 92.8%

"no surprise" range

Connection to control charts

Given n and the percent of top-box responses, p, control

chart calculations give a range of plausible per cents.

Example If you observe p = 80%, here are the lower and

upper control limits (LCL and UCL) for several n values

44

n LCL UCL

10 42.1 100.0

25 56.0 100.0

50 63.0 97.0

100 68.0 92.0

200 71.5 88.5

Page 24: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

45

Overall Rating n TB pct

2012 Q1 12 58.33

2012 Q2 16 75

2012 Q3 10 90

2012 Q4 10 80

2013 Q1 3 100

2013 Q2 10 80

2013 Q3 7 42.86

2013 Q4 10 90

2014 Q1 6 50

2014 Q2 12 50

Any evidence that the Unit's Overall Rating in 2014 is

worse than 2012-2013 Baseline?

In review of survey data, an executive and unit leader were concerned

that the 2014 survey data Top Box scores were lower than 75%, the

2012-2013 average. What actions are called for based on these data?

Knowledge of the “n effect” should

Dampen or eliminate management cycles of despair or celebration, based on a single reported percent.

Cause you to interpret unit-level results with great caution (unit level n may be 5 or fewer)

Help make the case for plotting survey results in time order

Inspire you to learn and use control charts--see Provost and Murray (2012)

46

Page 25: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Plot survey data in time order

Start with run charts, move on to control charts

47

Baseline

Jan 2011

to Aug

2012

Median=

60.5

Is there any evidence that a management

intervention begun September 2012 (dashed

vertical line) had any impact on HCAHPS Overall

Recommendation? Do you have any questions

about the baseline period?

Page 26: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

In 2011, 9 consecutive values below the baseline median is highly

unusual; there appears to be a shift to better performance before

the intervention starts. In 2013, 8 consecutive values above the

reference median further (incremental) improvement. For more

details on run chart rules, see Perla et al. (2011).

Take Home Exercise: What interpretations

should leaders of this hospital draw as they

assess the apparent impact of the

intervention begun in September 2013?

(n ~ 30 each month)

Page 27: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Recap: Why plot data in time order?

Single survey numbers provide no useful guidance for improvement

You need time order to make “before and after” comparisons to assess progress

If “n” is about the same for each survey number in the series, you can look for striking patterns over time to signal improvement or decay

Again: See Perla et al. (2011) for run chart “rules”

A quick look at qualitative data

52

Page 28: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Accessible Qualitative Sources

Patient stories and letters

Formal leader patient rounding (alone, in

pairs, with physicians, etc.)

Informal rounding

Conversations/discussion in staff meetings

53

Patient stories need interpretation

RULE: Always have a formal or informal

leader prepared to interpret the meaning of

a patient story

“This is what the story means to me.”

“This is what the story means to our

organization.”

54

Page 29: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

FEEDBACK IS FUNDAMENTAL

Fast versus Slow Feedback

55

Feedback is vital to Sex56

2 July 2003, Science 299 (5615): 2054

Page 30: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

FAST AND SLOW FEEDBACK: PHYSICIAN COMMUNICATIONS

57

58

Page 31: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

3.1. Fast Feedback, Paper Method

The feedback form

– Behaviors

– Patient perceptions

How form is administered (steps)

– Feedback to physicians, how handled

Summary feedback table and graph

– Interpretation of the data summary

– Use in engaging physicians

59

Fast Feedback

1. Feedback form on paper

2. First four questions—yes or

no on behaviors in the bundle

3. Next four questions—

patient perception of the

conversation

Note: Slow feedback means

the monthly or quarterly

numbers from formal

HCAHPS or 3rd party vendors.

60

Page 32: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Individual Events History 61

Example Logistics of Feedback Form (visit

-> feedback -> spreadsheet summary)

REQUEST RANDOM

1. Phys asks UC/Charge 1 day or over 1wk 3pts selected

2. Visit occurs or already done FB within 1hr

3. FB w/in 1hr (usually min) Check boxes & send to Q

4. Catch phys & share FB Enter & update Review as group

5. Check boxes & send to Q

6. Enter & update q 2wks

62

Page 33: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

63

3.2 Link Fast Feedback to PDSA64

Page 34: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Summary

65

Patient Experience Data: Summary

• There is no single perfect measure for patient experience and the experience of care

• There are multiple sources of “good enough” data that can drive improvement

• Formal Survey Data (Quantitative) advice:1. Use Top Box

2. Understand Percentiles

3. Interpret Correlations

4. Remember the “n” does matter

5. Plot your data in time order

• Formal survey data are too slow to guide specific PDSA tests• Patient Stories (Qualitative) never stand by themselves

66

Page 35: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

OK- back to our symptoms of trouble

“We pretty much just look at our performance internally and overall we feel pretty good about it.” (Survey Percentiles: benchmark)

“We look only at organizational numbers rolled up, that’s what matters at the end of the day.” (Stratify by service lines)

“We regularly review our data and form teams around the lowest scores.” (Correlations)

“Every month we review our scores and if we drop down, we form a team to fix it, and if it’s up- we get a pizza party.” (“n” and plot data in time order)

“It’s all so overwhelming- it’s just so hard to know where to start.” (Start with proper analysis of formal surveys and direct observations.)

“CAHPS has really changed our focus- it’s really the only thing we are focusing on now in my organization” (Remember: Multiple sources vita, CAHPS by itself is not enough)

67

Reflection ExerciseBest Practice Data Ideas

Can you use

this idea?

Other Colleagues or

Groups use this idea?

1. Opportunities from Data Self-Assessment

2. Surveys: Focus on Top Box

3. Surveys: Understand Percentiles

3. Surveys: Interpret correlations

4. Surveys: Remember how "n" matters

5. Plot and interpret measure values in time order

6. Deploy fast(er) feedback

7. Assure Patient Stories connect to meaning

8.

PICK Chart: map the ideas to the Impact by

Difficulty Grid. Ideas with relatively high impact and

low difficulty are immediate candidates for testing.

Page 36: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Reflections and Discussion

Set a Goal

Don’t be afraid to be bold and

courageous IF the organization is

ready.

The conversation is important- have

to know where you are heading.

70

Page 37: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

So what?

“I attribute my success to this,

I never took or gave an excuse.”

Don’t be afraid of what’s “real”

Don’t focus on why we’re special- each has a unique set of

challenges.

71

To learn more

Lee, F. (2004), If Disney Ran Your Hospital: 9 ½ Things You Would Do Differently, Second River Healthcare Press.

Perla, R., Provost, L., and Murray, S. (2011), “The run chart: a simple analytical tool for learning from variation in healthcare processes”, BMJ Quality & Safety. 2011 Jan; 20(1):46-51.

Provost, L. and Murray, S. The Health Care Data Guide: Learning from Data for Improvement. Jossey-Bass Publishers, 2011, especially chapter 12.

Kevin Little screencast, explanation of variation (effect of n) http://www.screencast.com/t/cSUvvlbFR

Data Tools Self-Assessment with Answers

72

Page 38: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Appendix 1: Informative Displays build on multiple run charts

Small multiples: looking across units 74

Collab

start

Baseline

median

Page 39: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

75

Compressed

Percentile

scale is good

news/bad

news.

Do you

know which

is which?

Appendix 2: More on Fast Feedback

Physician Communications Example

Page 40: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Lessons and Key Points

• CMS Compliance

• Patients can observe presence/absence of

physician behaviors

• Should patient responses be anonymous?

• Cognitively impaired patients--options

• Accept and work around positive bias

77

CMS Compliance“…activities and encounters that are intended to provide or assess

clinical care or promote patient/family well-being are

permissible. However, activities and encounters that are primarily

intended to influence how patients, or which patients, respond to

HCAHPS survey items must be avoided.”

Avoid wording of questions that closely resemble HCAHPS

questions. In particular, you may not use the HCAHPS categories (e.g.

“Always” “Usually” “Sometimes” “Never”). Yes/No questions such as

“Did the physician ask ‘what are you most worried about?’” do not

violate HCHAPS protocols (see additional examples in the HCAHPS

Quality Assurance Guidelines).

Here is the link: http://www.hcahpsonline.org/Files/HCAHPS%20QAG%20V9%200%20MARCH%202014.pdfExamples and language guidance on p. 22 of the CMS document

78

Page 41: DRAFT Best Practices: Data for Learning and Improvementapp.ihi.org/Events/Attachments/Event-2523/Document...Sep 01, 2011  · Patient Experience CAHPS surveys (national government-sponsored

Institute for Healthcare Improvement

Patients as observers

Based on 2012-2013 experience, most patients can

score YES/NO on behaviors with good reliability

– Have short time between encounter and form use (same hour)

– Patients with cognitive impairment a challenge--

– family members or observers can score

– Initial practice cycles for physicians should focus on pts w/o

cognitive issues

Teams 2012-2013 tested both anonymous and id'd

responses. If anonymous, difficult to engage physician

specifically but response can be used for group

monitoring and analysis.

79

What about positive bias?

• Our belief: Patients are vulnerable and loathe to criticize the care team.

• Positive bias likely• Focus on top-box

responses only

80