new quality institute #2: how to share performance data to spur … · 2019. 12. 18. · quality...
TRANSCRIPT
-
Quality Institute #2:Quality Institute #2:How to Share Performance Data to Spur Improvement Session 3
Clemens SteinbockClemens SteinbockW d d A t 25 3 30W d d A t 25 3 30 55Wednesday, August 25; 3:30Wednesday, August 25; 3:30--5pm5pmMaryland BMaryland BRWARWA--04170417
-
DisclosuresDisclosures
• I have no financial interest or relationships to disclosep• HRSA Education Committee Disclosures
HRSA Education Committee Staff have no financial interest or relationships to discloseor relationships to disclose
• CME Staff DisclosuresProfessional Education Services Group staff have no pfinancial interest or relationships to disclose
2
-
Learning ObjectivesLearning Objectives
• Understand the importance of sharing performance p g pdata effectively with your target audience to generate momentum for quality improvementL t t i t ff ti d t t d• Learn strategies to prepare effective data reports and share data successfully
• Learn how peer grantees innovatively share data withLearn how peer grantees innovatively share data with their staff, providers, consumers, subcontractors, advisory bodies, etc.
3
-
AgendaAgenda
• Introduction to data reporting• Examples of grantee performance data reports and
feedback by audience• Development of recommendations/small group work• Development of recommendations/small group work • QI resource overview
4
-
4 Data Steps4 Data Steps
• Data Gathering – Where are the data?• Data Analysis – What are the data telling us?• Data Sharing – How can I best share the results with
t k h ld ?stakeholders?• Data Follow-up – What should I do in response to
the results?e esu s
5
-
Find a Balance between MeasurementFind a Balance between Measurement and Improvement
6
-
Options for Follow up ActivitiesOptions for Follow-up Activities
‘D thi !’ if ithi t d d• ‘Do nothing!’ – if scores are within expected ranges and goals, frequently repeat measurement
• ‘Take Immediate Individual Action’ – follow-up onTake Immediate Individual Action follow up on individual pts (missed appointments, pts not on PCP prophylaxis, etc) and/or provider
• ‘Quick PDSA’ – develop a quick pilot test• ‘Launch QI Project!’ – set up a cross-functional team to
address identified aspects of HIV careaddress identified aspects of HIV care
7
-
Why Measuring?y g
• Strangers are asked to estimate the IQ of weathermen on TV they have never met before
• Question: Who can better estimate the IQ - you or strangers?
• Result: Strangers are 66% more accurate when predicting someone’s IQ
• Conclusion: We are poor self evaluators based on the positive illusion effect
[Peter Borkenau, Journal of Personality and Social Psychology, 65, 546-553]
8
-
Why Measuring? We Are Unrealistically Optimisticy g y p
• 90% of all drivers think they are above average behind the wheelabove average behind the wheel
• 94% of college professors report doing above average workS k f th• Smokers are aware of the statistical risks but most believe that they are likely to be diagnosed with lung cancer and heart diseasewith lung cancer and heart disease than non-smokers
• Gay man underestimate their chance to contract HIV evenchance to contract HIV, even though they know about HIV/AIDS in general
[Cass Sunstein, Journal of Legal Studies, 27, 1998, 799-823]
9
-
What’s Wrong with this Picture?g
10
-
Barriers to Putting Data into ActionBarriers to Putting Data into Action
• Don’t even know where to get data/info• Don t even know where to get data/info• Paralysis by analysis• No one is interested in itNo one is interested in it• Defensiveness• Too complex to understand• Incorrect interpretation of data
11
-
12
-
Quality Management should NOT look like:Quality Management should NOT look like:
13
-
‘Death by Slides’ – Edward Tuftey
• Average data points/numbers per graph:120 in New York Times53 New England Journal of Medicine12 PowerPoint graph
• 100-160 spoken words per minute vs 15 words per slideo ds pe s de
• To show content PowerPoint templates use only 30%-40% of the space availableuse only 30%-40% of the space available on a slide
14
-
Lessons Learned about Data Reportsp
• Tell a story – ‘designer formats will not l k t t’salvage weak content’Summarize major points you want to makeUse color to highlight key findingsAvoid technical jargon/define unfamiliar terms
• Know your audiences and their data needs Plan data display with key stakeholdersUse different graphs for different audiencesPost graphic displays in hallways and waiting rooms for staff/patients
15
-
Lessons Learned about Data Reportsp
• Be aware – we all have a different data literacyD fi h i di tDefine each indicatorLabel charts and tables clearly (show 0% to 100%)Identify data source(s) and dates Stratify data by demographics/other characteristicsStratify data by demographics/other characteristicsNote limitations
• Find balance: simple messages vs complex datap g pBegin analyses with questions/hypothesesLimit the display to the points you need to makeProvide handouts with more data pointsProvide comparisons over time, benchmarks, established targets
16
-
Examples from the FieldExamples from the Field…
17
-
Request to AudienceRequest to Audience
At the end of the presentation:p• Share one chart/graph that you like the most –
remember the slide number!• Share one improvement idea for your next data• Share one improvement idea for your next data
chart/graph that you have learned today
‘Focus on how data are presented vs what the actual data points are telling you!’
18
-
Lincoln Community Health Center Early Intervention Clinic -5/1/08 th h 12/31/09
PCP Prophylaxis100
5/1/08 through 12/31/09
9289
9396
90
100
80
70
80
60
505/9/2008 7/08 to 12/08 1/09 to 6/09 1/09 to 12/09
Goa l Ac t ua l Da t a
19
-
20
-
New Jersey Cycle 8 CPC Data
87
7980
8887
7881 80
76
90
8176
828283 8283 80
8985
8090
100
58
70 6873
76 7674
61
71
52
62
75
50
67
52
63
74
54
73
53
7175
50607080
cent
50
20304050
Perc
01020
CD4 HAART Vi i P h l i S hiliCD4 HAART Visits Prophylaxis Syphilis
Cycle1 Cycle2 Cycle3 Cycle4 Cycle5 Cycle6 Cycle7 Cycle8
21
-
Viral Load Every 6 MonthsViral Load Every 6 Months
Indicator Definition: Percentage of eligible Scores Over Time: Viral Load Every 6 Months g gpatients who had a VL during each 6-month interval (n = 11,131 eligible NYS patients in 2007)
89%
87%60%70%80%90%
100%
Sco
re
Frequency Distribution of Scores: Viral Load
Key Findings: Consistently high; no improvement since 2003Over 50% of NYS sites scored above 90%
0%10%20%30%40%50%
Pro
gram
S
NYS MeanNational Mean
%
40%
50%
60%
70%
ogra
ms
Frequency Distribution of Scores: Viral Load
20072003
0%2003 2004 2005 2006 2007
M 90th
Range: Viral Load Every 6 Months 2007
0%
10%
20%
30%
0% 20% 40% 60% 80% 100%
% o
f Pr
10th 25th
M
75th
10th 25th
M
75th
90th
22
Program Scores0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%
Program Score NYSNote: M = Median
-
23
-
TOT P ti i t b Zi C d ( 299)TOT Participants by Zip Code (n=299)
24
-
2524
-
Top Scoring Performance in 4 Categories (Viral Load, Adherence, TB, Pelvic)
100%
In 2007, how many of the 174 NYS HIVQUAL facilities were in the top 25% statewide on all four indicators (Viral Load Every 6 Months, Tx
Adherence, TB Screening, Pelvic Exam)? n = 174
48%n = 8260%
80%
n = 8230%n = 52
11%n = 19
7%n = 12
5%n 8
20%
40%
n = 12 n = 80%
Top 25% 0x Top 25% 1x Top 25% 2x Top 25% 3x Top 25% 4x
26
Sample size > 40: Nassau Health Care, Westbury; APICHA; Wyckoff Heights Medical Center
-
First Regional
First Onsite TA First SteeringFirst Exhibit
First Regional Workshops
ConsultationFirst Steering Meeting
First Exhibit
2006200620052005 Sept
Initial Meeting Initiation of First Consumer Initiation of National TA Callswith HAB in DC IHI/NQC website Advisory Meeting National TA Calls
27
-
74% f ll R Whit t74% of all Ryan White grantees participated in TA Callsparticipated in TA Calls
28
-
29
-
San Juan Ponce Caguas Puerto RicoPart A Part A Part A Rico Part BApril 05
June 09
May 05
June 09
May 05
June 09
May 05
Jun 0905 09 05 09 05 09 05 09
Written Quality Management Plan
QI Committee
--
- --
--+
++
++
++
++
Consumer on Committee
Quality Indicators
--
--
--
--
++
++
++
++
QM Required in Subcontracts
Organizational Assessments Conducted
-
-
-
-
-
-
-
-
+
+
+
+
+
+ +
-
Participation in QM Workshops
QM Trainings for subcontractors
--
--
--
--
++
++
++
++
30
Participation in NQC TOT - - - -+ + + +
-
Data the other way (2007)Data, the other way… (2007)
Out of 11,131 pts with 2 or more annual medical visits, 614 pts did NOT have a documented VL during the last 6 monthsdocumented VL during the last 6 months of the year (5.5%)Based on a sample of 2,209 pts with a p , pCD4 count less than 200, 246 pts were NOT on PCP prophylaxis (11.1%)1 313 t f 4 269 f l ti t did1,313 out of 4,269 female patients did NOT receive a GYN exam last year (30.8%)
31
(30.8%)
-
SpiderchartsSpidercharts
0.630 7
HEP A Vaccination
CHFP 2005
0 821
1
HEP A Vaccination
CHFP 2006
0.350.41
0 10.20.30.40.50.60.7
HAART Adherence
Undetct VL, pts on HAART
0.82
0.920.61
0.20.40.60.8
1
HAART Adherence
Undetct VL, pts on HAART
0.09
0 380.31
0.330
0.1
OHPPD Screening0.210.31
00.2
OHPPD Screening
0.38
PAPSocial Assessment0.840.74 PAPSocial Assessment
32
-
33
-
Sharing of Data with ConsumersSharing of Data with Consumers
34
-
35
-
3635
-
Key Lessons Learnedy
• Allow audience to absorb data and graphs• Watch out for defensiveness• Watch out for paralysis by analysis• Rotate the functions of data reporting among staff• Share reports at QM committees and at staff, provider and
consumer meetingsconsumer meetings• Share detailed data report, if needed
37
-
Key Lessons Learnedy
• Stratify statewide data by race/ethnicity, region, etc.• Develop individual provider reports to share data and
compare with aggregate statewide dataSh t l / di b t t 25% b tt 25% t• Show not only mean/median, but top 25%, bottom 25%, etc.
• Use maps and other pictorial strategies• Consider blinded vs unblinded data reports• Consider blinded vs. unblinded data reports
38
-
Request to AudienceRequest to Audience
• Which chart/graph did you like the most?g p y
• Share one improvement idea for your next data chart/graph that you have learned todaychart/graph that you have learned today
39
-
Quality Improvement ResourcesQuality Improvement Resources
40
-
Quality AcademyQ y y
41
-
NQC Technical Assistance CallsNQC Technical Assistance Calls
42
-
Development of Recommendations: pSmall Group Discussions
S l t f th f ll i 4 t i b d- Select one of the following 4 topic areas based on your personal interest
- Move towards the assigned meeting areaMove towards the assigned meeting area- Select a group facilitator(s) and select a reporter- Discuss your topic and report back to the larger groupy g g
43
-
Topic AreasTopic Areas
- How can we best share agency-wide performance data with consumers? How can consumers ‘understand’ your data’?consumers? How can consumers understand your data ? How can you overcome the resistance by your staff to openly share ‘bad’ data? Wh t th t t l h ‘ bli d d’- What are the steps necessary to openly share ‘unblinded’ performance data across your agency, network or region? How can we link high performers with ‘poor’ performers?
- How can you best prioritize your performance data and take action based on the most important indicator? What are the selection criteria? Who should be involved?
- How can you effectively report your quality performance data to your agency-wide senior leaders? What reporting format is most effective?
44
-
Aha Moment and Action PlanningAha Moment and Action Planning
• What have you learned from this workshop?• What will you do differently in response to this
workshop?
• Complete the Action Planning Form on your chair
45
-
NQC Activities at the AGM 2010 – Join Us!QMonday, August 23, 2010• 11am: Improve Your Care and Services with Consumer Input (Quality Institute 1) - Delaware A • 2:30pm: Creating a Culture for Quality Improvement (Quality Institute 1) - Delaware A2:30pm: Creating a Culture for Quality Improvement (Quality Institute 1) Delaware A Tuesday, August 24, 2010• 8:30am: Quality in Hard Times (Quality Institute 1) - Delaware AWednesday, August 25, 2010• 8:30am: Quality Improvement 101/HAB Quality Expectations (Quality Institute 2) - Maryland B • 11am: An Introduction to Performance Measurement (Quality Institute 2) - Maryland B • 3:30pm: How to Share Performance Data to Spur Improvement (Quality Institute 2) - Maryland B Thursday August 26 2010Thursday, August 26, 2010• 8am: Strategies to Measure and Improve Patient Retention Rates - Washington 2 • 10am: Aligning Quality Initiatives: Lessons Learned from Cross-Part Collaborative - Washington 4 • 10am: Quality Management for Non-Clinical Care - Washington 1
Visit our NQC/HIVQUAL Exhibit Booth in the Exhibit Area • Pick up hard copies of QI Publications and meet your staff and consultants
46
-
Clemens SteinbockNational Quality Center (NQC)National Quality Center (NQC)
212212--417417--47304730Q CQ CNationalQualityCenter.orgNationalQualityCenter.org
[email protected]@NationalQualityCenter.org