Evaluation of Phase I Demonstrations of the Pharmacy Quality Alliance
AHRQ Annual ConferenceSeptember 16, 2009
Laura Pizzi, PharmD, MPHAssociate Professor and Director, Doris N. Grandon Center for Health Economicsand Outcomes ResearchJefferson School of Population HealthPhiladelphia, Pennsylvania USA
Telephone: (215) 955-1159Email: [email protected]
Note: The timelines presented herein are dependent on OMB approval.
Introduction
Evaluation of PQA Phase I Demonstrations: – Provides independent external evaluation – Facilitates within- and across-site comparisons
Core evaluation team members:– Joyce McMahon, PhD - Project Director (CNA)– Elizabeth Schaefer, MA - Associate Project Director
(CNA)– Laura Pizzi, PharmD, MPH - Associate Project Director
(Jefferson)– Erin Thomson, MPH - Research Analyst (Jefferson)
AHRQ Project Officer: Linda Greenberg, PhD
Key Evaluation Questions
What is the most efficacious way to collect and aggregate data on the 15 pharmacy performance measures and from the consumer questionnaire about pharmacy services?
What challenges, issues, and technical problems were encountered in creating and populating the template reports? How were they resolved?
How could the demonstration sites have strengthened their measurement efforts?
How well were the reports understood by users (i.e., staff pharmacists, pharmacy managers)?
How can the PQA report template and the reporting process be improved (e.g., user-friendliness, comprehension, ability to act)?
What are the operational costs and non-financial burdens encountered in collecting the data, generating reports, and using the performance data?
Early Focus
Establish lines of communication (AHRQ, PQA, demonstration sites)• Monthly teleconferences between AHRQ, CNA,
Jefferson, and PQA• Monthly teleconferences with project leaders at each of
the 5 sites to monitor progress Primary data requirements
• Instruments: Interview Guide and Pharmacist Survey • Both require OMB approval
Evaluation Components
Verification of findings via secondary data analysis (data already collected during the demonstration itself):
1. Performance reports (claims data provided by the sites)
2. Pharmacy consumer survey (survey data provided by Avatar)
New information gathered via primary data:
1. Monthly teleconferences
2. Qualitative on-site interviews of demonstration project staff
3. Quantitative paper survey of field pharmacists
Interviews of Demonstration Staff
With the help of site Project Leaders, CNA team has identified six (6) interviewees per site, representing specific job functions:1. Demonstration project leadership
2. Pharmacy operations / management
3. Analytics management responsible for oversight of performance report analyses
4. Analytics staff assigned to complete the performance reports
5. Information technology (IT) staff responsible for developing and/or coordinating Inter- or Intranet components of the project
6. Senior management (executive leadership)
Interviews of Demonstration Staff, continued
Conducted in person by a two-person interview team from Jefferson
Duration of each interview will be approximately 1 hour Total duration of each site visit will be 2 business days Approximate timeline:
• Mid-February to mid-May 2009: CNA team worked with Project Leaders to identify interviewees
• September 2009: Site visits scheduled• October 2009: Site visits conducted
Survey of Field Pharmacists
Pharmacist sample currently being obtained with the help of site Project Leaders
Sampling parameters• Inclusion criterion: pharmacists are required to have
participated in the PQA Phase I Demonstration project as recipients of the performance report
• Sample size: 100 field pharmacists per site, except for sites where fewer than 100 pharmacists participated
Survey will require approximately 30 minutes to complete
Survey of Field Pharmacists: Timeline
Mid-July to October, 2009: CNA team works with demonstration sites to identify sample. Also sends sampled pharmacists a formal letter* to:1. Inform them that they have been selected as participants for AHRQ-funded PQA
evaluation2. Explain the purpose of the survey and anticipated time required for completion3. Assure them confidentiality and encourage participation
Mid-October: CNA team sends the survey by mail to sampled pharmacists. The mailing will include:1. A cover letter* to remind pharmacists of the purpose of the survey and
anticipated time requirements, and assure confidentiality2. The survey instrument, for completion3. A postage-paid envelope for returning the completed survey
Late October: CNA team sends a reminder letter* prompting pharmacists to complete and return the survey
*Letters will be sent on AHRQ letterhead and signed by Project Officer with reinforcement message sent by email to the pharmacists by their Demonstration Project Leader(s)
Evaluation Constructs: Primary Data
Construct Items Covered
Interview Guide for
Demonstration Staff
Pharmacist Survey
1.Respondent
Characteristics
NamePositionLength of time in current position
X X
2.Organizational
Background
Perceived importance of quality measurement within the organizationExisting quality measurement initiatives within organization/agencyQuality measurement personnel (training, credentials, full-time equivalents) in the organizationDecision-making process surrounding quality measurement (which measures, which disease states, which accreditation organizations)
X
3.Organizational
Resources
Personnel qualifications and time required for data collection, aggregation, and analysis of the 15 pharmacy performance measures and the consumer questionnaire Additional resources required (e.g., training, software, equipment, or tools)Role of organizational leadership in supporting measures
X
Evaluation Constructs: Primary Data (continued)
Construct Items Covered
Interview Guide for
Demonstration Staff
Pharmacist Survey
4.Measurement Methodology
What data sources were used and whyMethod employed to collect data on the 15 pharmacy performance measuresMethod employed to select sample Method employed to disseminate consumer questionnaire (mail, telephone, or mail with telephone follow-up)How and by whom data were analyzedTime required for data collection (both in real time and in man-hours)
X
5.Performance
Measure Evaluation
Participants’ perceptions regarding each of the 15 PQA quality measures as defined using NQF Measure Evaluation Criteria: Importance, Scientific acceptability, Usability, FeasibilityPerceptions and response rates pertaining to the pharmacy consumer survey
X X
6.DisseminationProcess
Whether and how field pharmacists and other personnel were made aware of measures Were there implementation partners or other collaborators
X X
Evaluation Constructs: Primary Data (continued)
Construct Items Covered
Interview Guide for
Demonstration Staff
Pharmacist Survey
7.Incentives /
Penalties
Incentives to promote pharmacy staff participation Negative consequences for pharmacy staff non-participation or non-completion
X X
8.Usability of Performance
Reports
Overall usability of performance reportsFactors that facilitated or enabled the usability of the performance reportsData or measures that do not exist but would be usefulParticipants’ overall reaction to performance reportKey Learnings: About medication quality measurement, About quality measurement in general, About consumers’ experiences and assessments of pharmacy plans and services
X X
Evaluation Constructs: Primary Data (continued)
Construct Items Covered
Interview Guide for
Demonstration Staff
Pharmacist Survey
9.Perceptions Regarding
Pharmacy Quality Measurement
Pharmacists’ perceived barriers and beliefs about pharmacy quality measurementReadiness to engage pharmacy quality measurement, in generalPerceived self-efficacy (extent to which pharmacists feel they can improve pharmacy quality)
X
10.Future Directions
Recommendations to improve the report template and reporting processNext steps/thoughts regarding Phase II Demonstration:Whether organization will continue to promote existing measures, modify them, and/or introduce new onesWhether organization plans to continue use of the pharmacy consumer surveyWhether organization will change dissemination approach
X
Pilot Test of Evaluation Tools
Goal of pilot testing was to obtain input for refining the interview guide and pharmacist survey in terms of both content and process
Pilot test was completed February 2009 (prior to OMB submission)
• Conducted with pharmacy staff from the Jefferson Health System (JHS)
JHS delivers both inpatient and outpatient pharmacy services through approximately 80 full time pharmacists, with care delivered at two hospitals (totaling approximately 800 beds) on the main campus in Center City, Philadelphia
JHS operates 3 outpatient pharmacies, which are open to the public and are operationally similar to typical retail pharmacies
Pilot Test of Evaluation Tools, continued
Participants from the pilot site were provided with the training materials developed by one of the demonstration sites, plus a mock performance report based on an actual de-identified pilot site report
There were 2 pilot test cohorts:• Cohort 1 (5 individuals) participated in testing the demonstration staff
interview guide:1.Analytics/IT management and staff2.Pharmacy department management3.Clinical pharmacy staff4.Quality improvement5.Senior management (executive leadership)
• Cohort 2 (5 practicing pharmacists) participated in testing the pharmacist survey
Tasks Following Data Collection
Analyze all information• Primary data collected: interviews and pharmacist survey• Secondary data acquired: consumer survey and claims data• Information collected via teleconferences with PQA and sites
Prepare case studies• Concentrate on both within-case and cross-case studies• Cross-case studies will focus on generalizability of findings
Share findings with AHRQ, PQA, and demonstration sites Publication
• Primary manuscript focused on evaluation findings (pooled data)• Submitted to major health policy or quality journal