improving the quality of pain management through measurement and action

115
Improving the Quality of Pain Management Through Measurement and Action NATIONAL PHARMACEUTICAL COUNCIL, INC This monograph was developed by JCAHO as part of a collaborative project with NPC. March 2003 ®

Upload: national-pharmaceutical-council

Post on 11-Apr-2015

1.880 views

Category:

Documents


4 download

DESCRIPTION

This monograph addresses the application of continuous quality improvement techniques to pain management and the implementation of performance measurement processes. The use of a multidisciplinary systems point of view is described and the Cycle for Improving Performance, a structured approach to improvement activities, is outlined. Factors that influence an organization's ability to implement change and improve performance are also discussed. In addition, the real-world experience of four organizations in improving pain management is described. The examples provided encompass a diversity of settings and experience and suggest strategies for overcoming obstacles to pain management improvement initiatives.

TRANSCRIPT

Page 1: Improving the Quality of Pain Management Through Measurement and Action

Improving the Quality of Pain Management

Through Measurement and Action

NATIONALPHARMACEUTICAL

COUNCIL, INC

This monograph was developed by JCAHO as part of a collaborative project with NPC.

March 2003

®

Page 2: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission Mission Statement: To continuously improve the safety and quality of care provided to the public through the provision of health care accreditation andrelated services that support performance improvement in health care organizations.

All rights reserved. Reprinting of the material is prohibited in any form or by any means without written permission from the publisher.

Requests for permission to reprint or make copies of any part of this work should be mailed to: Permissions EditorDepartment of Publications Joint Commission Resources, Inc. One Renaissance Boulevard Oakbrook Terrace, IL 60181 www.jcrinc.com

@ 2003 by the Joint Commission on Accreditation of Healthcare Organizations

For information about the Joint Commission on Accreditation of Healthcare Organizations, please visit www.jcaho.org. For informationabout Joint Commission Resources, please visit www.jcrinc.com.

DISCLAIMER: This monograph was developed by the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) forwhich it is solely responsible. Another monograph titled “Pain: Current Understanding of Assessment, Management, and Treatments”was developed by the National Pharmaceutical Council (NPC) for which it is solely responsible. The two monographs were producedunder a collaborative project between JCAHO and NPC and are jointly distributed. The goal of the collaborative project is to improvethe quality of pain management in health care organizations.

This monograph is designed for informational purposes only and is not intended as a substitute for medical or professional advice.Readers are urged to consult a qualified health care professional before making decisions on any specific matter, particularly if itinvolves clinical practice. The inclusion of any reference in this monograph should not be construed as an endorsement of any of thetreatments, programs or other information discussed therein. JCAHO has worked to ensure that this monograph contains useful infor-mation, but this monograph is not intended as a comprehensive source of all relevant information. In addition, because the informationcontained herein is derived from many sources, JCAHO cannot guarantee that the information is completely accurate or error free.JCAHO is not responsible for any claims or losses arising from the use of, or from any errors or omissions in, this monograph.

Page 3: Improving the Quality of Pain Management Through Measurement and Action

Editorial Advisory Board

Patricia Berry, PhD, APRN, BC, CHPN Assistant Professor College of Nursing University of Utah Salt Lake City, UT

June L. Dahl, PhD Professor of Pharmacology University of Wisconsin Medical SchoolMadison, WI

Marilee Ivers Donovan, PhD, RN Clinical Nurse Specialist/Manager Kaiser Permanente Northwest Chronic Pain Management Clinic Portland, OR

Perry G. Fine, MD Associate Medical Director Pain Management Center Professor of Anesthesiology University of Utah Salt Lake City, UT

Christine Miaskowski, RN, PhD, FAANProfessor and Chair Department of Physiological Nursing University of California San Francisco, CA

Mark Stillman, MD Section HeadSection of Headache and PainDepartment of NeurologyCleveland Clinic FoundationCleveland, OH

Karen L. Syrjala, PhD Associate Professor Psychiatry and Behavioral Sciences University of Washington-School of MedicineAssociate MemberFred Hutchinson Cancer Research CenterSeattle, WA

Joint Commission Project Team

Vice President, Division of Research Jerod M. Loeb, PhD

Project Director Barbara I. Braun, PhD

Project Coordinator Annette Riehle, MSN

Research Assistant Kristine M. Donofrio

Editorial Consultant Lindsay Edmunds, PhD

Page 4: Improving the Quality of Pain Management Through Measurement and Action

Section I: Introduction................................................................................................1Section II: Getting Started on Measurement to Improve Pain Management ..................2

A. Measurement Is Key to Improvement............................................................................2

1. Internal Uses for Pain Management Performance Measurement..............................22. External Uses for Pain Management Performance Measurement .............................23. Addressing Concerns Related to Measuring Performance ........................................2

B. Using Criteria for Effective Measurement......................................................................3

1. Overview of Criteria...................................................................................................32. Criteria Similarities and Differences ..........................................................................43. Mandated Criteria ....................................................................................................10

Section III: Understanding Organizational Improvement in Pain Management ...........14A. Viewing the Organization as a System.........................................................................14

B. Understanding Pain Management................................................................................14

C. Institutionalizing Pain Management............................................................................15

D. Understanding Current Pain Management Practices in Your Organization...............16

E. Understanding Quality Improvement Principles .........................................................17

1. The Cycle for Improving Performance ....................................................................172. Repeating the Cycle at Different Phases of the Project ..........................................173. Integrating With Internal Quality Improvement Activities ...................................20

Section IV: Designing Your Pain Management Improvement Initiative.......................21A. Establishing the Project Team......................................................................................21

B. Selecting Objectives and Target Goals.........................................................................21

C. Establishing the Scope of Measurement ......................................................................22

D. Identifying Resources and Project Timeline ................................................................24

Section V: Measuring What You Need to Know ........................................................25A. Consider Your Measurement Approach.......................................................................25

1. Data Quality .............................................................................................................252. Data Quantity ...........................................................................................................27

B. Collecting Data With Established Assessment Instruments........................................28

1. Pain-Specific Instruments.........................................................................................282. Choosing Pain Instruments ......................................................................................28

C. Selecting a Measurement Approach ............................................................................31

1. Conduct An Organizational Self-assessment...........................................................312. Review Medical Records ..........................................................................................323. Test Knowledge and Attitudes .................................................................................334. Directly Observe the Care........................................................................................335. Conduct a Point Prevalence Study ..........................................................................356. Assess Patient Status and Outcome Over Time ......................................................367. Collect Indicator Data..............................................................................................388. Utilize an Externally Developed Performance Measurement System .....................44

ii Improving the Quality of Pain Management Through Measurement and Action

Table of Contents

Page 5: Improving the Quality of Pain Management Through Measurement and Action

Section VI: Assessing and Analyzing Your Processes and Results ..............................50A. Using Quality Improvement Tools...............................................................................50

B. Analysis of Data ............................................................................................................50

C. Interpretation of the Results of Data Analysis ............................................................53

D. Display of Data and Results..........................................................................................54

E. Dissemination of Information .......................................................................................55

Section VII: Improving Your Performance ................................................................56A. Designing a Pain Improvement Intervention..............................................................56

B. Testing and Implementing a Pain Improvement Intervention ....................................57

C. Monitoring for Success .................................................................................................57

D. Sustaining Change........................................................................................................58

Section VIII: Understanding Factors That Affect Organizational Improvement .........59A. Factors That Influence Change ...................................................................................59

1. Patient Factors ..........................................................................................................592. Clinician Factors.......................................................................................................603. Organization Factors.................................................................................................614. External Factors ........................................................................................................62

B. Improving Pain Management Through the Principles of Total Quality Management ....................................................................................................63

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives ....................................................66

A. Overview.......................................................................................................................66

B. Educational Effectiveness in a Home Health Agency: Visiting Nurse Association & Hospice of Western New England, Inc. ...............................................67

1. Special Challenges....................................................................................................672. Choosing a Methodology .........................................................................................673. Assembling a Team...................................................................................................674. Identifying the Problem............................................................................................685. Implementing Interventions.....................................................................................696. Analyzing Processes and Results Leads to Redesign ................................................707. Sharing the Success and Sustaining Change ...........................................................70

C. Creative Improvement in a Rural Hospital: Memorial Hospital of Sheridan County...........................................................................................................72

1. Doing Some Homework ...........................................................................................722. Establishing a Team ..................................................................................................723. Implementing Improvements ...................................................................................734. Collaborating for Improvement................................................................................755. Integrating Nonpharmacologic Approaches ............................................................766. Sustaining Change....................................................................................................76

Joint Commission on Accreditation of Healthcare Organizations iii

Table of Contents

Page 6: Improving the Quality of Pain Management Through Measurement and Action

D. Integrating Pain Management Improvement Activities in an Academic Medical Center: University of Iowa Hospitals and Clinics .......................77

1. History of Pain Management Activities at UIHC...................................................772. Forming a Standing Committee...............................................................................783. Establishing an Action Plan.....................................................................................804. Fulfilling the Charge: The Assessment and Documentation Work Group.............805. Complementing Centralized Pain Improvement Activities: A

Division-level Initiative to Manage Circumcision Pain..........................................81E. Improving Care Across Hundreds of Facilities: The Veterans

Health Administration’s National Pain Management Strategy...................................84

1. Organizing the National Initiative...........................................................................852. National Goals and Objectives ................................................................................853. Implementing Multiple Interventions......................................................................854. Evaluating Effectiveness Through System-wide Measurement ...............................865. VHA-IHI Collaborative ...........................................................................................876. Implementing the IHI Initiative in Two New England Hospitals...........................887. Operational Strategies ..............................................................................................898. Sharing Success and Next Steps ..............................................................................909. Summary: Key Lessons..............................................................................................91

Appendix A: A Selection of Resources Related to Pain Management andImprovement ..................................................................................................................93

Appendix B: A Plan-Do-Check-Act Worksheet..............................................................96

References ..........................................................................................................................99

List of Tables, Figures, Text Boxes, and Appendices

TABLESTable 1. Pain Management Guidelines ...............................................................................5Table 2. Consensus and Position Statements ...................................................................12Table 3. Examples of Tasks Associated with Each Stage of the

Improvement Cycle ............................................................................................19Table 4. Examples of Roles and Behaviors of Pain Project Team Members ....................23Table 5. Scientific Review Criteria for Health Status Instruments..................................29Table 6. Examples of Unidimensional and Multidimensional Pain-

Specific Instruments ............................................................................................30Table 7. Examples of Indicator Development for Pain Management .............................39Table 8. Desirable Attributes of Measures ........................................................................45Table 9. Comparing Process and Outcome Indicators......................................................47Table 10. Quality Improvement Tools: Definition And Use............................................51Table 11. Visiting Nurse Association & Hospice Of Western New

England Pain Performance Improvement Time Line .......................................68Table 12. Memorial Hospital of Sheridan County Pain Management

Improvement Initiative .....................................................................................74

iv Improving the Quality of Pain Management Through Measurement and Action

Table of Contents

Page 7: Improving the Quality of Pain Management Through Measurement and Action

Table 13. Examples of Departmental Pain Improvement Activities: University of Iowa ............................................................................................78

Table 14. Examples of Pain Management Interventions for Circumcision: University of Iowa Hospitals and Clinics (UIHC) ..........................................83

Table 15. Attributes of a Successful Pain Management Program: VHA-IHI Collaborative Planning Group.........................................................88

FIGURESFigure 1. Dynamics of Systems Thinking in Relation to the

Health Care Organization..................................................................................15Figure 2. The PDSA Cycle................................................................................................18Figure 3. Cycle for Improving Performance .....................................................................19Figure 4. Selection Grid: Improvement Tools ..................................................................52Figure 5. [cartoon]..............................................................................................................54Figure 6. Using Appropriate Scales on Graphs.................................................................55Figure 7. Examples of Factors Affecting Pain Management

Improvement Initiatives.....................................................................................60Figure 8. Second Pain Management Newsletter Distributed by the Visiting

Nurse Association & Hospice Of Western New England................................71Figure 9. Mean Score on Staff Pain Management Knowledge Survey:

Visiting Nurse Association & Hospice of Western New England, Inc. .............................................................................................72

Figure 10. Baseline and Post (6 Months) Indicator Data: Memorial Hospital of Sheridan County Clinical Indicators............................................76

Figure 11. Quality and Performance Improvement Program Framework Illustrating the Reporting Lines for Communication: University of Iowa Hospitals and Clinics ..........................................................................79

Figure 12. Mean Number of Times Pain Intensity Is Documented Every 24 Hours: Department of Nursing Services and Patient Care Intensive Care Units, University of Iowa Hospitals and Clinics .......................................................................................82

Figure 13. Percentage of Records With Documented Pain Scores Within the Last Year: Veterans Health Administration, National External Peer Review Program .......................................................................87

Figure 14. Performance on Two Indicators Within 6 Weeks of Initiation of the Veterans Health Administration/Institute for Healthcare Improvement Pain Management Collaborative at the Togus, Maine, and Providence, Rhode Island, Veterans Affairs Medical Centers .................91

TEXT BOXESOverview of Joint Commission on Accreditation of Healthcare

Organizations Standards Related to Pain Management.................................................4American Pain Society Quality Improvement Guidelines for

the Treatment of Acute and Cancer Pain ....................................................................10

Joint Commission on Accreditation of Healthcare Organizations v

Table of Contents

Page 8: Improving the Quality of Pain Management Through Measurement and Action

vi Improving the Quality of Pain Management Through Measurement and Action

Table of Contents

Comparing Criteria: Guidelines, Standards, Consensus Statements ...............................11Eight Steps to Develop an Institutional Pain Management Program ..............................16Getting Physicians Involved in the Quality Improvement Process .................................22Common Characteristics of Successful Quality Improvement Teams .............................23Examples of Approaches to Assesing Data Quality..........................................................26Examples of Approaches for Selecting Samples ...............................................................27Organizational Example: Conduct Organizational Self-assessment .................................32Organizational Example: Review Medical Records ..........................................................33Organizational Example: Test Knowledge and Attitudes .................................................34Organizational Example: Directly Observe the Care .......................................................35Organizational Example: Conduct a Point Prevalence Study ..........................................36Organizational Example: Assess Patient Status and Outcome Over Time ......................37Organizational Example: Collect Indicator Data .............................................................48Organizational Example: Utilize an Externally Developed Performance

Measurement System.....................................................................................................49Ten Key Lessons From the National Demonstration Project on

Quality Improvement....................................................................................................53The 14 Points of Total Quality Management Applied to Improving Pain

Management..................................................................................................................63No Pain, Big Gain Kick-off Activities in February 2001 .................................................75

APPENDICESAppendix A. A Selection of Resources Related to Pain Management and

Improvement .................................................................................................................93Appendix B. A Plan-Do-Check-Act Worksheet..............................................................96

AcknowledgementsThe Joint Commission project team gratefully acknowledges the important contribution of the staff at the healthcare organizations who shared their valuable time and experience in improving pain management with this proj-ect. In particular, we would like to thank Micki Bonnette, RN.C, CHPN, Memorial Hospital of SheridanCounty, Sheridan, WY; Janet I. Moss, MS, RN, CPAN, Advocate Christ Medical Center, Oak Lawn, IL; LynSullivan Lee, RN, BSN, Swedish Medical Center, Seattle, WA; Lynnette Kenne, RN, MSN, Mercy MedicalCenter-North Iowa, Mason City, IA; Marguerite Jackson, RN, PhD, FAAN, and Wendy Hansbrough, RN, BSN,UCSD Medical Center, San Diego, CA; Marilee Ivers Donovan, PhD, RN, Kaiser Permanente Northwest,Portland, OR; Terri Savino, RN C, BSN, Middlesex Hospital, Middletown, CT; Tim Ashley, Crouse Hospital,Syracuse, NY; Marita Titler, PhD, RN; and Janet Geyer, RN, MSN, University of Iowa Hospitals and Clinics,Iowa City, IA; Carol Rodrigues, RN, MS, CHPN, VNA and Hospice of Western New England, Inc., Springfield,MA; and Robert Kerns, PhD, VA Connecticut Healthcare System, West Haven, CT. We also appreciate theassistance of others too numerous to mention at these organizations who facilitated and supplemented interviewsor in-person visits with members of the project team.

We wish to thank the members of the Editorial Advisory Panel who provided outstanding advice and feedbackthroughout the project, as did others who reviewed materials and advised us on content. Last, but certainly notleast, we are grateful to our colleagues at NPC, who made this project possible through the support of an unre-stricted educational grant. They were a pleasure to work with and helped advance the monograph from concep-tion to completion.

Page 9: Improving the Quality of Pain Management Through Measurement and Action

SECTION I:

IntroductionDespite ongoing, significant advances in

treatment options, studies indicate that paincontinues to be poorly managed and under-treated.1-6 The increase in clinical informa-tion related to pain management, as well asrecent high-profile press coverage of individ-ual cases of undertreatment,7 has resulted inheightened awareness among health careprofessionals and the public that this criticalissue must be addressed.

At the most fundamental level, improvingpain management is simply the right thing todo. As a tangible expression of compassion,it is a cornerstone of health care’s humanitar-ian mission. Yet it is just as important from aclinical standpoint, because unrelieved painhas been associated with undesirable out-comes such as delays in postoperative recov-ery and development of chronic pain condi-tions.8,9(p14, Table 5) In addition, effective treat-ment of pain is necessary to respond topatients’ increasing expectations for healthcare and new standards or requirements suchas those set by accrediting bodies, insurers,government regulatory bodies, and otherconstituent groups. In fact, Congressdeclared the decade beginning on January 1,2001, as the Decade of Pain Control andResearch.10

The key to judging the success of improve-ment efforts in an organization is measure-ment, because accurate data underpin allaspects of the change and improvementprocess. This monograph is designed to helphealth care organizations implement the per-formance measurement processes they needto achieve their goal of improving pain man-agement. This monograph takes a practicalapproach to provide:■ An overview of methods for measuring

performance and principles of organiza-tional improvement applied to painmanagement.

■ Examples of organizational use of per-formance measurement methods and

implemented improvement initiatives. ■ References and other sources for more

detailed information on selected topics.This monograph is written for clinicians,

pain management teams, quality improve-ment professionals, researchers, and othersinvolved in pain management performanceassessment, improvement, education, andpolicy making. It is intended for use in con-junction with the companion monographPain: Current Understanding of Assessment,Management, and Treatments,9 which focuseson the etiology, physiology, and clinicaltreatment of pain.

Because of the broad intended audience,some readers may find aspects of the mono-graph too technical while others may find itoverly simplistic. Similarly, some sections(e.g., organization examples or measurementapproaches) may be of greater interest tosome readers than others, depending on one’scare setting, role, and experience in painmanagement improvement activities.Nevertheless, we hope that readers will findmuch of the material relevant and helpful intheir efforts to measure and improve painmanagement processes and patient outcomes.

Given the primary focus of this mono-graph on measuring and improving perform-ance, certain aspects of pain management arenot addressed.■ This monograph does not review or cri-

tique recommendations for treatments,though reference material on clinicalpractice guidelines is provided.

■ This monograph covers subjects that areintegral to evaluating pain managementprograms. Readers interested in materialthat focuses specifically on establishingand implementing new pain manage-ment programs should consult the refer-ences cited in Section III.C(Institutionalizing Pain Management).

■ Individuals and organizations seekinginformation on compliance with require-ments for certification and accreditationprograms mentioned in this monographshould request specifications and educa-tional materials from the appropriatesponsoring agency (see references listedin Appendix A).

Joint Commission on Accreditation of Healthcare Organizations 1

Section I: Introduction

Page 10: Improving the Quality of Pain Management Through Measurement and Action

SECTION II:

Getting Started on Measurement to Improve PainManagement

A. Measurement Is Key to Improvement

Evaluating improvement in pain manage-ment performance depends on measurement.By its nature, measurement is comparativeand used to establish relationships based oncommon units of analysis.11 For example,during the start-up phase of an improvementinitiative, measurement allows staff to estab-lish the baseline performance of a process oractivity, to assess current practice, and toidentify opportunities for improvement.Then, over time, continued measurementenables them to compare current perform-ance against baseline data to evaluate thesuccess of their interventions.

Two important reasons to measure per-formance in health care organizations are toassess change for quality improvement pur-poses within an organization (internal) andto compare quality of care between differententities (external).12

1. Internal Uses for Pain ManagementPerformance Measurement

Examples of internal uses for performancemeasurement in pain management include:■ Measurement of direct and indirect

changes in pain management processesand patient outcomes in response toquality improvement interventions.

■ Assessment of compliance with selectedguidelines and evidence-based practicesin order to eliminate non-evidence–based approaches to pain managementgrounded in habit, opinion, and biases.

■ Identification of knowledge, attitudes,and competencies of clinicians.

■ Identification of educational needs, indi-vidual preferences, beliefs, and expecta-tions about pain management amongpatients and their family caregivers.

■ Dispelling false beliefs about pain and itstreatment such as fear of addiction.

■ Prioritization when multiple improve-ment opportunities exist.

■ Provision of objective data in order togain support from organizational leader-ship and buy-in from clinicians forimprovement activities.

2. External Uses for PainManagement PerformanceMeasurement

Examples of external uses for measuringperformance include:■ Comparison of performance on pain

management processes and outcomeswith those of other organizations.

■ Compliance with performance datademands from external sources such aspayers, accreditors, and government reg-ulatory bodies.

■ Establishment of national, regional, orother benchmarks.

■ Validation and refinement of criteriasuch as guidelines, standards, and carerecommendations. Evaluation of theeffect of these criteria on areas such aspatient outcomes, costs, and providerbehavior.

■ Collection of research data to validatetreatment efficacy, evaluate assessmentinstruments, or establish evidence-basedpractice.

3. Addressing Concerns Related toMeasuring Performance

Although most health care professionalswould agree that performance measurementis valuable and necessary for effective qualityimprovement, it is often met with resistanceand apprehension. Lack of familiarity withdata analysis methods and tools (e.g., statisti-cal process control) can lead to concernsabout the ability to understand and commu-

2 Improving the Quality of Pain Management Through Measurement and Action

Section II: Getting Started on Measurement to Improve Pain Management

Page 11: Improving the Quality of Pain Management Through Measurement and Action

nicate findings. Prior negative experiencescan give rise to perceptions that performancemeasurement is not valuable or valid.11

These include judgments based on insuffi-cient sample size, poorly tested measures(e.g., measures that do not have an estab-lished link to patient outcomes or measuresnot adequately adjusted for differences inseverity of illness among different patientpopulations), and disclosure of misleadingdata.

At the organizational level, performancemeasurement may be perceived as requiringtoo much dedicated staff time and resourcesrelative to the expected benefit.

Overcoming these challenges requiresearly communication and ongoing educationthroughout the organization to ensure every-one understands the purpose of data collec-tion, the measurement methods used, andthe ultimate project goals. In general it isbest to prepare special communications formanagement, department heads, and otherinfluential groups to address specific con-cerns and solicit buy-in.

It is important to choose measurementobjectives carefully and consider the impacton related processes of care. It has beenobserved that measurement may have theunintended result of allocating resources,effort, and attention to targeted areas, possi-bly to the detriment of other functions.12,13

Therefore, careful planning and attention toinstitutional priorities is important to ensureresources are wisely distributed.

Adopting a systems-level approach to per-formance measurement and improvementcan help overcome the concern that datawill be used to single out individuals for criti-cism. Those charged with implementing suchactivities must assess data quality and relia-bility and use sound analysis and interpreta-tion techniques to ensure data are translatedinto information that is fully substantiatedand can be readily understood. More infor-mation about this issue is provided inSection VI (Assessing and Analyzing YourProcesses and Results).

Whether launching new pain managementimprovement activities or maintaining estab-lished improvements, organizations can

enhance effectiveness by:■ Viewing measurement as part of an

ongoing process rather than as an endpoint.

■ Exercising care in choosing what tomeasure.

■ Keeping in mind that measurement ismost valuable when it is conducted withrigorous attention to quality, analyzedwith accuracy, and applied in a timelymanner.

As Donald Berwick, MD, a widely recog-nized leader in the quality improvementfield, has said “measurement without changeis waste, while change without measurementis foolhardy.”14 Keeping this in mind, organi-zations can avoid the pitfall of “measurementfor measurement’s sake” as well as the nega-tive effects on morale and costs that canresult when data are collected but not used.11

By recognizing and attending to commonconcerns surrounding performance measure-ment and use of data, organizations can over-come resistance.

B. Using Criteria for EffectiveMeasurement

1. Overview of CriteriaWhen evaluating pain management activi-

ties, it is important to refer to credible, evi-dence-based sources for identifying measure-ment objectives and establishing perform-ance expectations. Examples of possiblesources include clinical practice guidelines,standards, consensus statements, and positionpapers. For the purposes of this monograph,all of these items will be referred to ascriteria, which are defined as a means forjudging or a standard, a rule, or a principleagainst which something may be measured.16

a. Guidelines Guidelines are systematically developed

statements to assist practitioner and patientdecisions about appropriate health care forspecific clinical circumstances.15 Guidelinesare often a mechanism by which treatmentrecommendations are communicated. Pain

Joint Commission on Accreditation of Healthcare Organizations 3

Section II: Getting Started on Measurement to Improve Pain Management

Page 12: Improving the Quality of Pain Management Through Measurement and Action

management–related guidelines have beendeveloped for different patient populations(pediatric, adult, geriatric), types of pain(chronic or acute), and conditions or proce-dures (e.g., low back pain, postoperativepain, cancer pain). One source of informa-tion on current guidelines related to painmanagement is the National GuidelineClearinghouse Web site (www.guideline.gov). Examples of pain management guide-lines are included in Table 1.

b. Standard A standard is 1) a criterion established by

authority or general consent as a rule for themeasure of quality, value, or extent; or 2) forpurposes of accreditation, a statement thatdefines the performance expectations, struc-tures, or processes that must be substantiallyin place in an organization to enhance thequality of care.16 (Note: this definition is dis-tinct from the use of “standard” in a “stan-dard of care,” which may be used in a legalcontext or a “standard of practice” estab-lished by an organization). Standards typical-ly are used by accrediting bodies such as theJoint Commission on Accreditation ofHealthcare Organizations (JCAHO) andThe Rehabilitation AccreditationCommission (CARF) to evaluate health careorganizations and programs. An example ofstandards related to pain management fromJCAHO is included in the box at right.

c. Consensus statements and position papers

Consensus statements and position papersare expressions of opinion or positions onhealth care issues generally prepared by pro-fessional societies, academies, and organiza-tions and generated through a structuredprocess involving expert consensus, availablescientific evidence, and prevailing opinion.Table 2 provides a sample of consensus state-ments and position papers that may provideadditional resources for examining pain man-agement practice in an organization, devel-oping improvement interventions, and creat-ing an organization-wide program. One con-sensus statement of special relevance for painmanagement improvement is the AmericanPain Society Quality of Care Committee’s

Quality Improvement Guidelines for theTreatment of Acute and Cancer Pain (seebox on page 10).18

2. Criteria Similarities and DifferencesOften, these terms (standards, guidelines,

and consensus statements) are used inter-changeably throughout the literature and

4 Improving the Quality of Pain Management Through Measurement and Action

Section II: Getting Started on Measurement to Improve Pain Management

Overview of Joint Commission onAccreditation of HealthcareOrganizations Standards Related toPain Management

■ Recognize the right of patients toappropriate assessment and manage-ment of pain.

■ Screen for the presence and assessthe nature and intensity of pain in allpatients.

■ Record the results of the assessmentin a way that facilitates regularreassessment and follow-up.

■ Determine and ensure staff compe-tency in pain assessment and man-agement (i.e., provide education),and address pain assessment andmanagement in the orientation of allnew clinical staff.

■ Establish policies and procedures thatsupport the appropriate prescribing orordering of pain medications.

■ Ensure that pain does not interferewith a patient’s participation in reha-bilitation.

■ Educate patients and their familiesabout the importance of effectivepain management.

■ Address patient needs for symptommanagement in the discharge plan-ning process.

■ Incorporate pain management intoperformance activities (i.e., establisha means of collecting data to monitorthe appropriateness and effectivenessof pain management).

Adapted from reference 17.

(continued on page 10)

Page 13: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission on Accreditation of Healthcare Organizations 5

Section II: Getting Started on Measurement to Improve Pain Management

Table 1. Pain Management Guidelines

Guideline TitleDate ofRelease/Update Developer Focus/Application Contact Information

Acute PainManagement:Operative or MedicalProcedures andTrauma AHCPRPublication No. 92-0032

Release date: 1992 U.S. Department ofHealth and HumanServices Agency for HealthCare Policy andResearch (AHCPR)now known asAgency forHealthcare Researchand Quality (AHRQ)

Acute pain associat-ed with operations,medical procedures,or trauma; includesall age groups fromneonates to the elderly

AHRQClearinghouse; 800-368-9295www.ahrq.gov;Agency forHealthcare Researchand Quality; 2101East Jefferson Street,Suite 501; Rockville,MD 20852

Acute PainManagement inAdults: OperativeProcedures. QuickReference Guide forClinicians AHCPRPublication No. 92-0019

Release date: 1993 U.S. Department ofHealth and HumanServices Agency forHealth Care Policyand Research(AHCPR) nowknown as Agencyfor HealthcareResearch andQuality (AHRQ)

The QuickReference Guideincludes excerptsfrom the guideline,Acute PainManagement:Operative orMedical Proceduresand Trauma; theexcerpts aredesigned to provideclinicians with somesuggestions for prac-tical and flexibleapproaches to acutepain assessment andmanagement

AHRQClearinghouse 800-368-9295www.ahrq.gov;Agency forHealthcare Researchand Quality 2101East Jefferson Street,Suite 501 Rockville,MD 20852

Acute PainManagement inInfants, Children, andAdolescents:Operative andMedical Procedures.Quick ReferenceGuide for CliniciansAHCPR PublicationNo. 92-0020 In: J PainSymptom Manage.1992;7:229-242

Release date:1992 U.S. Department ofHealth and HumanServices Agency forHealth Care Policyand Research(AHCPR) nowknown as Agencyfor HealthcareResearch andQuality (AHRQ)

The QuickReference Guideincludes excerptsfrom the guideline,Acute PainManagement:Operative orMedical Proceduresand Trauma

AHRQClearinghouse;800-368-9295;www.ahrq.gov;Agency forHealthcare Researchand Quality; 2101East Jefferson Street,Suite 501; Rockville,MD 20852

Practice Guidelinesfor Acute PainManagement in thePerioperative Setting

Release date: April1995

American Society ofAnesthesiologists

Perioperative painmanagement

American Society ofAnesthesiologists;520 NorthNorthwest Highway;Park Ridge IL 60068-2573; E-mail: [email protected]

Page 14: Improving the Quality of Pain Management Through Measurement and Action

6 Improving the Quality of Pain Management Through Measurement and Action

Section II: Getting Started on Measurement to Improve Pain Management

Table 1. Pain Management Guidelines (continued)

Guideline TitleDate ofRelease/Update Developer Focus/Application Contact Information

Guidelines for thepediatric perioperativeanesthesia environ-ment

Release date:February 1999

American Academyof Pediatrics

Care of the pediatricpatient in the peri-operative anesthesiaenvironment

American Academyof Pediatrics; P.O.Box 747; Elk Grove,IL 60009-0747;www.aap.org

Clinical PracticeGuideline:Management ofCancer Pain AHCPRPublication No. 95-0592

Release date: March1994Revision In Press –The American PainSociety: ExpectedSpring 2003

U.S. Department ofHealth and HumanServices Agency forHealth Care Policyand Research(AHCPR) nowknown as Agencyfor HealthcareResearch andQuality (AHRQ)

AHRQClearinghouse; 800-368-9295;www.ahrq.gov;Agency forHealthcare Researchand Quality; 2101East Jefferson Street,Suite 501; Rockville,MD 20852

NCCN PracticeGuidelines for CancerPain

Revision date: June2000

NationalComprehensiveCancer Network(NCCN)

Recommendationsfor standardizedassessment andtreatment of cancerpain

NationalComprehensiveCancer Network; 50 Huntington Pike,Suite 200;Rockledge, PA19046; 215-728-4788;www.nccn.org

Cancer Pain TreatmentGuidelines for Patients

Release date:January 2001

American CancerSociety (ACS) AND NationalComprehensiveCancer Network(NCCN)

A source of cancerpain treatment infor-mation for patientsbased on the NCCNtreatment guidelines

American CancerSociety; 800-ACS-2345;www.cancer.org ORNationalComprehensiveCancer Network; 800-909-NCCN;www.nccn.org

Cancer Pain Reliefand Palliative Care inChildren

Release date:December 1998

World HealthOrganization in col-laboration with theInternationalAssociation for theStudy of Pain

Consensus guide-lines on the man-agement of pain inchildren with cancer

WHO Distributionand Sales Office;1211 GenevaSwitzerland; Phone:41-22-791-24-76;Fax: 41-22-791-48-57; E-mail: [email protected]

Practice Guidelinesfor Cancer PainManagement

Release date: 1996 American Society ofAnesthesiologists(ASA)

Evaluation andassessment of thepatient with cancerpain

American Society ofAnesthesiologists;520 NorthNorthwest Hwy;Park Ridge, IL60068-2573; E-mail: [email protected]

Page 15: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission on Accreditation of Healthcare Organizations 7

Section II: Getting Started on Measurement to Improve Pain Management

Table 1. Pain Management Guidelines (continued)

Guideline TitleDate ofRelease/Update Developer Focus/Application Contact Information

Cancer Pain Relief Release date: 1986 World HealthOrganization

Management of cancer pain

WHO Distributionand Sales Office;1211 GenevaSwitzerland; Phone:41-22-791-24-76;Fax: 41-22-791-48-57; E-mail: [email protected]

Migraine HeadacheTreatment Guidelines

Release date: 2001 U.S. HeadacheConsortium

Diagnosis, treat-ment, and preven-tion of migraineheadache

American Academyof Neurology; 1080Montreal Avenue; St. Paul, MN 55116;651-695-1940;www.aan.com

Practice Guidelinesfor Chronic PainManagement

Release date: 1997 American Society ofAnesthesiologists

Management ofchronic pain andpain-related prob-lems

American Society ofAnesthesiologists;520 NorthNorthwest Hwy;Park Ridge, IL60068-2573; E-mail:[email protected]

Guideline for theManagement of Acuteand Chronic Pain inSickle-Cell Disease

Release date: August1999

American PainSociety

Guideline to aidphysicians, nurses,pharmacists, andother health careprofessionals inmanaging acute andchronic pain associ-ated with sickle-celldisease

American PainSociety; 4700 W.Lake Ave; Glenview,IL 60025; Phone: 847-375-4715; Fax: 847-375-4777;www.ampainsoc.org

Page 16: Improving the Quality of Pain Management Through Measurement and Action

8 Improving the Quality of Pain Management Through Measurement and Action

Section II: Getting Started on Measurement to Improve Pain Management

Table 1. Pain Management Guidelines (continued)

Guideline TitleDate ofRelease/Update Developer Focus/Application Contact Information

Acute PainManagement (in theelderly)

Release date: 1997Revision date: April1999

University of IowaGerontologicalNursingInterventionsResearch Center

Acute pain manage-ment in the elderlypatient

University of IowaGerontologicalNursingInterventionsResearch Center,ResearchDisseminationCore, 4118Westlawn, IowaCity, IA 52242-1100www.nursing.uiowa.edu/gnirc

Model Guidelines forthe Use of ControlledSubstances for theTreatment of Pain

Release date: May1998

The Federation ofState MedicalBoards of the UnitedStates Inc.

Evaluating the use ofcontrolled sub-stances for pain con-trol

Federation of StateMedical Boards ofthe United States,Inc.; FederationPlace; 400 FullerWiser Road, Suite300; Euless, TX76039-3855; Phone: 817-868-4000; Fax: 817-868-4099;www.fsmb.org

Clinical PracticeGuideline: ChronicPain Management inthe Long-Term CareSetting

Release date: 1999 American MedicalDirectorsAssociation

Addressing chronicpain in the long-term care setting

American MedicalDirectorsAssociation; 10480Little PatuxentParkway, Suite 760;Columbia, MD21044; Phone: 800-876-2632; Fax: 410-740-4572;www.amda.com

Principles of AnalgesicUse in the Treatmentof Acute Pain andCancer Pain

Release date: 1999 The American PainSociety

Principles andrecommendationsrelated to analgesictherapy

American PainSociety; 4700 W.Lake Ave; Glenview,IL 60025; Phone: 847-375-4715; Fax: 847-375-4777;www.ampainsoc.org

Treatment ofNonmalignantChronic Pain

Release date: 2000 The AmericanAcademy of FamilyPhysicians

Management of non-malignant chronicpain

American Academyof Family Physicians;11400 TomahawkCreek Parkway;Leawood, KS66211-2672; 913-906-6000;www.aafp.org

Page 17: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission on Accreditation of Healthcare Organizations 9

Section II: Getting Started on Measurement to Improve Pain Management

Table 1. Pain Management Guidelines (continued)

Guideline TitleDate ofRelease/Update Developer Focus/Application Contact Information

Guideline for theManagement of Painin Osteoarthritis,Rheumatoid Arthritisand Juvenile ChronicArthritis

Release date: 2002 The American PainSociety

Pain managementfor juveniles withosteoarthritis,rheumatoid arthritis,and juvenile chronicarthritis

American PainSociety; 4700 W.Lake Ave.;Glenview, IL 60025;www.ampainsoc.org;Phone: 847-375-4715; Fax: 847-375-4777

Management of LowBack Pain or Sciaticain the Primary CareSetting

Release date: May1999

Veterans HealthAdministration andDepartment ofDefense

Management of lowback pain in theambulatory care setting for patientsolder than 17 yearsof age

Department ofDefense Department ofVeterans AffairsVeterans HealthAdministration(VHA) Office of Qualityand Performance(1OQ); 810 VermontAve, NW;Washington, DC20420

Acute Low BackProblems in AdultsAHCPR PublicationNo. 95-0642

Release date: 1994 U.S. Department ofHealth and HumanServices Agency forHealth Care Policyand Research(AHCPR) nowknown as Agencyfor HealthcareResearch andQuality (AHRQ)

Management of lowback pain-adults

AHRQClearinghouse; 800-368-9295;www.ahrq.gov;Agency forHealthcare Researchand Quality; 2101East Jefferson Street,Suite 501; Rockville,MD 20852

The Management ofPersistent Pain inOlder Persons

Release date: 2002 American GeriatricSociety (AGS)

Management of per-sistent pain in olderadults

American GeriatricsSociety; The EmpireState Building; 350Fifth Avenue, Suite801; New York, NY10118; 212-308-1414;www.americangeriatrics.org

Page 18: Improving the Quality of Pain Management Through Measurement and Action

10 Improving the Quality of Pain Management Through Measurement and Action

Section II: Getting Started on Measurement to Improve Pain Management

across the health care field by entities rang-ing from government agencies to accreditingorganizations to professional societies to legalexperts. Although similarities exist amongthem, there also are some distinct differencesthat should be kept in mind to prevent con-fusion over terminology (see box on pg. 11).

Although criteria can be highly beneficialfor assessing and improving organizationalperformance and quality of care, health careorganizations must ensure that the sourcesare credible, evidence-based where applica-ble, scientifically sound, and accepted by cli-nicians. As options for managing pain con-tinue to evolve, it is important to considerthe extent to which information in the crite-ria reflect the current state of knowledge.Some criteria are subject to establishedcycles of review and revision, while othersare not. For example, a recent study designedto assess the current validity of 17 guidelinesdeveloped by the Agency for HealthcareResearch and Quality (AHRQ) (includingthe landmark 1992 postoperative pain guide-lines) and to use this information to estimatehow quickly guidelines become obsolete, led

to the recommendation that, as a generalrule, guidelines should be reassessed for valid-ity every 3 years.19

3. Mandated CriteriaAnother important set of criteria consists

of those mandated by federal, state and localstatute or regulation. All health care profes-sionals and organizations need to be aware ofthe statutes and regulations applicable totheir practice. Though a detailed discussionis beyond the scope of this monograph, duein part to the complexity and changingnature of such criteria, an introduction withreferences is provided.

a. StatuteA statute is a law created by a legislative

body at the federal, state, county, or citylevel. Commonly called a law or an act, asingle statute may consist of just one legisla-tive act or a collection of acts.20 Examplesinclude the Pain Patient’s Bill of Rights(California, 1997) and the Intractable PainAct (West Virginia, 1998).

b. RegulationA regulation is an official rule or order

issued by agencies of the executive branch ofgovernment. Regulations have the force oflaw and are intended to implement a specificstatute, often to direct the conduct of thoseregulated by a particular agency.20 An exam-ple is the 1999 pain management regulationprovided by the Washington MedicalQuality Assurance Commission.

Detailed information on state statutes andregulations addressing multiple aspects ofpain management is available from the Website for the Pain and Policies Study Group,University of Wisconsin ComprehensiveCancer Center, which is a World HealthOrganization collaborating center for policyand communication in cancer care(www.medsch.wisc.edu/painpolicy/). ThePain and Policy Studies Group publishes anannual review of state pain policies on itsWeb site; this review summarizes and com-ments on each new or amended state statute,regulation, and medical board policy affect-ing pain management. The complete cita-tion, full text, and Internet citation also areprovided for each pain-related policy.

American Pain Society QualityImprovement Guidelines for theTreatment of Acute and Cancer Pain

■ Recognize and treat pain promptly.■ Chart and display patient’s self-

report of pain.■ Commit to continuous improve-

ment of one or several outcomevariables.

■ Document outcomes based ondata and provide prompt feedback.

■ Make information about analgesicsreadily available.

■ Promise patients attentive analgesiccare.

■ Define explicit policies for use ofadvanced analgesic technologies.

■ Examine the process and outcomes ofpain management with the goal ofcontinuous improvement.

(18) Source: JAMA, 1995; 274:1874-1880. Used withpermission

Page 19: Improving the Quality of Pain Management Through Measurement and Action

Comparing Criteria: Guidelines, Standards and Consensus Statements

Similarities■ They are useful for establishing

expected or desired levels of per-formance that are credible, evi-dence-based, and recognized byprofessionals.

■ Scientific evidence and/or expertconsensus are used in develop-ment.

■ They can be multidisciplinary inscope and focus.

■ Usually compliance is consideredvoluntary rather than mandated bylaw.

■ They are useful for improving con-sistency and reducing variation incare processes and for evaluatinglinks between processes of care andoutcome quality management.

■ They can serve to educate clini-cians, organizations, and others inthe health care community regard-ing advances in the field and sub-sequent care recommendations.

Differences■ The intended audience can vary

widely. For example, guidelines areoften written for clinicians whodeliver care, while standards oftenare written for a broader audiencesuch as a department or organiza-tion.

■ The intended purpose can vary.For example, standards of all typesare frequently seen as authoritativestatements, expectations, orrequirements and are used at thelevel of assessment for organiza-tions or programs. Guidelines, onthe other hand, generally areviewed as strategies for clinicaldecision-making, and hence aremore flexible. They also are usedto develop protocols allowing forclinically justifiable deviations intreating individual patients or foralleviating specific conditions orsymptoms.

■ The degree of clinical certaintycan vary across and even withineach criterion category based onthe developmental approach usedand the availability of documentedscientific evidence.

Joint Commission on Accreditation of Healthcare Organizations 11

Section II: Getting Started on Measurement to Improve Pain Management

Page 20: Improving the Quality of Pain Management Through Measurement and Action

12 Improving the Quality of Pain Management Through Measurement and Action

Section II: Getting Started on Measurement to Improve Pain Management

Table 2. Consensus and Position Statements

TitleDate ofRelease Developer Focus/Application Contact Information

Quality ImprovementGuidelines for the Treatmentof Acute Pain and CancerPain

1995 American PainSociety Quality ofCare Committee

Improving treatmentfor patients withacute pain and can-cer pain.

American Pain Society4700 W. Lake Ave;Glenview, IL 60025-1485;www.ampainsoc.org

The Use of Opioids for theTreatment of Chronic Pain:A Consensus StatementFrom American Academy ofPain Medicine andAmerican Pain Society

1996 American Academyof Pain Medicineand American PainSociety

The management ofchronic pain

American Pain Society;4700 W. Lake Ave;Glenview, IL 60025-1485;www.ampainsoc.org; orAmerican Academy of PainMedicine; 4700 W. LakeAve; Glenview, IL 60025-1485; www.painmed.org

Treatment of Pain at the Endof Life: A Position Statementfrom the American PainSociety

1997 American PainSociety

The management ofpain at the end oflife

American Pain Society;4700 W. Lake Ave;Glenview, IL 60025-1485;www.ampainsoc.org

Pediatric Chronic Pain:Position Statement

2001 American PainSociety

The special needs inpediatric chronicpain assessment,treatment, research,and education

American Pain Society;4700 W. Lake Ave.;Glenview, IL 60025-1485;www.ampainsoc.org

Pain Assessment andTreatment in the ManagedCare Environment: APosition Statement from theAmerican Pain Society

2000 American PainSociety

Assist managed careorganizations(MCOs) in advanc-ing the quality ofpain managementservices and thepatient satisfactionand clinical out-comes for peoplereceiving carethrough these organizations

American Pain Society;4700 W. Lake Ave;Glenview, IL 60025-1485;www.ampainsoc.org

Oncology Nursing SocietyPosition on Cancer PainManagement

2000 Oncology NursingSociety (ONS)

Pain management inpatients with cancer

Oncology Nursing Society;501 Holiday Drive;Pittsburgh, PA 15220;www.ons.org

The Assessment andManagement of Acute Painin Infants, Children, andAdolescents

2001 American PainSociety; ANDAmerican Academyof Pediatrics

Pediatric acute pain American Pain Society;4700 W. Lake Ave.;Glenview, IL 60025-1485;www.ampainsoc.org orAmerican Academy ofPediatrics; 141 NorthwestPoint Blvd.; P.O. Box 927;Elk Grove Village, IL60009-0927; www.aap.org

Page 21: Improving the Quality of Pain Management Through Measurement and Action

Table 2. Consensus and Position Statements (continued)

Joint Commission on Accreditation of Healthcare Organizations 13

Section II: Getting Started on Measurement to Improve Pain Management

TitleDate ofRelease Developer Focus/Application Contact Information

Circumcision PolicyStatement

March1999

American Academyof Pediatrics

A policy statementon neonatal circum-cision of the maleinfant

American Academy ofPediatrics; 141 NorthwestPoint Blvd.; P.O. Box 927;Elk Grove Village, IL60009-0927; www.aap.org

Prevention andManagement of Pain andStress in the Neonate

February2000

American Academyof Pediatrics andCanadian PediatricSociety

A policy statementregarding the recog-nition and manage-ment of pain inneonates (preterm to1 month of age)

American Academy ofPediatrics; 141 NorthwestPoint Blvd; P.O. Box 927;Elk Grove Village, IL60009-0927; www.aap.org

Symptom Management:Pain, Depression, andFatigue (draft statement)

July2002

State of the ScienceConferenceStatement, ExpertPanel

Assessment by expertpanel of medicalknowledge at thetime the statementwas written regard-ing symptom man-agement in cancer

State of the ScienceStatements; NIH ConsensusDevelopment Program;www.consensus.nih.gov

Definitions Related to theUse of Opioids for theTreatment of Pain, AConsensus Document

2001 The LiaisonCommittee on Painand Addiction. Thecommittee is a col-laborative effort ofthe AmericanAcademy of PainMedicine, AmericanPain Society &American Society ofAddiction Medicine

Definitions of addic-tion, tolerance andphysical depend-ence

American Academy of PainMedicine; 4700 W. LakeAvenue; Glenview, IL60025-1485; www.painmed.org orAmerican Pain Society;4700 W. Lake Avenue;Glenview, IL 60025-1485;www.ampainsoc.org orAmerican Society ofAddiction Medicine; 4601North Park Avenue; Arcade101; Chevy Chase, MD20615; www.asam.org

Clinical Policy: CriticalIssues for the InitialEvaluation and Managementof Patients Presenting With aChief Complaint ofNontraumatic AcuteAbdominal Pain

2000 American College ofEmergencyPhysicians

Evaluation and man-agement of patientspresenting with achief complaint ofnontraumatic acuteabdominal pain.

American College ofEmergency Physicians;Customer Service Dept.;P.O. Box 619911; Dallas,TX 75261-9911;www.acep.org; 800-798-1822; Ann Emerg Med. October2000;36:406-415

Page 22: Improving the Quality of Pain Management Through Measurement and Action

SECTION III:

UnderstandingOrganizationalImprovement In PainManagement

A. Viewing the Organization as a System

Recent trends in health care performanceassessment indicate organizations are movingtoward systems techniques and approachesthat originated in industrial improvementmodels. A system is a group of interacting,interrelated, or interdependent elements orprocesses that form a collective entity andshare a common goal.21 Taking a systemsapproach means emphasizing the organiza-tion as a whole, focusing on the interconnec-tivity of processes and underlying structures,recognizing relationships and interactionsacross the organization, and identifying rootcauses of problems. Figure 1 illustrates howan organization’s culture and leadership areinfluenced by patients (customers), staff(people), strategies, and processes.

Similarly, improving pain managementperformance requires “systems thinking,”much like that used for identifying opportu-nities to improve the medication useprocess.23,24 The use of a systems approachwhen designing or redesigning processesrequires one to consider ways to overcomepotential process or task design failures insuch areas as equipment, organizational andenvironmental factors, psychological precur-sors, as well as team building and training.25

Experts consistently emphasize the need fora system-wide, collaborative, and interdisci-plinary approach to pain management.17,26-28

Strategies for improving pain managementcan be viewed in relation to a system com-posed of:

■ Inputs—including patients, clinicians,technology, equipment, pharmacologicand nonpharmacologic therapies.

■ Throughputs—care processes. ■ Outputs—including patient satisfaction,

pain experience, outcomes (e.g., morbidity,length of stay), changes in clinicianbehavior, and healthcare utilization (e.g.,emergency room visits, rehospitalization).

Improvement efforts will produce thegreatest effect when inputs and throughputsare simultaneously addressed by altering howthese elements interact to change outputs.29

As Donald Berwick, MD, has said, “Everysystem is perfectly designed to achieve exact-ly the results it achieves” (cited in reference29, p. 6). Poorly designed systems often leadto inefficiency and inadequate quality ofcare. Understanding and simplifying thesteps of a process can yield substantialimprovements in performance.

B. Understanding PainManagement

As with any clinical performance evalua-tion, it is essential to obtain and review thelatest information available before proceed-ing. The rapidly expanding body of evi-dence-based findings related to pain manage-ment makes it imperative to conduct such areview before implementing quality improve-ment activities. Resources include criteria(guidelines, consensus statements, and stan-dards), journal articles, texts, Web sites, andother publications. McCaffery and Pasero27

suggest that pain management referencematerials, including books, journals, andvideos, be used to create an institutionallibrary. Sources useful in identifying refer-ence materials include: City of HopePain/Palliative Care Resource Center Website (www:cityofhope.org/prc) andDepartment of Pain Medicine and PalliativeCare at Beth Israel Medical Center Web site(www.stoppain.org). In addition, confer-ences, professional society meetings, researchsymposia, and educational seminars are oftengreat sources of new or “cutting edge” infor-

14 Improving the Quality of Pain Management Through Measurement and Action

Section III: Understanding Organizational Improvement In Pain Management

Page 23: Improving the Quality of Pain Management Through Measurement and Action

mation. They also provide the opportunityfor staff to gain new skills, network withother organizations, and meet recognizedexperts in the field.

Participating in research as a test site canbe an excellent opportunity for organiza-tions. Benefits may include access to newlydeveloped technology, data collection strate-gies, education, and support services as wellas aggregate comparative information. Tolearn about research opportunities, checkwith academic institutions, professional soci-eties, and government agencies (e.g., theAgency for Healthcare Research and Quality[formerly known as the Agency for HealthCare Policy and Research]). It is importantthat healthcare institutions support researchthrough funding, active participation anddissemination of research findings at profes-

sional and scientific meetings. These activi-ties are critical to creating and maintainingexemplary practice in clinical services acrossall types of settings.

C. Institutionalizing PainManagement

The process of integrating good pain man-agement practices into an organization’severyday life requires a comprehensiveapproach that includes—and goes beyond—performance improvement to overcome bar-riers and to achieve fundamental systemchanges. Researchers and pain managementexperts have identified a core set of activitiescharacterized by the use of an interdiscipli-nary approach to facilitate these system

Joint Commission on Accreditation of Healthcare Organizations 15

Section III: Understanding Organizational Improvement In Pain Management

Figure 1. Dynamics of Systems Thinking in Relation to the Health Care

Organization

• Create a shared strategicvision for the organization

• Develop a deployment plan totranslate the strategic plan intoaction

• Identify critical success factors

• Identify key strategic workprocesses

• Align all organizational workwith strategic vision

• Design an ongoing process forsystematic review

• Measure and monitor results

• Identify corporate andindividual competenciesnecessary to achieve thestrategic vision

• Identify current competencylevels

• Design and employ plans to“close gaps”

• Identify coreprocesses

• Redesign keyprocesses

• Continuouslyimproveprocesses

• Measure andmonitor results

• Design and align rewardand recognition systemswith organizational goalsand desired competencies

• Measure and monitorresults

• Identify critical“strategic”customers

• Identifycustomerdelight factors

• Create theorganizationalcapability andinfrastructure tocontinuouslygather“actionable”customer dataand use them todrive processimprovementand design

• Measure andmonitor results

(22) This figure illustrates how an organization’s strategies, patients, staff, and processes affect its culture and leaders.Reprinted with permission from Labovitz G, Rosansky V. The Power of Alignment: How Great Companies Stay Centeredand Accomplish Extraordinary Things. Copyright © 1997 John Wiley & Sons, Inc. This material is used by permission ofJohn Wiley & Sons, Inc.

Strategy

Process CustomerCultureLeadership

People

• Align culture withstrategic vision

• Communicate strategy

• Train for leadership inrapidly changingenvironment

• Measure and monitorresults

Page 24: Improving the Quality of Pain Management Through Measurement and Action

changes.18,26,27 Collectively, this interdisci-plinary approach and the related activitieshave been referred to as “institutionalizingpain management.” These strategies providea foundation for system change and effectiveperformance improvement activities. Anexample of the steps associated with institu-tionalizing pain management is provided inthe text box above.

Three books that provide extensive infor-mation on institutionalizing pain manage-ment include Building an InstitutionalCommitment to Pain Management: TheWisconsin Resource Manual,26 Pain ClinicalManual,27 and Pain Assessment andManagement: An Organizational Approach.30

Organizational experiences and successfulefforts to institutionalize pain managementare also reported in journal articles.28,32-34,117

Among the suggested steps to institution-alize pain management are activities com-mon to modern quality improvement meth-

ods. These include forming a multidiscipli-nary committee of key stakeholders, analyz-ing current pain management practice per-formance, and improvement through contin-uously evaluating performance. Althoughmany steps are common to a qualityimprovement initiative, they may differ inscope. For example, the multidisciplinarycommittee charged with integrating qualitypain management throughout the organiza-tion will often be addressing mission and pol-icy statements, standards of care, accounta-bility, and overall practice issues. By contrast,a quality improvement work group generallyhas responsibility for a focused improvementactivity (see Section IV.A [Establishing theProject Team]). The multidisciplinary com-mittee may be established as a formal bodyfirst, or in some organizations may evolvefrom a quality improvement project team.35

Depending on the size of the organizationand available resources, the committee mayfulfill both roles.

D. Understanding Current PainManagement Practices inYour Organization

One of the first priorities is to complete acomprehensive evaluation of the organiza-tion’s structures, processes, and people, refer-encing key criteria for quality pain manage-ment practices. This organizational assess-ment may involve multiple data collectionactivities including review of organizationalresources, documents, and medical records;completion of a self-assessment tool; andassessment of staff knowledge and attitudes.These measurement approaches, includingreferences for organizational self-assessmentinstruments, are further described in SectionV (Measuring What You Need to Know).

Understanding patient (or client) factors isessential to completing the picture of currentmanagement practices. No pain initiative cansucceed if patient needs are not carefully eval-uated and addressed. A common starting pointis an examination of the answers given bypatients and families to the questions on satis-

16 Improving the Quality of Pain Management Through Measurement and Action

Section III: Understanding Organizational Improvement In Pain Management

Eight Steps to Develop an InstitutionalPain Management Program

■ Develop an interdisciplinary workgroup.

■ Analyze current pain management inyour care setting.

■ Articulate and implement a standardof practice.

■ Establish accountability for painmanagement.

■ Provide information about pharma-cologic and nonpharmacologic inter-ventions to clinicians to facilitateorder writing and interpretation andimplementation of orders.

■ Promise individuals a quick responseto their reports of pain.

■ Provide education for staff. ■ Continually evaluate and work to

improve the quality of pain manage-ment.

(26) Gordon DB, Dahl JL, Stevenson KK, eds. Buildingan Institutional Commitment to Pain Management. TheWisconsin Resource Manual. 2nd ed. Madison, WI:The Resource Center of the American Alliance ofCancer Pain Initiatives; 2000. Used with permission.

Page 25: Improving the Quality of Pain Management Through Measurement and Action

faction surveys. It is important to note thatresearchers find that satisfaction survey results,though important, often indicate a high levelof satisfaction even though pain intensity rat-ings show that individuals may be experienc-ing significant degrees of pain.36-42 Satisfactionsurveys also fail to capture significant patient-specific issues critical to successful pain man-agement in that they generally do not assesspatient knowledge and beliefs. It is essentialthat patients know how and to whom toreport pain as well as understand their treat-ment regimen. Measurement of patients’knowledge, attitudes, and beliefs through theuse of assessment instruments is discussed fur-ther in Section V.C.3 (Test Knowledge andAttitudes). This information is important todeveloping needs-based education and recog-nizing patient barriers (see Section VIII.A[Factors That Influence Change]). Otherpatient-related considerations that may figuresignificantly in designing a pain managementimprovement activity include:■ Specific special-risk groups such as the

elderly, children, and the cognitivelyimpaired.

■ Special pain relief needs related to can-cer or chronic/acute illness.

■ Unique demographic characteristics(e.g., ethnic/cultural, language, educa-tional level).

■ Information obtained about current painintensity levels derived from patient-reported ratings.

Obtaining objective data related to thesepatient and institutional factors will result ina better evaluation of organizational perform-ance. A thorough understanding of yourorganization’s current practices will provide asolid foundation for making recommenda-tions, gaining needed support, and designingsuccessful improvement interventions.

E. Understanding QualityImprovement Principles

The past decade has been marked by anevolution in health care toward the use ofmodern quality improvementmethodologies.29 Though several different

frameworks exist, virtually all includedefined steps that function as an operationalmodel for implementing improvementprocesses. Figure 2 presents the widely usedPDSA (Plan, Do, Study, Act) frameworkdeveloped by Langley et al.21 A worksheet ofkey questions associated with this cycle isprovided in Appendix B.

1. The Cycle for ImprovingPerformance

This monograph has adopted JCAHO’sFramework for Improving Performance, whichincludes a cycle similar to the PDSA cycle.44

The JCAHO Cycle for ImprovingPerformance describes critical activities com-mon to many improvement approaches andprovides for systematic, scientifically orientedaction (Figure 3). The cycle is one compo-nent of the framework that also recognizesfactors in the external and internal environ-ment that influence organizational perform-ance.

As shown in Figure 3, the cycle is a con-tinuous process with four major activities:design, measure, assess, and improve. All fouractivities are important and must beaddressed to achieve a balanced, effectiveapproach. In the design phase, a function orprocess is created; in the measure phase, dataare gathered about the process to create aninternal database for evaluating performance;in the assess phase data are analyzed to iden-tify areas for improvement or to assesschanges; and in the improve phase changesrequired to improve upon the original func-tion or process are implemented. This highlyflexible cycle allows health care organiza-tions to begin at any phase, and apply it to asingle activity or an organization-wide func-tion.

2. Repeating the Cycle at DifferentPhases of the Project

Whether one selects the PDSA cycle, theFramework for Improving Performance, or adifferent approach, most improvement initia-tives will require repeating the cycle for dif-ferent phases of the project. For example, ifthe quality management staff has identified

Joint Commission on Accreditation of Healthcare Organizations 17

Section III: Understanding Organizational Improvement In Pain Management

Page 26: Improving the Quality of Pain Management Through Measurement and Action

infrequent patient reassessment as a problem,they may take steps to organize a team todetermine an overall goal for improvingreassessment practices (design/plan); use self-assessment tools and medical record audits tomeasure the extent of the problem across theorganization (measure/do); collect and ana-lyze data on reassessment in representativepatient care units (assess/study); and priori-tize potential improvement options andimplement interventions (improve/act).

After agreement among quality manage-ment staff members that a multifaceted edu-cational intervention for nurses is needed,the cycle would repeat. The education would

be planned across shifts (design/plan), theeducation would be implemented as a pilottest on a single unit by having staff completea pretest examination before the education(measure/do), the education would be con-ducted (improve/act), and a posttest exami-nation would be completed and the changein scores would be analyzed (assess/study). Atthat point, the education would be modifiedas needed and incorporated into orientationand other required education activities.

Table 3 provides examples of some differ-ent activities associated with the cycle foreach stage of the improvement initiative.

18 Improving the Quality of Pain Management Through Measurement and Action

Section III: Understanding Organizational Improvement In Pain Management

Figure 2. The PDSA Cycle

From: Langley GJ, Nolan KM, Nolan TW. The foundation of improvement. Quality Progress. June 1994:81-86.Reprinted in Joint Commission on Accreditation of Healthcare Organizations. Using Performance Improvement Tools ina Health Care Setting. Rev ed. Oakbrook Terrace, IL: Joint Commission on Accreditation of Healthcare Organizations;1996.

Act■ What changes

are to be made?■ Next cycle?

Study■ Complete the

analysis of thedata

■ Compare data topredictions

■ Summarize whatwas learned

Do■ Carry out the

plan■ Document

problems andunexpectedobservations

■ Begin analysis of the data

Plan■ Objective■ Questions and

predictions■ Plan to carry out

the cycle(who, what,where, when)

Page 27: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission on Accreditation of Healthcare Organizations 19

Section III: Understanding Organizational Improvement In Pain Management

Figure 3. Cycle for Improving Performance

From: Joint Commission on Accreditation of Healthcare Organizations. Framework for Improving Performance: FromPrinciple to Practice. Oakbrook Terrace, IL: Joint Commission on Accreditation of Healthcare Organizations; 1994.

Objectives Function orProcess

InternalDatabase

ImprovementPriorities

Improvement/Innovation

Design

DesignRedesign

Assess

Measure

Improve

Table 3. Examples of Tasks Associated with Each Stage of the Improvement Cycle

Phase of QualityImprovement Initiative Design/Plan Measure/Do Assess/Study Improve/Act

Understanding theproblem

• Organize team• Set overall project goals• Integrate with

organizational priorities

• Collectbaselinedata

• Compare toestablished norms

• Identify opportunitiesfor improvement

• Identifypotentialactions

• Prioritizeactions

Implementing theimprovement/intervention

• Set target goals, desiredlevels of performance

• Plan interventionschedule

• Obtain resources,approvals

• Pilot-testintervention

• Collect pilotdata

• Collect datato evaluateeffectivenessofintervention

• Assess effectivenessof pilot (redesign ifnecessary)

• Analyze data oninterventioneffectiveness

• Compare results togoals

• Implement fullintervention

Continuous monitoring • Determine who, whenand how monitoring willbe done

• Remeasureat regularintervals

• Reanalyzeperiodically

• Disseminate findings

• Modifyintervention ifneeded

• Identify newopportunitiesforimprovement

ComparativeInformation

Page 28: Improving the Quality of Pain Management Through Measurement and Action

20 Improving the Quality of Pain Management Through Measurement and Action

Section III: Understanding Organizational Improvement In Pain Management

3. Integrating With Internal QualityImprovement Activities

Ideally, pain initiatives should be integrat-ed into existing improvement functions.Taking advantage of established institutionalimprovement structures will help ensure sup-port of key leadership and involvement ofstaff with quality improvement expertise anddedicated resources. Look for ways to empha-size pain management improvement withinthe context of overall organizationalimprovement goals.

It is useful to examine previous successfulperformance improvement initiatives at yourorganization to identify the factors that leadto success. Studying successful pain improve-ment initiatives completed by other organi-zations also can be valuable. This can bedone by:■ Networking through professional soci-

eties

■ Using Web-based communications suchas the list-serv

■ Attending educational conferences ■ Participating in learning networks and

organizational performance improvementworkshops such as those offered by theInstitute for Healthcare Improvement(www.ihi.org) or Joint CommissionResources, Inc. (www.jcrinc.com)

To learn from organizations that havereceived special recognition for qualityimprovement practices, examine the initia-tives of winners of national quality awardssuch as the Malcolm Baldridge award(administered by the U. S. Department ofCommerce) or the Ernest Codman award(Ernest A. Codman Award Program,JCAHO, www.jcaho.org).

Page 29: Improving the Quality of Pain Management Through Measurement and Action

SECTION IV:

Designing Your Pain ManagementImprovementInitiative

A. Establishing the ProjectTeam

The composition of the project team isextremely important to the success of theimprovement project. As with any health careissue, it will be important to include keystakeholders, including people with the neces-sary knowledge and skill in care processes,“change champions” (motivated individualswho reinforce effective pain managementbehaviors and influence peers), and thoseindividuals with the authority to supportchange (administrators). Improving pain man-agement will require the participation of mul-tiple groups within an organization, and theteam should reflect the interdisciplinarynature of the process. As major decision mak-ers, physicians should be engaged early in theplanning stages, or efforts to change care arelikely to fail.45 Some suggestions for engagingphysicians in quality improvement activitiesare presented in the box on page 22. Becausethe pain management process could potential-ly involve many individuals, it may be practi-cal to develop a core group and enlist addi-tional members who participate as necessaryon specific issues.35 Be sure to keep all mem-bers informed, especially if they do not attendall meetings, through communications provento be most effective in your organization suchas minutes, e-mail, phone, or newsletters.

Once the team is identified, define andassign roles and responsibilities. For example,in one hospital’s initiative to increase evi-dence-based pain management practice, roleswere specifically defined for pain team mem-bers during the 3-year project (see Table 4).46

Another approach used by some organiza-tions utilizes a “contract,” or written state-ment of responsibilities and commitment forteam members.

Several additional factors important tosuccessful implementation of qualityimprovement teams are shown in the box onpage 23.

B. Selecting Objectives andTarget Goals

Once the team is in place, the next step isto define objectives for the organization’smeasurement efforts. As previously noted,criteria such as those described in Section II(Getting Started on Measurement toImprove Pain Management) provide valu-able evidence-based principles and processesfor evaluating pain management. In particu-lar, the American Pain Society qualityimprovement guidelines often are used toestablish improvement goals and objec-tives.18 Improvement opportunities may beidentified by comparing your organization’scurrent practice (using objective assessmentdata) with criteria.

If multiple opportunities are identified,objectives need to be prioritized, with one ortwo eventually selected. Objectives shouldbe manageable and measurable, and shouldreflect expectations that are meaningful topatients as well as to health care profession-als. The following statements are examples ofobjectives:■ A pain assessment will be completed

upon admission. ■ A patient-identified acceptable pain

level will be noted on the chart. ■ Staff that manages specialized pain thera-

py techniques will have successfully com-pleted competency testing.

■ Patient-reported pain intensity scoreswill be recorded in their chart.

Some organizations may further define theobjective by adding a target performancegoal. For example, the first objective couldhave an assigned target goal of 90%. The tar-get goal describes a desired level of perform-ance and serves as a “ruler” for measuring

Joint Commission on Accreditation of Healthcare Organizations 21

Section IV: Designing Your Pain Management Improvement Initiative

Page 30: Improving the Quality of Pain Management Through Measurement and Action

improvement. At the same time, settingincremental goals may provide positiveincentives as success is achieved in a definedway. One example of using target goals todevelop an internal “report card” for selectedmeasures of pain management performance isdescribed by Starck et al.48 A modifiedAmerican Pain Society Patient OutcomeQuestionnaire and a second instrument toassess consistency of the hospital’s processeswith the AHCPR guideline were used to col-lect data and to develop a pain managementreport card. The report card establisheddesired target goals against which actual per-formance was compared.

C. Establishing the Scope ofMeasurement

Once the team determines what to meas-ure (setting the objectives), it needs todecide the scope of measurement.Measurement options can be defined in anumber of ways:■ Discipline-specific measurement focuses

on a particular discipline (e.g., nursing,medicine, pharmacy) and probably ismost applicable when the intervention isexclusive to the measured group and dis-cipline-specific knowledge and skills canbe assessed.

■ Service or unit-specific measurementmay be useful when the improvementactivity is limited to a specific area such

22 Improving the Quality of Pain Management Through Measurement and Action

Section IV: Designing Your Pain Management Improvement Initiative

Getting Physicians Involved in the Quality Improvement Process

Eight questions to ask:

■ Do physicians in your organization feel excluded from the team at the outset?■ Are incentives poorly defined and identified?■ Is the team too rigid in laying out its tasks?■ Are goals and objectives vague or unrealistic?■ Are there too few data, or is there misuse of data?■ Is there a solid connection between the problem and its relevance to patient care?■ Does the approach emphasize scientific methods?■ Does the team have the needed expertise in continuous quality improvement tech-

niques, skill measurement, and data analysis (i.e., a data analyst or statistician)?

Eight solutions to consider:

■ Choose an important project with clear goals.■ Include physicians on the ground floor. Ask what quality improvement issues they

would like to see measured and monitored.■ Focus on data that are readily available, timely, and valid.■ Adjust outcomes data appropriately to avoid misuse.■ Recognize physician bias. Physicians care more about clinical outcomes and less about

quality improvement processes. Clearly identify benefits of quality improvement tophysicians and their patients.

■ Use physician time wisely: avoid meetings during office hours, and use fax, phone, tele-conferencing, and e-mail as alternatives to face-to-face meetings.

■ Avoid engaging physicians in activities that require long training sessions and excessiveuse of quality improvement terminology.

■ Strive to involve a few physicians who are opinion leaders.

Adapted from reference 45.

Page 31: Improving the Quality of Pain Management Through Measurement and Action

as a postoperative recovery or hospiceunit.

■ Population-specific measurement basedon diagnoses or symptoms targets adefined patient group—possibly acrossmore than one unit—such as sickle cellor chronic pain patients.

■ Organization-wide measurement dealswith all patients and service areas and isuseful when implementing processchanges such as use of a specific painassessment tool.

■ Health system–wide measurement looksat organizations within the same admin-istrative management system and is usedwhen assessing activities across multiplesites. For example, the Veterans HealthAdministration adopted a national mon-itoring strategy for pain managementwhich is described in Section IX.

Finally, the scope of measurement shouldbe aligned with stated measurement objec-tives and target goals. For example, it wouldbe insufficient to measure a single unit for anorganization-wide objective.

Joint Commission on Accreditation of Healthcare Organizations 23

Section IV: Designing Your Pain Management Improvement Initiative

Table 4. Examples of Roles and Behaviors for Pain Project Team Members

Source: Miller EH, Belgrade MJ, Cook M, et al. Institutionwide pain management improvement through the use ofevidence-based content, strategies, resources, and outcomes. Quality Management in Health Care. 1999;7(2):33. Usedwith permission.

Team Member Role/Behavior

Clinical nurse specialist Lead and coordinate implementation strategies in each patientcare population; pain program faculty member for pain assess-ment, complementary therapy, and case study sections

Clinical pharmacist Lead and coordinate implementation strategies that involvepharmacologic strategies; especially helpful in physician orderchanges; pain program faculty member for opioid, nonsteroidalanti-inflammatory drug, and case study sections

Physician Develop, advise, and deliver physician education; lead opioidorder revision; pain program faculty member for physiology

Researcher Develop project plan and proposal; obtain funding; coordinateimplementation, data collection, analysis, and outcome dissemi-nation; provide overall leadership for project; pain program fac-ulty member for project processes and outcomes

Program director Develop educational material, obtain funding, assimilate projectliterature and materials, and garner institutional support

Common Characteristics of SuccessfulQuality Improvement Teams

■ Clear goals and a written charter■ Clarity of each member’s role■ A standard process for team meetings

and the team’s work■ Trained and oriented team members■ External support and recognition■ Effective leadership and facilitation■ Collaborative problem solving and

decision making■ Presence of leadership with the

resources to implement proposedsolutions

■ Time for team meetings and assignedteam work

Adapted from reference 47.

Page 32: Improving the Quality of Pain Management Through Measurement and Action

24 Improving the Quality of Pain Management Through Measurement and Action

Section IV: Designing Your Pain Management Improvement Initiative

D. Identifying Resources andProject Timeline

The final steps of the planning processinclude determining necessary resources anda timeline. Resources will be needed to sup-port the quality improvement team’s work aswell as to implement improvement interven-tions. Improvement interventions mayinvolve patient-care related equipment (e.g.,patient-controlled analgesia pumps), writtenmaterials (e.g., educational booklets, surveys,instructions, assessment instruments), andmedia (e.g., teaching videos). Project-relatedresources may include equipment (e.g., com-puters, printers), supplies (e.g., paper, flipcharts, poster-boards,) and staff time (e.g., fordata collection, analysis, meetings, atten-dance at educational programs).

Whenever possible, adapt and adopt fromexisting reference materials to save preciousstaff time and benefit from previous research.Fortunately, there is a rich foundation ofmaterials developed and published by painresearchers to address organizational, clini-cian, and patient assessment as well as insti-tutional policy statements and standards.

Examples of some of these materials are ref-erenced in Section V (Measuring What YouNeed to Know). Additional publicationsinclude: ■ Pain Clinical Manual27

■ Building Institutional Commitment to PainManagement. The Wisconsin ResourceManual26

■ Pain Assessment and Management: AnOrganizational Approach30

■ Examples of Compliance: Pain Assessmentand Management.49

Finally, the team must develop a timelinebased on realistic estimates for each step ofthe project; underestimating the amount oftime required is a common pitfall. Importantproject steps include the baseline perform-ance assessment, pilot testing, redesign (ifindicated), retesting, and full-scale imple-mentation. The timeline can be as short as afew weeks for simpler improvement efforts oras long as several years, which may berequired to complete comprehensive qualityimprovement projects.32

Page 33: Improving the Quality of Pain Management Through Measurement and Action

SECTION V:

Measuring What YouNeed to Know

In the design/planning phase, specificobjectives and the scope of measurement forthe improvement activity are identified.Careful planning to select project objectivesthat are measurable will help your organiza-tion prepare for the next step: data collec-tion. Data collection is used throughout theimprovement project, beginning with theassessment of current performance and con-tinuing during and after interventions todocument change. Deciding how to collectthe data (i.e., the measurement approach) isan important decision. Several considera-tions when selecting an approach are dis-cussed in this section, and several measure-ment approaches are presented.

A. Consider Your MeasurementApproach

Measurement involves the collection ofspecified values or facts. The data selectedfor collection should support analysis of thegoals and objectives identified for the proj-ect. The method used in data collection,together with the identified data source(s),constitutes the measurement approach. Insome instances the approach includes well-defined data elements and collection process-es such as those required for the MinimumData Set (MDS) by the Center for Medicareand Medicaid Services (formerly the HealthCare Financing Administration) or for par-ticipation in a performance measurementsystem. In other approaches, the organiza-tion-specific data elements, sources, and col-lection process will have to be identified.

Many options exist for measuring perform-ance. In Section V.C (Selecting aMeasurement Approach), the followingmeasurement approaches are described andan organizational example specific to pain

management is provided for each: ■ Conduct organizational self-assessment■ Review medical records■ Test knowledge and attitudes ■ Directly observe the care■ Conduct a point prevalence study ■ Assess patient status and outcome over

time■ Collect indicator data■ Utilize an externally developed perform-

ance measurement system.Regardless of the method or measurement

approach selected, there are some commondata collection issues to consider. Obtainingquality data that can support performanceassessment is critical to a successful improve-ment initiative. The following provides anoverview of the issues of data quality, dataquantity, and use of data collection instru-ments.

1. Data QualityEnsuring data quality is an essential compo-

nent of the measurement initiative. High-quality data are needed to establish the valid-ity of the measurement and improvement ini-tiative (e.g., the extent to which the conceptbeing assessed is accurately measured and thatthe improvement is a true improvement).50

There are several issues to consider relativeto data quality. Some issues apply at the dataelement level, while others apply after dataelements have been aggregated (e.g., indica-tor rates). Data quality issues can bedescribed by using the concepts of accuracy,completeness, and consistency.

a. AccuracyThe accuracy of information at the data

element level is a function of the definitions(e.g., clarity and thoroughness), categoriza-tions (e.g., whether categories are appropri-ate, comprehensive, and mutually exclusive),and the overall clarity and readability of theinstructions and documentation (e.g., inclu-sion and exclusion criteria). Often, it is nec-essary to describe a preferred source of data,because even simple items like patient agecan be recorded differently and in more thanone place (e.g., on the chart versus in theadmission database). When aggregating mul-

Joint Commission on Accreditation of Healthcare Organizations 25

Section V: Measuring What You Need to Know

Page 34: Improving the Quality of Pain Management Through Measurement and Action

tiple data elements to arrive at calculatedvalues (e.g., rates), the number and complex-ity of steps can affect the accuracy of theinformation.

b. CompletenessCompleteness refers to the extent of miss-

ing data. The more data elements needed,the greater the opportunity for missing data.

c. ConsistencyConsistency is often referred to as reliabili-

ty. Reliability is the extent to which themeasure or data collection tool, when repeat-edly applied to the same population, yieldsthe same results a high proportion of thetime. Three categories of reliability assess-ment include: internal consistency (used forscores created from multiple items), inter-rater (used for medical record abstraction),and test-retest (used for survey measures).50

To enhance consistency, one may want toconsider these questions: Are differentdepartments using different pain scales orassessment tools? Are all patients beinginstructed consistently in how to use thetool? Will data collection be completed by afew individuals, or many staff members? Howwill consistency between data collectors(interrater reliability) be ensured?

No data source or data collection process isperfect; there are always trade-offs to consid-er. For example, data from automated sourcesmay be more reliable than data that are man-ually abstracted, but the information in auto-mated sources (e.g., administrative) may beless clinically robust. Using scanable formscan decrease errors associated with data entry,but may limit the types of responses that canbe entered. In general, as more peoplebecome engaged in data collection, the relia-bility of the process decreases. However, lim-iting responsibility for data collection to oneor two people risks the possibility that staffturnover, illness, or reassignments can sub-stantially derail your project. This potentialproblem can be mitigated somewhat by cross-training people to perform different functions.

Data quality can be greatly enhanced byformal training and testing of data collectionprocedures. These steps may initially requireextra time and resources, but will result in

26 Improving the Quality of Pain Management Through Measurement and Action

Section V: Measuring What You Need to Know

Examples of Approaches to AssesingData Quality

Automated edit checks. The use of datacollection software allows for built-in editsto promote data integrity. This is a benefitof using an automated approach to collectdata because some errors can be found andcorrected at the point of data entry. Also,missing data can be minimized by softwareprompts alerting the user of neededresponses.

Periodic reabstraction by a person otherthan the usual data collector for a samplegroup of patient records. This approach iscommonly used for manually abstracteddata on a monthly or quarterly basis. Acertain number or percentage of patientrecords are pulled at random, and someonereabstracts the same data by using the samedata collection tool as the original datacollector to determine an error rate. Thismethod is referred to as “interrater orinterobserver reliability.” Frequently occur-ring errors are investigated to determinepossible causes, and actions are taken toprevent the errors from recurring.

Duplicate data entry. Companies that spe-cialize in survey research often require allor a subset of data to be entered twice (bytwo people or one) to assess and ensureaccuracy in data entry.

Run aggregate frequencies and means onindividual data elements and rates. Thefirst step in data analysis is most useful toidentify data quality problems. Typically,one uses database software (such asMicrosoft Excel or Microsoft Access[Microsoft Corporation, Redmond, WA])to identify cases with missing values as wellas outlier values (e.g., ages over 200 years,excessive lengths of stay). One then needsto follow up with original data sources toconfirm or correct inaccurate information.

Adapted from reference 47.

Page 35: Improving the Quality of Pain Management Through Measurement and Action

enhanced credibility of results and buy-infrom stakeholders in the long run.

Early in the process, it is important to deter-mine how you will monitor the quality of yourdata. Remember that monitoring data qualityneeds to be done at regular intervals, deter-mined by the length of the project. Examplesof different approaches are provided in the boxon the previous page.

2. Data QuantityA second consideration is the quantity

(amount) of data you will need. Quantity isaffected by both the total amount of dataneeded and the frequency with which dataare collected. The amount of data neededdepends on the type of analysis you plan todo and the level of confidence you want tohave in your results. If you plan to usesophisticated statistical analyses and/orintend to generalize your findings from asample to the population, you may want touse a power analysis formula to calculate thenecessary sample size. See Scheaffer et al.51

and Fowler52 for more information on samplesize issues. Statisticians frequently recom-mend a minimum of 30 cases in each groupfor analysis. It is always wise to consult withthe experts on sample size issues.

If you cannot collect data on all patientsin the population of interest, you can usesampling techniques to reduce the data col-lection effort. One example of a simplemethod for selecting a systematic randomsample is provided in Managing PerformanceMeasurement Data in Health Care.47 Thesteps are as follows: 1) obtain a list ofpatients in the order in which they weretreated, 2) count the number of patients onthe list, and 3) divide by the number neededfor the sample size. The resulting numberwill be the interval between one patient onthe list and the next patient on the list whois to be selected for the sample. In otherwords, if the list has 300 patients and theneeded sample size is 50 cases, every sixth(300/50 = 6) patient record would be select-ed for data collection. To make sure eachpatient has an equal chance of being select-ed, it is helpful to pick the starting point ran-

domly (using a random numbers table or adie). Some commercial software programssuch as Microsoft Excel include a randomsampling procedure. Selected approaches tosampling are shown in the box above.

Joint Commission on Accreditation of Healthcare Organizations 27

Section V: Measuring What You Need to Know

Examples of Approaches for SelectingSamples

Simple random sampling is a process inwhich a predetermined number of casesfrom a population as a whole are select-ed for review. It is predicated on theidea that each case in the populationhas an equal probability of being includ-ed in the sample.

Systematic random sampling is a processin which one case is selected randomly,and the next cases are selected accord-ing to a fixed interval (for example,every fifth patient who undergoes a cer-tain procedure).

Stratified sampling is a two-step process.First, the population is stratified intogroups (e.g., male/female); second, asimple random sample is taken fromeach group.

Cluster sampling is a process in whichthe population is divided into groups;then some of the groups are selected tobe sampled.

Judgment sampling is a process in whichexperts in the subject matter select cer-tain cases to be sampled. Unlike thepreviously mentioned “probability ”sampling techniques, this form of sam-pling is considered a “nonprobabilitysample.” It is likely that the samplegroup will not represent the popula-tion’s characteristics. However, theexperts selecting the cases may be tryingto change a particular process.

Adapted from reference 47.

Page 36: Improving the Quality of Pain Management Through Measurement and Action

B. Collecting Data WithEstablished AssessmentInstruments

Organizations should use structured datacollection tools, which can be paper formsand/or electronic data entry screens. Theymay choose to develop these forms or pro-grams themselves, or adopt or adapt onedeveloped by others. The result is a widearray of materials ranging from a simple chartabstraction form to a sophisticated clinicalassessment product. The following defini-tions are provided to differentiate betweenvarious types of tools.

Data collection tool: A user-friendly com-posite of indicators, trigger questions, or state-ments aimed at eliciting performance dataabout specific issues of concern.53 Two exam-ples of this type of tool are the Medical RecordAudit Form54 and the Pain Audit Tool.31

Quality improvement tools (performanceimprovement tools): A collection of testedactivities and nonstatistical and statisticalmethods designed to facilitate the process ofimprovement. An example of one of thesetools is the cause-and-effect diagram, alsocalled a fishbone or Ishikawa diagram.Quality improvement tools are furtherdescribed in Section VI (Assessing andAnalyzing Your Processes and Results).

Patient assessment instruments: Testeddata collection tools designed to obtainstructured information about patient healthstatus and levels of functioning. Examples ofclinical assessment instruments include theMcGill Pain Questionnaire,55 The Brief PainInventory,56,57 and the SF-36 HealthSurvey.57a The widely used SF-36, which canbe self-administered by persons over age 14years or by using trained interviewers, meas-ures the following eight health concepts: ■ Limitations in physical activities because

of health problems■ Limitations in usual role activities

because of physical health problems■ Bodily pain■ General health perceptions■ Vitality (energy and fatigue)■ Limitations in social activities because of

physical or emotional problems

■ Limitations in usual role activitiesbecause of emotional problems

■ Mental health (psychological distress andwell-being).

Permission to use the SF-36 Health Surveycan be obtained from the Medical OutcomesTrust at www.outcomes-trust.org.

It is useful to consider several issues whenselecting clinical assessment instrumentsdeveloped by others. The Medical OutcomesTrust has identified eight attributes for evalu-ating existing health status instruments (Table5).58 These attributes could be useful whenconsidering the selection and use of painassessment instruments developed by others.

One should determine whether and howthe instrument has been evaluated for theseattributes. Do the results indicate the instru-ment can be applied successfully to your orga-nization’s patient population? Learning aboutthe development and testing of an instrumentwill help ensure that it will support collectionof data that are meaningful and meet yourperformance improvement goals.

1. Pain-Specific InstrumentsSpecific instruments have been developed

by researchers to assess for the presence ofpain, the intensity of pain being experienced,and the impact of pain on the individual’sactivities of daily living. In general, theseinstruments may be grouped into those thatmeasure a single dimension (e.g., pain inten-sity) and those that measure multiple dimen-sions (e.g., pain intensity, duration, frequen-cy, history, and impact on activities of dailyliving). Some examples of common unidi-mensional and multidimensional instrumentsare presented in Table 6. Additional infor-mation also is available in the companionmonograph, Pain: Current Understanding ofAssessment, Management, and Treatments,Section II, Assessment of Pain,9 and later inthis section within specific measurementapproach descriptions.

2. Choosing Pain InstrumentsConsiderations when selecting instruments

for measuring pain include:■ The type of pain to be evaluated.

28 Improving the Quality of Pain Management Through Measurement and Action

Section V: Measuring What You Need to Know

Page 37: Improving the Quality of Pain Management Through Measurement and Action

■ The measurement goals. Are you assess-ing multidimensional aspects of pain?

■ Characteristics of the patient population(e.g., language, cognition, age, culturalfactors).

■ Resources required/ease of use. Considerthe comfort of staff members with usingthe tool, the ability of patients to use the

tool, the time needed to administer thetool, appropriateness andinterpretability.27,47

■ Reliability. A pain rating scale should bereliable, meaning that it consistentlymeasures pain intensity from one time tothe next.27

■ Validity. There are a number of different

Joint Commission on Accreditation of Healthcare Organizations 29

Section V: Measuring What You Need to Know

Table 5. Scientific Review Criteria for Health Status Instruments

Source: Lohr KN, Aaronson NK, Alonso J, et al. Evaluating quality-of-life and health status instruments: development ofscientific review criteria. Clinical Therapeutics 1996;18:979-986. Used with permission.

Criterion Description

Conceptual andmeasurement model

A conceptual model is a rationale for and description of the concept(s) that themeasure is intended to assess and the relationship between those concepts. Ameasurement model is defined as an instrument’s scale and subscale structure and theprocedures followed to create scale and subscale scores.

Reliability The principal definition of test reliability is the degree to which an instrument is freefrom random error. A second definition of reliability is reproducibility or stability of aninstrument over time (test-retest) and interrater agreement at one point in time.

Validity The validity of an instrument is defined as the degree to which the instrument measureswhat it purports to measure. There are three ways of accumulating evidence for thevalidity of an instrument: 1) content-related: evidence that the content domain of aninstrument is appropriate relative to its intended use; 2) construct-related: evidence thatsupports a proposed interpretation of scores on the instrument based on theoreticalimplications associated with the constructs; and 3) criterion-related: evidence thatshows the extent to which scores of the instrument are related to a criterion measure.

Responsiveness Sometimes referred to as sensitivity to change, responsiveness is viewed as animportant part of the construct validation process. Responsiveness refers to aninstrument’s ability to detect change, often defined as the minimal change consideredto be important by the persons with the health condition, their significant others, ortheir providers.

Interpretability Interpretability is defined as the degree to which one can assign qualitative meaning toan instrument’s quantitative scores. Interpretability of a measure is facilitated byinformation that translates a quantitative score or change in scores to a qualitativecategory that has clinical or commonly understood meaning.

Respondent andadministrative burden

Respondent burden is defined as the time, energy, and other demands placed on thoseto whom the instrument is administered. Administrative burden is defined as thedemands placed on those who administer the instrument.

Alternative forms Alternative forms of an instrument include all modes of administration other than theoriginal source instrument. Depending on the nature of the original source instrument,alternative forms can include self-administered self-report, interviewer-administeredself-report, trained observer rating, computer-assisted self-report, computer-assistedinterviewer-administered report, and performance-based measures.

Cultural and languageadaptations

The cross-cultural adaptation of an instrument involves two primary steps: 1)assessment of conceptual and linguistic equivalence and 2) evaluation of measurementproperties.

Page 38: Improving the Quality of Pain Management Through Measurement and Action

30 Improving the Quality of Pain Management Through Measurement and Action

Section V: Measuring What You Need to Know

types of validity (e.g., construct and cri-terion). The validity of assessmentinstruments has been defined as thedegree to which the instrument measureswhat it purports to measure.58 A validpain rating scale has been described byMcCaffery and Pasero as “one that accu-rately measures pain intensity.”27

■ Sensitivity to small fluctuations in whatis being scaled.64

Each instrument has advantages and disad-vantages that should be understood and con-sidered. Tables 17 and 18 in the companionmonograph, Pain: Current Understanding ofAssessment, Management, and Treatment,highlight some of the advantages and disad-vantages associated with each tool.9

Psychometric properties for selected instru-ments have been described in the literature.For example, Serlin et al.65 explored the rela-tionship between numerical ratings of pain

severity and ratings of the pain’s interferencewith functional activities (activity, mood,and sleep) for patients with cancer pain.Using a 0 to 10 scale, they found three dis-tinct levels of pain severity. Specifically, theydetermined that pain ratings of 1-4 corre-sponded with mild pain, 5-6 indicated mod-erate pain, and 7-10 signaled severe pain inthe cancer patients surveyed.

One organization empirically evaluatedthree pain scales in an effort to select a scalefor use in their improvement initiative.66

Three research questions were posed: 1) Isone of the scales easier for most patients touse? 2) Is the choice of scales influenced bynursing unit, age, education, race, socioeco-nomic status, diagnosis, or type of pain expe-rienced? 3) Do patients perceive that a ratingscale helps them describe their pain moreeffectively? Three scales (a visual analogscale, a numeric rating scale, and a faces

Table 6. Examples of Unidimensional and Multidimensional Pain-Specific

Instruments

Sources: Adapted from references 27, 55, 56, and 59-63.

Name Type Description

Numeric ratingscale

Unidimensional A scaling procedure where subjects use numbers from 0 (no pain) to 10(worst possible pain) to describe their pain; numbers may be presentedon a vertical or horizontal line, may include words and numbers, or maybe administered verbally27

Visual analogscale

Unidimensional A scaling procedure used to measure a variety of clinical symptoms (e.g.,pain, fatigue) by having subjects indicate on a straight line (usually 10cm in length) the intensity of the attribute being measured59

Faces scales Unidimensional A scale depicting a series of faces with expressions representing increas-ing levels of discomfort; examples include the Oucher,60 and the Wong-Baker Faces Scale.61,62

Brief PainInventory(BPI)

Multidimensional A series of questions covering the intensity and location of pain plus thehistory, impact on activities of daily living, and treatments56

Initial PainAssessment Tool

Multidimensional A tool for the clinician to use in conducting a pain assessment; areascovered include location, intensity, quality of pain, onset, duration, varia-tions and rhythms, manner of expressing pain, what causes and relievespain, the effects of pain, and other comments27

McGill PainQuestionnaire(MPQ)

Multidimensional An assessment of pain from three dimensions: sensory, affective, andevaluative55

Page 39: Improving the Quality of Pain Management Through Measurement and Action

scale) were presented to patients in thestudy, who then completed survey questionsabout using the scales. The results revealedthat the faces scale was most often selectedby patients, followed by the numeric ratingscale and then the visual analog scale.

C. Selecting a MeasurementApproach

When considering which measurementapproach to use, there are certain questionsto keep in mind. These include: ■ Does the method match/support the

measurement objective? For example, useof a point prevalence study to measureevents or processes undergoing rapidchange and with great variability maynot yield sufficient or meaningful data.

■ Which approach will provide the mostreliable and highest quality data and theleast bias and subjectivity; which willminimize missing data?

■ What are the available data sources? Oneshould assess for data availability, but donot let data alone determine your meas-urement goals.67 In many organizations—especially large ones—data may be collect-ed but kept within individual departmentsand therefore cannot be routinely sharedwith others, or data may be collected inmore than one place. For example, dataregarding pain medication administrationmay be documented in an automatedpharmacy information system as well as inthe individual medical record.

■ What is the most economical optionthat fulfills measurement requirements?Consider the data collection burden(e.g., time to collect the data, the staffneeded, access to/use of automated infor-mation systems) against the benefitsderived. For example, could dataretrieved from an automated administra-tive database (e.g., pharmacy ordering)be used to collect indicator data, or ismedical record review required?

■ Will the measurement strategies selectedbe sensitive enough to capture changesthat occur as a result of the intervention,

and support assessment of the impact ofthe intervention?44

The following sections outline commonmeasurement approaches used in health care.However, these approaches are not the onlyoptions available. A description of themethod, suggested applications, and resourceconsiderations are provided for each methoddiscussed.

1. Conduct an Organizational Self-assessment

a. Definition /description of methodThis method is used to determine an orga-

nization’s status in relationship to targetedareas of performance. It involves the collec-tion of organizational data using a compre-hensive structured instrument that captureskey structural and process elements associat-ed with performance across the organization.These include information about the struc-tures that support quality pain managementsuch as written mission and policy state-ments, staff qualifications/credentials, staffingpatterns, forms for documentation, capitalresources, information management systems,and organizational management style.

b. Applications Organizational self-assessment is critical in

establishing a baseline of the organization’sperformance in pain management and iden-tifying potential opportunities for improve-ment based on objective data. It also can beused over time to conduct ongoing evalua-tions of the organization’s performance.

c. Resource considerationsConducting an organizational assessment

will require staff time and use of a developedtool. It is beneficial to involve multiple peo-ple in the process to obtain input from sever-al perspectives.

Two examples of organizational assessmenttools specific to pain management are theInstitutional Needs Assessment Tool68 and achecklist to assess institutional structuresthat will support efforts of improved painrelief.31 There are also organizational self-assessment tools and worksheets designed tohelp focus quality improvement activities.

Joint Commission on Accreditation of Healthcare Organizations 31

Section V: Measuring What You Need to Know

Page 40: Improving the Quality of Pain Management Through Measurement and Action

32 Improving the Quality of Pain Management Through Measurement and Action

Section V: Measuring What You Need to Know

Examples of such tools in JCAHO’s A Guideto Performance Measurement for Hospitals53

include Assessment of StakeholdersOutcomes, Data/Information Needs,Assessing Organization Commitment toPerformance Measurement andImprovement, and Determining Areas forOrganization Improvement.

2. Review Medical Records

a. Definition/description of method Medical record review (also known as an

audit) is the process of ensuring that medicalrecords properly document patient care. Theprocess can take place either while care isbeing provided (referred to as open or concur-

rent review) or after the patient has been dis-charged from the health care organization(referred to as closed or retrospectivereview).69

Medical records are reviewed with a struc-tured review tool for completeness and time-liness of information, and the presence ofspecific data is authenticated. In relation topain management, a record audit can capturecritical pieces of information about pain caresuch as:■ The documentation of pain assessments

and pain intensity ratings.■ The timing and types of interventions

provided.■ Patient reassessment and response to

interventions.■ Patient education for symptom manage-

ment.■ Evidence of discharge planning, which

includes pain management instructions.This review often is completed on a repre-

sentative sample of records (as discussed inSection V.A.2 [Data Quantity]).69

b. ApplicationsThe medical record review can be useful at

several stages of the improvement initiative,including determination of current practicesat your organization, identification of areasfor focused review and improvement oppor-tunities, measurement of baseline perform-ance before implementing an intervention,and measurement of performance changeagainst the baseline.

c. Resource considerationsMedical record review generally involves

staff time and requires the use of a defineddata collection tool. Hard copy records willrequire manual review by a staff member.When the medical record is computerized,this can be an automated process.

A few examples of medical record audittools specific to pain management are theMedical Record Pain Management Audit,27

the Pain Management Audit Tool,70 thePatient Record Pain Assessment Tool,71 andthe Patient Record Pain ManagementAssessment Tool.72

Organizational Example

Conduct Organizational Self-assessment

Completion of an organizational self-assessment was an important part of thepain improvement initiative atMemorial Hospital of Sheridan Countyin Sheridan, Wyoming. Members of ateam assembled to evaluate pain man-agement practice and identify opportu-nities for improvement, utilized theInstitutional Needs Assessment datacollection tool68 to complete thisimportant assessment. Input from manypeople was included to obtain a com-plete and accurate picture based oninformation and responses from acrossthe organization. The assessment pro-vided a frame of reference for initiatingorganizational pain managementimprovement efforts; and over time, ithas served a blueprint for identifyingand selecting opportunities for improve-ment. For additional information on theimprovement activities in this hospitalsee Section IX, Examples ofOrganizations Implementing PainManagement-Related Measurement andImprovement Initiatives.

Page 41: Improving the Quality of Pain Management Through Measurement and Action

3. Test Knowledge and Attitudes

a. Definition/description of methodThis method uses a standardized instru-

ment to assess knowledge, attitudes, behav-iors, or perceptions of a targeted group. Aninstrument is administered before an inter-vention to determine learning needs andafter an intervention to assess change inknowledge, attitudes, or behaviors (pretest-ing/posttesting).

b. ApplicationsThis method is very useful as part of initial

and ongoing assessment of organizationalperformance in pain management. It isimportant to assess knowledge, attitudes, and

perceptions of patients as well as staff.

c. Resource considerationsThe primary resource required for this

method is time for administration of theassessment or testing instrument. A fewexamples of knowledge and attitude assess-ment instruments specific to pain are RNNeeds Assessment: Pain Management,73 thePain Knowledge and Attitude Survey,27 thePain Management Patient Questionnaire,27

and the Patient Outcome Questionnaire.18

4. Directly Observe the Care

a. Definition/description of methodThis method involves watching and not-

Joint Commission on Accreditation of Healthcare Organizations 33

Section V: Measuring What You Need to Know

Organizational Example

Review Medical Records

The review of medical records has been a staple in the assessment of pain managementpractice at Advocate Christ Medical Center in Oak Lawn, Illinois. Multiple aspects ofpain-related care processes and outcomes are tracked through chart review for general andspecialty populations (e.g., sickle cell patients, orthopedic patients). Early in 2002, a clini-cal nurse specialist for pain management and the hospital-wide Pain ManagementCommittee pondered the issue of how well physicians were documenting pain informationon admission assessments. They decided to obtain baseline data and enact measures toheighten physician awareness and improve documentation practices. Developing a tool for data abstraction was the first step. Initially, the tool was designed tocapture only the existence of pain assessment documentation by the physician on admis-sion or within the first 24 hours. However, after pilot-testing, it was revised to determinethe documentation of pain characteristics (e.g., intensity, duration, location) and use of anumeric pain scale. A third and final change to the tool was the addition of the numericpain rating, recorded in the nurse’s admission pain assessment.Chart review for baseline data began in March 2002. Data from 25 to 30 charts in threeselected units are abstracted monthly by the clinical nurse specialist for pain managementand two other clinical nurse specialists. Currently, charts are being selected from the med-ical, surgical, and orthopedic units. Over time, the review will be rotated to other units.In June 2002, a new intervention was added to the ongoing medical staff education activi-ties. A “business card” on bright yellow paper was inserted in a plastic holder on the frontof each patient chart. It displays the “no pain” sign and poses the following questions: ■ Do you know the score (0-10)?■ Did you document it?To assess the impact of the card reminder system, data from charts reviewed after imple-mentation will be compared with data from charts reviewed at baseline. Through contin-ued record review, the committee hopes to see a trend toward increased documentation ofinitial pain assessment information and use of pain scales by physicians.

Page 42: Improving the Quality of Pain Management Through Measurement and Action

ing the behavior of and/or interactionsbetween caregivers and patients/customers.29

Typically, a process of care performed by aclinician is observed by a specially trainedthird party, who objectively notes each activ-ity using structured methods of observationand documentation.

b. ApplicationsObservational measurement can be useful

for circumstances where the subject isunaware, unable, and/or hesitant to report,or lacks awareness of the events and activi-ties that are wanted. For example, it can beused for observing professionals in the courseof performing routine care processes such as

pain assessment and medication administra-tion. It is best used when the observer isunobtrusive and events being observed areroutine and familiar. It is not considered agood choice for unpredictable events/behav-iors and events that are of long duration.75

c. Resource considerationsThis method will require observers with

appropriate training and qualifications. It istime intensive and therefore costly. Also, thesubject of the observation may alter his orher behavior due to the presence of theobserver.76 However, this method offersunique advantages over other approaches inthat it allows researchers to assess the quality

34 Improving the Quality of Pain Management Through Measurement and Action

Section V: Measuring What You Need to Know

Organizational Example

Test Knowledge and Attitudes

When Swedish Medical Center in Seattle, Washington, began to develop organization-wide inter-ventions to improve pain management, testing staff knowledge and attitudes was a key activity. Theyselected an established instrument, developed by McCaffery and Ferrell,74 and obtained permissionfor use. It was distributed to all nursing staff, and 497 completed surveys were returned (pretest).Survey responses were anonymous, but department identification was collected. The results werereviewed carefully, and seven questions were selected as focus areas for improvement activities. Overthe next year and a half, multiple interventions were conducted in these focus areas. These interven-tions ranged from patient and staff education to the development of a care protocol and documenta-tion tool (see Examples of Compliance: Pain Assessment and Management49). As part of the organiza-tion’s educational plan, a 1-hour staff in-service entitled the “Hour of Pain” was developed. It tookabout 1 year to schedule all staff to attend.

The knowledge and attitude survey was administered a second time after all staff had completed theeducational program. A total of 430 surveys were returned (posttest). To measure change, aggregateresults were compared with those from the first survey. Answers revealed improvement in six of theseven focus areas targeted in the educational program, with significant improvement for the ques-tion “Who is the most accurate judge of the patient’s pain?”

Testing for knowledge and attitudes also has been done by individual units to assess special needsand customize interventions. Most units have approximately 80 nurses, and a return rate of at least38% (30 out of 80 surveys) is targeted. The surveys are used to determine the top five questionsanswered incorrectly, which become the focus of further scrutiny and possibly improvement inter-ventions.

The manager of the Swedish Pain Center noted that measuring knowledge and attitudes has beenan effective tool for changing behavior. Following the guiding principle of “keeping it as simple aspossible” has led to results that are understandable and have effectively supported improvements inpain management practice.

Page 43: Improving the Quality of Pain Management Through Measurement and Action

of interactions and qualitative factors thatcan not be captured in documentation.75

Examples of applications of the observa-tion approach related to pain managementinclude observation of clinicians conductingpain assessments and observation of deliveryof nonpharmacologic therapies.

5. Conduct a Point Prevalence Study

a. Definition/description of methodPoint prevalence is a static measure of a

rate for an event of interest at a designatedpoint in time, or a census-type measure.78,79

A point prevalence study involves use of ascientific approach to collect data for calcu-lating a point prevalence rate. Data are col-

lected at designated intervals (e.g., quarterlyor annually) on the occurrence of a specificevent or outcome of interest for a definedperiod of time such as 1 week or 1 month.

b. ApplicationsThis method has wide applicability and

can be helpful in reducing the burden of datacollection, especially when the cost of con-tinuous data collection exceeds the benefit.Alternatively, when targeted goals of per-formance have been achieved, it can be usedfor ongoing monitoring to ensure that per-formance is not deteriorating. For example, ifa performance improvement team hasdemonstrated 100% compliance with animprovement initiative to document an ini-

Joint Commission on Accreditation of Healthcare Organizations 35

Section V: Measuring What You Need to Know

Organizational Example

Directly Observe the Care

The Orthopedic Department of Mercy Medical Center–North Iowa, Mason City, Iowa, hasincorporated direct observation of care into annual competency assessment related to painmanagement. Clinical staff and managers worked together to develop a standardized obser-vation tool that reflects important components from the hospital’s pain management poli-cy, the pain management flow sheet, and the University of Iowa guideline on acute painmanagement in the elderly.77 The tool focuses on pain assessment and the planning, imple-mentation, and evaluation of interventions. The tool is further refined for different staffpositions (e.g., registered nurse, licensed practical nurse, nursing assistant).

Observers use cognitive, technical, and interpersonal skills to assess competency in pain man-agement. These observers are fellow co-workers who have undergone additional trainingrelated to pain management, how to conduct an observation, and how to use the documenta-tion tool. These individuals are recognized for their skills and knowledge related to orthope-dic surgical pain, their ability to teach, and their nonthreatening approach. The pain man-agement competency observation is conducted before the staff member’s annual review and isdocumented on his or her evaluation.

As part of the direct observation process, the observer and staff member review the writtentool before the observed patient interaction. They introduce themselves to the patient,and the observer quietly observes as care is provided. Special actions taken by the clinicianto address age- or patient-specific needs are noted in a section of the observation tool con-cerned with special patient considerations. Staff performance in measured competencies ischecked as acceptable or unacceptable, and individualized comments are added. If areas forimprovement are identified, the observer and clinician identify an action plan and targetdate for reassessment. The rich complexity of each patient situation provides a great envi-ronment for learning, and the observation process supports the underlying goal of providingan individualized educational experience that is positive rather than punitive.

Page 44: Improving the Quality of Pain Management Through Measurement and Action

tial pain assessment for each patient, theteam may decide to monitor performanceusing a sample of charts at regular intervals(e.g., 1 day per month) rather than review-ing all charts.

c. Resource considerationsData collection may be automated or

require manual review depending on thedata source. These considerations, as in allmethods, will determine the type of resourcesnecessary. A sound method for determiningthe interval and time frame for data collec-tion should be used to ensure that measure-ment objectives are met.

6. Assess Patient Status and OutcomeOver Time

a. Definition/description of method This method involves the repeated meas-

urement of strategic patient factors and char-acteristics with established assessment instru-ments in an effort to demonstrate changeover time.

b. ApplicationsThis essential but underutilized approach

provides critical information about variousaspects of patient status and outcomes.Measurement for pain management canfocus on areas of patient functioning andexperience such as pain intensity, satisfac-tion, functional status, treatment response,and knowledge and attitudes.

i. Pain intensity/severityIt has been suggested that measuring pain

severity is one of the most important indica-tors of pain management.80 This outcomeusually is measured with pain assessmentinstruments. These include the pain ratingscales and pain assessment instruments previ-ously discussed in Section V.B.1 (Pain-Specific Instruments). Additional informa-tion about pain intensity may be availablefrom treatment administration records (phar-macologic and nonpharmacologic) andpatient surveys.

ii. Patient satisfactionThis information can provide valuable

insight into the quality of care as viewed

36 Improving the Quality of Pain Management Through Measurement and Action

Section V: Measuring What You Need to Know

Organizational Example

Conduct a Point Prevalence Study

Since 1971, the Office of StatewideHealth Planning and Development inCalifornia has collected and disseminat-ed important financial and clinical datafrom licensed health facilities (seewww.oshpd.ca.gov). Staff at theUniversity of California Medical Centerin San Diego have expanded the man-dated data collection effort to includedata obtained from pain assessmentinterviews and chart audits conductedon a single day each quarter. Workingin pairs, members of the NursingQuality Council gather informationabout all available patients on each des-ignated unit through concurrent patientinterview and chart abstraction.

Although these data are maintained forinternal use, the organization has builtonto an existing data collection mecha-nism to provide quarterly performanceinformation that can be used to discerntrends and track the effectiveness ofquality improvement actions regardingpain management over time. For exam-ple, in two data collection efforts, thecharts of 433 patients were reviewed. Itwas found that one organization-speci-fied documentation standard was met inonly 39% of charts. In their discussionof results, the Nurse Practice Councilidentified a structural barrier related tothe forms used in certain units, whichcould easily be corrected to improvethis score. Of the 217 patients inter-viewed during the two collection peri-ods, 92% reported that a pain reliefaction was taken within 20 minutes oftheir report of pain. Communicatingthis result to staff provided positivefeedback on their responsiveness topatient reports of pain, which wasrewarding and satisfying.

Page 45: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission on Accreditation of Healthcare Organizations 37

Section V: Measuring What You Need to Know

Organizational Example

Assess Patient Status and Outcome Over Time

Since 1995, the Kaiser Permanente Northwest Pain Management Program has worked tochange attitudes and beliefs about pain while effectively treating plan members who are suf-fering. The program is built around four components: 1) a multidisciplinary Pain Board to setpolicy and oversee the program; 2) Multidisciplinary Pain Group visits that support primarycare practitioners by providing assessment information, patient education, and therapeuticoptions and recommendations in the primary care setting; 3) a Multidisciplinary Pain Clinicfor tertiary care and complex interventions; and 4) communication systems that offer peerconsultation and mentoring via the electronic medical record and telephone.

Together with program development, the Pain Board designed a structured evaluation of theprogram’s effectiveness based on measuring patient status and outcomes over time. Originallythey tracked three pain variables (pain now, least pain in past week, worst pain in past week);seven functional variables (interference with general activity, work, walking, mood, sleep, rela-tionships with others, enjoyment of life); two satisfaction variables (satisfaction with efforts ofyour caregivers to relieve your pain and satisfaction with the outcomes of treatment); and sixutilization variables (mental health visits, other outpatient visits, inpatient days, prescriptionsfilled, laboratory tests, imaging studies).

The pain, function, and satisfaction data were collected by a mail survey conducted 6months after each series of patient visits. The utilization data were routinely recorded bythe electronic medical record and were analyzed annually for the 12 months before and the12 months after the series of visits by the Pain Group. Analysis of data for the first 5 yearsindicated that 5 of the original variables were the most sensitive indicators: pain now,interference with sleep, satisfaction with efforts and effects, utilization of outpatient visits,and prescriptions filled.

Through the evaluation, Kaiser Permanente Northwest was able to document statisticallysignificant decreases in the percentage of patients with self-reported pain scores greaterthan 5 (on a scale of 1 to 10), interference-with-activity scores greater than 5, and interfer-ence-with-mood scores greater than 5. Interestingly, they also found that as patientsbecame more knowledgeable about pain management, satisfaction at baseline decreasedover time (which is consistent with reports in the literature) and satisfaction after involve-ment in the program increased over time. The data also showed that baseline pain levelsamong patients referred to the program decreased over 5 years, providing evidence that theeducational benefits of the program have extended beyond those patients in the programand have led to improvements in primary care practice. The number of different cliniciansreferring to the program has risen to include 75% of all primary care physicians. Finally,utilization statistics show specific cost savings related to fewer hospitalizations amongpatients with chronic pain. After the Multidisciplinary Pain Group visits, 16% of treatedpatients were admitted to hospital in the subsequent year, compared with the admissionrate of 47% to 55% reported in literature83,84; and the number of prescriptions for painmedicine and other drugs decreased 30%, in contrast to a national increase of 33%.85

Page 46: Improving the Quality of Pain Management Through Measurement and Action

by the recipient or family caregiver andmay identify opportunities for improve-ment. It is important to remember that sat-isfaction and pain relief are not synony-mous. Patient satisfaction with pain man-agement often is high, even in the pres-ence of moderate to severe pain. Themeasurement of satisfaction is complicatedby several factors that can influence satis-faction ratings. It has been hypothesizedthat as patients’ expectations for pain reliefare raised, their satisfaction levels initiallymay decrease.81 Dawson et al.41 exploredthe paradox of patients who are in pain yetexpress satisfaction with their pain man-agement. Some factors found to be predic-tive of satisfaction included the patient’ssatisfaction with overall pain management,if their pain over the last year went downand stayed down, and if the primaryprovider told the patient that treating painwas an important goal. These researchersalso found that the patients’ expectationsabout pain and pain relief were significant-ly related to their level of satisfaction.Thus, it is important to view satisfactionratings in conjunction with other organiza-tional and patient variables. Sourcesinclude proprietary tools from companiesspecializing in patient satisfaction measure-ment.

iii. Functional statusFunctional status refers to the evaluation

of the ability of an individual to performactivities of daily living such as walking,bathing, and dressing. Sometimes these ele-ments are incorporated into pain-relatedassessment instruments, or there are specificfunctional assessment instruments.

iv. Multiple dimensions of functioningThese instruments incorporate pain assess-

ment into tools designed to obtain an overallassessment of health status. Examples of suchtools include the Resident AssessmentInstrument used in the MDS, which isadministered to patients in long-term carefacilities, and the Outcome and AssessmentInformation Set (OASIS), which is adminis-tered to Medicare patients receiving homecare. A recent study found that a scale based

on the MDS (version 2.0) pain items washighly predictive of scores measured by the“gold standard” visual analog scale.82

v. Knowledge and attitudes related topain management

Testing the knowledge and attitudes ofpatients can provide significant insight intobehavior and potential barriers to pain man-agement. More details about the instrumentsdesigned to measure knowledge or attitudesabout pain management among patients areprovided in Section V.C.3 (Test Knowledgeand Attitudes).

c. Resource considerationsResources often will be determined by the

complexity of the instruments used. Analysisof changes in outcome data over time mayrequire statistical support.

7. Collect Indicator Data

a. Definition/description of methodAn indicator (also known as a perform-

ance measure) is a quantitative tool that pro-vides an indication of performance in rela-tion to a specified process or outcome.53

Table 7 presents examples of some indicatorsthat have been developed by various organi-zations to measure aspects of pain manage-ment. A recent collaborative effort betweenthe Joint Commission, the AmericanMedical Association and the NationalCenter for Quality Assurance to developpain-related performance measures began inSeptember 2001.

Developing a clinical measure can be acomplex and difficult task.50 Indicators, likeguidelines and instruments, should reflectcertain attributes to make them credible andeffective. Characteristics or attributes of goodindicators have been defined by multipleorganizations experienced in health caremeasurement.67 See Table 8 for some sug-gested desirable attributes against whichindicators should be reviewed.

In general, a good indicator should raiseimportant questions about the processes andoutcomes of pain management such as assess-ment, documentation, and treatment selec-

38 Improving the Quality of Pain Management Through Measurement and Action

Section V: Measuring What You Need to Know

(continued on page 43)

Page 47: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission on Accreditation of Healthcare Organizations 39

Section V: Measuring What You Need to Know

Table 7. Examples Of Indicator Development for Pain Managementa

Topic Area Indicator Focus Application Developer/Reference

Prevention/Screening Initial patient evaluation Population: Vulnerable Elders(defined as individuals 65 andolder meeting project speci-fied criteria)

Assessing the Care of VulnerableElders (ACOVE) Project. RandHealth, Santa Monica, CaliforniaChodosh J, Ferrell BA, ShekellePG, et al.Quality indicators for pain man-agement in vulnerable elders.Ann Intern Med. 2001; 135(8pt2):641-758, 731-738www.annals.org/issues/V135n8s/full/200110161-00004.html

Prevention/Screening Screening for chronic pain Population: Ibid Ibid

Prevention/Screening Nursing home residents experi-encing a change in condition

Population: Ibid Ibid

Prevention/Screening Screening for pain Population: Ibid Ibid

Prevention/Screening Initial evaluation of cognitivelyimpaired patients

Population: Ibid Ibid

Diagnosis Evaluation for depression inpatients with significant chron-ic pain

Population: Ibid Ibid

Diagnosis Evaluating changes in pain patterns

Population: Ibid Ibid

Diagnosis Diagnostic evaluation ofpatients with significant chron-ic pain conditions

Population: Ibid Ibid

Diagnosis Therapy for underlying condi-tions responsible for significantchronic pain

Population: Ibid Ibid

Diagnosis Quantitative pain assessmentusing standard pain scales inpatients with chronic pain

Population: Ibid Ibid

Treatment/management Assessment of pain in hospital-ized patients with significantchronic pain syndrome

Population: Ibid Ibid

Treatment/management Patient education for patientswith significant chronic painsyndrome

Population: Ibid Ibid

Treatment/management NSAID therapy for the treat-ment of chronic pain

Population: Ibid Ibid

Treatment/ management Monitoring of patients onNSAID therapy for the treat-ment of chronic pain

Population: Ibid Ibid

Page 48: Improving the Quality of Pain Management Through Measurement and Action

40 Improving the Quality of Pain Management Through Measurement and Action

Section V: Measuring What You Need to Know

Table 7. Examples Of Indicator Development for Pain Managementa (continued)

Topic Area Indicator Focus Application Developer/Reference

Treatment/ management Route of administration for painmedication regimens

Population: Ibid Ibid

Treatment/management Bowel management for patientswith chronic pain and treatedwith opioids

Population: Ibid Ibid

Postoperative care Intramuscular route not used Population: Patients havingone of the following surgicalprocedures: hysterectomy,partial excision of the largeintestines, cholecystectomy,coronary artery bypass, andhip replacement.

MetaStar, Inc2090 Landmark PlaceMadison, WI 53713www.metastar.comSource: MetaStar, Inc., Usedwith permission

Postoperative care Meperidine not used Population: Ibid Ibid

Postoperative care Use of self-administered or regu-larly scheduled dosing

Population: Ibid Ibid

Postoperative care Frequency of pain assessment Population: Ibid Ibid

Postoperative care Use of self-report pain scales Population: Ibid Ibid

Postoperative care Use of nonpharmacologic paincontrol

Population: Ibid Ibid

Preoperative care Preoperative pain history Population: Patients age 65and older who have had non-cardiac thoracic,* upperabdominal,* lower abdomi-nal,* or hip or orthopedicextremity* surgery and whoare not drug or substanceabusers, currently receivingnarcotics for chronic pain, orcurrently under treatment forschizoaffective disorders

* Specific ICD-9-CM codes

Center for Clinical QualityEvaluation (formerlyAmerican Medical ReviewResearch Center [AMRRC])Prepared for Agency forHealth Care Policy andResearch AMRRC-AHCPRGuideline Criteria Project:Develop, Apply, And EvaluateMedical Review Criteria andEducational Outreach BasedUpon Practice Guidelines October 1995 AHCPRPublication No. 95-N01

Preoperative care Preoperative psychosocial history Population: Ibid Ibid

Preoperative care Preoperative pain plan Population: Ibid Ibid

Intra-operative care Intra-operative pain management Population: Ibid Ibid

Postoperative care MU agonists in first 24 hours aftersurgery for surgical pain

Population: Ibid Ibid

Postoperative care NSAIDs in first 24 hours after sur-gery for surgical pain

Population: Ibid Ibid

Page 49: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission on Accreditation of Healthcare Organizations 41

Section V: Measuring What You Need to Know

Table 7. Examples Of Indicator Development for Pain Managementa (continued)

Topic Area Indicator Focus Application Developer/Reference

Postoperative care NSAIDs after first 24 hours aftersurgery for surgical pain

Population: Ibid Ibid

Postoperative care Meperidine not routinely given forsurgical pain

Population: Ibid Ibid

Postoperative care Analgesics not given intramuscu-larly except ketorolac

Population: Ibid Ibid

Postoperative care Supervision of regional medica-tion for surgical pain

Population: Ibid Ibid

Postoperative care Medication dose & route docu-mented for control of surgicalpain

Population: Ibid Ibid

Postoperative care Behavioral methods to reduce sur-gical pain

Population: Ibid Ibid

Postoperative care Self-assessment tool used to meas-ure surgical pain

Population: Ibid Ibid

Postoperative care Surgical pain monitoring Population: Ibid Ibid

Postoperative care Sedation monitoring for post-sur-gical patients

Population: Ibid Ibid

Postoperative care Respiration monitoring for post-surgical patients

Population: Ibid Ibid

Postoperative care Catheter site monitoring for post-surgical patients

Population: Ibid Ibid

Postoperative care Treatment change for continuedpost-surgical pain

Population: Ibid Ibid

Postoperative care Treatment change for over seda-tion of post-surgical patients

Population: Ibid Ibid

Postoperative care Treatment change for depressedrespiration among treated post-surgical patients

Population: Ibid Ibid

Postoperative care Treatment change for other com-plications among post-surgicalpatients

Population: Ibid Ibid

Postoperative care(outcome)

Respiratory arrest among medicat-ed post-surgical patients

Population: Ibid Ibid

Postoperative care(outcome)

Infection at catheter site amongmedicated post-surgical patients

Population: Ibid Ibid

Page 50: Improving the Quality of Pain Management Through Measurement and Action

42 Improving the Quality of Pain Management Through Measurement and Action

Section V: Measuring What You Need to Know

Table 7. Examples Of Indicator Development for Pain Managementa (continued)

Topic Area Indicator Focus Application Developer/Reference

Postoperative care(outcome)

Urinary retention among medicat-ed post-surgical patients

Population: Ibid Ibid

Postoperative care(outcome)

Pneumonia among medicatedpost-surgical patients

Population: Ibid Ibid

Postoperative care(outcome)

Deep vein thrombosis amongmedicated post-surgical patients

Population: Ibid Ibid

Postoperative care Discharge instruction regardingpain medication

Population: Ibid Ibid

Ambulatory care Appropriate prescription ofNSAID

Population: Patients taking anewly prescribed NSAID

Project to Develop andEvaluate Methods to PromoteAmbulatory Care Quality(DEMPAQ) recordsDelmarva Foundation forMedical Care Inc. FinalReport to the Health CareFinancing Administration:Developing and EvaluatingMethods to PromoteAmbulatory Care Quality(DEMPAQ). Nov. 15, 1993

Ambulatory care Appropriate prescription of salicy-late

Population: Patients taking anewly prescribed long-term,high-dose salicylate

Ibid

Home care Improvement in pain interferingwith activity

Population: All non-maternityadult (age > 21 years) homecare patients, regardless ofwhether they improved, aslong as it was possible forthem to improve

The Outcome and AssessmentInformation Set (OASIS) foruse in home health agenciesThe OASIS is the intellectualproperty of the Center forHealth Services and PolicyResearch, Denver, Colorado.See www.cms.gov/oasis/

Postoperative care Pain is treated by route other thanIM

Population: Patients who havehad a major surgical proce-dure and were able to com-plete a survey within 72hours postoperatively.

Post-Operative PainManagement QualityImprovement Project University of Wisconsin-Madison Medical SchoolSource: J. L. Dahl, Used withpermission

Postoperative care Pain is treated with regularlyadministered analgesics

Population: Ibid Ibid

Postoperative care Pain is treated with non-pharma-cologic interventions in additionto analgesics

Population: Ibid Ibid

Page 51: Improving the Quality of Pain Management Through Measurement and Action

tion and response, as well as identifyingopportunities for improvement.67 A goodindicator also should encourage a search forunderlying causes or explanations of collect-ed data.44 Toward that end, indicators shouldfocus on pain management processes or out-comes that are within the organization’s con-trol; the indicators should be clearly defined,understandable, and able to be reported in amanner that is accurate, easy to interpret,and useful.67 An indicator is useful when itreflects concerns of patients, providers, regu-lators, accreditation bodies, or other stake-holders and is able to discern variation andimprovement in performance. Finally, a goodindicator includes complete specifications forconsistent implementation and application.50

b. Types of indicatorsIndicators may be calculated in various ways.

The most common approach is to state theindicator as a proportion. The indicator isexpressed as the number of individuals (orevents) in the category of interest divided bythe total number of eligible individuals (orevents) in the group.78 The numerator is there-fore a subset of the denominator. This type ofindicator often is referred to as rate-based.

Example: The denominator is allpatients discharged from a postopera-tive recovery unit, and the numerator isall patients discharged from a postoper-ative recovery unit with a pain intensityscore of less than 5. In a second type of indicator, the numera-

tor is not a subset of the denominator. Whenthe sources of data are different for thenumerator and the denominator, the relation-ship is more accurately referred to as a ratio.78

Example: Primary bloodstream infec-tions for patients with central lines per1000 patient days.A third type is the continuous variable indi-

cator, in which the value of each measure-ment can fall anywhere along a continuousscale.

Example: Time from patient requestfor analgesia to administration of med-ication.Indicators are useful in evaluating perform-

ance longitudinally and can be used to cap-ture various dimensions of performance suchas timeliness or efficacy.11 They can be struc-tured to capture desirable or undesirableaspects of performance. For example, an indi-

Joint Commission on Accreditation of Healthcare Organizations 43

Section V: Measuring What You Need to Know

Table 7. Examples Of Indicator Development for Pain Managementa (continued)

IM: intramuscular; NSAID: nonsteroidal anti-inflammatory drugaThis table presents a sample of focus areas for which indicators have been developed specific to pain management. For detailed indicator specific specifications, readers are referred to the source and developer information provided. Clinicians should evaluateindicators for accuracy, applicability and continuing relevance to current practice recommendations.

Topic Area Indicator Focus Application Developer/Reference

Postoperative care The intensity of pain is assessedand documented with a numeric(e.g., 0-10, 0-5) or verbal descrip-tive (e.g., mild, moderate, severe)rating scale

Population: Ibid Ibid

Postoperative care Pain intensity is documented atfrequent intervals

Population: Ibid Ibid

Postoperative care A balanced approach is used totreat pain (e.g., regional tech-niques, opioid and nonopioidanalgesicis)

Population: Ibid Ibid

Page 52: Improving the Quality of Pain Management Through Measurement and Action

cator could be used to determine whetherpatients’ discharge directions included painmanagement instructions and emergencycontact information for problems (desirable)versus discharge directions that did notinclude pain management instructions andemergency contact information for problems(undesirable). Process measures focus on spe-cific patient care interventions performed byhealth care professionals and are distinctfrom outcome indicators that measure theresults of the patient’s interaction withhealth care professionals. Each has advan-tages and disadvantages, as described inTable 9.87

c. ApplicationsCollecting indicator data is useful for the

following activities:■ To observe patterns and trends in per-

formance and stability of processes with-in an organization over time with objec-tive data.

■ To capture distinct dimensions of per-formance.

■ To evaluate multiple aspects of perform-ance that are related to the selectedimprovement, which helps to ensure thatattention focused on one aspect of aprocess does not result in deterioration ofother aspects.67

■ To measure factors in addition to theclinical dimension (e.g., financial aspectsof care) or patient satisfaction.

Measuring performance on multipledimensions simultaneously sometimes isreferred to as a “balanced dashboard.”88-90

Indicators can be derived from criteriasuch as guidelines, standards, or consensusstatements (if available) and thereby cansupport the assessment of performanceagainst evidence-based practice recommen-dations.91,92

d. Resource considerationsIf new data elements are required for the

indicator, these would need to be collectedeither through enhanced fields in automatedprograms or manual data collection. Thismethod usually requires some automateddata retrieval and analysis capabilities. Thesteps associated with aggregating data ele-

ments to calculate indicator rates may needto be programmed. However, indicators canbe calculated manually if the volume of datais not too great and the calculation algo-rithm is not too complex.

8. Utilize an Externally DevelopedPerformance Measurement System

a. Definition/description of methodThe Joint Commission defines a perform-

ance measurement system as an entity con-sisting of an automated database that facili-tates performance improvement in healthcare organizations through the collection anddissemination of process and/or outcomemeasures of performance. Measurement sys-tems must be able to generate internal com-parisons of organization performance overtime and external comparisons of perform-ance among participating organizations atcomparable times (www.jcaho.org).Donabedian described a clinical performancesystem (also known as a measurement sys-tem) as a tool for rational management thatsupports assessment of performance with anepidemiologic perspective.93 Data are report-ed at regular intervals (e.g., daily, monthly orquarterly) to a central database at the meas-urement system, most often in electronic for-mats. However, submission of hard copy datacollection forms also is possible. The per-formance measurement system analyzes andreports the organization’s performance on theindicators and provides comparative dataaggregated from other participants usingstandard or customized reports.

b. ApplicationPerformance measurement systems are use-

ful for the following: ■ Internal performance improvement

activities for which no established datacollection processes exist. For example, asystem that measures patient outcomesover time often will provide detailed datacollection tools and training manuals.

■ Provision of comparative outcome data(comparison with other organizations oragainst benchmarks). Some systemsapply statistical risk-adjustment tech-

44 Improving the Quality of Pain Management Through Measurement and Action

Section V: Measuring What You Need to Know

Page 53: Improving the Quality of Pain Management Through Measurement and Action

niques to certain outcome data.■ Meeting external demands for data for

accountability purposes. For example,home health and long-term care organi-zations collect pain-related data throughmeasurement systems with OASIS andthe MDS, as mandated by the Centersfor Medicare and Medicaid Services.

c. Resource considerations Resource requirements are varied and may

range from sophisticated information systemsto manual data collection. The benefits andcosts should be explored when consideringuse of a performance measurement system.Some possible benefits of using a perform-ance measurement system include:■ Data collection. In some cases, the per-

formance measurement system may func-tion “invisibly” by abstracting data fromexisting repositories such as administra-tive/billing data. In other instances, thesystem may provide hard-copy forms orspecific software for direct data entry, orit may allow for the electronic transfer of

data from the organization’s own databas-es to a central system database.

■ Data analysis. Some systems are capableof providing sophisticated data analysistechniques that the organization doesnot have in-house.

■ Data interpretation. Some systems mayprovide access to professional staff whocan assist in understanding and interpret-ing the data.

■ Report production. In addition to stan-dard reports, some systems offer custom,ad hoc reporting options.

■ External comparison. Systems generallyaggregate data from other system partici-pants, providing an opportunity forexternal comparison.

Some other issues to consider regardingparticipation in a performance measurementsystem include:■ Cost. There are generally charges associ-

ated with participation, although somesystems have no specific participationfees (e.g., MDS, OASIS), and there areeven a few instances in which the system

Joint Commission on Accreditation of Healthcare Organizations 45

Section V: Measuring What You Need to Know

Table 8. Desirable Attributes of Measures

Attribute Definition

1. Importance of topic areaaddressed by the measure

1A. High priority for maximiz-ing the health of personsor populations

The measure addresses a process or outcome that is strategically important inmaximizing the health of persons or populations. It addresses an importantmedical condition as defined by high prevalence, incidence, mortality, morbid-ity, or disability.

1B. Financially important The measure addresses a clinical condition or area of health care that requireshigh expenditures on inpatient or outpatient care. A condition may be finan-cially important if it has either high per-person costs or if it affects a large num-ber of people.

1C. Demonstrated variation incare and/or potential forimprovement

The measure addresses an aspect of health care for which there is a reasonableexpectation of wide variation in care and/or potential for improvement.If the purpose of the measurement is internal quality improvement and profes-sional accountability, then wide variation in care across physicians or hospitalsis not necessary.

2. Usefulness in improving patientoutcomes

2A. Based on established clini-cal recommendations

For process measures, there is good evidence that the process improves healthoutcomes. For outcome measures, there is good evidence that there areprocesses or actions that providers can take to improve the outcome.

Page 54: Improving the Quality of Pain Management Through Measurement and Action

46 Improving the Quality of Pain Management Through Measurement and Action

Section V: Measuring What You Need to Know

Table 8. Desirable Attributes of Measures (continued)

From: American Medical Association, Joint Commission on Accreditation of Healthcare Organizations, and NationalCommittee for Quality Assurance. Coordinated Performance Measurement for the Management of Adult Diabetes.Consensus statement. Chicago, IL: AMA; April 2001.86

a In some cases, risk stratification may be preferable to risk adjustment because it will identify quality issues of importanceto different subgroups.

Attribute Definition

2B. Potentially actionable byuser

The measure addresses an area of health care that potentially is under the con-trol of the physician, health care organization, or health care system that itassesses.

3. Measure design

3A. Well-defined specifications The following aspects of the measure are to be well defined: numerator,denominator, sampling methodology, data sources, allowable values, methodsof measurement, and method of reporting.

3B. Documented reliability The measure will produce the same results when repeated in the same popula-tion and setting (low random error)Tests of reliability include:a) Test-retest (reproducibility): test-retest reliability is evaluated by repeated

administration of the measure in a short time frame and calculation ofagreement among the repetitions

b) Interrater: agreement between raters is measured and reported using thekappa statistic

c) Data accuracy: data are audited for accuracyd) Internal consistency for multi-item measures: analyses are performed to

ensure that items are internally consistent.

3C. Documented validity The measure has face validity: it should appear to a knowledgeable observer tomeasure what is intended. The measure also should correlate well with othermeasures of the same aspects of care (construct validity) and capture meaning-ful aspects of this care (content validity).

3D. Allowance for risk The degree to which data collected on the measure are risk-adjusted or risk-stratified depends on the purpose of the measure. If the purpose of the measureis for continuous quality improvement and professional accountability, thenrequirements for risk adjustment or risk stratification are not stringent. If thepurpose of the measure is comparison and accountability, then either the meas-ure should not be appreciably affected by any variables that are beyond theuser’s control (covariates) or, to the extent possible, any extraneous factorsshould be known and measurable. If case-mix and/or risk adjustment isrequired, there should be well-described methods for either controlling throughrisk stratification or for using validated models for calculating and adjustingresults that correct for the effects of covariates.a

3E. Proven feasibility The data required for the measure can be obtained by physicians, health careorganizations, or health care systems with reasonable effort and within theperiod allowed for data collection. The cost of data collection and reporting isjustified by the potential improvement in care and outcomes that result fromthe act of measurement. The measure should not be susceptible to cultural orother barriers that might make data collection infeasible.

3F. Confidentiality The collection of data for the measures should not violate any accepted stan-dards of confidentiality.

3G. Public availability The measure specifications are publicly available.

Page 55: Improving the Quality of Pain Management Through Measurement and Action

provides some reimbursement to partici-pants.

■ Predetermined measures. Participantsare sometimes limited to the measuresoffered within the system.

■ Lack of customization. To enable com-parisons between participants, measuresmust be standardized and individual cus-

tomization may be limited or unavail-able.

■ “Black box” analysis methods. Somesystems may provide risk adjustmentwhere applicable, but will not sharespecifics of the methodology used withparticipants for proprietary reasons.

Joint Commission on Accreditation of Healthcare Organizations 47

Section V: Measuring What You Need to Know

Table 9. Comparing Process and Outcomes Indicators

Adapted from reference 87.

Advantages Disadvantages

1. They can directly measure what was done for an indi-

vidual or group (e.g., screening mammography for detec-

tion of breast cancer).

2. They can assess care within a relatively short time win-

dow (e.g., weekly or monthly for run charts, annually for

some preventive services, episodes for acute and chronic

disease care).

3. They can use relatively small sample sizes for common

processes.

4. They can frequently be assessed unobtrusively (e.g.,

from data stored in administrative or medical records).

5. They can be influenced by clinically appropriate actions

taken by the health care organization or clinician.

6. They can be interpreted by clinicians who may need to

modify their care delivery.

1. They may have little meaning for patients unless the

link to outcomes can be explained.

2. If the process measure is a rate, the “right” rate may

not be known (e.g., emergency room use rates, proce-

dure rates).

3. They are often quite specific to a single disease or a

single type of medical care, so that process measures

across several clinical areas or aspects of service delivery

may be required to represent quality for a particular

group of patients.

1. They tend to be more meaningful to some of the

potential users of performance measures (e.g., con-

sumers, purchasers).

2. They more clearly represent the goals of the health

care system.

3. They can provide a summary measure of the effective-

ness of medical care across a variety of conditions, types

of care, or processes of care.

1. They tend to be influenced by many factors that are

outside the control of the health care organization.

2. They may be insensitive for making health care organ-

ization comparisons, particularly if poor outcomes are

rare (e.g., mortality rates for children).

3. They may require large sample sizes to detect a statis-

tically significant difference.

4. They may require obtaining data directly from

patients.

5. They may take a long period of time to observe.

6. They may be difficult to interpret if the process that

produced the outcome occurred far in the past and/or in

another health care organization.

Process Indicators

Outcomes Indicators

Page 56: Improving the Quality of Pain Management Through Measurement and Action

48 Improving the Quality of Pain Management Through Measurement and Action

Section V: Measuring What You Need to Know

Organizational Example

Collect Indicator Data

The use of indicators in the postanesthesia care unit at Middlesex Hospital in Middletown,Connecticut, began when the nurses wanted to examine patterns of patient pain levels onadmission and discharge from the unit. Working with a member of the organization’s PainManagement Team and Quality Improvement Department, they reviewed clinical practiceguidelines and scientific literature related to postoperative pain control. Initially, the meas-urement focus was on whether patients needed pain medication on admission, whetherthey needed pain medication between 1/2 and 1 hour after admission, and whether theyreported a pain score greater than 5 (0-10 numeric rating scale) on discharge from the unit.

After review of the initial data, it was determined that collecting the reported pain intensi-ty scores would increase the usefulness of the data. Therefore, the indicators were rede-fined. The denominator was defined as “patients entering the post anesthesia care unit.”Three numerators were defined, each of which is concerned with the number of patientsreporting pain intensity scores in four categories: none (0), mild (1-3), moderate (4-6), andsevere (7-10). Numerator 1 refers to the number of patients reporting each category onadmission. Numerator 2 refers to the number of patients reporting each category between1/2 and 1 hour postadmission. Numerator 3 refers to the number of patients reporting eachcategory on discharge.

To facilitate calculations, a specific data collection tool was created for nursing staff toenter patient-level data. Data elements include date, diagnosis, pain level at three desig-nated intervals, and a comments section for documentation of medication administration,side effects, adverse events, or other pertinent information. Quality improvement staffmembers then aggregated the patient-level data monthly on all entries with no missingdata (range of 46 to 81 patients per month). Results, reported as percentages, were sharedwith postanesthesia care unit nurses at staff meetings and quarterly with the QualityImprovement Committee, the Pain Management Committee, the Department ofAnesthesiology, and the Nursing Quality Council. Results for the first 3 months of 2002showed an initial downward trend in pain intensity levels on admission and at discharge.In April 2002, use of the indicators was initiated in outpatient surgery and at an off-sitesurgery center.

Page 57: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission on Accreditation of Healthcare Organizations 49

Section V: Measuring What You Need to Know

Organizational Example

Utilize an Externally Developed Performance Measurement System

Crouse Hospital in Syracuse, New York, began using the Total Quality PainManagement™ (or TQPM) system provided by Abbott Laboratories in January 1998.Crouse made the decision to adopt TQPM as its primary adult postprocedural pain assess-ment and management tool because it was based on the patient survey first developed bythe American Pain Society, and because it provides access to regularly updated nationaldata. It also affords a broad overview of organizational performance over time, as well asaccess to aggregated national data for use in comparisons and identification of benchmarkperformance.

Using the system survey tool, the Acute Pain Management Quality Assessment Survey,each unit targets completion of at least 10 forms on a monthly basis. Staff and volunteerscollect the surveys, which then are submitted to the Quality Improvement Department forentry into the TQPM software database. After setting desired filters, the quality improve-ment professionals at Crouse can produce standard graphs and tables. The data also areexported to other programs (e.g., Microsoft Excel, [Microsoft Corporation, Redmond, WA])for additional analysis and display options. Approximately twice each year, a call for data isissued by TQPM to incorporate the local databases of the participant institutions into anaggregated national database, which is then disseminated to each of the affiliate hospitals.

The results are shared with the Pain Continuous Quality Improvement Committee, whoevaluate the data and make recommendations to the unit managers and administrationbased on their findings. These data have been used by Crouse to identify opportunities forimprovement (e.g., perceived wait times for pain medication), as well as track the meanpain levels of the surveyed population over time. Due to the comparative nature of thedata with national means, and through the use of control charts with limits set at threestandard deviations, Crouse has been able to identify and plot any changes that occurwithin the system on a hospital-wide as well as a unit-specific basis. Each of the nine indi-cators tracked (from each major section of the survey) provides an objective measure ofthe overall effort to maintain control of patient pain and gauge overall patient satisfactionwith pain relief methods and medications. Together, they provide a composite picture ofthe hospital’s success in meeting and exceeding pain management expectations. Crouseplans to continue to track its performance, measure change, and document successfulimprovements in pain management initiated through continued use of the TQPM data-base.

Page 58: Improving the Quality of Pain Management Through Measurement and Action

SECTION VI:

Assessing andAnalyzing YourProcesses and Results

This section describes the assessment phasein the cycle for improving performance.Assessment of data means translating datainto information one can use to make judg-ments and draw conclusions about perform-ance. The assessment phase allows one tocompare performance, determine causes, andset priorities for actions/interventions (base-line data assessment), and to evaluate theeffect of actions (results of improvementinterventions).44

Broadly defined, the assessment phaseincludes the following processes and activi-ties:■ Analysis of the problem ■ Analysis of data■ Interpretation of the results of data

analysis ■ Display of data and results ■ Dissemination of information.

Assessment also requires that performance(baseline or follow-up) be compared withsome reference point.44 Examples of refer-ence points include:■ Historical patterns of performance within

the organization.■ Internal policies and procedures.■ Desired performance goals, targets, or

specifications.■ The performance of other organizations

provided in external reference databases.■ Established practice guidelines, stan-

dards, and consensus statements.Assessment is not a one-time activity. It is

usually done at several points in the processsuch as problem identification, baselineassessment, and reassessment after interven-tion. In fact, assessment often continuesbeyond the immediate quality improvementproject time frame at regular intervals (e.g.,annually or every 6 months) to ensure thatdesired levels of performance are maintained.

A. Using Quality ImprovementTools

The assessment phase is greatly facilitatedby the use of relevant quality improvementtools. The tools provide an essential com-mon structure for the analyses of problemsand results and are useful for ensuring thatthe improvement activity is planned and sys-tematic, based on reliable data and accurateanalysis, and carried out with effective team-work and communication.44 It is importantto understand the purpose and capability ofeach tool (see definitions in Table 10) sothey are used appropriately. Although devel-oped for distinct purposes, the tools may beused in several stages of a project, includingplanning, identification of the problem,analysis of baseline and follow-up data, plan-ning solutions to the problem, and evalua-tion of the results.95 A grid to help organiza-tions select tools appropriate to the phase ofthe project is provided in Figure 4. Becausespace constraints do not permit an adequatediscussion of the use and value of these tools,readers desiring more information may findthe following texts to be helpful: UsingPerformance Improvement Tools in a HealthCare Setting, Revised Edition,96 ManagingPerformance Measurement Data in HealthCare,47 and The Team Handbook.95

Many of these tools were originally devel-oped for use in industrial quality control.94,96

The application of industrial qualityimprovement methodologies to health carewas tested successfully by the NationalDemonstration Project on QualityImprovement in Health Care. This projectpaired industrial quality experts with teamsfrom 21 health care organizations. Theresults are reported in Curing Health Care:New Strategies for Quality Improvement97 andsummarized in the text box on page 53.

B. Analysis of DataAnalysis involves sorting, organizing, and

summarizing data into a form that enablespeople to interpret and make sense of theraw data. Understandably, raw data should

50 Improving the Quality of Pain Management Through Measurement and Action

Section VI: Assessing and Analyzing Your Processes and Results

Page 59: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission on Accreditation of Healthcare Organizations 51

Section VI: Assessing and Analyzing Your Processes and Results

Table 10. Quality Improvement Tools: Definition And Usea

Adapted from reference 94.aFor more information on the use of these tools the following texts may be helpful: Using Quality Improvement Tools in aHealth Care Setting,94 Managing Performance Measurement Data in Health Care,47 and The Team Handbook.95

Tool Name DefinitionPhases of QualityImprovement Process

Brainstorming A structured process for generating a list of ideas about anissue in a short amount of time

Problem identificationData analysisSolution planningResult evaluation

Affinity diagram A diagram designed to help teams organize large volumesof ideas or issues into major groups

Problem identificationSolution planning

Multivoting A voting process that narrows a broad list of ideas to thosethat are most important

Problem identificationData analysisSolution planningResult evaluation

Selection grid(prioritization matri-ces)

A grid designed to help teams select one option out ofseveral possibilities

Problem identificationSolution planning

Cause-and-effect dia-gram

A diagram designed to help teams picture a large numberof possible causes of a particular outcome

Problem identificationData analysis

Control chart A plotting of data on a graph indicating an upper andlower control limit on either side of the average

Problem identificationData analysisEvaluating results

Run chart A plotting of points on a graph to show levels of perform-ance over time

Problem identificationData analysisResults evaluation

Check sheet A form designed to record how many times a given eventoccurs

Data analysis

Flowchart A diagram illustrating the path a process follows Problem identificationData analysisSolution planning

Scatter diagram A plotting of points on a graph to show the relationshipbetween two variables

Data analysisResult evaluation

Pareto chart A bar graph depicting in descending order (from left toright) the frequency of events being studied

Problem identificationData analysisResult evaluation

Histogram A bar graph displaying variation in a set of data and distri-bution of that variation

Data analysisResult evaluation

Page 60: Improving the Quality of Pain Management Through Measurement and Action

not be used to draw conclusions about aprocess or outcome.47

Typically, the data to be analyzed will havebeen entered (either directly from the datasources or secondarily from paper forms) intoan automated database software program suchas a spreadsheet (e.g., Lotus 123 [IBM Lotus,Cambridge, MA], Microsoft Excel [MicrosoftCorporation, Redmond, WA]) or a databasemanagement and analysis software package(e.g., SAS [SAS Institute Inc., Cary, NC];Epi Info [Centers for Disease Control andPrevention, Atlanta, GA]; SPSS [SPSS Inc.,Chicago, IL]).

Data analyses can be done through the useof relatively simple descriptive statistics (e.g.,medians, means and standard deviations, fre-quency distributions, proportions) or morecomplex, inferential techniques. Analysescan involve single variables, two variables(e.g., bivariate correlations), or multiple vari-ables (e.g. multiple regression analyses).

Analysis of patient outcomes sometimesrequires multivariate statistical techniques to

risk-adjust for variables outside the healthcare organization’s control (e.g., intrinsicpatient factors) that are related to the out-come of interest. For example, outcomes ofdecreased pain intensity may be influencedby patient beliefs and attitudes, which oftenare outside the control of the health careorganization. More information on risk-adjustment approaches for performancemeasurement data can be found in RiskAdjustment for Measuring HealthcareOutcomes.98

The sophistication of the analyses andability to draw inferences from a sampleusing statistical tests depend on many factors,including:■ The design of the study.■ The quality of the data.■ The sample size.■ The level of data collected (e.g., patient,

unit, organization).■ The underlying distributions of the vari-

ables (e.g., normal/Gaussian versus bino-mial).

52 Improving the Quality of Pain Management Through Measurement and Action

Section VI: Assessing and Analyzing Your Processes and Results

Figure 4. Selection Grid: Improvement Tools

Tool Selection GridTool Phase of Problem-Solving Activity

Adapted from: Joint Commission on Accreditation of Healthcare Organizations. Using Performance Improvement Tools inHealth Care Settings. Rev ed. Oakbrook Terrace, IL: Joint Commission on Accreditation of Healthcare Organizations;1996.

ProblemIdentification Data Analysis

SolutionPlanning

EvaluatingResults

Brainstorming X X X X

Cause-and Effect (Fishbone Diagram) X X

Check Sheet X

Control Chart X X X

Flowchart X X X

Affinity Diagram X XHistograms X X

Multivoting X X X X

Pareto Analysis X X X

Run Chart X X X

Scatter Diagram X X

Selection Grid Prioritization Matrix X X

Task List X X

Page 61: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission on Accreditation of Healthcare Organizations 53

Section VI: Assessing and Analyzing Your Processes and Results

■ The type of data to be analyzed (nomi-nal, ordinal, interval, ratio).

■ The availability of statistical analysis pro-grams.

■ Familiarity of the analyst with statisticsand data analysis techniques.

Individuals without a strong background indata analysis may wish to seek guidance fromexperts within the organization (e.g., a statis-tician, data analyst, or other quality improve-ment professional) or consider use of an out-side consultant. Keep in mind that complexanalyses are not necessarily better for qualityimprovement purposes. Often, simpledescriptive statistics (means, standard devia-tions, and proportions) are more revealingand useful for improvement.

One well-known approach to data-drivenevaluation of processes is the use of statisticalprocess control. Statistical process control isthe application of statistical techniques suchas control charts to the analysis of a

process.53 It is used to determine whether theprocess is functioning within statisticalnorms (in control). It is particularly usefulfor distinguishing between random variationin a process (common cause variation) andchanges due to specific occurrences (specialcause variation).99 For more information onthis topic, refer to the literature on statisticalprocess control. A few references includeCarey and Lloyd,100 Wheeler andChambers,101 Grant and Leavenworth,102

and Sellick.103

C. Interpretation of the Resultsof Data Analysis

This often-overlooked activity is an essen-tial component of translating data into infor-mation. Defined simply, “data are numbers,information is what the numbers mean.”99

O’Leary reports that data interpretationand dissemination are enhanced when theperson(s) involved (i.e., the interpreter)demonstrates certain personal and profes-sional attributes.104 These include:■ Problem-solving skills.■ Thoroughness.■ Open-mindedness.■ Awareness of one’s own limitations.■ A healthy degree of skepticism.■ The ability to collaborate with other

people.■ Strong communication skills.■ Numeracy (the ability to think and

express themselves in quantitativeterms).

■ Computer literacy.One suggestion for enhancing interpreta-

tion involves narrowing the scope of inter-pretation to a workable amount of focuseddata (e.g., specific indicator rates or percent-ages) rather than all available data at once.Another important step in the interpretationprocess is evaluating the strength of dataaccording to five aspects: 1) their clinical rel-evance to multiple stakeholders, 2) the rangeof health care processes and outcomes thatthey address, 3) the degree of reliability andvalidity of the methods and findings, 4) the

Ten Key Lessons From the NationalDemonstration Project on QualityImprovement

1. Quality improvement tools canwork in health care.

2. Cross-functional teams are valuablein improving health care processes.

3. Data useful for quality improvementabound in health care.

4. Quality improvement methods arefun to use.

5. Costs of poor quality are high, andsavings are within reach.

6. Involving doctors is difficult.7. Training needs arise early.8. Nonclinical processes draw early

attention.9. Health care organizations may need

a broader definition of quality.10. In health care, as in industry, the

fate of quality improvement is firstof all in the hands of leaders.

(97) Source: Berwick D, Godfrey AB, Roessner J. CuringHealth Care: New Strategies for Quality Improvement.San Francisco, CA: Jossey-Bass; 1990. Used withpermission.

Page 62: Improving the Quality of Pain Management Through Measurement and Action

54 Improving the Quality of Pain Management Through Measurement and Action

Section VI: Assessing and Analyzing Your Processes and Results

degree of variation (fluctuation in processesand spread/dispersion around an averagevalue), and 5) how much control providershave over the process or outcome measuredby the data.104

Interpreting the results of multivariate sta-tistical analyses can be complicated; it oftenis helpful to seek advice from statisticians.For example, when determining the statisti-cal significance of the results, the expertsremind us that a p-value of less than 0.05 isnot a magical number; p-values can be sub-stantially affected by sample size and vari-ance.78,105 In some cases, p-values between0.05 and 0.10 can signal important findings.

Another important issue is the differencebetween statistical significance and clinicalimportance. Clinicians need to judgewhether or not the results are clinically sig-nificant—a decision that often is independ-ent of the level of statistical significance.78

For example, comparing a mean patient-reported pain level score of 6.2 for one largedepartment receiving in-service educationon treatment approaches with a mean scoreof 7.1 in a similar control group may be sta-tistically significant but clinically insignifi-cant and unacceptable relative to the targetlevel of performance.

D. Display of Data and Results For interpretation and dissemination to be

effective, it is essential to convey the find-ings and key messages as clearly and accu-rately as possible. Several considerations inselecting data displays include:■ Who is the audience?■ What is the most important message?

(Do not drown the audience in data.)■ What do you need to present a complete

and accurate representation of your find-ings? Be careful to avoid the pitfall ofemphasizing only the positive.

■ What graphical display capabilities doyou have?

The most commonly used display tools arethe pie chart, bar chart, and line graph.

Pie charts are most useful for displayingfrequencies (or percentages) of single items

that add up to the total number (or 100%)(see Figure 5). Bar charts are effective whendisplaying differences between groups (onthe horizontal axis) in frequencies or per-centages. Line graphs are particularly usefulfor spotting trends in a process.99

Akin to the concept of multivariate analy-ses, more advanced data display tools are use-ful for demonstrating multiple findings (e.g.,rates, priorities, outcomes) on the same page.Examples of multiple measure display toolsinclude 1) the balanced scorecard, 2) a dash-board display, 3) a performance matrix, 4) aradar chart, and 5) a stratified multivariabledisplay. Additional information on thesedata display tools can be found in Tools forPerformance Measurement in Health Care: AQuick Reference Guide.99

Displays of data should be as clear and easyto read as possible. For example, use of thethree-dimensional option when displayingthe bars of a histogram can make it more dif-ficult to determine the y-axis intersection.All legends and axes should be clearlylabeled because the audience may not takethe time to read accompanying text or theymay not grasp the point(s) without subse-quent reference to the material. The scale ofthe horizontal (x) and vertical (y) axesshould be appropriate to the range of possiblevalues. For example, depicting a change overtime of 10% looks very different (and poten-tially misleading) on a vertical axis scale

Figure 5.

Page 63: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission on Accreditation of Healthcare Organizations 55

Section VI: Assessing and Analyzing Your Processes and Results

ranging from 50 to 70 than on a vertical axisscale ranging from 0 to 100 (Figure 6).

E. Dissemination ofInformation

Given that substantial time and resourceshave been invested to collect and interpretthe data, it is important to share the findingsas widely as is appropriate. Disseminating theresults of the analysis process helps raiseawareness in the organization of how it isperforming with respect to pain manage-ment.

Strategies or formats for disseminationinclude: ■ Verbal representation (e.g., lecture, pre-

sentations at staff meetings, other gather-ings).

■ Written reports (e.g., published articles,newsletter).

■ Visual displays (e.g., posters, storyboards).

■ Electronic dissemination (e.g., organiza-tion-wide Intranet, external Internetusing list-serv).

It is important to know your organizationand how information is best received. Thinkof creative ways to communicate your results.Make the learning experience interesting—even fun. When possible, use multiple meth-ods such as a verbal representation and writ-ten report. Consider sharing your experi-ences (both successes and challenges) outsideyour organization by submitting an article toa journal.

Figure 6. Using Appropriate Scales on Graphs

This figure shows that using different vertical scales can affect the perception of the change in scores.

50

52

54

56

58

60

62

64

66

68

70

4thQtr

3rdQtr

2ndQtr

1st Qtr

0

10

20

30

40

50

60

70

80

90

100

4thQtr

3rdQtr

2ndQtr

1stQtr

Page 64: Improving the Quality of Pain Management Through Measurement and Action

SECTION VII:

Improving YourPerformance

This section reviews the steps related todesigning and instituting actions or interven-tions (design/act) to achieve desired improve-ments. Interventions can vary widely fromchanging structural aspects that support painmanagement (e.g., vital sign flow sheets) toadding/enhancing care processes (e.g., assess-ing patient pain). Several examples of painimprovement interventions are discussed.

A. Designing a PainImprovement Intervention

After the team has completed analysis ofdata gathered to assess current practice,hypothesized root causes, andprioritized/selected opportunities forimprovement, the team can begin to planinterventions. These interventions will com-prise specific actions such as implementingan educational program for patients and staff.Careful planning of the intervention willhelp ensure that it can be accurately evaluat-ed for success. Aspects of implementation toconsider include designing the intervention,conducting a pilot test before widespreadapplication, and identifying an approach tomeasure the impact of the intervention(s)before integration into normal organizationalprocesses.99

Designing interventions will involvedetailed planning of all aspects of the inter-vention such as developing or selecting writ-ten materials and defining changes to careprocesses. Other planning considerationsinclude deciding who will be involved intesting, what information they will need, andhow will it be communicated. Roles andresponsibilities of project team members andother champions of the improvement inter-vention should be clearly defined. Details ofoperationalizing the intervention, projected

timetables, and identification of measurablesuccess factors also are important considera-tions. If target performance goals are estab-lished for the intervention, care should betaken that these do not represent levelsbelow established standards of care, regulato-ry requirements, or other requirements whenthese are applicable. A target performancegoal can serve as a reference point for meas-uring the success of the intervention. Forexample, one organization set a goal that100% of their patients receive informationon the importance of pain management.48

Assessing the educational needs at yourorganization is an important part of the eval-uation of current practice. Although educa-tion alone may not change care, improve-ments are unlikely to occur without it.Studies examining this area have suggestedthe need for improved professional educationcurricula addressing pain management.106-110

When designing educational interventionsconsider the needs of:■ Clinicians. Provide a solid foundation for

staff to practice good pain managementand continually reinforce these practicesthrough an initial program and regularupdates. Tailor programs as necessarybased on needs identified through assess-ments and support staff attendance byproviding time. Changing clinicianbehavior also will require the influenceof role models, the application of theoryto actual clinical practice situations, andfeedback on performance.

■ Patients. Tailor the extent of informa-tion based on needs (e.g., hospicepatients probably need more comprehen-sive information than patients having anoutpatient surgical procedure [K. Syrjala,PhD, personal communication, 2001]).Use results of knowledge and attitudeassessments to evaluate unique patientneeds, including reading and languageskills, education, and so on. Use a varietyof approaches such as printed materials,video, in-house TV station, and one-on-one teaching.

For example, consider the steps involvedin planning an educational program aboutpain management for staff. Specifics of the

56 Improving the Quality of Pain Management Through Measurement and Action

Section VII: Improving Your Performance

Page 65: Improving the Quality of Pain Management Through Measurement and Action

program content must be developed and amethod of delivery determined (e.g., lecture,self-directed learning, teleconference). Staffmust be identified for participation, the suc-cess factors determined (e.g., a minimum testscore on a posttest), and a timetable estab-lished for staff attendance.

There are many other interventions toconsider in developing a comprehensiveimprovement initiative. A few examplesinclude:■ Developing policies and standards of

care.■ Enhancing documentation through revi-

sion of forms or redesign of electronicdata entry systems.

■ Development of pocket cards, analgesicconversion charts, and other clinicaltools to assist clinicians.

■ Assigning accountability for pain man-agement to specific staff.

■ Developing competency evaluationstrategies for clinical staff involved inpain therapies.

■ Developing or revising standing orders asneeded.

■ Defining pain assessment protocols andimplementing new instruments andreporting scales.

■ Defining appropriate utilization of phar-macologic agents.

■ Designing effective strategies to commu-nicate pain management information tocaregivers, patients and families, and allkey stakeholders.

■ Providing feedback on performance toeffect changes in clinician behavior.46,93

B. Testing and Implementing a Pain ImprovementIntervention

After designing all aspects of the interven-tion, it is helpful to conduct a pilot testbefore full implementation111 to identifyproblems that could affect the success of theintervention.44,67 Results of the pilot testmay suggest the need to change aspects ofthe intervention’s design, implementation

process, or evaluation approach. Addressingthese issues early before proceeding withlarge-scale application of the interventioncan prevent implementation failures, anunfortunate outcome that is costly in termsof both money and staff morale. It is worthnoting again that improving performance is arepetitive process, and results may lead youto redesign the intervention and conduct fur-ther testing.67 Another benefit of the pilottest is that the results can serve to persuadeskeptics and cement support from leadershipto expand the improvement intervention.

When proceeding with organization-wideimplementation, be sure to include all thestakeholders in the process improvementactivity. Involve as many clinicians as possi-ble early in the process to foster owner-ship.45,112 As has been previously discussed,pain management is the responsibility ofmultiple disciplines, departments, and clini-cians as well as patients. Therefore, the suc-cess of interventions will be enhanced bysecuring commitment from key individualswho can serve as champions during imple-mentation in day-to-day practice. Someorganizations have found it beneficial tophase improvement interventions in oneunit at a time, allowing for learning andadjustment or improvement of the imple-mentation process.46 Others have focusedfirst on organizational attitudes and broadsystem change. Again, knowing your organi-zation well and adjusting your approach towhat will work is important. Finally, allowingsufficient time for implementation ofchanges is critical to success.32

C. Monitoring for Success After implementing changes, close moni-

toring will be required initially to assesswhether the intervention is meeting expec-tations and targeted improvement goals.67

The collection and analysis of measurementdata will help in evaluating the appropriate-ness of the project goals. Data may suggestthat the goals need to be adjusted.Monitoring also will help determine whetherthe intervention is being implemented con-

Joint Commission on Accreditation of Healthcare Organizations 57

Section VII: Improving Your Performance

Page 66: Improving the Quality of Pain Management Through Measurement and Action

58 Improving the Quality of Pain Management Through Measurement and Action

Section VII: Improving Your Performance

sistently across the organization and willhighlight areas needing additional support.

Influences external to the implementationprocess should be considered for possibleimpact. These include organizational factorssuch as mergers, ownership/personnelchanges, or reorganization as well as othersignificant environmental changes (e.g., reg-ulatory or reimbursement). Related to that, itis important to regularly review educationalmaterials, documentation tools, guidelines,and standards compliance.113 Ongoing moni-toring of performance improvement litera-ture will alert the organization of the need tomake adjustments as pain management andtreatment options evolve.

After the success of the intervention hasbeen established, the next priority is toensure that the improvements are main-tained. Again, measurement will be a criticalfactor in evaluating the intervention overtime. Although it is necessary to determinethat performance continues to meet desiredgoals, it may be possible to reduce the inten-sity (scope and frequency) of the measure-ment activity. Determining the level of mon-itoring necessary is part of ensuring sustainedsuccess.44,67

D. Sustaining Change A performance improvement initiative is

successful when the improvements become apermanent part of the organization’s routine.When an initiative succeeds, temporarychanges in practice become operationalprocesses (e.g., clinical pathways).

Be aware that change in practice behavior

and improved outcomes can lag months toyears behind improvement efforts.112

Ongoing assessment of clinician knowl-edge114 and a series of educational and moti-vational activities repeated over time arenecessary for lasting improvement in painmanagement practices.74

It is important to set goals for ongoingimprovement. For example, if initial imple-mentation of the intervention was of a limit-ed scope, it could expand to additional areasin the organization or across the entireorganization. Measurement data might sug-gest other improvement opportunities thatcould be acted on. Performance goals mightneed to be raised. How does your organiza-tion’s performance compare with thatdescribed in external references, if available?This is the time to secure a commitment forthose resources and approvals necessary tomaintain progress.

As performance across health care organi-zations improves, the “bar is raised” andongoing attention will help ensure that prac-tices remain current. Also, new advances willlead to new opportunities.

Finally, success should be celebrated.Communicating improvements effectively,and in a timely and ongoing way, will help toreinforce the permanency of the change.Positive reinforcement is a strong motivator.Recognize participants and share accom-plishments with others in your organizationthrough formal means (e.g., employee spot-light, staff recognition day, poster displays,newsletters, Intranet) and informal means(personal visit by organizational leaders, spe-cial lunch or treats).

Page 67: Improving the Quality of Pain Management Through Measurement and Action

SECTION VIII:

UnderstandingFactors That AffectOrganizationalImprovement

This section examines the potential of var-ious factors to enable change (i.e., to facili-tate improvement) or to be a barrier tochange (i.e., to inhibit improvement). Alsoincluded is how the principles of total qualitymanagement can be applied to improvingpain management performance.

A. Factors That InfluenceChange

Many factors influence an organization’sability to implement change and improveperformance (see Figure 7). These factorscan be grouped into the following four cate-gories: ■ Patient factors■ Clinician factors ■ Organizational factors■ External/environmental factors

Many of these factors can be either barriersor enablers, or both barriers and enablersunder some circumstances. A particular fac-tor’s degree of influence will likely varyaccording to the health care setting. Forexample, patient and family factors thataffect compliance with pain treatment proto-cols are likely to have greater influence inthe home health environment than in thehospital setting. In long-term care, someresearch has suggested barriers include lessfrequent contact with physicians andincreased reliance on health care providerswith little training in pain and symptommanagement, together with the increasedincidence of cognitively impaired peoplewho have complex medical problems.115

Maximizing the influence of enablers andminimizing the influence of barriers is key to

an effective improvement initiative. Eachorganization will have unique challenges thatwill need to be addressed (such as restructur-ing or new computer systems). The followingsections describe issues that have been iden-tified in the literature.

1. Patient FactorsThe impact of a patient’s pain beliefs and

perceptions on his or her response to painshould be considered. In a recent study ofchronic pain patients, pain beliefs and cogni-tions were found to account for a significantamount of variance in general activity, paininterference, and affective distress.116 Theexperience of pain “is a product not only ofphysiological mechanisms but also thepatient’s knowledge and beliefs, emotions,social well-being, as well as system factors,including the multiple caregivers involvedwith the patient’s pain management.”117

Tolerance for pain is affected by genetic,cultural, or personal belief systems. Elderlypatients may begin with the expectation thatthey will have pain or that not much can bedone.118 Similarly, Warfield and Kahn foundthat 77% of adults surveyed in 500 U.S. hos-pitals believed that it is necessary to experi-ence some pain after surgery.119

Patients may be reluctant to report pain32

or unwilling to “bother” staff if they perceivethat staff members are too busy.27,120

Sometimes patients use self-distractionbehaviors that include laughter, reading, vis-iting, and so forth, which may be effectivefor a time but are sometimes misconstrued bystaff to mean there is no pain.27 Manypatients believe that they should be stoic andbear the pain or that asking for pain medica-tion is a sign of weakness. They may wish todisguise their pain so as not to worry familymembers.

Lack of knowledge or poor understandingabout pain and treatment may contribute tofears about taking medication such as fear ofaddiction. Patients may not adhere to treat-ment due to unpleasant side effects from themedication.121 Patients may be uncomfort-able assuming responsibility for self-controlof pain,122 or may hesitate to use newer tech-nologies such as patient-controlled analgesia

Joint Commission on Accreditation of Healthcare Organizations 59

Section VIII: Understanding Factors That Affect Organizational Improvement

Page 68: Improving the Quality of Pain Management Through Measurement and Action

(PCA) devices. High cost and inadequateinsurance coverage for medications can causepatients to use medication sparingly; theymay fear that the pain will get worse andthey should “save” the medication for later.

On the other hand, patients and their fam-ilies can be powerful allies and an effectiveforce for change in promoting improvements.Due in part to the increased availability ofinformation from sources such as theInternet, patients have become more equalpartners in determining health care choic-es.76 Patients have increased knowledge andexpectations specific to their treatment. Theprominence of pain-related issues in themedia has led to increased public awarenessof the issues of pain management. Patientand family involvement in self-help organiza-tions (e.g., the American Chronic PainAssociation) also has increased the visibilityof these issues.

2. Clinician FactorsClinicians maintain a pivotal role in

changing pain management practice.Clinicians can be defined broadly to includepractitioners who are involved in clinicalpractice or clinical studies as well as clinicalleaders in executive positions.

One well-documented clinician barrier islack of knowledge about current evidence-based treatment.107-110 This lack of knowl-edge often stems from inadequate educationregarding principles of pain management ineducational curricula.107,108,110 Clinicians’misconceptions about pain treatments couldinclude an exaggerated fear of addictionresulting from use of opioids;108 confusionabout the differences between addiction,physical dependence, and tolerance; orunwarranted concerns about the potentialfor the side effect of respiratory depression.107

Like patients, clinicians are affected bypersonal beliefs and attitudes. For example,some clinicians may believe that pain is animportant diagnostic tool, in spite of evi-dence that supports the use of analgesia in

60 Improving the Quality of Pain Management Through Measurement and Action

Section VIII: Understanding Factors That Affect Organizational Improvement

Figure 7. Examples of Factors Affecting Pain Management Improvement

Initiatives

Improved pain Management

PatientKnowledgeInvolvement in care Beliefs/expectations

ClinicianKnowledgeBeliefs/cultureTime constraints

OrganizationPolicies/proceduresStaff/other resources Structure/leadership

External/environmentalRegulatory/quality oversight bodiesHealth care payment systemsMedia influence/societal beliefs

Page 69: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission on Accreditation of Healthcare Organizations 61

Section VIII: Understanding Factors That Affect Organizational Improvement

instances where it previously was thought tobe contraindicated.123,124 Others may believethat pain is too difficult to treat or is transi-tory and poses no harm. Some cliniciansbelieve that a patient who is asking for painmedication regularly or from several doctorsis abusing medication. Additionally theremay be a mismatch between the clinician’sperceived ability to successfully manage painand his or her actual knowledge of currentpractice.107,110

Finally, practice also is influenced by pro-fessional culture and mindset. Some clini-cians believe that the professional is the bestjudge of the patient’s pain and are unable toaccept the patient’s report.27 For example, ifthe patient does not display behaviors associ-ated with pain (such as grimacing or crying),the clinician may find it difficult to acceptthe patient’s report of pain. Members of thehealth care team may disagree on the pre-ferred treatment plan, or there may be lackof consensus between the health care teamand patients and their families. In today’sstrained health care environment, competingdemands on professional time and otherinstitutional responsibilities sometimes resultin inadequate attention to pain managementissues.117

On the positive side, changing pain man-agement care begins at the bedside.Therefore, clinicians are critical forces forimprovement. Advances in pain medicinespecialty education, new certification pro-grams, and other educational options such aspreceptorships have produced a cadre of cli-nicians with heightened awareness of, andexpertise in, pain management.

Numerous professional organizations pro-mote scientific approaches to practice, suchas the American Pain Society, the AmericanAcademy of Pain Medicine, and theAmerican Society of Pain ManagementNurses. Many journals are devoted to painmanagement, such as the Journal of Pain andSymptom Management, Pain, Clinical Journalof Pain, Journal of Pain, Pain ManagementNursing, and others. There also is a prolifera-tion of clinical texts, guidelines, positionstatements, and evidence-based practice rec-ommendations.

3. Organization FactorsOrganizations can enhance or impede

improvement in pain care in ways that maynot be obvious, yet are highly influential.Sometimes, well-intended policies and pro-cedures can hinder the process. For example,documentation forms may not have a spacefor entry of pain assessment findings.Standing orders may be outdated comparedwith current recommendations.

Clearly, adequate resources are essential fora successful effort. These include, but are notlimited to, the dedication of staff and equip-ment resources for the improvement initia-tive. The regular availability of appropriatemedications, supplies, and equipment neededin daily patient care is important to organiza-tional improvement. For example, insuffi-cient numbers of PCA pumps could lead todelays in initiation of therapy. Conversely,ready availability of some items can be coun-terproductive. For example, stocking ofmeperidine on units makes it an easy choiceas a first-line agent, which can reinforceestablished patterns of prescribing, while pre-ferred agents for pain management may notbe as readily accessed.

Of course, the most important resourceissue is staffing effectiveness, which includesstaffing levels, experience, and education ofnurses and other direct and indirect carestaff. When decreased numbers of profession-al staff (or increased numbers of nonclini-cians) are caring for increased numbers ofpatients and/or patients who are moreseverely ill, it can be difficult to give painmanagement the attention it deserves.Similarly, staff turnover can have a negativeeffect on educational interventions and con-tinuity of care.

Organizational structure and culture alsoinfluence improvement activities. Expertspoint to lack of accountability for pain man-agement as one of the greatest barriers toimprovement.26,27,120 Related problemsinclude lack of coordination between depart-ments and pain management teams from dif-ferent specialties and, in some instances, dif-ficulty in gaining organization-wide accept-ance for the work of individual departmentalinitiatives.125

Page 70: Improving the Quality of Pain Management Through Measurement and Action

Many of these barriers are readily overcomeby key enabling factors associated with theorganization. Organizations that have a painmanagement champion—an individual whoacts as a role model and who is highly moti-vated, energetic, and often charismatic— canbe very successful in improvingperformance.126

Similarly, there is a greater likelihood ofsuccess when leadership is interested andinvolved in the initiative.127 The literatureshows that change is most effective whenprimary change agents (change champions)provide clinical leadership. Sound manage-ment practices play a role as well. For exam-ple, recent studies of long-term care organiza-tions have found that certain characteristicsof top management such as job tenure, edu-cational experience, and professionalinvolvement appear to affect the adoption ofinnovative practices.115,128 Computerizationalso can be an effective tool for change bysupporting consistent documentation, on-line knowledge, analysis, and timely feed-back.

Finally, some organizations have a struc-ture and culture that embraces change.Organizational change has been studiedextensively in the organizational manage-ment field, and various theories and strate-gies have been developed.129 For example,Zaltman and Duncan130 identified the fol-lowing four strategies for implementingchange in organizations: 1) reeducation,which is based on unbiased presentation offact and assumes that rational people willadjust behavior accordingly; 2) persuasion,which attempts to bring change through biasin structuring and presentation of informa-tion (i.e., selling an idea); 3) facilitation,which involves interventions designed tomake change easier for individuals whoalready recognize the problem and agree on aremedy; and 4) power, which involves theuse of sanctions or coercion to implementand maintain change. Beyer and Trice131

identified seven stages that organizationsexperience during a change process: 1) sens-ing of unsatisfied demands, 2) search for pos-sible responses, 3) evaluation of alternatives,

4) decision to adopt a course of action, 5)initiation of action within the system, 6)implementation of change, and 7) institu-tionalization of the change.

4. External FactorsPatients, clinicians, and organizations are

subject to influences exerted by the externalenvironment. Compared with other coun-tries, the United States has a pluralistichealth care system. This system, whichemphasizes freedom of choice, also increasesfragmentation of care, thus complicating theeffort to provide individual patients withwell-coordinated care across settings overtime.

Payment mechanisms relating to medica-tions and specialized pain treatments andservices are not always designed to optimizepain management.132 For example, in anational survey of pain specialists, Carr etal.133 found that two thirds of 223 anesthesi-ologists reported a trend toward reducedreimbursement for intravenous PCA; and ofthose, 29% believed this trend would lead todecreased use of this approach.

Sociocultural beliefs and trends also affectprocesses. An example is cultivation of anantidrug philosophy with campaigns such as“just say no,” which may foster misconcep-tions or fear. High-profile drug abuse casesrelated to pain management also influencebehavior among patients and clinicians.134

These beliefs and values often are manifestedin statutes, laws, and regulations that seek tocontrol clinicians’ prescribing practices.

In part, through increased visibility in themedia, public opinion and attitudes towardpain relief have begun to change. Recentcourt cases concerning patients with unre-lieved pain have heightened awareness ofthe need for appropriate pain management.7

Scrutiny of clinician practice includes notonly the investigation of the overprescriptionof opioids but increasingly the study of casesof underprescribing.

Another powerful enabler is the fact thatregulatory bodies such as state hospitallicensing agencies and accrediting organiza-tions have added specific requirements relat-

62 Improving the Quality of Pain Management Through Measurement and Action

Section VIII: Understanding Factors That Affect Organizational Improvement

Page 71: Improving the Quality of Pain Management Through Measurement and Action

ed to pain management. A recent example isthe announcement by the Secretary ofHealth and Human Services of the intentionto publicly report nursing home comparativeperformance information specific to painmanagement and other clinical care issues.135

Similarly, the advances in scientific research,as well as the explosion of electronic andwritten health-related materials for bothconsumers and professionals, have greatlyenhanced the knowledge base about theneed for proper pain management.

B. Improving Pain ManagementThrough the Principles ofTotal Quality Management

Though this monograph has emphasizedcontinuous quality improvement using astructured process, it is worth noting thatcontinuous quality improvement is one ele-ment of the overall principle of total qualitymanagement.136 McCaffery and Pasero, intheir landmark publication entitled PainClinical Manual, have thoughtfully appliedthe 14 principles of total quality managementto improving pain management (see boxbelow).27 The effectiveness of organizationalimprovement initiatives will be enhanced byapplication of many of the points containedwithin these general principles.

Joint Commission on Accreditation of Healthcare Organizations 63

Section VIII: Understanding Factors That Affect Organizational Improvement

The 14 Points of Total Quality Management Applied to Improving Pain Management

1. Create constancy of purpose forimprovement of care and service.

Concentrate on a long-term plan basedon a patient-focused mission (e.g., provid-ing patients with attentive analgesiccare). Consistently model the vision ofthe organization (e.g., each person is aunique individual with certain rights).Enable staff to continuously improvecosts, services, and patient satisfactionthrough well-designed pain managementplans. Invest in a plan for continuing edu-cation and a system of rewards to encour-age innovation in staff. Treat continuousimprovement of pain management as anongoing obligation to the patient.

2. Adopt the new philosophy.Quality pain management must become

a driving passion of the organization, sothat undertreatment of pain in any caresetting is immediately recognized asincompatible with the institution’s mis-sion and unacceptable to all of its staffand physicians.

3. Avoid dependence on inspection only.Traditionally, errors and problems are

discovered after the fact through aprocess of quality assurance (inspection).This inspection process must be replacedwith an improvement process that pre-vents errors and problems. Stop endlessdata collection on unrelieved pain, andestablish a pain care committee to ana-lyze and synthesize the data; developplans to correct current pain problemsand prevent the occurrence of new prob-lems.

4. Avoid the practice of awarding busi-ness on price alone.

Quality outcomes are possible onlywhen quality materials, supplies, andprocesses are used. Consider long-termcost and appropriateness of productsrather than just their purchase price.Cultivate long-term relationships withvendors, rather than simple short-termpurchasing relationships. This requiresthat the supplier must consistently meetthe needs of the organization and com-

Page 72: Improving the Quality of Pain Management Through Measurement and Action

64 Improving the Quality of Pain Management Through Measurement and Action

Section VIII: Understanding Factors That Affect Organizational Improvement

The 14 Points of Total Quality Management Applied to Improving Pain Management(continued)

mit to continually improving its product(e.g., equipment and supply manufacturersbecome partners in providing attentiveanalgesic care by requesting input on howtheir product can be improved to help cli-nicians provide better pain care). Forexample, manufacturers of PCA pumpsoften rely on suggestions from clinicianswho use the pumps to improve the tech-nology. Many suppliers offer a variety offree consultative services and sponsor edu-cational programs for institutions that areinterested in improving pain manage-ment. This point has implications for theorganization’s internal “suppliers” (staffand physicians) as well. Cultivating loyal-ty and trust provides a secure practiceenvironment and long-term relationshipswith less staff and physician turnover.

5. Constantly improve every process forplanning, implementation, and service.

Improving pain management is not aone-time effort. Teamwork is essential.Approve the establishment of an interdis-ciplinary pain care committee. Empowerfront-line staff to contribute to theimprovement process (e.g., encouragethem to serve as members of the pain carecommittee), to constantly look for ways toreduce waste and improve quality, and tobecome accountable for pain management(e.g., become a pain resource nurse).

6. Institute teaching and reteaching.On-the-job training alone encourages

worker-to-worker propagation of practice.Many practices are faulty and outdated(e.g., promoting the idea that there is ahigh risk of addiction when opioids aretaken for pain relief). Teach and reteach(continuing education) front-line staffmembers the important aspects of painmanagement, give adequate support forproviding effective pain relief, and meas-

ure the effectiveness of the training.Assign new staff members to trained pre-ceptors who will perpetuate quality per-formance. Remember to encourage, notdrive, staff. Adjust the preceptor’s work-load so that quality training is possible.

7. Ensure qualified leadership for systemimprovement.

The job of management is to lead.Leading is moving staff toward a vision,managing is helping them do a better job(e.g., ensure that pain care committeemembers attend meetings, adjust theworkload of the PRN to allow time forassisting others with pain problems).

8. Drive out fear.Many experts consider this the most

important of the 14 points because whenstaff fear failure, embarrassment, or retal-iation, they are unwilling to make sug-gestions and recommendations forchange. This results in a lower level ofquality. Consider and value all staff sug-gestions. Appreciate that front-line staffmembers are “in the trenches” and haveinvaluable knowledge of how to improvepain management (e.g., extend the con-cept of the PRN program to all levels ofnursing staff).

9. Break down barriers between depart-ments.

The goals of the various departmentsmust complement one another, or quali-ty is jeopardized. Foster teamwork by dis-mantling systems that stop staff fromworking together to accomplish a proj-ect. Help staff understand the needs ofother departments and work togethertoward the organization’s vision.Promote processes that support thevision (e.g., initiate postoperative intra-venous PCA in the postanesthesia care

Page 73: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission on Accreditation of Healthcare Organizations 65

Section VIII: Understanding Factors That Affect Organizational Improvement

The 14 Points of Total Quality Management Applied to Improving Pain Management(continued)

Source: McCaffery M, Pasero C, eds. Pain Clinical Manual. 2nd ed. St. Louis, MO: Mosby; 1999:714-715. Used withpermission.

unit to avoid the dangerous scenario ofpatients receiving intramuscular opioidinjections on the clinical unit while wait-ing for intravenous PCA to be initiated).

10. Avoid trite slogans, exhortations, andtargets.

Avoid using derogatory and ambiguousslogans such as “Just Do It!” These typesof slogans provide no direction and mayoffend and repel some staff and physi-cians. If slogans and motivational phrasesare used, explain them (e.g., “Our sloganis ‘Preventing Pain Is Easier than TreatingPain.’ This means that we remind patientsto request analgesia or press their PCAbuttons before their pain becomessevere.”). Let staff and physicians knowexactly what is being done to make it eas-ier for them to provide better pain man-agement (e.g., post in all clinical areas alist of the names and numbers of supervi-sors and PRNs available to help with painproblems).

11. Eliminate numeric quotas.Quotas place a cap on productivity and

conflict with the continuous nature ofquality improvement. Pain issues andproblems are ongoing and unending.Encourage staff to look for more than oneproblem to solve and more than one wayto improve pain management. Allow painimprovement work groups to evolve to apermanent standing interdisciplinary paincare committee.

12. Remove barriers to pride of workman-ship.

Delegate authority and responsibility tostaff members to foster autonomy while

promoting the philosophy of an interdis-ciplinary, interdepartmental approach topain management. Avoid focusing onindividual or department performance(e.g., pain management is everyone’sresponsibility); support teaching nursesto titrate analgesics, individualize doses,and manage side effects.

13. Institute a vigorous program of staffeducation and continuing education andself-improvement.

Encourage staff members’ ongoing per-sonal development even in areas notrelated to their jobs. Continually provideupdated pain management informationto staff and consider the need for updat-ed pain management technology toimprove performance (e.g., evaluate andreevaluate the way pain is being assessedand managed and update approaches onthe basis of advances in pain knowledgeand technology).

14. Take action to accomplish the trans-formation.

Put everyone, including top manage-ment, to work on sustaining the organi-zation’s new mindset (e.g., discuss theinstitution’s philosophy of providingattentive analgesic care with all newemployees during their orientation andupdate long-term employees during theirannual cardiopulmonary resuscitationand fire prevention recertification class-es).

Page 74: Improving the Quality of Pain Management Through Measurement and Action

SECTION IX:

Examples ofOrganizationsImplementing PainManagement–RelatedMeasurement andImprovementInitiatives

A. OverviewThe purpose of describing these initiatives

is to provide examples of how qualityimprovement techniques can effectively sup-port pain management improvement. Theexamples illustrate how organizations usedsystematic processes for improvement andhow measurement provided important datato support improvement activities.

These are real-world examples, rather thanhypothetical ones. They illustrate how peo-ple committed to quality improvement inpain management achieved success withintheir organizations. The particular exampleswere chosen both to provide a diversity ofsettings and experiences and to suggeststrategies for overcoming obstacles to painmanagement improvement initiatives. Thesettings are a home health agency, a ruralhospital, an academic medical center, andthe hundreds of facilities under the VeteransHealth Administration.

Improving pain management in homecare, away from the institutional settingwhere there is greater control over medica-tion delivery and therapeutic options, has itsown special challenges. Patient and familysocial, cultural, and economic factors greatlyinfluence the care received in the home.Staff involved in quality improvement initia-tives are geographically dispersed therebyadding to the complexity. The first organiza-

tional example, which describes the experi-ence of the Visiting Nurse Association &Hospice of Western New England, Inc.,highlights the coordination of care acrosssettings and demonstrates the continuousnature of the improvement cycle.

Small and rural hospitals sometimes oper-ate under extremely tight resource con-straints, which can result in staff servingmultiple roles. With an average daily censusof 32, Memorial Hospital of SheridanCounty, Wyoming, overcame challengeswhen the Pain Team implemented their “NoPain, Big Gain” campaign. This exampledemonstrates the benefits of participation inexternal educational programs and collabora-tive activities to maximize internal resources.Strong leadership support, creative improve-ment strategies, and positive recognition arehighlights of the initiative.

The pain management improvementefforts conducted within the University ofIowa Hospitals and Clinics (UIHC), an 831-bed teaching facility, highlight the coordina-tion of pain management initiatives within alarge, sophisticated, and complex qualityimprovement organizational structure. Thisexample illustrates how UIHC uses bothcentralized and decentralized approaches topain management.

As health care systems increase in com-plexity, the process of improving pain man-agement also becomes more complicated.Broad-scale initiatives require a substantialinvestment of time and resources. One keyto success is giving organizations within ahealth care system the option to customizeinitiatives to suit their particular needs. Thefourth example describes the implementationof the National Pain Management Strategywithin the Veterans Health Administration(VHA). The example presents strategies formeasurement and improvement driven fromthe Washington, DC office, as well as theexperience of two hospitals participating in aVHA-sponsored Institute for HealthcareImprovement “Breakthrough Series” modelcollaborative.

There are more similarities than differ-ences in these four case examples. Theydemonstrate that quality measurement and

66 Improving the Quality of Pain Management Through Measurement and Action

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

Page 75: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission on Accreditation of Healthcare Organizations 67

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

improvement techniques can be used effec-tively in organizations of varying size andcomplexity. Many of the lessons describedare consistent with the basic tenets of thequality improvement field, such as ensuringleadership commitment early in the process,using multidisciplinary teams, collecting anddisseminating data effectively, sharing successand providing positive feedback to staff, andaccepting the fact that change takes time,while understanding that perseverance willpay off in the long run. Of course, eachorganization—as well as each improvementproject—is unique. Readers are encouraged touse these case studies not as blueprints, but asresources to adopt and adapt to their particu-lar needs.

B. Educational Effectiveness ina Home Health Agency:Visiting Nurse Association& Hospice of Western NewEngland, Inc.

Late in 1998, the home and communityservices division of the Visiting NurseAssociation & Hospice of Western NewEngland decided to evaluate pain manage-ment practice across its home care and hos-pice divisions. This agency, originally estab-lished in 1907, is now part of the BaystateHealth System, which includes three hospi-tals and an infusion and respiratory servicescompany. The home and community servicesdivision has three sites in suburban and rurallocations; all three sites offer Visiting NurseAssociation (VNA) programs and two offerhospice care. The need to know whetherpain was being managed effectively through-out the division, coupled with customer sat-isfaction opportunities and external factors,led the hospice staff to propose this initia-tive. In addition, it was one in which theentire division could participate, thus pro-moting unity and teamwork among staff thatincludes more than 150 nurses as well as 95home care aides.

1. Special ChallengesAgency staff recognized that the special

challenges of this pain improvement initia-tive included those that are common to theprovision of health care in the home envi-ronment (e.g., earlier discharge from acutecare settings have resulted in increased acu-ity of illness among home care patients,while the length of home care service isdecreasing) and those specific to this organi-zation (e.g., coordinating activities acrossmultiple offices that are geographically dis-tant and between VNA and Hospice servic-es). Another challenge involved languagebarriers in the special populations served bythis agency, including people of Russian,Spanish, and Vietnamese origin. Also of spe-cial concern are the clinically complexpatients and those requiring highly technicalequipment or procedures. These external andorganization-specific factors presented addi-tional dimensions for consideration indesigning a successful pain managementimprovement project.

2. Choosing a MethodologyThe Juran Institute’s Performance

Improvement methodology was used for thisproject.137 In preliminary meetings, teammembers received education on the four stepsof the Juran process: 1) identification andestablishment of the project, 2) determinationof the root cause of the problem, 3) imple-mentation of a solution, and 4) holding thegains. To facilitate goals of care coordinationacross the system and integration with systemquality improvement activities, a member ofthis project team also served on the hospitalquality improvement committee. Quarterlyreports on the project were provided.

3. Assembling a TeamMembers of the pain improvement project

team were recruited from volunteers withineach program and from each site. Othermembers included a representative from boththe quality assurance and the educationdepartments as well as the clinical supervisorfor the infusion and respiratory services com-pany; the team was chaired by the hospice

Page 76: Improving the Quality of Pain Management Through Measurement and Action

director. Team members were chosen basedon an expressed interest in pain managementand a willingness to serve as a resource andchange agent at their respective offices.Meetings were held monthly until projectcompletion in December 2000 (see Table11). The medical director received andreviewed minutes, providing input as indicat-ed. Authority for supporting the project-identified changes and allocating resourceswas provided through the director of riskmanagement and the vice-president of theagency. The team’s overall mission was toevaluate, improve, and standardize painmanagement across the home and communi-ty services division.138

4. Identifying the ProblemThe members of the team began by look-

ing at criteria, including guidelines (e.g.,those of the World Health Organization andthe Agency for Healthcare Research andQuality [formerly the Agency for HealthcarePolicy and Research]) and standards (e.g.,those of the Joint Commission onAccreditation of Healthcare Organizations).They completed a literature search and usedbrainstorming to identify potential opportu-nities for improvement. Among the itemsidentified in the brainstorming process, clini-cian knowledge was selected for furtherexamination. The team decided to evaluatethe knowledge and attitudes of nurses. Theyalso decided to conduct a medical recordaudit and to review patient and family satis-faction surveys and organizational documents(internal agency policies and procedures) tofurther pinpoint opportunities for improve-ment.

a. Test knowledge and attitudesTo assess staff knowledge, the team adopt-

ed the survey tool developed previously with-in the Baystate Health System, but madeslight modifications to address home careissues. The survey was completed by approxi-mately 80% of the nursing staff. Resultspointed to opportunities for improvementrelated to conducting pain assessments,equianalgesic dosing, pain management inthe elderly, and opioid side effects.

b. Review medical recordsChart audits were conducted to evaluate

multiple aspects of pain management. Thereview highlighted the need to improve theconsistency and completeness of documenta-tion related to pain assessments and treat-ment. For example, the documentation ofpatient response to treatment in visits fol-lowing interventions for pain managementwas inconsistent. It also was noted that painreports were not always described using dis-crete ratings.

c. Review organizational documentsOrganizational documents addressing

issues related to pain management werereviewed. Policies and procedures such as a

68 Improving the Quality of Pain Management Through Measurement and Action

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

Table 11. Visiting Nurse Association &

Hospice Of Western New England Pain

Performance Improvement Time Line

Source: C. Rodrigues, Visiting Nurse Association & Hospice ofWestern New England. Used with permission.

Date Event

March 1999 First team meeting

December 1999 Pain assessment tool evaluation and revi-sion to create a standard form; Pretestgiven; Pretest results tabulated

January 2000 Equianalgesic card finalized

March 2000 Project presented at Baystate HealthSystem performance improvement meet-ing; Home care aides in-service

April 2000 Protocol/procedure review, revision, andcreation; In-services conducted for nurs-ing staff

June 2000 Competency testing after in-services;Electronic medical record enhancement

September/October 2000

Focus audit; Distribution of pain newslet-ter (self-learning model)

December 2000 Competency retest

January/February2001

Focus audit

Page 77: Improving the Quality of Pain Management Through Measurement and Action

patient rights statement and a pain assess-ment policy were evaluated. Many of thesedocuments had been developed 4 to 5 yearsearlier and needed updating or revision.

Based on the results of this baseline assess-ment, some goals for improving pain man-agement across the division were identified.Particular goals included:■ Education of staff, patients, and family

regarding good pain management.■ Improvement of availability and timeli-

ness of analgesic medications.■ Enhanced continuity when transitioning

from the acute care setting to home care.■ Improved documentation of pain assess-

ments and treatments.■ Improvement or maintenance of high

levels of patient satisfaction.■ Increased use of nonpharmacologic

approaches.

5. Implementing Interventions

a. EducationEducational interventions were identified

by the team to address staff and patientknowledge deficits, and to enhance the tran-sition from acute care to home care. Toaddress staff needs, a 2-hour curriculum (twosessions of 1 hour each) was designed thathad multiple learning formats and offeredcontinuing education credits. Staff attendedlectures and participated in small group dis-cussions and role-playing exercises. Contentfor the sessions included:■ Physical, psychological, and social

aspects of pain.■ Misconceptions in assessing pain.■ Evidence-based pain interventions. ■ Communicating pain information to

physicians. The pain committee member at each

office actively participated in conducting theeducational sessions, helping to establishthese individuals as mentors and changechampions. A laminated trifold pocket cardwas provided to all staff for quick referenceto the World Health Organization analgesicladder, opioid equianalgesic dosing, and opi-oid/coanalgesic equivalency tables. Nursesprovided one-on-one education for patients

and families, focusing on the importance ofrecognizing and reporting pain, using a painscale to describe pain, and understandingtheir treatment regimen. All patients experi-encing pain as a problem received a painscale for use in reporting their pain. Finally,to address the goal of enhancing the transi-tion from acute care to home care, theagency provided information to the hospitalabout common problems such as patients dis-charged with prescriptions that are difficultto fill or that require titration every 4 hours.

b. DocumentationPain assessment documentation was

another area addressed by the project team.Combining the best-practice findings fromthe literature review with previously devel-oped tools, they created a comprehensive butuser-friendly tool to capture pain assessmentinformation. Next, the agency’s electronicdocumentation system was reviewed for waysto better capture pain-specific information.Updates were made to include pain as anidentified problem. When pain was selectedas a problem, the system would trigger specif-ic questions for the clinician to ask at eachsubsequent visit. Documentation capabilitieswere further enhanced with the introductionof a completely new electronic medicalrecord system in 2002. The new system offersmore detailed care plans, a focus on processessuch as reassessment, and the ability toupdate the central database from any phoneline, allowing all staff access to the latestvisit information. The central database alsowill support the analysis of patient-specificdata over time to create reports of patternsand trends in care. Finally, organizationalpain management policies and procedures aswell as the patient’s rights statement wereupdated or revised as necessary.

c. Treatment interventionsTo improve the availability and timeliness

of analgesic medications, the team identifiedlocal pharmacies that maintain a 3-day sup-ply of critical medications. Additionally, anarrangement was established with a special-ized hospice pharmacy that providesovernight delivery of medications for hospicepatients.

Joint Commission on Accreditation of Healthcare Organizations 69

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

Page 78: Improving the Quality of Pain Management Through Measurement and Action

Another intervention involved the addi-tion of nonpharmacologic treatments forhospice patients, including massage, aromatherapy, guided imagery, and heat/cold appli-cations. Treatments were provided by a vol-unteer specialist.

6. Analyzing Processes and ResultsLeads to Redesign

a. EducationTo assess the results of the clinician educa-

tional intervention, the annual nursing com-petency assessment process was used. A med-ication exam was developed that includedfour questions specific to pain management.They were designed to test the areas of weak-est performance on the initial knowledgeand assessment survey. The results showedthat across the agency, scores on these ques-tions averaged a disappointing 63%. Thepain project team began to examine the edu-cational interventions and processes tounderstand why scores were not higher. Theyidentified the following problems:■ Only 25% of the staff were able to attend

both in-services.■ Timing of the sessions during the sum-

mer was complicated by vacations andheavy caseloads.

■ New staff had joined the agency and hadnot received the educational interven-tion.

The pain team took action to redesign theeducational intervention. A self-guidedlearning approach was developed with thesame materials. A series of one-page newslet-ters was created and distributed weekly to allstaff and also incorporated into the newemployee orientation. Examples of coveredtopics include assessment of pain and typesof pain, titration and equianalgesic dosing,and side effects of medication related to painmanagement (Figure 8 shows a samplenewsletter). They also received Managementof Cancer Pain: Adults139 and the pocket cuecard. Retesting of all staff was conductedafter the second educational intervention,and results showed significant improvement,with a mean score of 94% (Figure 9).

Transition and coordination of care

between the acute and home care settinghave been improved through educationalexchanges between the hospital and homecare staff. Pain specialists from the healthsystem pain clinic provide home care staffwith special in-services on new technologyonce or twice each year, as well as consultingon complex treatments or difficult-to-man-age pain.

As a result of education, patients and fami-lies have become increasingly aware of theimportance of reporting and managing pain.In fact, phone messages from patients fre-quently include specific pain rating scores,noted Hospice Director Carol Rodrigues,RN, MS, CHPN.

b. DocumentationOngoing quarterly audits, focused on docu-

mentation related to assessments, interven-tions, and reassessments, show a trendtoward more consistent and complete infor-mation. The team anticipates that the newlyintroduced electronic record will provideadditional analysis capabilities and patient-level data.

7. Sharing the Success and SustainingChange

Improvements were shared with all staffthrough their office pain team member/men-tor. This pain improvement initiative wasselected as one of the Baystate HealthSystem’s eight most successful initiatives for2000. Team representatives did a poster pres-entation and attended a reception in April2001. A presentation also was made at theHospice Federation annual meeting, andinformation was shared at the Massachusettsmeeting of the National Association ofHome Care. The agency further shared theirexperience in an article that was publishedin the American Journal of Hospice &Palliative Care.138 The initiative has engen-dered excitement in staff and served as thespringboard for further improvement activi-ties.

Measures are being taken to sustain theimprovements in pain management. Thenew electronic documentation system willcontinue to enhance pain-related documen-

70 Improving the Quality of Pain Management Through Measurement and Action

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

Page 79: Improving the Quality of Pain Management Through Measurement and Action

tation. Medical record review will enablemonitoring of gains. Clinicians will be testedannually regarding their knowledge of painmanagement. Pain education has been inte-grated into the orientation program for newstaff. Family and patient satisfaction willcontinue to be monitored via questionnaires.To expand expertise within the agency, twonew staff members each year will receive spe-

cialized education to serve as pain resourcenurses. A new admission booklet is beingdesigned to include important pain-relatedmaterial along with general agency informa-tion. This is part of an ongoing commitmentto make pain management a normal part ofthe organizational culture.

This pain improvement initiative requiredtime, patience, a willingness to redesign

Joint Commission on Accreditation of Healthcare Organizations 71

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

Figure 8. Second Pain Management Newsletter Distributed by the Visiting Nurse

Association & Hospice of Western New England

Pain Newsletter 2: Reasons for Good Pain Assessment and Management

■ Improved patient satisfaction and quality of life■ Required by JCAHO for home health as well as hospice■ Supported by evidence-based practice■ May prevent litigation (several cases against RN staff for poor pain management)

What effect does pain have on a person’s quality of life?Physical• decreased functional ability• diminished strength, endurance• nausea, poor appetite• poor or interrupted sleep, no restorative sleep

Psychological• diminished leisure, enjoyment• increased anxiety, fear• depression, personal distress• difficulty concentrating• somatic preoccupation • loss of control

Social• diminished social relationships• decreased sexual functions, affection• altered appearance• increased caregiver burden

Spiritual• increased suffering• altered meaning• reevaluation of religious beliefs

There are many common misconceptions on pain held by health care providers. Yet we know that:■ The patient and not the health care provider is the authority on his/her own pain.■ There are physiologic and behavioral adaptations to pain; the lack of expression does not equal lack of pain.■ Not all causes of pain are identified; no cause does not mean no pain.■ Respiratory tolerance is rapid.■ Sleep is possible with pain.■ Elders experience pain—do not express it as much.■ Addiction is rare: 0.1% to 0.3%.

Source: C. Rodrigues, Visiting Nurse Association & Hospice of Western New England. Used with permission.

Page 80: Improving the Quality of Pain Management Through Measurement and Action

interventions, and use of nontraditionalapproaches. These lessons can be applied toany pain improvement project in any healthcare setting.

C. Creative Improvement in aRural Hospital: MemorialHospital of Sheridan County

Memorial Hospital of Sheridan County, an80-bed facility nestled beneath the Big HornMountains of Wyoming, serves the medicalneeds of a largely rural county with a popula-tion of 26,000 people as well as the sur-rounding area. The organization’s mission,“to provide quality health care services thatimprove the well-being of our community,” isexemplified in their pain managementimprovement efforts.

1. Doing Some HomeworkThe first steps toward developing this pain

management improvement initiative were

taken in November 1999, when the directorof nursing at Memorial Hospital, MickiBonnette, RNC (home care/hospice), andPam Hall, RN (radiation oncology), attend-ed a 1-day program entitled “Cancer PainRole Model Program,” conducted by theWisconsin Pain Initiative. Subsequently, Ms.Bonnette and Ms. Hall applied to and com-pleted the pain nurse preceptorship programat the Medical College of Wisconsin–Milwaukee to further develop their painmanagement expertise and support develop-ment of an institutional program.

Upon return to Memorial Hospital, Ms.Bonnette spearheaded activities to assess cur-rent organization-wide practice. The firststep was an organizational self-assessmentutilizing a structured instrument. Second, aknowledge and attitude survey specific topain management was distributed to nursingstaff. One section of the survey asked thenurses to identify barriers they experiencedto providing quality pain management.Lastly, an organization-wide sample of chartswas reviewed by using a slightly modifiedversion of the Medical Record Audit Form.54

The results of these measures of currentperformance provided an indication of boththe strengths and opportunities for improve-ment in pain management practice acrossthe organization. Specifically, the chartaudits revealed inconsistent documentationof pain assessments and patient-reported painintensity ratings. In part, this was becausethere was no specific location to documentpain-related information on existing medicalrecord forms. The staff survey results identi-fied the following priority areas for educa-tional interventions: 1) pain assessment andmanaging pain in specialty populations, 2)pharmacologic management of pain, 3) non-pharmacologic interventions, and 4) psy-chosocial issues in pain. After compilingthese data, Ms. Bonnette and Ms. Hall feltready to convene a dedicated Pain Team tofacilitate a pain improvement initiative.

2. Establishing a Team A notice calling for interested persons to

form a Pain Team was posted in May and

72 Improving the Quality of Pain Management Through Measurement and Action

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

Figure 9. Mean Score on Staff Pain

Management Knowledge Survey:

Visiting Nurse Association & Hospice

of Western New England, Inc.

This histogram shows the mean scores on a staff painmanagement knowledge survey before and after aneducational intervention at the VNA & Hospice ofWestern New England, Inc. (C. Rodrigues).

0

10

20

30

40

50

60

70

80

90

100

PosttestPretest

63

94

Page 81: Improving the Quality of Pain Management Through Measurement and Action

June 2000 on the hospital Intranet. Effortswere made to identify potential membersfrom multiple disciplines and departments.The 12-member team met for the first timein August 2000 and included representativesfrom pharmacy, nursing (inpatient servicesand home care/hospice), radiation oncology,and anesthesiology. A nurse who specializedin counseling and chronic pain, and a nursedirector of complementary medicine joinedthe team a year later. The Director ofNursing, also an active team member, pro-vided strong management support and vali-dated pain improvement as an institutionalpriority. The team first evaluated the resultsof the current practice assessments, consid-ered the aspects of good pain managementpractice already in place, and developed amission statement: “All patients have theright to pain relief.”

Meeting monthly from August throughDecember 2000, the team selected objec-tives, defined interventions, and proposed atimeline for 2001. Additionally, a library ofpain resource information was established,including a selection of key journal articlesand guidelines, a copy of Building AnInstitutional Commitment to Pain Management,The Wisconsin Resource Manual,26 and 30copies of McCaffery and Pasero’s Pain ClinicalManual27 for distribution on individual units.Two surveys, one for nursing and one formedical staff, were distributed to elicit infor-mation on educational needs (content) andpreferences for teaching methods as well asscheduling options.

3. Implementing ImprovementsThe objectives for pain management

improvement selected for 2001(which aredescribed in Table 12) fell into three generalareas:■ Increasing awareness of pain manage-

ment■ Staff and patient/family education ■ Assessment and documentation (forms).

a. Increased awarenessThe planning and initial work of the Pain

Team was showcased in a weeklong kick-off inFebruary 2001. The campaign, designated as

No Pain, Big Gain, was launched with numer-ous fun and educational activities to raiseawareness among staff and the community ofthe organizational commitment to pain man-agement (see the text box on pg. 75).

In March 2001, the patients’ rights state-ment was posted in all units and departmentsof the hospital next to the hospital’s missionstatement. A pain resource manual wasdeveloped and placed on all units and in thephysicians’ lounge, along with a copy of PainClinical Manual.27 Information about thecampaign was placed in medical staff mail-boxes and the public was alerted through aspecial news release in the local paper.

b. EducationMultiple educational interventions were

conducted during and after the campaignkick-off week. These included a pain sympo-sium, a movie, a self-directed learning exer-cise, and printed materials. The travelingeducational exhibit included a posttest withan optional extra-credit medication conver-sion problem. All patient-care staff wereencouraged to complete the quiz. Special off-site educational interventions were offeredfor medical staff through collaboration withother community health care organizations.For example, in March 2001 a program wasarranged with guest speaker MichaelAshburn, MD, President of the AmericanPain Society. A newsletter, Pain News Weeklywas developed to further highlight and rein-force information. Finally, patient educationinterventions included the provision andreview of information about patients’ rightsand responsibilities, pain reporting and use ofscales, and discharge instructions.

c. Documents and formsOne of the first actions taken by the team

was to identify and select a designated painscale. They combined a numeric rating scaleof 0-10 with the Wong-Baker Faces scale62

on a laminated card and also selected theFLACC140 pain intensity scale for preverbal,nonverbal, and cognitively impaired patients.Pocket cards with dosing guidelines for anal-gesics and equianalgesic charts were preparedfor clinicians. Policies and procedures wererevised and updated during 2001.

Joint Commission on Accreditation of Healthcare Organizations 73

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

Page 82: Improving the Quality of Pain Management Through Measurement and Action

74 Improving the Quality of Pain Management Through Measurement and Action

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

Table 12. Memorial Hospital of Sheridan County Pain Management Improvement

Initiative

Source: M. Bonnette, Memorial Hospital of Sheridan County, Wyoming. Used with permission.

Objective Actions

Measurement

Approach Status

Establish organization-wide multidisciplinaryteam

Invitation placed on IntranetVolunteers identifiedTeam members selected

Notapplicable

Full team convened August 2000Team meets monthly or more frequently, if needed

Develop organizationalmission statement andupdate policy and proce-dure documents related topain management

Develop mission statementRevise/develop policies andprocedures

Revieworganizationaldocuments

Completed June 1, 2001Statement posted in rooms December 2001Policy and procedures approved and implementedAugust 2001

Patients receive notifica-tion of rights/responsibili-ties regarding pain man-agement.

Develop rights and respon-sibilities brochureProvide education duringthe admission processDocument teachingRevise admission form todocument patient rightseducation

Reviewmedicalrecords

Admission process task force evaluates admissionformsBrochure completed July 2001Organization-wide in-service on brochure/educa-tion in July 2001Admission form revision received December 2001;reaudit April 2002

Patients will receive infor-mation about pain controlmethods within first 24hours

Staff education on objectiveand process

Assesspatients overtime

Survey of patients (n = 101) using the AmericanPain Society Patient Outcomes Survey in June 2001Resurvey June 2002

Patients on themedical/surgical units willbe assessed for pain usingnew pain documentationtool

Develop new assessmenttoolPilot-test toolEducate staff on use of tool

Reviewmedicalrecords

Pain tool template finalized in July 2001Form delayed at printer; final form receivedJanuary 2001Quarterly chart review showed steady improve-ment (93% compliance by third quarter 2001)

Patients are provided withpain education materialsbefore discharge as appli-cable

Review discharge processAssess documentation formsDevelop new form to docu-ment discharge

Review med-ical records

Previous change in discharge process documentationcreated a barrier (November 2001)New discharge form development (ongoing)

Provide reference materi-als and guidelines house-wide

Develop reference manualfor each unit and physicianloungeObtain copy of Pain ClinicalManual for each unitObtain reference materialsfor new Health InformationCenter

Not applica-ble

Pain reference manuals on every unit and in physi-cian lounge in March 2001Pain Clinical Manual obtained and distributed toeach unit in December 2000Health Information Center supplied with patienteducation materials

Provide staff education Develop educational inter-ventionsTest for knowledge and atti-tudesUtilize multiple approaches

Knowledgeassessments

Clinician survey for learning needs/preferencesPain News Weekly distributed monthlyPosters: myths and tipsIn-servicesOff-site physician educationSelf-directed learning“No Pain, Big Gain” campaign

Page 83: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission on Accreditation of Healthcare Organizations 75

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

A separate forms committee (includingsome Pain Team members) focused on revis-ing forms to document patient assessments.This committee developed a new multidisci-plinary screening and assessment tool thatdocuments pain intensity, patterns and char-acteristics, psychosocial/cultural/spiritualissues, nutrition, and functional status. Theyalso created a flow sheet to detail care whena patient’s reported pain is greater than orequal to 4 in intensity and frequent monitor-ing is required.

4. Collaborating for ImprovementThe Pain Team decided that the identifi-

cation of clinician mentors or pain champi-ons from the nursing staff on each shiftwould further reinforce changes in practice.Invitations were extended for interested par-ties, and three individuals were selected andadded to the Pain Team. Shortly after theselection of the clinician mentors, participa-tion in a multi-hospital pain improvementcollaborative was approved by hospital lead-ership. The project presented an excellentopportunity for the mentors to develop theirnew roles. In April 2001, the three clinicalmentors went to Denver, Colorado, for train-ing associated with a collaborative projectoffered through the Voluntary Hospitals ofAmerica. In addition to project specifics,they received education about qualityimprovement principles and serving as a rolemodel/preceptor. The project also includedmonthly “coaching calls,” on-site visits, threelive audio educational interventions, andnetworking with other organizations.Through participation in this collaborativeproject, Memorial Hospital received defineddata collection tools, data analysis, and com-parative feedback. Four indicators weremeasured at baseline and again 6 monthslater. They were:■ Receipt upon admission of written evi-

dence of the organization’s position onpain management.

■ Receipt, prior to discharge, of educa-tion/explanation regarding pain and painmanagement.

■ Screening at initial assessment.■ Pain assessment at discharge.

No Pain, Big Gain Kick-off Activitiesin February 2001

■ An activity featuring cake and distri-bution of campaign buttons tookplace every day for the first week.

■ There was a 45-minute in-serviceprogram on pain assessment andmanagement, provided through apharmaceutical company.

■ There was a popcorn and movie day,featuring I Got My Life Back, a movieabout chronic pain patients and phar-macologic management.

■ The Traveling Show (Lions and Tigersand Pain—Oh NO!), a portable edu-cational exhibit created on a pedi-atric bed frame, was taken to eachunit and left there for at least 24hours to accommodate all shifts. Thisexhibit provided key pieces of infor-mation about pain management, andintroduced selected pain scales andexamples of good documentation.Staff could take a quiz after studyingthe exhibit; 117 individuals did so.

■ Laminated pain scales were distrib-uted to all staff.

■ Raffles and other rewards generatedenthusiasm and provided additionalincentives for participation.

■ Special materials for physicians wereinitially placed in their mailboxesand then delivered via fax to theiroffices.

■ A news release was sent to the localpaper.

■ Information on the campaign wasprovided to the recently openedHealth Information Center. TheCenter is open to any communityresident and provides Internet access(online documents), a library, educa-tional materials (including pain-relat-ed materials), and access to healthinformation.

From M. Bonnette, Memorial Hospital of SheridanCounty, Wyoming.

Page 84: Improving the Quality of Pain Management Through Measurement and Action

Based on a target sample of 30 records,improvements were seen in two of the indi-cators (education and assessment prior to dis-charge; see Figure 10). Although initiated inpractice, delays in printing the revised admis-sion form resulted in no evidence ofimprovement on the other two indicators.The team anticipates documentation willimprove after implementation of the form inthe first quarter of 2002 and will remeasurethe indicators later in 2002.

5. Integrating NonpharmacologicApproaches

One of the most exciting interventionswas the addition of nonpharmacologicapproaches to pain management. A regis-tered nurse who is a certified massage thera-pist was chosen to develop a complementaryhealing program, now known as theIntegrated Healthcare Program. To promoteawareness and address misconceptions, shemade office visits to physicians to shareinformation and discuss the program. Sherecently completed certification in imageryand has held in-services for staff on relax-ation, imagery, and massage techniques. Byproviding massage therapy and imagery toinpatients, she helps nonpharmacologicapproaches gain acceptance. Some physi-cians have established standing orders formassage therapy for their patients.

6. Sustaining ChangeThe Pain Team, with administrative sup-

port, has made pain an institutional priority.The initiative has been marked by creativity.Efforts were made to make educational activ-ities interesting, fun, and accessible.Information is shared regularly at staff meet-ings, and desired changes in pain manage-ment practice are recognized. For example,team members may randomly check chartsfor examples of good documentation or lookfor staff wearing the campaign buttons.Individuals are immediately acknowledgedand treated to a special award such as amovie pass or gift certificate. This use of pos-itive reinforcement has marked all aspects ofthe initiative.

The No Pain, Big Gain campaign facedchallenges such as competing demands forresources from other educational programsand delayed arrival of the revised documen-tation forms. A major building expansionproject requiring unit relocations occurredconcurrent with the effort. Despite thesechallenges, those involved in the campaigncontinued to believe that the goal ofimproved pain management was important.They accepted that the process would take along time and valued the progress they made.The clinical mentors have proven to be akey factor in effecting changes in day-to-daypractice, and nurses feel more empowered toadvocate for their patients’ pain relief needs.

The members of the pain committee areclearly focused on maintaining and continu-

76 Improving the Quality of Pain Management Through Measurement and Action

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

Figure 10. Baseline and Post (6

Months) Indicator Data: Memorial

Hospital of Sheridan County

Clinical Indicators

Indicators1. Receipt, upon admission, of written evidence of

organization’s position on pain management.2. Receipt, prior to discharge, of education/explana-

tion regarding pain and pain management.3. Screening at initial assessment.4. Pain assessment at discharge.

This figure shows change across four indicators fromthe Memorial Hospital of Sheridan County Wyoming(M. Bonnette).

0

20

40

60

80

100

120

PostBaseline

1 2 3 4 1 2 3 4

Page 85: Improving the Quality of Pain Management Through Measurement and Action

ing the improvement efforts. Annual compe-tency testing has been instituted, and a painmanagement component has been added tothe skills lab and to new employee educa-tion. Ongoing medical record audits arebeing completed to determine trends inimportant aspects of care. With the arrival ofa new pharmacy director in spring 2002,pharmacologic aspects of pain managementpractice will be studied.

The success of the pain improvement ini-tiative at Memorial Hospital highlights thevalue of participation in preceptorship pro-grams, collaborative measurement/improve-ment initiatives, careful assessment of cur-rent practice by measuring performance, cre-ative learning experiences, and positiverecognition of practice changes. These arevaluable insights for organizations of any sizeor location.

D. Integrating PainManagement ImprovementActivities in an AcademicMedical Center: Universityof Iowa Hospitals andClinics

The University of Iowa Hospitals andClinics is a comprehensive teaching andhealth care center serving patients from allcounties in Iowa as well as other states andcountries. It is one of the largest university-owned teaching hospitals in the country.Established in 1898 as a 50-bed hospital, thecurrent 831-bed facility and clinics servemore than 41,000 patients annually. Locatedwithin UIHC, Children’s Hospital of Iowaprovides family-centered pediatric clinicalservices. As a major health training resource,UIHC serves students from five health sci-ence colleges and provides supervised clinicalsettings for community college programs innursing education, surgical technology, andrespiratory therapy. A commitment to excel-lence in research marks another importantcomponent of the organization’s mission andservice.

1. History of Pain ManagementActivities at UIHC

The management of pain has been a focusin clinical care and research for over adecade in both the health care and the uni-versity systems. Multiple pain improvementinitiatives have been undertaken by variousdivisions and clinical departments since theearly 1990s. Table 13 lists examples of somerecent departmental activities. Similarly,clinical research has been a significant ele-ment of pain management practice develop-ment at UIHC. For example, theDepartment of Nursing has collaborated withthe University of Iowa College of Nursing topursue research in pain management thatsupports evidence-based practice. The inves-tigation of the effectiveness of a brief distrac-tion educational intervention for parents ofpreschool children undergoing intravenouscatheter insertion,141 the evaluation of appli-cation time for effectiveness of local anes-thetic gels,142 and the development of aninfant pain rating scale are just a few exam-ples. Translating research evidence into prac-tice, a long-standing commitment for boththe College of Nursing and the Departmentof Nursing, is exemplified in the Iowa Modelof Evidence-Based Practice to ImproveQuality of Care.143,144 It is the focus of a cur-rent study examining acute pain manage-ment in the elderly, funded by the Agencyfor Healthcare Research and Quality (grantRO1 HS 10482; principal investigator: M.Titler); this study is being conducted by sci-entists from UIHC and the University ofIowa Colleges of Nursing, Medicine,Pharmacy, and Public Health.

Patient-related pain management servicesare comprehensive and include:■ The Acute Pain Service. This service

actively manages acute postoperative andcancer pain for inpatients and commonlyuses advanced pain management treat-ment modalities.

■ The Pain Medicine Clinic. The centerprovides evaluation and treatment forpatients with chronic pain, cancer-relat-ed pain, and nerve or musculoskeletalinjuries.

Currently, a new state-of-the-art facility is

Joint Commission on Accreditation of Healthcare Organizations 77

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

Page 86: Improving the Quality of Pain Management Through Measurement and Action

being constructed that will locate both thechronic and acute programs in one suite.The Acute Pain Service area will have directaccess to the surgical department, and thePain Medicine Clinic will have an entrancefor outpatients.

2. Forming a Standing Committee The work completed by a multidisciplinary

pain committee in the early 1990s and thesuccessful ongoing efforts within and acrossdepartments have built a strong foundationfor pain management practice. In early 2000,the administration of UIHC proposed theestablishment of a standing committee that

would act as an advisory body to address crit-ical and long-term issues associated with painmanagement. The Interdisciplinary PainManagement Subcommittee was createdunder the Pharmacy and TherapeuticsSubcommittee of the University HospitalAdvisory Committee. Structurally, the com-mittee is part of the Quality andPerformance Improvement Program frame-work (see Figure 11). This framework illus-trates the reporting lines for communicationof centralized and decentralized qualityimprovement activities conducted by othersubcommittees of the University HospitalAdvisory Committee (n = 20), 19 clinical

78 Improving the Quality of Pain Management Through Measurement and Action

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

Table 13. Examples of Departmental Pain Improvement Activities: University of

Iowa

Source: M. Titler, University of Iowa Hospitals and Clinics. Used with permission.

Department Activity

Pharmacy Educational program for pharmacistsCase presentations (e.g., chronic, acute, pediatric)Addition of pharmacist with specialty in pain management (June 2001) who also works with thepain teamMedication use evaluation: evaluate use of meperidine through chart review with a goal ofdecreasing the rate of use; Development of criteria and recommendations for useParticipate in review and update of policies and procedures as part of the Interdisciplinary PainManagement Subcommittee to evaluate for consistency between nursing and pharmacy (e.g.,patient-controlled analgesia, epidural)Review of preprinted order forms for drug-related issues (e.g., abbreviations, doses)Preparation of articles related to pharmaceutical treatments for pain for the P & T News, a publica-tion of the Pharmacy and Therapeutics Subcommittee

Nursing Development of an analgesic guide (including equianalgesic charts) that was approved by thePharmacy and Therapeutics Subcommittee of University Hospital Advisory Committee; available asa pocket guide and posterChart audits by each clinical nursing division to determine documentation practices regarding painassessment and reassessmentRegistered nurse knowledge surveys completed within each clinical nursing divisionDevelopment and implementation of pain standards for ambulatory careInservices by ambulatory care nurse managers for staff on the ambulatory pain standardAdult pain standard of practice for inpatient pain management presented to clinical staffStandardization of pain intensity scales

Children’sHospital

Development of standards of careCreation of an interdisciplinary task force to look at pain management issuesDevelopment of a policy regarding the use of a local anesthetic gel before procedures and researchon the effectiveness of two productsIdentification of a pain intensity scaleResearch and development of a pain rating system for infantsCreation of a Pain Resource Manual for specialty units

Page 87: Improving the Quality of Pain Management Through Measurement and Action

departments, 12 hospital departments, 4multidisciplinary clinical centers, and multi-ple interdisciplinary teams. Central withinthe structure of the Quality and PerformanceImprovement Program is the ClinicalOutcomes and Resource Management office,which supports quality improvement initia-tives through use of data management sys-tems and expert staff.

The Interdisciplinary Pain ManagementSubcommittee is co-chaired by Marita Titler,PhD, RN (Nursing Services and PatientCare), Richard Rosenquist, MD(Anesthesia), and Steve Nelson, MS(Pharmaceutical Care). The other 15 mem-bers also are drawn from multiple disciplines.This committee began their work with the

following charges:■ Increase awareness of the need for ade-

quate pain management for all patientswho seek care at the UIHC.

■ Identify and develop effective pain man-agement strategies for UIHC patients.

■ Identify methods for patient referral toappropriate existing clinical consultationservices for pain management.

■ Identify ways to reduce adverse drugevents through the safe and appropriateuse of analgesics.

■ Coordinate educational initiatives forcare providers.

■ Coordinate and review data associatedwith pain management as part of UIHC’squality improvement program.

Joint Commission on Accreditation of Healthcare Organizations 79

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

Figure 11. Quality and Performance Improvement Program Framework

Illustrating the Reporting Lines for Communication: University of Iowa Hospitals

and Clinics

Source: University of Iowa Hospitals an Clinics Web site. Used with Permission, Available atwww.uihc.uiowa.edu/corm/QPOPReport.htm

University of Iowa Hospitals and

Clinics

University ofIowa

Health Care

College ofMedicine

UniversityHospital AdvisoryCommittee (HAC)

ProfessionalPractice

Subcommittee

Other HAC Subcommittees (20)(Interdisciplinary Pain ManagementSubcommittee of the Pharmacy and

Therapeutics Subcommittee)

Performance, Value andService Management

Committee

Clinical Department (19)

Multidisciplinary Clinical Centers (4) Hospital Departments (12)

Interdisciplinary Qualityand Performance

Improvement Teams(8 Types)

ClinicalOutcomes and

ResourceManagement

(CO/RM)

Page 88: Improving the Quality of Pain Management Through Measurement and Action

80 Improving the Quality of Pain Management Through Measurement and Action

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

■ Routinely evaluate the hospital’s painmanagement programs and identify/rec-ommend needed changes.

The co-chairs began by identifying multi-ple data sources within the organization’sdocumentation (automated and nonautomat-ed) and quality improvement systems thatcould provide information on current prac-tice. When the full committee met for thefirst time in August 2000, they were able toreview the information and begin exploringopportunities for improvement. They usedthe quality improvement tool of brainstorm-ing and identified 25 possible focus areas.

3. Establishing an Action PlanThe subcommittee members were then

asked to rank the top 10 priority areas inorder of importance from the initial list of25. In the month before the next meeting,they solicited input from colleagues and oth-ers in the organization. When the full com-mittee met again in September, they wereable to identify 10 priority areas that com-bined most of the items on the original list.They then grouped the priority areas intofour work groups: ■ Assessment and documentation■ Patient education■ Patient rights and responsibilities■ Clinician education.

Each work group was charged with devel-oping an improvement plan, interventions,and a strategy to measure the impact of theinterventions. All work groups reported backto the Interdisciplinary Pain ManagementSubcommittee on their progress.

4. Fulfilling the Charge: TheAssessment and DocumentationWork Group

The Assessment and DocumentationWork Group included individuals who alsosat on departmental pain teams such as theDepartment of Nursing Pain ManagementSubcommittee of the Quality ImprovementCommittee. In this way, the group built onthe substantial work related to pain manage-ment already completed in decentralized ini-tiatives. In their plan to understand andimprove the quality of pain assessment and

documentation across departments and levelsof care, the work group posed the followingquestions:

1. Do nurses use standardized painassessment techniques for pain assessment(intensity, location, quality, duration) ofpatients?

2. How frequently is pain assessed anddocumented in the inpatient and ambulatorycare setting?

3. Is a standardized tool used to assessand document pain intensity?

4. What pain treatments do patientsreceive for management of:

Acute pain?Cancer pain?Chronic pain?

5. What mechanisms are used to pro-vide patient and family education regardingrights and responsibilities for management ofpain?

6. What are the educational needs ofnurses to provide optimal pain assessmentand treatment?

7. Are the departmental standards forpain management congruent with currentresearch evidence and American PainSociety Guidelines?

8. What system changes are needed toprovide optimal pain management topatients receiving care at UIHC?

The work plan they developed used multi-ple interventions, including the review andrevision of departmental standards for painmanagement, knowledge assessment of nurs-es, and education of nurses, patients, andfamilies. The work group obtained baselineand postintervention data through the use ofretrospective chart audits, data from theonline Nursing Information DocumentationSystem, and knowledge and assessment sur-veys.

a. Implementing and evaluating interven-tions

i. Departmental/divisional standards ofcare for pain management

The following standards were updatedand/or revised by the Department of NursingPain Management Subcommittee of theQuality Improvement Committee, the

Page 89: Improving the Quality of Pain Management Through Measurement and Action

Department of Nursing Policy and ProcedureCommittee, and the Children’s Hospital ofIowa (for pediatrics):■ Pain Screening in the Ambulatory

Setting■ Pain Management for Adults: Inpatient■ Pain Management: Pediatrics.

The Interdisciplinary Pain ManagementSubcommittee then forwarded these stan-dards for review and approval. A “train thetrainer” approach was used to educate nursemanagers, assistant managers, and supervisorsin the new pain management standards sothey, in turn, could teach nursing staff. Inaddition, materials were incorporated intothe annual competency review for staff andare included in both the central and divi-sional new employee orientation.

ii. Knowledge assessment and educationKnowledge surveys for nurses were cus-

tomized and distributed by care areas (pedi-atrics, perinatal, adult medical/surgical andcritical care, and perioperative). Based onthe survey results, each nursing clinical divi-sion devised a plan unique to their educa-tional deficits. For example, the adult med-ical/surgical, perioperative, and critical carenursing divisions addressed the identifiededucational needs by:■ Developing and implementing an adult

analgesic guide approved by thePharmacy and TherapeuticsSubcommittee of the University HospitalAdvisory Committee.

■ Purchasing and distributing theAmerican Pain Society’s Principles ofAnalgesic Use in the Treatment of AcutePain and Cancer Pain, 4th edition,145 andPain Clinical Manual, 2nd edition,27 foreach nursing division.

■ Displaying posters on units regardingdrug interactions, drug dosing, and painassessment.

■ Providing education regarding the painstandards and use of pain rating scales.

b. Measuring the effectiveness of workgroup activities

To assess the impact of the interventionsto improve the quality of pain assessmentand documentation, the work group designed

the Quality Improvement Pain Monitor.This chart audit tool used data from chartsand the online documentation system todetermine the number of times pain wasassessed/reassessed and if the intensity, quali-ty, location, and interventions were noted(quality indicators for pain). As can be seenin Figure 12, there was significant improve-ment in the number of times pain intensitywas documented in 24 hours for the unitsdisplayed. Additionally, knowledge surveyswere repeated to evaluate for improvementin deficit areas. Between September 2000and September 2001, the work group hadaccomplished much. They had stated goals,posed important questions, obtained baselinedata, designed and implemented interven-tions, and monitored change. The workgroup successfully fulfilled their charge and,as part of the subcommittee, helped meet theoverall goals of coordinating educational ini-tiatives for care providers and collecting painmanagement data for use in quality improve-ment activities.

5. Complementing Centralized PainImprovement Activities: ADivision-level Initiative to ManageCircumcision Pain

The Interdisciplinary Pain ManagementSubcommittee has worked to coalesce painmanagement activities across the organiza-tion through a centralized approach, whilesupporting the continued development ofcomplementary initiatives to address specialpopulations, types of pain, and unique proce-dures. The development of a protocol tomanage circumcision pain is an example of apain management initiative completed with-in one division at UIHC.

a. Identifying an opportunityWithin the Division of Children’s and

Women’s Services, multiple units care forinfants undergoing a circumcision procedure.A recent American Academy of Pediatricsposition statement on circumcision thatincluded the management of pain (see“Circumcision Policy Statement,” Table 2)caught the attention of Janet Geyer,Advanced Practice Nurse. Together with col-

Joint Commission on Accreditation of Healthcare Organizations 81

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

Page 90: Improving the Quality of Pain Management Through Measurement and Action

82 Improving the Quality of Pain Management Through Measurement and Action

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

leagues on other units, she began to explorethe issue of how pain was managed duringthis procedure at UIHC. The first step was tounderstand current practice by reviewingmedical charts. In fall 1999, 51 charts ofinfants undergoing a circumcision wereexamined. Results showed that nearly allbabies (96%) received a dorsal block. It alsowas noted that although acetaminophen wasordered in 71% of the charts reviewed, only59% of the charts indicated that it wasadministered at least once. Although theresults indicated consistent use of blocks, thenurses believed the administration rate foracetaminophen could be improved and over-all pain management could be enhancedthrough the development of a comprehensivepain management protocol for this procedure.

b. Forming a planOnce the opportunity was identified, the

nurses began by forming a committee andenlisting participation from key stakeholders.The resulting team of eight people includednurse representatives from all applicable careunits, a neonatologist, and an obstetrician.Additionally, consultations were obtainedfrom a lactation specialist, pharmacist, den-tist, urologist, music therapist, and lawyer forspecific issues. Initially, the team began byconducting a literature search for evidence-based pain management recommendationsspecific to infants or this procedure. Staffphysicians were solicited to participate inthis process and review articles. Over thecourse of several months, the team began toidentify a set of specific care processes such

Figure 12. Mean Number of Times Pain Intensity Is Documented Every 24 Hours:

Department of Nursing Services and Patient Care Intensive Care Units,

University of Iowa Hospitals and Clinics

Source: M. Titler, University of Iowa Hospitals and Clinics. Used with permission.

0

1

2

3

4

5

6

7

8

9

10

ICU #2

ICU #1

MayAprilMarchFeb.Jan.Dec.Nov.Oct.Sept.AugustJulyJuneMay

2000 2001

6.1 6.0

9.7

9.1

4.53.9

Baseline(n=20)

PostEducation

(n=20)

Post Flowsheetchanges (n=20)

Page 91: Improving the Quality of Pain Management Through Measurement and Action

as the use of buffered lidocaine and a smaller-gauge needle for the block, as well as the useof soft music during the procedure. Theseprocesses were incorporated into a compre-hensive protocol. Examples of recommendedpractices are shown in Table 14.

c. Implementing the protocolIn planning the implementation of the

new protocol, the team worked to ensurethat related documents and required serviceswere ready. First, an update to the operativepermit was completed to include informationon pain management, in addition to proce-dure-related risks and benefits. Also, educa-tional materials for parents were revised toinclude similar information. These educa-tional materials are provided to parents dur-ing prenatal visits to physician offices andalso on admission to UIHC for delivery. The

team also met with pharmacy staff to estab-lish a process that ensured the availability ofbuffered lidocaine, which must be mixeddaily.

By spring 2000, all necessary approvals hadbeen obtained for use of the protocol. Thenext step involved providing education to allclinicians. For nursing and patient care per-sonnel, a self-learning module was devel-oped. Nurses involved in the team alsoserved as resources for other staff and provid-ed one-on-one training. The protocol hasbeen added to new staff orientation materi-als. Physician members of the team were keyin communicating the protocol to colleaguesboth individually and through faculty ordepartmental meetings. All residents weresent information via e-mail and were super-vised by the chief resident during the firstfew procedures. This process is repeatedevery year when new residents begin theirtraining. The protocol also has been includ-ed in the agenda for a special 1-day postgrad-uate course for pediatric residents to be heldin fall 2002.

d. Assessing for successTo measure the success of the project,

medical records were reviewed a second time6 months after implementing the protocol(January 2001). An audit tool was developed(Circumcision Evaluation Tool) to check forcompliance with each component of the pro-tocol. The tool also documents the numberof doses of analgesia given. Twenty-ninecharts were reviewed with findings of fullcompliance (100%) for use of a dorsal block,acetaminophen ordered, infant swaddled,and sucrose pacifier offered. The rate foradministration of at least one dose of aceta-minophen rose to 84%. The chart reviewalso revealed that the electronic medicalrecords facilitated documentation of the pro-tocol, whereas manual records did not.Therefore, for those units not using electron-ic medical records, a special sticker wasdeveloped with the protocol steps and aplace to check completion.

The success of the project has been sharedin several ways, including a poster presenta-tion at a regional pediatric conference, astatewide educational conference, and a

Joint Commission on Accreditation of Healthcare Organizations 83

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

Table 14. Examples of Pain

Management Interventions for

Circumcision: University of Iowa

Hospitals and Clinics (UIHC)

1. Premedicate with acetaminophen 10-15 mg/kgwithin 1 hour before procedure per physician ornurse practitioner order.

2. Swaddle infant on padded circumcision boardand restrain legs (allows infant to put hand tomouth).

3. Assess infant comfort/pain level and provide inter-ventions:a. Shield patient eyes from direct light.b. Provide pacifier dipped in sucrose after discus-

sion with parents/guardian.c. Music therapy may be utilized per standard of

practice.d. Ensure patient is kept appropriately warm.

4. Following block, wait 3-5 minutes before pro-ceeding with circumcision.

5. Following procedure, remove infant from circum-cision board.

6. Assess infant comfort/pain and provide interven-tions:a. Swaddle and hold infant.b. Have parent/guardian or nurse provide feeding.c. Provide acetaminophen 10-15 mg/kg as

ordered by physician or nurse practitioner 4hours after initial dose and every 4-6 hours for24 hours.

Source: J. Geyer, University of Iowa Hospitals andClinics. Used with permission.

Page 92: Improving the Quality of Pain Management Through Measurement and Action

national evidence-based practice conference.Team members also have shared their successin an article detailing the use of evidence asthe basis for the protocol.146 The team notedthat some of the keys to the success of theproject were involving important stakehold-ers, engaging physicians in the process, andkeeping people informed. They had strongsupport from nursing leadership, and medicalstaff was receptive. Finally, conducting chartreviews provided objective measures of per-formance that validated the successful imple-mentation of the protocol.

E. Improving Care AcrossHundreds of Facilities: TheVeterans HealthAdministration’s NationalPain Management Strategy

The veterans health care system is thelargest fully integrated health care system inthe United States. In fiscal year 1999, theVHA provided direct care in more than1100 different sites to more than 3.6 mil-lion persons. Those sites included 172 hos-pitals, more than 600 ambulatory and com-munity-based clinics, 132 nursing homes,206 counseling centers, 40 residential caredomiciliaries, 73 home health programs,

and numerous contract care programs. In1999, the VHA staff included 53,000 nurs-ing personnel, 13,000 physicians, more than3500 pharmacists, and thousands of otherhealth care professionals.147

In 1995, the VHA, which manages thecare, began to reorganize the delivery systeminto 22 Veterans Integrated ServiceNetworks (VISNs) to enhance the continu-um of primary to tertiary care within geo-graphical regions.147 As in the private sector,there is great variation across VISNs in thedemand for care, the resources available, andthe priority issues. For example, resources forchronic care are greater in southern stateswith large elderly populations.

Because of its centralized structure, theVHA is able to mandate certain activitiessuch as use of a nationwide computerizedpatient record system (CPRS) and a selec-tion of indicators for performance improve-ment. Although the national leaders inWashington, DC set general priorities forimprovement, it is up to each VISN to deter-mine how to implement these priority initia-tives. For example, the VHA central officemandated use of the 0-10 scale for initialpain screening across all facilities, butallowed individual facilities to determine thecontent and format of a tool for comprehen-sive pain assessment.

The flexibility to customize initiativeswithin VISNs and facilities (e.g., by addingspecific data elements) helps overcomebureaucratic delays that can occur with verylarge organizations. One potential drawbackis that even within the VHA system, facili-ties may be using different forms for docu-mentation and different approaches for painassessment, treatment, and follow-up. Thefollowing description of the VHA pain man-agement improvement initiative exemplifiesthis balance and interplay between nationalplanning and VISN/facility-level implemen-tation. After an overview of the national-level pain management initiatives, a descrip-tion is provided of how two facilities withinVISN 1 (VA New England HealthcareSystem) implemented one component of thenational initiative: participation in a collabo-rative improvement project using the

84 Improving the Quality of Pain Management Through Measurement and Action

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

We dedicate this section of the monographto Margaret Berrio, RN, MS who died

unexpectedly in July 2002. Ms. Berrio wasQuality Management Specialist/ManagementInformation System Nursing Coordinator at theBoston VA Healthcare System and a key mem-ber of the VISN 1 Pain Management QualityImprovement Team. She played a pivotal role inhelping to improve the pain care of veterans atseveral facilities and in the development of thissection. Most importantly, Ms. Berrio was aninspiration to all who knew her. She was pas-sionate about her work, had an unwaveringcommitment to and compassion for her patients,and a steadfast faith in the positive nature of thehuman condition. Her smile, sense of humor,and friendship encouraged us all and will notsoon be forgotten. (R. Kerns )

Page 93: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission on Accreditation of Healthcare Organizations 85

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

Institute for Healthcare Improvement (IHI)Breakthrough Series Model forImprovement. The IHI Breakthrough Seriesmodel was designed to help organizationsmake rapid, measurable, and sustainableimprovements in the specific focus areaswithin 9 to 15 months. In addition, thisprocess was intended to help organizationsbuild the capacity to improve beyond thetime frame of the collaborative.148

1. Organizing the National InitiativeIn late 1990s, the organization of VHA

anesthesiologists conducted a national surveythat identified system-wide inconsistencies inthe areas of pain assessment, access to treat-ment, and standards of practice. The groupput forth a position paper that recommendeddevelopment of a national coordinatingstrategy for pain management–related activi-ties. In late 1998, Dr. Kenneth Kizer, thenVA undersecretary for health, announcedthe vision for a national pain managementstrategy and established a National PainManagement Coordinating Committee(NPMCC). The NPMCC consisted of amultidisciplinary group of experts from theVHA and was co-chaired by Judith Salerno,MD, MS, and Tony Mitchell, MD. JaneTollett, PhD, was appointed as program coor-dinator for the national strategy. TheNPMCC was operationalized as severalworking groups, including pain screening andassessment, clinical guideline development,education, research, and outcome measure-ment. Each VISN designated a single personas the point of contact to serve as a liaison tothe NPMCC and facilitate disseminationand implementation.

2. National Goals and Objectives The primary goal of the national initiative

was to provide a system-wide standard of carefor pain management that reduces sufferingfrom preventable pain. Several additionalgoals were specified:■ Ensure that pain screening is performed

in a consistent manner.■ Ensure that pain treatment is prompt and

appropriate.

■ Include patients and families as activeparticipants in pain management.

■ Provide strategies for continual monitor-ing and improvement in pain manage-ment outcomes.

■ Provide for an interdisciplinary, multi-modal approach to pain management.

■ Ensure that VHA clinicians are ade-quately prepared to assess and managepain effectively.

3. Implementing MultipleInterventions

In the past 3 years, several major initia-tives have been undertaken to improve painmanagement. Best known is the VHA’s deci-sion to consider documentation of pain as“the fifth vital sign.” In early 1999, the VHAmandated that every patient in everyencounter in every facility should bescreened with a 0-10 numeric rating scale forthe presence and intensity of pain. At thesame time, an initial version of a toolkit“Pain as the 5th Vital Sign” was published tofacilitate national implementation of themandate. Subsequently, a revised version ofthe toolkit was authorized, which was devel-oped under the leadership of Dr. RobertKerns (chair of the NPMCC toolkit workinggroup and Chief, Psychology Services, VAConnecticut Healthcare System). The tool-kit contains details and resource materials forimplementing the mandate, including anintroduction to comprehensive pain assess-ment and an extensive bibliography.149 Thetool kit is accessible atwww.vachronicpain.org.

To implement the pain screening mandate,the committee worked with informationmanagement staff to incorporate screeningscores on the 0-10 scale into the vital signscomponent of the CPRS. The OutcomesMeasurement Work Group is developing atemplate and reminder system for completinga comprehensive pain assessment (includingduration, location, etc.) if the pain screenscore was greater than 3 or if there was someother potential or obvious indication for paintreatment.

The Clinical Guideline Work Group, co-chaired by Jack Rosenberg, MD of the Ann

Page 94: Improving the Quality of Pain Management Through Measurement and Action

86 Improving the Quality of Pain Management Through Measurement and Action

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

Arbor VAMC and Richard Rosenquist, MDof the Iowa City VAMC, worked togetherwith experts from the Department ofDefense to develop clinical practice guide-lines for the management of acute postopera-tive pain. The guidelines were accompaniedby extensive toolkits, including posters,videotapes, reminder cards, and key articles,to facilitate implementation. Use of theguidelines and associated protocols was alsomandated.

The VHA has dedicated substantial effortand resources to provider education, whichrepresented the largest working group ofNPMCC. Two national painmanagement–related conferences have beenheld, in conjunction with conferencesaddressing care at the end of life. Otherforms of provider education included satellitebroadcasts, Web-based educational programs,pain management list-servs, and extensivereference materials for VISN libraries.

4. Evaluating Effectiveness ThroughSystem-wide Measurement

The External Peer Review Program(EPRP) is one of several quality measure-ment systems for the VHA.150 The VHAcontracts for external review of more than200,000 records annually. Each month, thecentral office draws a random sample ofrecords from each facility (inpatient, ambula-tory, home health, etc) based on patientswho had at least one encounter (visit/admis-sion) in the previous month and anencounter within the past 2 years.Immediately after completing the review, theexternal reviewers provide patient-specificfeedback directly to the clinical staff, usuallywithin days or weeks of the selectedencounter, which facilitates rapid improve-ment on identified opportunities. The data istrended at the VISN level quarterly. The sta-tistically meaningful national sampleincludes monthly review of all inpatientswith diagnoses of acute myocardial infarc-tion, congestive heart failure, and chronicobstructive pulmonary disease and biannualreview of 5000 spinal cord injury patients. Inthe ambulatory setting, the monthly sample

includes 1500 general primary care, 1500mental health, 1500 diabetes, 1000 previousacute myocardial infarction, 1000 chronicobstructive pulmonary disease, and 1000congestive heart failure patients (D. Walder,personal communication, August 1, 2002).

The EPRP has been highly successful instimulating improvement in a wide variety ofclinical areas, in part because performanceon these indicators (monitors) is directlylinked to the medical center director’s per-formance evaluation. With establishedthresholds for each indicator and regularfeedback of EPRP scores to individual facili-ties, the goal is clear. Therefore, all who ren-der care are aware of the expectation forindividual and organizational performance.

In 1999, the following pain managementindicator was added for ambulatory primarycare facilities and eight specialty clinics: theproportion of patient visits during the timeperiod of interest that showed documenta-tion of pain scores at least once within thelast year. Results from October 1999 throughSeptember 2001 showed steadily improvingperformance (see Figure 13). Because themost recent scores were over 95%, this indi-cator was determined by the central office tobe less useful for identifying opportunities forimprovement. Thus, for fiscal year 2002,three new monitors are in development: 1. Percentage of patients with a document-

ed pain screen on a scale of 0-10 on themost recent visit.

2. Percentage of patients with a pain scoregreater than 3 for whom an interventionwas recorded.

3. Percentage of patients with a pain inter-vention recorded who had a follow-upassessment.

Since 1994, the VHA has conducted anannual national patient satisfaction surveythat is sent to a stratified sample of 175recently discharged patients and 175 outpa-tients from each facility.151 The inpatientsatisfaction survey, which was modified frominstruments developed by the PickerInstitute of Boston, includes five questionsrelated to pain management.1. When you had pain, was it usually

severe, moderate, or mild?

Page 95: Improving the Quality of Pain Management Through Measurement and Action

2. How many minutes after you asked forpain medicine did it usually take beforeyou got it?

3. Do you think you would have had lesspain if the hospital staff had acted faster?

4. Overall, how much pain medicine didyou get?

5. Did you experience pain and not reportit? If yes, why?

5. VHA-IHI Collaborative The NPMCC, the IHI, and outside con-

tent experts such as Dr. Charles Cleeland,who served as chairman of the initiative, col-laboratively planned the content for the pro-gram. The VHA provided financial supportto allow 70 teams (more than 300 people)from various settings, including long-termcare, behavioral health, and home care, toparticipate in the initiative over a 9-monthperiod.

One of the first priorities of the VHA-IHIcollaborative planning group was to developa set of attributes of a successful pain man-agement system (see Table 15). Like othertopic areas in the Breakthrough Series initia-tives (e.g., see Leape et al.152 and Wagner etal.153), the VA-IHI pain management collab-orative was built around the following princi-ples and activities: ■ Explicit leadership support and commit-

ment to change.■ Establishment of organization-specific,

measurable goals based on the agreed-upon theory for improvement.

■ Creation of teams that include personswith system leadership to institutechange, persons with technical expertise,persons who can provide day-to-day lead-ership, and change champions.

■ Sharing knowledge and experiencethrough three 2-day in-person learningsessions (an initial meeting, a meeting at3 months, and a meeting at 9 months),

Joint Commission on Accreditation of Healthcare Organizations 87

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

Figure 13. Percentage of Records With Documented Pain Scores Within The Last

Year: Veterans Health Administration, National External Peer Review Program

Source: R. Kerns. Used with permission.

0

10

20

30

40

50

60

70

80

90

100

Jul-01Apr-01Jan-01Oct-00 Jul-00Apr-00Jan-00Oct-99

Page 96: Improving the Quality of Pain Management Through Measurement and Action

with action periods and frequentexchange between meetings via e-mailgroups.

■ Testing changes on a small scale beforewidespread implementation.

■ Rigorous but parsimonious measurementof a few key processes or outcomes, oftenusing sampling.

■ Objective evaluation of team success byusing monthly progress reports and opensharing of data on a secure Web-basedbulletin board maintained by IHI.

■ Other educational services, includingmonthly educational conference callsmoderated by experts in quality measure-ment and pain management, as well as

access to Web sites with resource materi-als.

At a planning group meeting in January2000, the following improvement goals werespecified:■ Pain scores should be reduced by 25%

among patients experiencing moderateor severe pain.

■ One hundred percent of patients shouldhave pain screening documented in theirmedical record.

■ The number of patients with pain scoresof 4 or higher who have a documentedplan of care should be increased by atleast 20%.

■ At least 50% of patients should receiveappropriate education.

Each VISN then decided how many andwhat types of facilities could participate inthis initiative, as well as which clinical areasto focus on (e.g., postoperative pain, oncolo-gy, chronic pain). Each team was expected toselect two or more of these goals for the pilotprojects.

6. Implementing the IHI Initiative inTwo New England Hospitals

Since 1999, VISN 1 (VA New EnglandHealthcare System) has taken a leadershiprole in efforts to improve pain managementwithin the VHA. VISN 1 consists of nineVA medical centers (VAMCs) and 40 ambu-latory care clinics across six New Englandstates. VISN 1 chose to focus on primarycare pain management because the VHA asa whole has shifted emphasis from inpatientcare to primary care.

The VISN 1 team selected the Togus,Maine, VAMC and the Providence, RhodeIsland, VAMC as pilot sites for the IHI-VHA initiative. These two facilities wereselected because their clinical leaders (onephysician and one nurse practitioner) wereprimary care providers with a commitmentto and a passion for improving pain manage-ment. Importantly, it was felt that these lead-ers could commit the necessary time to thisinitiative (i.e., their sites were considered fer-tile ground in which to sow the seeds ofchange). Unlike other sites in VISN 1, these

88 Improving the Quality of Pain Management Through Measurement and Action

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

Table 15. Attributes of a Successful

Pain Management Program: VHA-

IHI Collaborative Planning Group

1. Pain should be assessed in a routine and timelymanner.• Pain should be documented as the 5th vital sign

during every clinical encounter.2. Patients should have access to an appropriate level

of treatment.• Multidisciplinary team should be available for

specialty consultations as needed.3. Treatment protocols should be in place and

understood by all staff.• Protocol should be available for most common

conditions.• Reminder systems should be used for adherence

to protocols.• Protocols should contain expectations for the

time between assessment of pain and action,and the response time to medication requests.

4. Providers should receive education through avariety of strategies.• Criteria for assessing the competency of

providers also should be established.5. A comprehensive plan for patient and family

education should be in place.6. Each organization should have standards of

practice, some of which should be organization-wide and some of which may be disciplinespecific.

7. Each organization should have a plan forperformance improvement that includes goals andkey outcomes as well as a schedule for progressreview.

Source: R. Kerns. Used with permission.IHI: Institute for Healthcare Improvement; VHA:Veterans Health Administration.

Page 97: Improving the Quality of Pain Management Through Measurement and Action

two sites were not distracted by competinginitiatives such as organizational restructur-ing or technology upgrades.

The Togus VAMC, the only VA hospitalin Maine, is the oldest facility in theDepartment of Veterans Affairs, havingserved veterans for 138 years. It has 67 acutemedical, surgical, mental health, and inter-mediate-care beds, along with a 100-bednursing home, half of which is devoted toresidents with dementia. Togus’s primary careprogram, together with five community-based outpatient centers across the state,serves approximately 25,000 veterans annu-ally. The Providence VAMC provides com-prehensive outpatient and inpatient healthcare to veterans residing in Rhode Island andsoutheastern Massachusetts. The ambulatory-care program of 32 subspecialty clinics is sup-ported by a general medical and surgicalinpatient facility currently operating 66 beds.

The VISN 1 team was composed of thefollowing members by function: chair, VISN1 pain management subcommittee, RobertKerns, PhD (VA Connecticut HealthcareSystem); team leader, Fred Silverblatt, MD(Chief, Primary Care, Providence VAMC,Rhode Island); day-to-day leadership andcoordination, Bonnie Lundquist, MS (NursePractitioner, Togus VAMC, Maine); reviewersupport, Marcia Kelly, RN (PerformanceManagement, Northampton VAMC,Massachusetts); patient education support,Elizabeth Fiscella, RN (Nurse Manager,Northampton VAMC, Massachusetts); clini-cal and technical expertise, Margaret Berrio,RN, MS (Quality Management/ManagementInformation System Nursing Coordinator,VA Boston Healthcare System,Massachusetts); and management analyst,Genez Orejola, MHA, MT, Boston.

Because the VISN 1 initiative involvedtwo sites (unlike most other initiatives, wherethe teams were drawn from a single site), theteam had to modify the IHI model for certainteam member functions and implementationstrategies. Toward that end, the VISN 1 teamundertook the key steps in the IHI initiative.The team developed a comprehensive assess-ment tool, solicited leadership involvement,and communicated through weekly team

meetings by conference call. Centralized support for data collection,

database development, and analysis at VABoston Jamaica Plain played an importantrole. Using the CPRS, Ms. Berrio initiallygenerated weekly facility-specific reports toprovide feedback on four rates: 1) the pro-portion of patients with pain scores notedamong all patients seen; 2) the proportion ofpatients with a comprehensive assessmentamong all patients who had pain scoresgreater than 3; 3) the proportion of patientswith a treatment plan among all patientswho had pain scores greater than 3; and 4)the proportion of patients with documenta-tion of patient education among all patientswho had pain scores greater than 3. Theresults, along with questions and answers,were reported to all team members via a bul-letin board list-serv so that members did notneed to access the Web site directly.

Both hospitals implemented a variety ofinterventions to achieve the pilot projectgoals. These interventions included:■ Simplifying and increasing the font size

of the patient comprehensive pain self-assessment tool based on feedback fromusers (patients and staff).

■ Implementing a template that incorpo-rated functional assessment informationfor clinicians to use when entering com-prehensive assessment information in theCPRS.

■ Educating patients through distributionof the Channing-Bete booklet, conduct-ing an ongoing lecture series, postingmaterial on clinic bulletin boards, andproviding a detailed pain chart in everyexam room.

■ Educating primary care providers, with afocus on review of pain managementguidelines and training in the use of thecomprehensive assessment tool throughlectures and workshops.

7. Operational Strategies Despite similar interventions, there were

hospital-specific differences in the approach-es used to screen and educate patients. Forexample, the Togus VAMC targeted the pri-mary care clinics run by two providers (one

Joint Commission on Accreditation of Healthcare Organizations 89

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

Page 98: Improving the Quality of Pain Management Through Measurement and Action

physician and one nurse practitioner). Thesupport staff for these clinics consisted of apatient care associate, a nurse, and a clerk.After receiving education on use of the painscreening scale, the group recognized theneed to allow additional time for patientassessment and education. The clerk cre-atively designed bright yellow stickers toattach to clinic appointment letters, askingpatients in the pilot clinics to arrive 15 min-utes early. Upon arrival, the patient careassistant conducted the pain screening. If thepain score was greater than 3, the patientcare associate gave the comprehensive self-assessment tool and the educational bookletto the patient and notified the nurse. Thenurse completed the comprehensive assess-ment with the patient and discussed the con-tents of the booklet. The nurse enteredinformation regarding patient education intothe template progress note in the CPRS. Toidentify which patients had completed theassessment and education, the nurse high-lighted a note on top of the chart. Theprovider developed the pain treatment planand ordered telephone follow-up by thenurse at 6 weeks to assess the effectiveness ofthe treatment plan and review patient educa-tion. Providers were notified of the results ofthe calls.

The medical primary care clinic was tar-geted at the Providence VAMC. The patientcare associate conducted the initial painscreening at the time of the visit. If the painscore was greater than 3, the patient careassociate gave the educational booklet andcomprehensive assessment tool to thepatient. The patient completed the compre-hensive assessment tool, and the primarycare provider reviewed it with the patientduring the physical exam. The provider com-pleted the assessment, provided the educa-tion, and arranged the telephone follow-up.The approach at the Providence VAMCexcluded the role of the nurse as intermedi-ary between patient and provider.

The effectiveness of providing weeklyfacility-specific feedback is demonstrated inFigure 14, which shows evidence of substan-tial improvement on two indicators within 6weeks of project initiation. The variation in

performance between the two sites may havebeen due to the different steps and personnelinvolved in operationalizing the assessmentand the education processes.

Both medical centers have implementedseveral other pain management improve-ment activities not described here. For exam-ple, the Togus VAMC has created a full-timepain management coordinator position.After completing the IHI initiative, bothhospitals expanded the interventions anddata collection activities to include all pri-mary care clinics and community-based out-patient centers, as well as the appropriatemedical and geriatric clinics. Other siteswithin VISN 1 whose staff were part of theteam have implemented activities similar tothe IHI process in their own facilities.Additional information can be obtained bycontacting the team members directly attheir respective facilities.

8. Sharing Success and Next Steps Over time, the VHA and several of its

facilities have become recognized as a modelof excellence in the pain management field.There have been several presentations andsymposia led by NPMCC members atAmerican Pain Society, American GeriatricSociety, and New England Pain Societyannual meetings. Several articles have beenpublished, including those by Kerns154 andMontrey.155 A report on the aggregate resultsof the experience of the 70 VHA teams isforthcoming.156

Next steps at the national level includethe development of new guidelines related tothe use of opioids in patients with chronicpain. Additionally, the Outcome MeasuresWork Group is developing a toolkit of painmanagement–related measures and resourcesdesigned to guide staff in quality improve-ment efforts. The VHA Health ServicesResearch and Development Service estab-lished pain as a priority area for researchfunding in 2002. The NPMCC PharmacyWork Group (chaired by Charles Sintek,MS, RPh, Denver VAMC) has implementeda Web-based learning system on opioid usefor continuing education credits. Finally, the

90 Improving the Quality of Pain Management Through Measurement and Action

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

Page 99: Improving the Quality of Pain Management Through Measurement and Action

VHA has provided funding for new painmanagement fellowships in anesthesiology,psychiatry, neurology, and rehabilitationmedicine. Additional information aboutVHA pain management initiatives is avail-able at the Web site www.vachronicpain.org.

9. Summary: Key LessonsThe following lessons learned from partici-

pation in the VHA-IHI collaborative shouldbe generally applicable to organizations striv-ing to promote change in existing processesor practices:

Joint Commission on Accreditation of Healthcare Organizations 91

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

Figure 14. Performance on Two Indicators Within 6 Weeks of Initiation of the

Veterans Health Administration/Institute for Healthcare Improvement Pain

Management Collaborative at the Togus, Maine, and Providence, Rhode Island,

Veterans Affairs Medical Centers

Percentage of Primary Care Patients With Pain Scores

Percentage of Primary Care Patients With Moderate Pain

Source: M. Berrio, VA Boston Health Care System. Used with permission.

0

10

20

30

40

50

60

70

80

90

100

Togus

Providence

7/117/66/276/206/136/86/1

91%

100% 100%

91% 91% 92%

87%88%87%91%

86%

50%50%

100%

0

10

20

30

40

50

60 Togus

Providence

7/117/66/276/206/136/86/1

50%

21%18%

9% 9% 8%

8%5%7%7%9%9%14%

50%

Page 100: Improving the Quality of Pain Management Through Measurement and Action

92 Improving the Quality of Pain Management Through Measurement and Action

Section IX: Examples of Organizations Implementing Pain Management–Related Measurement and Improvement Initiatives

■ Build the initiative step by step; showsuccess in one small location; plant theseeds and allow them to take root.

■ Use facility-based pain champions androle models; work with those who willwork with you.

■ Involve top administration personnelfrom the beginning.

■ Implement change when you are notlikely to be confronted by conflicting pri-orities such as major reorganization.

■ Give regular feedback to all involved;data were critical to making the VHA-IHI initiative a success.

■ Make the changes easy for thoseinvolved; integrate the changes intodaily practice.

■ Publicize successes.

Page 101: Improving the Quality of Pain Management Through Measurement and Action

Agency for Healthcare Research and Quality2101 E. Jefferson Street, Suite 501Rockville, MD 20852301-594-1364www.ahrq.gov

American Academy of Orofacial Pain19 Mantua RoadMount Royal, NJ 08061856-423-3629www.aaop.org

American Academy of Pain Medicine4700 West Lake AvenueGlenview, IL 60025847-375-4731www.painmed.org

American Academy of Pediatrics141 Northwest Point BoulevardElk Grove Village, IL 60007847-434-4000www.aap.org

American Alliance of Cancer Pain Initiatives1300 University Avenue, Room 4720Madison, WI 53706608-265-4012www.aacpi.org

American Council for Headache Education(ACHE)19 Mantua RoadMount Royal, NJ 08061856-423-0258www.achenet.org

American Cancer Society1599 Clifton Road, NEAtlanta, GA 30329-4251800-ACS-2345www.cancer.org

American Chronic Pain AssociationP.O. Box 850Rocklin, CA 95677800-533-3231www.theacpa.org

American College of Rheumatology1800 Century Place, Suite 250Atlanta, GA 30345404-633-3777www.rheumatology.org

American Geriatrics SocietyThe Empire State Building350 Fifth Avenue, Suite 801New York, NY 10118212-308-1414www.americangeriatrics.org

American Medical Directors Association10480 Little Patuxent Parkway, Suite 760Columbia, MD 21044800-876-2632www.amda.com

American Pain Foundation201 North Charles Street, Suite 710Baltimore, MD 21201-4111888-615-7246www.painfoundation.org

American Pain Society4700 West Lake AvenueGlenview, IL 60025847-375-4715www.ampainsoc.org

American Society of Addiction Medicine4601 North Park Avenue, Arcade Suite 101Chevy Chase, MD 20815301-656-3920www.asam.org

American Society of Anesthesiologists520 North Northwest HighwayPark Ridge, IL 60068-2573847-825-5586www.asahq.org

American Society of Clinical Oncology1900 Duke Street, Suite 200Alexandria, VA 22314703-299-0150www.asco.org

American Society of Consultant Pharmacists1321 Duke StreetAlexandria, VA 22314703-739-1300www.ascp.com

The American Society of Law, Medicine & Ethics765 Commonwealth Avenue, Suite 1634Boston, MA 02215617-262-4990www.aslme.org

Joint Commission on Accreditation of Healthcare Organizations 93

Appendix A: A Selection of Resources Related to Pain Management and Improvement

Page 102: Improving the Quality of Pain Management Through Measurement and Action

American Society of Pain Management Nurses7794 Grow DrivePensacola, FL 32514888-342-7766www.aspmn.org

American Society of Regional Anesthesia & PainMedicineP.O. Box 11086Richmond, VA 23230-1086804-282-0010www.asra.com

The Rehabilitation Accreditation Commission(CARF)4891 East Grant RoadTucson, AZ 85712520-325-1044www.carf.org

Center to Advance Palliative CareThe Mount Sinai School of Medicine1255 5th Avenue, Suite C-2New York, NY 10029-6574212-201-2767

City of Hope, Pain/Palliative Care ResourceCenter1500 E Duarte RoadDuarte, CA 91010626-359-8111www.cityofhope.org/prc/web

Federation of State Medical Boards of the UnitedStates, Inc.P.O. Box 619850Dallas, TX 75261-9741817-868-4000www.fsmb.org

Hospice & Palliative Nurses AssociationPenn Center West One, Suite 229Pittsburgh, PA 15276412-787-9301www.hpna.org

Institute for Healthcare Improvement375 Longwood Avenue, 4th FloorBoston, MA 02215617-754-4800www.ihi.org

International Association for the Study of Pain909 Northeast 43rd Street, Suite 306Seattle, WA 98105-6020206-547-6409www.iasp-pain.org

Joint Commission on Accreditation of HealthcareOrganizationsOne Renaissance BoulevardOakbrook Terrace, IL 60181630-792-5000www.jcaho.org

Joint Commission Resources, Inc.One Lincoln Center, Suite 1340Oakbrook Terrace, IL 60181630-268-7400www.jcrinc.com

The Mayday Pain Projectwww.painandhealth.org

Medical College of WisconsinPalliative Medicine9200 West Wisconsin AvenueMilwaukee, WI 53226414-805-4605www.mcw.edu/pallmed

National Chronic Pain Outreach Association7979 Old Georgetown Road, Suite 100Bethesda, MD 20814-2429301-652-4948neurosurgery.mgh.harvard.edu/ncpainoa.htm

The National Guideline [email protected]

National Headache Foundation428 West Saint James Place, 2nd FloorChicago, IL 60614-2754888-NHF-5552www.headaches.org

National Hospice and Palliative CareOrganization1700 Diagonal Road, Suite 300Alexandria, VA 22314703-837-1500www.nhpo.org

94 Improving the Quality of Pain Management Through Measurement and Action

Appendix A: A Selection of Resources Related to Pain Management and Improvement

Page 103: Improving the Quality of Pain Management Through Measurement and Action

Oncology Nursing Society501 Holiday DrivePittsburgh, PA 15220412-921-7373www.ons.org

Pain and Policy Studies Group (PPSG)University of Wisconsin-Madison406 Science Drive, Suite 202Madison, WI 53711-1068608-263-7662www.medsch.wisc.edu/painpolicy

The Resource Center of the American Alliance ofCancer Pain Initiatives1300 University Avenue, Rm 4720Madison, WI 53706608-262-0978www.wiscinfo.doit.wisc.edu/trc

Sickle Cell Information CenterP.O. Box 109, Grady Memorial Hospital80 Jesse Hill Jr Drive SoutheastAtlanta, GA 30303404-616-3572www.emory.edu/PEDS/SICKLE

Toolkit of Instruments to Measure End-of-LifeCareDr. Joan TenoCenter for Gerontology and Health Care ResearchBrown Medical SchoolP.O. Box G-HLLProvidence, RI 02912www.chcr.brown.edu/pcoc/toolkit.htm

Wisconsin Cancer Pain Initiativewww.wisc.edu/wcpi

World Health Organization20 Avenue Appia CH-1211 Geneva 27Switzerland+41-22-791-3634www.who.int

Worldwide Congress on PainDannemiller Memorial Education Foundation12500 Network Blvd., Suite 101San Antonio, TX 78249www.pain.com

PRECEPTORSHIPS, OBSERVERSHIPS, FELLOW-SHIPS

Beth Israel Deaconess Medical Center330 Brookline AvenueBoston, MA 02215617-667-5558

Mercy Medical CenterPain Management Services1111 Sixth AvenueDes Moines, IA 50314-1101515-247-3172 or 515-247-3239

Memorial Sloan-Kettering Cancer CenterNurse Fellowship in Pain and Palliative CareDepartment of Pain and Palliative Care1275 York AvenueNew York, NY 10021212-639-2662

Joint Commission on Accreditation of Healthcare Organizations 95

Appendix A: A Selection of Resources Related to Pain Management and Improvement

Page 104: Improving the Quality of Pain Management Through Measurement and Action

96 Improving the Quality of Pain Management Through Measurement and Action

Appendix B: A Plan-Do-Check-Act Worksheet

A Plan-Do-Check-Act Worksheet

1. What are we trying to accomplish?Some questions to consider: (a) What is our aim? (b) What need is tied to that aim? What exactlydo we know about that need? (c) What is the process we are working on? (d) What is the linkbetween our aim and this process? Does this process offer us the most leverage for work in supportof our aim? (e) What background information do we have available about this improvement—cus-tomer, other?

2. How will we know that change is an improvement?Some questions to consider: (a) Who are the customers? What would constitute improvement intheir eyes? (b) What is the output of the process? (c) How does the process work? (d) How doesthe process currently vary?

3. What changes can we make that we predict will lead to improvement?Some questions to consider: (a) Reflecting on the process described, are there ideas for improve-ment that come readily to mind? (b) Would a simple decision matrix help you decide which towork on first? (c) Are all our decision criteria worded to make their scoring in the same direction?(d) For the change we’d like to try, what is our prediction? (e) What questions do we have aboutthe change, the process, and our prediction? What will you need to check? Do these questionshelp us link to the overall aim and need?

Page 105: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission on Accreditation of Healthcare Organizations 97

Appendix B: A Plan-Do-Check-Act Worksheet

4. How shall we PLAN the pilot?Some questions to consider: (a) Who is going to do What, by When, Where, and How? (b) Is the“owner” of the process involved? (c) How shall we measure to answer our questions-to confirm orreject our prediction?

5. What are we learning as we DO the pilot?Some questions to consider: (a) What have we learned from our planned pilot and collection ofinformation? (b) What have we learned from the unplanned information we collected? (c) Wasthe pilot congruent with the plan?

6. As we CHECK and study what happened, what have we learned?Some questions to consider: (a) Was our prediction correct? (b) Did the pilot work better for alltypes of customers—or just some of them? (c) What did we learn about planning the next change?

Page 106: Improving the Quality of Pain Management Through Measurement and Action

98 Improving the Quality of Pain Management Through Measurement and Action

Appendix B: A Plan-Do-Check-Act Worksheet

7. As we ACT to hold the gains or abandon our pilot efforts, what needs to be done?Some questions to consider: (a) What should be standardized? (b) What training should be con-sidered to provide continuity? (c) How should continued monitoring be undertaken? (d) If thepilot efforts should be abandoned, what has been learned?

8. Looking back over the whole pilot, what have we learned?Some questions to consider: (a) What was learned that we expected to learn? (b) What unantici-pated things did we learn? (c) What did we learn about our predictive ability? (d) Who might beinterested in learning what we’ve learned?

Source: Batalden, PB, Stoltz PK. A framework for the continual improvement of health care: building and applying professionaland improvement knowledge to test changes in daily work. Joint Commission Journal on Quality Improvement.1993;19(10):446-447.43 Worksheet developed with the help of Tom Nolan, PhD, of Associates in Process Improvement. ©1992HCA Quality Resource Group, June 1993. Used with permission.

Page 107: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission on Accreditation of Healthcare Organizations 99

References

1. Cleeland CS, Gonin R, Hatfield AK, et al. Pain and itstreatment in outpatients with metastatic cancer. N EnglJ Med. 1994;330(9):592-596.

2. Wolfe J, Grier HE, Klar N, et al. Symptoms and suffer-ing at the end of life in children with cancer. N Engl JMed. 2000;342:326-333.

3. Teno JM, Weitzen S, Wetle T, Mor V. Persistent pain innursing home residents [research letter]. JAMA2001;285(16):2081.

4. Donovan M, Dillon P, McGuire L. Incidence and char-acteristics of pain in a sample of medical-surgicalinpatients. Pain. 1987;30:69-78.

5. Bernabei R, Gambassi G, Lapnae K, et al.Management of pain in elderly patients with cancer.JAMA. 1998;270(23)1877-1882.

6. Breitbart W, Rosenfield BD, Passik SD, et al. Theundertreatment of pain in ambulatory AIDS patients.Pain. 1996;65:243-249.

7. Doctor guilty of elder abuse for undertreating pain.American Medical News. July 23, 2001:1,4.

8. Carr DB, Goudas LC. Acute pain. Lancet.1999;353:2051-2058.

9. National Pharmaceutical Council, Inc. Pain: CurrentUnderstanding of Assessment, Management, andTreatments. Reston, VA: National PharmaceuticalCouncil; 2001.

10. Ashburn MA. The role of APS in the decade of paincontrol and research [President’s Message]. APS Bull.May/June 2001;11(3). Available at:http://www.ampainsoc.org/pub/bulletin/may01/pres1.htm. Accessed July 12, 2001.

11. Joint Commission on Accreditation of HealthcareOrganizations. The Measurement Mandate: On theRoad to Performance Improvement in Health Care.Oakbrook Terrace, IL: Joint Commission onAccreditation of Healthcare Organizations; 1993.

12. Eddy DM. Performance measurement: problems andsolutions. Health Aff. 1998;17:7-25.

13. Casalino LP. The unintended consequences of measur-ing quality on the quality of medical care. N Engl JMed. 1999;341:1147-1150.

14. Berwick DM. Crossing the boundary: changing mentalmodels in service of improvement. Int J Qual HealthCare. 1998;10(5):435-441.

15. Field MJ, Lohr KN, eds, for the Committee to Advisethe Public Health Service on Clinical PracticeGuidelines, Institute of Medicine. Clinical PracticeGuidelines: Directions for a New Program.Washington, DC: National Academy Press; 1990.

16. Joint Commission on Accreditation of HealthcareOrganizations. Lexicon: A Dictionary of Health CareTerms, Organizations, and Acronyms. 2nd ed.Oakbrook Terrace, IL: Joint Commission onAccreditation of Healthcare Organizations; 1998.

17. Joint Commission on Accreditation of HealthcareOrganizations. Comprehensive Accreditation Manualfor Hospitals. Oakbrook Terrace, IL: Joint Commissionon Accreditation of Healthcare Organizations; 2001.

18. American Pain Society Quality of Care Committee.Quality improvement guidelines for the treatment ofacute pain and cancer pain. Consensus statement.JAMA. 1995;274:1874-1880.

19. Shekelle PG, Oritz E, Rhodes S, et al. Validity of theAgency for Healthcare Research and Quality clinicalpractice guidelines. JAMA. 2001;286:1461-1511.

20. Pain and Policy Study Group, University of WisconsinComprehensive Cancer Center. Glossary of terms.Available at: http://www.medsch.wisc.edu/painpolicy/glossary.htm. Accessed February 4, 2002.

21. Langley GJ, Nolan KM, Nolan TW. The foundation ofimprovement. Qual Prog. June 1994:81-86.

22. Labovitz G, Rosansky V. The Power of Alignment: HowGreat Companies Stay Centered and AccomplishExtraordinary Things. New York: John Wiley and Sons;1997.

23. Leape LL, Bates DW, Cullen DJ, et al. Systems analysisof adverse drug events. JAMA. 1995;274:35-43.

24. Cohen MR ed. Medication Errors. Washington D.C.:American Pharmaceutical Assoication; 1999.

25. Leape LL. A systems analysis approach to medicalerror. J Eval Clin Pract. 1997;3(3):213-222.

26. Gordon DB, Dahl JL, Stevenson KK, eds. Building anInstitutional Commitment to Pain Management. TheWisconsin Resource Manual. 2nd ed. Madison, WI:The Resource Center of the American Alliance ofCancer Pain Initiatives; 2000.

27. McCaffery M, Pasero C, eds. Pain Clinical Manual.2nd ed. St Louis, MO: Mosby; 1999.

28. Blau WS, Dalton JB, Lindley C. Organization of hospi-tal-based acute pain management programs. SouthMed J. 1999;92:465-471.

29. Massoud RK, Reinke AJ, Franco LM, et al. A ModernParadigm for Improving Healthcare Quality. Bethesda,MD: The Quality Assurance Project, U.S. Agency forInternational Development; 2001. QA MonographSeries 1(1).

30. Joint Commission on Accreditation of HealthcareOrganizations. Pain Assessment and Management: AnOrganizational Approach. Oakbrook Terrace, IL: JointCommission on Accreditation of HealthcareOrganizations; 2000.

31. Ferrell BR, Whedon M, Rollins B. Pain and qualityassessment/improvement. J Nurs Care Qual.1995;9:69-85.

32. Bookbinder M, Coyle N, Kiss M, et al. Implementingnational standards for cancer pain management: pro-gram model evaluation. J Pain Symptom Manage.1996;12:334-347.

33. Duggleby W, Alden C. Implementation and evaluationof a quality improvement process to improve painmanagement in a hospice setting. Am J HospicePalliative Care. July/August 1998:209-216.

34. Sayers M, Marando R, Fisher S, et al. No need forpain. J Healthcare Qual. 2000; 22(3):851-861.

35. Pasero C, McCaffery M, Gordon D. Build institutionalcommitment to improving pain management. NursManage. 1999;30:27-34.

36. Miaskowski C, Nichols R, Brody R, et al. Assessmentof patient satisfaction utilizing the American PainSociety’s quality assurance standards on acute andcancer-related pain. J Pain Symptom Manage. 1994;9:5-11.

37. Kelly AM. Patient satisfaction with pain managementdoes not correlate with initial or discharge VAS painscores, verbal pain rating at discharge, or change inVAS score in the emergency department. J Emerg Med.2000;19:113-116.

38. Corizzo, CC, Baker MC, Henkelmann GC. Assessmentof patient satisfaction with pain management in smallcommunity inpatient and outpatient settings. OncolNurs Forum. 2000;27:1279-1286.

Page 108: Improving the Quality of Pain Management Through Measurement and Action

39. Lin C. Applying the American Pain Society’s QA stan-dards to evaluate the quality of pain managementamong surgical, oncology, and hospice inpatients inTaiwan. Pain. 2000;87:43-49.

40. Ward S, Gordon D. Application of the American PainSociety quality assurance standards. Pain1994;56(3):299-306.

41. Dawson R, Spross JA, Jablonski ES, et al. Probing theparadox of patients’ satisfaction with inadequate painmanagement. J Pain Symptom Manage.2002;23(3):211-20.

42. Devine EC, Bevsek SA, Brubakken K, et al. AHCPRclinical practice guideline on surgical pain manage-ment: adoption and outcomes. Res Nurs Health.1999;22:119-130.

43. Batalden, PB, Stoltz PK. A framework for the continualimprovement of health care: building and applyingprofessional and improvement knowledge to testchanges in daily work. Jt Comm J Qual Improv. 1993;19(10):446-447.

44. Joint Commission on Accreditation of HealthcareOrganizations. Framework for Improving Performance:From Principle to Practice. Oakbrook Terrace, IL: JointCommission on Accreditation of HealthcareOrganizations; 1994.

45. Engaging physicians in the performance improvementprocess [Quality Corner]. JCAHO Perspectives. July2001.

46. Miller EH, Belgrade MJ, Cook M, et al. Institution-widepain management improvement through the use ofevidence-based content, strategies, resources, and out-comes. Qual Manage Health Care. 1999;7:28-40.

47. Joint Commission Resources, Inc, Joint Commission onAccreditation of Healthcare Organizations. ManagingPerformance Measurement Data in Health Care.Oakbrook Terrace, IL: Joint Commission onAccreditation of Healthcare Organizations; 2001.

48. Starck PL, Adams J, Sherwood G, et al. Developmentof a pain management report card for an acute caresetting. Adv Pract Nurs Q. Fall 1997;3(2):57-63.

49. Joint Commission on Accreditation of HealthcareOrganizations. Examples of Compliance: PainAssessment and Management. Oakbrook Terrace, IL:Joint Commission on Accreditation of HealthcareOrganizations; 2002.

50. McGlynn EA, Asch SM. Developing a clinical perform-ance measure. Am J Prev Med. 1998;14:14-20.

51. Scheaffer RL, Mendenhall W, Ott L. Elementary SurveySampling. 3rd ed. Boston, MA: Duxbury Press; 1986.

52. Fowler FJ. Applied Social Research Methods SeriesSurvey. Newbury Park, CA: Sage Publications; 1993.Research Methods. 2nd ed. Vol 1.

53. Joint Commission on Accreditation of HealthcareOrganizations. A Guide to Performance Measurementfor Hospitals. Oakbrook Terrace, IL: Joint Commissionon Accreditation of Healthcare Organizations; 2000.

54. Gordon D, Pellino T. Medical Record Audit Form. In:Gordon DB, Dahl JL, Stevenson KK, eds. Building anInstitutional Commitment to Pain Management. TheWisconsin Resource Manual. 2nd ed. Madison, WI:The Resource Center of the American Alliance ofCancer Pain Initiatives; 2000.

55. Melzack R. The McGill Pain Questionnaire: majorproperties and scoring methods. Pain. 1975;1(3):277-299.

56. Daut RL, Cleeland CS, Flanery RC. Development ofthe Wisconsin Brief Pain Questionnaire to assess pain

in cancer and other diseases. Pain. 1983;17:197-210.57. Cleeland CS, Ryan KM. Pain assessment: global use of

the Brief Pain Inventory. Pain Research Group,University of Wisconsin-Madison. Ann Acad MedSingapore. 1994;2(March 23):129-138.

57a. Ware JE. Future directions in health status assessment.J Clin Outcomes Manage. 1999;6(3):34-37a.

58. Lohr KN, Aaronson NK, Alonso J, et al. Evaluatingquality-of-life and health status instruments: develop-ment of scientific review criteria. Clin Ther.1996;18:979-986.

59. Flaherty SA. Pain measurement tools for clinical prac-tice and research. AANA J. 1996;64(2):133-140.

60. Beyer JE, McGrath PJ, Berde CB. Discordance betweenself-report and behavioral pain measures in childrenaged 3-7 years after surgery. J Pain Symptom Manage.1990;5:350-356.

61. Wong D, Baker C. Pain in children: comparison ofassessment scales. Pediatr Nurs. 1988;14:9-17.

62. Wong D, Baker C. Reference Manual for the Wong-Baker FACES Pain Rating Scale. Tulsa, OK: Wong &Baker; 1995.

63. McGrath PJ, Johnson G, Goodman JT, et al. CHEOPS:a behavioral scale for rating postoperative pain in chil-dren. In: Fields HL, Dubner R, Cervero F, eds.Advances in Pain Research and Therapy. New York:Raven Press; 1985:395-402.

64. Giuffre M, Keanne A, Hatfield, FM, et al. Patient con-trolled analgesia in clinical pain research manage-ment. Nurs Res. 1988;37(4):254-255.

65. Serlin RC, Mendoza TR, Nakamura Y, et al. When iscancer pain mild, moderate or severe? Grading painseverity by its interference with function. Pain.1995;61:277-284.

66. Carey SJ, Turpin C, Smith J, et al. Improving pain man-agement in an acute care setting. Orthop Nurs.1997;16:29-36.

67. Loeb JM, Braun BI, Hanold LS, et al. Measuring andmonitoring for performance improvement. In: LinskeyME, Rutligliano MJ, eds. Quality and Cost inNeurologic Surgery. Philadelphia, PA: LippincottWilliams and Wilkins; 2001.

68. Gordon D, Stevenson K. Institutional needs assess-ment. Wisconsin Pain Initiative. In: Gordon DB, DahlJL, Stevenson KK, eds. Building an InstitutionalCommitment to Pain Management. The WisconsinResource Manual. 2nd ed. Madison, WI: The ResourceCenter of the American Alliance of Cancer PainInitiatives; 2000.

69. Joint Commission on Accreditation of HealthcareOrganizations. Medical Record Review. OakbrookTerrace, IL: Joint Commission on Accreditation ofHealthcare Organizations; 1998.

70. Ferrell BR, Wisdom C, Rhiner M, et al. Pain manage-ment as a quality of care outcome. J Nurs Qual Assur.1991;5:50-58.

71. Veterans Affairs Western New York Healthcare System,Amherst, New York. Patient Record Pain ManagementAssessment Tool. In: Gordon DB, Dahl JL, StevensonKK, eds. Building an Institutional Commitment to PainManagement: The Wisconsin Resource Manual. 2nded. Madison, WI: The Resource Center of theAmerican Alliance of Cancer Pain Initiatives; 2000.

72. Wallace KG, Graham KM, Bacon DR, et al. The devel-opment and testing of the patient record pain manage-ment assessment tool. J Nurs Care Qual. 1999;13:34-46.

100 Improving the Quality of Pain Management Through Measurement and Action

References

Page 109: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission on Accreditation of Healthcare Organizations 101

References

73. Gordon D, Schroeder S. RN needs assessment: painmanagement. University of Wisconsin hospitals andclinics. In: Gordon DB, Dahl JL, Stevenson KK, eds.Building an Institutional Commitment to PainManagement. The Wisconsin Resource Manual. 2nded. Madison, WI: The Resource Center of theAmerican Alliance of Cancer Pain Initiatives; 2000.

74. McCaffery M, Ferrell BR. Nurse’s knowledge of painassessment and management: how much progresshave we made? J Pain Symptom Manage.1997;14:175-188.

75. Barker KN. Data collection techniques: observation.Am J Hosp Pharm. 1980;37:1235-1243.

76. Donabedian A. Reflections on the effectiveness ofquality assurance. In: Palmer RH, Donabedian A,Povar GJ, eds. Striving for Quality in Health Care. AnInquiry Into Policy and Practice. Ann Arbor, MI: HealthAdministration Press; 1991:59-128.

77. Acute Pain Management in the Elderly. Rev ed. IowaCity, IA: University of Iowa Gerontological NursingInterventions Research Center, Research DisseminationCore; 1999.

78. Kramer MS. Clinical Epidemiology and Biostatistics. APrimer for Clinical Investigators and Decision-Makers.New York: Springer-Verlag; 1988:25-27.

79. MacMahon B, Pugh T. Epidemiology. Boston, MA:Little, Brown & Company; 1970:60-62.

80. Ward S, Donovan M, Max MB. A survey of the natureand perceived impact of quality improvement activi-ties in pain management. J Pain Symptom Manage.1998;15:365-373.

81. Super A. Improving pain management practice. Amedical center moves beyond education to documentand manage patient care. Health Prog. 1996;77:50-54.

82. Fries BE, Simon SE, Morris JN, et al. Pain in U.S. nurs-ing homes: validating a pain scale for the minimumdata set. Gerontologist. 2001;41:173-179.

83. Turk D. Efficacy of multidisciplinary pain clinics intreatment of chronic pain. In: Cohen MJ, Campbell JN,eds. Pain Treatment Centers at a Crossroad: A Practicaland Conceptual Reappraisal: The Bristol Myers SquibbSymposium on Pain Research. Seattle WA: IASP Press;1996:257-273.

84. Friedlieb OP. The impact of managed care on the diag-nosis and treatment of low back pain: a preliminaryreport. Am J Med Qual 1994;9(1):24-29.

85. Lande SD, Loeser JD. The future of pain managementin managed care. Manag Care Interface2001;14(5):69-75.

86. American Medical Association, Joint Commission onAccreditation of Healthcare Organizations, andNational Committee for Quality Assurance.Coordinated Performance Measurement for theManagement of Adult Diabetes. Consensus statement.Chicago, IL: AMA; April 2001.

87. McGlynn EA. Choosing and evaluating clinical per-formance measures. Jt Comm J Qual Improv.1998;24:470-479.

88. Nelson EC, Mohr JJ, Batalden PB, et al. Improvinghealth care, part 1: the clinical value compass. JtComm J Qual Improv. 1996;22(4):243-257.

89. Nelson EC, Batalden PB, Plume SK, et al. Report cardsor instrument panels: who needs what? Jt Comm JQual Improv. 1995;21(4):155-166.

90. Nugent WC, Schults WC, Plume SK, et al. Designingan instrument panel to monitor and improve coronaryartery bypass grafting. J Clin Outcomes Manage.1994;1:57-64.

91. Agency for Health Care Policy and Research. Buildingand Applying a Guideline-Based PerformanceMeasurement System. Vol 1. Rockville, MD: US PublicHealth Service, Agency for Health Care Policy andResearch; October 1995. AHCPR publication 95-N013.

92. Agency for Health Care Policy and Research. Buildingand Applying a Guideline-Based PerformanceMeasurement System. Vol 2. Rockville, MD: US PublicHealth Service, Agency for Health Care Policy andResearch; October 1995. AHCPR publication 95-N014.

93. Donabedian A. Institutional and professional responsi-bilities in quality assurance. Qual Assur Health Care.1989;1:3-11.

94. Joint Commission on Accreditation of HealthcareOrganizations. Using Quality Improvement Tools in aHealth Care Setting. Oakbrook Terrace, IL: JointCommission on Accreditation of HealthcareOrganizations; 1992.

95. Scholtes PR. The Team Handbook. How to Use Teamsto Improve Quality. Madison, WI: Joiner AssociatesInc; 1988.

96. Joint Commission on Accreditation of HealthcareOrganizations. Using Performance Improvement Toolsin a Health Care Setting. Rev ed. Oakbrook Terrace, IL:Joint Commission on Accreditation of HealthcareOrganizations; 1996.

97. Berwick D, Godfrey AB, Roessner J. Curing HealthCare: New Strategies for Quality Improvement. SanFrancisco, CA: Jossey-Bass; 1990.

98. Iezzoni LI, ed. Risk Adjustment for MeasuringHealthcare Outcomes. Chicago, IL: HealthAdministration Press; 1997.

99. Joint Commission on Accreditation of HealthcareOrganizations. Tools for Performance Measurement inHealth Care: A Quick Reference Guide. OakbrookTerrace, IL: Joint Commission on Accreditation ofHealthcare Organizations; 2002.

100. Carey RG, Lloyd RC. Measuring quality improvementin healthcare: a guide to statistical process controlapplications. In: Quality Resources. Milwaukee, WI:ASQ Press; 2001.

101. Wheeler DJ, Chambers DS. Understanding StatisticalProcess Control. Knoxville, TN: SPC Press; 1992.

102. Grant EL, Leavenworth RS. Statistical Quality Control.New York: McGraw-Hill Book Company; 1988.

103. Sellick JA Jr. The use of statistical process controlcharts in hospital epidemiology. Infect Control HospEpidemiol. 1993;14(11):649-656.

104. O’Leary MR. Clinical Performance Data: A Guide ToInterpretation. Oakbrook Terrace, IL: Joint Commissionon Accreditation of Healthcare Organizations; 1996.

105. Ware JH, Mosteller F, Ingelfinger JA. P values. In:Bailar JC, Mosteller F, eds. Medical Uses of Statistics.Waltham, MA: The Massachusetts Medical Society;1986: chapter 8.

106. Plymale MA, Sloan PA, Johnson M, et al. Cancer paineducation: a structured clinical instruction module forhospice nurses. Cancer Nurs 2000;24(6):424-429.

107. Brown ST, Bowman JM, Eason FR. Assessment of nurs-es’ attitude and knowledge regarding pain manage-ment. J Continuing Educ Nurs. 1999;30:132-139.

108. Lebovits AH, Florence I, Bathina R, et al. Pain knowl-edge and attitudes of healthcare providers: practicecharacteristic differences. Clin J Pain. 1997;13:237-243.

Page 110: Improving the Quality of Pain Management Through Measurement and Action

109. Sloan PA, Plymale MA, Johnson M, et al. Cancer painmanagement skills among medical students: the devel-opment of a cancer pain objective structured clinicalexamination. J Pain Symptom Manage. 2001;21:298-306.

110. Glajchen M, Bookbinder M. Knowledge and perceivedcompetence of home care nurses in pain manage-ment: a national survey. J Pain Symptom Manage.2001;21:307-316.

111. Berwick DM. Developing and testing changes in deliv-ery of care. Ann Intern Med. 1998;128(8):651-656.

112. Gordon DB, Jones HD, Goshman LM, et al. A qualityimprovement approach to reducing use of meperidine.Jt Comm J Qual Improv. 2000;26:686-699.

113. Collins PM. Improving pain management in yourhealth care organization. J Nurs Care Qual.1999;13:73-82.

114. Stratton L. Evaluating the effectiveness of a hospital’spain management program. J Nurs Care Qual.1999;13:8-18.

115. Weissman DE, Griffie J, Muchka S, et al. Building aninstitutional commitment to pain management in long-term care facilities. J Pain Symptom Manage.2000;20:35-43.

116. Stroud MW, Thorn BE, Jensen MP, et al. The relation-ship between pain beliefs, negative thoughts, and psy-chosocial functioning in chronic pain patients. Pain.2000; 84:347-352.

117. Ferrell BR, Dean GE, Grant M, et al. An institutionalcommitment to pain management. J Clin Oncol.1995;13:2158-2165.

118. Miaskowski C. The impact of age on a patient’s per-ception of pain and ways it can be managed. PainManage Nurs. 2000;1(3 suppl 1):2-7.

119. Warfield CA, Kahn CH. Acute pain management.Programs in U.S. hospitals and experiences and atti-tudes among U.S. adults. Anesthesiology.1995;83:1090-1094.

120. Carr E. Structural barriers to pain control. Nurs Times.1997;93:50-51.

121. Ersek M, Kraybill BM, Pen AD. Factors hinderingpatients’ use of medications for cancer pain. CancerPract. 1999;7(5):226-232.

122. Donovan M, Evers K, Jacobs P, et al. When there is nobenchmark: designing a primary care-based chronicpain management program from the scientific basisup. J Pain Symptom Manage. 1999;18:38-48.

123. American Pain Society. Principles of Analgesic Use inthe Treatment of Acute and Cancer Pain. 3rd ed.Skokie, IL: American Pain Society; 1992.

124. American College of Emergency Physicians. Clinicalpolicy: critical issues for the initial evaluation andmanagement of patients presenting with a chief com-plaint of nontraumatic acute abdominal pain. AnnEmerg Med. October 2000;36:406-415.

125. Caswell DR, Williams JP, Vallejo M, et al. Improvingpain management in critical care. Jt Comm J QualImprov. 1996;22:702-712.

126. Weissman DE, Griffie J, Gordon D, et al. A role modelprogram to promote institutional changes for manage-ment of acute and cancer pain. J Pain SymptomManage. 1997;14:274-279.

127. Committee on Quality of Health Care in America,Institute of Medicine. Building organizational supportsfor change. In: Crossing the Quality Chasm: A NewHealth System for the 21st Century. Washington, DC:National Academy Press; 2001: chapter 5.

128. Castle NG, Banaszak-Holl J. Top management teamcharacteristics and innovation in nursing homes.Gerontologist. 1997;37:572-580.

129. Kaluzny AD, Hernandez SR. Organization change andinnovation In: Shortell SM, Kaluzny AD, eds. HealthCare Management: A Text in Organization Theory andBehavior. 2nd ed. New York: Wiley; 1988.

130. Zaltman G, Duncan R. Strategies for Planned Change.New York: John Wiley and Sons; 1977.

131. Beyer JM, Trice HM. Implementing Change:Alcoholism Policies in Work Organizations. New York:Free Press; 1978.

132. Pain management services pose headache to manyhealth plans. Managed Care Week. December 10,2001

133. Carr DB, Miaskowski C, Dedrick SC, et al.Management of perioperative pain in hospitalizedpatients: a national survey. J Clin Anesth. 1998;10:77-85.

134. Tough P. The alchemy of oxycontin: from pain relief todrug addiction. The New York Times Magazine. July29, 2001.

135. HHS announces quality initiative as demonstrationprogram in five states. Health Care Policy Report.Washington, DC: Bureau of National Affairs;December 3, 2001.

136. Schiff GD, Goldfield NI. Deming meets Braverman:toward a progressive analysis of the continuous qualityimprovement paradigm. Int J Health Serv.1994;24(4):655-673.

137. Juran JR. Juran on Leadership for Quality: An ExecutiveHandbook. New York: The Free Press; 1989.

138. Braveman C, Rodriques C. Performance improvementin pain management for home care and hospice pro-grams. Am J Hospice Palliative Care. 2001;18(4):257-263.

139. Agency for Health Care Policy and Research.Management of Cancer Pain: Adults. Quick ReferenceGuide for Clinicians, No. 9. Rockville, MD: US PublicHealth Service, Agency for Health Care Policy andResearch; March 1994. AHCPR publication 94-0593.

140. Merkel SI, Voepel-Lewis T, Shayevitz J, et al. TheFLACC: a behavioral scale for scoring postoperativepain in young children. Pediatr Nurs. 1997;23(3):293-297.

141. Kleiber C, Craft-Rosenberg M, Harper DC. Parents asdistraction coaches during IV insertion: a randomizedstudy. J Pain Symptom Manage. 2001;22(4):851-861.

142. Kleiber C, Sorenson M, Whiteside K, et al. Topicalanesthetics for IV insertion in children: a randomizedequivalency study. Pediatrics. In press.

143. Titler MG, Kleiber C, Goode C, et al. Infusing researchinto practice to promote quality care. Nurs Res.1994;43(5):307-313.

144. Titler MG, Kleiber C, Steelman VJ, et al. The Iowamodel of evidence-based practice to promote qualitycare. Crit Care Nurs Clin North Am. 2001;13(4):497-509.

145. American Pain Society. Principles of Analgesic Use inthe Treatment of Acute and Cancer Pain. 4th ed.Skokie, IL: American Pain Society; 1999.

146. Geyer J, Ellsbury D, Kleiber C, et al. An evidence-based multidisciplinary protocol for neonatal circum-cision pain management. J Obstet Gynecol NeonatalNurs. 2002 Jul-Aug;31(4):403-10.

102 Improving the Quality of Pain Management Through Measurement and Action

References

Page 111: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission on Accreditation of Healthcare Organizations 103

References

147. Kizer KW, Demakis JG, Feussner JR. Systematizingquality improvement and quality innovation. MedCare. 2000;38(suppl I):I-7–I-16.

148. Kilo CM. A framework for collaborative improvement:lessons from the Institute for Healthcare Improvement’sBreakthrough Series. Qual Manage Health Care.1998;6(4):1-13.

149. Kerns RD, Wasse L, Ryan B, et al. Pain as the 5th VitalSign Toolkit. V.2. Washington, DC: Veterans HealthAdministration; 2000.

150. Halpern J. The measurement of quality of care in theVeterans Health Administration. Med Care.1996;34(3):MS55-68.

151. Young GJ, Meterko M, Desai KR. Patient satisfactionwith hospital care: effects of demographic and institu-tional characteristics. Med Care. 2000;38(3):325-334.

152. Leape LL, Kabcenell AI, Gandhi TK, et al. Reducingadverse drug events: lessons from a BreakthroughSeries collaborative. Jt Comm J Qual Improv.2000;26(6):321-331.

153. Wagner EH, Glasgow RE, Davis C, et al. Qualityimprovement in chronic illness care: a collaborativeapproach. Jt Comm J Qual Improv. 2001;27(2)63-80.

154. Kerns RD. Improving pain management in the VA. FedPractitioner Suppl. August 2001:18-22.

155. Montrey J, ed. Proceedings of the VA NationalLeadership Conference on Pain Management and Endof Life Care. Veterans Health Systems Journal.2000;5(suppl).

156. Cleeland CS, Schall M, Nolan K, et al. Rapid improve-ment in pain management: The Veterans HealthAdministration and the Institute for HealthcareImprovement Collaborative. Submitted for publication.

Page 112: Improving the Quality of Pain Management Through Measurement and Action

104 Improving the Quality of Pain Management Through Measurement and Action

Page 113: Improving the Quality of Pain Management Through Measurement and Action

Joint Commission on Accreditation of Healthcare Organizations 105

Page 114: Improving the Quality of Pain Management Through Measurement and Action

106 Improving the Quality of Pain Management Through Measurement and Action

Page 115: Improving the Quality of Pain Management Through Measurement and Action

NATIONALPHARMACEUTICAL

COUNCIL, INC

1894 Preston White DriveReston, VA 20191www.npcnow.org

One Renaissance BoulevardOakbrook Terrace, IL 60181

www.jcaho.org

®