cahps® home and community- based services …® home and community-based services survey ... o...

Post on 24-May-2018

215 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

CAHPS® Home and Community-Based Services SurveyNational Training Call: Part II

January 24, 2017Centers for Medicare & Medicaid Services

Truven Health Analytics American Institutes for ResearchConnecticut TEFT Grantee Team

This document was made possible under Contract HHSM-500-2010-0025I-T006 from the Centers for Medicare & Medicaid Services. The contents of this presentation are solely the responsibility of the author(s) and do not necessarily represent the official views of the Centers for Medicare & Medicaid Services or any of its affiliates.

• Phone lines will be muted• Send questions to “all participants” via Chat• For technical issues, contact Lisa Gold (via Chat) or

lisa.gold@truvenhealth.com• Training is being audio-recorded to serve as a web-

based technical assistance module; pleaseanticipate a pause after these announcements andprior to Q&A to accommodate the recording

2

Logistics

CAHPS® Home and Community-Based Services Survey

National Training Part IIJanuary 2017

Centers for Medicare & Medicaid ServicesTruven Health Analytics

American Institutes for Research

This document was made possible under Contract HHSM-500-2010-0025I-T006 from the Centers for Medicare & Medicaid Services. The contents of this presentation are solely the responsibility of the author(s) and do not necessarily represent the official views of the Centers for Medicare & Medicaid Services or any of its affiliates.

Connecticut TEFT Grantee Team

Overview

4

• Two web-based events

• Part I was on January 10

• Part IIoAdministering the survey

oPreparing and analyzing data from the survey

o State use of the CAHPS® Home and Community-BasedServices Survey

5

Overview of the Training

Introduction of Presenters

Mike Smith, Centers for Medicare & Medicaid Services (CMS), Director, Division of Community Systems Transformation (DCST)

Kerry Lida, CMS, Testing Experience and Functional Tools (TEFT) Team Lead, DCST

Allison Weaver, CMS, TEFT Technical Assistance Contracting Officer’s Representative and Project Officer, DCST

6

Introduction of Presenters (Cont’d)

Susan Raetzman, HCBS CAHPS Survey Developer, Truven Health Analytics

Elizabeth Frentzel, HCBS CAHPS Survey Developer, American Institutes for Research (AIR)

Coretta Mallery, HCBS CAHPS Survey Developer, AIR

7

Introduction of Presenters (Cont’d)

Chris Pugliese, HCBS CAHPS Survey Developer, AIR

Kathy Bruni, Connecticut TEFT Grantee, State of Connecticut

Julie Robison, Connecticut TEFT Grantee team, University of Connecticut

8

Medicaid HCBS Expenditures as a Percentage of Total Medicaid Long-Term Services and Supports

(LTSS) Expenditures, FY 1995–2014

9

Survey Rationale: Person-Centeredness

• Person-centered thinking helps to establishthe means for people to live a life that theyand the people who care about them havegood reason to value

• Person-centered planning is a way to assistpeople needing HCBS services and supportsto construct and describe what they wantand need to bring purpose and meaning totheir life

• Person-centered practice is the alignment ofservice resources that give people access tothe full benefits of community living andensure that they receive services in the waythat may help them achieve individual goals

10

• Cross-disability consumer experience survey for elicitingfeedback from beneficiaries receiving Medicaid HCBSservices and supportso Focus on participant experience, not satisfaction

• Allows for comparisons across programs serving differenttarget populationso Individuals who are frail elderlyo Individuals with a physical disabilityo Individuals with an intellectual or developmental disabilityo Individuals with a brain injuryo Individuals with serious mental illness

11

Overview of the HCBS CAHPS Survey

Overview of the HCBS CAHPS Survey (cont’d)

12

Recap: Part I Training

Topics covered:oResearch and development of the survey, including

background on key features

o Lessons learned from pilot and field tests

oNational Quality Forum (NQF)–endorsed measures derivedfrom the survey

13

Administering the HCBS CAHPS Survey

14

Major Phases of the Survey Process

15

Survey Activities and Estimated Timeline

Phase: Survey Planning and Set UpTask and Sequence Time RequiredPull together team, develop plan and timeline 2 weeksFinalize survey and associated materials 1 weekDevelop request for proposal Dependent on state policiesChoose vendor/hire interviewers Dependent on state policiesIRB approval Up to 2 months. Varies by state.Pull sample and provide to vendor 2 to 3 weeks. Varies by data quality.Inform stakeholders OngoingTrain vendor and/or interviewers Typically, 1 webinar or in-person session.Review and approve field disks 3 to 4 daysMail pre-notification letter 2 to 3 weeks

Phase: Field Work and Data ManagementTask and Sequence Time RequiredBegin surveying N/AIn the field 8 to 12 weeksReceive first round of data for quality checks 1 week after data collection startsEnd data collection N/A

Phase: Data Delivery, Analysis and ReportingTask and Sequence Time RequiredReceive data Typically, 1 week to 10 days after data

collection ceases.Disposition report 2 weeks after data collection ceasesData analysis One monthInternal and public reporting One month

16

Items and Measures in the HCBS CAHPS Survey

• Cognitive screener items• Service identification items• Screening items—dictate skip patterns in survey• Composite measure items• Items that the technical expert panel (TEP) identified as important,

although they were not included in a composite measure• Global rating items and recommendation items

o Personal assistant and behavioral health staff, homemaker, and casemanager

• Demographic and administration items—for case-mix adjustment andother purposes

• Separate and optional: employment module

17

Opportunities for Tailoring the Survey

• Program- and provider-specific terms• Survey name changed to be meaningful

to the state or programo Survey naming conventions

Formal: CAHPS Home andCommunity-Based Services Survey

Short version: HCBS CAHPS Survey Tailoring to the state: Missouri HCBS

CAHPS Survey• Add questions

o Supplemental employment moduleo Flexibility to add state- and program-

specific questions

18

Adding Questions

• To use the trademark, CAHPS requires that thecore survey remain unchanged

• Flexibility to add state- and program-specific items• Questions may be added before the “About You”

(demographic) section• Questions may be added in the “Interviewer

Questions” section

19

CAHPS Prohibitions

20

Survey Vendor Contracting

• Identify vendors with experience interviewing individuals who receive HCBS• Vendor contract should include—

o Security protocolso Quality assurance plan for data integrityo Interviewer training and oversighto Survey software requirements (computer-assisted telephone interviewing

[CATI] and/or computer-assisted personal interviewing [CAPI])o Standardized data collection protocolso Requirements for reporting and communication to the survey sponsoro Periodic data submission and data checks

• Potential consequences to vendor of not meeting quality assuranceprovisions, for example, reinterviewing participants, nonpayment forfraudulent interviews

21

Training of Vendors

22

Key Decisions Affecting the Vendor

1. Survey instrumento Sample sizeoAdditional items and supplemental employment module

2. AdministrationoModes (phone, face-to-face, web)oAbility to use CATI and/or CAPI softwareoUse of proxy respondentso Language(s) offered

23

Ongoing Vendor Quality Assurance

• Check vendor programming beforeimplementation

• Early review of survey data (e.g., first 25 surveys)

• Identify data outliers and need for follow-up withinterviewers or programming

• Ensure that survey respondents are appropriatelyidentifying program staff in eligibility questions

• Ensure that all appropriate survey questions areadministered

• Ensure that responses seem logical

24

Survey Planning

• Coordination and communicationo Alert stakeholders about survey (e.g., case managers,

providers, advocates, state help line)o Prenotification letter to eligible respondentso Recruitment calls with eligible respondentso Thank you notes to eligible respondents, with follow-up

contact information

• Institutional review board (IRB) review and approval, ifrequired by state

• State-specific abuse and neglect reporting requirements

25

State-Generated Survey Materials

• Consent forms• Letters to beneficiaries

o Prenotification lettero Thank you letter

• Recruitment protocol for vendors• Program-specific term guide for vendors• Abuse and neglect reporting requirements for

vendorso State abuse and neglect reporting requirementso Abuse and neglect reporting protocolo Abuse and neglect reporting form

26

Sample Accuracy

• Challenges to an accurate sample listo Mortality (deceased people in sample)o Incorrect contact information of beneficiarieso Incorrect/unavailable guardian information

• Maximizing sample accuracy may require some planning andeffort; however, it is critical to support—o Reduced time needed to collect datao Lower cost of data collectiono Ability to use data to analyze survey results

27

Sample Accuracy (cont’d)

• States have used different strategies to ensure accuracy ofsampleo Working with care coordinators/case managerso Working with managed care organizations (MCOs) and

coordinating agencieso Reviewing records against death recordso Checking individual data fields for missing information before

conveying the sample to the survey vendoro Vendors typically offer verification or correction of contact

information

28

Decrease Potential for Bias

29

Recruitment: Special Situations

Program respondent:

30

Guardians

• The preferred respondent is always the HCBS beneficiary

• Guardian consent for a beneficiary to participate in thesurvey may be needed for those with legal guardians

• Even when a guardian provides consent for the beneficiaryto participate in the survey, the beneficiary must agree tobe interviewed, that is, provide assent.

31

Cognitive Screening Questions

• Set of three cognitive screening questions at thebeginning of the survey

• Identify individuals who are or are not likely toprovide reliable responses to the surveyo If all questions are answered adequately

administer rest of surveyo If fewer than three questions are answered

adequately stop interview and seek proxy

32

Cognitive Screening Questions (cont’d)

1. Does someone come into your home to help you?

1 YES 2 NO → END SURVEY -1 DON’T KNOW → END SURVEY-2 REFUSED → END SURVEY-3 UNCLEAR RESPONSE → END SURVEY

2. How do they help you?

_________________________________________________________________________[EXAMPLES OF CORRECT RESPONSES INCLUDE]

• HELPS ME GET READY EVERY DAY• CLEANS MY HOME• WORKS WITH ME AT MY JOB• HELPS ME DO THINGS• DRIVES ME AROUND

3. What do you call them?

_________________________________________________________________________[EXAMPLES OF SUFFICIENT RESPONSES INCLUDE]

• MY WORKER• MY ASSISTANT• NAMES OF STAFF (JO, DAWN, ETC.)

33

Use of Proxies

• Sponsoring entities decide on whether and which proxies to include

o Guardians

o Friends or family who are unpaido Individuals with regular contact

• IRB suggestions and requirements

o Consento Assent

• Need for introductory script to account for role in survey

• While fielding survey, consider monitoring percentage of surveys that arecompleted by proxy

• Adjust for proxy responses in analyses

The most reliable

respondents are HCBS

beneficiaries

34

Who is more likely to be a good proxy respondent?

• Willing to respond on behalf of the individual

• Unpaid caregivers, family members, friends, and neighbors

• Familiar with the services and supports that the individual is receiving

• Has regular, ongoing contact with the individual

• Examples:

Lives with the individualManages the individual’s in-home care for a majority or all of the dayHas regular conversations with the individual about the services they

receiveVisits with the individual Is present when services and supports are provided

35

Who is less likely to be a good proxy respondent?

• Someone with paid responsibilities for providing services and/orsupports to the beneficiaryo Includes family members and friends who are paid to help the

beneficiary

• Guardians or conservators whose only responsibility is to oversee thebeneficiary’s financeso Unlikely to have sufficient knowledge of the quality of service and

supports delivered and whether the beneficiary’s preferences areaddressed and goals met

A respondent’s choice not to participate should be honored(i.e., no proxy should be allowed)

36

Administering the Survey to Proxy Respondents

• Have the interviewer check with the beneficiary to seewhether the presence of a proxy during the interview isconsistent with his or her wishes

• Possible interview scenarioso Both the beneficiary and proxy participating (over the phone

or in the same room)o Proxy may do the interview without the beneficiary present

• Ascertain whether anyone who is paid to provide servicesto the beneficiary is present and if so, request privacy

37

Administering the Surveyto Proxy Respondents (cont’d)

• Cognitive screening questions: considerapplying to anyone answering questions—beneficiary and proxy alike

• In general, there are two options for wording ofitems addressed to proxy respondents

1. Ask as written, that is, do not use alternatewording for proxies such as “Did this person. . .”

2. Tailor to the individual by replacing thebeneficiary’s name for first personreferences in each item, that is, “DidJohn…” instead of “Did you . . .”

38

Administering the Survey to Proxy Respondents (cont’d)

• If both beneficiary and proxy respondent are present, considerhaving the interviewer direct all questions to the beneficiary first,because the beneficiary may be able to answer some questions

o If proxy responds, consider instructing the interviewer to defer to thebeneficiary for confirmation or a different answer

o If both the beneficiary and proxy answer, consider instructing theinterviewer to record the beneficiary’s answer.

• If the beneficiary does not seem to understand questions or is notable to answer them,

o First, follow protocol for alternate responseso Otherwise, ask proxy respondent

39

Proxy Respondents

100. DID SOMEONE HELP THE RESPONDENT COMPLETE THIS SURVEY?

1 YES 2 NO → END SURVEY

101. HOW DID THAT PERSON HELP? [MARK ALL THAT APPLY.]

1 ANSWERED ALL THE QUESTIONS FOR RESPONDENT 2 ANSWERED SOME OF THE QUESTIONS FOR THE RESPONDENT 3 RESTATED THE QUESTIONS IN A DIFFERENT WAY OR REMINDED/PROMPTED THE

RESPONDENT 4 TRANSLATED THE QUESTIONS OR ANSWERS INTO THE RESPONDENT’S LANGUAGE 5 HELPED WITH THE USE OF ASSISTIVE OR COMMUNICATION EQUIPMENT SO THAT

THE RESPONDENT COULD ANSWER THE QUESTIONS 6 HELPED THE RESPONDENT IN ANOTHER WAY,

SPECIFY__________________________

40

Identifying and Reporting Abuse, Neglect, and Exploitation

• Establish a protocol for interviewers to report abuse, neglect, and exploitation (ANE)

• Beneficiary or other person may report ANE to the interviewero Questions on the survey that asks

about ANE• Interviewer may observe ANE directly• Interviewer may suspect ANE• Interviewers should be trained on how

to report reported, observed, and suspected instances of ANE

41

Example of Neglect Survey Question

The next few questions ask if anyone paid to help you treated you badly in the last 3 months. This includes {personal assistance/behavioral health staff, homemakers, or your case manager}. We are asking everyone the next questions—not just you. [ADD STATE-SPECIFIC LANGUAGE HERE REGARDING MANDATED REPORTING, IF APPROPRIATE—“I want to remind you that, although your answers are confidential, I have a legal responsibility to tell {STATE} if I hear something that makes me think you are being hurt or are in danger.”]

65. In the last 3 months, did any {personal assistance/behavioral health staff, homemakers,or your case managers} take your money or your things without asking you first?

1 YES 2 NO → GO TO Q68 -1 DON’T KNOW → GO TO Q68 -2 REFUSED → GO TO Q68 -3 UNCLEAR RESPONSE → GO TO Q68

42

Preparing and Analyzing Data from the HCBS CAHPS Survey

43

Importance of Preparing Data for Analysis

Preparing CAHPS Home and Community-Based Services Survey data for analysis ensures that—

oData accurately reflect participant responsesoData from the standard and alternate response options

are combined appropriately for analysisoRespondent characteristics and survey mode are coded

correctly for use as case-mix adjusters

44

Cleaning Data: Overview

• Check for several common types of data errors

• These errors should not be an issue for ComputerAssisted Telephone Interviews (CATI) or ComputerAssisted Phone Interviews (CAPI) survey administration,but we still recommend checking for data quality issueso Incorrect programming of CATI and CAPI will allow errors

in skip patterns and out-of-range values

45

Cleaning Data: Out-of-Range Values

Check for the following types of data quality issues:

oOut-of-range values (e.g., a respondenthas a value of 11 for their “0–10 rating of homemaker”)

46

Cleaning Data: Failed Skips• Failed Skips (e.g., a respondent answered questions from a section

that they should have skipped on the basis of a previous question)o Keep the response to the screener and set the response to the follow-up as

missing.

47

Cleaning Data: Indeterminate Eligibility

• Indeterminate eligibility problems (i.e., a respondent left a screeneritem blank and answered subsequent questions from that section)o Keep the response to the follow-up and back code the screener question to Yes.

48

Cleaning Data: Duplicates

Also check for the following type of data quality issues:

• Duplicates (i.e., a respondent isrepresented more than once in thedata set)

49

Reverse Coding of Responses

• Several questions will need to be reverse coded to ensurethat the highest value corresponds to the most positiveresponse

50

Determining Complete Surveys

• For CAHPS surveys, analysis should be limited to “complete”surveys

• The CAHPS definition of a complete survey is one in which arespondent provided a substantive response to at least half ofthe items that all respondents are eligible to answer in thesurvey

• Each sponsoring entity should determine which questionsevery respondent is eligible to answer, specific to theirbeneficiaries

51

Determining Complete Surveys (cont’d)

• Use the following steps to determine complete surveys:1. Determine the number of key items or items that all respondents are

eligible to answer Key items may vary depending on the types of services that all

beneficiaries receive in a program and additional items added to thesurvey

2. Sum the number of substantive responses (responses other than Don’tKnow/Refused/Unclear) from these key items for each respondent

3. If the number is equal to or greater than half the number of key items,then that respondent’s survey is considered complete

4. If the number is fewer than half, then that respondent’s survey isconsidered incomplete

• For proxy surveys, follow the same steps with the total number of itemsthat proxy respondents are eligible to answer

52

Combining Data From Response Option Modes

• The HCBS CAHPS Survey offers two response optionmodes for participants

• Example:

o Standard Question/Response: In the last 3months, how often did personal assistance staffcome to work on time?(Never/Sometimes/Usually/Always)

o Alternate Question/Response: Do personalassistance staff come to work on time? (MostlyYes/Mostly No)

• Offering both response options necessitates thecombining of responses before the data can beanalyzed

53

Combining Data From Response Option Modes (cont’d)

• Because there are two different formats of alternate responseoptions in the survey, to analyze all survey items, two types oftransformations are needed:o Alternate two-point Mostly Yes/Mostly No responses are

transformed to the standard four-point Never/Sometimes/Usually/Always scales

o Standard 0–10 responses are transformed to alternate five pointglobal rating Excellent/Very Good/Good/Fair/Poor scale

• Use your statistical package of choice to merge the separatevariables for standard and alternate responses so that all dataare represented in one variable

54

Combining Data From Response Option Modes: Mostly Yes/Mostly No With

Never/Sometimes/Usually/Always Variables • Use the following logic to combine standard Never/ Sometimes/

Usually/Always responses with analogous Mostly Yes/Mostly Noresponseso Ensure that standard responses are coded as:

Least positive response = 1 3rd most positive response = 2 2nd most positive response = 3 Most positive response = 4

o Recode alternate responses as: Least positive response = 1 Most positive response = 4

• Use your statistical package of choice to merge the separate variablesfor standard and alternate responses so that all data are representedin one variable

55

Combining Data from Response Option Modes: Global Rating Variables

• Use the following logic to combine standard global ratings with alternate global ratings:o Keep alternate responses on five point Excellent/Very Good/Good/Fair/ Poor scale. Recode

alternate responses as: Poor = 1 Fair = 2 Good = 3 Very Good = 4 Excellent = 5

o Recode standard responses as: 0,1,2 = 1 3,4 = 2 5,6 = 3 7,8 = 4 9,10 = 5

• Use your statistical package of choice to merge the separate variables for standard andalternate ratings so that all data are represented in one variable

56

Case-Mix Adjustment

• To fairly compare one program or group againstothers, it is important during analysis to adjust theresults for case mixo Collect data on case-mix factors during the survey

• Traditional factors recommendedo General health rating, mental health ratingo Age, sex, education, whether living alone

• Factors specific to survey administrationo Mode, if both phone and in-person interviews are

usedo Response option

• Respondent status, if proxy respondents are used

57

Producing Case-Mix Adjusted Scores

• A multivariable analysis is required to produce case-mixadjusted scoreso This can be done using general linear models that include, and

can be an extension of, ordinary least squares regressiono Adjusted scores are the predicted scores generated by such a

model

• This can be accomplished in two ways:o Using the SAS® CAHPS Analysis Scoring Program (“CAHPS

Macro”)o Using a statistical software package of your choice (i.e., SPSS,

Stata, R)

58

Link to CAHPS Macro and Documentation

• The CAHPS Macro and documentation can be foundhereo This link goes to a zip file containing the following

guidance documents: Instructions for Analyzing Data From CAHPS Surveys:

provides general guidelines for using the CAHPS analysisprogram. SAS Document Package: includes SAS files needed for the

CAHPS analysis program.

59

Presenting HCBS CAHPSSurvey Scores

• Two recommended ways to present scores:

1. Average score: the mean across all of the response categories

2. “Top -Box” score: the percentage of survey respondents who chosethe most positive score for a given item response scale Always for Composite Items 9 or 10 for Global Ratings Definitely Yes for Recommend Items

• Each type of score presentation will require you to recode yourdata

For example, to produce average scores, data will need to berecoded for a 0-to-100 scale.

60

Transforming Data forTop-Box Scores

• To produce top-box scores, create anew variable for each question andassign it a value of 1 if the respondentchose the most positive category anda value of 0 if the respondent did notchoose the most positive category.

• The most positive category is:o Always for Composite Items (or

Never for reverse coded items)o 9 or 10 for Global Ratingso Definitely Yes for Recommendation

Items

61

Presenting HCBS CAHPS Survey Scores: Examples

Global Rating of Personal Assistance / Behavioral Health Staff

Mean Score Top Box Score (% Rating PCA 9 or 10)

89.5 77%

Would Recommend Personal Assistance / Behavioral Health Staff

Mean Score Top Box Score (% would "definitely" recommend their PCA)

93.5 82%

Transportation to Medical Appointments

Mean Score Top Box Score

96.1 91%

Abbreviation: PCA, personal care assistant.

62

State Use ofthe HCBS CAHPS Survey

63

Considerations for Using the HCBS CAHPS Survey

1. Person-centered2. Cross-disability

o Ability to compare programs3. Increased accessibility via in person and phone modes,

alternate response options, proxy respondents4. Development aligned with CAHPS

o Reflects what is important to beneficiarieso Rigorous methods, for example, psychometric testingo Trademark that providers recognize

5. Measures available (endorsed by the National QualityForum)

6. Publicly available from CMSo Free of charge to accesso Resources for help in using survey

64

Using the Survey for Program Quality Management

• Assess program performanceo Point-in-time snapshoto Track changes over time

• Document successes• Identify areas for program improvement• Assess impact of program improvement initiatives and

projects• Provide information to stakeholders on program performance

o Internal staff, providers, and managed care organizations,beneficiaries, legislators, and the general public

o Measures align with some CMS quality requirements

65

HCBS Final Rule for Person-Centered Planning

• Section 2402(a) of the Affordable Care Act requires theSecretary of Healthy and Human Services to ensure allstate received federal funds:o Develop service systems that are responsive to the needs and

choices of beneficiaries receiving HCBS and community-basedlong-term services and supports (LTSS)

o Maximize independence and self-directiono Provide support coordination to assist with a community-based

supported lifeo Achieve a more consistent and coordinated approach to

administration of policies and procedures across programs

66

How are TEFT Grantees Using the HCBS CAHPS Survey?

67

Populations Participating in HCBS CAHPS Pilot & Field Tests by State

State

Individuals Who Are

Frail Elderly

Individuals With a

Physical Disability

Individuals Who are Frail Elderlyand/or With a

Physical Disability

IndividualsWith an

Intellectual orDevelopmental

Disability

Individuals With a

Brain Injury

Individuals With Serious

Mental Illness

Arizona . . X X . .

Colorado . . X X . .

Connecticut X . . . X X

Georgia . X X . . .

Kentucky . . X X X .

Louisiana . . X X . .

Maryland . . X . . .

Minnesota X . . . X X

New Hampshire . . X X X X

Tennessee . . X . . .

68

Populations Participating in TEFT Grantee Demonstration of HCBS CAHPS Survey by State

StateIndividuals Who Are

Frail Elderly

Individuals With a

Physical Disability

Individuals Who Are Frail

Elderly or With a

Physical Disability

Individuals With an Intellectual

orDevelopmental

Disability

Individuals With aBrain Injury

Individuals With Serious

Mental Illness

Arizona X X . X . .

Colorado X . . X . .

Connecticut X X . . X .

Georgia X X . . X .

Kentucky . X X . . .

Maryland X X . . X .

New Hampshire . . X X X X

69

TEFT Grantees’ Planned Use of the HCBS CAHPS Survey and/or Demonstration

State Planned Use

ArizonaFacilitate discussion with stakeholders about findings, lessons learned, and next steps. The ArizonaHealth Care Cost Containment System and managed care organizations will isolate and address improvement opportunities identified by the data.

ColoradoInform services and delivery; determine usability, accessibility, and functionality features of multiple survey administration modes; and develop beneficiary messaging and notification about survey participation.

Connecticut

Implement as a single quality improvement survey for all Medicaid HCBS programs and to set and measure quality benchmarks for Access Agencies and LTSS providers across all LTSS programs. Connecticut hopes to provide web-based access to the survey through the Personal Health Record (PHR) in the future.

GeorgiaAfter analysis of demonstration data, Georgia and its stakeholders will discuss the possibility of using the survey with the Georgia customized questions to augment the current surveys being conducted by the state’s Medicaid waivers.

Kentucky To compare content and survey results with the Money Follows the Person Quality of Life Survey.

MarylandInform whether to implement the survey through the PHR/Client Profile solution and possibly to guide what information is in the Client Profile; and determine the survey’s effectiveness in other waiver programs and what areas could be improved.

New Hampshire

Possibly introducing the survey into the state’s LTSS information system, seeking to compare survey results across LTSS programs, and requiring Medicaid Care Management Programs to use the survey when they begin managing LTSS services.

70

IN PROGRESS!The Experience of Care Survey: CT Demonstration

Julie Robison, PhDProfessor

UConn Health, Center on Aging&

Kathy Bruni, MPA, LCSWDirector, HCBS Unit

Connecticut Department of Social Services

Funding from CMS and Connecticut Department of Social Services

71

CT Plans for HCBS CAHPS Survey Data

• Results from the survey will be used to developperformance benchmarks for providers

• Goal: improve the experience of care for all HCBSrecipients

72

Waiver Case Management Structure

• For Elder, Disabled, and Brain Injury Waivers, casemanagement is a contracted service withcontractors in five different regions in the state

• Quality varied among these providers• Difficult to compare one provider with another• HCBS CAHPS Survey offers that opportunity

73

2013 Case Management Contract

• The Department of Social Services addedperformance bonus incentives to the older adultwaiver contracts in 2013

• Pool is divided by the number of performancestandards

• Pool total available is $500,000

• Four performance incentives tied to HCBS CAHPSSurvey in the contractso Access to careo Having choice and control over assistance receivedo Being treated with respect and dignityo Feeling included in the community

74

Connecticut Experience of Care Survey: State Demonstration

• Participants from three HCBS waiverso Older adultso Personal Care Assistanceo Acquired Brain Injury

• 400 surveys needed from each for representativesamples and cross-group comparisonso Connecticut fielded pretrademark version of instrument

because of timing of implementation

• Participants choose: telephone or in-person• Assisted or proxy allowed if needed

75

Response So Far

Category PCA Older AdultTotal available to call 828

(all)982

(random sample)Attempted to contact 620 874Ineligible* 48 189TOTAL ELIGIBLE 572 685Refused 57 179Not reached 115 106Completed 400 400Response Rate 70.0% 58.4%

Abbreviation: PCA, personal care assistant.*Died, institutionalized, non-English/Spanish speaker, wrong contact information,or cognitively incompetent.

76

Interview Breakdown

Abbreviation: PCA, personal care assistant.

77

Lessons Learned

• Flexibility is CRITICALo Choice in survey mode (phone, in-person, web . . .)o Allow assistance if individual (or legal guardian)

desires or needso Choice in language; accommodate nonverbal

78

Lessons Learned (Cont’d)

• Stakeholder input is CRITICALo Share survey informationo Get suggestionso Collaborate with other involved

providers Access agencies designated one

person to help contact consumers

79

Lessons Learned (Cont’d)

• Organization is criticalo Before sending notification letters

Get IRB resolved (University of Connecticut IRB: Quality Improvement) Program and test English and Spanish surveys Train all interviewers Obtain and clean contact information

o Start calls within a week of letters Batch letters depending on interviewer capacity

o Easy movement within CATI system• Side buttons to start survey/save/go back

o Simple recruitment script with natural flow

80

Lessons Learned (Cont’d)

• Public trust in vendor iscriticalo Use known entity if

possibleo Independence from state

agencies and serviceproviders (e.g., usedUniversity of Connecticutletterhead)

o Emphasize confidentiality,privacy, choice

81

HCBS CAHPS Survey Resources

• CMS webpage on HCBS CAHPS surveyo Survey instruments in English and Spanisho Technical assistance documents

• HCBSCAHPS@Truvenhealth.com mailbox for questions

• See the CAHPS webpage for rules about modifying andnaming the survey instrumento Flexibility to add items borrowed from other surveys

82

THANK YOU!

83

Questions and Discussion

84

Polling

• Please answer brief poll questions at the end of thistraining.

85

top related