office of the director contract no: 200-2014-60994€¦ · 2.4 create low-fidelity forecast...

73
Centers for Disease Control and Prevention and Prevention (CDC) Office of Infectious Diseases (OID) National Center for Immunization and Respiratory Diseases (NCIRD) Office of the Director Contract No: 200-2014-60994 F ORMAL D ELIVERABLE 10 A I MMUNIZATION -R ELATED G UIDANCE A TTACHMENT D: U SER C ENTERED D ESIGN (UCD) F ORECASTING AND D ATA Q UALITY M ETHODS AND F INDINGS S EPTEMBER 28, 2015

Upload: others

Post on 04-Aug-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

Centers for Disease Control and Prevention and Prevention (CDC) Office of Infectious Diseases (OID)

National Center for Immunization and Respiratory Diseases (NCIRD)

Office of the Director

Contract No: 200-2014-60994

FORMAL DELIVERABLE 10A

IMMUNIZATION-RELATED GUIDANCE ATTACHMENT D:

USER CENTERED DESIGN (UCD) FORECASTING AND DATA QUALITY

METHODS AND FINDINGS S E P T E M B E R 2 8 , 2 0 1 5

Page 2: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGs

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page i September 28, 2015

VERSION HISTORY Formal Deliverable 10a, Attachment D, will be updated to reflect changes that incorporate the Centers for Disease Control and Prevention and Prevention’s review and feedback, as well as any new requirements or changes to the business environment in which the project exists. The following table will track changes and updates to the document.

Version # Implemented by

Revision Date

Description Approved By Approval Date

Initial Draft CNI ADV

Page 3: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGs

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page ii September 28, 2015

TABLE OF CONTENTS VERSION HISTORY ...................................................................................................................... i 1 Application of UCD Methods to Immunization Workflows ................................................... 1

1.1 Define Project Scope and Select Workflows ................................................................... 3

1.1.1 Selection of Workflows Based on Risk Assessment ................................................ 4

1.1.2 Identify and Obtain Vendor Participation ................................................................. 4

1.1.3 Review Work Products from Prior Phases of Immunization Project ....................... 8

2 Forecasting Workflow ............................................................................................................. 8

2.1 Discovery and Definition of Forecasting Workflow User Requirements ........................ 8

2.1.1 Cognitive Task Analysis Activities for the Forecasting Workflow .......................... 8

2.1.2 Task Mapping Activities for Forecasting Workflow ................................................ 9

2.1.3 Additional Discovery Activities for Forecasting Workflow ................................... 10

2.2 Define Early Design Concepts for Forecasting Workflow ............................................ 11

2.3 Obtain Input from Subject Matter Experts on Early Forecasting Workflow Concepts . 11

2.4 Create Low-Fidelity Forecast Workflow Prototype ....................................................... 12

2.5 Conduct Forecasting Workflow Formative Usability Test (Round 1) ........................... 15

2.5.1 Objectives ............................................................................................................... 15

2.5.2 Methods................................................................................................................... 15

2.5.3 Prototype Description ............................................................................................. 16

2.5.4 Participants .............................................................................................................. 16

2.5.5 Pilot Testing ............................................................................................................ 17

2.5.6 Test Sessions ........................................................................................................... 17

2.6 Discuss/Review Findings and Determine Design Updates for Forecasting Workflow . 18

2.6.1 Patient Summary Screen ......................................................................................... 18

2.6.2 Immunization Forecasting Calendar View ............................................................. 18

2.6.3 Immunization Forecasting Table View ................................................................... 19

2.6.4 Pop-ups and Tooltips .............................................................................................. 19

2.7 Iteration based on Forecasting Workflow Round 1 Usability Test ................................ 19

2.8 Finalize Round 1 Forecasting Workflow Iteration ......................................................... 23

2.9 Conduct Forecasting Workflow Formative Usability Test (Round 2) ........................... 24

2.9.2 Methods................................................................................................................... 25

2.9.3 Prototype Description ............................................................................................. 26

2.9.4 Participants .............................................................................................................. 27

2.9.5 Pilot Testing ............................................................................................................ 28

2.9.6 Test Sessions ........................................................................................................... 28

2.10 Discuss/Review Findings and Determine Design Updates for Forecasting Workflow . 29

Page 4: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGs

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page iii September 28, 2015

2.10.1 General Findings – Forecasting Views ................................................................... 29

2.11 Iteration based on Forecasting Workflow Round 2 Usability Test ................................ 30

2.12 Finalize Round 2 Forecasting Workflow Iteration ......................................................... 33

2.13 Conduct a “Mock” Summative Usability Test on Forecasting Workflow (Round 3) ... 35

2.13.1 Limitations of Sample Size ..................................................................................... 35

2.13.2 Objectives ............................................................................................................... 35

2.13.3 Methods................................................................................................................... 35

2.13.4 Prototype Description ............................................................................................. 36

2.13.5 Participants .............................................................................................................. 36

2.13.6 Pilot Testing ............................................................................................................ 37

2.13.7 Test Sessions ........................................................................................................... 37

2.13.8 Tasks ....................................................................................................................... 38

2.13.9 Usability Results for Task “Order Due Vaccines Using the Calendar View” ........ 38

2.13.10 Usability Results for Task “Order Due Vaccines Using the Table View”.......... 39

2.13.11 Discussion of the Findings .................................................................................. 40

2.13.12 Areas for Improvement ....................................................................................... 42

2.13.13 Satisfaction and Use of the System Usability Scale (SUS) ................................. 43

2.13.14 Reporting the Mock Summative Test.................................................................. 43

3 Immunization Administration Workflow & Data Quality .................................................... 43

3.1 Discovery and Definition of Documentation & Data Quality Workflow User Requirements ............................................................................................................................. 43

3.1.1 Cognitive Task Analysis Activities for the Documentation & Data Quality Workflow ............................................................................................................................... 44

3.1.2 Task Mapping Activities for Documentation & Data Quality Workflow .............. 45

3.1.3 Additional Discovery Activities for Documentation & Data Quality Workflow ... 45

3.2 Define Early Design Concepts for Immunization Administration Documentation & Data Quality Workflow ............................................................................................................. 46

3.3 Obtain Input from Subject Matter Experts on Early Immunization Administration Documentation & Data Quality Workflow Concepts ............................................................... 46

3.4 Create Low-Fidelity Immunization Administration Documentation & Data Quality Prototype ................................................................................................................................... 47

3.5 UCD Pilot Demonstration Round 1 ............................................................................... 51

3.6 Conduct Immunization Administration Documentation & Data Quality Workflow Formative Usability Test (Round 2).......................................................................................... 51

3.6.2 Methods................................................................................................................... 52

3.6.3 Prototype Description ............................................................................................. 52

3.6.4 Participants .............................................................................................................. 52

Page 5: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGs

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page iv September 28, 2015

3.6.5 Pilot Testing ............................................................................................................ 53

3.6.6 Test Sessions ........................................................................................................... 53

3.7 Discuss/Review Findings and Determine Design Updates for Immunization Administration Documentation & Data Quality Workflow ...................................................... 54

3.7.1 Iteration based on Immunization Administration Documentation & Data Quality Workflow Round 2 Usability Test ......................................................................................... 54

3.8 Finalize Round 2 Immunization Administration Documentation & Data Quality Workflow Iteration .................................................................................................................... 55

3.9 Conduct a “Mock” Summative Usability Test on Immunization Administration Documentation & Data Quality Workflow (Round 3) .............................................................. 56

3.9.1 Objectives ............................................................................................................... 57

3.9.2 Methods................................................................................................................... 57

3.9.3 Prototype Description ............................................................................................. 58

3.9.4 Participants .............................................................................................................. 58

3.9.5 Pilot Testing ............................................................................................................ 58

3.10 Test Sessions .................................................................................................................. 59

3.10.1 Tasks ....................................................................................................................... 59

3.10.2 Usability Results for Task “Document Vaccine Administration” .......................... 60

3.10.3 Discussion of the Findings ...................................................................................... 61

3.10.4 Areas for Improvement ........................................................................................... 61

3.10.5 Satisfaction and Use of the System Usability Scale (SUS) .................................... 62

3.10.6 Reporting the Mock Summative Test ..................................................................... 62

4 Describe and Report the UCD Pilot Demonstration Methods and Findings ......................... 63

4.1 Conduct Findings Review Meetings with Vendors........................................................ 63

4.1.1 Feedback from Vendors on Participating in the UCD Pilot Demonstration ........... 63

4.2 Prepare Written Reports for Delivery to CDC ............................................................... 64

5 References ............................................................................................................................. 64

Page 6: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGs

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page v September 28, 2015

LIST OF FIGURES Figure 1. User-Centered Design (UCD) Process Executed in the Immunizations Pilot Demonstration Project .................................................................................................................... 2

Figure 2. Detailed Steps for Conducting Formative and Summative Testing ................................ 2

Figure 3. Cognitive Task Analysis for Forecasting Workflow. ...................................................... 9

Figure 4. Task Map of Immunization Forecasting Workflow ...................................................... 10

Figure 5. Summary Screen Prototype (Round 1 Forecasting Workflow). .................................... 12

Figure 6. Calendar View Prototype (Round 1 Forecasting Workflow). ....................................... 13

Figure 7. Table View Prototype (Round 1 Forecasting Workflow). ............................................ 14

Figure 8. Examples of Informational Vaccine Tooltips. ............................................................... 14

Figure 9. Examples of Informational Vaccine Pop-ups. ............................................................... 15

Figure 10. Evolution of the Immunization Forecasting Calendar View. Wireframes are shown in chronological order (top to bottom, earliest to latest). .................................................................. 20

Figure 11. Evolution of Informational Vaccine Tooltips and Pop-ups. Wireframes are shown in chronological order (top to bottom, earliest to latest). .................................................................. 22

Figure 12. Examples of Interactive Prototype User Interfaces. .................................................... 24

Figure 13. Examples of Interactive Prototype Screens Used in Round 2 Forecasting Usability Test. ............................................................................................................................................... 27

Figure 14. Evolution of Immunization Forecasting Calendar View. ............................................ 31

Figure 15. Evolution of Forecasting Table View.......................................................................... 32

Figure 16. Evolution of Forecasting Information Tooltips & Pop-ups. ........................................ 33

Figure 17. Examples of Interactive Prototype User Interfaces. .................................................... 34

Figure 18. Cognitive Task Analysis for Immunization Administration Documentation Workflow. ..................................................................................................................................... 44

Figure 19. Task Map that includes the Immunization Administration Documentation Workflow....................................................................................................................................................... 45

Figure 20. Prototype Screen with Field Level Formatting Cues Displayed within the Field and Field Level Validation with Inline Error Messaging .................................................................... 48

Figure 21. Field Level Validation with Error Messaging that Interrupts the User’s Workflow. .. 49

Figure 22. Interaction that Restricted Data Entry Fields to Force Top to Bottom Form Completion. ................................................................................................................................... 50

Figure 23. Constrained Dropdown Options based on Previously Entered Data. .......................... 51

Figure 24. Immunization Administration Documentation Form with Tooltip. ............................ 56

Figure 25. Immunization Administration Documentation Form with Inline Error Message. ...... 56

Page 7: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGs

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page vi September 28, 2015

LIST OF TABLES Table 1. Basic UCD Activities with Vendor Participation ............................................................. 5

Table 2. Vendor Team and End-user Participation ........................ Error! Bookmark not defined. Table 3. Participant Demographic Background for Round 1 Forecasting Workflow .................. 16

Table 4. Examples of Changes Made between Wireframes in Figure 10 ..................................... 21

Table 5. Examples of Changes Made between Wireframes in Figure 11. .................................... 22

Table 6. Participant Demographic Background for Round 2 Forecasting Workflow .................. 28

Table 7. Participant Demographic Background for Round 3 Forecasting Workflow .................. 37

Table 8. Usability Test Results for Each Subtask in the Order Due Vaccines Task using the Calendar View. ............................................................................................................................. 39

Table 9. Usability Test Results for Each Subtask in the Order Due Vaccines Task using the Table View .............................................................................................................................................. 40

Table 10. Participant Demographic Background for Round 2 Immunization Documentation Workflow ...................................................................................................................................... 52

Table 11. Participant Demographic Background for Round 3 Immunization Administration Documentation & Data Quality Workflow ................................................................................... 58

Table 12. Usability Test Results for Each Subtask in the Document Vaccine Administration Task. .............................................................................................................................................. 60 Note: This document contains references to artifacts used in the UCD process. These artifacts accompany the document as zip files, highlighted in blue, and contents of the zip files highlighted in green.

Page 8: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 1 September 28, 2015

1 APPLICATION OF UCD METHODS TO IMMUNIZATION WORKFLOWS This document describes the portion of the immunizations pilot demonstration project that focused on improving usability. A team of human factors specialists and designers performed a user-centered design (UCD) process on prototype EHR applications to address immunization functionality. The team developed the immunization content and user interface designs from business workflow requirements to create functional prototypes through the use of the UCD activities described in this document. This document provides the detailed descriptions and artifacts the activities that were performed as part of the UCD process. For background information regarding optimizing usability with a UCD process, see the document titled “UCD Primer: UCD Primer.” Figure 1 provides a high-level overview of the usability evaluation process the CNIADV Team executed in the immunization pilot demonstration project. As illustrated in the graphic, the project addressed two workflows including: (1) forecasting addressing patient age and immunization history and (2) data quality with respect to documenting vaccine administration. The team conducted a series of UCD activities for each workflow in three rounds (Round 1, Round 2, and Round 3). Vendors provided customer end-users to participate in the project. Each round consisted of a two-week interval. Round 1 included early prototype formative testing for forecasting. Round 2 included a second iteration of formative testing for forecasting and early prototype testing for data quality. Round 3 included mock summative testing for both forecasting and data quality. Data quality analysis only included one iteration of formative testing because it involves nursing documentation and vendors found it more difficult to recruit to participate. Each workflow otherwise included similar UCD activities.

The first steps in the UCD demonstration pilot included the following: 1. Define project scope and select workflows; 2. Identify and obtain vendor participation; and 3. Review work from prior phases of immunization project.

Once the workflows were selected the following UCD activities were carried for each immunization workflow that was included in the project:

1. Discovery and definition Cognitive Task Analysis Task Mapping Additional discovery activities

2. Iterative formative design with stakeholder and user feedback 3. Conduct a “mock” summative usability test with final prototype (including early round

pilot testing) To complete the UCD demonstration pilot the following activity was planned and executed:

4. Describe and report the methods and findings

Page 9: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 2 September 28, 2015

Figure 1. User-Centered Design (UCD) Process Executed in the Immunizations Pilot Demonstration Project

Figure 2. Detailed Steps for Conducting Formative and Summative Testing

Page 10: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 3 September 28, 2015

1.1 Define Project Scope and Select Workflows The project scope limited the number of workflows that could be included in the UCD portion of the project due to the amount of time that could be spent planning and conducting each UCD activity. The CNIADV Team focused the usability analysis on providing value for workflows specifically related to immunization. In order to identify the most challenging issues for immunization, the analysis reviewed data from the following sources:

• Interviews with a wide range of stakeholders in Phase 1 of the project; • Stakeholder input during in-person meetings held in June 2014 and September 2014 as

part of Phase 1 of the project; • Discussions with usability experts and review of usability literature including specific

publications regarding usability related safety issues for pediatrics; and • Observations of demonstrations of the 12 vendor products with highest market share to

determine immunization-related function and identify aspects important to usability. Data were reviewed by SMEs (physicians, the IIS community, usability experts, CDC NCIRD) to select the most challenging issues to address within the scope of the project. The above efforts identified the five areas of greatest concern:

1) Patient matching; 2) Presentation and use of immunization forecast; 3) Documentation of administered immunizations with respect to influencing data quality; 4) Immunization reconciliation (e.g., EHR history and IIS reported history); and 5) VFC eligibility documentation.

The scope of the project limited the evaluation to two of the workflows mentioned above. While each of the five identified workflows is significant, the team prioritized two that would provide actionable information with near-term benefits. Discussions with SMEs (SMEs) and CDC NCIRD led to selection of immunization forecasting and data quality for the project. Patient matching is an essential component of patient safety yet it is applicable to all aspects of EHR use and not necessarily specific to immunizations. Immunization reconciliation is equally important, yet Meaningful Use requirements for reconciliation of medication lists, allergy lists and problem lists may provide some industry focus in this area. VFC eligibility documentation remains problematic, yet a full root cause analysis and standardization of individual state requirements will better inform how to approach usability. The team defined the following specific workflows and target user groups for inclusion in the UCD portion of the pilot demonstration project:

(1) Forecasting Workflow: View and interpret an immunization forecast that contains patient-specific context of age, immunization history, and health history. Determine what vaccines should and should not be given during “today’s” encounter. The target user group associated with the Forecasting workflow was identified as the Provider (with

Page 11: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 4 September 28, 2015

prescribing authority). Note: other user groups participated in the forecasting workflow in practice, but project scope limited UCD activities to one user group per workflow.

(2) Documenting Immunization Administration Workflow: Documenting the administration of an immunization with the focus on improving data quality during documentation. The target user group for the documentation workflow was included those in a nursing role (including credentialed professionals from medical assistant to registered nurse).

1.1.1 Selection of Workflows Based on Risk Assessment Given the project constraints related to applying a UCD process, the selection criteria for which workflows and features to evaluate must include the topic of identifying and reducing risk. Once workflows are selected for inclusion, UCD activities must align with a sound risk management process. Workflows and features might also be selected for inclusion in UCD activities based on other criteria. For example, a workflow or feature might be included because it offers opportunity to improve efficiency (e.g., time spent completing the workflow multiplied by the frequency the workflow is completed); or because the accuracy and completeness with which a user can achieve task goal may be improved; or a workflow or feature might positively influence user satisfaction (e.g., address end-user complaints about a feature). The workflows in this project were selected based on a global assessment of eight general user workflows established in the conceptual model for immunization end-to-end processing. CDC and participating SMEs also reviewed analyses by immunization registries regarding vaccine reporting data quality. The group selected immunization forecasting and improvement in the quality of data that is transmitted to registries. The selection of these two workflows does not necessarily imply that they are more critical than other workflows that could have been selected for the study.

1.1.2 Identify and Obtain Vendor Participation The CNIADV Team contacted vendors and invited them to participate in the usability efforts for the pilot demonstration project. The CNIADV Team then scheduled kickoff meetings with those vendors who agreed to participate. During the kickoff meetings, the CNIADV Team presented a summary of basic activities required by the vendor for participation in the project. The document titled “CNIADV Pilot Participant Guidance Template - UCD - v 0 4 Generic_18jun2015” contains a summary that was presented to the vendor during the kickoff meeting and provided via email after the meeting. Basic activities were associated with general design prototypes created by the CNIADV Team. In addition to the required basic activities, vendors were also invited to participate in optional activities involving the use of their own product or prototype. None of the vendors participated in the additional activities. (The pilot guidance document contains a summary of vendor participation requirements for the additional activities.) The basic activities that the vendor participated in as part of the pilot demonstration UCD process are presented in Table 1.

Page 12: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 5 September 28, 2015

Table 1. Basic UCD Activities with Vendor Participation

Basic UCD Activities with Vendor Participation

Activity Purpose and Description Attendees Estimated Level of

Effort from Vendor

Kickoff Meeting CNIADV TeamDescribe the pilot demonstration project, present the basic usability activities that the vendor would participate in, and describe the need for the vendor to recruit product end-users for participating in the usability testing.

• CNIADV Lead • CNIADV usability team

member(s) • Vendor representative

responsible for coordinating the vendor team’s involvement in the project

1 hour

Prototype Review

Share the current status of prototype(s) developed by CNIADV for use in usability testing during the next round of testing and to obtain feedback from vendor team members who attended.

• CNIADV usability team member(s)

• Vendor team members (as determined by the vendor)

1 – 2 hours

Participant Recruiting (Identify and provide contact with end-users)

Each vendor was asked to provide contact information for up to five (5) end-users in each of two user groups: • Providers who use the vendor’s

product and have prescribing authority.

• Nurses/CMAs who use the vendor’s product for documenting immunizations that are administered during a patient encounter.

Vendors could schedule their end-users across the three (3) rounds of testing. Slots were filled on a first come basis.

• CNIADV usability team member assigned to scheduling test sessions

• Vendor representative responsible for coordinating contact with the vendor’s end-users

Unable to estimate the time required by the vendor to identify and recruit end-users.

Round 1 Formative Usability Test Sessions

Moderated individual 30-minute sessions with end-users provided by the vendor. During these sessions, end-users were shown a low-fidelity prototype and interviewed about information needs to support the following workflow: • Forecasting (with providers). Findings were used to inform the next design iteration of the prototype(s). Vendor team members observed sessions where their end-users participated in the test. (Note: no vendor observed a session where another

• CNIADV usability team moderator

• CNIADV usability team data logger

• Other CNIADV Team members (optional)

• Vendor team members (as determined by the vendor)

Each end-user participant = 30 minutes Each vendor team member = 30 minutes per session attended

Page 13: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 6 September 28, 2015

Basic UCD Activities with Vendor Participation

Activity Purpose and Description Attendees Estimated Level of

Effort from Vendor

vendor’s end-user participated).

Round 2 Formative Usability Test Sessions

Moderated 30-minute sessions with individual end-users provided by the vendor. During these sessions, end-users were asked to interact with an interactive low-fidelity prototype to perform tasks that were part of the following workflows: • Forecasting (with providers). • Documentation of an administered

vaccine (with nurse/CMAs). Findings were used to inform the next design iteration of the prototype(s). Vendor team members observed sessions where their end-users participated in the test. (Note: no vendor observed a session where another vendor’s end-user participated).

• CNIADV usability team moderator

• CNIADV usability team data logger

• Other CNIADV Team members (optional)

• Vendor team members (as determined by the vendor)

Each end-user participant = 30 minutes Each vendor team member = 30 minutes per session attended

Round 3 “Mock” Summative Usability Test Sessions

Moderated 30-minute sessions with individual end-users provided by the vendor. During these sessions, end-users were asked to interact with an interactive low-fidelity prototype to perform tasks that were part of the following workflows: • Forecasting (with providers). • Documentation of an administered

vaccine (with nurse/CMAs). Findings were used to prepare a Summative Test Report. Vendor team members observed sessions where their end-users participated in the test. (Note: no vendor observed a session where another vendor’s end-user participated).

• CNIADV usability team moderator

• CNIADV usability team data logger

• Other CNIADV Team members (optional)

• Vendor team members (as determined by the vendor)

Each end-user participant = 30 minutes Each vendor team member = 30 minutes per session attended

Findings Review Meeting

Review the overall usability project and activities with the vendor. A separate meeting was held with each participating vendor.

• CNIADV usability team member(s)

• Other CNIADV Team members (optional)

• Vendor team members

1.5 hours

Page 14: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 7 September 28, 2015

Basic UCD Activities with Vendor Participation

Activity Purpose and Description Attendees Estimated Level of

Effort from Vendor

(as determined by the vendor)

On-going Communications

Schedule meetings, request end-user contact information, and provide updates or request additional assistance related to scheduling end-user participants in testing sessions

• CNIADV usability team member assigned to scheduling test sessions

• Vendor representative responsible for coordinating the vendor team’s involvement in the project

As needed

Four (4) vendors participated in the UCD portion of the pilot demonstration project. Representatives from the vendor organizations attended meetings and testing sessions. Vendors also provided contact with end-users of their products for participation in usability testing sessions. Vendor team and end-user participation is summarized in Error! Reference source not found.Table 2.

Vendor Team and End-User Participation

Vendor Usability Kickoff

Prototype Review

Round 1 Formative 6/16 – 6/18

Round 2 Formative 6/30 – 7/1

Round 3 “Mock” Summative 7/14 – 7/17

Totals Final Review Meeting

Vendor 1 6/12 6/16 3 providers 1 nurse/MA 2 providers 5 providers 1 nurse/MA

7/21

Vendor 2 6/11 6/15 1 provider 3 providers 2 nurse/MAs

1 provider 1 nurse

5 providers 3 nurses/MAs

7/29

Vendor 3 6/17 6/25 Did not participate

1 provider 3 nurse/MAs

1 provider 1 nurse/MA

2 providers 4 nurse/MAs

7/31

Vendor 4 6/18 7/7

7/9 Did not participate

Did not participate

1 provider 1 provider 7/28

Totals 4 providers 4 providers

6 nurse/MAs 5 providers 2 nurse/MAs

13 providers 8 nurse/MAs

Page 15: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 8 September 28, 2015

Table 2. Vendor Team and End-user Participation

1.1.3 Review Work Products from Prior Phases of Immunization Project CNIADV human factors team members reviewed work products from prior phases of the immunization project. This review:

• Provided immunization-centric business workflow requirements; • Identified the range of functional abilities currently in use by providers in managing

immunizations for patients today; and • Revealed existing product characteristics of the user interface and screen flows that might

impact usability.

Using what was learned from the prior phases of the immunization project and drawing from previous within-context observations and interview knowledge, the team recognized the need to include UCD activities that focus on “two bins of usability” (Ratwani, Fairbanks, Hettinger, and Benda, 2015). The first bin, User Interface Design, addresses displays and controls, screen design, clicks & drags, colors, and navigation. The second bin, Cognitive Task Support, focuses on workflow design, data visualization, support of cognitive work, and functionality.

2 FORECASTING WORKFLOW 2.1 Discovery and Definition of Forecasting Workflow User Requirements Discovery activities are conducted to define user requirements. As described in the UCD Primer document (see Attachment C: UCD Primer”), the richest data collection methods for understanding the user, the user environments, user workflows, and user tasks are in-person observations and interviews. In-person discovery activities were not conducted for the project. Instead, the CNIADV usability team completed the following activities in order to discover and define the users, user environments, user workflows, user tasks, and user information needs so that early design concepts could be created and tested:

Cognitive Task Analysis; Task Mapping; and Additional discovery activities.

2.1.1 Cognitive Task Analysis Activities for the Forecasting Workflow A task analysis is a breakdown of the tasks and subtasks required to successfully operate a system. A cognitive task analysis is appropriate when there are large mental demands on the user. The CNIADV Team conducted a cognitive task analysis on the forecasting workflow. The purpose of the cognitive task analysis was to determine the tasks and subtasks that the user needs to perform in order to complete the task and identify the mental demands of those tasks (especially those tasks with high cognitive including perception, memory, information processing, and decision making). The task analysis informed the early design concepts (e.g., prioritization and layout of information on a display). Figure 3 illustrates the documentation of a cognitive task analysis. The complete analysis can be found in the file titled “Forecasting Task Analysis_06may15.xlsx.”

Page 16: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 9 September 28, 2015

Figure 3. Cognitive Task Analysis for Forecasting Workflow.

2.1.2 Task Mapping Activities for Forecasting Workflow

A task map is a diagram showing the tasks and subtasks users might perform in a given system along a timeline. The CNIADV usability team created task maps for the forecasting workflow early in the discovery and definition phase. Figure 4 provides an example of such a task map. The task map helped the CNIADV Team understand the steps in the user’s workflow associated with forecasting. For example, the task maps visually displayed the steps that the user might complete in order to understand what immunizations a patient is due. This allowed the team to develop early design concepts to include in prototypes for testing and eliciting end-user feedback. The task maps also aided the team in developing the tasks that would be included in formative and summative usability tests. The actual image of the task map is contained in the file titled “DiscoveryAndDefinition_Task Map v1.JPG.”

Page 17: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 10 September 28, 2015

Figure 4. Task Map of Immunization Forecasting Workflow

2.1.3 Additional Discovery Activities for Forecasting Workflow As part of the Design and Discovery activities, the CNIADV Team performed additional reviews and analyses to better understand issues and guidance related to forecasting. These activities included:

1. Review and use of CDC’s “Recommended Immunization Schedule for Persons Aged 0 Through 18 Years” (CDC, 2015). (Available at http://www.cdc.gov/vaccines/schedules/hcp/imz/child-adolescent.html). The version the team used is contained in the file titled “0-18yrs-child-combined-schedule.pdf.”

2. Review and use of North Carolina Immunization Registry (NCIR) guidelines and information including “NCIR Quick Reference Guide” (NCIR, 2013). (Available at http://immunize.nc.gov/providers/ncirmaterialsforms.htm). The version the team used is contained in the file titled “Final Quick_Reference_Guide 07 08 13.pdf.”

3. Review and use of CDC’s “Clinical Decision Support for Immunization (CDSi): Logic Specification for ACIP Recommendations” dated December 16, 2014 (CDC, 2014). (Available at http://www.cdc.gov/vaccines/programs/iis/interop-proj/cds.html). The team saved and highlighted a version of this document. The website now has a newer version. The version the team used is contained in the file titled “highlightedbyCNIteam20may2015_CDCdocument_logic-spec-acip-rec.pdf.”

The results of this analysis helped defined early forecasting design concepts used in the usability evaluation.

Page 18: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 11 September 28, 2015

2.2 Define Early Design Concepts for Forecasting Workflow The CNIADV usability team reviewed the information from the Discovery and Definition phase and produced alternative design concepts to consider for further design and testing. Design alternatives included the following:

• Calendar view: CDC recommended immunization schedule with patient context (e.g., age, allergies, prior immunization history) applied.

• Table view: Immunization history presented in a tabular display with immunization series status and next-due information displayed.

• “Lite” view: Immunizations due at “today’s visit” in a “lite” format that might display on a summary sheet along with other health maintenance activities that need to be addressed in “today’s visit” e.g., “Due Fors” indicating that the patient is due for specific immunizations or procedures.

• Timeline view: Alternative view of the CDC recommended immunization schedule with patient context (e.g., age, allergies, prior history) applied in a view that made use of timelines.

The Timeline view was not included in the UCD activities beyond the early design activities due to the time constraints. The “Lite” view was incorporated into a patient summary screen and included in the first round of formative usability testing. The Calendar view design was influenced by the forecasts available on the CDC website. The human factors team felt it was important to include a forecasting concept that was similar to the CDC forecasts because providers are familiar with this layout and the team is not aware of available usability test data associated with this layout. Thus, inclusion provides a usability baseline. Inclusion of the Table view was influenced by a review of forecast screens currently in use by providers in managing immunizations for patients today. The team observed several products that use a table layout during an earlier phase of this project.

2.3 Obtain Input from Subject Matter Experts on Early Forecasting Workflow Concepts

The human factors team conducted an interview over WebEx with a clinical member of the CNIADV Team acting as an SME. This interview served to demonstrate planning and executing stakeholder interviews and to identify information needs and priorities for defining the early forecasting prototypes to be used in the first formative testing round. The SME provided opinions and answered questions related to vocabulary, functionality, experience with other products, and what information was needed and when (e.g., persistent on the screen, on mouse over, 1-click away) in order to make decisions related to what immunizations were due and should be given. The stakeholder interview guide and screen examples used in the interview are contained in the document titled “SMEReview_ImmunizationConceptWireframes_29may2015_Final.pdf.”

Page 19: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 12 September 28, 2015

2.4 Create Low-Fidelity Forecast Workflow Prototype The CNIADV usability team created a low-fidelity prototype for use in the first round of formative usability testing. The primary goal of the testing phase was to understand the previously identified user information needs and priorities during the immunization forecasting workflow. The prototype was created using Microsoft PowerPoint. The focus displaying information that the users might need to make decisions about vaccine status and what vaccines they will give. The prototype was not interactive. It included basic screen wireframes to convey information that was persistently displayed, displayed on mouse over, and displayed on 1-click away. Prototypes included a patient summary screen (see sample screen in Figure 5Figure 5), immunization information layout using a Calendar view (see sample screen in Figure 6), immunization information layout using a Table view (see sample screen in Figure 7), and immunization related mouse-over tooltips and 1-click pop-ups (see sample screen in Figure 9). A version of the actual prototype can be found in the document titled “R1_ForecastPrototype_t1D4_18jun2015.pdf.” The patient summary screen (see Figure 5Figure 5) is a visual representation of sets of summary information that a provider might use as he/she is preparing to see a patient. The summary screen included an Immunizations summary section that displayed Vaccine Group, Dose, Date Given, and Next Due immunization information.

Figure 5. Summary Screen Prototype (Round 1 Forecasting Workflow).

The calendar view (see Figure 6) of a patient’s immunization forecasting information is a visual representation of a patient’s past immunization record, current immunization status, and future

Page 20: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 13 September 28, 2015

immunization schedule. The top of the screen contains basic patient information including information such as name, age, allergies, and family. The majority of the screen is made up of the immunization calendar which displays the official CDC immunization schedule for the given vaccines and age ranges as blocks (http://www.cdc.gov/vaccines/schedules/hcp/imz/child-adolescent.html). The bottom of the screen contains a legend for icons and symbols used in the calendar. Each row in the calendar is a vaccine and each column represents a time period corresponding to the patient’s age. The calendar view aims to give users the ability to examine a patient’s immunization history and forecasting recommendations and make subsequent informed vaccine administration decisions. The table view (see Figure 7) presents largely the same informational content as the calendar view but in a different visual style. The patient’s immunization history and status are presented in a table format with vaccines as rows and dose numbers as columns. Dates when vaccines were received or become due are shown in the table cells. The prototype screens also included mouse over tooltips and pop-up overlay windows. Figure 8 shows a tooltip which displays when the user mouses over the vaccine label “poliovirus” in the left most column. Information is displayed regarding the patient’s history with the vaccine, including any alerts (e.g., in the example the patient had an adverse reaction at some point in the past). Figure 9 shows a pop-up modal window which would display if the user clicked on an individual dose within the calendar grid (in this case the third dose of MMRV). The modal window presents information regarding the dose that is due including dates when it should be administered.

Figure 6. Calendar View Prototype (Round 1 Forecasting Workflow).

Page 21: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 14 September 28, 2015

Figure 7. Table View Prototype (Round 1 Forecasting Workflow).

Figure 8. Examples of Informational Vaccine Tooltips.

Page 22: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 15 September 28, 2015

Figure 9. Examples of Informational Vaccine Pop-ups.

2.5 Conduct Forecasting Workflow Formative Usability Test (Round 1) The first round of formative testing was conducted as part of the UCD process for the pilot demonstration. The formative usability testing was performed to evaluate design prototypes for immunization forecasting that accounted for patient age, allergies, and immunization history.

2.5.1 Objectives The objectives of the first round of testing included the following:

• To understand user needs, issues, and opinions related to early design concepts for immunization forecasting;

• To further define information needs and priorities that support decision making related to determining what vaccines are due and what vaccines should be given; and

• To use the findings to update design concepts and prototypes in support of a second round of formative testing.

2.5.2 Methods The test was conducted by executing the following steps:

(1) Identified activities that would address the objectives of the test. (2) Wrote a test plan that identified approach, scope, objectives, materials, participants, and

methods. The test plan was written in a light-weight format using Microsoft PowerPoint.

Page 23: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 16 September 28, 2015

The test plan can be found in the document titled “R1_FormativeUsabilityTestPlan_09jun2015_v1_Final.pdf.”

(3) Developed a moderator guide for use in the testing. The moderator guide outlined the flow of each test session from the opening welcome, introduction, and consent through the testing session including the wrap-up questions and thanks. Moderators used the guide but were allowed flexibility to include or exclude questions depending on time and participant responses. The moderator guide can be found in the document titled “R1_Forecasting_ModeratorGuide_FormativeTest1_16jun2015_Final.pdf.”

(4) Scheduled participants into 30-minute test sessions with a minimum of 30 minutes between sessions. All testing took place during the same week.

(5) Conducted a pilot test session with internal CNIADV Team member and update test materials as needed.

(6) Conducted the end-user participant test sessions and logged the data in a Microsoft Excel file during each session. The data log can be found in the file titled “R1_Forecasting_RAWdata_FormativeDataLog_06aug2015.xlsx.”

(7) Reviewed the findings and revised the design/prototypes.

2.5.3 Prototype Description The low-fidelity prototype described earlier was used for testing. Participants viewed the prototype as the moderator moved through the screens and asked the participant questions. Participants did not take control of the prototype during the round 1 sessions. The participant responded to the moderator’s questions related to the information that was displayed, missing information, and information needed to determine what immunizations to order/not order for the visit. Immunization related alerts were included in the prototypes and as part of the moderator questions. Initial prototypes are described in Section 2.4 of this document. Adjustments were made to the prototype screens during round 1 formative testing.

2.5.4 Participants Four (4) providers participated in this round of usability test sessions. The participants were recruited by Two vendors from their product end-users (refer to Error! Reference source not found.Table 2 for vendor end-user participation across testing rounds). Due to timing constraints of kick-off meetings with other vendors, the number of providers in this round of testing was lower than planned. Participants were not compensated for their participation. Participant demographic background information is provided in Table 3. Table 3. Participant Demographic Background for Round 1 Forecasting Workflow

Demographic Number or Range Across Participants Role 5 medical doctors (MD) Specialty 2 pediatrics, 1 internal medicine, 1 family practice Years of Experience Range from 9 – 28 years

Page 24: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 17 September 28, 2015

Number of patients seen per day

Range from 15 – 25

Percentage of patients 18 years old or less

1 participant - 95% 1 participant - 85% 1 participant - 35-40% 1 participant - 0%

2.5.5 Pilot Testing Prior to conducting actual test sessions with end-user participants, the CNIADV usability team conducted one or more internal pilot test sessions as needed. The purposes of pilot testing were (1) to ensure that the test materials were complete, (2) to assess whether the test length was appropriate for the planned session lengths, and (3) to provide the moderator with opportunity to gain practice with moderating the session. After pilot testing was conducted, the team updated all materials as needed to continue with testing.

2.5.6 Test Sessions Each 30-minute session was conducted via WebEx and recorded for internal CNIADV Team use. The participant session include a welcome, informed consent explanation, background information collected, and usability test tasks. Participants viewed low fidelity prototypes displayed over the WebEx. Participants responded to questions included in the moderator guide. (The moderator guide can be found in the document titled “R1_Forecasting_ModeratorGuide_FormativeTest1_16jun2015_Final.pdf.”) The moderator also used a talk aloud protocol. Participants were asked to talk out loud while they were thinking about how they would complete the task (e.g., to tell what they are thinking, what they are clicking or think they would click, what they are looking for, what information they need, etc.). The sessions were moderated by a human factors specialist from the CNIADV Team. Observers from the individual vendor team and the CNIADV Team attended the sessions as silent observers. A data logger from the CNIADV Team took notes in an excel file. Only one (1) vendor organization was involved in an individual session. The data log can be found in the file titled “R1_Forecasting_RAWdata_FormativeDataLog_06aug2015.xlsx.” The following background information was collected from each participant:

• Role (e.g., MD, NP, PA); • Years of experience; • Specialty; • Practice type; • Practice size; • Number of patients provider sees per day; and • Percentage / number of patients that are age 18 or younger.

Page 25: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 18 September 28, 2015

The following usability data were collected:

• Participant impressions and reactions to concepts; • Participant feedback, opinions, preferences; • Participant expressed needs for information display and priority; and • Usability issues observed by the CNIADV human factors team.

Note: No quantifiable measures (e.g., task times, number of clicks) were collected for this formative test.

2.6 Discuss/Review Findings and Determine Design Updates for Forecasting Workflow

The human factors team reviewed the findings from the usability test. During the review, the team discussed and analyzed the findings, referenced best practice user interface design principles, and brainstormed and sketched possible updates. The findings from this round of testing included the following:

2.6.1 Patient Summary Screen • Viewing immunization information on the Patient Summary Screen was considered lower

priority/not necessary. • Tasks due, allergies, and vitals were received positively.

2.6.2 Immunization Forecasting Calendar View • The calendar view was received more positively than table view. • There was confusion with some of the icons. Alert icons with an exclamation point were

perceived to be associated with overdue vaccines. • The use of a highlighted background to indicate a patient’s current age was received

positively overall. • Sometimes “due” and “completed” doses in the highlighted column (indicating patient’s

current age) were confused due to a lack of visual differentiation. • The indication of when a dose was given within a given CDC time block via small visual

indicator (e.g. circle, badge) received mixed responses. • Dose count should emphasize counting valid doses. • Invalid doses should be visually distinct and separated from valid doses and should not

contribute to the dose count for a particular vaccine. • In at least one case, the sibling’s name in the patient banner was mistaken for the patient

whose chart was being viewed. • Filter options should include the ability to filter by overdue doses and alerts.

Page 26: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 19 September 28, 2015

2.6.3 Immunization Forecasting Table View • A “print” option from this screen was received positively.

2.6.4 Pop-ups and Tooltips • Click-to-access pop-ups and hover-over tooltips were both received positively. • Earliest and latest vaccine due dates were received positively. • Presenting alerts and information about those (e.g., allergies, prior reactions) were

positive for hover and 1-click. • Alert descriptions were received with mixed responses. Descriptions should be succinct. • Vaccine trade names were not considered useful elements for doses that were

recommended or due but were important for recording which vaccines had been given. • Action buttons (order, defer, close) in 1-click overlays were received positively. The

“order” label would be more accurately named to “add to order” so that multiple vaccines could be added but then ordered as single or combination vaccines.

• Combination vaccines that have been given should be clearly indicated with the antigens that are contained in them.

• Testers noted the lack of the following useful information: • Person who administered the vaccine; • Site the vaccine was given (e.g., to provide additional information if a parent calls

about a possible reaction); and • Interval between last dose given and next due.

2.7 Iteration Based on Forecasting Workflow Round 1 Usability Test This iteration included design changes and updates to the prototypes based on the findings from the usability test. In keeping with an agile process, after each usability test session the usability team looked for patterns and strong findings in the qualitative data collected up to that point. The data were used to inform further iterations of the wireframes. The user feedback data were used in conjunction with the team’s expert knowledge of human factors principles to update wireframe designs with the intention of increasing the usability of the concept. Figure 10 and Figure 11 show examples of how user feedback informed wireframe iterations throughout the first round of formative usability testing. Figure 10 shows the evolution of the calendar view wireframe over three iterations within the first round of formative usability testing. As mentioned prior, the alterations made between versions reflect the feedback received from users during the first round of formative usability testing combined with the expert input from our team of human factors specialists. Table 4Table 4 provides examples of changes made between the iterative wireframe designs.

Page 27: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 20 September 28, 2015

A. Wireframe Iteration – Pre-testing

B. Wireframe Iteration – Early First Round Formative Testing

C. Wireframe Iteration – Late First Round Formative Testing

Figure 10. Evolution of the Immunization Forecasting Calendar View. Wireframes are shown in chronological order (top to bottom, earliest to latest).

Page 28: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 21 September 28, 2015

Table 4. Examples of Changes Made between Wireframes in Figure 10

UI Element Wireframe A Wireframe B Wireframe C

Visual indication of dose administration time along CDC recommended timeline

Description: Star-like icon

Description: Blue dot Reason for change: User feedback indicated the icon was too large and uninterpretable

Description: Green badge with dose number Reason for change: Blue dot icon did not contain sufficient dose information for the user

Catch-up region visual Indicator

Description: N/A Description: N/A Reason for change: N/A

Description: Green box with associated legend Reason for change: Users indicated catch-up regions help inform administration decision making

Invalid dose icon Description: Red circle with exclamation point (same as alert icon)

Description: Grey circle with an “X” within it Reason for change: Red indicated a more severe alert to users. An exclamation point indicated an overdue dose to users

Description: Grey circle with an “X” within it Reason for change: No change was made

MMR/varicella invalid dose indicator in calendar

Description: The invalid dose is labeled “Dose 1” in the calendar

Description: The invalid dose is no longer a numbered dose but is indicated along the timeline by the invalid dose icon Reason for change: Users indicated that they do not think of invalid doses as completed doses

Description: N/A Reason for change: N/A

The table view for immunization forecasting did not receive substantial alterations during the testing period. This was due to the fact that more emphasis was put on the calendar view, mouse over tooltips, and 1-click pop-ups. The table view was not altered until the post-test analysis which is described in the next section.

Page 29: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 22 September 28, 2015

Figure 11 shows the evolution of the pop-ups and tooltips wireframes over two iterations. Only two iterations are shown because no substantial iteration was made mid-testing. Wireframes were made prior to testing and a single iteration was made near the end of the first round of testing. Examples of changes made between the iterative wireframe designs in Figure 11 are provided in Table 5Table 5.

A. Wireframe Iteration – Pre-testing

B. Wireframe Iteration – Late First Round Formative Testing

Figure 11. Evolution of Informational Vaccine Tooltips and Pop-ups. Wireframes are shown in chronological order (top to bottom, earliest to latest).

Table 5. Examples of Changes Made between Wireframes in Figure 11.

UI Element Wireframe A Wireframe B

Vaccine name Description: Full vaccine name given

Description: Abbreviated/alternate vaccine name given in parentheses. Reason for change: The alternate vaccine name adds an additional useful cue that the user can use to identify the

Page 30: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 23 September 28, 2015

UI Element Wireframe A Wireframe B drug

Order functionality Description: Button labeled “Order” allows user to order the selected vaccine

Description: Button changed to “Add to Order”. Instead of ordering, button adds the selected vaccine to a potential set of immunizations which can be ordered at once from a later time. Reason for change: Ordering immunizations independently may lead to mistakes in ordering.

Trade name on tooltip Description: Vaccine trade name is present on tooltip

Description: Trade name is not present on tooltip. Reason for change: Users indicated the trade name is not important enough to include on tooltip for a vaccine series. Additionally, a vaccine series may have multiple trade names.

2.8 Finalize Round 1 Forecasting Workflow Iteration The data gathered in Round 1 of usability testing was analyzed and interpreted by a team of human factors specialists. Relevant findings were implemented in the construction of an interactive prototype used in the Round 2 of formative testing. Functional and visual improvements were made. Figure 12Figure 12 demonstrates examples of the final prototype developed from Round 1 of formative usability testing. Through the course of testing there too many findings to report exhaustively here. Findings from each round of usability testing were analyzed and compiled into a list of evidenced-based guidance for developing immunization functionality within EHRs to support positive usability outcomes. Further rounds of usability testing are discussed later in this document.

Page 31: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 24 September 28, 2015

A. Final Prototype Screen - Immunization Forecasting Calendar View

B. Final Prototype Screen - Immunization Forecasting Table View

C. Final Prototype Screen – Informational Tooltips and Pop-ups Figure 12. Examples of Interactive Prototype User Interfaces.

2.9 Conduct Forecasting Workflow Formative Usability Test (Round 2)

Page 32: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 25 September 28, 2015

A second round of formative usability testing was conducted to evaluate design prototypes for immunization forecasting that accounted for patient age, allergies, and immunization history. The prototypes were created based on findings from the first round of formative testing (described earlier).

2.9.1.1 Objectives As part of the multi-phase UCD process, the human factors team planned and carried out a formative usability test. The objectives of the second round of testing included:

• Observing end-users interacting directly with limited functionality prototypes for specific immunization workflows.

• Validating updated design details that were informed by the first round of formative testing.

• Better understanding user needs, issues, and opinions related to design concepts for immunization forecasting. Defining information needs and priorities that support decision making related to determining what vaccines are due and what vaccines should be given.

• Beginning to define measurement of objective usability measures to support planning for summative testing (e.g., define task begin and end points for collecting task times).

Findings from this test were used to update design concepts and prototypes before the third round of testing, a “mock” summative usability test.

2.9.2 Methods The test was conducted using the same steps at Round 1:

(1) Identified the activities to address the objectives of the test. (2) Wrote a test plan that identified approach, scope, objectives, materials, participants, and

methods. The test plan was written in a light-weight format using Microsoft PowerPoint. The test plan can be found in the document titled “R2_FormativeUsabilityTestPlan_29jun2015_v1_Final.pdf.”

(3) Developed a moderator guide for use in the testing. The moderator guide outlined the flow of each test session, including the opening welcome, introduction, consent, the testing session, as well as the wrap-up questions and thanks. The moderator used the guide but had the flexibility to include or exclude questions depending on time and participant responses. The moderator guide can be found in the document titled “R2_Forecasting_ModeratorGuide_FormativeTest2_29jun2015_v1_Final.pdf.”

(4) Scheduled participants into 30-minute test sessions with a minimum of 30 minutes between sessions. All testing sessions were held the same week.

(5) Conducted a pilot test session with internal CNIADV Team member and updated test materials as needed.

(6) Conducted the end-user participant test sessions and logged the data in a Microsoft Excel file during each session. The data log can be found in the file titled “R2_Forecasting_RAWdata_FormativeDataLog_06aug2015.xlsx.”

(7) Reviewed the findings and revised the design/prototypes.

Page 33: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 26 September 28, 2015

2.9.3 Prototype Description The interactive low-fidelity prototype described earlier (see Section 2.8) was also used for Round 2 testing. Figure 13 provides example screen shots from the prototype used in this round of usability testing. The prototype used in the usability test can be found in the archived zip file titled “R2_Forecasting_Prototype_GRIDSTART.zip.” Participants were given control of the prototype over the WebEx. Participants interacted with the prototype to complete tasks as instructed by the moderator.

Page 34: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 27 September 28, 2015

A. Final Prototype Screen - Immunization Forecasting Calendar View

B. Final Prototype Screen - Immunization Forecasting Table View

C. Final Prototype Screen – Informational Tooltips and Pop-ups Figure 13. Examples of Interactive Prototype Screens Used in Round 2 Forecasting Usability Test.

2.9.4 Participants Four (4) providers representing Two vendors’ end-users participated in this round of usability test sessions. One of the vendors had participated in the first round. The second vendor had not participated in the first round. Due to time constraints in scheduling of kick-off meetings with

Page 35: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 28 September 28, 2015

other vendors and difficulty recruiting participants, the number of providers in this round of testing was lower than planned. Participants were recruited by each participating vendor from the vendor’s customer end-users (refer to Error! Reference source not found.Table 2 for vendor end-user participation across testing rounds). Participants were not compensated for their participation. Participant demographic background information is provided in Table 6. Table 6. Participant Demographic Background for Round 2 Forecasting Workflow

Demographic Number or Range Across Participants Role 4 medical doctors (MD) Specialty 3 pediatrics,

1 pediatrics / infectious disease / vaccine safety Years of Experience Range from11 – 20 years Number of patients seen per day

3 participants range from 16 – 30 1 participant varies from 0 – 20 per day

Percentage of patients 18 years old or less

2 participant s - 100% 2 participant s - 99%

2.9.5 Pilot Testing Prior to conducting actual test sessions with end-user participants, the CNIADV usability team conducted one or more internal pilot test sessions.. The purposes of pilot testing were (1) to ensure that the test materials were complete, (2) to assess whether the test length was appropriate for the planned session lengths, and (3) to allow the to practice moderating the session. After the pilot testing, the team updated all materials as needed..

2.9.6 Test Sessions Each 30-minute session was conducted via WebEx and recorded for internal CNIADV Team use. The sessions were moderated by a human factors specialist from the CNIADV usability team. Individuals from the vendor teams and the CNIADV Team attended the sessions as silent observers. Each session was limited to one vendor organization. No vendor attended a session where another vendor’s customer was a test participant. The participant session included a welcome, informed consent, collection of background information, performance of test tasks, and responding to questions. Participants viewed and interacted with the medium-fidelity prototype displayed over WebEx. Participants attempted clinical-based tasks and responded to questions included in the moderator guide. The guide can be found in the document titled “R2_Forecasting_ModeratorGuide_FormativeTest2_29jun2015_v1_Final.pdf.” Participants were asked to talk out loud while they were completing their tasks or thinking about how they would complete the task (e.g., to tell what they were thinking, what they were clicking or thought they would click, what they were looking for, what information they needed, etc.). Participants were asked to provide feedback, opinions, and responses to brief background questions. A data

Page 36: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 29 September 28, 2015

logger from the CNIADV Team took notes in an excel file. The data log can be found in the file titled “R2_Forecasting_RAWdata_FormativeDataLog_06aug2015.xlsx.” The following background information was collected from each participant:

• Role (e.g., MD, NP, PA); • Years of experience; • Specialty; • Practice type; • Practice size; • Number of patients provider sees per day; and • Percentage / number of patients age 18 or younger.

The following usability data were collected:

• Task time; • Participant feedback, opinions, preferences; and • Observations made by the CNIADV usability team.

2.10 Discuss/Review Findings and Determine Design Updates for Forecasting Workflow

The human factors team reviewed the findings from the usability tests. During the review, the team discussed and analyzed the findings, referenced best practice user interface design principles, and brainstormed and sketched possible updates. The findings from this round of testing included the following:

2.10.1 General Findings – Forecasting Views • The invalid dose information was incorrectly associated with the wrong dose.

Participants did not necessarily see that a valid dose had been given later. • Having no access to specific CDC recommended date ranges was confusing/frustrating

(e.g., when clicking on a CDC recommended dose inside the grid). • The overdue dose for which the catch-up range had ended was confusing. The

participants did not necessarily see the explanatory text about the catch-up period having ended.

• There appeared to be some confusion about the grouping of elements in the grid that were related when mousing and clicking (E.g., including an invalid dose, a CDC recommended dose, and a valid dose together).

• In the table view it was not always clear what doses were due (especially for the overdue dose). The dates provided for the vaccines that were due were confusing.

Page 37: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 30 September 28, 2015

2.10.1.1 Iterative Changes – Forecasting Views

• Created a separate table view prototype with different patient and patient data that was considered to be equal in complexity to the data in the calendar view.

• Added tooltips and pop-ups to table view. • Made elements of grid individually clickable and hoverable. • Updated CDC recommended dose elements to show the CDC recommendations when

clicked. • Overdue doses with an expired catch-up range were colored gray instead of orange. The

catch-up region was made more visually prominent on mousing. • For the table view, due cells were filled with blue color to convey button-like elements. • For the table view, a status column was added to contain open-ended notes or other non-

critical status information such as invalid dose information.

2.11 Iteration based on Forecasting Workflow Round 2 Usability Test The findings from the second round of testing informed updates to the prototype. All design changes and updates were made to the prototypes after all the testing was completed for the round. No changes were made to the prototypes between test participants as was done in the first round of testing. There were three primary user interface being tested including a calendar view, a table view, and informational tooltips and pop-ups. The evolution of the system prototypes (pre-test and post-test) is shown in Figure 14. The final version of each prototype served as the benchmark prototype to be used in the final round of summative usability testing. The changes between the pre-test and post-test calendar views shown in Figure 14 were primarily interactive and thus not fully captured by static screenshots. The primary change made across iterations is the method of interaction with individual elements of the calendar. In the earlier version, selecting an individual dose selected all of the elements that were associated with that CDC recommended dose. In the post-test iteration, individual elements are selectable independently (e.g. “Due” badge or “Alert” icon, the invalid dose, the CDC recommended dose, a valid given dose, etc.). This approach to calendar interaction was intended to eliminate confusion as to what information is associated with a particular dose, as well as meet user’s expectations of what is and is not interactive. Some visual design changes were also made. Some visual changes (e.g., pneumococcal being complete versus deferred) were made to expand the prototype’s ability to convey design concepts. An example of a visual change made with the intention of mitigating user error is the greying of an overdue (Hib dose four in Figure 14) dose which has passed its catch-up period (the maximum time that can pass before the dose is no longer recommended to be given).

Page 38: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 31 September 28, 2015

C. Wireframe Iteration – Pre-testing

D. Wireframe Iteration – Post-Testing

Figure 14. Evolution of Immunization Forecasting Calendar View. The changes between the pre-test and post-test table views are shown in Figure 15. In order to increase salience for vaccines that are due, the table cells were filled entirely with blue color. Overdue doses were color coded in the same manner. This revision was made in order to mitigate errors caused when users could not immediately identify which vaccines were due. In addition to visual treatment of cells, a new column labeled “Notes” was inserted. This addressed the desire of users to enter customized information which may not have a place in the system otherwise. This could also serve as a location for non-critical vaccine information such as invalid dose indicators.

Page 39: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 32 September 28, 2015

A. Wireframe Iteration – Pre-testing

B. Wireframe Iteration – Post-Testing

Figure 15. Evolution of Forecasting Table View.

As seen in Figure 16, some alert and popup elements remained unchanged (e.g. poliovirus alert tooltip). The primary change occurred due to the interaction with individual elements of the calendar. For example, the pre-test second dose MMR elements shown in Figure 16 were separated in the post-test prototype into unique pop-up elements for the invalid dose that was given and the second dose that is due. This resulted from the decision to make individual grid elements clickable and separate from other grid elements. For instance, if the user selects the invalid dose icon only information concerning the invalid dose will be displayed. If the user clicks the “Due” badge, then information about the due dose is displayed. Therefore, the information that is displayed in the tooltip or pop-up is dependent on the individual calendar element that is selected.

Page 40: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 33 September 28, 2015

A. Wireframe Iteration – Pre-testing

B. Wireframe Iteration – Post-Testing

Figure 16. Evolution of Forecasting Information Tooltips & Pop-ups.

2.12 Finalize Round 2 Forecasting Workflow Iteration Once Round 2 testing was complete, the interactive prototypes were updated. The changes included functional, visual, and interactive improvements based on a review and analysis of the findings from the first and second rounds of testing by the human factors team. Figure 17 below demonstrates examples of the final prototype user interfaces developed from rounds one and two of formative usability testing.

Page 41: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 34 September 28, 2015

A. Final Prototype Screen - Immunization Forecasting Calendar View

B. Final Prototype Screen - Immunization Forecasting Table View

C. Final Prototype Screen – Informational Tooltips and Pop-ups Figure 17. Examples of Interactive Prototype User Interfaces.

Page 42: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 35 September 28, 2015

2.13 Conduct a “Mock” Summative Usability Test on Forecasting Workflow (Round 3)

A summative usability test is a method used to validate that usability goals have been achieved. In the case of health IT, the summative usability test is the method used to provide objective evidence that the application is safe and effective to use. As discussed in the UCD Primer (see “UCD Primer: Conducting User-Centered Design Processes to Improve Usability in Electronic Health Records”), the summative usability test is a rigorous evaluation method. Best practices must be followed during planning and execution of the test, gather objective evidence and validate usability goals. As part of the UCD pilot demonstration project, the CNIADV Team developed a “mock” summative usability test. We refer to the activity as a “mock” summative usability test because some best practices were not carried out. Namely, only 5 providers participated in the usability test sessions (instead of the recommended number of 15 to 20). We conducted the mock summative test with a smaller number of participants in order to (1) illustrate differences in the planning and execution of a summative test compared to a formative test, (2) highlight the differences in the artifacts (e.g., test plan, moderator guide, report) that are associated with a summative test, (3) provide examples of types of tasks and questions that might be included in the summative test, and (4) provide examples of objective and subjective usability metrics that might be collected.

2.13.1 Limitations of Sample Size None of the specific findings and metrics obtained from the “mock” summative test should be used or reported as actual summative test results for the forecasting workflow. With a sample size of only 5 participants, the confidence in reliability of the data is extremely low. Confidence intervals are statistical ranges that indicate the probability that the findings from the sample represent what exists in the overall population. Given that there are roughly 60,000 general practice pediatricians in the US (AMA, 2011), the feedback of 5 participants cannot represent the needs and opinions of the field.

2.13.2 Objectives The human factors team planned and carried out a mock summative usability test as part of a multi-phase UCD process. The objectives of this round of testing included the following:

• To validate the safe and effective use of the user interface as currently designed. • To evaluate if stated usability goals have been achieved. • To document the test methods and results using a standard report format.

2.13.3 Methods The test was conducted by executing the following steps:

(1) Identified the activities needed to address the objectives of the test. (2) Wrote a test plan that identified approach, scope, objectives, materials, participants, and

methods. The test plan was written in Microsoft Word using the template provided for

Page 43: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 36 September 28, 2015

use by Usability.gov (2013). The test plan can be found in the document titled “R3_MockSummativeUsabilityTestPlan_july2015_Final.pdf.”

(3) Developed a moderator guide for use in the testing. The moderator guide outlined the each test session from the opening welcome, introduction, consent, through the testing process, and included the wrap-up questions and thanks. The moderator guide can be found in the document titled “R3_Forecasting_ModeratorGuide_MockSummativeTest_july2015_Final.pdf.”

(4) Scheduled participants into 30-minute test sessions with a minimum of 30 minutes between sessions. All testing sessions were conducted during the same week.

(5) Conducted a pilot test session with internal CNIADV Team member and updated test materials as needed.

(6) Conducted the end-user participant test sessions and logged the data in a Microsoft Excel file during each session. The data log can be found in the file titled “R3_Forecasting_RAWdata_SummativeDataLog_06aug2015.xlsx.”

(7) Analyzed and reported the results of the summative usability test. The full summative usability test can be found in the document titled “R3_Forecasting_MockSummativeUsabilityTestReport_07aug2015_Final.pdf.”

2.13.4 Prototype Description The interactive low-fidelity prototypes described earlier (see Section 2.12) were used for testing. The prototypes included (1) an interactive Calendar view prototype that continued with the patient data that had been used in earlier rounds and (2) an interactive Table view prototype that included a different patient and different patient data that was considered to be comparable in complexity to the Calendar view content. The actual prototypes can be found in the following archived zip files:

• “R3_Forecasting_Prototype_Patient1GridView.zip” • “R3_Forecasting_Prototype_Patient3TableView.zip”

Participants were given control of the prototypes over the WebEx. Participants interacted with the prototypes to complete tasks as the moderator instructed.

2.13.5 Participants Five (5) providers participated in this round of usability test sessions. As a note, the constraints of this demonstration project did not allow for a sufficient number of participants to conduct a summative usability test. Participants were recruited by each of the four (4) participating vendors from their customer end-users (refer to Error! Reference source not found.Table 2 for vendor end-user participation across testing rounds). Participants were not compensated for their participation. Participant demographic background information is provided in Table 7.

Page 44: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 37 September 28, 2015

Table 7. Participant Demographic Background for Round 3 Forecasting Workflow

Participant ID

Role Specialty Years of Experience

Number of patients seen per day

Percentage of patients 18 years old or less

1 t3D1 MD Family Practice 12 years 20 0 % 2 t3D4 MD Pediatrics 26 years 18-20 95 % 3 t3D5 MD Pediatrics 15 years 99.9 % 4 t3D2 MD Pediatrics 19 years 15-20 95 % 5 t3D3 MD General

Pediatrics 23 years 10 per half day 98 %

2.13.6 Pilot Testing Prior to conducting actual test sessions with end-user participants, the CNIADV usability team conducted one or more internal pilot test sessions as needed. The purposes of pilot testing were (1) to ensure that the test materials were complete, (2) to assess whether the test length was appropriate for the planned session lengths, and (3) to allow the moderator to practice moderating the session. After pilot testing was conducted, the team updated materials as needed to continue with testing.

2.13.7 Test Sessions Each session was conducted via WebEx and recorded for internal CNIADV Team use. No session recordings will be shared outside of the CNIADV Team. The sessions were moderated by a human factors specialist from the CNIADV usability team. Individuals from the vendor teams and the CNIADV Team attended the sessions as silent observers. There was a single vendor organization involved in each session. No vendor attended a session where another vendor’s customer was a test participant. Participants viewed and interacted with the prototype displayed over the WebEx. Participants completed tasks as moderated via a script. The moderator did not interact with the participant during the task. Participant actions were logged by a data logger from the CNIADV usability team. The following background information was collected from each participant:

• Role (e.g., MD, NP, PA); • Years of experience; • Specialty; • Practice type; • Practice size • Number of patients provider sees per day; and

Page 45: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 38 September 28, 2015

• Percentage / number of patients that are age 18 or younger. The following usability data were collected:

• Effectiveness • Percentage of tasks successfully completed within the allotted time without

assistance (Pass). • Percentage of task failures (Fail) • Types of errors.

• Efficiency • Task Time. • Types of errors.

• System Satisfaction • Participant’s satisfaction rating of the system as indicated by responses to the 10

statements son the System Usability Scale (SUS). • Participant’s verbalizations (e.g., comments)

2.13.8 Tasks Tasks used in a summative test should be realistic and representative of the kinds of activities a user might actually do with the system in the real world. The team constructed a single task to demonstrate summative testing for the forecasting workflow. Provider participants were asked to complete the following task:

• Order vaccines that are due today based on the patient’s immunization forecast data. Each participant completed the task twice, once using the Calendar View and once using the Table View. The order of Calendar View versus Table View was varied across participants. Paying attention to study objective, the tasks were selected based on their frequency of occurrence, criticality of function, and those that may be most troublesome for users.

2.13.9 Usability Results for Task “Order Due Vaccines Using the Calendar View” The task was broken into the subtasks required to complete it. Performance was assessed for each subtask and the overall task. Each of the following subtasks was used to assess task performance with the Calendar View:

• Subtask 1.1: Identify and make clinical decision regarding adverse reaction alert on poliovirus vaccine.

• Subtask 1.2: Order poliovirus vaccine (dependent on subtask 1.1 completion). • Subtask 1.3: Order MMR vaccine • Subtask 1.4: Order varicella vaccine

Page 46: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 39 September 28, 2015

Five (5) participants attempted the task. Table 8 provides the usability test results for each subtask using the forecasting Calendar view. Table 8. Usability Test Results for Each Subtask in the Order Due Vaccines Task using the Calendar View.

Measure ->

Task n % Pass % Fail Task Time

(min)

Order vaccines that are due today – Calendar View Prototype Mean

(SD)

Identify and make clinical decision regarding adverse reaction alert on poliovirus vaccine

5 40% 60%

4.8 (2.2)

Order poliovirus vaccine (dependent on subtask 1 completion)

5 40% 60%

Order MMR vaccine 5 80% 20%

Order varicella vaccine 5 80% 20%

2.13.10 Usability Results for Task “Order Due Vaccines Using the Table View” The task was broken into subtasks required to complete it. Performance was assessed for each subtask and the overall task. Each of the following subtasks was used to assess task performance with the Table View:

• Subtask 2.1: Identify and make clinical decision regarding adverse reaction alert on DTaP vaccine

• Subtask 2.2: Order DTaP vaccine (dependent on subtask 2.1 completion) • Subtask 2.3: Order poliovirus vaccine • Subtask 2.4: Order MMR vaccine • Subtask 2.5: Order varicella vaccine

Five (5) participants attempted the scenario. Table 9 provides the usability test results for each subtask using the forecasting Table view.

Page 47: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 40 September 28, 2015

Table 9. Usability Test Results for Each Subtask in the Order Due Vaccines Task using the Table View

Measure ->

Task N % Pass % Fail Task Time

(min)

Order vaccines that are due today – Table View Prototype Mean

(SD)

Identify and make clinical decision regarding adverse reaction alert on DTaP vaccine

5 100% 0%

3.8 (1.1)

Order DTaP vaccine (contingent on subtask 1 completion) 5 100% 0%

Order poliovirus vaccine 5 80% 20%

Order MMR vaccine 5 80% 20%

Order varicella vaccine 5 80% 20%

2.13.11 Discussion of the Findings The following sections discuss the results, organized around a risk analysis of use, test performance, and error rates. The risk analysis of use includes identification of use errors and user interface design issues as well as classification of severity based on the consequence of the error. Use errors and user interface design issues that resulted in subtask failures or that are known industry risk issues are considered more severe compared to noncritical system usability issues related to efficiency. As such the discussion of more serious errors and issues is provided in the Risk Analysis section and the associated mitigation strategy is provided in the Areas for Improvement section. Based on our definition of effectiveness metrics, performance, use errors and issues stemming from task failures are also discussed in the Effectiveness section as effectiveness was measured with task success and failure. Noncritical system usability issues related to efficiency are discussed in the Efficiency section. Associated recommendations are provided in the Areas for Improvement section. Satisfaction was assessed for both the Calendar view and the Table view using the System Usability Scale (SUS) instrument. The SUS scores are discussed in the Satisfaction section.

2.13.11.1 Risk Analysis • Vaccine ordering with the Forecasting Calendar View:

Page 48: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 41 September 28, 2015

o Some users experienced difficulty recognizing and acknowledging alerts associated with past vaccination administration for a vaccine due today. The task required that participants either order every vaccine due today or make a clinically informed decision to not order a vaccine. Use errors were observed in which users ordered a vaccine without making a clinical decision regarding an associated adverse reaction alert. Such errors are of high severity as they could potentially lead to patient harm.

o Use errors associated with failing to order a vaccine that was due were observed. The task required that participants either order every vaccine due today or make a clinically informed decision to not order a vaccine.

o Errors classified as near-misses were also observed when some users attempted to order a vaccine which was not due. The prototype did not provide a way to order vaccines that were not due. Therefore, this use error being classified as a near-miss. In a fully implemented system it is not recommended that clinicians be absolutely restricted from ordering vaccines, even those vaccines that the system does not recognize as currently due. Thus, in a fully implemented system this use error would be a critical error that could result in patient harm.

• Vaccine ordering with the Forecasting Table View: o Some users experienced difficulty recognizing and acknowledging alerts associated

with a past vaccination administration for a vaccine that was due today. The task required that participants either order every vaccine due today or make a clinically informed decision to not order a vaccine. Use errors were observed in which users ordered a vaccine without making a clinical decision regarding an associated adverse reaction alert. Such errors are of high severity as they could potentially lead to patient harm.

o Use errors associated with failing to order a vaccine that was due were observed. The task required that participants either order every vaccine due today or make a clinically informed decision to not order a vaccine.

o Errors classified as near-misses were also observed when some users attempted to order a vaccine which was not in fact due. The prototype did not provide a way to order vaccines that were not due. Therefore, this use error was classified as a near-miss. In a fully implemented system it is not recommended that ordering clinicians be absolutely restricted from order vaccines, even those which the system does not recognize as currently due. Thus, in a fully implemented system this use error would be a critical error that could result in patient harm.

2.13.11.2 Effectiveness

• Vaccine ordering with Forecasting Calendar View: o Performance for ordering vaccines fell below the 90% acceptance criteria for all

subtasks. It is possible that some task failures were due to unfamiliar task scenarios and test configuration. However, end-users in actual clinical situations will need to order vaccines in infrequent clinical scenarios.

Page 49: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 42 September 28, 2015

• Vaccine ordering with Forecasting Table View: o Performance for ordering vaccines fell below the 90% acceptance criteria for all but

two of the performance criteria. It is possible that some task failures were due to unfamiliar task scenarios and test configuration. However, even in the actual clinical situation, end-users will need to order vaccines in infrequent clinical scenarios.

2.13.11.3 Efficiency

• Vaccine ordering with Forecasting Calendar View: o The most common efficiency-related comments made by participants were related to

the expectation of a “shopping cart” feature which would allow for the tracking of vaccines which had been selected for ordering as well as the ability to order all selected vaccines simultaneously. This expectation led to participants spending additional time looking for such a feature. There was a button labeled “Today” which displayed all vaccines listed as due today in a single dialogue window, but some users failed to find it.

o Participants also experienced confusion as to where tooltips would display and what information would be displayed. The most common expectation was that tooltips would be displayed when an icon (e.g. alert icon) was hovered over. This expectation mostly likely impacts time on task. Some participants expected CDC recommendations regarding vaccine details to be displayed somewhere in the system but had trouble locating them. This difficulty most likely affected system satisfaction and time on task.

• Vaccine ordering with Forecasting Table View: o The most common efficiency-related comments made by participants were related to

the expectation of a “shopping cart” feature which would allow for the tracking of vaccines which had been selected for ordering as well as the ability to order all selected vaccines simultaneously. This expectation led to participants spending additional time looking for such a feature. There was a button labeled “Today” which displayed all vaccines listed as due today in a single dialogue window, but some users failed to discover the button.

o Participants also experienced confusion as to where tooltips would display and what information would be displayed. The most common expectation was that tooltips would be displayed when an icon (e.g. alert icon) was hovered over. This expectation was seen as mostly impacting time on task.

2.13.12 Areas for Improvement Observed use errors associated with failure to recognize and make clinical decisions regarding vaccine alerts can be addressed through further user-centered design processes and research. Alerts could be made more salient or better align with user expectations. Observed use errors associated with failing to order vaccines which are due can be addressed through further user-centered design processes and research. Vaccines that are due could be

Page 50: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 43 September 28, 2015

presented in a manner that is more in line with expectations given a user’s typical workflow or could be visually treated so they are more salient and meaningful. This can also apply to errors associated with attempting to order a vaccine which is not due. Efficiency-related errors associated with forgetting which vaccine has been ordered as well as individual manual entry of orders could be addressed through the implementation of a shopping cart feature. Such functionality would allow the user to keep track of which orders will be placed as well as order all vaccines at once, which might also include making decisions about what vaccines to order as combination vaccines. This would reduce time on task and frustration associated with long task times.

2.13.13 Satisfaction and Use of the System Usability Scale (SUS) The SUS is considered to be a reliable and valid measure of system satisfaction. The instrument contains 10 statements to which participants provide ratings of agreement. The responses from the 10 items are used to compute a single usability score from 0 to 100. Sauro (2013) reports that, based on 500 studies across various products (e.g., websites, cell phones, enterprise systems) and across different industries, the average SUS score is a 68. A SUS score above a 68 is considered above average and anything below 68 is below average. We encourage vendor teams to use the SUS as a measure to compare their own usability improvement in an application as changes are made. Five (5) physicians completed the System Usability Scale (SUS) questionnaire after each task where they rated their experience using the Calendar and Table views. The average SUS scores for each view were as follows:

• Calendar view: Average score of 57.0 (SD=21.1) based on responses from 5 providers. • Table view: Average score of 71.0 (SD=14.0) based on responses from 5 providers.

2.13.14 Reporting the Mock Summative Test The mock summative testing round was the final end-user activity that the CNIADV Team conducted as part of the usability pilot demonstration project. The detailed methods and results of the summative usability test were documented in a report using the NISTIR 7742 Customized Common Industry Format Template for Electronic Health Record Usability Testing (NIST, 2010). The mock summative test report can be found in the document titled “R3_Forecasting_MockSummativeUsabilityTestReport_07aug2015_Final.pdf.”

3 IMMUNIZATION ADMINISTRATION WORKFLOW & DATA QUALITY 3.1 Discovery and Definition of Documentation & Data Quality Workflow User

Requirements Discovery activities are conducted to inform the definition of user requirements. As described in the UCD Primer document (see “UCD Primer: Conducting User-Centered Design Processes to

Page 51: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 44 September 28, 2015

Improve Usability in Electronic Health Records”), the richest data collection methods for understanding the user, user environments, user workflows, and user tasks are in-person observations and interviews conducted with users in their own environments. In-person discovery activities were not conducted for this project. Instead, the CNIADV usability team completed the following activities in order to discover and define the users, user environments, user workflows, user tasks, and user information needs so that early design concepts could be created and tested:

Cognitive Task Analysis; Task Mapping; Additional discovery activities.

3.1.1 Cognitive Task Analysis Activities for the Documentation & Data Quality Workflow A task analysis is a breakdown of the tasks and subtasks required to successfully operate a system. A cognitive task analysis is appropriate for situations where there are large mental demands on the user. The CNIADV Team conducted a cognitive task analysis on the documentation workflow. The purpose of the cognitive task analysis was to determine the tasks and subtasks that the user needs to perform in order to complete the task and identify the mental demands of those tasks (especially those tasks with high cognitive demands including perception, memory, information processing, and decision making). The task analysis informed the early design concepts (e.g., prioritization and layout of information on a display). Figure 18Figure 18 illustrates the output of a cognitive task analysis on immunization administration documentation with a focus on maintaining data quality. The complete analysis can be found in the file titled “DataQuality_Task Analysis_24june15.xlsx.”

Figure 18. Cognitive Task Analysis for Immunization Administration Documentation Workflow.

Page 52: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 45 September 28, 2015

3.1.2 Task Mapping Activities for Documentation & Data Quality Workflow A task map is a diagram showing the tasks and subtasks users might perform to complete an activity. The CNIADV usability team created a task map for the forecasting workflow early in the discovery and definition phase (see Figure 4). The task map included a high level task path that proceeded from selecting a patient through reviewing the immunization forecast to ordering. The task map was updated to include administering and documenting an immunization vaccine. The task flow included the key aspects (e.g., required fields and formats, error messages) that the team would analyze in greater detail. The actual image of the task map is contained in the file titled “DataQuality_DiscoveryAndDefinition_Task Map v1.JPG.”

Figure 19. Task Map that includes the Immunization Administration Documentation Workflow

3.1.3 Additional Discovery Activities for Documentation & Data Quality Workflow As part of the Design and Discovery activities, the CNIADV Team performed additional analyses to better understand data quality issues related to immunization administration documentation. These analyses included:

1. Review and use of the AIRA Modeling of Immunization Registry Operations Work Group “Data Quality Assurance in Immunization Systems: Selected Aspects” report (AIRA-MIROW, 2013). The document used by the team is contained in the file titled “AIRA-MIROW DQA Selected Aspects best practice guide 05-17-2013.pdf.”

2. Review and use of the AIRA-MIROW “Lot Numbers Validation best practices micro-guide” (AIRA-MIROW, 2014). The document used by team is contained in the file titled “AIRA-MIROW Lot Numbers Validation best practices micro-guide 05-08-2014-1.pdf.”

3. Development of a human factors system level perspective on addressing the data quality issue(s) associated with documentation of an administered immunization. This is included in the document file titled “Imms_IncludingDataQualityinUCDPlan_11may2015_v1.pdf.”

4. Outlining a human factors perspective on potential guidance and limitations for addressing data quality with focus on user interface designs. This is included in the document file titled “Imms_IncludingDataQualityinUCDPlan_11may2015_v1.pdf.”

Page 53: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 46 September 28, 2015

5. Conduct a root cause analysis to determine reasons that might be associated with data quality errors identified in AIRA reports. The analysis included email communication with Eric Larson, Nathan Bunker and Noam Arzt, and also discussion with vendors. The results of the analysis are included in the document file titled “CNIADV Workflow Issues for Usability Evaluation 19June2015_Final.pdf.”

6. Review of “Recording of Lot Numbers for Reconstituted Vaccines” for information on lot number format by manufacturer. [Warren Williams, Personal communication.] The information can be found in the file titled “LotNumberRecording-2.pdf.”

7. Review and discussion of other information (e.g., resources that define HL7 fields, information on lot number formats for individual drug manufacturers, resources on valid administration route and site values for different vaccines, other information related to data fields and value sets)

The results of this analysis defined the data elements that would be included in the usability evaluation.

3.2 Define Early Design Concepts for Immunization Administration Documentation & Data Quality Workflow

The CNIADV usability team reviewed the information from the Discovery and Definition phase of the project and produced alternative design concepts to consider for further design and testing. Design alternatives included the following:

• Automated data entry: Bar coding technology allows for data entry without manual input. One concept simulated technology that allowed for critical fields displayed on the user interface to populate based on system-known information once the nurse selected the administered order to document.

• Restrict data entry fields to force top to bottom form completion. • Provide field level format cues (e.g., for phone number (xxx-xxx-xxxx). • Provide data validation with error messaging at the field level. • Provide data validation with error messaging that interrupts the user’s workflow. • Limit dropdown options based on previously selected options (e.g., if order is for

injectable route allow selection of options only associate with injectable vaccines and prevent selection of options not associated with injectable vaccines).

3.3 Obtain Input from Subject Matter Experts on Early Immunization Administration Documentation & Data Quality Workflow Concepts

The human factors team conducted an interview over WebEx with a clinical member of the CNIADV Team acting in the role of a SME. This interview served two purposes including the following: (1) to demonstrate planning and executing stakeholder interviews and (2) to identify information needs and priorities for defining the early prototypes to be used in the first formative testing round. The SME provided opinions and answered questions related to vocabulary, functionality, experience with other products, and what information was needed and when (e.g., persistent on the screen, on mouse over, 1-click away) in order to document immunization

Page 54: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 47 September 28, 2015

administration. The wireframes and questions discussed in the interview can be found in the following files:

• “SMEinterview with Concrete Example of Data Quality Wireframes.pdf” • “SMEinterview_MMRV_Wireframes_23jun2015.pdf” • “SMEinterview_Influenza_Wireframes_v2_23jun2015.pdf”

3.4 Create Low-Fidelity Immunization Administration Documentation & Data Quality Prototype

The CNIADV usability team created interactive prototypes for use in the first round of formative usability testing for the documentation prototype. (Note: this was executed in Round 2 testing for the pilot demonstration project). The primary goals of the testing phase included the following:

• To employ task-based formative testing that allows end-users to interact directly with limited functionality prototypes for specific immunization workflows.

• To demonstrate capture of objective usability measures of effectiveness, efficiency, and user satisfaction.

• To validate design details that were informed by discovery and definition activities. • To use the findings from this test to update design concepts and prototypes in support of a

second round of testing that will represent a “mock” summative usability test. The prototypes were created using HTML and C+. The focus was on display of information and error messaging to evaluate impact on maintaining data quality during manual data entry. Prototypes included data field level formatting cues displayed within the field (see Lot Number in Figure 5Figure 5), field level validation with inline error messaging (see Figure 5Figure 5), field level validation with error messaging that interrupts the user’s workflow (see Figure 6), and interaction that restricted data entry fields to force top to bottom form completion (see Figure 7). The actual prototypes can be found in the archived zip file titled “R2_DataQuality_Prototype_vaccine.zip.”

Page 55: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 48 September 28, 2015

Figure 20. Prototype Screen with Field Level Formatting Cues Displayed within the Field and Field Level Validation with Inline Error Messaging

Page 56: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 49 September 28, 2015

Figure 21. Field Level Validation with Error Messaging that Interrupts the User’s Workflow. In addition to the concepts above, the prototypes included options in some dropdown selection controls that were constrained based on previously selected options. As shown in Figure 23, MMRV was selected as the administered vaccine to document. Therefore, the Route field dropdown selection options, other than the valid choices for Subcutaneous and Other, were gray and not available for selection.

Page 57: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 50 September 28, 2015

Figure 22. Interaction that Restricted Data Entry Fields to Force Top to Bottom Form Completion.

Page 58: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 51 September 28, 2015

Figure 23. Constrained Dropdown Options based on Previously Entered Data.

3.5 UCD Pilot Demonstration Round 1 No formative usability testing for immunization administration documentation and data quality was conducted during Round 1 of the UCD process for the pilot demonstration because the review of initial business requirements was underway.

3.6 Conduct Immunization Administration Documentation & Data Quality Workflow Formative Usability Test (Round 2)

The first round of formative testing for immunization administration documentation and data quality was conducted during Round 2 of the UCD process for the pilot demonstration. This round of formative usability testing was performed to evaluate design prototypes for influencing data quality during the documentation of immunization administration. 3.6.1.1 Objectives The objectives of this round of testing included the following:

• To employ task-based formative testing that allows end-users to interact directly with limited functionality prototypes for specific immunization workflows.

• To demonstrate capture of objective usability measures of effectiveness, efficiency, and user satisfaction.

• To validate updated design details that were informed by discovery and definition activities.

• To use the findings from this test to update design concepts and prototypes in support of a second round of testing that will represent a “mock” summative usability test.

Page 59: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 52 September 28, 2015

3.6.2 Methods The test included the following steps:

(1) Identified the activities to address the objectives of the test. (2) Wrote a test plan that identified approach, scope, objectives, materials, participants, and

methods. The test plan was written in a light-weight format using Microsoft PowerPoint. The test plan can be found in the document titled ““R2_FormativeUsabilityTestPlan_29jun2015_v1_Final.pdf.”

(3) Developed a moderator guide for use in the testing. The moderator guide outlined the flow of each test session, from the opening welcome, introduction, consent, through the session, to the wrap-up questions and thanks. The guide was used by the moderator but allowed flexibility to include or exclude questions depending on time and participant responses. The moderator guide can be found in the document titled “R2_DataQuality_ModeratorGuide_FormativeTest1_26jun2015_Final.pdf.”

(4) Scheduled participants into 30-minute test sessions with a minimum of 30 minutes between sessions. The testing was conducted over two days.

(5) Conducted a pilot test session with internal CNIADV Team member and updated test materials as needed.

(6) Conducted the end-user test sessions and logged the data in a Microsoft Excel file during each session. The data log can be found in the file titled “R2_DataQuality_RAWdata_FormativeDataLog_06aug2015.xlsx.”

(7) Reviewed the findings and revised the design/prototypes.

3.6.3 Prototype Description The low-fidelity prototypes described earlier (see Section 3.4) were used for testing. Participants viewed and interacted with the prototypes over WebEx. For each prototype the moderator asked questions related to documentation of immunization administration. The actual prototype can be found in the archived zip file titled “R3_DataQuality_Prototype_vaccine.zip.”

3.6.4 Participants Six (6) nurses participated in this round of usability test sessions. Each participating vendor recruited test participants from end-users of the product (refer to Error! Reference source not found.Table 2 for vendor end-user participation across testing rounds). Participants were not compensated for their participation. Participant demographic background information is provided in Table 3. Table 10. Participant Demographic Background for Round 2 Immunization Documentation Workflow

Demographic Number or Range Across Participants Role 1 RN, 2 LPN, 3 CMA Specialty 3 pediatrics, 3 other Years of Experience Range from 10 – 35 years Number of patients seen 3 see more than 15 patients per day

Page 60: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 53 September 28, 2015

per day 3 do not currently see patients due to change in role Percentage of patients 18 years old or less

2 participants - 100% 2 participants - 85 to 99% 2 participants - 65% or less

3.6.5 Pilot Testing Prior to conducting actual test sessions with end-user participants, the CNIADV usability team conducted internal pilot test sessions as needed. The purposes of pilot testing were (1) to ensure that the test materials were complete, (2) to assess whether the test length was appropriate for the planned session lengths, and (3) to allow the moderator to practice the session. After pilot testing was conducted, the team updated materials as needed before continuing testing.

3.6.6 Test Sessions Each 30-minute session was conducted via WebEx and recorded for internal CNIADV Team use. The sessions were moderated by a human factors specialist from the CNIADV Team. The participant session include a welcome, informed consent explanation, background information collected, and usability test tasks. Participants interacted with prototypes displayed over the WebEx. Participants responded to questions included in the moderator guide. The guide can be found in the document titled “R2_DataQuality_ModeratorGuide_FormativeTest1_26jun2015_Final.pdf.” Participants were asked to talk out loud while they were completing their tasks or thinking about how they would complete the task (e.g., to tell what they were thinking, what they were clicking or thought they would click, what they were looking for, what information they needed, etc.). Participants were asked to provide feedback, opinions, and responses to brief background questions. Observers from the individual vendor team and the CNIADV Team attended the sessions as silent observers. A data logger from the CNIADV Team took notes in an excel file. Only one (1) vendor was involved in an individual session. The data log can be found in the file titled “R2_DataQuality_RAWdata_FormativeDataLog_06aug2015.xlsx.” The following background information was collected from each participant:

• Role (e.g., nurse, RN, LPN, medial assistant); • Years of experience; • Specialty; • Practice type; • Practice size; • Number of patients nurse/CMA sees per day; and • Percentage / number of patients that are age 18 or younger.

The following usability data were collected:

• Task time; • Task effectiveness (Pass, Fail); • Participant impressions and reactions to concepts; and

Page 61: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 54 September 28, 2015

• Participant feedback, opinions, preferences.

3.7 Discuss/Review Findings and Determine Design Updates for Immunization Administration Documentation & Data Quality Workflow

The human factors team reviewed the findings from the usability test. During the review, the team discussed and analyzed the findings, referenced best practice user interface design principles, and brainstormed and sketched possible updates. The findings from this round of testing included the following:

• The content of the inline error message for Lot Number field was confusing and misunderstood (e.g. what does # or x mean) although participants appeared to view it favorably.

• The formatting tooltip was confusing in the same manner as the inline error message. • The field label “Order” was confusing. At least one participant commented that they

would not order anything. • The lack of automatically filled fields and need for manual entry of data were frustrating. • Progressive disclosure of sections in the form was confusing. • The location of the Status drop-down in the form was not appropriate. At least one

participant commented that s/he would not be completing all of the information above that field if the status of the immunization was not “Completed”

Additional findings were observed and reflected in the changes that were made to the prototype. These are described in the following section.

3.7.1 Iteration based on Immunization Administration Documentation & Data Quality Workflow Round 2 Usability Test

This iteration included design changes and updates to the prototypes based on the findings from the usability test. The user feedback data were used in conjunction with the team’s expert knowledge of human factors principles to update designs with the intention of increasing the usability of the concepts.

• Changed inline error message from “Lot Number can only have digits and letters. The correct format is #x##x####” to “Lot Number format is #x##x####. Check your entry for the correct format.”

• Added tooltip message to lot number box to clarify formatting requirements. • Changed language from “Order” to “Vaccine Administered”. • Moved Status dropdown to the top of the form. • Autofill-enabled fields were added/changed: Dose, date administered. • Interruptive pop-up alert was eliminated. • Progressive disclosure prototype was eliminated.

The team finalized the prototype titled “Task 2: Incorrect Lot Number Format Error” for use in the Mock Summative testing round. Changes to the prototype included the following:

Page 62: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 55 September 28, 2015

• Changed label for selecting the vaccine that was administered from “Order” to “Vaccine Administered”.

• Moved “Date Administered” under “Vaccine Administered” (previously known as “Order”).

• Auto filled “Date Administered” with the computer’s date • Auto filled “Expiration Date” with the computer’s Date • Changed dose from 0.2mL to 0.5mL. • Made it so the 0.05mL did not show until MMRV was selected in Vaccine Administered

field. • Changed site field default option showing from “NA” to “Select One”. • Made two Radio Buttons with “No Reaction” filled by default and “Reaction” not filled. • Changed VIS type drop-down selection from “Print” to “Printed Copy”. • Auto filled the “VIS Date Given to Patient” field with the computer’s date.

No other changes were made to the HTML prototype. The C+ prototype used to evaluate the interaction that restricted data entry fields to force top to bottom form completion was not carried forward due to project time and cost constraints. It is a design that teams should continue to investigate.

3.8 Finalize Round 2 Immunization Administration Documentation & Data Quality Workflow Iteration

The data gathered during formative usability testing in this round was analyzed and interpreted by a team of human factors specialists. Relevant findings informed the construction of an interactive prototype used in the final round of mock summative testing. The pre-test designs included three variations of the same documentation form. The first was a simple form featuring inline error messages which displayed when a format error was made (e.g. incorrect lot number format). The second was identical to the first but instead of inline messaging, a modal dialogue window which interrupted the user’s workflow and forced interaction with the dialogue before continuing. The third version included a progressive disclosure approach. This approach initially disabled form fields and forced users to complete the form from top to bottom. For instance, the manufacturer field was not selectable initially but became selectable when the administered order option was selected. The post-test system was reduced to a single approach. The interruptive dialogue approach was discarded entirely due to the feedback gathered from participants who suggested there was no use error critical enough to warrant interrupting the user’s workflow. Likewise, the progressive disclosure approach was not pursued further due to feedback which indicated users became confused and did not understand the logic of the progressive disclosure model. The design team did not have time or resource to continue developing this design. Other design changes included resolving field autofill issues such as incorrect dose amount and default administration date as well as adjusting the order of presentation of some form fields. These changes were made to mitigate use errors that were observed in formative testing. Figure 24Figure 24 and Figure

Page 63: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 56 September 28, 2015

25Figure 25 provide examples of the final prototype user interfaces developed from Round 1 of formative usability testing.

Figure 24. Immunization Administration Documentation Form with Tooltip.

Figure 25. Immunization Administration Documentation Form with Inline Error Message.

3.9 Conduct a “Mock” Summative Usability Test on Immunization Administration Documentation & Data Quality Workflow (Round 3)

A summative usability test is a method used to validate that usability goals have been achieved. In the case of health IT, the summative usability test is the method used to provide objective evidence that the application is safe and effective to use. As discussed in the UCD Primer (see “UCD Primer: Conducting User-Centered Design Processes to Improve Usability in Electronic

Page 64: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 57 September 28, 2015

Health Records”), the summative usability test is a rigorous evaluation method. It is essential to follow best practices during the planning and execution of the summative usability test in order insure objective evidence and to validate usability goals. As part of the UCD pilot demonstration project, the CNIADV Team out a “mock” summative usability test. We refer to the activity as a “mock” summative usability test because some best practices were not carried out. Namely, only two nurses participated in the usability test sessions (instead of the recommended number of 15 to 20). We conducted the mock summative test with a smaller number of participants in order to (1) illustrate differences in the planning and execution of a summative test compared to a formative test, (2) highlight the differences in the artifacts (e.g., test plan, moderator guide, report) that are associated with a summative test, (3) provide examples of types of tasks and questions that might be included in a full summative test, and (4) provide examples of objective and subjective usability metrics that might be collected.

3.9.1 Objectives The human factors team planned and carried out a mock summative usability test as part of a multi-phase UCD process. The objectives of this round of testing included the following:

• To validate the safe and effective use of the user interface as currently designed. • To evaluate if stated usability goals have been achieved.

3.9.2 Methods The test included the following steps:

(1) Identified the activities needed to address the objectives of the test. (2) Wrote a test plan that identified approach, scope, objectives, materials, participants, and

methods. The test plan was written in Microsoft Word using the template provided for use by Usability.gov (2013). The test plan can be found in the document titled “R3_MockSummativeUsabilityTestPlan_july2015_Final.pdf.”

(3) Developed a moderator guide for use in the testing. The moderator guide outlined each test session from the opening welcome, introduction, gaining consent, the testing session, to the wrap-up questions and thanks. The moderator guide can be found in the document titled “R3_DataQuality_ModeratorGuide_MockSummativeTest_july2015_Final.pdf.”

(4) Scheduled participants into 30-minute test sessions with a minimum of 30 minutes between sessions. All test sessions were conducted during the same week.

(5) Conducted a pilot test session with internal CNIADV Team member and updated test materials as needed.

(6) Conducted the end-user test sessions and logged the data in a Microsoft Excel file during each session. The data log can be found in the file titled “R3_DataQuality_RAWdata_SummativeDataLog_06aug2015.xlsx.”

(7) Analyzed and reported the results of the summative usability test. The full mock summative usability test report can be found in the document titled “R3_DataQuality_MockSummativeUsabilityTestReport_07aug2015_v1.pdf.”

Page 65: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 58 September 28, 2015

3.9.3 Prototype Description The interactive low-fidelity prototype described earlier (see Section 3.8) was used for testing. Participants were given control of the prototype over the WebEx. Participants interacted with the prototype to complete tasks as the moderator instructed. The actual prototype can be found in the archived zip file titled “R3_DataQuality_Prototype_vaccine_DQSummative.zip.”

3.9.4 Participants Two nurses participated in this round of usability test sessions. As a note, the constraints of this demonstration project did not allow for a sufficient number of participants needed for a rigorous summative usability test. The number of participants in this round of testing was lower than planned because the vendors did not provide enough contacts to the CNIADV Team. Participants were recruited by two participating vendors from their customer end-users (refer to Error! Reference source not found.Table 2 for vendor end-user participation across testing rounds). Participants were not compensated for their participation. Participant demographic background information is provided in Table 11. A recruiting screener was not used to identify and schedule participants as might be done for testing to ensure that the test participants meet critical criteria. However, the data only includes participants were currently providing patient care. (Note: this resulted in one nurse who participated in a session to be excluded from the data set because s/he had not seen patients in 3 years. This was not known until the participant session.) Table 11. Participant Demographic Background for Round 3 Immunization Administration Documentation & Data Quality Workflow

Participant ID

Role Specialty Years of Experience

Number of patients seen per day

Percentage of patients 18 years old or less

1 t3N1 Medical assistant

Pediatrics 11 years 30 80 %

2 t3N2 Nurse (RN) Family Practice

14 years 10 - 20 30 %

3.9.5 Pilot Testing Prior to conducting actual test sessions with end-user participants, the CNIADV usability team conducted internal pilot test sessions as needed. The purposes of pilot testing were (1) to ensure that the test materials were complete, (2) to assess whether the test length was appropriate for the planned session lengths, and (3) to allow the moderator to practice moderating the session. After pilot testing was conducted, the team updated all materials as needed to continue with testing.

Page 66: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 59 September 28, 2015

3.10 Test Sessions Each session was conducted via WebEx and recorded for internal CNIADV Team use. The sessions were moderated by a human factors specialist from the CNIADV usability team. Individuals from the vendor teams and the CNIADV Team attended the sessions as silent observers. Only one vendor organization was involved in each session. No vendor attended a session where another vendor’s customer was a test participant. Participants viewed and interacted with the prototype displayed over the WebEx. Participants completed tasks as moderated via a script. The moderator did not interact with the participant during the task. Participants were asked to talk out loud while they were completing their tasks to aid data collection. Participant actions were logged by a data logger from the CNIADV usability team. The following background information was collected from each participant:

• Role (e.g., nurse, RN, LPN, medial assistant); • Years of experience; • Specialty; • Practice type; • Practice size; • Number of patients provider sees per day; and • Percentage / number of patients that are age 18 or younger.

The following usability data were collected:

• Effectiveness • Percentage of tasks successfully completed within the allotted time without

assistance (Pass). • Percentage of task failures (Fail). • Types of errors.

• Efficiency • Task Time. • Types of errors.

• System Satisfaction • Participant’s satisfaction rating of the system as indicated by responses to the 10

statements son the System Usability Scale (SUS). • Participant’s verbalizations (e.g., comments)

3.10.1 Tasks Tasks used in a summative test should be realistic and representative of the kinds of activities a user might actually do with the system in the real world. The team constructed a single task to demonstrate summative testing for the immunization administration documentation and data quality workflow. Nurse participants were asked to complete the following task:

• Document an administered vaccine. Each participant completed the task once. The task was selected based on frequency of occurrence, criticality of function, potential to be most troublesome for users, and influence on data quality. Tasks should always be constructed in light of the study objectives.

Page 67: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 60 September 28, 2015

3.10.2 Usability Results for Task “Document Vaccine Administration” The task was broken into subtasks required to complete it. Performance was assessed for each subtask and the overall task. Each of the following subtasks was used to assess task performance:

• Subtask 1.1: Select correct vaccine administered. • Subtask 1.2: Enter correct date of administration. • Subtask 1.3: Enter correct manufacturer. • Subtask 1.4: Enter correct lot number. • Subtask 1.5: Enter correct expiration date. • Subtask 1.6: Enter correct NDC code. • Subtask 1.7: Enter correct dose amount. • Subtask 1.8: Enter correct site. • Subtask 1.9: Save completed form.

Two participants attempted the task. Table 8 provides the usability test results for each subtask. Table 12. Usability Test Results for Each Subtask in the Document Vaccine Administration Task.

Measure -> Task

n % Pass % Fail Task Time (min)

Document vaccine administration Mean

(SD)

Select correct vaccine administered 2 100% 0%

4.0 (0.0)

Enter correct date of administration 2 100% 0%

Enter correct manufacturer 2 100% 0%

Enter correct lot number 2 100% 0%

Enter correct expiration date 2 100% 0%

Enter correct NDC code 2 100% 0%

Enter correct dose amount 2 100% 0%

Enter correct site 2 100% 0%

Save completed form 2 100% 0%

Page 68: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 61 September 28, 2015

3.10.3 Discussion of the Findings The following sections discuss the results organized around a risk analysis of use, test performance, and error rates. The risk analysis of use includes identification of use errors and user interface design issues as well as classification of severity based on the consequence of the error. Use errors and user interface design issues that resulted in subtask failures or that are known industry risk issues are considered more severe compared to noncritical system usability issues related to efficiency. As such the discussion of more serious errors and issues is provided in the Risk Analysis section and the associated mitigation strategy is provided in the Areas for Improvement section. Based on our definition of effectiveness metrics, performance, use errors and issues stemming from task failures are also discussed in the Effectiveness section as effectiveness was measured with task success and failure. Noncritical system usability issues related to efficiency are discussed in the Efficiency section. Associated recommendations are provided in the Areas for Improvement section. Satisfaction was assessed at the task level using the System Usability Scale (SUS) instrument. The SUS scores are discussed in the Satisfaction section. 3.10.3.1.1 Risk Analysis • Vaccine administration documentation

o Although no critical use errors were observed, there were errors related to efficiency which are discussed in Section 3.10.3.1.3.

3.10.3.1.2 Effectiveness • Vaccine administration documentation

o Performance of documentation administered vaccines was above the acceptance criteria of 90% for all subtasks.

3.10.3.1.3 Efficiency • Vaccine administration documentation

o The most prominent efficiency-related observation was that of manual entry. Participants commented that auto-fill features are preferred to manual entry for most information. The need to manually enter data highly influenced time on task.

o Some participants had difficulty interpreting the inline error messages concerning field entry format errors. Participants corrected the error, and so no critical errors arose from this confusion although it did have an impact on task time.

3.10.4 Areas for Improvement Efficiency-related errors associated with manual entry of form fields could be addressed by use of technology (e.g., barcode scanning of vaccines that are entered into inventory and vaccines that are administered) and system design (e.g., automatic population of fields with information from the vaccine order) that minimizes manual entry for fields where direct user input is not needed. Autofill functionality would also impact effectiveness by reducing the probability of critical errors associated with manual entry (e.g. typos). The negative result of autofill functionality is that users are less likely to check and correct any errors in the information. Autofill functionality should not be used when the information is highly variable.

Page 69: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 62 September 28, 2015

3.10.5 Satisfaction and Use of the System Usability Scale (SUS) The SUS is considered to be a reliable and valid measure of system satisfaction. The instrument contains 10 statements to which participants provide indicate agreement on a scale of 1-10. The responses from the 10 items are used to compute a single usability score from 0 to 100. Sauro (2013) reports that, based on 500 studies across various products (e.g., websites, cell phones, enterprise systems) and across different industries, the average SUS score is 68. A SUS score above a 68 is considered above average and anything below 68 is below average. We encourage vendor teams to use the SUS as a measure to compare their own usability improvement in an application as changes are made. Two nurse/CMA participants completed the System Usability Scale (SUS) questionnaire after completing the task using the prototype system. The average SUS score was as follows:

• Average score of 78.8 (SD=26.5) based on responses from two participants.

3.10.6 Reporting the Mock Summative Test The mock summative testing round was the final end-user activity that the CNIADV Team conducted as part of the usability pilot demonstration project. The detailed methods and results of the summative usability test were documented in a report using the NISTIR 7742 Customized Common Industry Format Template for Electronic Health Record Usability Testing (NIST, 2010). The mock summative usability test report can be found in the document titled “R3_DataQuality_MockSummativeUsabilityTestReport_07aug2015_v1.pdf.”

Page 70: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 63 September 28, 2015

4 DESCRIBE AND REPORT THE UCD PILOT DEMONSTRATION METHODS AND FINDINGS

4.1 Conduct Findings Review Meetings with Vendors Upon completion of the UCD process executed in the pilot demonstration, meetings were scheduled with each vendor to review the methods and findings from the project. The artifacts described throughout this document were used in the final meeting with each vendor. Meetings with vendors covered the following topics:

• A brief description of the project, including the conceptual model with eight workflows, how we arrived at the two workflows evaluated for usability, and what the vendor might gain if CDC makes the information publicly available.

• A summary of the individual vendor’s participation (e.g., kickoff meeting and prototype review meeting dates).

• A summary of the individual vendor’s end-user participation (e.g., number of providers in each of the three rounds of testing).

• Demonstrated the prototypes developed and used in each round. E.g., the Forecasting prototype iterated from a non-interactive Microsoft PowerPoint in round 1 to an interactive HTML prototype in round 2 to a more interactive HTML prototype in round 3.

• Described how the CNIADV usability team processed the findings from each round and show example findings, emphasizing how the findings change from the formative rounds to the mock summative round.

• Questions, comments, and discussion from the vendor team. • Feedback from the vendor about participation in this process.

4.1.1 Feedback from Vendors on Participating in the UCD Pilot Demonstration The findings review meetings were rich in discussion and very interactive. Vendors were extremely interested in the processCNIADV Team. They appeared to appreciate and engage in the final briefing of the overall process and the different activities that had taken place, including how the CNIADV Team had used the findings from each round to inform the next round of activities. Not all vendors participated in every round of testing and no vendor was present during every end-user session during any round (since no vendor observed a session where another vendor’s end-user was the test participant.). Therefore, no single vendor had a clear view into the details of every activity. At the end of the findings review meetings, the CNI team asked the vendors to share their feedback on participating in the UCD process. Vendors appeared to be very positive about their participation in the process and indicated that they had learned information that they would carry forward in their own processes. Some of the specific feedback we received from the vendors during these sessions included the following (paraphrased):

Page 71: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 64 September 28, 2015

• I’m glad we participated in this process. • I appreciate the time you spent setting it all up and giving us a debrief. • This was a really interesting process. • It was very interesting to compare your usability process to our usability process. • It was great timing because we are in the middle of doing something similar. • We benefited by getting our end-users engaged in the process. This helps our internal

team by increasing awareness among our end-users about this type of process. This should help us with continued engagement of our end-users in our own internal UCD efforts.

• It was helpful to hear how CNIADV moderators phrased the questions to users, allowing users to give their opinions without bias.

• This is a really good start toward understanding and improving things. There are a lot of users involved in answering vaccine questions in the office, from the person making the appointment to when the patient can walk out the door.

• A summary document of the processes and the mockups would be very beneficial for us. We could refer to that for use in our own processes. It would give us an opportunity to spend time on the concepts and think about our approach to some of these topics based on CNIADV experience and findings.

• The process has gone very smoothly and has been very positive. The team has kept us up-to-date and been very flexible with scheduling our users.

All vendors were interested in having access to a report that described the UCD pilot demonstration activities, methods, and findings.

4.2 Prepare Written Reports for Delivery to CDC The CNIADV documented the methods and findings from the usability portion of the pilot demonstration project for delivery to CDC. The documented reports include the following:

• UCD Primer that defines usability and describes the user-centered design (UCD) process and activities that are a part of the process.

• Pilot Demonstration Methods and Findings (this document) that describes in detail all of the methods that were carried out as part of the pilot demonstration, includes the artifacts that were created and used in the process, and describes the findings obtained from execution the activities in the process.

• UCD Guidance for Vendors that provides non-prescriptive guidelines related to including immunization functionality in EHRs.

5 REFERENCES

Page 72: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 65 September 28, 2015

Ratwani R.M., Fairbanks R.J., Hettinger A.Z., and Benda N.C. (2015). Electronic health record usability: analysis of the user-centered design processes of eleven electronic health record vendors. Journal of the American Medical Informatics Association. 2015 Jun 6. pii: ocv050. doi: 10.1093/jamia/ocv050. [Epub ahead of print] Centers for Disease Control and Prevention (CDC). (2015) Recommended Immunization Schedule for Persons Aged 0 Through 18 Years. Available at http://www.cdc.gov/vaccines/schedules/hcp/imz/child-adolescent.html). Accessed May 2015. North Carolina Immunization Registry (NCIR) (2013). NCIR Quick Reference Guide. (dated 07/08/2013). Available at http://immunize.nc.gov/providers/ncirmaterialsforms.htm. Centers for Disease Control and Prevention (CDC). (2014). Clinical Decision Support for Immunization (CDSi): Logic Specification for ACIP Recommendations, version 1.8 (dated December 16, 2014). Tullis T. and Albert W. (2013). Measuring the User Experience. 2nd edition. Morgan Kaufmann, MA. Sauro J. and Lewis J.R. (2012) Quantifying the user Experience. Morgan Kaufmann, MA. Usability.gov (2013). Usability Test Plan Template. Available at http://www.usability.gov/how-to-and-tools/resources/templates/usability-test-plan-template.html. Sauro J. (2013). Measuring Usability with the System Usability Scale (SUS). Available at http://www.measuringusability.com/sus.php. Accessed 14 March 2013. National Institute of Standards and Technology (NIST) (2010). NISTIR 7742: Customized Common Industry Format Template for Electronic Health Record Usability Testing. AIRA-MIROW (2013). Data Quality Assurance in Immunization Systems: Selected Aspects. American Immunization Registry Association (AIRA) Modeling of Immunization Registry Operations Work Group (MIROW) (eds). Atlanta, GA: American Immunization Registry Association. May, 2013. AIRA-MIROW (2014). Lot Numbers Validation best practices micro-guide 05-08-2014. American Immunization Registry Association (AIRA) Modeling of Immunization Registry Operations Work Group (MIROW) (eds). Atlanta, GA: American Immunization Registry Association. May, 2014.

Page 73: Office of the Director Contract No: 200-2014-60994€¦ · 2.4 Create Low-Fidelity Forecast Workflow Prototype..... 12 2.5 Conduct Forecasting Workflow Formative Usability Test (Round

FD10a Attachment D: UCD PROCESS – METHODS and FINDINGS

Formal Deliverable 10a Immunization-Related Guidance: Attachment D: UCD Process Page 66 September 28, 2015