usability test plan · unneeded steps for completion). • identify how much time it takes to...

16
EDJ2Edge Design Solutions 414 E Clark St Vermillion, SD 57069 (Template borrowed from Usability.gov) 1 EDJ2Edge Design Solutions Western Digital Easystore External Hard Drive Usability Test Report Model: WDBDNK0010BBK-WESN Ellie Arndt, Denis Kozhokar, & Jon Vogl April 9, 2018

Upload: others

Post on 06-Jul-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Usability Test Plan · unneeded steps for completion). • Identify how much time it takes to complete tasks. • Discover where users get lost and/or commit errors in the process

EDJ2Edge Design Solutions – 414 E Clark St – Vermillion, SD 57069 (Template borrowed from Usability.gov)

1

EDJ2Edge Design Solutions

Western Digital Easystore External Hard Drive

Usability Test Report

Model: WDBDNK0010BBK-WESN

Ellie Arndt, Denis Kozhokar, & Jon Vogl April 9, 2018

Page 2: Usability Test Plan · unneeded steps for completion). • Identify how much time it takes to complete tasks. • Discover where users get lost and/or commit errors in the process

EDJ2Edge Design Solutions – 414 E Clark St – Vermillion, SD 57069 (Template borrowed from Usability.gov)

2

EDJ2Edge Design Solutions

Table of Contents

Executive Summary ...................................................................................................................... 3 Background Information................................................................................................................ 4

Product Description ................................................................................................................. 4

Test Objectives ........................................................................................................................ 4

Methodology ................................................................................................................................. 5 Participants .............................................................................................................................. 5

Procedure ................................................................................................................................ 5

Usability Tasks ......................................................................................................................... 6

Usability Metrics ....................................................................................................................... 8

Results .......................................................................................................................................... 9 Completion Rate ...................................................................................................................... 9

Error-Free Rate ...................................................................................................................... 10

Time on Task ......................................................................................................................... 11

Subjective Measure: SUS ...................................................................................................... 12

Recommendations ...................................................................................................................... 13 Problem Severity ................................................................................................................... 13

Conclusion .................................................................................................................................. 15 References.................................................................................................................................. 15

Page 3: Usability Test Plan · unneeded steps for completion). • Identify how much time it takes to complete tasks. • Discover where users get lost and/or commit errors in the process

EDJ2Edge Design Solutions – 414 E Clark St – Vermillion, SD 57069 (Template borrowed from Usability.gov)

3

EDJ2Edge Design Solutions

Executive Summary This report contains the results of a laboratory-based user study conducted to

evaluate 5 participants’ out-of-box experience with the Western Digital Easystore External Hard Drive. The objectives of this study include evaluation of the product to

determine whether the external hard drive fit the description of “plug and play” as

well as whether the core features of a typical external hard drive were executable both effectively and efficiently by its users.

Hardware Maintain – The device packaging was appropriate in that it was not excessive and users did not have issues or commit errors when unboxing and first connecting the

device to the desk top. Regarding the ability of users to “plug and play,” users were

able to “plug” effectively and efficiently.

Software Maintain – The Western Digital Easystore External Hard Drive provided the user easy

access to the installation program for the WD Discovery software in a way that was

both optional and easy to understand. Most participants were able to upload and delete files to and from the hard drive using the file explorer provided by the

Windows operating system. Users were also able to check the available space remaining on the hard drive in multiple ways, allowing freedom of use of the device.

Improve – The software used to interact with the product proved to be troublesome for various reasons for each user. The installation of WD Backup software was not

intuitive for most users. The WD Backup software was labeled as an “App” in the

WD Discovery software interface, causing users to overlook the most obvious way to perform the task. WD Backup was not actually needed to backup and upload files

onto the hard drive, but it is required to complete the scheduling of an automatic backup plan. Users had difficulty locating the correct software, WD Utilities and WD

Backup, to set a sleep schedule and a backup schedule for the device, respectively.

When attempting to disconnect, the device was still considered to be “in use” even when the software was closed – users had to explicitly exit the software in order to

follow Windows protocol to disconnect the device.

Negative findings are accompanied by recommendations that conform to the

evaluation to improve effective and efficient use of the Western Digital Easystore External Hard Drive.

Overall, the Western Digital Easystore External Hard Drive is worthy of the “plug and play” title. This product is easy for users to interact with when performing tasks

related to the basic connection and interaction procedures such as unboxing the product, connecting it to the computer, installing the WD Discovery software,

managing remaining space, removing saved content, and disconnecting the product

from the computer.

Page 4: Usability Test Plan · unneeded steps for completion). • Identify how much time it takes to complete tasks. • Discover where users get lost and/or commit errors in the process

EDJ2Edge Design Solutions – 414 E Clark St – Vermillion, SD 57069 (Template borrowed from Usability.gov)

4

EDJ2Edge Design Solutions

Background Information

Product Description The Western Digital Easystore External Hard Drive is a product that is designed to offer an effective, efficient, and portable storage and backup solution for

users of all computer experience levels. This external hard drive is advertised as being “plug and play”, allowing users to store files and interact with the hard

drive’s settings immediately without tackling a pile of documentation.

Accompanying software offers the user to have complete control over the external hard drive. With varying memory size options (1, 2, and 4 TB) and a

slim portable design, the Western Digital Easystore External Hard Drive (henceforth referred to as the ‘product’) offers users the space they need

wherever they may need it.

Test Objectives The task of the current evaluation was brought forth to EDJ2Edge Design

Solutions by the marketing team at Western Digital Corporation based in San Jose, CA. The requested area of focus for the current usability test is to answer

the question: “Is the Western Digital Easystore External Hard Drive truly plug

and play?” More specifically, the product was to be evaluated in a way that would determine if the core features of a typical external hard drive were

executable both effectively and efficiently by its users.

To answer this question, the following test objectives were explored using a

user testing approach: • Locate indirect or unintended task completion sequences (i.e. extra,

unneeded steps for completion). • Identify how much time it takes to complete tasks.

• Discover where users get lost and/or commit errors in the process of

completing a task. • Evaluate the visual interface, navigation, and cognitive constraints.

• Identify overall user satisfaction on the usability of the product. • Collect findings for future design recommendations.

To reach these test objectives, a laboratory-based user study was conducted to evaluate 5 participants’ out-of-box experience with the product. Each

participant fit the target user profile of a college student who has various digital

storage needs. Each participant was tasked with attempting to complete a set of representative task scenarios in line with the functionality of the product. As

the participant completed each task, they were required to engage in a ‘think aloud’ procedure, allowing for a more thorough analysis of their actions,

intentions, and errors.

Three objective performance measures (completion rate, error count, and time

on task) and a subjective usability measurement (System Usability Scale) were analyzed to determine user performance and satisfaction. A problem severity

measurement is provided by the usability team to categorize findings for

product recommendations.

Page 5: Usability Test Plan · unneeded steps for completion). • Identify how much time it takes to complete tasks. • Discover where users get lost and/or commit errors in the process

EDJ2Edge Design Solutions – 414 E Clark St – Vermillion, SD 57069 (Template borrowed from Usability.gov)

5

EDJ2Edge Design Solutions

Methodology The methods employed in the user test aimed to evaluate an out-of-box

experience with the product in a controlled laboratory setting similar to that of the participant’s home office. The laboratory setting offered the participants a

desktop computer running the Windows XP operating system and the usability

data collection software Morae, which recorded the participant’s actions on the desktop computer, their physical interactions with the product, and their think

aloud vocalizations. The evaluation examined participant background

information, three objective performance measures (completion rate, error count, and time on task), and one subjective performance measure (System

Usability Scale) for five participants.

Participants A total of 5 participants (male=4, female=1, mean age=22.4) were recruited for the usability test of the product. The participants were recruited from the

population of the usability team’s co-workers, friends, and other contacts.

Moderated user testing took place at the Heimstra Human Factors Laboratory in Vermillion, South Dakota.

All participants were required to be over 18 years of age and have at least

minor experience with the modern computer and Windows operating system.

All participants were to engage in a think aloud training session prior to participating in the usability test to ensure they understood the procedure.

Procedure Two members of the usability team were present for each study. One member acted as the facilitator, who was tasked with guiding, assisting, and prompting

the participant throughout the study. The other team member acted as a data

logger, who was tasked with collecting video data concerning the participant’s physical interactions with the product and taking notes. The remainder of the

data, i.e. the user’s on-screen interactions and think aloud vocalizations were logged by the Morae usability software.

The facilitator briefed the participants on the Western Digital Easystore External Hard Drive and instructed the participant that they are evaluating the product,

rather than the facilitator evaluating the participant. Participants signed an

informed consent that acknowledges: their participation is voluntary, that participation can cease at any time, and that the session will be videotaped but

their privacy of identification will be safeguarded. Participants completed a pretest demographic and background information questionnaire.

The facilitator explained that the amount of time taken to complete the test task will be measured and that exploratory behavior outside the task flow

should not occur until after task completion. At the start of each task, the participant read aloud the task description from their printed copy of the task

scenario/list and began the task. Time-on-task measurement began when the

participant started the task.

After all task scenarios were attempted, the participant completed the System

Usability Scale, a subjective evaluation questionnaire designed to evaluate the usability of a product. All SUS and participant opinion data was presented and

recorded by the Morae usability software.

Page 6: Usability Test Plan · unneeded steps for completion). • Identify how much time it takes to complete tasks. • Discover where users get lost and/or commit errors in the process

EDJ2Edge Design Solutions – 414 E Clark St – Vermillion, SD 57069 (Template borrowed from Usability.gov)

6

EDJ2Edge Design Solutions

Usability Tasks The usability tasks were derived from a list of the product’s functionality and

user reviews for products in the same category. Additionally, tasks were selected in a way that limits the testing to features offered by the product,

rather than by the operating system used to interact with the product. Due to the range and extent of functionality and uses provided by the product, and the

short time for which each participant will be available, the chosen tasks were

the most common and relatively complex of available functions. The tasks were identical for all participants in the study.

Each participant attempted to complete ten different tasks after being given

two scenarios that are more relatable to the participant than a task list. The

scenarios chosen for this user test encompass a typical user interaction with external hard drive technology and address all ten tasks analyzed in the current

study. The two scenarios are listed below, and a task number reference is

provided in superscript.

“Unfortunately, you lost your old external hard drive and you had to buy a new one to back up some files on your PC. So, you purchased the Western

Digital Easystore External Hard Drive to act as your replacement.1,3,4 You

have a folder filled with scenic landscape pictures you took on your recent trip to the Rocky Mountains that you would like to back up immediately.2,5

Assuming you have enough space remaining on the hard drive after saving your photo collection to it (at least 500 GB)6, you also want the new external

hard drive to back up your PC on a weekly basis, every Sunday at 1:00 PM,

automatically.7 You also want your external hard drive to go into sleep mode after 45 minutes of inactivity to save on power consumption.8 Finishing these

tasks will help you rest assured your files are safe.”

The participant was then asked to close out of every open window on the

desktop. The participant was then presented with the next scenario designed to enable the performance of two additional tasks.

“Luckily, you happened to have found your old external hard drive hidden away on your bookshelf. Since you no longer require a second external hard

drive, you plan to return the Western Digital Easystore External Hard Drive to the store. Once your saved content is completely erased from the hard

drive9, you can disconnect10 and repackage the device to return it later.”

After each scenario was read to the participant, they were given a copy of the

scenario with a check list of tasks to be performed. The check list was written

in a way that provides no algorithmic hints regarding the completion of the task. A more technical description, including expected use frequency, task

prerequisites, and success criteria, of the tasks being analyzed in this user test is provided in Table 1.

Page 7: Usability Test Plan · unneeded steps for completion). • Identify how much time it takes to complete tasks. • Discover where users get lost and/or commit errors in the process

EDJ2Edge Design Solutions – 414 E Clark St – Vermillion, SD 57069 (Template borrowed from Usability.gov)

7

EDJ2Edge Design Solutions

Task 1 (Low Use): Unbox product.

Requirements: 1. Sealed product

Success Criteria: The product and all components are out of the packaging.

Task 2 (High Use): Connect the product to computer.

Requirements: 1. Product and USB cable

2. Computer with USB input (assumed for other tasks)

Success Criteria: The drive appears in the file explorer directory.

Task 3 (Low Use): Install WD Discovery software.

Requirements: 1. Access to WD Discovery software

Success Criteria: WD Discovery software is successfully installed.

Task 4 (Low Use): Install WD Backup software.

Requirements: 1. Access to WD Backup software

Success Criteria: WD Backup software is successfully installed.

Task 5 (High Use): Transfer files to the product.

Requirements: 1. Product connected to computer

2. Folder of photos to transfer

Success Criteria: Files are accessible through the external hard drive.

Task 6 (Medium Use): Check remaining space on product.

Requirements: 1. Product connected to computer

Success Criteria: Participant makes note of remaining space.

Task 7 (Medium Use): Set automatic backup frequency.

Requirements: 1. Product connected to computer

2. WD Backup software installed

Success Criteria: The backup frequency is saved as Sunday at 1:00 PM.

Task 8 (Low Use): Customize the product settings (example: Set sleep schedule)

Requirements: 1. Product connected to computer

2. WD Utilities software installed

Success Criteria: External hard drive sleep setting is saved as 45 minutes.

Task 9 (Low Use): Erase all content saved on the hard drive during the user test.

Requirements: 1. Product connected to computer

2. Folder of photos

Success Criteria: Used the software to erase all content on the hard drive.

Task 10 (High Use): Properly disconnect the product from the computer.

Requirements: 1. Product connected to computer

Success Criteria: Hard drive and computer are not connected.

Table 1. Task list with use frequency, requirements, and defined success criteria used in the

current study.

Page 8: Usability Test Plan · unneeded steps for completion). • Identify how much time it takes to complete tasks. • Discover where users get lost and/or commit errors in the process

EDJ2Edge Design Solutions – 414 E Clark St – Vermillion, SD 57069 (Template borrowed from Usability.gov)

8

EDJ2Edge Design Solutions

Usability Metrics Usability metrics refers to user performance measured against specific

performance goals necessary to satisfy usability requirements. Scenario completion success rates, error rates, and a subjective evaluation were used.

Time-to-completion of scenarios were also collected.

Scenario Completion Each scenario required or requested that the participant obtains or inputs

specific data that would be used in course of a typical task. The scenario was considered complete when the participant indicated the scenario's goal had

been obtained (whether successfully or unsuccessfully) or the participant requested and received sufficient guidance as to warrant scoring the scenario

as a critical error.

Critical Errors Critical errors are deviations at completion from the targets of the scenario.

Obtaining or otherwise reporting of the wrong data value due to participant

workflow is a critical error. Participants may or may not be aware that the task goal is incorrect or incomplete.

Independent completion of the scenario is a universal goal; help obtained from

the other usability test roles is cause to score the scenario a critical error.

Critical errors can also be assigned when the participant initiates (or attempts to initiate) and action that will result in the goal state becoming unobtainable.

In general, critical errors are unresolved errors during the process of completing the task or errors that produce an incorrect outcome.

Non-critical Errors Non-critical errors are errors that are recovered from by the participant or, if not detected, do not result in processing problems or unexpected results.

Noncritical errors can always be recovered from during the process of completing the scenario.

Procedural errors, such as using excessive steps to reach a goal, are considered non-critical errors. Brief exploratory behavior, such as opening the wrong

menu while searching for a function, will not be coded as a non-critical error.

However, if more than 15 seconds are utilized searching for a function, then the exploratory behavior will be coded as a non-critical error.

Scenario Completion Time (time on task) The time to complete each scenario, not including subjective evaluation

durations, were recorded. Additionally, the time to actually complete the task

was compared to the time it took to explore through the product’s features to begin the task.

Subjective Evaluations Subjective evaluations regarding ease of use and usability of the product were

collected via System Usability Scale (SUS), and during debriefing at the

conclusion of the session. The SUS is a questionnaire that consists of ten items each with five possible responses from “Strongly agree” to “Strongly disagree”.

The SUS is a time-tested, valid subjective evaluation metric that can effectively differentiate between usable and unusable systems.

Page 9: Usability Test Plan · unneeded steps for completion). • Identify how much time it takes to complete tasks. • Discover where users get lost and/or commit errors in the process

EDJ2Edge Design Solutions – 414 E Clark St – Vermillion, SD 57069 (Template borrowed from Usability.gov)

9

EDJ2Edge Design Solutions

Results

Task Completion Task Completion is determined based on whether each test participant was able to successfully the task with the correct output and without critical errors.

Unboxing the hard drive, connecting the hard drive, installing the WD Discovery software, and erasing saved content from the hard drive were achieved by all

participants. Installing WD Backup software and setting a sleep schedule had

low completion (3 of 5 participants successfully completed), and setting the backup schedule had the lowest completion rate (1 of 5 participants successfully

completed). Each participant failed to complete at least 2 of the 10 tasks.

Task Completion for Each Participant

P1 P2 P3 P4 P5

Unbox 100% 100% 100% 100% 100%

Connect 100% 100% 100% 100% 100%

Install WD Discovery 100% 100% 100% 100% 100%

Install WD Backup 100% 0% 0% 100% 0% Transfer Folder 100% 100% 100% 0% 100%

Check Space Remaining 100% 100% 0% 100% 100%

Set Backup Schedule 0% 0% 0% 0% 100%

Set Sleep Schedule 0% 100% 0% 0% 100%

Erase Saved Content 100% 100% 100% 100% 100%

Disconnect Hard Drive 100% 100% 100% 100% 0%

Table 2. Task completion rate for each participant. Failed tasks are color coded.

Figure 1. Task completion distribution for all participants. The task of setting a backup schedule

was failed by 4 out of 5 participants.

Page 10: Usability Test Plan · unneeded steps for completion). • Identify how much time it takes to complete tasks. • Discover where users get lost and/or commit errors in the process

EDJ2Edge Design Solutions – 414 E Clark St – Vermillion, SD 57069 (Template borrowed from Usability.gov)

10

EDJ2Edge Design Solutions

Error Count Critical errors led to participants being unable to complete the task. Non-

critical errors were errors made that interfered with task completion but did not prohibit it. Below is the number of critical and non-critical errors performed by

each participant. Participants encountered the most errors (both critical and non-critical) when attempting to set a sleep schedule, followed by installing the

WD Backup software and setting a backup schedule. Setting a backup schedule

and setting a sleep schedule for the hard drive resulted in the highest number of critical errors.

Critical and Non-Critical Error Count for Each Participant

P1 P2 P3 P4 P5 Total

C NC C NC C NC C NC C NC

Unbox

Connect

Install WD Discovery

Install WD Backup 2 1 1 1 1 1 7

Transfer Folder 1 2 1 4

Check Space Remaining

Set Backup Schedule 1 1 1 2 1 6

Set Sleep Schedule 1 2 1 1 1 1 1 1 9

Erase Saved Content 1 1 2

Disconnect Hard Drive 1 1

Table 3. Critical and non-critical error count for each participant. Setting a sleep schedule had

the greatest number of errors across all participants.

Figure 2. Error distribution for all participants. If a task is error free 80% or more of the time, the

usability goal was met.

Page 11: Usability Test Plan · unneeded steps for completion). • Identify how much time it takes to complete tasks. • Discover where users get lost and/or commit errors in the process

EDJ2Edge Design Solutions – 414 E Clark St – Vermillion, SD 57069 (Template borrowed from Usability.gov)

11

EDJ2Edge Design Solutions

Time on Task The time to complete a task scenario is referred to as "time on task". It is

measured from the time the person begins the scenario to the time he/she signals completion. While there is no standard to compare the time on task

values to, it is interesting to note the relationships, such as exploration time to task time and time on task relative to errors experienced during the tasks.

A difference in exploration time and task time was noticed in 5 tasks. Exploration time is defined as time spent exploring for the location or

prerequisites to begin the task. Task time is defined as the time it took to complete the task after the location or prerequisites were located. The largest

exploration times occurred during the ‘Install WD Backup’ and ‘Set Sleep

Schedule’ tasks. It is clear that the installation application location for the WD Backup software is not readily apparent to the user, causing them to take a

significant amount of time to search for the application. The same can be said

of option to set a sleep schedule. Data for Participant 4’s time on task for the tasks of ‘Transfer folder’ and ‘Erase saved content’ are removed from Figure 3

due to uncorrected failures that occurred during the testing procedure.

Time on Task for Each Participant (in seconds)

P1 P2 P3 P4 P5 Mean

Unbox 31.53 18.22 28.36 47.81 34.39 32.06

Connect 17.36 15.98 25.17 46.89 18.97 24.87

Install WD Discovery 38.12 52.69 35.56 42.53 78.2 49.42

Install WD Backup 112.96 169.73 76.01 37.01 115.64 102.27

Transfer folder 27.74 26.79 49.28 192.36* 115.42 82.32

Check space remaining 17.78 15.38 33.61 26 9.91 20.53

Set backup schedule 183.55 158.29 140.54 50.45 32.46 113.05

Set sleep schedule 125.94 42.71 138.58 140.48 95.56 108.65

Erase saved content 19.86 27.83 65.85 145.17* 17.23 55.19

Disconnect 21.44 47.34 36.92 34.38 52.34 38.48

Total Time on Tasks: 596.28 574.96 629.88 763.08 570.12 626.84

Table 4. Time on task for each participant. Tasks where errors occurred are color coded

(orange=non-critical error, tan=critical error). P4’s data for two tasks marked with * are outliers.

Figure 3. Average time on task for all participants. * Outlier data for P4 is removed.

Page 12: Usability Test Plan · unneeded steps for completion). • Identify how much time it takes to complete tasks. • Discover where users get lost and/or commit errors in the process

EDJ2Edge Design Solutions – 414 E Clark St – Vermillion, SD 57069 (Template borrowed from Usability.gov)

12

EDJ2Edge Design Solutions

Subjective Measure: SUS Subjective evaluations regarding ease of use and usability of the product were

collected using the System Usability Scale (SUS). The results can be seen in Figure 4 below. The commonly accepted average score for the SUS is 68, a

value obtained through the analysis of over 500 studies using the metric (Sauro, 2011).

For the tested product, the group mean of the SUS scores was 65. Interpreting this value relative to the average, the product’s group mean score is below

average (below the 50% rank). Converting this number to a more relatable percentile rank, the product scores at the 43.5% percentile rank, which would

be roughly equivalent to a D letter grade (Sauro, 2011).

Individual participant ratings appear to be varied. Participant 3, who identified

as a beginner user, rated the product higher than the average user group’s

mean. However, it was apparent that the beginner user had the most trouble locating and successfully completing tasks (4/10 failed tasks) and utilized the

most time on task among all participants. Due to the significant amount of guidance that was offered to Participant 3 by the facilitator, the rating of the

product may be skewed higher due to Participant 3’s limited interaction with the

product (i.e. more guidance to tasks by facilitator=less interaction) relative to the other participants.

Participant 5, who identified as an average user, ranked the product as very

poor. Participant 5, who also claimed to have experience with other external

hard drives, noted that the product was “the least user friendly external hard drive he has ever used”, explaining the low rating.

Figure 4. SUS scores for the tested product across all participants.

Page 13: Usability Test Plan · unneeded steps for completion). • Identify how much time it takes to complete tasks. • Discover where users get lost and/or commit errors in the process

EDJ2Edge Design Solutions – 414 E Clark St – Vermillion, SD 57069 (Template borrowed from Usability.gov)

13

EDJ2Edge Design Solutions

Recommendations

Problem Severity To prioritize recommendations, a method of problem severity classification was used in the analysis of the data collected during the user testing. The approach

treats problem severity as a combination of two factors - the impact of the problem and the frequency of users experiencing the problem during the

evaluation.

Impact Impact is the ranking of the consequences of the problem by defining the level

of impact that the problem has on successful task completion. There are three levels of impact:

• High - prevents the user from completing the task (critical error) • Moderate - causes user difficulty but the task can be completed (non-

critical error)

• Low - minor problems that do not significantly affect the task completion (non-critical error)

Frequency Frequency is the percentage of participants who experience the problem when

working on a task.

• High: 60% or more of the participants experience the problem

• Moderate: 21% - 40% of participants experience the problem • Low: 20% or fewer of the participants experience the problem

Problem Severity Classification The identified severity for each problem implies a general reward for resolving it, and a general risk for not addressing it, in the current release.

Severity 1 - High impact problems that often prevent a user from correctly

completing a task. They occur in varying frequency and are characteristic of

calls to the Help Desk. Reward for resolution is typically exhibited in fewer Help Desk calls and reduced redevelopment costs.

Severity 2 - Moderate to high frequency problems with moderate to low impact are typical of erroneous actions that the participant recognizes needs

to be undone. Reward for resolution is typically exhibited in reduced time on task and decreased training costs.

Severity 3 - Either moderate problems with low frequency or low problems with moderate frequency; these are minor annoyance problems faced by a

number of participants. Reward for resolution is typically exhibited in reduced time on task and increased data integrity.

Severity 4 - Low impact problems faced by few participants; there is low risk to not resolving these problems. Reward for resolution is typically exhibited in

increased user satisfaction.

The issues and recommendations are ranked by severity in Table 5 below.

Page 14: Usability Test Plan · unneeded steps for completion). • Identify how much time it takes to complete tasks. • Discover where users get lost and/or commit errors in the process

EDJ2Edge Design Solutions – 414 E Clark St – Vermillion, SD 57069 (Template borrowed from Usability.gov)

14

EDJ2Edge Design Solutions

Major Issue Impact Frequency Severity Ranking

Failure to locate WD Backup installation application.

High 60% 1

Recommendations: The WD Discovery interface allowing the user easy access to the WD Backup software is unnecessarily hidden in a tab labeled “Apps”. No participant made the connection of the software being considered an app, a term commonly used in mobile devices. The best fix would include combining WD Backup, WD Discovery, and WD Utilities into one package of software, ensuring each utility is installed for the user automatically. A simple solution would be to make the WD Backup software easier to locate.

Failure to locate WD Utilities to set sleep timer.

High 60% 1

Recommendations: The WD Utilities software is automatically installed when WD Discovery is installed. However, the system offers no immediate indication that WD Utilities was installed on the computer. Making the fact that two pieces of software were installed, or, ideally, combining all software into a single package would reduce search time and failure rates for less common tasks such as managing the product’s settings.

Failure to save folder to product. High 20% 1

Recommendations: One participant had trouble with the simple task of saving a folder onto the product, as they tried to perform the task using the WD Backup software only. While the task could be completed this way, even after pressing ‘Backup Now’, the folder was not transferred to the drive. Ensuring that the WD Backup software can back up files immediately and intuitively rather than using future schedules and abstract timelines would make the multiple routes to task completion more efficient.

Failure to disconnect product while operational.

High 20% 1

Recommendations: Trying to disconnect the product when it is running in the background alerts the user with a popup that does not provide an option for the user to pause or quit the currently running processes. Making this option available on the popup would be desirable.

Failure to set a proper backup schedule meeting all desired criteria.

Moderate 80% 2

Failure to set backup for desired folder. Moderate 60% 2

Failure to set correct time schedule. Low 20% 4

Recommendations: Setting a backup schedule proved to be difficult due to errors in selecting a proper time and folder for the backup process. Redesigning this process with proper user testing procedures during design would be the most efficient way to increase its usability.

Folder Selection Issue: Navigating the folder selection interface proved to be difficult for most users, as they expected information to be interactive in a way that is similar to how they interact with files on their computer and the product normally. Mimicking the typical interface using a select/drag-and-drop process would be desirable. Time Schedule Issue: Hourly, Daily, and Monthly options are provided to the user. However, when participants were prompted to set a weekly schedule, the current design caused confusion. Providing another option for weekly backups would be simple and reduce confusion.

Unnecessary toggling of sleep timer switch. Low 60% 2

Recommendations: The switch indicating whether the sleep timer was on or off caused confusion among most participants. When the sleep timer was toggled on it was perceived be off, causing the participants to toggle the switch multiple times. Making this switch’s status more definitive would reduce unnecessary user frustration.

Table 5. Major issue and recommendation table. Issues are ranked by severity. Issues that had

major overlap with the issues listed here are not listed, as fixing these will fix the others.

Page 15: Usability Test Plan · unneeded steps for completion). • Identify how much time it takes to complete tasks. • Discover where users get lost and/or commit errors in the process

EDJ2Edge Design Solutions – 414 E Clark St – Vermillion, SD 57069 (Template borrowed from Usability.gov)

15

EDJ2Edge Design Solutions

Conclusion Is the Western Digital Easystore External Hard Drive truly ‘plug and play’?

The current user test answered this question with a slightly vague ‘Yes’. The product proved to be easy to interact with when performing tasks related to

basic connection and interaction procedures such as unboxing the product,

connecting the product to the computer, installing the WD Discovery software, managing remaining space, removing saved content, and

disconnecting the product from the computer. While there were minor issues

for two participants when transferring files to the hard drive, these issues are believed to be a result of the testing environment rather than issues inherent

in the product. With the success of these tasks in mind, it is easy to state that the product is worthy of the ‘plug and play’ title.

However, major issues concerning the interaction with the product’s accompanying software drastically dent the usability performance of the

product, both in objective and subjective measures. The hidden placement of the installation files for the WD Backup software proved to be extremely

problematic, as failing to install the software prevents the user from using a

desirable feature of the product. Should the user manage to find and install the software, the interface of the WD Backup software proves to be

problematic. Unintuitive options and poor displays of information caused confusion for most participants. Lastly, finding and scheduling a sleep timer

forced the user to use a new piece of software that offered no indication that

it was installed (as it was installed in addition to WD Discovery).

These major issues, while critical, can be fixed with a simple reconfiguration

of the software used to interact with the product. Combining all of the features used in WD Discovery, WD Utilities, and WD Backup into one

wrapper would reduce confusion and exploration time. This simple fix, when paired with the successes found in the other tasks, would increase the overall

usability of the product drastically.

The current study provided confirmation that the basic features of the

Western Digital Easystore External Hard Drive are worthy of the ‘plug and play’ description, offered recommendations for immediate fixes, and gave

Western Digital Corporation a new direction to focus on improving their

product’s software suite.

Page 16: Usability Test Plan · unneeded steps for completion). • Identify how much time it takes to complete tasks. • Discover where users get lost and/or commit errors in the process

EDJ2Edge Design Solutions – 414 E Clark St – Vermillion, SD 57069 (Template borrowed from Usability.gov)

16

EDJ2Edge Design Solutions

References

Sauro, J. (2011). SUStisfied? Little-Known System Usability Scale Facts. User

Experience Magazine, 10(3).