cs305: hci in sw development continuing evaluation: asking experts inspections and walkthroughs

23
CS305: HCI in SW Development Continuing Evaluation: Asking Experts • Inspections and walkthroughs

Upload: silvester-ball

Post on 18-Dec-2015

219 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

CS305: HCI in SW Development

Continuing Evaluation:Asking Experts

• Inspections and walkthroughs

Page 2: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

Overview

Page 3: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

The aims

Describe how do heuristic evaluation & walkthroughs. Discuss strengths & limitations of these techniques

Old topics in this unit (not covered this year)

Discuss the role of interviews & questionnaires in evaluation.

Teach basic questionnaire design. Describe how to collect, analyze & present data.

Page 4: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

Overview: Evaluation By Experts

• Several approaches– Heuristic evaluation– Walkthroughs (several flavors)

• In general:– Inexpensive and quick compared to asking users

• “Discount evaluation”

– Experts may suggest solutions (users probably don’t)

Page 5: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

Asking experts

• Experts use their knowledge of users & technology to review software usability

• Expert critiques (crits) can be formal or informal reports

• Heuristic evaluation is a review guided by a set of heuristics

• Walkthroughs involve stepping through a pre-planned scenario noting potential problems

Page 6: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

Heuristic evaluation

• Developed Jacob Nielsen in the early 1990s• Based on heuristics distilled from an empirical

analysis of 249 usability problems• These heuristics have been revised for

current technology, e.g., HOMERUN for web• Heuristics still needed for mobile devices,

wearables, virtual worlds, etc.• Design guidelines form a basis for developing

heuristics

Page 7: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

Nielsen’s “general” heuristics

(Remember these?)• Visibility of system status• Match between system and real world• User control and freedom• Consistency and standards• Help users recognize, diagnose, recover from errors• Error prevention • Recognition rather than recall• Flexibility and efficiency of use• Aesthetic and minimalist design• Help and documentation

Page 8: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

Nielsen HOMERUN

• Derived from general heuristics– More specific for commercial Web sites:

• High-quality content• Often updated• Minimal download time• Ease of use• Relevant to users’ needs• Unique to the online medium• Netcentric corporate culture

Page 9: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

Discount evaluation• Heuristic evaluation is referred to as discount

evaluation when 5 evaluators are used.

• Empirical evidence suggests that on average 5 evaluators identify 75-80% of usability problems.

• Note how similar to “quick and dirty” user studies results

Page 10: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

3 stages for doing heuristic evaluation

• Briefing session to tell experts what to do• Evaluation period of 1-2 hours in which:

- Each expert works separately- Take one pass to get a feel for the product- Take a second pass to focus on specific

features• Debriefing session in which experts work

together to prioritize problems

Page 11: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

Advantages and problems

• Few ethical & practical issues to consider• Can be difficult & expensive to find experts• Best experts have knowledge of application

domain & users

• Biggest problems- important problems may get missed- many trivial problems are often identified

• One study has shown:– For each true problem, 1.2 false alarms and 0.6

missed problems

Page 12: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

Evaluating Websites

• Heuristics like Nielsen’s “general list” less applicable for websites– Nielsen’s HOMERUN

• Important: Heuristics can become more useful when they become guidelines for development

• Analysis of outcomes might lead to particular suggestions. E.g. for one set of published results:– (1) layout of pages, (2) arrangement of topics, (3)

depth of navigation

Page 13: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

Preece’s Web Heuristics/Guidelines

• Navigation– (1) long pages with wasted space, (2) navigation

support like a site map, (3) good menu structures, (4) standard link colors, (5) consistent look and feel

• Access– (1) avoid complex URLs, (2) long download times

• Information Design (both content and comprehension)– (1) outdated or incomplete info, (2) good graphical

design, (3) good use of color, (4) avoid gratuitous graphics and animation, (5) consistency

Page 14: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

Topic: Heuristics for…• What are a good set of heuristics for a cellphone’s UI?• Status

– status of call should be visible (call, connection, roaming, battery)

– mode (vibrate etc.)– unread text messages, voice mails

• Navigation– one-button for phonebook numbers– consistent navigation button for back, etc.

• Error prevention– prevent accidental button presses in pocket, backpack, purse

• Efficiency

Page 15: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

Topic: Heuristics for…• What are a good set of heuristics for a cellphone’s UI?• Status

• Navigation

• Error prevention

• Efficiency

Page 16: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

Overview: Walkthroughs

• Like heuristic evaluation because– Experts are involved– Criteria are used to evaluate things

• Different because:– Defining characteristic: they “walk through” one or

more tasks– In addition to experts, may involve designers and/or

users

Page 17: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

Cognitive walkthroughs

• Focus on ease of learning• Designer presents an aspect of the design & usage

scenarios• One of more experts walk through the design

prototype with the scenario• Expert is told the assumptions about user population,

context of use, task details• Experts are guided by 3 questions (on next slide)

• Disadvantages? time-consuming, laborious, narrow focus (maybe that’s OK)

Page 18: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

The 3 questions• Will the correct action be sufficiently evident

to the user?• Will the user notice that the correct action is

available? • Will the user associate and interpret the

response from the action correctly?

As the experts work through the scenario they note problems

Page 19: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

Pluralistic walkthrough• Variation on the cognitive walkthrough theme• Performed by a carefully managed team

– Experts, users, and developers

• The panel begins by working separately – Goes through a task scenario, using screens from a

prototype (perhaps)

• Then there is managed discussion that leads to agreed decisions

• The approach lends itself well to participatory design

• Disadvantages: larger group to schedule, only look at a few scenarios

Page 20: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

Key points

• Expert evaluation: heuristic & walkthroughs– Relatively inexpensive because no users– Heuristic evaluation relatively easy to learn– May miss key problems & identify false ones– Walkthroughs: more task focused, more time and cost

Page 21: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

Heuristic Categories from ID-Book

• Visibility of System Status• Match between system

and real world• User control and freedom• Consistency and

Standards• Error Prevention• Recognition rather than

recall• Flexibility and efficiency

of use

• Aesthetic and minimalist design

• Help users recover from errors

• Help and documentation• Navigation• Structure of Information• Physical constraints• Extraordinary users

Page 22: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

Class Exercise: Choose One

• (A) Heuristic evaluation for cell phone– Define a few criteria for cell phones

• Use categories but make them phone-specific– E.g. visibility of status; or, error prevention; or recognition vs. recall

• Produce your own list of important criteria

– Be experts and review UI for issues• List issues for one (or more) phones

• (B) Cognitive walkthrough for one or more department web pages– Identify tasks

• Find requirements for major electives for that major• Find rules for major(s) in that department• Find faculty who are working in a certain research area• etc.

– Walk-through these tasks identifying issues based on particular criteria (ease of learning)

• (B1) Same but do a shopping site etc.

Page 23: CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

End, Spring 2008