hci usability evaluation portfolio presentation

30
HCI Usability HCI Usability Evaluation Portfolio Evaluation Portfolio Presentation Presentation Team 1: Matthew Lewis Lee Richardson Gareth Gerrard Daniel Ashmore Rachael Stephenson

Upload: conan-brewer

Post on 30-Dec-2015

34 views

Category:

Documents


2 download

DESCRIPTION

HCI Usability Evaluation Portfolio Presentation. Team 1: Matthew Lewis Lee Richardson Gareth Gerrard Daniel Ashmore Rachael Stephenson. Context. As a team we chose to evaluate Jet2 and Ryanair Justification of Choice - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: HCI Usability Evaluation Portfolio Presentation

HCI Usability HCI Usability Evaluation Portfolio Evaluation Portfolio PresentationPresentationTeam 1:Matthew LewisLee RichardsonGareth GerrardDaniel AshmoreRachael Stephenson

Page 2: HCI Usability Evaluation Portfolio Presentation

ContextContextAs a team we chose to evaluate Jet2 and Ryanair

Justification of Choice Wide range of user experience will give us a variation of

perspective for evaluation The sites have the same purpose, but a wide audience

(Giraffe Forum, 2007) state that one site is dramatically superior in regards to the user interface design

Dix et al (2004) say evaluation has the goal of assessing the accessibility of a system. According to Webcredible (2005) 14% of the UK population were registered disabled. While Cowen (2010) states that half the population take at least one flight each year. These sites would allow us to assess their accessibility thoroughly.

Page 3: HCI Usability Evaluation Portfolio Presentation

ContextContextEvaluation CriteriaWe chose to form our own evaluation criteria

based on those taken from existing HCI principles and heuristics, as we felt it more applicable to our choice of websites and that it would allow for a large scope of evaluation:◦ Consistency/Model (Sutcliffe, 1995/Norman, 2004).◦ User control (Sutcliffe, 1995).◦ Error prevention (Nielsen, 1994).◦ Aesthetic and minimalist design (Nielsen, 1994).◦ Match between system and the real world (Nielsen,

1994).◦ Design dialogs to yield closure (Schneiderman,

1998).◦ Universal Design (Bettye Rose Connell et al, 1997).

Page 4: HCI Usability Evaluation Portfolio Presentation

ContextContext

Evaluation MethodsEye TrackingQuestionnairesHuman ObservationHeuristic Evaluation

Page 5: HCI Usability Evaluation Portfolio Presentation

Evaluation & Critical Evaluation & Critical AnalysisAnalysis

Eye Tracking

Why eye tracking is appropriate for our project

It tests interface usability

Allows us to understand how information is processed by

a user, when looking at visual information.

It provides us with evidence to undertake our evaluation.

Because our tasks were mainly search based, eye

tracking was able to provide us with very accurate data.

Page 6: HCI Usability Evaluation Portfolio Presentation

Evaluation & Critical Evaluation & Critical AnalysisAnalysisThe Three Tasks

1. Find the telephone number of the head office /

customer service desk

2. Locating the site-map / A-Z index of website content

3. Continue the booking process up until the website

requests your personal details, to find flights from

Leeds/Bradford airport to Malaga airport on the 2nd

January 2011, and a returning flight on 8th January

2011 for a 40 year-old couple with a 15 year-old

daughter.

Page 7: HCI Usability Evaluation Portfolio Presentation

Evaluation & Critical Evaluation & Critical AnalysisAnalysisReasons for choosing the tasks

The third task given was cognitive as it tested the

potential usability failure of Jet2.

The reason being is, it stipulates that an adult is classed

as a person over 12 (in small grey writing).

Where as Ryanair classes an adult as a person over 16.

We thought it would be a good usability check and a

possible test of the ‘Error prevention’ evaluation criteria.

Booking a flight for someone who is classed as an adult

on one sites (Jet2) and a child on the other (Ryanair)

Page 8: HCI Usability Evaluation Portfolio Presentation

Evaluation & Critical Evaluation & Critical AnalysisAnalysisWhat did the eye tracking data show? Participant 1 took much longer to complete task 3 than

Participant 2 did, while Ryanair achieved roughly the same time on each site. This could suggest that Ryanair is the more usable of the two, however human observation during the eye tracking would prove that a faulty return of results for Participant 1 on Jet2 meant they had to complete the task twice.

Both websites (for tasks 1 and 2) also adhered to Zeldman’s (2001) 3 click rule which suggests that all information should beaccessible within three clicks or the user will become frustrated and leave.

However Porter (2003) says ‘The number of clicks isn't what is important to users, but whether or not they're successful at finding what they're seeking.

Page 9: HCI Usability Evaluation Portfolio Presentation

Evaluation & Critical Evaluation & Critical AnalysisAnalysisQuestionnaire In creating the questionnaire, we wanted to obtain both

qualitative and quantitative data, so we incorporated different types of questions, ranging from open and closed questions to including ‘likert’ scales.

We asked 'Rate on a scale of 1-10 how hard you found it to find the flight in task 3 using each website‘. A bar graph below shows the results:

Analysis It would seem that there

is no clear correlation between ease of use on each of the sites based on these results. It may help to gain more participants for the study, as this way it could allow for a correlation. Niles (2006) states that the larger the sample size, the smaller the margin of error will be - therefore offering more reliable results. However, Nielsen (2004) proposes that after testing fifteen participants, diminishing returns set in and correlations increase very little, and he therefore concludes that using 15 participants would return a useful set of results.

Page 10: HCI Usability Evaluation Portfolio Presentation

Evaluation & Critical Evaluation & Critical AnalysisAnalysisQuestionnaire We also asked them ‘Please tick the appropriate emotion(s)

regarding how you felt while using each website’ and we produced pie charts based on the results.

Analysis The results suggest that Ryanair is the more usable of the two

websites, as Participant 1 found Jet2 ‘frustrating’ while finding Ryanair ‘comfortable’ and ‘happy’ in use. However this may be a result of the participant receiving a faulty data set on the third task, which may have influenced his feelings.

We must still take this into consideration though, and state that Jet2 violated our evaluation criteria of ‘User control’ (Sutcliffe, 1995) and ‘Error prevention’ (Nielsen, 1994).

Page 11: HCI Usability Evaluation Portfolio Presentation

Evaluation & Critical Evaluation & Critical AnalysisAnalysisAppropriateness of Questionnaires

Advantages Gain direct feedback from the user Trustworthy sample from the whole user population Gathered in a standardised way, which suggests they

are more objective

Disadvantages Only tells you the users reaction as they perceive the

situation They occur after the event, so participant may forget

things Participants may answer superficially

Page 12: HCI Usability Evaluation Portfolio Presentation

Evaluation & Critical Evaluation & Critical AnalysisAnalysisHeuristic

EvaluationDiscount method

◦ Quick◦ Cheap◦ Easy

Cost/Benefit vs. “deluxe” methods

Page 13: HCI Usability Evaluation Portfolio Presentation

Evaluation & Critical Evaluation & Critical AnalysisAnalysis

Heuristic EvaluationScore Consistency

Page 14: HCI Usability Evaluation Portfolio Presentation

Evaluation & Critical Evaluation & Critical AnalysisAnalysis

Mean Score Per SectionHeuristic Evaluation

Visibility of system status

User control and freedom

Consistency and standards

Recognition rather than recall

Aesthetic and minimalist design

Help, documentation, recovery from errors

Language

Page 15: HCI Usability Evaluation Portfolio Presentation

Heuristic EvaluationUsability problems encountered

◦ Jet2 Current location

Evaluation & Critical Evaluation & Critical AnalysisAnalysis

Page 16: HCI Usability Evaluation Portfolio Presentation

Heuristic Evaluation Usability problems encountered

◦ Ryanair Consistency

Evaluation & Critical Evaluation & Critical AnalysisAnalysis

Page 17: HCI Usability Evaluation Portfolio Presentation

Evaluation & Critical Evaluation & Critical AnalysisAnalysisHuman Observation

We used 4 observers during the evaluation:

One to look out for irregular facial expressions that may indicate what the participant felt at that specific time

Another member to monitor the screen, looking out for unexpected decisions that the participant may have perceived to be relevant to the task

The other two members were to look out for the participants body language that again may indicate what they were feeling during the evaluation

Page 18: HCI Usability Evaluation Portfolio Presentation

Evaluation & Critical Evaluation & Critical AnalysisAnalysisHuman Observation

We also used recording tools:

The discrete webcam set into the monitor was a useful tool for video recording the participants facial expressions

Simultaneous audio recording is useful for reviewing what was said by the participant in response to the system. I.E what they were feeling, which may be reflected in the video

Page 19: HCI Usability Evaluation Portfolio Presentation

Evaluation & Critical Evaluation & Critical AnalysisAnalysisHuman Observation

The Hawthorne effect:

“A form of reactivity whereby subjects improve an aspect of their behaviour being experimentally measured simply in response to the fact that they are being studied, not in response to any particular experimental manipulation”

- (Miller, 2010)

This may affect our results slightly as the participant may ‘improve their performance’ from normal conditions

We made the participant feel as comfortable as possible to reduce the Hawthorne effect

Page 20: HCI Usability Evaluation Portfolio Presentation

Evaluation & Critical Evaluation & Critical AnalysisAnalysis

(Fig 1: Confusion demonstrated)

Participant A expects to see the telephone number on the ‘contact’ page, however is confused when he cannot find it (fig. 1). Supported by Hagen (2008) - “raised eyebrows can also indicate confusion…”

Task 1 - Jet2:

In comparison both participants were able to complete the task extremely quickly and seemingly comfortably on the Jet2 website

Task 1 - Ryanair:

Participant B shows signs of concentration or concern (fig. 2) supported by Miles (2003 pg.3) - “humans bite their lip during times of concentration or concern”, neither of which should be in excess for the user who is performing a relatively simple task

(Fig 2: Lip biting)

Page 21: HCI Usability Evaluation Portfolio Presentation

Evaluation & Critical Evaluation & Critical AnalysisAnalysisHuman Observation

Task 2- Ryanair:

The home page took 14 seconds to load for participant A, risking users leaving the site under normal conditions. Supported by McGrath (2006 pg.41) - “…web users will wait about 6 seconds for a web page to load. Beyond 6 seconds, it is likely the user will leave the web site”

(Fig 3: Surprise)

Sitemap opens in a new window. This is not expected by participant A who stated at the time “oh… it opens a new window”. Also fig. 3 shows signs of surprise, supported by Huron (2006 pg.26) - “people will often exhibit the characteristic ‘surprise’ face with the gaping mouth and wide-open eyes” Participant B clicked on a link ‘route map’

rather than ‘site map’ which is understandable due to its position and terminology. The frowning on his face (fig. 4) shows how confused he is when the page for ‘route map’ loads up

(Fig 4: Frowning)

Page 22: HCI Usability Evaluation Portfolio Presentation

Evaluation & Critical Evaluation & Critical AnalysisAnalysisHuman Observation

Task 2- Jet2:

Again the home page took 14 seconds to load for participant A risking users leaving the site under normal conditions

Quick to complete. Both participants identified the link in the footer

Page 23: HCI Usability Evaluation Portfolio Presentation

Evaluation & Critical Evaluation & Critical AnalysisAnalysisHuman Observation

Task 3- Ryanair:

Text boxes highlighting the selected flight data provide error prevention

No flight was pre-selected due to availability and participant B seemed unaware of how to select a relevant flight. He clicked on ‘select flight’ that looks like a button, but in fact a request

Neither participant had any trouble identifying which category the passengers would be classed in on the Ryanair website, as they both successfully managed to identify two adults and one child

Page 24: HCI Usability Evaluation Portfolio Presentation

Evaluation & Critical Evaluation & Critical AnalysisAnalysisHuman Observation

Task 3- Jet2:

Participant A seemed surprised that the 15year old that would be classed as a child on the Ryanair website, is classed as an adult on the Jet2 website. Good error prevention as he realised before submitting any data

Error occurred during the flight search and information box information box is crammed with other irrelevant information and failed to provide accurate and specific feedback to the user who found it difficult to recover

Participant B seemed relaxed due to ease and speed at finding the relevant

Page 25: HCI Usability Evaluation Portfolio Presentation

Evaluation & Critical Evaluation & Critical AnalysisAnalysisEvaluation of the X-Stream discussion tool

Based on our current use of the x-stream discussion tool our team discussed how it can help or hinder communication as a group. These were our findings.

Positives• The ability to show and hide content so that the users can choose what they want to see.“- users do not read on the Web; instead they scan the pages, trying to pick out a few sentences or even parts of sentences to get the information they want” (Nielsen, 1997).

• No distracting advertisements, this helps user experience as the user’s attention isn’t been diverted from its original purpose.“- users detest anything that seems like marketing fluff or overly hyped language ("marketese") and prefer factual information.” (Nielsen, 1997).

Page 26: HCI Usability Evaluation Portfolio Presentation

Evaluation & Critical Evaluation & Critical AnalysisAnalysisEvaluation of the X-Stream discussion tool

Negatives• One of the main problems we encountered was the page constantly refreshing whenever you do anything. For example, if you are to click the arrow that drops down the list of all messages within one topic, X-Stream refreshes the whole page and sends you back to the top, which can add to a frustrating user experience. It could be made much better by using AJAX and just refreshing in-page.

• The page can also refresh whist scrolling down a page and all of a sudden you find yourself at the top again.

• Most of the links are JavaScript and don’t give a indication of what the link is going to do without looking at the status bar. Some links open a new window without the user knowing. This "undermine[s] users' understanding of their own system." (Nielsen, 2002).

It should also be noted that this was a top 10 design mistake of 2002, meaning the system failed to take in to consideration a significant problem noted 8 years ago.

Page 27: HCI Usability Evaluation Portfolio Presentation

Conclusion, Further Work & Conclusion, Further Work & ReferencesReferencesConclusion The two websites that we have evaluated were

www.jet2.com and www.ryanair.com Two participants completed three tasks each on both

websites, they were recorded by the eye tracker with observers taking notes on what happened.

Afterwards they both filled in a questionnaire on the user experience of these websites and how they felt overall about them.

The use of a questionnaire provided us with both qualitative and quantitative data. This provided us with a large scope of feedback which was helpful when it came to evaluating both the websites as well as the statistical data from the eye tracker.

Page 28: HCI Usability Evaluation Portfolio Presentation

Conclusion, Further Work & Conclusion, Further Work & ReferencesReferencesFurther Work If we had the opportunity to take part in the project

again we would have looked at ways in which we could have done things differently.

What we could have done? - We could have looked at using different user profiles. In

our study both participants were computing students and therefore could have a slight advantage over the average computer user. We could have used one computing student and one average user and looked at how the results from the eye tracker compare against each other.

- Changed the environment. The room we did the experiment in was rather small and during our session there were about 8 people in the room making it fairly crowded.

Page 29: HCI Usability Evaluation Portfolio Presentation

Conclusion, Further Work & Conclusion, Further Work & ReferencesReferencesCookies Problem.

During the second participant’s third task (booking a specific flight) we noticed that the data from the booking form was still filled in from the first participant, this caused the completion time of the third task to be incorrect as there was little time required to fill in each section. This problem was caused by the cookies from the internet browser. If we were to do this again we would have deleted the browsing history in between each participant.

Page 30: HCI Usability Evaluation Portfolio Presentation

BibliographyBibliography Jakob Nielsen (1994) Guerrilla HCI [online] Available at:

<www.useit.com/papers/guerrilla_hci.html> [Accessed 18 November 2010]. Jakob Nielsen (2004) Change the Color of Visited Links [online] Available at:

<www.useit.com/alertbox/20040503.html> [Accessed 18 November 2010]. Finkernet Marketing (2005) Basics of Search Engine Marketing. [E-book] Finketnet

Marketing. Available at: < http://www.finkernet.com/sem/consistency/> [Accessed 17 November 2010].

Jakob Nielsen (1997) Concise, SCANNABLE, and Objective:How to Write for the Web [Internet]. Available from <http://www.useit.com/papers/webwriting/writing.html> [Accessed 31 October 2010].

Jakob Nielsen (2002) Top Ten Web-Design Mistakes of 2002 [Online]. Available from <http://www.useit.com/alertbox/20021223.html> [Accessed 21 October 2010].

Miller, F., and others. (2010) Hawthorne Effect. VDM Publishing House Ltd. Hagen, S. (2008) The Everything Body Language Book. USA, Adams Media. Miles, S. (2003) Don't Take Me to Your Leader. USA, iUniverse. McGrath, B. (2006) 100 Steps for Improving Your Website and eBusiness.

Printed Owl. Huron, D. (2006) Sweet anticipation: music and the psychology of

expectation. MIT Press.