Transcript
Page 1: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

User testing and evaluation: why, how and when to do it

Evaluating and user testing…Appleby Magna Centre

11 May 2009

Martin BazleyMartin Bazley & Associates

www.martinbazley.com

Page 2: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Intro: Martin Bazley• Consultancy/websites/training/user testing

ICT4Learning.com (10+ yrs)

• Chair of E-Learning Group for Museums

Previously:

• E-Learning Officer, MLA South East (3yrs)

• Science Museum, London, Internet Projects (7yrs)

• Taught Science in secondary schools (8yrs)

Page 3: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Why evaluate websites?Why do evaluation and user testing?

Isn’t it really expensive and time consuming?

1. Save money – avoid substantial, hurried redevelopment later in project

2. Audience feedback improves resource in various ways – new activity ideas, etc

3. Demonstrate involvement of key stakeholders throughout project

Page 4: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Making websites effective3 key success factors

• Understanding audience

• Learning experience and learning outcomes – right for audience and clearly stated

• Evaluation – esp in classroom or home (observe in ‘natural habitat’ wherever possible…)

Page 5: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Who for what for ...• Who for? (audience)

– Need to be clear from starte.g. ‘ for teachers of yr5/6 in local area with whiteboards’

• What ‘real-world’ outcomes? (learning outcomes)– What will they learn or do as a result?

e.g. plan a visit to museum, learn that Romans wore funny clothes, discover that they enjoy using a digital camera…

• How will they use it? (learning experiences)– What do they actually do with the site?

e.g. work online or need to print it? - in pairs or alone? - with or without teacher help?

• Where, when and why will they use it?– context is important

Page 6: 090511 Appleby Magna Overview Presentation
Page 7: 090511 Appleby Magna Overview Presentation
Page 8: 090511 Appleby Magna Overview Presentation
Page 9: 090511 Appleby Magna Overview Presentation
Page 10: 090511 Appleby Magna Overview Presentation
Page 11: 090511 Appleby Magna Overview Presentation
Page 12: 090511 Appleby Magna Overview Presentation
Page 13: 090511 Appleby Magna Overview Presentation
Page 14: 090511 Appleby Magna Overview Presentation
Page 15: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Website evaluation and testingNeed to think ahead a bit:

– what are you trying to find out?

– how do you intend to test it?

– why? what will do you do as a result?

The Why?Why? should drive this process

Page 16: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Test early

Testing one user early on in the project…

…is better than testing 50 near the end

Page 17: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

When to evaluate or test and why

• Before funding approval – project planning

• Post-funding - project development

• Post-project – summative evaluation

Page 18: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Testing is an iterative process

Testing isn’t something you do once

Make somethingMake something=> test it => test it

=> refine it=> refine it=> test it again=> test it again

Page 19: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Before funding – project planning• *Evaluation of other websites

– Who for? What for? How use it? etc– awareness raising: issues, opportunities– contributes to market research– possible elements, graphic feel etc

• *Concept testing – check idea makes sense with audience– reshape project based on user feedback

Focus group

Research

Page 20: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Page 21: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Post-funding - project development• *Concept testing

– refine project outcomes based on feedback from intended users

• Refine website structure– does it work for users?

• *Evaluate initial look and feel – graphics,navigation etc

Focus group

Focus group

One-to-one tasks

Page 22: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Page 23: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Page 24: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Page 25: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Page 26: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Post-funding - project development 2

• *Full evaluation of a draft working version – usability AND content: do activities work, how

engaging is it, what else could be offered, etc

Observation of actual use of website

by intended users,

using it for intended purpose,

in intended context – classroom, workplace, library, home, etc

Page 27: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Page 28: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Page 29: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Page 30: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Page 31: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Page 32: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

• Video clip Moving Here key ideas not lesson plans

Page 33: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Page 34: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Page 35: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Page 36: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Page 37: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Post-funding - project development 3

• Acceptance testing of ‘finished’ website– last minute check, minor corrections only– often offered by web developers

• Summative evaluation– report for funders, etc– learn lessons at project level for next time

Page 38: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Two usability testing techniques

“Get it” testing- do they understand the purpose, how it

works, etc

Key task testing- ask the user to do something, watch how

well they do

Ideally, do a bit of each, in that order

Page 39: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Page 40: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

User testing – who should do it?• The worst person to conduct (or interpret)

user testing of your own site is…– you!you!

• Beware of hearing what you want to hear…

• Useful to have an external viewpoint• First 5mins in a genuine setting tells you

80% of what’s wrong with the site• etc

Page 41: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

User testing – more info

User testing can be done cheaply – tips on how to do it available (MLA SE guide): www.ICT4Learning.com/onlineguide

Page 42: 090511 Appleby Magna Overview Presentation

Strengths and weaknesses of different data gathering techniques

Page 43: 090511 Appleby Magna Overview Presentation

Data gathering techniquesUser testing

- early in development and again near endOnline questionnaires

– emailed to people or linked from websiteFocus groups

- best near beginning of project, or at redevelopment stage

Visitor surveys - link online and real visits

Web stats- useful for long term trends /events etc

Page 44: 090511 Appleby Magna Overview Presentation

Need to distinguish between:

Diagnostics – making a project or service better

Reporting – to funders, or for advocacy

Page 45: 090511 Appleby Magna Overview Presentation

Online questionnaires(+) once set up they gather numerical and

qualitative data with no further effort – given time can build up large datasets

(+) the datasets can be easily exported and manipulated, can be sampled at various times, and structured queries can yield useful results

(–) respondents are self-selected and this will skew results – best to compare with similar data from other sources, like visitor surveys

(–) the number and nature of responses may depend on how the online questionnaire is displayed and promoted on the website

Page 46: 090511 Appleby Magna Overview Presentation

Focus groups

(+) can explore specific issues in more depth, yielding rich feedback

(+) possible to control participant composition to ensure representative

(–) comparatively time-consuming (expensive) to organise and analyse

(–) yield qualitative data only - small numbers mean numerical comparisons are unreliable

Page 47: 090511 Appleby Magna Overview Presentation

Visitor surveys

(+) possible to control participant composition to ensure representative

(–) comparatively time-consuming (expensive) to organise and analyse

(–) responses can be affected by various factors including interviewer, weather on the day, day of the week, etc, reducing validity of numerical comparisons between museums

Page 48: 090511 Appleby Magna Overview Presentation

Web stats(+) Easy to gather data – can decide

what to do with it later(+) Person-independent data

generated - it is the interpretation, rather than the data themselves, which is subjective. This means others can review the same data and verify or amend initial conclusions reached

Page 49: 090511 Appleby Magna Overview Presentation

Web stats(–) Different systems generate

different data for the same web activity – for example no of unique visits measured via Google Analytics is generally lower than that derived via server log files

(–) Metrics are complicated and require specialist knowledge to appreciate them fully

Page 50: 090511 Appleby Magna Overview Presentation

Web stats(–) As the amount of off-website web

activity increases (e.g. Web 2.0 style interactions) the validity of website stats decreases, especially for reporting purposes, but also for diagnostics

(–) Agreeing a common format for presentation of data and analysis requires collaborative working to be meaningful

Page 51: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Who for what for ...• Who for? (audience)

– Need to be clear from starte.g. ‘ for teachers of yr5/6 in local area with whiteboards’

• What ‘real-world’ outcomes? (learning outcomes)– What will they learn or do as a result?

e.g. plan a visit to museum, learn that Romans wore funny clothes, discover that they enjoy using a digital camera…

• How will they use it? (learning experiences)– What do they actually do with the site?

e.g. work online or need to print it? - in pairs or alone? - with or without teacher help?

Page 52: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

• How can you ensure you do get these right?– Build questions into the planning process – Evaluate/test regularly– Get informal feedback whenever possible –

and act on it

• Who is it for?• What are the real world outcomes?• How will they use it?• Also When, Where, Why?

Who for what for ...

Page 53: 090511 Appleby Magna Overview Presentation

Martin Bazley www.ICT4Learning.com

Martin Bazley0780 3580 737

www.martinbazley.com

More information


Top Related