usability service enhancements to digimap (used)
DESCRIPTION
Presentation delivered by Addy Pope at IS Usability Workshops, 1/11/2012, University of EdinburghTRANSCRIPT
Usability Service Enhancements to Digimap
JISC Research infrastructure programme - Usability/Learnability
Aim
Short Project - aims to improve the usability and learnability of a widely used research tool that provides access to valuable geospatial research data.
http://www.flickr.com/photos/ogimogi/2223450729/sizes/l/in/photostream/
What do we know
• The Workflow is based on a single linear sequencing of interaction.
• It presumes a significant familiarity with Ordnance Survey products.
• It doesn’t allow mulit-product extractions.
• User preferences cannot be saved, nor do they persist between downloads.
• Focused on accommodating data
Define area
Pick data tiles
Define delivery format
Stats: • 48,000+ Digimap
users• 74,000 data
requests*• 1,000,000 data
tiles served** Based on Jan 2010 – Jan 2011
Usability focused Methodology
Geoforum 2012
Start
Persona
Tech/UserRec’s
Testing
Deploy
0
6
User Testing
Prototype Review
Revise
Implement
Tech Requirements
• Must mesh with existing database
• Must work in IE/Chrome/Safari/Firefox
• Existing skill/knowledge base
• Future-proof
• Solution must conform to OGC standards
Persona’s
http://used.blogs.edina.ac.uk/files/2011/09/USeD_Persona.pdf
User Requirements
Based on this we know that users generally:• know where they want data for• may not know what data best suits their needs• want more than one dataset for their area of interest • know roughly what they need for a specific area
In addition we know that some users:• download multiple datasets to work out which suits there needs • find tile based selection process frustrating• find the data download limits confusing and frustrating• get annoyed by repetition when ordering data• want to download a lot of detailed data
a user must be able toa user should be able toa user could be able to
Version 1
Task based Testing1 day5-6 people45 mins per user6 tasks for users to complete designed to allow them to explore the interface15 min de-brief between users
Not how we will be conducting the interviews..... we had tea and coffee!(http://www.flickr.com/photos/portlandcenterstage/1485915421/)
Usability Lab
http://used.blogs.edina.ac.uk/2012/01/27/usability-lab-on-a-shoe-string-budget/
Usability need not cost the earth:We set up a usability lab for less than £50
Presenting your findings
Seems obvious, but presenting your findings clearly is vital
1. Categorise your findings2. Provide examples of alternative functionality
3. Don’t forget to mention the positive aspects
So what did we find out?
So what did we find out?
This particular issue prevented a couple of testers from completing tasks and therefore did not get the data they wanted.This is a serious issue as the design has essentially failed.
So what did we find out?
If there is an issue with the interface, suggest an alternative. Provide a visual mock-up if you can.
Not all changes work….
If it caused confusion, change it and see if your solution is better. They aren’t always but with iterative testing you shouldn’t be afraid to experiment.
Changing the basket worked
Changing the removing the pan/zoom didn’t
Version 2
Version 3
Version 4
Final
Lesson Learned
• Save time - find out what is wrong with an interface early rather than when it is release.
• Buy skills – no shame in getting an expert in to help. Great if you can use them to help up-skill staff and ensure best practice is followed.
• Fresh eyes – putting aside all baggage is difficult, an external partner (could still be within the University) helps strip away legacy terms and processes.
• Low tech - usability labs need not be expensive• Documentation - effective documentation makes
buy-in from stakeholders much easier
Lesson Learned
• Personas work - even the contrived names such a Explorer Evie or Work-around Walter. These make it easier to discuss issues and problems with the project team and relate them back to a “real” user.
• The obvious is not obvious ‘til you it is pointed out – fresh eyes, no a priori knowledge and no hang ups on terms/process
• Salvage something – even when a test is going horribly you can get something out of it.
• Don’t make extra work for yourself - 5-6 user to test an interface is fine, by the 4th person you are uncovering very little in the way of new issues.
• Go beyond the interface - users may be using your service for something other than it’s primary purpose. This may be because they don’t know there is another service that would be better suited, or that your service is the best thing out there that almost does what they want.
• It’s in your head - write up user tests immediately, you cannot write down everything at the time
Success and how to measure it
In the broadest terms, we probably wanted:• a more useable interface for
Digimap Data Downloader• to develop usability skills in house• to promote the use of usability as
a tool for effective service development
What did we set out to achieve?
Improve service interface
A more useable interface for Digimap Data Downloader
Well I think we scored a big tick on this one, but how would you measure it?
Testing showed that users were getting through the process of ordering the data that they needed. So the interface is usable. Fewer users struggled using the later iterations.
“Data Download Beta beats the old version hands down as far as I‘m concerned. The rapidity with which you can select a map extent and download all of the relevant mapping data in one go is by far much better than the slow and more manual way things used to work. Top notch stuff.” Lecturer – Northumbria University.
Develop in-house usability skills
To develop usability skills in EDINA
• We employed a Usability consultant and they provided input on best practice throughout the project.
• EDINA staff ran the usability tests.
Usability is only part of the process. You still need• To have clear aims• Understand your users needs• Have a strong design that is clear and consistent
Promote the use of usability
To promote the use of usability as a tool for effective service delivery
We have implemented small usability studies for other projects such as:• the Digimap Home page (it was clear from the USeD that some
users were using the wrong tools in Digimap)• FieldMap GB – essentially a mobile mapping/data collection
app. Mobile devices are more complex:• Multiple Operating Systems• Multiple Screen sizes• Data connection• Screen real-estate
• Personas have been developed for other projects to help stakeholders appreciate the variety of users there are (LOCKSS Aliance)
Conclusions
Saves time, money and results in a better service.
Only works if it is an integral part of service design and service transition, not an after
thought.
http://www.addletters.com/pictures/bart-simpson-generator/3207130.htm
Find out more USeD Blog – http://used.blogs.edina.ac.uk
Thoughts
“Data Download Beta beats the old version hands down as far as I‘m concerned. The rapidity with which you can select a map extent and download all of the relevant mapping data in one go is by far much better than the slow and more manual way things used to work. Top notch stuff.”
Lecturer at Northumbria University
What did we learn?
Personas work – they help demonstrate who the interface is for and how it will be usedThe blindingly obvious is not always obvious until it is pointed outHow you present your UI testing report after testing is very importantUsability labs need not be expensive