the live usability lab: open access archives and digital repositories

4
The Live Usability Lab: Open Access Archives and Digital Repositories SIG Sponsors: SIG-DL, SIG-SI, SIG-STI, and SIG-USE (All Confirmed) Organizers: Anita Coleman and Paul Marty Panel Format: This is a new, interactive lab/panel format; details provided below. Time Requested: minimum 90 minutes; 120 minutes preferred. Panelists: 5 total, all confirmed Usability Evaluators & Panel Leaders Paul Marty, College of Information, Florida State University (confirmed) Michael Twidale, Graduate School of Library and Information Science, University of Illinois at Urbana-Champaign (confirmed) Archives’ Site Representatives Representing Eprints software ( will be dLIST ( ) Repository Editor: Anita Coleman (confirmed) Representing DSpace ( will be IDEALS () Repository Research Programmer: Tim Donohue (confirmed) Representing Dspace () will be UW @ MINDS () Repository Librarian: Dorothy Salo (confirmed) Overview While the past ten years have seen great advances in the willingness of most organizations to concede the value of usability analysis (Dumas, 2002), misconceptions about the value of user testing persist, and consumers still contend daily with poorly designed and unusable interfaces (Shneiderman, 2002). Even today, many need to be convinced of the value of usability analysis for improving information interfaces (Bias & Mayhew, 2005). Usability has emerged as a particularly vexing problem and formidable barrier to scholarly self-archiving and the growth of open access archives, whether they are institutional or disciplinary digital

Post on 15-Jun-2016

217 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: The Live Usability Lab: Open access archives and digital repositories

The Live Usability Lab: Open Access Archives and Digital Repositories

SIG Sponsors: SIG-DL, SIG-SI, SIG-STI, and SIG-USE (All Confirmed)

Organizers: Anita Coleman and Paul Marty

Panel Format: This is a new, interactive lab/panel format; details provided below.

Time Requested: minimum 90 minutes; 120 minutes preferred.

Panelists: 5 total, all confirmed

Usability Evaluators & Panel Leaders

Paul Marty, College of Information, Florida State University (confirmed)

Michael Twidale, Graduate School of Library and Information Science, University of

Illinois at Urbana-Champaign (confirmed)

Archives’ Site Representatives

Representing Eprints software ( will be dLIST ( ) Repository Editor: Anita Coleman

(confirmed)

Representing DSpace ( will be IDEALS () Repository Research Programmer: Tim

Donohue (confirmed)

Representing Dspace () will be UW @ MINDS () Repository Librarian: Dorothy Salo

(confirmed)

Overview

While the past ten years have seen great advances in the willingness of most

organizations to concede the value of usability analysis (Dumas, 2002),

misconceptions about the value of user testing persist, and consumers still contend

daily with poorly designed and unusable interfaces (Shneiderman, 2002). Even

today, many need to be convinced of the value of usability analysis for improving

information interfaces (Bias & Mayhew, 2005). Usability has emerged as a

particularly vexing problem and formidable barrier to scholarly self-archiving and the

growth of open access archives, whether they are institutional or disciplinary digital

Page 2: The Live Usability Lab: Open access archives and digital repositories

repositories (Foster and Gibbons, 2003; Coleman, 2005; Salo, 2006; Sale, 2006).

This session proposes a solution to the usability problem in open access archives by

using an innovative and interaction-driven usability demonstration method

developed and tested over the past five years: the Live Usability Lab (Marty &

Twidale, 2005). While this would be its first demonstration at ASIS&T, the Live

Usability Lab has been presented at seven different national and international

conferences over the past five years, each time evaluating different websites

selected by the audience. At these conferences, this method has consistently and

successfully a) demonstrated the potential and power of user testing, and b)

engaged the audience by illustrating the process with live data instead of canned

examples. Since there is very little in the literature about the usability testing of open

access archives and digital repositories (which are, after all, another type of website)

the Live Usability Lab format will provide an exciting demonstration of the potential

of usability analysis for evaluating diverse information interfaces.

Globally, the three leading OAI-compliant, open source software of choice for open

access archives and digital repositories are Eprints, DSpace, and Fedora. Other

commercial software available includes Digital Commons with services by Bepress,

and open CDS ware (developed at CERN) which is OAI-compliant and uses MARC21

as its bibliographic base. The Live Usability Lab will focus on the usability analysis of

three different repositories running Eprints and Dspace as these are the most used.

Potential roles and scenarios to be evaluated for usability in digital repositories

include: the scholar/end-user's task of self-archiving, the searcher/seeker's task of

information discovery/search, and the digital repository editor/manager's task of

reviewing the content, metadata, and approving or rejecting a deposit into the

archive.

About The Live Usability Lab Format

During the Live Usability Lab at ASIS&T, the usability of three different open access

archives will be evaluated in an exciting and unscripted live action format with at

least three audience members volunteering to serve as user testers. Each open

access archive will be evaluated in a thirty minute session. This includes the time for

introductions to the archives by their representatives and time for audience

questions and involvement. During this time, the value of user testing will be

explained, how user testing works demonstrated in practice, and a small amount of

user testing data analyzed in real time in front of the audience.

Each thirty minute session involves spending ten minutes assessing an interface and

developing representative tasks, ten minutes administering these tasks to

Page 3: The Live Usability Lab: Open access archives and digital repositories

representative users, and ten minutes analyzing the results of those tasks to identify

usability flaws and recommendations for design (Marty & Twidale, 2005). During

each test, the volunteer user testers leave the room while the site representatives

describe what they consider a typical scenario of use: something the average user

would be trying to do with their site. The evaluators convert these scenarios into

specific tasks and ask the user testers to perform those tasks while the evaluators,

representatives, and audience members observe. After each test, the user testers,

representatives, evaluators, and audience members discuss lessons learned from

the usability test.

The Live Usability Lab provides an excellent format for presenting and analyzing

specific information interfaces as well as different usability analysis techniques. It

has been wildly successful each time it is presented and frequently called “the best

thing offered at the entire conference.” Assuming this session is as successful at

ASIS&T as it has been at other conferences, one could easily imagine making the

Live Usability Lab a recurring event (as it is, for example, at the international

conference for Museums and the Web), with different types of information interfaces

being evaluated each year.

About the Open Access Archives and Digital Repositories To Be Evaluated

dLIST, the Digital Library of Information Science and Technology, was established in

2002 as a cross-institutional, disciplinary/subject repository for the Information

Sciences, including Archives and Records Management, Library and Information

Science, Information Systems, Museum Informatics, and other critical information

infrastructures. Any scholar can submit to dLIST as the vision is to serve as a

dynamic archive in the Information Sciences, broadly understood, and positively

impact and shape scholarly communication in the closely related fields. DLIST is

based on Eprints.

IDEALS, the Illinois Digital Environment for Access to Learning and Scholarship,

disseminates, preserves, and provides persistent and reliable access to the research

and scholarship of faculty, staff, and students on the University of Illinois at

Urbana-Champaign campus. IDEALS is a set of collections and related services that

together constitute the campus institutional repository. IDEALS is built upon DSpace

software, with customizations to meet local policies and procedures (Donohue and

Salo, 2006).

MINDS @ UW is designed to store, index, distribute, and preserve the digital

materials of the University of Wisconsin. Content, which is deposited directly by UW

faculty and staff, may include research papers, pre-prints, datasets, photographs,

Page 4: The Live Usability Lab: Open access archives and digital repositories

videos, learning objects, theses, student projects, conference papers, or other

intellectual property in digital form. The content is then distributed through a

searchable Web interface. MINDS @ UW is based on DSpace.

References

Bias, R.G. & Mayhew, D.J. (Eds.) (2005). Cost Justifying Usability Morgan Kaufmann.

Coleman, A. (2005). dLIST 2005 Survey. Self-archiving and scholarly

communication behaviors in LIS. Instrument. http://dlist.sir.arizona.edu/1000/

Donohue, T. and Salo, D. (2006). DSpace How-To Guide: tips and tricks for

managing common DSpace chores. http://ideals.uiuc.edu/handle/2142/11

Dumas, J. (2002). User-based evaluations. In: Jackie, J. & Sears, A. (Eds.) The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications (pp.1093-1117). Mahwah, NJ: Lawrence Erlbaum Associates,

Inc.

Foster, Nancy and Gibbons, Susan. (2003). Understanding faculty to improve

content recruitment in institutional repositories. D-Lib Magazine 11 (1)

http://www.dlib.org/dlib/january05/foster/01foster.html

Marty, P.F. & Twidale, M.B. (2005). Usability@90mph: Presenting and evaluating a

new, high-speed method for demonstrating user testing in front of an audience 10

(7) http://www.firstmonday.org/issues/issue10_7/marty/index.html

Nielsen, J. (1999). Designing Web Usability: The Practice of Simplicity. Berkeley, CA: New Riders Publishing.

Sale, A. (2006) The acquisition of open access research articles. First Monday 11(10) http://www.firstmonday.org/issues/issue11_10/sale/index.html

Salo, D. (2006). Caveat Lector (posts on self-archiving, registration, etc.).

http://cavlec.yarinareth.net/

Shneiderman, B. (2002). Leonardo's Laptop: Human Needs and the New

Computing Technologies. Boston: MIT Press.