disrupting digital library development with scenario informed design

13
Disrupting digital library development with scenario informed design Ann Blandford a, * , Suzette Keith b , Richard Butterworth b , Bob Fields b , Dominic Furniss a a UCL Interaction Centre, University College London, Remax House, 31-32 Alfred Place, London WC1E 7DP, United Kingdom b Interaction Design Centre, Middlesex University, The Burroughs, Hendon, London NW4 4BT, United Kingdom Received 4 January 2006; received in revised form 23 July 2006; accepted 24 July 2006 Available online 30 August 2006 Abstract In recent years, there has been great interest in scenario-based design and other forms of user-centred design. However, there are many design processes that, often for good reason, remain technology-centred. We present a case study of introducing scenarios into two digital library development processes. This was found to disrupt established patterns of working and to bring together conflicting value systems. In particular, the human factors approach of identifying users and anticipating what they are likely to do with a system (and what problems they might encounter) did not sit well with a development culture in which the rapid generation and informal eval- uation of possible solutions (that are technically feasible and compatible with stable system components) is the norm. We found that developers tended to think in terms of two kinds of user: one who was exploring the system with no particular goal in mind and one who knew as much as the developer; scenarios typically work with richer user descriptions that challenge that thinking. In addition, the development practice of breaking down the design problem into discrete functions to make it manageable does not fit well with a scenario-based approach to thinking about user behaviour and interactions. The compromise reached was scenario-informed design, whereby scenarios were generated to support reasoning about the use of selected functions within the system. These scenarios helped create productive common ground between perspectives. Ó 2006 Elsevier B.V. All rights reserved. Keywords: Digital libraries; Scenario based design; Usability evaluation; Software development processes 1. Introduction The development of any complex interactive system is demanding. There are various factors to take into account: reliability, maintainability, interoperability, usability etc. It is very difficult for a development team to keep all of these requirements in focus simultaneously, or even to pay atten- tion to each one as it becomes relevant. In particular, usability is often sidelined within traditional software development practice. In the work reported here, we investigated ways of intro- ducing user concerns into traditional software development processes that had previously focused on functionality and reliability. The approach taken was based on participant observation, in which a human factors specialist joined each development team at appropriate points within the development process and investigated the use of scenari- os (Rosson and Carroll, 2002) and Claims Analysis (Carroll and Rosson, 1992) as techniques for including a stronger user focus within that development process. One of our partner organisations used both scenarios and Claims Analysis; the other only used scenarios. The findings related to Claims Analysis are reported else- where (Blandford et al., 2006). In this paper, we focus on those relating to the development and use of scenarios. Before presenting the study and findings, we summarise the background under three headings: Understanding the target system: design and usability evaluation of digital libraries; Scenarios; and Previous work on making usability usable. 0953-5438/$ - see front matter Ó 2006 Elsevier B.V. All rights reserved. doi:10.1016/j.intcom.2006.07.003 * Corresponding author. Tel.: +44 20 7679 5288; fax: +44 20 7679 5295. E-mail address: [email protected] (A. Blandford). www.elsevier.com/locate/intcom Interacting with Computers 19 (2007) 70–82

Upload: ann-blandford

Post on 26-Jun-2016

212 views

Category:

Documents


0 download

TRANSCRIPT

www.elsevier.com/locate/intcom

Interacting with Computers 19 (2007) 70–82

Disrupting digital library development with scenario informed design

Ann Blandford a,*, Suzette Keith b, Richard Butterworth b, Bob Fields b, Dominic Furniss a

a UCL Interaction Centre, University College London, Remax House, 31-32 Alfred Place, London WC1E 7DP, United Kingdomb Interaction Design Centre, Middlesex University, The Burroughs, Hendon, London NW4 4BT, United Kingdom

Received 4 January 2006; received in revised form 23 July 2006; accepted 24 July 2006Available online 30 August 2006

Abstract

In recent years, there has been great interest in scenario-based design and other forms of user-centred design. However, there aremany design processes that, often for good reason, remain technology-centred. We present a case study of introducing scenarios intotwo digital library development processes. This was found to disrupt established patterns of working and to bring together conflictingvalue systems. In particular, the human factors approach of identifying users and anticipating what they are likely to do with a system(and what problems they might encounter) did not sit well with a development culture in which the rapid generation and informal eval-uation of possible solutions (that are technically feasible and compatible with stable system components) is the norm. We found thatdevelopers tended to think in terms of two kinds of user: one who was exploring the system with no particular goal in mind and onewho knew as much as the developer; scenarios typically work with richer user descriptions that challenge that thinking. In addition,the development practice of breaking down the design problem into discrete functions to make it manageable does not fit well with ascenario-based approach to thinking about user behaviour and interactions. The compromise reached was scenario-informed design,whereby scenarios were generated to support reasoning about the use of selected functions within the system. These scenarios helpedcreate productive common ground between perspectives.� 2006 Elsevier B.V. All rights reserved.

Keywords: Digital libraries; Scenario based design; Usability evaluation; Software development processes

1. Introduction

The development of any complex interactive system isdemanding. There are various factors to take into account:reliability, maintainability, interoperability, usability etc. Itis very difficult for a development team to keep all of theserequirements in focus simultaneously, or even to pay atten-tion to each one as it becomes relevant. In particular,usability is often sidelined within traditional softwaredevelopment practice.

In the work reported here, we investigated ways of intro-ducing user concerns into traditional software developmentprocesses that had previously focused on functionality andreliability. The approach taken was based on participantobservation, in which a human factors specialist joined

0953-5438/$ - see front matter � 2006 Elsevier B.V. All rights reserved.

doi:10.1016/j.intcom.2006.07.003

* Corresponding author. Tel.: +44 20 7679 5288; fax: +44 20 7679 5295.E-mail address: [email protected] (A. Blandford).

each development team at appropriate points within thedevelopment process and investigated the use of scenari-os (Rosson and Carroll, 2002) and Claims Analysis(Carroll and Rosson, 1992) as techniques for includinga stronger user focus within that development process.One of our partner organisations used both scenariosand Claims Analysis; the other only used scenarios.The findings related to Claims Analysis are reported else-where (Blandford et al., 2006). In this paper, we focuson those relating to the development and use ofscenarios.

Before presenting the study and findings, we summarisethe background under three headings:

• Understanding the target system: design and usabilityevaluation of digital libraries;

• Scenarios; and• Previous work on making usability usable.

with Computers 19 (2007) 70–82 71

1.1. The development context: development and usability of

digital libraries

While the focus of this paper is not on digital libraries(DLs) per se, but on the experience of working with devel-opers, DLs provide an essential strand to the context ofthis work.

Historically, much work on usability of DLs has beenpost hoc – at least in the sense that a fully functioning pro-totype system exists to be tested (e.g., Papadakis et al.,2002; Komlodi and Marchionini, 1998). More recently,there has been an interest in check-list based approachesthat make explicit some of the desirable features of DLs(e.g., Kyrillidou and Giersch, 2005). While this work hasadvanced the understanding of usability issues for DLs,these approaches do not lead to techniques that can bedirectly embedded within development processes.

Few descriptions of DL development processes exist.Bates (2002) gives one account of why this might be,describing a digital library in terms of a ‘‘cascade’’ of inter-acting components. Although not a development modelitself, Bates’ work sets out to establish the relationshipsbetween the various entities (content, search mechanism,user interface etc.) that make up a DL system, with theintention that a DL designer who better understands whatthe entities are and how they interrelate is likely to come tobetter design decisions.

The majority of existing development descriptions arecase studies giving accounts of the design and testing ofindividual DL projects, typically for use in tertiary educa-tion. In particular, various researchers (e.g., Champenyet al., 2004; Spasser, 2003) have given accounts of develop-ment processes that they have been directly involved in.Champeny et al. focus on the overall design process, whichthey describe as involving ‘‘iterative, formative evaluation,a process in which system design is studied in parallel withuser needs and requirements’’. In contrast, Spasser (2003)is more concerned with the selection of material to beincluded within the DL – i.e., with collection building.

Most (though not all) DL projects reported in the liter-ature are experimental, exploratory research projects, andthis may go some way towards explaining why there is apaucity of design process models reported for DL systems.A design process should reduce the risk of failure in adevelopment project by steering the designers towardsgood practice. By its nature a research project is high risk,and therefore should not be bound by possibly restrictivedesign processes. However as DL systems transfer intopublic domains, it is more important to develop design pro-cesses which can ensure the delivery of working, usable DLsystems that are fit for purpose.

One particular problem for DL systems implementedoutside the educational domain is the difficulty in identify-ing well defined user groups. Users and their tasks can befairly easily identified in an educational setting, but manylibraries do not sit in such well defined contexts. Butter-worth et al. (2005) argue that many DL projects set out

A. Blandford et al. / Interacting

explicitly to ‘broaden access’ to library resources intonew user domains, and that in this case it is theoreticallyimpossible to collect a priori user requirements from anunidentified user group. Typical design processes (evensophisticated, iterative, requirements driven processes)start from the assumption of there being a user groupwhich can be delimited by some homogeneous need, andare therefore not well suited to DL development. Butter-worth et al.’s solution to the problem is similar to Champ-eny et al.’s: to iteratively develop a DL system in parallelwith assumed user requirements, and to refine both theDL system and the assumptions about the users’ require-ments as part of the development process. While both ofthe development contexts studied here include iteration indesign, there has historically been little user focus withinthose iterations: the use of scenarios as a means of takinga user perspective within an iterative development processis the focus of the work reported here.

1.2. Scenarios

The use of scenarios is an established approach to think-ing about users within design. Indeed, Carroll (2002,p. 621) asserts that ‘‘it is impossible to find HCI methodstoday that do not incorporate scenario-based design’’.However, a variety of approaches to generating and usingscenarios can be found in the literature.

Young and Barnard (1987) propose and illustrate theuse of scenarios to test cognitive theories, arguing thatHCI scenarios can play an important role in science bychallenging cognitive psychologists to give theory-basedaccounts of human behaviour (as encapsulated withinthose scenarios). More recently, the focus of scenario usehas shifted towards supporting design rather than science.Carroll and Rosson (1992, p. 185) describe a scenario as adescription of ‘‘the activities a user might engage in whilepursuing a particular concern’’. Different kinds of scenarioscan be produced, focusing on existing or envisaged tasksand interactions, and described at different levels of detailaccording to purpose. Carroll and Rosson discuss the selec-tion of scenarios in terms of typical and critical use scenar-ios: focusing on the tasks people perform most frequentlyand those that are most likely to cause problems. The mainsource of scenarios discussed by Carroll and Rosson isempirical studies of how users work; in principle, it is alsopossible to create them by reflecting on how users areintended to work (notably when creating new interactionpossibilities or systems for new user populations). Sutcliffeand Carroll (1999) extend this to describe three differentkinds of scenario: problem initiation (describing the situa-tion prior to re-design); usage (describing a sequence ofuser–system interaction, based on empirical evidence);and projected usage (describing anticipated interactionwith a re-designed artefact).

Cooper (1999) proposes a slightly different approach toscenarios that focuses on ‘‘personas’’. He describes perso-nas as ‘‘hypothetical archetypes of actual users’’. He argues

72 A. Blandford et al. / Interacting with Computers 19 (2007) 70–82

strongly that personas should be at the centre of the designprocess, and that a system should be designed for a verysmall number of personas – ideally, just one. Personas,which include a detailed description of the hypotheticaluser, include information about how that user might workwith a system. However, since the system might not existyet, there is typically less emphasis on details of interac-tions, and more on user features and requirements.

Within the HCI literature, there are many reports on theuse of scenarios within design; these take a variety of per-spectives – from the use of scenarios to support requirementsacquisition (e.g., Maiden and Robertson, 2005) to their usein fostering creativity and co-operation (e.g., Bødker et al.,2000). However, as far as we can establish, there have beenno studies of how to incorporate scenarios within an estab-lished function-oriented design process – an approach thatcan be regarded as ‘‘scenario informed design’’ rather thanthe more familiar ‘‘scenario based design’’.

1.3. The challenge: making usability usable

The challenge of making usability-oriented design andevaluation techniques themselves useful and usable hasbeen with us for a long time (e.g., Hammond et al.,1983). Bellotti (1989) observed that usability techniquesfrom the HCI research community had poor take-up with-in design practice, and presented a range of reasons forthis, from users being difficult to contact to many user-ori-ented techniques being inappropriate to particular designsituations. The problem is highlighted by O’Neill (1998,p. 65) when he refers to ‘‘the largely undisturbed arsenalof system development methods’’. With a view to alleviat-ing this problem, Rosson et al. (1988) and Bellotti et al.(1995) are amongst those that argue for a better under-standing of the practitioners, their context and their tasks,so as to overcome barriers to method use. BuckinghamShum and Hammond (1994) discuss the same issue in termsof ‘gulfs’ between research and practice. There are arguablyat least two gulfs: between research and practice andbetween protagonists with a user and a technology focus.

Heinbokel et al. (1996) go further than identifying gulfs,and actually suggest that user participation, or even a userorientation, within design practice can hinder softwaredevelopment. The measures of success they use explicitlyavoid the ultimate fitness for purpose of the designed arte-fact (including, of course, its perceived usability and useful-ness); rather, they use software engineering metrics. Oneinterpretation of their findings is that actually designingfor users makes design problems more complex, hence ‘bet-ter’ designs ignore how systems will ultimately be used.Suggested reasons for this include ‘‘higher levels of aspira-tion and, therefore, [. . .] a higher degree of stressors and alower degree of team effectiveness [particularly] when onlylittle knowledge and experience are available on how to putuser orientation into practice’’ (Heinbokel et al., 1996, p.234). In the study reported here, it has not been possibleto assess the quality – on any measures – of the design revi-

sions made as a consequence of introducing user-orienteddevelopment methods, but it has been clear that the addi-tional demands placed on developers, who already had dif-ficult technical problems to deal with, were disruptive, forreasons discussed below.

Even in the 21st century, there are surprisingly fewaccounts of the incorporation of HCI techniques within(non-research-focused) design practice, and what little evi-dence there is (e.g., Paddison and Englefield, 2003) pointsto the use of largely post hoc, discount usability techniquessuch as heuristic evaluation (Nielsen, 1994) and lightweightempirical user testing (Vredenburg et al., 2002). Relianceon such techniques means that many inappropriate designdecisions may have been made much earlier in the designprocess, and become design commitments, long before userperspectives are taken on board. There are counter-exam-ples; for example, Spencer (2000) describes the incorpora-tion of the Cognitive Walkthrough technique within anongoing design process. However, Spencer’s descriptionshows that the process of assimilating CW with an existingdesign process demanded adaptation on both sides: of theCW technique (simplifying it to make it more efficient)and of the design practice.

Whether or not it is true (as Carroll (2002) asserts) thatall HCI methods today incorporate scenario-based design,there are still design processes that are not user-centred; ashift towards scenario-based design has not been apparentin our studies – either of DL development (as in the studyreported here) or of other systems. For example, Blandfordand Rugg (2002) report on an industrial design process inwhich there was a strong focus on systems reliability, butminimal consideration of users and use – whether inrequirements acquisition or subsequent design and evalua-tion of the product. In their study, Blandford and Ruggintroduced a suite of user-oriented techniques within onedesign cycle, and design insights from the exercise wereadopted within the following design iteration, but therewas palpable resistance to further change of the establisheddesign process.

2. Methodology

The study reported here is located in the real world of sys-tem development (as advocated by John (1998)). Thisaccount draws on various experiences over a three-year pro-ject that started with the broad aim of developing user-ori-ented tools that were applicable in DL design practice. Thework has been exploratory and participatory rather than fol-lowing a more traditional science research paradigm.

In the early stages of the project, various user-orienteddesign and evaluation techniques were investigated, withtwo particular concerns in mind:

• They should be adaptable to fit within existing DLdesign practice: to make any impact at all, we neededtechniques that were acceptable to DL developers, andthat could co-evolve with their design practices.

A. Blandford et al. / Interacting with Computers 19 (2007) 70–82 73

• Techniques should be ones that could accommodate thekinds of issues that really make DLs difficult to use.

From the earliest days of the project, we were workingwith librarians and developers in a large telecommunica-tions company (BT), learning about their design processesand their users’ needs, and developing and testing candi-date usability tools. Data collection and analysis were qual-itative, based on interviews and observations.

Scenarios and Claims Analysis were adopted as a focusfor the work because:

• Some other techniques that we had tested, includingCognitive Walkthrough (CW: Wharton et al., 1994)and Heuristic Evaluation (Nielsen, 1994), gave usefulinsights about superficial aspects of design, such as thesuitability of labels and the quality of feedback, butdid not give leverage on deeper usability issues concern-ing information seeking (Blandford et al., 2004). Scenar-ios and Claims offered the promise of addressing thesedeeper issues.

• Rather than being purely evaluative, scenarios andClaims are explicitly designed to fit within ongoingdesign practice; thus, the analysts’ reflection on designstrengths and weaknesses could directly inform designchanges.

• They appeared to be sufficiently light-weight to beacceptable to developers without a strong user-orientedanalytical background.

• Compared to general design rationale (e.g., QOC:MacLean et al., 1991), the psychological basis of ClaimsAnalysis provides more explicit support for viewingdesign from a user perspective.

Later in the project, we worked with a second develop-ment team – this time, developers of the Greenstone DLsoftware, based at the University of Waikato. This studywas necessarily time-limited: initial meetings were held withthe lead developer when he was in the UK, but the mainstudy was conducted in a series of interviews and teammeetings over a 2-week period in New Zealand.

Each study comprised three main phases:

1. Initial exploratory participant observation work thatinvestigated the design approach taken by the develop-ment team and collaboratively developed scenarios. Bycoincidence, this phase with each team focused on thedevelopment of a novel feature (Jasper at BT and theGatherer in Waikato).

2. Formal training in development of scenarios and ClaimsAnalysis. In particular, this training focused on the ideaof a simple action cycle, in which users form goals, exe-cute actions and receive feedback from the system tostructure some of the discussion of scenarios.

3. More structured investigation of scenarios, in which thedevelopers created their own scenarios of use and thediscussion focused on the implications for design. Again,

these sessions took a participation observationapproach. These sessions happened to focus on reviewand redesign of existing system features with both devel-opment teams.

Clearly, the approach employed in this study is not astandard scientific method. It is not reproducible: if a dif-ferent team of people had conducted the study, the processand findings would have been different in detail. The ‘eval-uator effect’ (Hertzum and Jacobsen, 2001) has been foundto apply even to one individual evaluating a stable system,so it is unavoidably much greater where teams areinvolved in evaluating an evolving system. This is actionresearch, in which insights have been achieved throughthe process of working with developers and reflecting onthe experience. As we show in the accounts that follow,we have not chosen design problems to be particularly wellsuited to the approach adopted; neither have the develop-ment processes adopted by our collaborators been text-book development lifecycles. We have adapted ourinvestigation to fit with development constraints, and thedevelopment teams, in turn, have worked hard to accom-modate us and create constructive interactions. Thus,although the work has limited reproducibility, it has muchhigher ecological validity than most studies that havelooked at applicability of user-oriented techniques innon-user-centred design practice.

We describe each of the studies in more detail here,before drawing out general findings and conclusions. Asnoted above, the Claims Analysis work with BT isdescribed elsewhere (Blandford et al., 2006); here wefocus on their generation and use of scenarios. TheGreenstone developers did not engage with Claims Anal-ysis, preferring to work with only scenarios. To betterunderstand how scenarios fit within the development con-text, we start each section by outlining key features of thedevelopment team, their access to users and the processesthey follow.

2.1. Work at BT: the library and Jasper

Like many public- and private-sector organisations, BTprovides a DL interface to a range of digital resources towhich it subscribes on behalf of its staff. It also aims to pro-vide ‘value added’ services. These include a set of ‘Informa-tion Spaces’ that allow users to keep track of newinformation in their specialist areas, and collaborative toolsenabling users to share information they find.

The development is coordinated by three key membersof library staff, who deal with some aspects of the develop-ment (notably interface implementation) themselves, andwho also have regular contact with library users visitingthe physical library. Development of underlying technicalinfrastructure is devolved to specialists. The DL had under-gone three major upgrades, in terms of features offered anduser interface, in the previous six years. Development doesnot follow any standard methodology; the BT developers

74 A. Blandford et al. / Interacting with Computers 19 (2007) 70–82

described their development process as very informal – thatthey tended to have an idea, implement a prototype andthen discuss the design-as-instantiated with colleagues toget reactions.

The developers consulted users (i.e., BT employees) invarious ways. Around the time of each new release, theywould conduct user trials, focusing on finding flaws inthe system; users in these trials were given detailed instruc-tions on what to do and what features to test. The develop-ers’ knowledge of user difficulties and user requirementswas derived primarily from users calling them or cominginto the library with queries and requests. They alsoaccessed system logs, to ascertain levels of use and whichfacilities or resources were being most extensively used.

2.1.1. Development of a novel feature: Jasper

The preliminary study at BT focused on the design ofJasper. This was a web site notification feature that enabledusers to send interesting URLs to nominated colleagues,including notes about why they were interesting. It hadbeen developed by another group within BT and testedby them as a stand-alone item, and the plan was now toincorporate it within the library. There was no data aboutuser experience working with Jasper to inform scenariodevelopment.

The question was where to locate it within the library,and how to present it to users. In particular, the developersbelieved that the data entry form should be redesigned, andthey wanted to focus on that. The first step was to create ascenario accounting for how a user would have reached thispoint. The scenario developed was very informal, butinvolved a user surfing the web, seeing a page that wasinteresting, and then entering the digital library (whilekeeping track of the interesting URL) to access the Jasperfeature. This highlighted the fact that finding and usingJasper would be non-trivial, and that the user would haveto be aware of this feature and how it functions before itcould be used. Further analysis also highlighted the factthat the user was given no feedback when they had com-pleted and sent the form to confirm their action. The latterwas a solvable problem, so one of the BT developers imme-diately went to the whiteboard to map out how he couldcreate a preview/feedback page that addressed the identi-fied difficulty.

Even this initial case study highlighted several importantfeatures of the way the developers integrated the scenariowith their design reasoning:

• The scenario was developed around a pre-selected func-tion and questions about that function.

• The scenario evolved in parallel with reasoning aboutusability insights from it.

• Solution generation was also interleaved with scenariodevelopment; in fact, the development team expressedthe view that they did not want to find out about prob-lems unless they could also identify solutions. Thus, theywere highly solution-focused.

2.1.2. Development of an existing resource: information

spaces and other features

A subsequent, more structured, session at BT focusedon existing features such as the Information Spaces (shownin the centre of the screen in Fig. 1). An information spaceis a browsable collection of documents on a specified topic,to which new documents are regularly added. Each is spec-ialised so that it will not grow unmanageably large.

Three scenarios, based around personas, were developedby the BT developers. The personas covered a less skilleduser, one more demanding, and one in the middle; in creat-ing them, the developers drew heavily on people they knew(exploiting the fact that they have contact with their endusers). They named their three personas after the Goodies(a UK TV show from the 1970s).

• Tim was an expert in his subject, and had done a preli-minary search on credit card fraud. He had requestedhelp on following links from a relevant journal.

• Graeme was an expert in his field but new to the DL.Working against a tight deadline, he had no time towaste. The scenario involved him conducting a basicsearch and then selecting appropriate resources, chang-ing key terms, and using the keyword search feature.

• Bill was doing an MBA, and therefore using theresource for professional development, pushing theboundaries of the resources beyond the core businesscontent. He had a preference for books, and thereforewanted to search the book catalogue, but was havingdifficulty using phrase search.

These scenarios, generated by the developers, sharesome notable properties:

• They all reflect real cases where users had come to thelibrary seeking help;

• The formulated scenarios were (possibly as a direct con-sequence) very sketchy;

• They were all based on existing library features thatwere known to be difficult to use;

• All the personas lacked a good understanding of thesearch process, library features, the structure of the data-base, and the choice of resources; however, two of the per-sonas represented people who were experts in their ownfields.

Because the scenarios were based on real incidents, therewere times when the scenario as discussed merged intomemories of real incidents. For example, while discussingGraeme, one of the developers remembered:

D1: What actually happened was that at this stage weshowed him how to use ‘browse by key words’ to picka better word.

Similarly, while discussing Tim, two of the developers triedto recall the user’s experience:

Fig. 1. The BT Library interface.

A. Blandford et al. / Interacting with Computers 19 (2007) 70–82 75

D1: If it was credit card fraud, he went into places heknew, though, because I think you said that it’s a similarD2: That’s a similar one, but he didn’t find much:looked and searchedD1: How did he go in, do you know? Ranking technol-ogy orD2: I really don’t know. I think he probably just did akeyword search, really tie it down. I think he probablyjust did keyword searchingD1: but a bad choice of keywords perhapsD2: Yeah because the terms, I don’t think he was famil-iar with them

The developers expressed a strong preference, as exem-plified in the descriptions of Tim, Graeme and Bill, for gen-erating scenarios where the information seeking was lessthan instantly successful, because such scenarios wouldprovide more useful insights into problems and solutions.Scenarios tested different parts of the library, includingthe keyword browser, the ‘search for similar’ feature andthe Information Spaces.

Although the scenarios developed look impoverished,because they referred to incidents of which the developerswere all aware, there was substantial common ground ondetails that were never explicitly articulated. Thus, the sce-narios are richer than they appear, but the developers didnot perceive any value in producing the enriched descrip-tions that are advocated in texts on scenario based design.The initial sketchy descriptions were gradually embellishedthrough the dialogue, and (implicitly) through the sharedknowledge of the developers.

The scenarios, described from a user perspective, werethen tested against the system description to assess whetherusers with the specified skills sets (as defined within thepersonas) would know what to do and would understandthe results they were getting back. For example, whenanalysing the Information Spaces, it became apparent thatthey are much more appropriately regarded as a mon-itoring feature for expert users than a good place for a nov-ice user to start unless the user happens to be lucky andhave a requirement that neatly matches an existing topicarea.

In practice, the developers got distracted several timesby trying to enact the scenarios as proposed and having dif-ficulty understanding how the system returned results. Forexample:

D1: I was trying to think of something that would turnup. (typing) I know we de-emphasised databases wethought were less interesting, I can’t remember if it’s aset order or if it’s. Yep, we must have, I think we sometime manually set this, perhaps when ABI was new andwe wanted to make sure it was at the top, so I think weemphasised certain databases here and de-emphasisedothers.

This illustrates one difficulty with working with scenari-os in a domain such as DL development: that the effectsof user actions can be unpredictable to the developers,so that describing the scenario at an appropriate level ofdetail is a real challenge: very abstract scenarios can makeassumptions (e.g., about the success of a search or the

Fig. 2. Six collections in the NZDL.

76 A. Blandford et al. / Interacting with Computers 19 (2007) 70–82

appropriateness of the resources used) but lack the detailneeded to support design reasoning, while very detailedscenarios can become behaviour traces, which typicallyresults in the detail obscuring the principles.

Quite soon, the developers extended the discussion intoproblem solutions:

D1: Um, well if you don’t find anything, there’s a link tous for help.D2: We should make that more prominent for a start!D1: ‘‘I suggest you check your spelling and use thephone number for help’’.

In discussing solutions, they were considering the costsof making changes as well as the benefits:

D1: We could easily do that.D2: I thought that would be quite difficult.D1: The difficulty would be if you went down to theissue level: then the search box would, you think youwere just searching for that issue, because I couldn’tdo that, I could only have search making strategies. Icouldn’t have searched for January/February 2003issues.D2: Yes, you’d have to qualify it, wouldn’t you? Inter-esting thought.

The developers worked readily with the scenarios, andswitched seamlessly between scenarios, problem diagnosisand solution generation. Through their choice of scenarios,the developers focused on problems, and in the discussiontheir interest was in problems that were in principle solv-able, as the developers did not perceive any value in iden-tifying problems that could not be fixed.

This session enabled the developers to get a richer senseof the difficulties less experienced users face and a richerunderstanding of the user experience than they previouslyhad. Personas (based on real users) and scenarios (basedon real incidents) were central to this richer understanding.

2.2. Work on Greenstone

To test the broader validity of the findings from thework with BT, a similar approach was taken with thedevelopers of the Greenstone system. Greenstone is not aDL per se, but a ‘‘suite of software for building and distrib-uting digital library collections’’.1 A digital library mayconsist of any number of collections; for example, Fig. 2illustrates the home page2 of the New Zealand DigitalLibrary, showing six separate collections; however, inmuch DL development, the term DL may be used to referto either a collection or a set of collections.

Greenstone (Witten et al., 2001) is developed by theNew Zealand Digital Library Project at the University of

1 www.greenstone.org, accessed 14/12/5.2 www.nzdl.org, accessed 14/12/5.

Waikato; at the time of the study, Greenstone 2 was inwidespread use, including by international organisationssuch as UNESCO and the Human Info NGO. Greenstone3 was under development. This was envisioned to be a sub-stantial re-design that retains the essential features of thecurrent version with an improved system architecture andsome re-designed or new features.

There were two main phases of interaction betweenGreenstone developers and the human factors team. Thefirst occurred when the lead developer visited the UK, onsabbatical. At that time, he was developing a tool calledthe Gatherer (Bainbridge et al., 2003), which was the focusfor one analysis phase.

The second occurred when the second author, referredto below as HF, visited Waikato for a fortnight to workon the development of Greenstone 3. At the time of the vis-it, the core development team for Greenstone comprisedseven academic staff and research fellows. This team is sup-plemented by Masters students, who complete projectsbased around Greenstone, and a worldwide network ofdevelopers who contribute functions according to theirparticular specialisms.

2.2.1. Analysis of a new feature: The Gatherer

The Gatherer is a tool that supports a collection-builderin gathering together and organising documents to create adigital library collection. It was at a point of developmentwhere scenarios could be used for informing design andencouraging reflection within the design process. Data col-lection and analysis remained informal throughout thisinteraction.

Initially, the developer and human factors specialistworked through some of the screens and the developer toldher about the pages he was having difficulty with. Overtime, they constructed a representation of a person whowas using the Gatherer to collect information togetherand what they were going to be doing. The developer’s first

A. Blandford et al. / Interacting with Computers 19 (2007) 70–82 77

description of what the user would be doing was in terms ofprocedures: they would look at this and they could add in afew documents and then they can go to this bit which is themetadata and then they can come to this page and redesignit, etc. Gradually, the developer and HF specialist con-structed a more user-centred description of how the personwould create a collection, including where they would getdocuments from, how they would collect them onto the siteand what they would be expecting to do at each point in theinteraction.

At this stage, the focus was on telling stories about useand the user experience. HF and the developer investigatedways of talking about what the effects of a design on a userwould be. The approach eventually taken was to considerthe user’s goals, intentions and actions in the situation ofworking with the Gatherer. Together, they worked outwhat the user’s actions were and what had triggered them,what the designer expected the user to do and what the userwas able to do. By this route, they created an integratedaccount that related the developer’s intentions and user’sbehaviour. In this phase, there was a substantial culturalgap between the HF and developer perspectives on design.Both HF specialist and developer had much to learn fromeach other, which involved substantial negotiation overshared understanding and values.

The process of analysing the design in this way was gen-erating useful insights – for example, in identifying pageswhere the user would be unable to predict the effects ofactions or choose a sensible action at all. The developerstarted with two conceptions of users, as being either theexploratory user who will click boxes and just happen tosee what is there or the skilled user who already has allthe system knowledge they need. Understanding the useras being goal directed proved invaluable for considering aricher spectrum of users. Through the process of generatingscenarios, they created a user who knows what they wantto do but does not know how to do it.

The Gatherer was intended to become a component ofGreenstone 3, to which we now turn our attention.

2.2.2. Greenstone 3

In contrast to the BT developers, the Greenstone devel-opment team’s contact with end-users of libraries builtusing Greenstone is mixed: a couple of team members havestructured interactions with end users, but most only everinteract directly with collection builders, who thereforeact as intermediaries between the development team andend-users. There was therefore relatively little groupknowledge to inform scenario development.

In a 2-week visit to Waikato, the HF specialist engagedwith the developers in a systematic (user-focused) review ofGreenstone. The review took place in three stages, definedby the development team.

The first session was a functional review in which thedevelopment team listed out the different functions ofGreenstone 2 on a whiteboard, noting whether they werepart of the core development or prototypes that had not

yet been integrated into the core system, and discussingwhether or not they were perceived as being successful. Ifthey were considered successful, they were to be retainedin Greenstone 3. A function was considered successful ifthe software was reliable and functionally correct, andusers were able to use it fruitfully (i.e., it was perceivedas being both usable and useful). A function was regardednegatively if any team member came up with a reason whyit was not working well; this might be that the code wasunreliable, that users were known to have difficulty work-ing with it, or that collection-builders had to write substan-tial extra code to turn it into exactly what they wanted.

Functions that were to be considered further wereretained for subsequent group sessions.

In the second session, again, most of the developmentteam participated. A set of 11 scenarios, including personasfor different kinds of users, was constructed by the teammembers. The users included novices and others with moreknowledge, and end-users as well as collection builders. Atthis stage, the scenarios were fairly general, avoiding toomuch detail about individual user actions since these woulddepend in part on the content of a particular library. Fromthe set of 11, three scenarios were selected for detailed anal-ysis. These represented situations where the developersagreed it would take some work to fix known problemswith the system, but also situations that would affect manyusers, so that there would be a high gain in correctingknown difficulties. For example, one of these scenarioswas as follows:

Scenario 5 – Historic manuscripts

Victoria and Albert are developing a new collection ofhistoric manuscripts and books, digitised from originalsource material. They want to maintain a close fidelityto the pagination and other physical properties of theoriginals. Victoria and Albert are working in a smallarchive/library of interest both to expert professionalresearchers, family historians and those with an interestin 19th century literature – the latter including adultsand, especially, children and students of literature. Vic-toria and Albert want to create a collection which willattract these users, some of whom already visit the phys-ical library, both to the online and physical collections.Three typical users include Janet, a regular visitor (twoor three visits a year), an English Literature academicwho is particularly interested in the manuscripts ofpoets; Jeremy, who wants to find out if his ancestorsare mentioned in the memoirs of a writer; Sally andPaul, two 13 year-olds who are working on a projectfor their English literature class on the use of industrialmotifs in 19th century literature.

One of the developers almost immediately launched intodesign solutions:

DA: To me, if I’m doing this, I want plug-ins. You couldsay plug-ins are a good feature that helps and means thesystem is adaptable to changes in document format.

78 A. Blandford et al. / Interacting with Computers 19 (2007) 70–82

The discussion developed fluidly covering design solu-tions, requirements and scenario embellishment. For exam-ple, a bit later they further considered the likely IT skills ofthe personas and more about how they would use thesystem:

DA: V and A: do we think they have a lot of IT skills?They are working in a small archive.Group response: No, not high IT.DA: Do we think this is something they would have toattempt using the Gatherer only?DB: Ideally the gatherer should have everything theywant to do – that rather than command line: they ain’tgoing to hack that.DA: I guess the strong specialist feature is that the pag-ination in the digital library is preserved.

This last statement is an example of discussion that tee-ters on the borderline between being an element of a sce-nario (that Victoria and Albert are working withfacsimile pages) and a design requirement.

Others returned to discussing particular system features(or potential features) several times through the discussion.Another soon gave an account of the source of inspirationfor this particular scenario:

DB: I wrote this scenario – confession. I was thinking ofsomething that looks like the Niupepa Collection,3

where you have got something that looks like the origi-nal thing, not just the raw text.

At one point, the team focused on Sally and Paul, consid-ering their specific needs. This was a very tightly coupleddiscussion with four participants contributing ideas:

DB: Classifiers.DC: Yes.DB: Seems damn important because probably want toclassify by who was mentioned, who wrote it.DC: Date, location, particularly for the 13 year olds,probably aren’t going to be.DB: Classifiers are going to be pretty big, I think.DA: Do we feel that classifiers are done in a good way?We could obviously reconstruct the Niupepa classifiersby date quite easily. However, if you wanted to do.DC: Different things.DB: We could improve on how we do that, we could dothings nicer and make the collection nicer as a finishedresult. It would make their job a lot easier. A lot of whatthey are going to end up doing is after they have donethe OCR, is the classifiers.DD: Provide some automated help in checking thequality of the data that we are getting out of OCRprocess.

3 A collection of Maori newspapers.

Similar (though less tightly coupled) exchanges occurredfor the other two scenarios considered.

Overall, the design team collaboratively embellished thescenarios (though these embellishments were never writtendown) through the process of considering designs andrequirements, and the three kinds of activity were interwo-ven in a reflective design interaction. The tightly coupleddiscussions ensured that common ground between partici-pants was reinforced.

Personal experience was also invoked routinely toinform consideration of design possibilities and costs. Forexample:

DA: The next goal is to brand the website – that is whatthe scenario is about.DC: Out of interest, so how hard was this?DA: This was very close to something. I did it – so itwasn’t hard at all.

For each scenario, the team considered what functionsusers would make use of and which functions were caus-ing problems, in order to focus on the most critical onesthat needed attention. In practice, the developers were notparticularly interested in what the consequences of aproblem would be for the user; it was enough for themto know that the user was likely to have a problem. Oncethey were sure this was the case, they simply needed toknow enough about the nature of that difficulty to be ableto make design changes so that the user would be success-ful in future.

The third session moved beyond evaluation into the nextstage of development. There was extensive discussion of thedifficulties experienced by novice and inexperienced search-ers in developing an iterative search (such users typicallyreceive a results set that is not quite right for their needsbut do not have the skills to refine the search). The meetingachieved consensus that this was indeed a problem thatneeded to be tackled. At that point, the team resorted toconsidering how other developers have tackled the samedifficulty and whether those other solutions could beexploited in Greenstone. From that point on, the discus-sion moved away to new design solutions and the practical-ities of implementing them. The discussion considered costs(the likely time to implement new solutions) and benefits(what user problems the new solutions would fix).

Overall, structuring the review around scenarios made itpossible for the more technically oriented developers togain a new perspective on how a DL would be used, partic-ularly by relatively inexperienced users. The discussion alsoenabled the team to identify feasible design changes thatwould improve matters for users.

Greenstone 3 is a challenging object of study. It is nota system that is directly used by end-users. Some of thedevelopment team view it simply as a toolset that is madeavailable so that others can create good collections, so thattheir responsibility is to make collection building asstraightforward as possible. Other team members argue

A. Blandford et al. / Interacting with Computers 19 (2007) 70–82 79

that there should be defaults built in to Greenstone toenable collection builders to quickly create collections thatare likely to be easy to work with. This is a developmentcontext where responsibility is shared across remote designteams who may have minimal contact with each other, butwhere the decisions of one team constrain and inform whatis possible for another; in this context, the generation anduse of scenarios to guide design is difficult. Nevertheless,we believe that this ‘‘scenario informed design’’ approachwas more effective in bringing usability into design thanany other strategy we could have adopted.

3. Discussion

We structure the findings according to the two mainthemes that have emerged from the work: the use of scenar-ios and the cultural gulf between the development andhuman-factors teams.

3.1. The generation and use of scenarios

Within this project, scenarios were found to be more dif-ficult to work with than anticipated. Had we been operat-ing, as Rosson and Carroll (2002) advocate, within ascenario-based design context, the challenge of generatingscenarios would probably have been smaller. In function-oriented design, for a complex and partly unpredictablesystem, defining scenarios at an appropriate level of detailproved difficult. As a craft skill, of course, there is no suchthing as a ‘correct’ scenario: just one that is more or lessuseful.

For both sets of developers, in the early stage of the col-laboration it proved difficult to establish common groundbetween their traditional technical orientation and ouruser-centred orientation. The early scenarios they generat-ed were technology-focused: assuming a user with perfectknowledge, or an exploratory user without particular infor-mation goals, and considering what was possible or sensi-ble with the existing system. Personas and a vocabularyof goals, actions and feedback were the two thinking-toolsthat supported us, collectively, in shifting towards user-centred scenarios. In particular, personas enabled them toview the DL from the perspective of externally motivatedbut novice users.

The early design situations with each development team(Jasper and Gatherer) were novel ones: designs that werecreating new interaction possibilities rather than upgradingexisting ones. For these, the only option was to create pro-jected usage scenarios. The developers’ strong preferencewas to focus on scenarios that predicted user difficultiesthat could be fixed with obvious design changes. They werenot interested in either scenarios that highlighted no userdifficulties or ones that highlighted difficulties that theycould not address.

The other two cases (the Information Spaces and Green-stone 3) involved iterative development so that there wasan existing system to refer to. At BT, the experiences of real

users were the main source for generating scenarios. Thedevelopment team had difficulty divorcing the abstract sce-nario from the details of real user interactions and low levelimplementation details that typically have a substantialinfluence on the user experience (e.g., in determining howsearch results are ranked). There were no such empiricalresources in relation to Greenstone, so that the team hadto rely on anecdotal evidence of how users worked withGreenstone 2. At BT, the developers favoured scenariosof known problematic situations that could be used todirectly inform re-design; the Greenstone team favouredrelatively abstract scenarios – but again ones that gaveinsights into problems with the current system.

In this ‘‘scenario informed design’’, all the scenarios thatwere generated originated in the system features that werethe current focus of concern, rather than the moreabstract user considerations that might be the focus inscenario based design. These developers had no interestin completely transforming their design processes to makethem user centred. The function-centred, scenario-in-formed design compromise appears to have achieved anacceptable balance between technology- and user-centreddesign. In addition, there was no text-book progressionfrom problem statement to solution generation: fromalmost the beginning of each session, design possibilitieswere being generated alongside the development anddiscussion of the scenarios. Where possible, these solutionsdrew on the developers’ prior experience. Personal experi-ence was clearly a highly valued resource for consideringboth problem and solutions.

3.2. Bridging gulfs between perspectives

Although the focus of this study was on contextualisingand testing scenarios in DL development, key issues alsoemerged regarding the contrasting perspectives of thehuman factors team and the two development teams weworked with.

Both development teams shared a functional approachto thinking about design; Because of this functional per-spective, both had difficulty perceiving how, from the user’sperspective, all the functions needed to be joined up into acontinuous interaction experience. Both teams started sev-eral discussions by presenting the human factors specialistwith a particular issue – for example, where should thispage be linked in to the rest of the system, or how shouldit be laid out? – without considering the broader contextof how the user would have got to that point in the inter-action, or why they were there. This was inconsistent witha scenario based approach.

There was a cultural gulf to be bridged: the human fac-tors team had difficulty comprehending the interactionfrom a technical perspective, while the developer teamshad equal difficulty in comprehending it from a user per-spective. This made the early stages of working togetherchallenging, as there were substantial communication diffi-culties. Over time, scenarios were used to create common

80 A. Blandford et al. / Interacting with Computers 19 (2007) 70–82

ground but the process of building this ground was slowand difficult. This is consistent with the view put forwardby Bødker and Christiansen (1997) that scenarios can serveas ‘boundary objects’ between members of a design teamwho come from very different backgrounds. It is worth not-ing, however, that the process of generating scenarios wasreversed from the text-book account: we typically workedfrom functions that developers were concerned about togenerate scenarios that involved users working with thosefunctions.

As well as thinking about design in terms of the func-tionality a system supports, another important characteris-tic of the developer perspective was that it was solution-focused. Developers were, frankly, not interested in posthoc reflection on the qualities of a design; consequently,they only really engaged with scenario-based discussionwhen it could help identify solvable problems, and analysiswas often interleaved with solution generation and discus-sion. They valued the insights from theory, but only whenthey could guide problem-fixing.

4. Conclusions

Scenarios explicitly set out the assumptions that design-ers make about the users and their tasks. Where possible,the developers chose to base scenarios on instances fromtheir own prior experience. However, in other cases therewas no a priori empirical evidence gathered from actualDL system users, and therefore developers had to relyon their intuitions and make design guesses. Clearly doingso is not ideal, but the problems in doing so are mitigatedby making those assumptions explicit and testable. Sce-narios would seem to be a good vehicle for doing this,but our work has exposed various practical difficulties indoing so.

In electing to work with scenarios, one initial assumptionwas that it should be possible to create a library of scenariosthat could then be readily re-used by DL developers. Thisproved more challenging than anticipated for severalreasons:

• The cultural gulf between the development teams andourselves was substantial. They were function- and solu-tion-focused, whereas we were scenario- and user-fo-cused. Scenarios proved useful for creating commonground between these perspectives, but the gulf remains.

• DL development is even more ill-structured than mostother software development, typically involving interop-erating components developed by distributed teams whomay have no direct contact with each other. At the least,this creates distributed responsibility for delivering auser experience that is an emergent product of a com-plex web of design decisions.

• Two of the functions we worked with were novel designsthat created new interaction possibilities, so that therewas neither relevant theory nor empirical data on whichto base the design of scenarios.

• Developers readily invoked their own prior experience(of observing users, developing functions, experiencingother people’s designs as users themselves), but showedmuch less inclination to draw on theory or the empiricalfindings of others.

Nevertheless, the use of scenarios helped deliver impor-tant design insights, and to bridge the gulf between the con-flicting perspectives. In particular, the process of discussingthe scenarios and who the users are and what they are try-ing to do generated a productive dialogue between thedevelopers and human factors specialists. Thus, the poweris in the dialogue and potential for creating shared under-standing (a point also noted by Bødker and Christiansen(1997)).

Scenarios and personas enabled developers to explorethe ‘hidden middle’ of user profiles: the users who are nei-ther simply exploring nor ready-made experts. Cooper(1999), in his discussion of programming practice, assertsthat programmers tend to think in terms of three numbers:0, 1 and infinity (e.g., if something can be done more thanonce, it can typically be done an infinite number of times),and that this prevents them from thinking of good designsolutions based around smallish numbers. There is a paral-lel here with thinking about users: good designs are oftenthose that assume users have some knowledge (and pur-pose), but not infinite knowledge.

Our experience is necessarily coloured by the nature ofthe development environments in which we were working.DL development is unavoidably ill-structured. There is aweb of interdependent developments. The whole processis much more organic and opportunistic than that assumedin descriptions of design processes, including scenario-based design (Rosson and Carroll, 2002). Design and anal-ysis are interspersed fluidly, with each other and sometimeseven with exploration of the current system. Both of thedesign teams we worked with had a strong culture of closeteam working, as illustrated in some of the transcriptsabove where individuals work in tightly coupled ways,often even contributing elements of the same sentence.The rich complexity of the development process makes itdifficult to apply scenario-based design in anything like atextbook fashion. More critically still, the cultures of thedevelopment processes we have investigated are highlyfunction-focused, and developers are interested in solu-tions, not problems. The sheer complexity of the technicalchallenges that underlie the development of well engineeredDLs (i.e., ones that work at a functional level, with compo-nents that interoperate correctly, with a minimum of bugs)demands a strong technical focus, which is culturally atodds with user-centred design processes. This technical ori-entation is essential to creating products that are wellenough engineered to be useful or usable; the need is notto change the development culture to a user-centred one,but to value both cultures and find ways to enable themto work together. We have called this ‘scenario informeddesign’.

A. Blandford et al. / Interacting with Computers 19 (2007) 70–82 81

Making usability usable demands that it is acceptableand valued. There are many different development con-texts, and those we have been involved in are very differentfrom those described by Carroll et al. There are good rea-sons for retaining a strong technology perspective in devel-opment and for focusing on the development of individualfunctions. Newell and Card (1985) propose that ‘‘The raceis between the tortoise of cumulative science and the hareof intuitive design’’. In the work reported here, it wouldappear that the hare is still leading the race – that, in thedesign contexts studied, intuition and solutions continueto take precedence over science and understanding prob-lems. What is needed is not for the hare and tortoise tocontinue to compete, but for them to find ways of co-exist-ing and working together more effectively.

Acknowledgements

We are grateful to the development teams at BT andWaikato who worked with us on this project and createdsuch fruitful interactions. David Bainbridge and GeorgeBuchanan gave useful feedback on a draft of this paper.The work was funded by EPSRC Grants GR/N37858and GR/S67494.

References

Bainbridge, D., Thompson, J., Witten, I.H., 2003. Assembling andenriching digital library collections. In: Proc. Joint Conference onDigital Libraries. Houston, Texas, pp. 323–334.

Bates, M., 2002. The cascade of interactions in the digital library interface.Information Processing and Management 38, 381–400. Available from<www.gseis.ucla.edu/faculty/bates/articles/cascade.html>.

Bellotti, V., 1989. Implications of current design practice for the use ofHCI techniques. In: Jones, D., Winder, R. (Eds.), People andComputers IV, Proceedings of HCI’89. Cambridge University Press,pp. 13–34.

Bellotti, V., Buckingham Shum, S., Maclean, A., Hammond, N., 1995.Multidisciplinary modelling in HCI design. . . in theory and in practice.In: Proceedings of CHI ’95 ACM Conference on Human Factors inComputing, pp. 146–153.

Blandford, A., Rugg, G., 2002. A case study on integrating contextualinformation with usability evaluation. International Journal of HumanComputer Studies 57 (1), 75–99.

Blandford, A., Keith, S., Connell, I., Edwards, H., 2004. Analyticalusability evaluation for Digital Libraries: a case study. In: Proceedingsof ACM/IEEE Joint Conference on Digital Libraries, pp. 27–36.

Blandford, A., Keith, S., Fields, B., 2006. Claims Analysis ‘in the wild’: acase study on digital library development. International Journal ofHuman–Computer Interaction. 21 (2), 197–218.

Bødker, S., Christiansen, E., 1997. Scenarios as springboards in design. In:Bowker, G., Gasser, L., Star, S.L., Turner, W. (Eds.), Social ScienceResearch, Technical Systems and Cooperative Work. Erlbaum, pp.217–234.

Bødker, S., Nielsen, C., GravesPetersen, M.G., 2000. Creativity, Coop-eration and Interactive Design. In: Proc. DIS 2000. ACM, pp. 252–261.

Buckingham Shum, S., Hammond, N., 1994. Transferring HCI Modellingand Design Techniques to Practitioners: A Framework and EmpiricalWork. In: Proceedings of HCI’94: People and Computers IX,Glasgow, 23–26 August, 1994. Cambridge University Press, Cam-bridge, pp. 21–36.

Butterworth, R., Ghosh, M., Wise, C., 2005. Designing for broadeningaccess: model based iterative design of digital library resources for life-long learners. Presented at Digital Resources for the Humanities 2005.Extended abstract available from: <http://ahds.ac.uk/drh2005/viewabstract.php?id=24&cf=1>.

Carroll, J.M., 2002. Making use is more than a matter of task analysis.Interacting with Computers 14, 619–627.

Carroll, J.M., Rosson, M.B., 1992. Getting around the task-artifact cycle:how to make claims and design by scenario. ACM Transactions onInformation Systems 10 (2), 181–221.

Champeny, L., Borgman, C., Leazer, G., Gilliland-Swetland, A., Mill-wood, A., D’Avolio, L., Finley, J., Smart, L., 2004. Developing adigital learning environment: an evaluation of design and implemen-tation processes. In: Proc. JCDL 2004, pp. 37–46.

Cooper, A., 1999. The Inmates are Running the Asylum. Sams Publishing,Indianapolis.

Hammond, N., Jorgensen, A., MacLean, A., Barnard, P., Long, J., 1983.Design practice and interface usability. Proceedings of the SIGCHIConference on Human Factors in Computing Systems, 40–44.

Heinbokel, T., Sonnentag, S., Frese, M., Stolte, W., Brodbeck, F., 1996.Dont underestimate the problems of user centredness in softwaredevelopment projects there are many. Behaviour and InformationTechnology 15 (4), 226–236.

Hertzum, M., Jacobsen, N.E., 2001. The evaluator effect: a chilling factabout usability evaluation methods. International Journal of Human–Computer Interaction 13 (4), 421–443.

John, B., 1998. On our case study of claims analysis and other usabilityevaluation methods. Behaviour and Information Technology 17 (4),244–246.

Komlodi, A., Marchionini, G., 1998. Key frame preview techniquesfor video browsing. In: Proc. ACM Digital Libraries 1998, pp.118–125.

Kyrillidou, M., Giersch, S., 2005. Developing the DigiQUAL protocol fordigital library evaluation. In: Proc. 5th ACM/IEEE-CS Joint Confer-ence on Digital Libraries, pp. 172–173.

MacLean, A., Young, R.M., Bellotti, V., Moran, T., 1991. Questions,options, and criteria: elements of design space analysis. Human–Computer Interaction 6 (3–4), 201–250, In: Carroll J.M., Moran T.P.(Eds.), Special Issue on Design Rationale.

Maiden, N., Robertson, S., 2005. Developing use cases and scenarios inthe requirements process. In: Proc. ICSE 2005, pp. 561–570.

Newell, A., Card, S.K., 1985. The prospects for psychological science inhuman–computer interaction. Human Computer Interaction 1, 209–242.

Nielsen, J., 1994. Heuristic evaluation. In: Nielsen, J., Mack, R.(Eds.), Usability Inspection Methods. John Wiley, New York, pp.25–62.

O’Neill, E., 1998. User–developer cooperation in software development:building common ground and usable systems. Ph.D. Thesis. QueenMary and Westfield College, University of London.

Paddison, C., Englefield, P., 2003. Applying heuristics to perform arigorous accessibility inspection in a commercial context. In: Proc.CUU 2003, pp. 126–133.

Papadakis, I., Andreou, I., Chrissikopoulos, V., 2002. Interactive searchresults. In: Agosti, M., Thanos, C. (Eds.), Proc. ECDL 2002. SpringerLNCS 2458, pp. 448–462.

Rosson, M.B., Carroll, J.M., 2002. Usability Engineering. MorganKaufman, San Francisco.

Rosson, M., Maass, S., Kellogg, W., 1988. The designer as user buildingrequirements for the design tools from design practice. Communica-tions of the ACM 31 (11), 1288–1298.

Spasser, M., 2003. The Flora of North America Project: Making the case[study] for Social Realist Theory. In: Bishop, A.P., Van House, N.A.,Buttenfield, B.P. (Eds.), Digital Library Use. MIT Press, Cambridge,MA.

Spencer, R., 2000. The streamlined cognitive walkthrough method,working around social constraints encountered in a software develop-ment company. In: Proc. CHI’2000, pp. 353–359.

82 A. Blandford et al. / Interacting with Computers 19 (2007) 70–82

Sutcliffe, A.G., Carroll, J.M., 1999. Designing claims for reuse ininteractive systems design. International Journal of Human–ComputerStudies 50, 213–241.

Vredenburg, K., Mao, J.-Y., Smith, P., Carey, T., 2002. A survey of user-centred design practice. In: Proc. ACM CHI 2002, pp. 471–478.

Wharton, C.,Rieman, J.,Lewis, C., Polson, P., 1994. Thecognitivewalkthroughmethod: a practitioner’s guide. In: Nielsen, J., Mack, R. (Eds.), UsabilityInspection Methods. John Wiley, New York, pp. 105–140.

Witten, I.H., Bainbridge, D., Boddie, S.J., 2001. Greenstone: open-sourcedigital library software with end-user collection building. OnlineInformation Review 25 (5), 288–298.

Young, R.M., Barnard, P.J., 1987. The use of scenarios in humancomputer interaction: turbocharging the tortoise of cumulative science.In: Carroll, J.M., Tanner, P.P. (Eds.), Proc CHI + GI ’87—HumanFactors in Computing Systems and Graphics Interface. ACM, NewYork, pp. 291–296.