spec kit 303 in the uk and ireland:

Post on 28-Nov-2014

182 Views

Category:

Education

1 Downloads

Preview:

Click to see full reader

DESCRIPTION

"A survey of performance measurement and assessment practice in SCONUL member libraries" Delivered at the 8th Northumbria International Conference on Performance Measurement in Libraries and Information Services.

TRANSCRIPT

SPEC Kit 303 in the UK and Ireland:

a survey of performance measurement and assessment practice in SCONUL member

librariesSelena Killick, Tracey Stanley and

J. Stephen Town

SPEC Kit 303 in the UK and Ireland

Summary

• Background, methodology and approach

• Findings• Comparative observations• Conclusions

Background, methodology and approach

Origins and process

• Web survey of performance measurement and assessment activities in academic and research libraries

• The ARL SPEC Kit 303 on Library Assessment was published in December 2007

• 60% response rate amongst 123 ARL libraries• Conducted by Stephanie Wright and Linda

White

Rationale for UK version

• The aim in both cases was to provide ‘an overview of precisely how library assessment activities are being implemented and developed’ within member libraries

• Assistance with best practice for developing performance measurement programmes

• Awareness of tools, techniques and structures• Direct comparison between ARL and SCONUL

libraries

Findings

Sample & characteristics

• 77 libraries (43% of SCONUL membership but 60% of University institutions)

• Majority engaged with PM from late ‘80s onwards

• User surveys were first assessment activities in most cases

• Rationale was internal and user driven

PM Activities in use

Range of 3-19 of listed methods; median of 10; average of 10.6

• Statistics (96%)• Suggestions (91%)• Data mining (72%)• Outcome evaluation (67%)• Benchmarking (63%)• KPIs (63%)

Least used

• Value/ROI assessment• Impact assessment• Balanced scorecard• Physical orientation studies• Mystery shopper studies

Functions assessed

Every one of 27 library functions reported as assessed by at least six respondents

• Enquiry services (92%)• Electronic resources (92%)• Circulation (89%)• Acquisitions, ILL and Web site (all 84%)• Information literacy and online catalogue (82%)

Organisation

• 1 respondent has a f/t coordinator• 26% have p/t coordination• 25% through Committees• 9% within a specific department• Majority of posts and committees

created since 2000

Outcomes and improvements include …

• Opening hours most frequent improvement• Web site• IT facilities• Reshelving processes• E-resources• Space• Staff structure

Strategy and development

• 79% have strategic commitment to evaluation, and most have a plan

… but 51% have no particular training• Further training needed on

– Data analysis tools (Atlas ti)– Understanding survey techniques– Survey design methodology

Culture of assessment

• Results used to improve library (75%)• Evaluation for service quality (69%)• Assessment is a library priority (67%)

• Staff development is adequate (13%)• Staff have necessary skills (26%)• Staff accept responsibility (34%)

Comparative observations

ARL & SCONUL

ARL

• North America (US & Canada)

• Selective membership of large scale research libraries

• 123 members• Tradition of

measurement

SCONUL

• The British Isles (UK & Ireland)

• Inclusive membership of all higher education institutional libraries

• 180 members (=129 Univ)

• Tradition of measurement

Basic comparisons

SPEC Kit 303

• 73 of 123 (60%)• 99% active• 91% customer driven• 29% accreditation

driven• Majority survey first• Improvement 76%• No particular training

29%

UK & Ireland

• 77 of 129 (60%)• 100% active• 84% customer driven• 9% accreditation driven• Majority survey first• Improvement 75%• No particular training

51%

Variation in methods

• User interface usability testing figures strongly within ARL libraries, and used frequently to test web sites (the most assessed area)

• Internally developed surveys used widely in the SCONUL sample, including for the web site

Organisation

• More full time coordinators in ARL (16% vs 1)

• More departments charged with assessment (13% vs 9%)

• Fewer part time and adhoc committees in ARL, although adhoc teams a feature in both contexts

Development and culture

• 71% support for training in ARL• Strong senior management

commitment in both, but not necessarily translating to the organisation as a whole in either

Conclusions

Speculative reasons for divergence

• Governance differences– UK Public service context for advocacy and reporting

• Quality assurance pressures– NSS and other review pressures accentuate particular

aspects of library performance (at the expense of others?)

• Technique availability– Variation between ARL and SCONUL initiatives and products

• Culture– Depth of ‘research’ and data reliance in US?– Local creativity and pragmatism in the UK & Ireland?

Conclusions

• SPEC Kit approach was transferable to the UK & Irish context (and potentially beyond)

• Revealed details of performance measurement and evaluation in this context

• Provides a tool for international comparison• ‘Assessment’ not recognised as a synonym

for Performance Measurement, but this did not affect responses

Afterword

The richness of data and the range of activities described on both sides of the Atlantic demonstrate a very strong commitment in libraries to delivering value to their communities, through measurement and assessment, and an enthusiasm for using any techniques which will assist in the process of developing a customer focused culture

top related