electronic collection management: how statistics can, and can't, help
DESCRIPTION
Presentation delivered at the ASLIB Engineering & Technology group and the Aerospace & Defence Librarians Group event titled: Surviving the recession: maximising your value. Held at Imperial College on the 15th of November 2011.TRANSCRIPT
Surviving the recession: maximising your valueASLIB Engineering & Technology GroupAerospace & Defence Librarians Group
Electronic Collection Management:How statistics can, and can’t, help.
John HarringtonHead of Information Services
Selena KillickLibrary Quality Officer
Introduction
• Institutional, financial and strategic context
• Previous methods used to review journals collections
• Role of qualitative and quantitative measures
• What these measures can and cannot tell us
Cranfield University
• The UK's only wholly postgraduate university focused
on science, technology, engineering and
management
• One of the UK's top five research intensive
universities
• Annual turnover £150m
• 40% of our students study whilst in employment
• We deliver the UK Ministry of Defence's largest
educational contract
Key Drivers
• Financial realities
• Demonstrating value for money
• Strategic alignment
• Research Excellence Framework (REF)
• Income
• ReputationMissioncritical
Expenditure on Journals
2006-07 2007-08 2008-09 2009-10
Journal Spend
Expenditure on Resources
8%
58%
4%
29%
0%
Cranfield UniversityInformation Provision Expenditure
by Format 2009-10
Books inc. special collections
Total Journals
e-Books
Other databases
Other digital documents
How do we demonstrate that the collection is meeting the needs of the University?
Previous Techniques Used:
Annual journals review using the follow data
• Circulation figures – issues and renewals
• “Sweep survey” to capture in-house use
• Journal contents page requests
• Download figures
• Journal prices v the cost of ILL requests
More recent focus on “cost per download”
New Approach
Quantitative:• Size
• Usage
• Coverage
• Value for Money
Qualitative:• Academic Liaison
• Reading Lists Review
• REF Preferred
Quantitative Reporting
Quantitative Reporting
• Systematic
• Sustainable
• Internal benchmarking
• Elevator pitch
• So what?
• Enable informed decision making
• Demonstrate smart procurement
Brought to you by the letters…
&
Our Approach
• What has everyone else done?
• Analysing Publisher Deals Project
• Storage centre
• Excel training
• Template design
Basic Metrics
• Number of titles within a package
• Total annual full-text downloads
• Cost:
• Core titles
• e-Access Fee
• Total costs
Value Metrics
• Average number of requests per title
• Average cost per title
• Total cost as % of information provision expenditure
• Cost per full-text download
• Average download per FTE student/staff/total
• Average cost per FTE student/staff/total
The Long Tail
Do
wn
loa
ds
Titles Titles Titles
Long Tail Short Tail No Tail
Subscribed Titles
• Reviewing performance of core collection
• REF Preferred?
• Popular?
• Three year trends in cost / downloads / CPD
• Cost / Downloads / CPD categorised:• Zero
• Low
• Medium
• High
• Cancel?
Popular Titles
• Which titles are the most popular?
• Top 30 titles in the package
• Three year trends in downloads
• REF Preferred?
• Subscribed title?
Considerations
• When to measure from/to?• calendar, financial/academic, or contract year?
• Which titles make up our core collection?
• Do we have access to all of the „zero use‟ titles?
• What constitutes Low/Medium/High?
• What about the aggregator usage statistics?
• Do we trust the usage statistics?
• What is the size of the target population?
How statistics can, and can’t, help.
Electronic Collection Management:
Qualitative Measures
Academic Liaison
• Who‟s using it?
• Why?
• How?
• How valuable is it?
• What will be the impact if we cancel?
• Teaching?
• Research?
Quantitative on the Qualitative:
Analysis on the five REF Preferred Recommended
Journals Lists:
• Overlapping titles
• Unsubscribed titles
• Financial shortfall
• Current recommended subscribed titles
• Usage data
Reading List Review
Qualitative analysis on course reading lists:
• What are our academic recommending?
• Where is it published?
• How often is it recommended?
• Are there alternatives?
Using the results
What they can do:
• Both qualitative and quantitative measures tell the
story of the resource
• Aid decision making
• Justify procurement
• Safeguard budgets
What they can’t do:
Conclusions
Closing thoughts
• Is it worth investing in this?
• Qualitative & Quantitative
• Danger of relying on cost-per-download
Looking Ahead
• Review of all budgets
• All Resources
• Systems
• Staff
• Services
• Demonstrating Value and Impact
• Resources
• Services
Thank You
Selena KillickCranfield [email protected]: 01793 785561
John HarringtonCranfield [email protected]: 01234 754477