value, impact and performance: the sconul vamp programme

Post on 17-Jan-2016

34 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Value, Impact and Performance: the SCONUL VAMP Programme. Stephen Town Cranfield University Open University Wednesday 27 th June, 2007. Summary. An introduction to the issues The Value & Impact Measurement Program (“VAMP”) The VAMP Deliverables - PowerPoint PPT Presentation

TRANSCRIPT

Value, Impact and Performance:the SCONUL VAMP Programme

Stephen TownCranfield University

Open University

Wednesday 27th June, 2007

Summary

• An introduction to the issues

• The Value & Impact Measurement Program (“VAMP”)

• The VAMP Deliverables– A Community of Practice in Library Measurement– The “Performance Portal”

The University Context (from the Library Assessment Conference,

Charlottesville, Va, September 2006)

Universities have two “bottom lines”

1. Financial (as in business)

2. Academic, largely through reputation in• Research (the priority in “leading” Universities)• Teaching (& maybe Learning)

Library Pressures for Accountability

The need is therefore to demonstrate the Library contribution in these two dimensions:

1. Financial, through “value for money” or related measures

2. Impact on research, teaching and learning

This also implies that “competitive” data will be highly valued

Cautions for measurement

The Aim & Role of Universities & their Libraries

• Research– ‘Mode 1’ Research & impact ‘transcendental’– ‘Mode 2’ Research & impact ‘instrumental’

• Reductionism– Value, Price & ‘Mandarinisation’ of research and its support– Libraries as symbols and ‘transcendent’ services

The SCONUL Experience

The SCONUL Working Group on Performance Improvement

• Ten years of “toolkit” development to assist in performance measurement and improvement

• SCONUL ‘Top concern survey’ 2005, suggested inability to prove value, impact or worth, leading to VAMP

Examples of tools developed 1

• Integration

• Efficiency & Comparability

Quality assurance Guidelines

SCONUL Statistics & interactive service

HELMS national performance indicators

E-measures projectBenchmarking Manual

Examples of tools developed 2

• Satisfaction

• Impact

SCONUL Satisfaction Survey

SCONUL LibQUAL+ Consortium

LIRG/SCONUL Impact Initiative

Information Literacy Success Factors

VAMP Objectives

• New missing measurement instruments & frameworks

• A full coherent framework for performance, improvement and innovation

• Persuasive data for University Senior Managers, to prove value, impact, comparability, and worth

Missing methods?

• An impact tool or tools, for both teaching & learning and research (from the LIRG/SCONUL initiative?)

• A robust Value for Money/Economic Impact tool

• Staff measures• Process & operational costing tools

Benefits?

1. Attainment & retention of Library institutional income

2. Proof of value and impact on education and research

3. Evidence of comparability with peer institutions4. Justification of a continuing role for libraries

and their staff5. Meeting national costing requirements for

separating spend on teaching and research

VAMP Project Structure

• Phase 1 (March-June 2006)– Critical review– SCONUL Members Survey– Gap analysis & synthesis– SCONUL Conference Workshops

• Phases 2 & 3 (July 2006 - June 2007)– Development of new measures & techniques– Review and re-branding of existing tools– Web site development– Dissemination & maintenance strategy

Critical Review Method

Review of:

• SCONUL initiated or promoted services• Other UK & European initiatives• Initiatives from other UK library sectors• International initiatives

starting from the perspective of ‘The Effective Academic Library’, 1995

Review Findings

• The Impact Initiative work will be key, but needs to be solidified and embedded

• Eight JISC/EU projects in the last eight years relevant to the assessment of value and impact, many relating to e-resources

• Significant work in the USA, Australia and South Africa

Review Conclusions

• Vast amount of relevant work, but without wide take up

• More critical analysis required of most products and tools

• Further development and simplification required to create credible and applicable instruments for SCONUL members

Member Survey Findings

• 38 respondents; 27% of population• 70% undertaken value or impact measurement• Main rationales are advocacy, service

improvement, comparison• Half used in-house methodologies; half used

standard techniques• Main barrier is lack of tools, making time an issue• Buy-in of stakeholders is an issue

Member Survey Conclusions

• There is a need to demonstrate value and that libraries make a difference

• Measurement needs to show ‘real’ value• Need to link to University mission• Libraries are, and intend to be, ahead of the game• Impact may be difficult or impossible to measure• All respondents welcomed the programme, and

the prospect of an available toolkit

Synthesis

• Terminological confusion?• Is ‘impact’ = ‘effect’ or to ‘outcome’• ‘Higher order effects’ and level

– Individual, course, institutional, vocational, societal, national, international

• ‘Value and impact’ are not an item?• ‘Value’, ‘adding value’, ‘value for

money’, ‘cost-effectiveness’

Expert Comment

• Return to why?– Advocacy or management?

• Critical gap in measurement– Effect of the Library on educational attainment– Effect of the Library on research attainment

• Robust and simple tools• Only a few existing tools effective, so

simplify and focus the range of offerings

SCONUL Conference Workshops

• Accountability to a variety of structures and individuals… therefore a range of approaches required

• SCONUL Statistics heavily used• Directors want help with all VAMP lines• Pedagogic “Big project” needed?• Re-engineer processes rather than measure!

Overall conclusions

• Wider sectoral involvement?– Health Evidence-based methods– National MLA Measures– British Library Contingent Valuation

• Not only new measures… but also supporting & directing processes

Deliverable Plan 1

“Content” Products

2.1 Value & Impact Guidelines

2.1.1 Institutional Value (eg VFM & Economic Impact)

2.1.2 Impact on Teaching & Learning2.1.3 Impact on Research

Deliverable Plan 2

“Content” Products

2.2 Staffing & Operational Measures Guidelines

2.2.1 Staff Costing2.2.2 Staff Added Value measures2.2.3 Other operational costing methods

2.3 Re-branding & packaging of existing tools

Deliverable Plan 3

“Process” Products

3.1 Web Site

3.2 Community of practice establishment

3.3 Maintenance & sustainability strategy

Progress on Content 1

2.1.1 Institutional Value (eg VFM & Economic Impact)

VFM tool in negotiation now with 2.2Contingent Valuation methods available

2.1.2 Impact on Teaching & Learning2.1.3 Impact on Research

Tool for both areas delivered (Information Management Associates)

Progress on Content 2

2.2.1 Staff Costing2.2.2 Staff Added Value measures2.2.3 Other operational costing methods

Costing & value method in negotiation‘Transparency’ Meeting May 2007

2.3 Re-branding & packaging of existing tools

Mainly included within the Web Site development

Progress on Process

3.1 Web Site

Portal to be launched in June 2007

3.2 Community of practice establishment

Invitations for beta testing of site May 2007

Communities of Practice

“groups of people who share a passion for something that they know how to do,and who interact regularly to learn how to do it better”

“coherence through mutual engagement”

Etienne Wenger, 1998 & 2002

Member’s Forum(Blog?Chat?)

Techniques in Use(Wiki?)

VAMP Home Page

SimpleIntroductio

ns

DetailedTechniques

Community of Practice

Techniques

The ‘Performance Portal’

• A Wiki of library performance measurement containing a number of ‘approaches’, each (hopefully) with:– A definition– A method or methods– Some experience of their use in libraries (or links to

this)– The opportunity to discuss use

The Ontology of Performance

• ‘Frameworks’• ‘Impact’• ‘Quality’• ‘Statistics’• ‘Value’

Frameworks

Mounted

• European Framework for Quality Management (EFQM)

Desired

• Key Performance Indicators

• The Balanced Scorecard

• Critical Success Factors• The Effective Academic

Library

Impact

Mounted

• Impact tools

Desired

• Detailed UK experience from LIRG/SCONUL Initiatives

• Outcome based evaluation

• Information Literacy measurement

• More on research impact

Quality

Mounted

• Charter Mark• Customer Surveys

– LibQUAL+– SCONUl Survey– Priority Research

• Investors in People

Desired

• Benchmarking• Quality Assurance• ISO 9000s• ‘Investors in People’

experience• Opinion meters• Quality Maturity Model

Statistics

Mounted

• SCONUL Statistics & interactive service

• HELMS statistics

Desired

• Institutional experience of using SCONUL statistics for local advocacy

• COUNTER• E-resource tools

Value

Mounted Desired

• Contingent valuation• ‘Transparency’ costing• Staff & process

costing, value & contribution

• E-resource value tools

Discussion Tools

• An experiment in social networking & Web 2.0 technologies

Acknowledgments

• Angela Conyers, Evidence Base, UCE

• Claire Creaser & Suzanne Lockyer, LISU, Loughborough University

• Professor Peter Brophy, Manchester Metropolitan University

• The VAMP Subgroup of SCONUL WGPIMaxine Melling, Philip Payne, Rupert Wood

• The Cranfield VAMP Team, Darien Rossiter, Michael Davis, Selena Lock, Heather Regan

top related