bibliometrics meeting, open university 5 march 2013 dr lisa colledge snowball metrics program...

18
Bibliometrics meeting, Open University 5 March 2013 Dr Lisa Colledge Snowball Metrics Program Director [email protected] Snowball Metrics 1

Upload: brittney-burson

Post on 29-Mar-2015

223 views

Category:

Documents


1 download

TRANSCRIPT

Bibliometrics meeting, Open University

5 March 2013

Dr Lisa Colledge

Snowball Metrics Program Director

[email protected]

Snowball Metrics

1

Snowball Metrics are…

• Endorsed by a group of distinguished UK universities to support their strategic decision making

• Tried and tested methodologies that are available free-of-charge to the higher education sector

• Absolutely clear, unambiguous definitions enable apples-to-apples comparisons so universities can benchmark against their peers to judge the excellence of their performance

Snowball Metrics are unique because:

• Universities drive this bottom up

• Academia – industry collaboration

2

Snowball Metrics address shared needs• Growing recognition of the value of data/metrics to inform and

monitor research strategies

• Dissatisfaction with available tools: bespoke implementations, incompatibility of systems

• Frustration over the lack of a manageable set of standard metrics for sensible measurements

• Frequent similar data requests from external bodies looking at aspects of performance that are not necessarily of most interest to universities themselves

Background

• An agreed national framework for data and metrics standards is needed, and suppliers should participate in the development of these standards

• Universities need to benchmark to know their position relative to their peers, so they can strategically align resources to their strengths and weaknesses

• Universities and funders should work more collaboratively, and develop stronger relationships with suppliers

Recommendations from the study

Imperial College London and Elsevier conducted a joint study of English research information

management funded by JISC

The REF alone is not a suitable tool for a university

4

CURRENT SITUATION• REF/RAE provides a snapshot every 5-6

years• Focused approach to measuring outputs and

impacts • Strategic allocation of researchers • Changing methodologies

• REF/RAE provides a snapshot every 5-6 years

• Focused approach to measuring outputs and impacts

• Strategic allocation of researchers • Changing methodologies

DESIRED SITUATION• Current snapshots, at least every year• Broad range of measures across research

and enterprise• Comparable allocation of researchers

between universities• Stable approach

• Current snapshots, at least every year• Broad range of measures across research

and enterprise• Comparable allocation of researchers

between universities• Stable approach

Desired situation = vision for SM

Snowball Metrics drive quality and efficiency across higher education’s research and enterprise activities, regardless of system and supplier, since they

• Are the preferred standards used by research-intensive universities to view their own performance within a global context

• Encompass the scope of key research and enterprise activities of a research-intensive university

5

Snowball Metrics Project Partners

Main roles and responsibilities

• Everyone is responsible for covering their own costs

• University project partners– Agree the metrics to be endorsed by Snowball– Determine feasible methodologies to generate the metrics in

a commonly understood manner

• Elsevier– Ensure that the methodologies are feasible, prior to

publication of the recipes, by building and hosting the Snowball Metrics Lab as a test environment

– Distribute the recipes using our communications networks– Day-to-day project management of the global program

• Outside the remit of the Snowball Metrics program– Nature and quality of data sources used to generate

Snowball Metrics – Provision of tools to enable the global sector to generate and

use Snowball Metrics6

Snowball Metrics Recipe Book

7

Agreed and tested methodologies for new Snowball Metrics, and versions of existing Snowball Metrics, are and will continue to be shared free-of-charge.

None of the project partners will at any stage apply any charges for the methodologies.

Any organisation can use these methodologies for their own purposes, public service or commercial.

(Extracts from Statement of intent, October 2012)

Elsevier’s approach

Any organisation can use the recipes to prepare the metrics in their own kitchen from their own ingredients free of charge.

If an organisation approaches Elsevier for help to implement and use the metrics, we will charge to eat at our restaurant

The Lab tests metrics’ feasibility

8

Metrics can be size-normalised

9

Metrics can be “sliced and diced”

10

Viewing options…

11

Chart / table

Testing addressed feasibility issues

Wide range of metrics

Data availability across

landscapeSensitivity of

inputting data into a shared

systemResearcher-

level data (Data Protection Act)Manual labour

in data collection

Experts group formed to select and define phase 1 metrics – impactful, do-

able, require data from 3 sources

Data sharing agreement“Unlocking” model in the

SM LabShare metrics not data

Used only where neededNot revealed in metric

granularityUniversity, proprietary and third party data used in as close to native format as

possible

Snowball Metrics are feasible

• Feasibility means that they are S(S)MART:– Specific - not open to interpretation– Scalable – can be generated across a whole

university– Manageable – data can be collected in

acceptable amount of time– Agreed – project partners have agreed both

metric and methodology– Realistic – can be generated by multiple

universities despite distinct systems– Time-bound – can be updated regularly to

ensure information currency

13

Metrics for 2013

• Aim is to publish Recipe Book v2 by end 2013• It is anticipated that this will add to v1 by including:

– New “group 2” recipes covering additional areas of Snowball Metrics Landscape…

14

Snowball Metrics landscape

Research InputsResearch Inputs

Research ProcessesResearch Processes

Research OutcomesResearch OutcomesResearch

Post-Graduate Education

Enterprise Activities

Research applicationsResearch awards

Research income Publications & citationsCollaboration (co-authorship)Impact / Esteem

Post-graduate research

Post-graduate experience

Industrial income and engagement

Contract turnaround timesIndustry research income

PatentingLicensing incomeSpin-out generation / income

Completion rates

PeoplePeople Organisations

Organisations

Themes / SchemesThemes / Schemes Researchers

Role Institution Institutional unit External groupings Funder type

Award type Subject area / keywords

Denominators“Slice and dice”Normalise for size

Nu

mera

tors

Den

om

.

Metrics for 2013

• Aim is to publish Recipe Book v2 by end 2013• It is anticipated that this will add to v1 by including:

– New “group 2” recipes covering additional areas of Snowball Metrics Landscape…

– Adoption of existing standards• Translation of “group 1” metrics into CERIF (Common

European Research Information Format), a common language produced by euroCRIS that supports data sharing between different tools

– Enriched “group 1” recipes• Metric update and data governance approaches• National (non-UK) versions

16

Global vs national standards for benchmarking

Snowball Metrics start life with a national perspective – currently UK The aim is to “promote” all aspects of Snowball Metrics as far as possible to a global standard

17

UK metrics

A.N.Other metricsElsewhere metrics

Illustrative only, testing underway

Common core where benchmarking against global

peers can be conductedShared features where benchmarking between

Elsewhere and A.N.Other, but not UK, can be conducted

National peculiarity can support benchmarking within Elsewhere, but not globally