snowball metrics as standard information agreements - anna clements and peter darroch
TRANSCRIPT
Snowball Metrics and CASRAI projectGlobal Standards for Institutional Benchmarking
1
Peter Darroch, ElsevierSenior Product Manager Research Metrics
Co-Chair, CASRAI-Snowball
Anna Clements, University of St Andrews, UKBoard member, euroCRIS
Chair , CASRAI-UK Data Management PlansCo-Chair, CASRAI-Snowball
Member, ORCiD Business Steering Group
Snowball Metrics addresses university-driven benchmarking
2
Snowball Metrics UK Project Partners
Universities need standard metrics to benchmark themselves to know their position relative to peers, so they can strategically align resources to their strengths and weaknesses
Snowball Metrics approach
3
Vision: Snowball Metrics enable benchmarking by driving quality and efficiency across higher education’s research and enterprise activities, regardless of system and supplier• Bottom-up initiative: universities define and
endorse metrics to generate a strategic dashboard. The community is their guardian
• Draw on all data: university, commercial and public
• Ensure that the metrics are system- and tool-agnostic
• Build on existing definitions and standards where possible and sensible
Main roles and responsibilities
• Everyone covers their own costs• Universities
– Agree the metrics to be endorsed as Snowball Metrics
– Determine methodologies to generate the metrics in a commonly understood manner to enable benchmarking, regardless of systems
• Elsevier– Ensures that the methodologies are feasible– Distribute the outputs using global communications
networks– Day-to-day project management of the global
program• Outside the remit of the Snowball Metrics
program– Nature and quality of data sources used to generate
Snowball Metrics – Provision of tools to enable generation and use
Snowball Metrics
4
Globalizing Snowball Metrics
5
US• University of Michigan• University of Minnesota• Northwestern University• University of Illinois at
Urbana-Champaign • Arizona State University • MD Anderson Cancer
Center• Kansas State University
Australia / New Zealand• University of Queensland• University of Western Australia • University of Auckland • University of Wollongong • University of Tasmania• Massey University • The University of Canberra• Charles Darwin University
Interest and support from:• Japan RU11 metrics group• Association of Pacific Rim Universities (APRU)• European Commission for H2020• Fundação para a Ciência e a Tecnologia (FCT)
in Portugal
The output of Snowball Metrics
6
www.snowballmetrics.com/metrics
“Recipes” – agreed and tested metric methodologies – are the output of Snowball Metrics
From Statement of Intent:• Agreed and tested
methodologies… are and will continue to be shared free-of-charge
• None of the project partners will at any stage apply any charges for the methodologies
• Any organization can use these methodologies for their own purposes, public service or commercial
Statement of Intent available at http://www.snowballmetrics.com/wp-content/uploads/Snowball-Metrics-Letter-of-Intent.pdf
The Snowball Metrics Landscape
7
Denominators
8
Snowball Metrics are feasible
9Note: this pilot was built for the UK project partners, and is not available more widely.
Metrics can be size-normalized
10Note: this pilot was built for the UK project partners, and is not available more widely.
Metrics can be “sliced and diced”
11Note: this pilot was built for the UK project partners, and is not available more widely.
Trusted comparison of metrics on a robust standard(comparing apples to apples)
Methods (recipes) are not proprietary. They are agnostic to systems or suppliers – anyone can use them for their own purposes
Ability to choose and control with whom one shares/benchmarks (the crossroad/traffic light model)
Ability to benchmark nationally and internationally
Benefits for universities
12
Key deliverables for 2015
13
• Recipe book – Final recipe book will be produced, in which we aim to complete the metrics matrix with the completed recipes for postgraduate education which need to be shared with the community
• CASRAI profiles for all recipes
• The Snowball Metrics Exchange API will be completed
• euroCRIS : review/approval of Snowball CERIFication work• Barcelona membership meeting : Nov 9-11th 2015• Indicators & CERIF Task Groups
Key deliverables for 2015
14
In more depth : Success Rates
15
• More competition for research funding• Are we doing better, worse or about the same as our
peer institutions?
• We want to compare apples to apples?
In more depth : Success Rates
16
• More competition for research funding• Are we doing better, worse or about the same as our
peer institutions?
• We want to compare apples to apples?
• Agreement reached by expert working group over 12 months
• What do we count ? Record three-states : success / pending / rejection Uses requested price rather than awarded value :
pragmatic i.e. almost all can provide this; recommend revisit when better linkages between application and award systems
• When do we count it ? Record against year of award : rates could change
retrospectively• How do we cope with ‘no shows’ ?
Agree 12 month write-off across all funders
In more depth : Success Rates
17
• More competition for research funding• Are we doing better, worse or about the same as our
peer institutions?
• We want to compare apples to apples?
CASRAI Snowball Working Group
18
AIMS :
• Extend community participation
• Codify metrics in CASRAI dictionary as an international standard
CASRAI Snowball Working Group
19
AIMS :
• Extend community participation• Commercial
• Thomson Reuters• Research Organisations
• US & Canada & UK• Funders
• Wellcome Trust & MRC• Standards / existing data collections
• euroCRIS• HESA• STAR Metrics
CASRAI Snowball Working Group
20
AIMS :
• Codify metrics in CASRAI dictionary as an international exchange standard
• Terms, objects and fields• Show “what’s under the hood” =>
adds value • Wider review circle => Snowball SG
Deliverables• Standard exchange agreement per
Metric [24]• Update/extension of CASRAI dictionary• Agreed streamlined process for new
Metrics
CASRAI Snowball Working Group
21
“Under the hood”
CASRAI Snowball Working Group
22
CASRAI Snowball Working Group
23Snowball metric, Institution, time period, funder type, number
CASRAI Snowball Working Group
24
“Under the hood”
To determine : Snowball metric, Institution, time period, funder type, numberfor :Income Volume
To calculate, need to know institutional information, such as :• Institutional financial year• Funding awards – identification and classification
• HESA cost centre of Principal Investigator• Funding organisations – identification and classification
• HESA funder type• Funding awards – relevant dates
• Date spent• Funding awards – relevant values
• Amount spent
CASRAI Snowball Working Group
25
“Under the hood”
To determine : Snowball metric, Institution, time period, funder type, numberfor :Income VolumeAwards Volume
To calculate, need to know institutional information, such as :• Institutional financial year• Funding awards – identification and classification
• HESA cost centre of Principal Investigator• Supplementary award
• Funding organisations – identification and classification• HESA funder type
• Funding awards – relevant dates• Date spent• Date entered into system
• Funding awards – relevant values• Amount spent• Amount awarded
CASRAI Snowball Working Group
26
Progress update:
• Metrics to be published to dictionary in draft form mid Nov
• Feedback from review group / Snowball Experts WG• Snowball Steering Group – late Nov
• Dissemination and engagement plan • Webinair/s• Workshop
2016 Plus CERIF-XML Plus Snowball Metrics Exchange API
Streamlined process:• Use 3-month sprint model for new metrics
Peter DarrochElsevier, Senior Product Manager, Research Metrics
Anna ClementsUniversity of St Andrews, UK
[email protected] Twitter: @annakclements
Snowball Metrics http://www.snowballmetrics.com/