dr vicky jones, senior research policy adviser, hefce

Post on 15-Apr-2017

1.459 Views

Category:

Government & Nonprofit

2 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Research Assessment and The Metric Tide

Vicky Jones

Senior Policy Adviser

Research impact: delivering excellence

5th July 2016

http://www.hefce.ac.uk/rsrch/metrics/

http://www.responsiblemetrics.org

“I can announce today that I have asked HEFCE to undertake a review of the role of metrics in research assessment and management. The review will consider the robustness of metrics across different disciplines and assess their potential contribution to the development of research excellence and impact…”

David Willetts, Minister for Universities & Science, Speech to UUK, 3 April 2014

REF 2014: Evaluation programme

Evaluation activity

• Two-phase evaluation of impact

• Feedback from participating institutions

• REF panel feedback

• Review of costs, benefit and burden

• Multi- and inter-disciplinary research in the UK

• Equality and diversity analysis

Wider work relating to REF

• Analysis of impact case studies and database

• Independent Review of Metrics

• Open access

The Metric Tide

Headline findings

Across the research community, the description, production and consumption of ‘metrics’ remains contested and open to misunderstandings.

Peer review, despite its flaws and limitations, continues to command widespread support across disciplines. Metrics should support, not supplant expert judgement.

Inappropriate indicators create perverse incentives. There is legitimate concern that some quantitative indicators can be gamed, or can lead to unintended consequences.

Correlation analysis of the REF2014 results at output-by-author level has shown that individual metrics cannot provide a like-for-like replacement for REF peer review.

Within the REF, it is not currently feasible to assess the quality of UOAs using quantitative indicators alone, or to replace narrative impact case studies, or the impact template.

Responsible metrics

Responsible metrics can be understood in terms of:

• Robustness: basing metrics on the best possible data in terms of accuracy and scope;

• Humility: recognising that quantitative evaluation should support – but not supplant – qualitative, expert assessment;

• Transparency: keeping data collection and analytical processes open and transparent, so that those being evaluated can test and verify the results;

• Diversity: accounting for variation by field, using a variety of indicators to reflect and support a plurality of research & researcher career paths;

• Reflexivity: recognising the potential & systemic effects of indicators and updating them in response.

The Metric Tide

Recommendations

At an institutional level, HEI leaders should develop a clear statement of principles on their approach to research management and assessment, including the role of indicators.

Research managers and administrators should champion these principles and the use of responsible metrics within their institutions.

HR managers and recruitment or promotion panels in HEIs should be explicit about the criteria used for academic appointment and promotion decisions.

Individual researchers should be mindful of the limitations of particular indicators in the way they present their own CVs and evaluate the work of colleagues.

Like HEIs, research funders should develop their own context-specific principles for the use of quantitative indicators in research assessment and management.

Data providers, analysts & producers of university rankings and league tables should strive for greater transparency and interoperability between different measurement systems.

Publishers should reduce emphasis on journal impact factors as a promotional tool, and only use them in the context of a variety of journal-based metrics that provide a richer view of performance.

The UK research system should take full advantage of ORCID as its preferred system of unique identifiers. ORCID iDs should be mandatory for all researchers in the next REF.

Further investment in research information infrastructure is required to improve the interoperability of research management systems.

HEFCE, funders, HEIs and Jisc should explore how to leverage data held in existing platforms to support the REF process, and vice versa.

For the next REF cycle, in assessing outputs, we recommend that quantitative data –particularly around published outputs –continue to have a place in informing peer review judgements of research quality.

In assessing the research environment, we recommend that there is scope for enhancing the use of quantitative data.

In assessing impact, we recommend that HEFCE builds on the analysis of the impact case studies from REF2014 to develop clear guidelines for the use of quantitative indicators in future impact case studies.

• Preparing for impact has provided benefits and strategic insight to Universities

• The assessment of impact worked well, but there are areas for improvement

• Considerable and diverse impacts were submitted for assessment

• Impact derives from the integration of disciplinary knowledge

• The systematic collection of impact data has generated an important national asset, and provided new insight into the relationship between research and impact

What have we learned?

27

• Impact is not confined to the place where research was carried out

• Research across the UK leads to impact in London

• There is some regional focus of impact

• Some regions are more locally focused in terms of impact than others

• No evidence to suggest the removal of the impact element or a radical change to the approach…

• …but some areas for review/reform:

• Impact template

• Evidence and data requirements

• FTE thresholds

• Changes to the guidance

• Case study database is an important source of evidence

Original areas for consultation

Thank you for listening

v.jones@hefce.ac.uk

top related