reuters/paul hanna evaluation frameworks: make the most … · insight. a well-designed ranking...

8
Evaluation Frameworks: Make the Most of Your Research REUTERS/PAUL HANNA

Upload: others

Post on 22-Mar-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: REUTERS/PAUL HANNA Evaluation Frameworks: Make the Most … · insight. A well-designed ranking addresses entities whose scope and activities are comparable from the beginning. InCites

Evaluation Frameworks: Make the Most of Your Research

REUTERS/PAUL HANNA

Page 2: REUTERS/PAUL HANNA Evaluation Frameworks: Make the Most … · insight. A well-designed ranking addresses entities whose scope and activities are comparable from the beginning. InCites

1 Evaluation Frameworks: Make the Most of Your Research

The prospect of evaluation, however, presents its own challenges. How can one determine the best approach and make certain that all the necessary data points are being collected? And will the results lend themselves to clear interpretation, helping to elucidate whatever changes in strategy, policy and practice are called for?

One proven answer to these questions is the application of a precise, fixed set of criteria and metrics to monitor and evaluate research: Evaluation Frameworks. These frameworks, as they are known, can help in formulating an approach to assessment, ensuring that all pertinent factors—research goals, inputs, outputs, desired outcomes—are considered. Frameworks can also clarify the steps necessary to adjust or re-direct efforts in order to meet long-term objectives.

To be sure, careful measurement of research impact is a critical activity in gauging the status of a research program and in allocating and directing resources toward a specific end. And adherence to a clearly formulated evaluation framework is an essential element in the process.

Planning Your Evaluation Strategy to Improve Research ManagementOne key to obtaining useful insights into research management from an evaluation framework is to pose the right questions at the outset. For example, what are the precise objectives of conducting the evaluation? What information and specific data points need to be collected, and how? What are the quantifiable benchmarks or achievements, both in the short-and-

long term, that will enable conclusions regarding whether the evaluation was successful in meeting the specified goals?

An evaluation framework might be applied to answer a range of situations and needs, such as advocating on behalf of increased support for research, addressing accountability in showing that resources have been used efficiently, deepening the understanding of ways in which research is effective, or determining where and how to allocate funds in the future.

For example, in order to identify researchers especially deserving of support, administrators might work to identify the authors at their institution whose published work has the most influence in the scientific community. Adherence to a well-formulated framework, and deciding upon the specific measurement of citation impact, would lead to a resource such as the Web of ScienceTM and its publication and citation data. The metrics and tools found there provide detailed figures on an author’s output of published papers, as well as how often, collectively, those papers have been cited. Essential Science IndicatorsTM, a component of the analytics and benchmarking resource InCitesTM, features citation-based rankings of individual scientists in 22 main fields, pointing to the authors who rank in the top 1% by citation impact.

Armed with this information, administrators have concrete figures on which individuals are contributing work that is of consistent interest to their scientific peers, and which subject fields are notable areas of strength. This forms a solid basis for apportioning support to these top producers—or, conversely, to pointing out underperforming areas in which the

Maintaining a research program is a complex and expensive undertaking. The administrators charged with overseeing such programs—whether in a university, government or commercial setting—are increasingly under pressure to document results and mark progress in specific, measurable terms. Therefore, rigorous and thorough evaluation is essential.

Page 3: REUTERS/PAUL HANNA Evaluation Frameworks: Make the Most … · insight. A well-designed ranking addresses entities whose scope and activities are comparable from the beginning. InCites

Careful measurement of research impact is a critical activity in gauging the status of a research program and in allocating and directing resources toward a specific end.

REUTERS/YURIKO NAKAO

Page 4: REUTERS/PAUL HANNA Evaluation Frameworks: Make the Most … · insight. A well-designed ranking addresses entities whose scope and activities are comparable from the beginning. InCites

3 Evaluation Frameworks: Make the Most of Your Research

concentration and influence in a given field, or showing a clear progression of increasing strength over time, provides compelling evidence of research strength for funding agencies to take into account.

Implementing a Clear Framework

Evaluation frameworks come in many varieties. For example, a 2013 guide by S. Guthrie et al., prepared for the Association of American Medical Colleges and published by the Rand Corporation, reviews a few general approaches. These include: quantitative approaches, which produce numerical output and do not require extensive judgment or interpretation; and formative approaches, which focus on learning and improvement while ranging across a wide variety of areas, and which do not produce comparisons between institutions.

Guthrie and colleagues mention 16 different evaluation frameworks, with six examined in detail, including the Research Excellence Framework (see sidebar), designed to evaluate UK universities, including for impact that falls outside the realm of academics; Excellence in Research for Australia, which relies primarily

administrators may choose to allocate resources in order to boost performance. In either case, the conclusions and resulting actions derive from a sound framework of inquiry and solid, supporting data.

In all, when properly formulated and applied, an evaluation framework can turn the abstract idea of “research excellence” into a detailed array of specific benchmarks and measurements by which to assess progress and chart future activities.

Accreditations and FundingIn addition to its utility in monitoring an institution’s research program for the purposes of management and administration, an evaluation framework can be invaluable in the process of accreditation. Tracking the scope and impact of research output—for individuals, departments and institutions as a whole—provides detailed documentation of an organization’s particular areas of emphasis and strength.

The precisely quantified data resulting from an evaluation framework also proves its value in securing and maintaining funding support. Demonstrating

Using Frameworks to Benchmark with PeersIn organizations of nearly every description, from governmental to commercial to athletic to academic, an abiding question for administrators is, “How do we stack up against the competition?”

For institutions concerned with research, formalizing this question within an evaluation framework will help to crystallize specific points of comparison. For example, how does our overall output of published papers compare with peer institutions? In terms of citation impact, how does our influence rate against peers in specific fields, as well as against the world baseline figures in those specialty areas? What percentage of our research output ranks among the top world percentiles in impact, and how does this aspect of performance compare against other, pertinent institutions?

An evaluation framework allows these questions to be posed and—with the use of pertinent, comparative data, such as that available in InCites—answered, with precision and thoroughness.

Page 5: REUTERS/PAUL HANNA Evaluation Frameworks: Make the Most … · insight. A well-designed ranking addresses entities whose scope and activities are comparable from the beginning. InCites

Thomson Reuters 4

Rankings: Research Strength, To OrderIn virtually every sphere of activity, including scientific and scholarly research, performance-based lists and rankings command attention. In the case of research performance, however, rankings must be properly applied, particularly when using citation data from the Web of Science. A key guideline is to draw comparisons carefully, making sure that like is being compared with like.

Fields differ significantly in terms of size and citation activity. Attempting to assess entities of different size in different fields, for example, by simply compiling citation totals or calculating citations per paper—whether for individuals or institutions—will not provide meaningful results. A large and prolific life-sciences field, such as genetics, tends to produce higher output and citation figures than does, for example, a physical-sciences discipline such as materials science, so a comparison by raw citation numbers between researchers in the respective fields would not impart useful insight. A well-designed ranking addresses entities whose scope and activities are comparable from the beginning.

InCites offers nuanced data, allowing users to select institutions to compare side by side, via measures that control for varying size and output. These measures include comparing citation impact against a world baseline in a particular field, or gauging the degree of research collaboration undertaken with other institutions, or determining what percentage of output ranks among a field’s most-cited papers. InCites provides the user with the ability to select institutions for comparison based on a customized search examining specific desired attributes.

In other words, hard data by which to determine the titles of greatest interest and utility to local users.

on bibliometric measures concerning publication statistics; Productive Interactions, a framework developed in several European countries; and STAR METRICS, an initiative led by the US National Science Foundation and the US National Institutes of Health to evaluate the impact of federal investment in research and development.

These frameworks, and others, differ in various attributes, including whether they are formative or quantitative, or the extent to which they are flexible for different applications, or whether they can be scaled for larger analysis without significant cost, or how burdensome the evaluative process is to participants and the organization managing the evaluation.

As the 2013 report states, designing a research evaluation framework requires trade-offs, and there is no “silver bullet.” In other words, settling upon an appropriate framework will itself require research and deliberation.

Frameworks in ActionOne useful tool for initiating and tracking an evaluation is a visual representation of the process. A review of monitoring and evaluation frameworks, generated by the organization UN Women (for evaluating programs aimed at ending violence against women and girls), highlights three main types of these graphical approaches: conceptual frameworks, which identify and illustrate the various factors—organizational as

Page 6: REUTERS/PAUL HANNA Evaluation Frameworks: Make the Most … · insight. A well-designed ranking addresses entities whose scope and activities are comparable from the beginning. InCites

When properly formulated and applied, an evaluation framework can turn the abstract idea of “research excellence” into a detailed array of specific benchmarks and measurements.

REUTERS/DENIS BALIBOUSE

Page 7: REUTERS/PAUL HANNA Evaluation Frameworks: Make the Most … · insight. A well-designed ranking addresses entities whose scope and activities are comparable from the beginning. InCites

Thomson Reuters 6

Research Evaluation Framework: The Future Standard?In 2007, the Research Excellence Framework (REF) was adopted for higher-education institutions in the UK, replacing the previous framework, the Research Evaluation Exercise. Stated goals for the adoption of the new framework included lowering the administrative burden compared to the older system, and an emphasis on equality and diversity, with the ultimate aim of fostering an internationally competitive UK research sector. Covering from 2008 to 2013, the initial REF assessment of 154 universities was published in December 2014.

The REF is divided into three main measures: 65% of the overall score is devoted to “Outputs,” defined as any form of research, including journal articles, monographs and books, along with other forms of creative output, such as designs, performances or exhibitions; “Impacts” account for 20% and are defined as any effect on, change or benefit to the economy, culture, society, public policy and general quality of life beyond academia; and, lastly, “Environment,” accounting for 15%, referring to resources and infrastructure that support research, with supporting data that include the university’s amount of income from research and the number of doctoral degrees granted.

well as individual—that might affect a program and the attainment of goals and objectives; results frameworks, which show the relationships between the intermediate results of activities all the way to the main goals; and the most widely used approach, logical frameworks, usually referred to as logic models.

Logic models provide a diagrammatic view of the inputs (that is, the steps and activities undertaken towards addressing the pertinent situation or problem), outputs (intermediate products resulting from the initial changes, permitting assessment of progress) and outcomes (the expected or desired results). By clearly illustrating the resources necessary to achieve the objectives, and with the representation of the “if-then” relationship between the various elements, logic models provide a thorough guide for envisioning and monitoring an evaluation.

An advisable approach to evaluation is to begin with the ultimate goals or objectives clearly formulated and stated. This initial step will be conducive to asking the necessary questions and envisioning the steps that will point to progress and, ultimately, deliver the ideal results.

For those seeking practical guidance, the end matter of the report by Guthrie and colleagues includes a logic tree for formulating an approach to a framework, along with detailed appendices that review several established models.

Whatever the situation or need faced by an institution as it assesses its progress and charts the way forward, finding and applying the proper evaluation framework will maximize the prospects for success.

Page 8: REUTERS/PAUL HANNA Evaluation Frameworks: Make the Most … · insight. A well-designed ranking addresses entities whose scope and activities are comparable from the beginning. InCites

Checklist for Starting Your Evaluation Framework1. Begin with the end in mind, envisioning the desired goals or outcomes for addressing institutional performance

2. Construct a logic model, a graphical representation of the problem or situation, incorporating pertinent inputs and outputs and their alignment with the desired outcome

3. Identify the appropriate framework, whether primarily quantitative, i.e., producing numerical outputs, or formative, focused more on learning and improvement as opposed to assessment, or some combination

With these as your foundation, you’ll be on your way to ensuring the impact of your work is realized and has the desired impact.

Laura GazeThomson Reuters+1 203 868 [email protected]

Christopher KingThomson Reuters+1 215 823 [email protected]

Copyright © 2016 Thomson Reuters 2/2016