best practice at the frontiers of program evaluation irwin feller, senior visiting scientist...

27
Best Practice at the Frontiers of Program Evaluation Irwin Feller, Senior Visiting Scientist American Association for the Advancement of Science New Frontiers in Evaluation Vienna, Austria April 24-25, 2006

Upload: warren-mckenzie

Post on 31-Dec-2015

213 views

Category:

Documents


0 download

TRANSCRIPT

Best Practice at the Frontiers of Program Evaluation

Irwin Feller, Senior Visiting Scientist

American Association for the Advancement of Science

New Frontiers in EvaluationVienna, Austria

April 24-25, 2006

Functions of a Keynote Presentation

• If a government or foundation official, announce a new initiative and promise funds

• If a distinguished honoree, summarize one’s life work and offer grand challenges about the future

• If an academic scholar, summarize a field’s existing state of knowledge

Keynoter Qualifications

• Not a government or foundation official; have no resources to offer

• Not a distinguished honoree; indeed am honored to be invited to participate in workshop

• No longer an academic scholar; rather currently engaged in advisory, committee and consulting activities in U.S. and international policy making settings relating to issues on how best to make scientific and technological choices

Subtext of Title

• Reports from the Frontier

• What Policy Makers Want from Evaluators

• Help in Finding My Way

Past Experiences

• (Economic) Evaluation of S&T Programs

• Extensive Use of Mainstream Techniques Embodied in Conference Papers

Recent Experiences

• National Research Council/NIH: Extramural Center Programs-Criteria for Initiation and Evaluation

• NRC/NIA-Assessing the Vitality of the Behavior and Social Sciences

• National Science Board-Task Force on Transformative Research

• AAAS/Harvard- Assessing the Quality of Interdisciplinary Research

Observations on Best Practice

• Workshop papers represent best practice in evaluation of technology and science programs

• Papers primarily represent retrospective evaluations

• Improvements in best practice on retrospective evaluations loosely coupled to the frontiers of science and technology policy questions

Observations on the Frontier

Saliency of Items within Portfolio of Science and Technology Policy

Questions is Changing

Different Thrust to Questions

• Is it Working/Has it Worked?

• What Should We Do?

• How Should We Decide What to Do?

General and Specific Themes

• General: Selection of Scientific and Technological Priorities

• Specific: Treatment of Interdisciplinarity

Science and Technology Policy Decision Cycle

Initial Conditions Annual Planning, Budgeting, & Assessment Cycle Outyear Cycle(New Conditions)

Retrospective Assessment

Prospective Science

DecisionsOrganizational

ActivitiesOutcomes

Prospective Science

Decisions

OrganizationalAssignments

• Existing Stateof Science

• MissionObjectives

• NationalPriorities

• OrganizationalCapabilities

• Budget

The Science Manager’s Decision Cycle

ToT1 T1…n Time

(Implicit) Roles of Evaluation in Decision Cycle

• Evidence-based Decision Making

• Organizational Learning

• Utilization of Evaluation Findings

Uncertainty at the Frontiers of Knowledge

• Problematic Value of Retrospective Assessments

• Power of Accumulated Experience (“Everybody Knows”) as Decision Algorithm

Discontent with Existing Procedures

“Conservatism” of Peer Review Procedures but (Variable) Resistance to

Alternative Methodologies (Foresight/Bibliometrics)

International Dimensions of Issues: I

“With the European Research Council a new funding instrument is about to enter the European

stage while still little is known about the interaction of different funding instruments, e.g, what a well

balanced funding mix actually means with regard to the strategic orientation of research policy and the

role of evaluation within this endeavor”Platform, Introductory Letter

International Dimensions of Issues: II

“Peer review already seems to have been pushed beyond its limits in its decision-making capacity, thus leaving room to review peer review and its

roles in the science system once again”Platform, Introductory Letter

Battles About Funding Priorities

“…last week, two dozen senior researchers met in a windowless Washington, D.C. conference room

to try to avert what some fear could turn into a civil war among earth and space science

disciplines scrambling for science’s decreasing share of the space agency’s budget”

Science, March 17, 2006

Criteria for Scientific Choice Redux

• Weinberg Criteria Revisited

• Internal/External Criteria

The Case of Interdisciplinarity

• “Interdisciplinary connections are absolutely fundamental. They are synapses in this new capability to look over and beyond the horizon. Interfaces of the sciences are where the excitement will be the most intense” (Colwell, 1998).

• “Virtually any meeting on the current state and future of science is leavened by obligatory statements about the importance of enabling researchers to work seamlessly across disciplinary boundaries and by solemn declarations that some of the most exciting problems in contemporary research span the disciplines.” (Metzger and Zare, 1999)

The Debate about Quality

• “Quality is the Watchword of Our Time” - M. Patton

• Debates about quality entail esthetic as well quantitative dimensions

Types and Levels of Decision-making

• Agency Level: Funding of interdisciplinarity (& centers) relative to discipline based research

• Proposal Selection-Competitive (Quality) Position of interdisciplinary proposals relative to discipline-based proposals

• Performance Evaluation-Output/Outcome of interdisciplinary research relative to discipline-based research

Proposed Changes in Treatment of Interdisciplinary Research

• Change composition of panels

• Change decision rules

• Establish separate funding pools

• Workshop on Quality Assessment in Interdisciplinary Research and Education, February 2006

Look Before You Leap

• Debated, uncertain, unknown effects of proposed changes

• True need for “evidence based” decision-making

Implications for “Evaluation” at the Frontier

• Emphasis on prospective decisions calls attentions to mechanisms and criteria

• More than a call for building in ex ante evaluations

• Contribute to (experimental) design of new decision making mechanisms

• Requires broader range of understanding and technical mastery than best practice evaluation techniques alone