crafting your evaluation plan
DESCRIPTION
TRANSCRIPT
June 18, 2009 1
Crafting Your Project’s Evaluation Plan
Presentation to Hands-on Proposal Development Workshop
June 18, 2009
June 18, 2009 2
Agenda
• Goals and Objectives
• Evaluation Definition and Types
• Writing Your Evaluation Plan
• Data Analysis and Reporting
• What Reviewers Look For
• Publicizing Your Results
Do not reproduce without permission
June 18, 2009 3
Goals and Objectives
June 18, 2009 4
Goals/Objectives
• The most important element of a successful program is the development of attainable goals and measurable objectives– Guides program planning and design
– Communicates to stakeholders
– Enables evaluation
• Success is dependent upon realistic goals
Do not reproduce without permission
June 18, 2009 5
Goals: Characteristics
• Describe the overall purpose of the program
• Describe broad outcomes and concepts (what we want to accomplish)
• Expressed in general terms.
Do not reproduce without permission
June 18, 2009 6
• Research the topic (define needs)
• Involve stakeholders (gains commitment)
• Brainstorm goals
• Select the goals that have priority (decide on what matters)
• Limit the program to two-five goals (select realistic goals)
Goals: Development Steps
Do not reproduce without permission
June 18, 2009 7
• The NLM databases will become an integral component of the institution’s public health department instruction
• The NLM databases will become a valuable public health resource for senior citizens in the community
• The project will identify the methods most effective in increasing the utilization of the NLM databases
Goals: Samples
Do not reproduce without permission
June 18, 2009 8
Objectives
• Specifically state how the goals will be achieved
• Are measurable: Define what you want to see
• Encourage a consistent focus on program functions
Do not reproduce without permission
June 18, 2009 9
Objectives Are Not…Tasks• Conducting a training session is a task.
– Poor objective: We will conduct a training session
• An effective objective defines intent– Better objective: Faculty that attend the training
session will create one or more activities to instruct students on the NLM database
Do not reproduce without permission
June 18, 2009 10
How to be SMART
Do not reproduce without permission
June 18, 2009 11
SMART Objectives
• Specific: Be precise about what you are going to achieve
• Measurable: Quantify the objectives • Appropriate: Align with the needs of the
target audience • Realistic: Do you have the resources to
make the objective happen?• Time-Specific: State when you will achieve
the objective
Do not reproduce without permission
June 18, 2009 12
SMART: Specific Objectives
Specific: Be precise about what you are going to achieve
– Specify target– Specify intended output– One output per objective– Avoid vague verbs (e.g. know, understand)– Make sure the objective is linked to the goal– Sample: By January 2010, all students in the Public Health
course will utilize one or more NLM resources in their final project
Do not reproduce without permission
June 18, 2009 13
SMART: Measurable Objectives
Measurable: Quantify the objectives – Use measures as indicators of program success– If possible, establish a baseline (e.g. In January 2009, 5% of
the public health majors utilized NLM resources in their final project)
– Sample: By January 2010, all students in the Public Health course will utilize one or more NLM resources in their final project
Do not reproduce without permission
June 18, 2009 14
SMART: Appropriate Objectives
Appropriate: Align with the needs of the target audience – Meeting the objective will advance the goal– Identify a specific target audience– Are inclusive of diversity within your group – Sample: By January 2010, all students in the Public Health
course will utilize one or more NLM resources in their final project
– Note: The “A” is sometimes called “Attainable” or “Achievable” in the literature.
Do not reproduce without permission
June 18, 2009 15
SMART: Realistic Objectives
Realistic : Do you have the resources to make the objective happen?
– Are important to stakeholders– Are adequately resourced – Can be achieved– Sample: By January 2010, all students in the
Public Health course will utilize one or more NLM resources in their final project
Take care on what you say you can do! Is it realistic for all (100%) students to utilize NLM resources in their final project?
Do not reproduce without permission
June 18, 2009 16
SMART: Time-Specific Objectives
Time-Specific: State when you will achieve the objective– Provide timeframe indicating when objective will be met– Sample: By January 2010, all students in the Public Health
course will utilize one or more NLM resources in their final project
Do not reproduce without permission
June 18, 2009 17
Goals and Objectives
Goal
Objective One
Objective Two
Objective Three
Maintain a clear connection between your goals and objectives. By maintaining this connection, you are articulating your theory of goal attainment.
Do not reproduce without permission
June 18, 2009 18
Goals and Objectives
Final Note:– The Goals and Objectives communicate your intended
results– Know your story. When stakeholders want to know what
your program will do, connect all of your activities to your goals and objectives
– Have an elevator speech
Do not reproduce without permission
June 18, 2009 19
SMART Tool
Do not reproduce without permission
June 18, 2009 20
SMART Tool
ObjectiveBy January 2010, all students in the Public Health course will utilize one or more NLM resources in their final project
Verb Metric Population ObjectBaseline Measure
Goal Measure
Timeframe
Breakdown utilize numberPublic Health
Students
NLM Resource
-- allJanuary
2010
ObjectiveBy January 2010, at least 10 public health courses from a baseline of zero courses will apply an NLM database in course instruction
Verb Metric Population ObjectBaseline Measure
Goal Measure
Timeframe
Breakdown apply NumberPublic Health
Courses
NLM databases
0 10January
2010
Goal: The National Library of Medicine’s databases will become an integral component of the institution’s public health department instruction
Do not reproduce without permission
June 18, 2009 21
SMART Benefits and Costs
Do not reproduce without permission
June 18, 2009 22
Benefits
• Facilitates communication with program stakeholders• Informs on what data should be collected• Enables effective program management• Facilitates the linkage of activities and intended
effects/goals• Enables a focus on evaluation
– Process level (activities)– Output level– Outcome level
• Facilitates replication
Do not reproduce without permission
June 18, 2009 23
Costs and Limitations
• Impression that creativity is limited• Time-consuming• GI/GO• Encourages too great a focus on discrete
measures
Do not reproduce without permission
June 18, 2009 24
Comment on Metrics
• A well-written objective suggests the metric(s)• Example:
– By January 2010, all students in the Public Health Class will complete a project that uses one or more NLM resources
• Metrics:– Total number of students– Total number that use an NLM resource in their
project
• While this may appear obvious, this is an area where programs often fail.
Do not reproduce without permission
June 18, 2009 25
Evaluation: Definitions and Types
June 18, 2009 26
Evaluation
What is it? Why do I HAVE to do it??We will:– Define Evaluation– Introduce Evaluation Types– Review Why Evaluation is Important
Do not reproduce without permission
June 18, 2009 27
Evaluation
What is it?Evaluation…– Assesses program achievements or progress– Enables a data-based judgment on program
quality– Enables the program to build on strengths and
minimize challenges
Do not reproduce without permission
June 18, 2009 28
Evaluation
What kinds are there?– Monitoring: Keeping track of what the program is
doing– Formative: Progress towards meeting goals and
objectives– Summative: Documents what goals have been
met
Do not reproduce without permission
June 18, 2009 29
Evaluation
Why do I have to evaluate?– Provides systematic information to
continually improve your program
– When people ask if you have met your goals, you need to be able to say, “yes,” “no,” or “maybe.” Your answer should not be “I don’t know.”
Do not reproduce without permission
June 18, 2009 30
Evaluation Types
1. Process Monitoring
2. Formative evaluation
3. Summative evaluation• Outputs• Outcomes
Do not reproduce without permission
June 18, 2009 31
Process Monitoring
What are we doing?– This evaluation type is a continuous activity
• Is the data being captured in an organized fashion?
• Is the program collecting the type of information that is needed for final reports and evaluations?
Tip: The answers should be at the tip of your fingers!
Do not reproduce without permission
June 18, 2009 32
Formative Evaluation
Are we making progress towards meeting our goals? Are mid-course corrections needed?
– Occurs during program operation• Is the program maintaining a focus on its planned
activities?• Does the monitoring data give evidence that goals and
objectives will be met?• What steps need to be taken to continue progress or
are adjustments needed?• If adjustments are needed, have all stakeholders been
informed?
Do not reproduce without permission
June 18, 2009 33
Summative Evaluation
What impact did we have? What were our Outputs and Outcomes
–Occurs at conclusion of program• What were our program outputs?
• What were our program outcomes?
• Did we achieve our goals and objectives?
• What improvements can be made to make the program stronger?
Do not reproduce without permission
June 18, 2009 34
Summative Evaluation
What are outputs and outcomes?– Outputs are the program results that can be quantified
(e.g. the number of training sessions, the number of participants in the training sessions)
– Outcomes are the goals of the program that typically need a more in depth evaluation (e.g. How do senior citizens utilize the NLM databases?)
Reminder: your program outputs and outcomes are specifically related to your program goals and objectives
Do not reproduce without permission
June 18, 2009 35
Writing Your Evaluation Plan
June 18, 2009 36
Evaluation Plan
• Think about and plan for how you will do evaluation now.
• If you don’t think about evaluation now, you may miss important opportunities to collect data that could improve your program.
Do not reproduce without permission
June 18, 2009 37
Evaluation Plan Components
Your Evaluation Plan Should Have these Components:
– Evaluation Questions– Methodology
• Process Monitoring: Outputs• Formative Evaluation: Progress towards meeting
outputs and outcomes• Summative Evaluation: Outcomes
Do not reproduce without permission
June 18, 2009 38
Evaluation Questions
Similar to how goals and objectives guide program development, Evaluation Questions guide program assessment and evaluation. We will review:
– Definition– Development
Do not reproduce without permission
June 18, 2009 39
Evaluation Questions
What are they?– A good evaluation question specifically outlines
what is being assessed and suggests the data needed
– Evaluation question types reflect the program stage (i.e. formative, summative)
• Formative: What are the best media channels to reach the target audience about NLM resources?
• Summative: How effective are select media channels in reaching our target audience?
Do not reproduce without permission
June 18, 2009 40
Evaluation Questions
How do I develop evaluation questions?
– The first step is to refer back to the SMART objectives. If the objectives reflect what the program was trying to do, the evaluation should assess this
– The second step is to form the questions
• Objective: By January 2010, ten courses will utilize NLM resources in their classes
– Evaluation Question One: How many courses are utilizing NLM resources?
– Evaluation Question Two: What do faculty identify as factors in their decisions to utilize NLM resources?
– Evaluation Question Three: How do faculty utilize NLM resources?
– The third step is to re-evaluate the questions. Are these the kind of questions needed to inform on program success?
Do not reproduce without permission
June 18, 2009 41
Evaluation Plan
I have evaluation questions… what do I do now?
– Develop a methodology– What data and data sources will you need?
• Develop an evaluation crosswalk
• How will you analyze the data?
• Can you do it or do you need help?
– How will you report the results?
Do not reproduce without permission
June 18, 2009 42
Evaluation Plan
Methodology
• Outline your steps by Evaluation Stage–Process Monitoring: In the first month of
the grant, an Excel spreadsheet will be created to track all project activities
–Summative Evaluation: In the first month of the grant, all surveys will be created and tested.
Do not reproduce without permission
Course Syllabi Faculty
Interviews Faculty Survey
Student Surveys
How many courses are utilizing NLM resources?
√ √
What do faculty identify as factors in their decisions to utilize NLM resources?
√ √
How do faculty utilize NLM resources?
√ √ √
June 18, 2009 43
Evaluation Plan
• An evaluation crosswalk defines what data sources will be used to inform on the evaluation question.
Do not reproduce without permission
June 18, 2009 44
Data Analysis and Reporting
June 18, 2009 45
Data Analysis
I have data… what do I do now?We will review– Data types– Analysis
Do not reproduce without permission
June 18, 2009 46
Data Analysis
Data Types: Quantitative Data– Measurable and tangible – Involves the counting of people, behaviors,
conditions, or other events– Enables the use of statistics to answer
questions
Do not reproduce without permission
June 18, 2009 47
Data AnalysisSteps
– Understand your data!• Be able to explain what the numbers mean
– Organize the data• Enter the data in a program like Excel or SPSS. (Learn
how to use the pivot table function in Excel!)
– “Clean” the data• Look at the data. Are there data entry mistakes? Does
something look odd? Check and fix mistakes!
– Compile the data• Summarize the data in tables or graphs
Do not reproduce without permission
June 18, 2009 48
Data Analysis
Tips
– Make it simple! It is not effective when no one understands your results
– Increase the white space! Graphs and tables are effective communicators
– When the number of people is less than 30, report numbers.
Do not reproduce without permission
June 18, 2009 49
Data Analysis
Data Types: Qualitative Data– Data is rich in detail and description– Text or narrative format– Examples: interviews, case studies, focus
groups, or document review.
Do not reproduce without permission
June 18, 2009 50
Data Analysis
Steps– Organize the data
• Enter data into a program like Excel– “Clean” the data
• Read the data. Correct data entry errors. (Caution: Do not change the wording of what was recorded)
– Label/Code the data• Give each group of data a label/code. Iterative process
to finalize the labels/codes– Compile the data
• Group like labels/codes together to see what the data is telling you.
Do not reproduce without permission
June 18, 2009 51
Data Analysis
Tips– Maintain objectivity. Recognize and limit your
biases– Where possible, quantify the qualitative data
(e.g. 25 people said something was important)– Visuals can help readers understand qualitative
data
Do not reproduce without permission
June 18, 2009 52
Data Analysis
Overall Tips– Remember: The data is used to answer the
evaluation questions– Quantitative data and qualitative data can be
used to support one another– When possible, have others look at your
analysis
Do not reproduce without permission
June 18, 2009 53
Reporting
Tell your story!– Report Sections– Tips– Understand Required Reporting
Do not reproduce without permission
June 18, 2009 54
Reporting
Report Sections– Program Background: What are the
program goals and objectives
– Program Activities: What did you do? Be specific. Include dates, number of activities, activity types, etc.
– Methodology: State how you are evaluating program effectiveness.
Do not reproduce without permission
June 18, 2009 55
Reporting
Report Sections– Results: List the results from your data collection
instruments– Analysis/Conclusions: What do the results
mean? Was the program successful? Were there things that could have been improved
– Next Steps: How will you use these results to keep the program growing?
Do not reproduce without permission
June 18, 2009 56
Reporting
Tips– You are doing important things. Do not brag but
do not minimize your accomplishments– Be honest… if the data indicates something did
not go well, state this– All conclusions on strengths and weaknesses
should be linked to the data– Know your audience: customize your report to
who you talk to– Do not extrapolate past the data
Do not reproduce without permission
June 18, 2009 57
Reporting
Required Reporting– Funders often have reporting requirements– Request that the funder defines data that will be
required– Incorporate this data collection into your
process monitoring
Do not reproduce without permission
June 18, 2009 58
Sample Evaluation Plan
June 18, 2009 59
Sample Evaluation PlanThe project has three evaluation questions:
1. How many courses are utilizing NLM resources? 2. W hat do facu lty identify as factors in their decisions to utilize NLM resources? 3. How do faculty utilize NLM resources?
The crosswalk below disp lays the data sources for each evaluation question.
Course Syllab i Faculty Interv iews
Faculty Survey
Stu dent Surveys
How many courses are utilizing NLM resources?
v v
What do faculty identify as factors in their decisions to utilize NLM resources?
v v
How do faculty utilize NLM resources?
v v v
The evaluation components are P rocess Monitoring, Formative Evaluation, and Summative Evaluation.
Do not reproduce without permission
June 18, 2009 60
Sample Evaluation PlanProcess Monitoring
An Excel workbook will be created to track the following activities: • Courses utilizing NLM resources by semester • Training sessions conducted • Number of participants per training session by type (faculty, students) • How activities were evaluated The Principal Investigator will be responsible for entering and maintaining the data. Tracking this information will provide the program managers with a comprehensive listing of program activities and participants. The Excel workbook will be created the first month of the program.
Formative Evaluation
To evaluate program status, the Principal Investigator will create a Project Activity Checklist that lists all activities. On a quarterly basis, each project team member will independently rate two areas for each activity. The first area is whether the activity contributes to obtaining the project goals. The second area is whether each activity is complete, on schedule, behind schedule, or not started. The Principal Investigator will organize all the responses of the team members. The team will review the results to determine if any interventions are needed to maintain progress towards the goals. A status report will be submitted program manager each quarter prior to the close of the grant.
Do not reproduce without permission
June 18, 2009 61
Sample Evaluation PlanSummative Evaluation
The summative evaluation will respond to each of the evaluation questions. The data sources and how they will be utilized are:
• Course Syllabi • Faculty Interviews • Faculty Survey • Student Survey
A course syllabi checklist will be created the document whether the course includes NLM content. Syllabi will be checked at the beginning of the grant program and the end of the grant program to determine whether there has been an increase in NLM resource utilization. The faculty interview protocol, faculty survey, and student survey will be created in the first month of the grant program and submitted to the institution’s Institutional Review Board for approval. Data collection and analysis will occur in the final month of the grant period. Data analysis will inform on whether the project goals have been met. The final evaluation report will have these sections: Program Background, Program Activities, Project Description, Methodology, Results, Analysis/Conclusion, Next Steps.
Do not reproduce without permission
June 18, 2009 62
What Reviewers Look For
June 18, 2009 63
Proposal Evaluation Plans
People have had many workshops on how to write evaluation plans for proposals.
– What do the reviewers see?
– What should the reviewers see?
Do not reproduce without permission
June 18, 2009 64
What Reviewers See…
• “We will conduct surveys at the end of every training session”
• “We will conduct pre/post tests to…”
• “We will have a focus group…”
Do not reproduce without permission
June 18, 2009 65
What Reviewers See…
Sounds good!!!
What’s the problem???– Evaluation plan has no clear connection
to the goals and objectives specified in the proposal.
• Writers often lay out plans to evaluate activities that are part of the program but not the overall program
Do not reproduce without permission
June 18, 2009 66
What Reviewers Should See
• Evaluation Questions based on the project goals and objectives
• Defined metrics
• Defined data sourcesA quality evaluation plan in a proposal describes how the overall success of the program will be determined
Do not reproduce without permission
June 18, 2009 67
Sustainability
– What do the reviewers see?
– What should the reviewers see?
Do not reproduce without permission
June 18, 2009 68
What Reviewers See…
• “We will work to identify more funding”
What Reviewers Should See…
• A more in-depth understanding of sustainability– What will be the lasting impacts of the
program if it is successful?
Do not reproduce without permission
June 18, 2009 69
Publicizing Your Results
June 18, 2009 70
Publishing Your Results• Identify a publication that would have an interest in
your study
• Have a full understanding of the publication’s submission guidelines (style, deadlines, etc.)
• PROOFREAD! You may have an outstanding study but your submission loses credibility if there are grammar or spelling errors
• Do not be discouraged by a “no”! Keep trying and listen to the reviewer comments. RESUBMIT!!
Do not reproduce without permission
June 18, 2009 71
Sample Publication Guidelineshttp://www.ehealthinternational.org/guidelines.htm• AUTHOR'S GUIDELINES
Manuscripts should be submitted electronically to the Managing Editor, Hasan Sapci, M.D. to the following email address: [email protected]. The submission should include a cover letter to provide a very brief description of the topic of the paper together with an explanation that the manuscript is original and not submitted elsewhere.
• Manuscript Preparation
Each manuscript should have a title page including a short running title as well as a listing of all authors. The list of authors should include names, degrees and institutional affiliation, as well as a complete postal mailing address, fax, telephone, and e-mail address for each author. In addition, a corresponding author should sign the cover letter.
Each manuscript should have an abstract of not more than 250 words, without any citations or references. The abstract must include a statement of the problem addressed in the paper, the methodology used in the analysis, the main findings, and conclusions, as appropriate.
Manuscripts should be prepared in Microsoft Word or Word Perfect.Tables and Figures should be submitted separately, preferably in uncompressed TIFF, BMP or PNG format.
We recommend reading these articles about formatting and style: - Uniform Requirements for Manuscripts Submitted to Biomedical Journals: Writing and Editing for Biomedical Publication http://www.icmje.org/index.html
- American Medical Association Manual of Style: A Guide for Authors and Editors. 9th ed. Baltimore, Md.: Williams & Wilkins, 1998.
Do not reproduce without permission
June 18, 2009 72
Sample Publication Guidelines
http://www.ehealthinternational.org/guidelines.htm• References
References are numbered consecutively in the text as superscripts, beginning with number 1. When the reference is at the end of a sentence, punctuation should precede the superscript. When in the middle of a sentence, superscripts are included in the text without punctuation. The list of references at the end of the manuscript should be numbered consecutively according to the order in which they appear in the text. Journal names must be abbreviated according to the style of Index Medicus. Copyright
Authors are responsible for obtaining permission to use published material, including their own work. Permission must be provided in writing from the original copy right holder ( typically the publisher, not the editor, except for unpublished material where the original author is the copyright holder).
Do not reproduce without permission
June 18, 2009 73
Questions?
Barry Nagle
Director
Center for Assessment, Planning, and Accountability
United Negro College Fund Special Programs Corporation
2750 Prosperity Avenue, Suite 600
Fairfax, VA 22031
703-205-8139
Do not reproduce without permission