adapt to a chang - nsp.hfpg.orgnsp.hfpg.org/portals/17/uploads/documents/use hpl/eval_bec... ·...
TRANSCRIPT
BUILDING EVALUATION CAPACITY
FINAL EVALUATION REPORT
Class of 2017
Submitted To: HARTFORD FOUNDATION FOR PUBLIC GIVING
NONPROFIT SUPPORT PROGRAM
Submitted By: Anita M. Baker, Ed. D.
August 2017
Evaluation Services 101 E. Blair Tr Lambertville, NJ 08530
We have been able to make the entire organization more data driven. Where previously we were using mostly professional hunches to make decisions, now we have data to back up those suggestions, and as a leader I can require that others back up their suggestions with data before we invest resources. My Board has also adopted this mindset and is looking to our evaluation to guide future strategy as we adapt to a changing funding and growth landscape. BEC Executive Leader, Class of 2017 BEC helped give us the tools, time, and expertise to make evaluation a priority. BEC Participant, Class of 2017
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 1 EXECUTIVE SUMMARY
The BEC Class of 2017 Connecticut Association for Human Services Connecticut Radio Information System (CRIS Foodshare Hartford Neighborhood Centers, Inc. Hartford Performs Hartford’s Camp Courant Hockanum Valley Community Council, Inc. Latino Community Services North Central Regional Mental Health Board, Inc. The Discovery Center The Mark Twain House & Museum Unified Theater World Affairs Council of CT YWCA Hartford Region, Inc. Class of 2017 BEC participants, like those from the Classes of 2008, 2010, 2013, and 2015 delivered different types of services and were broadly representative of Hartford area nonprofit organizations.
EXECUTIVE SUMMARY, BEC 2016 - 2017
The Building Evaluation Capacity (BEC) program was initiated in the fall of 2006 by the Hartford Foundation for Public Giving’s Nonprofit Support Program (NSP). It was designed to give participating organizations the knowledge, skills and tools to evaluate, improve and communicate about their work. The Class of 2017 is the fifth group of Hartford-area nonprofit organizations to participate. BEC is a multi-year program that includes evaluation capacity development for selected organizations and ongoing study for participating organizations that have completed the initial evaluation capacity building work. The evaluation capacity building training operates in two phases (phase I = initial training and evaluation project design, phase II = project implementation and continued training). Each phase is designed to provide comprehensive, long-term training and coaching to increase both evaluation capacity and organization-wide use of evaluative thinking for participating organizations. The program, adapted from the similar Bruner Foundation-sponsored Rochester Effectiveness Partnership* was developed and delivered by Anita Baker, Ed. D., an independent evaluation consultant, who has lead four other similar projects in Rochester, New York; New York City (2); and the Metrowest (Framingham) area of Boston, Massachusetts. From January 2016 through June 2017, BEC was delivered to representatives of 14 selected nonprofit organizations. NSP initially undertook the development of BEC because evaluation was an area of organizational capacity the NSP had not been addressing. Many organizations were requesting help with evaluation in response to requirements by their funders to collect data and answer outcomes-focused questions. It was felt that helping them to not only obtain better data, but also to use those data for decision-making would benefit the organizations. NSP elected to continue BEC for a fifth class (see next section), because the previous classes had been well received and participating organizations clearly benefitted from BEC. __________________ * REP was a self-governing partnership of funders, nonprofit service provider organizations and evaluation professionals committed to increasing knowledge and use of participatory program evaluation through comprehensive training and guided evaluation projects ** Though Class of 2017 team sizes varied from two to four members, every participant organization included at least one senior official capable of decision-making (12 of the 14 organizations directly involved their Executive Directors in the training). The organizations also involved individuals from various positions (e.g., Director of Grants and Program Development, Director of Education) according to their own needs for training.
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 2 EXECUTIVE SUMMARY
As with all the earlier BEC training classes, the phase I training period for the Class of 2017 participants included didactic training sessions, with opportunities to practice and apply new skills. It culminated with the development by each participating organization of a rigorous evaluation design for a selected program of its own. In a response to requests to increase flexibility, Executive Leaders were given the opportunity to opt out of the three sessions focused specifically on data collection, though most participated. Additionally, efforts were made at all of the initial sessions to encourage interactions between participating organizations. Phase II focused on the implementation of the evaluation project and included five team consultation sessions and four group sessions where there was both review of phase I concepts and delivery of new, more advanced topics such as effective use of pre- post-surveys, data visualization, organization-wide integration of evaluative thinking. The ultimate outcomes for all BEC participants were enhanced knowledge about evaluation, enhanced skill to conduct evaluation and use evaluation findings (for decision-making and fund development), extension (“Ripple”) of evaluation skills to other projects and personnel and enhanced knowledge about and use of evaluative thinking in organizations. By all accounts, the BEC program was very productive for the Class of 2017. Though there were multiple staff transitions, all 14 organizations successfully completed their training. Participants from all teams demonstrated they were learning about evaluation and developing evaluative capacity. Their feedback regarding BEC program design, content and especially their own evaluation projects was very positive (32 of 34 participants, 94%) who received the survey answered). As had their predecessors in prior classes, those from 2017 gained or honed numerous evaluation-related skills such as: ability to ask clear evaluation questions, design and select data collection methods and construct evaluation designs. They conducted their own evaluation projects, integrated evaluative thinking into their work, and initiated strategies to continue extending evaluation capacity throughout their organizations. Their evaluations included collection and analysis of data, summarization and presentation of findings and development of proposed action steps. Additionally, a total of nine of the Class of 2017 organizations opted to participate in the 2017-18 alumni group. Most will involve new participants from their organizations, and all will continue doing evaluation-related project work. The evaluation work of the Class of 2017 participants was particularly useful and noteworthy. All project work had to conform to standard professional evaluation practice and it clearly showed that BEC participants were able to apply what they learned. Project reports were presented at the final BEC conference to BEC organizations and the Hartford Foundation’s NSP stakeholders. Those in attendance, including many senior Hartford Foundation staff were once again consistently impressed with both the clarity and thoroughness of the efforts. Most importantly, all of the organizations obtained information through their evaluation projects that informed their ongoing work. All were able to identify program-specific action steps in direct response to their findings and most had initiated at least some of those actions before their participation in the Class of 2017 ended. As they concluded their work, Class of 2017 participants assessed their own abilities to extend (“Ripple”) the work beyond the class trainees, and they considered integration of
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 3 EXECUTIVE SUMMARY
evaluative thinking at their organizations. All but one of the respondents to the final BEC evaluation survey (97%) indicated BEC had prepared them to extend their learning. Each organization indicated they had extended the training at least a little (half reported they had done so a lot) for example by involving others in the evaluation projects, presenting findings to board and staff, and using evaluation skills to address additional evaluation needs at their organizations (like revising survey instruments or writing evaluation design sections into new proposals). Additionally, on the final survey, 100% of the responding participants indicated that participating in BEC had enhanced evaluative thinking in their organizations and they were able to provide specific examples to clarify and substantiate the changes in their use and understanding of evaluative thinking as it applies to multiple organization capacities. Conclusion The Class of 2017 reinforced that their BEC experiences had been important on multiple levels and accomplished what the program was designed to do. Specifically, all or almost all of the participants who responded to the survey indicated BEC was important because it helped them: improve the quality of data they attain; to look at their programs from different perspectives; understand participatory evaluation; to build evaluation into the program planning process; to revise programs based on real data; to increase capacity to analyze data about the outcomes they value; and to incorporate evaluation practices into daily practice. Additionally, the one area that class of 2015 identified as insufficient ̶ the attention to relationship-building and networking across participating agencies ̶ was identified as important by all Class of 2017 respondents. Three-fourths of the respondents to the final survey, including one or more from every organization indicated they are very likely to continue or expand their evaluation work.
Next Steps and Issues for Further Consideration A new alumni group will be initiated for the nine Class of 2017 organizations that opted to participate (see the full report for details) and plans for a new class are under serious consideration. Continued vigilance will be necessary to ensure that Alumni Group participants get meaningful opportunities to analyze real data from their own organization programs, continue to successfully plan for and conduct evaluations, and integrate new staff into BEC. It will also be important to attract and inform a suitable new cohort of participants, to continue to use productive strategies for supporting their needs and interests, and to help them stay focused on development of evaluation capacity while also managing other organizational demands. Both the alumni group and any future BEC classes will need assistance to handle the rigor required to fully analyze evaluation data, utilize new available tools to collect, analyze and present data, and summarize and use findings. NSP staff and consultants will need to continue to strive to integrate technology/automation (such as mapping, hand-held electronic surveying) and use of analytical software where possible, and to continue to support programs such as the Evaluation Roundtable and the Evaluation Capacity Grants to increase organizational connections and networking while remaining focused on ensuring that BEC increases evaluation capacity and enhances evaluative thinking for participating organizations.
Building Evaluation Capacity, Class of 2017: Final Evaluation Report i
TABLE OF CONTENTS EXECUTIVE SUMMARY I. INTRODUCTION BEC Design Overview 2016 - 2017 ................................................................................................... 2
II. BEC TRAINING AND EVALUATION COACHING BEC Class of 2017 Participants ........................................................................................................ 4 BEC Class of 2017 Training: Phase I ................................................................................................. 6 BEC Class of 2017 Training and Evaluation Coaching: Phase II ....................................................... 8 A Note about Attendance and Transition ........................................................................................ 8 Participant Assessment of BEC ..................................................................................................... 11 Longer Terms Importance of BEC ................................................................................................. 12 Participants’ BEC Experiences ........................................................................................................ 13 Comparative Assessment of the BEC Experience .......................................................................... 14
III. BEC FINAL RESULTS CLASS OF 2017 Participants Developed Important Skills to Conduct Evaluations ................................................. 16 Participants Used the Skills they Acquired/Honed During BEC .................................................... 18 Participants are More Involved in Evaluation at their Organizations ........................................... 19 Participants Successfully Completed Evaluation Projects ............................................................. 19 Participants’ Projects were Comprehensive and Useful .............................................................. 27 BEC Projects’ Informed Changes, Participants Used Their Findings ............................................. 32 BEC Participants Understand and Have Begun to “Ripple” ........................................................... 35 Evaluative Thinking is Being Enhanced Through BEC ................................................................... 37
IV. NEXT STEPS Importance of BEC ........................................................................................................................ 39 Future Classes and Alumni Study .................................................................................................. 39 A Note About BEC Evaluation ........................................................................................................ 41 Conclusions and Issues for Further Consideration ........................................................................ 41
APPENDIX Comparative Tables: Leaders v. Other Staff, A1 – A5 Alumni Group Overview NSP Support, including findings from Spring Evaluation Roundtable, April, 2017 List of Tables Table 1: BEC Class of 2017, Participating Organizations ............................................................................ 4 Table 2: BEC Class of 2017, Phase I Training Descriptions ......................................................................... 7 Table 3: BEC Class of 2017, Phase II Training Descriptions ........................................................................ 9 Table 4: Assessment of BEC Training Value .............................................................................................. 11 Table 5: Percent of Trainees who Rated BEC Components/Features as Excellent, Over Years ............... 12 Table 6: Percent of BEC Participants who Reported BEC Elements were Important ............................... 13 Table 7: Programs and Evaluation Questions Selected by BEC Class of 2017 Participants ...................... 21 Table 8: Selected Evaluation Data Collection Methods: Class of 2017 Participants ................................ 26 Table 9: Specific Examples of Program Changes Informed by Evaluation Results, BEC 2017 .................. 32 Table 10: Percent of Respondents Agreeing that Statements about BEC Importance were True ............. 40
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 1
BEC Desired Outcomes
Participants will develop skills to conduct comprehensive evaluations and use those evaluations in their regular work.
Participants will complete full evaluations including data collection and analysis, summarization of results and reporting.
Participants will communicate results of their studies to stakeholders and use their findings.
Participants will extend their evaluation skills to others in their organization.
Participants will increase their knowledge about and use of evaluative thinking.
I. INTRODUCTION
BEC was initiated in the fall of 2006 by the Hartford
Foundation for Public Giving’s Nonprofit Support Program.
It was designed to give participating organizations the
knowledge, skills and tools to evaluate, improve and
communicate about their work (see desired outcomes in
box at right). The Class of 2017 is the fifth cohort of
Hartford-area nonprofits to participate. BEC is a multi-
year program that includes evaluation capacity
development for selected organizations (in this case, the
Class of 2017) and opportunities for ongoing study for
participating organizations that have completed the initial
evaluation capacity building work (i.e., the BEC alumni
group). The evaluation capacity building training operates
over two phases (phase I = initial training and evaluation
project design, phase II = project implementation and
continued training). Each phase is designed to provide comprehensive, long-term training and
coaching to increase both evaluation capacity and organization-wide use of evaluative thinking for
participating organizations.
BEC was adapted from the Bruner Foundation-sponsored Rochester Effectiveness Partnership.1
It was developed by and it has been since its inception, conducted by Anita Baker, Ed.D., an
independent evaluation consultant who has lead other similar projects in Rochester, New York, New
York City (2), and the Metrowest (Framingham) area of Boston, Massachusetts. From January 2016
through June 2017, BEC was delivered to representatives from 14 selected nonprofit organizations
that are the focus of this report (see Section II for full descriptions of the Class of 2017).2
1 REP was a self-governing partnership of funders, nonprofit service provider organizations and evaluation professionals committed to increasing knowledge and use of participatory program evaluation through comprehensive training and guided evaluation projects. Visit the Bruner Foundation at www.EvaluativeThinking.org for more details. 2 Note that during the fall 2015, participants for the Class of 2017 were recruited and alumni study for the Class of 2015 was initiated. During winter/spring 2016, the Class of 2017 phase I training was conducted and Class of 2015 alumni study concluded. Phase II training for the Class of 2017 was conducted summer 2016 through spring 2017. This sequencing has been used since the fall of 2013.
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 2
BEC curricula and training sessions were re-designed somewhat for the Class of 2013 and then
used with minor updates and adjustments for both the Class of 2015 and the Class of 2017. The
streamlined BEC phase I training curriculum focused particularly on evaluation design (see following)
while the phase II training and consultation were presented with more emphasis on data analysis (see
Evaluation Report 2012 for descriptions of the related Initiatives, MWEI, Framingham, MA and
Anchoring Evaluative Capacity, Hartford, CT, that informed the curriculum changes).
BEC Project Design Overview 2016 - 2017
BEC Phase I BEC Phase II
* Attended six didactic sessions with practice application activities
* Attended one independent
consultation session on evaluation design
* Completed assignments to demonstrate understanding of evaluation and evaluative thinking
* Developed rigorous evaluation
designs for a program of choice * Presented designs at final conference
to peers and other stakeholders
* Attended four didactic sessions on advanced evaluation capacity development, specifically data analysis and evaluation reporting
* Actively participated in five customized individual technical assistance sessions focused on evaluation projects
* Completed analytical assignments, conducted evaluations of own design, and summarized findings into evaluation reports
* Participated in a critical read session, provided and received feedback from peers
* Presented results of evaluations at final conference to peers and other stakeholders
*Please note that the project design for BEC Class of 2017 was the same as the one used for the Class of 2013 and Class of 2015. Individual lessons were modified slightly to enhance the earlier focus on analysis and to integrate the use of electronic surveys and databases.
As with all of the earlier BEC training classes, the phase I training period for the Class of 2017
participants included didactic sessions with opportunities to practice and apply new skills (see details,
Section II). It culminated with the development by each participating organization of a rigorous
evaluation design for a selected program of its own. In a response to requests to increase flexibility,
Executive Directors were given the opportunity to opt out of the three sessions focused specifically
on data collection, though most participated. Additionally, efforts were made at all of the initial
sessions to encourage interactions between participating organizations. Training during phase II
continued on a monthly basis with four sessions being held with all participants together and five
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 3
sessions being used for individual consultations regarding participants’ evaluations, including their
final report (for additional details regarding the history of BEC, its purpose and design considerations,
please see prior evaluation reports, Class of 2008, Class of 2010, Class of 2013, Class of 2015).
Throughout both the phase I training period and phase II for the Class of 2017, feedback was
provided by participants regarding the program. Additionally, evidence of evaluation learning and
evaluative thinking enhancement was collected. Specifically, the Class of 2017 participants
completed brief assessment forms after every group session and responded to a comprehensive
survey at the end of each program year.3 They also completed assignments to demonstrate their
understanding of evaluation-related and evaluative thinking concepts and, most importantly,
developed and conducted evaluation projects. This report presents a description of BEC 2016-17 and
provides details about who participated, how and what training was provided, and what resulted for
the Class of 2017.
Like their predecessors in earlier classes, participants of the BEC Class of 2017 achieved desired
outcomes. Though there were many staff transitions, all 14 organizations successfully completed
their training, and as shown in Section III they conducted their own evaluation projects, integrated
evaluative thinking into their work, and initiated strategies to continue extending evaluation capacity
throughout their organizations. Additionally, they agreed to serve as a resource, as needed, to other
organizations in the Hartford community. Nine of the 14 participating Class of 2017 organizations
plan to continue doing specific, guided evaluation work with the BEC evaluation consultant/trainer as
part of the Alumni Group of 2017-18. Assuming sufficient community interest, a new Class of 2019
will be initiated January 2018. Details about Class of 2017 training, participants, results and issues for
further consideration follow.
3 Please note, all quotations are responses of BEC Class of 2017 participants to open-ended questions on the BEC final survey.
I understand the entire evaluation process so much better now. With this program, I became involved in structuring the evaluation instruments, the data collection plan, evaluation plan and report creation. We are making significant changes in the time frame for collecting our data, and will be automating the process going forward. I also have used the observation protocol we created to add an extra layer to our evaluation process. BEC Executive Leader, Class of 2017
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 4
II. BEC TRAINING AND EVALUATION COACHING
As described in the previous section, the BEC program included both training and completion of
coached evaluation projects for the Class of 2017. The Class of 2017 initial training (phase I) took
place from January 2016 through June 2016, and concluded with the successful development of
evaluation designs. During the second year of BEC (July 2016 – June 2017, phase II), all Class of 2017
organizations conducted the evaluations they had designed and developed evaluation reports. This
section of the report provides details regarding implementation of BEC and feedback from
participants.
BEC Class of 2017 Participants
A total of 14 organizations, the largest class to date, comprised the BEC Class of 2017. Table 1
identifies these organizations and provides some details about the BEC teams. As shown, this group
of BEC participants, like those from prior classes, delivered different types of services and were
broadly representative of Hartford area nonprofit organizations.
Table 1: BEC Class of 2017, Participating Organizations
Organization Primary Service Areas # of Team Members
Teams
Connecticut Association for Human Services*
Anti-poverty human services training and assistance.
3
Executive Director Program Director Director of Community Research and Evaluation
Connecticut Radio Information System (CRIS)*
Audio access to those who are blind or print-challenged.
2 Executive Director CRISKids Coordinator
Foodshare *
A food bank focused on getting food donations from the food industry and distributing them to food pantries, community kitchens, homeless shelters, and other partner programs.
5
President & CEO Executive VP & COO Grants and Program Impact Manager Agency Services Manager VISTA Volunteer
Hartford Neighborhood Centers, Inc.*
Basic Human Needs, Workforce Development, Youth Development, School Readiness
2 Executive Director Accounting Clerk/Exec. Assistant
Hartford Performs* Art Education for public schools 3 Executive Director Programming Director Grants and Ofc. Manager
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 5
Table 1: (Continued) BEC Class of 2017, Selected Organizations Organization Primary Service Area # of Team
Members Teams
Hartford’s Camp Courant* Summer learning, youth development, recreation
3
Executive Director Dir. of Grants and Program Development Volunteer
Hockanum Valley Community Council, Inc. *
Case management, behavioral health, transportation, food pantry,
3
Chief Executive Officer Accounts Receivable/ Data Management Executive Assistant
Latino Community Services* HIV/AIDS prevention and community services
3 Executive Director Deputy Director Program Coordinator
North Central Regional Mental Health Board, Inc.
Behavioral health and community services, assistance to those in recovery
4
Executive Director Review & Evaluation Coordinator Board Member Board Member
The Discovery Center* Diversity, equity and social justice for school children and communities
4 Executive Director Program Director Consultant and Fellow
The Mark Twain House & Museum* Museum and cultural programs 2 Executive Director
Director of Education
Unified Theater Performing arts, youth leadership 3 CEO Director of Programs Director of Programs
World Affairs Council of CT Global education and engagement 3
Executive Director Program & Membership Manager Education Director
YWCA Hartford Region, Inc.*
Community Services, Youth Development, Housing, Economic Development
4
Executive Director Director of Youth Dev. Grants and Contracts Administrator YWLC Prg. Coordinator
* Organization team changed due to staff transitions including Executive Directors at Foodshare, Hockanum Valley Community Council and Mark Twain House & Museum, and other staff positions at noted organizations.
As they had in all the earlier BEC classes, Senior-level officials (i.e., those with decision-making
authority) from the selected organizations attended and fully participated in the Class of 2017 BEC
training. They were specifically involved to increase the potential for both extending and sustaining
evaluation capacity and evaluative thinking in the organizations. As shown in Table 1, the
organizations chose teams of various sizes and involved individuals from various positions according
to their own needs for training (e.g., Program and Membership Manager, Director of Education).
Despite multiple transitions, including some at the executive level, all teams completed BEC 2017.
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 6
BEC Class of 2017 Training: Phase I
The BEC Class of 2017 used the content and delivery structure developed for the Class of 2013
and used also with the Class of 2015. A total of six sessions were conducted, each with multiple hands-
on activities4 and homework assignments that built toward the final evaluation design that was the
culminating project. Training for Class of 2017 members included six 3.5-hour sessions (three that
were optional for Executive Leader), one 1-hour independent consultation session for each
organization, and the final conference session (25 hours total). Each training session included some
lecture-style presentation, opportunities for individuals and groups to try out new material and to
work on applications for their own organizations and for teams to confer regarding their work,
evaluation and evaluative thinking. Specifically for the Class of 2017, there were also multiple and
enhanced opportunities for individuals from different organizations to confer and share about their
work. All sessions included homework that resulted in usable products (e.g., logic models, surveys,
action plans) and as stated above, components of evaluation designs. BEC participants were also
exposed to the concept of evaluative thinking, and how organizations can enhance and sustain
evaluative thinking and evaluation capacity. Topics covered at each session of the phase I training
period for the Class of 2017 are shown in Table 2 (additional details about training topics, activities and
homework are available in the Evaluation Report 2012).
In addition to the training sessions, the evaluation consultant/trainer also provided individual
technical assistance for all participants, as needed, via email, phone calls, or through face-to-face or
web-based meetings. This individual technical assistance was mostly conducted to help participants
complete their homework or directly apply what they had learned in their own organizations (e.g., to
revise an existing survey, assess existing data collection strategies, or review a logic model or an
evaluation design being proposed for one of their programs). Additionally, five organizations
requested other evaluation-related help (e.g., assistance with presentations for boards, advice
regarding engagement with evaluation consultants, oversight/assistance for new staff doing data
collection, assistance with design and completion of large scale evaluation).
4 Examples of activities include: program logic model development, trial development of e-surveys for laptops and handheld devices, analysis of quantitative survey data from 25 completed surveys using Excel and Survey Monkey; analysis of open-ended survey data; summarization of interview findings using completed interviews; level of effort predictions and workplan/timeline development using automated forms.
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 7
Table 2: BEC Class of 2017, Phase I Training Session Descriptions
Session Date Session Content
1 1/20/16
EVALUATION BASICS Intro/Importance, Terminology/Background, Context, Evaluation questions Evaluation stakeholders, Evaluation design, Evaluation logic Evaluative thinking
2 2/10/16 LOGIC MODELS AND EVALUATION LOGIC Logic model overview, Assessing logic models, Outcomes, indicators, targets
3 [ED OPT] 2/24/16
DOCUMENTING IMPLEMENTATION, DATA COLLECTION OVERVIEW SURVEY DEVELOPMENT Documenting program strategies, Data collection overview, Introduction to surveys
4 [ED OPT]
3/9/16 SURVEYS AND RECORD REVIEWS, ANALYZING QUANTITATIVE DATA Developing electronic surveys, Using record reviews, Basics of quantitative data analysis
5 [ED OPT] 3/23/16
OBSERVATIONS AND INTERVIEWS, ANALYZING QUALITATIVE DATA Using observations to collect evaluation data, Conducting interviews, Analyzing qualitative data
Team Consult 4/18, 4/19, 4/20
6 5/11/16
PUTTING IT ALL TOGETHER, FINAL CONFERENCE PLANS Developing level of effort and timeline summaries Budgeting and paying for evaluation. Introduction to Evaluative Thinking (part 2). Planning for the final conference.
Final Session 6/8/16 FINAL CONFERENCE
Development and presentation of final evaluation design project boards
Figure 1: Class of 2017 Phase I Work Session at the Lyceum
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 8
BEC Class of 2017 Training and Evaluation Coaching: Phase II During phase II, there were a total of 10 sessions. As stated previously, this included five
individual consultation sessions, four group meetings for review and continued training and planning,
and the final conference session. As in the first phase, each of the group sessions included some
formal presentation and many hands-on activities, including opportunities for participants to present
their projects and findings, in particular, and to critically read other participants’ reports. Throughout
phase II, as in all prior classes, the participants worked on completing their own evaluation projects –
collecting and analyzing data according to their designs, summarizing their findings, and determining
action steps.
The topics and activities addressed during phase II are shown in Table 3. All sessions focused
on skills needed to complete the evaluation projects and reports and to prepare for further integration
of evaluation capacity at the participating organizations. Specifically, the sessions addressed data
analysis and reporting, including use of tables and figures, and development of full reports that
illustrated and discussed evaluation findings and planned actions. The sessions were also used to
revisit evaluative thinking concepts and to continue planning for integration of evaluative thinking and
extension of the evaluation training more broadly in the participating organizations.
The individual consultation sessions provided opportunities for each group of participants to
report on the status of its work and to get individualized attention regarding data analysis and
summarization and reporting. Every organization met at the Hartford Foundation for Public Giving for
five one-hour consultation sessions with the evaluation trainer for discussions focused on their work.
Participants set the agendas for the meetings so that they could practice actively engaging in
participatory evaluation, and the evaluation consultant/trainer helped to keep the focus on progress
according to participants’ designs. Additional sessions were conducted via phone or at the Foundation,
with any organization that requested additional help.
A Note about Attendance and Transition: Class of 2017 As for all prior classes, both group meetings and the individual consultation sessions were generally well attended and always included representatives from each BEC organization. There was, however, a considerable amount of transition on the Class of 2017 teams, as members retired or took other jobs, and this included changes in Executive leadership at three organizations. Based on input from prior CEO/Executive Director participants, attendance for Executive Leaders was optional for some Phase I sessions, but most elected to attend. Participation overall and support for evaluation work by leaders was very consistent.
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 9
Table 3: BEC Class of 2017, Phase II Training Session Descriptions
Session Date Session Content
TEAM CONSULT 9/14/16 OR 9/15/16 Initiating and implementing designs, evaluation reporting plans
1 10/19/16 Evaluation report development, survey and record review data analysis, effective use of pre-post surveys, introduction to graphics
2 11/16/16 Data visualization (developing tables and graphs for evaluation reports), Math for Evaluators
TEAM CONSULT 12/6/16 OR 12/7/16 Evaluation work, evaluative thinking actions, data collection and analysis
3 1/11/17 Using your findings, Enhancing Evaluative Thinking
TEAM CONSULT 2/14/17 OR 2/15/17 Evaluation work, evaluative thinking actions, data collection and analysis
TEAM CONSULT 3/14/17 OR 3/15/17 Evaluation work, evaluative thinking actions, data collection, analysis and report writing
4 4/5/17 Draft Reports Due – Peer Critical Read Re-thinking Proof and Attribution
TEAM CONSULT 5/16/17 OR 5/16/17 Final consultations on evaluation reports and evaluative thinking assessments
5 6/10/15 FINAL CONFERENCE Peer presentations of Evaluation Reports: Questions, Strategies, Findings
As phase II drew to a close, all Class of 2017 participating organizations developed and
submitted draft evaluation reports and prepared for and then participated in the final conference.
During the 8th session, participants read and commented on each others’ work. Following that, each
draft report was thoroughly reviewed by the evaluation consultant/trainer, and suggestions were
made to strengthen (and standardize5) the reports, if needed. During the final consultation session,
final revisions were made, and the participants and evaluation consultant/trainer reviewed how best
to present the work at the final conference. 5 As for all previous classes, BEC participants were all required to use a standard reporting format for their final documents. This included an introduction with evaluation questions, description of the methodologies used, summary and discussion of findings, and presentation of conclusions and suggested action steps. All were required to use either graphics or tables (or both) to summarize some of their findings.
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 10
Both phase I and phase II ended with final conferences. The phase I conference provided an
opportunity for all participants to present their planned designs. The phase II conference provided an
opportunity for all participants to showcase their evaluation projects, including findings and action
steps. At both conferences, participants displayed their work using tri-fold project presentation boards
and discussed their efforts with classmates and stakeholders from the Hartford Foundation and other
organizations.
Figure 2: Final Conference 2017
A Note about Phase I and Phase II Attendance
Training Innovation: The NSP Evaluation Roundtable The NSP Evaluation Roundtable, initiated during Phase II of the Class of 2015, provided a new opportunity for area nonprofit organizations and evaluation professionals to meet, share and learn about evaluation practice. Roundtables are open to NSP BEC program participants and alumni and area evaluation professionals. To date there have been five Evaluation Roundtables covering topics including (1) data visualization, (2) conducting and learning from focus groups, (3) evaluating youth programs, (4) evaluation in arts and cultural organizations, and (5) inclusive evaluation, with Class of 2017 organizations presenting at both of the Roundtables in 2016-17. Participation in these sessions has included many BEC alumni and feedback has been consistently positive regarding their value.
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 11
Participant Assessment of BEC
On the final surveys administered at the end of each project year, responding participants6
provided summary ratings about BEC (see Table 4 and Figure 3). Overall, all respondents (100%)
indicated the training was worthwhile to them personally, and all the respondents indicated that BEC
was worthwhile to their organizations (including 78% who said it was Very Worthwhile). All
respondents described the evaluation consultant/trainer as Excellent (89%) or Very Good (11%), all
responding participants indicated the assistance they received from the evaluation consultant/trainer
to complete their final projects was Excellent (91%) or Very Good (9%), and all respondents indicated
the experience of completing a full evaluation project was Excellent (48%) or Very Good (52%).
Table 4: Assessment of BEC Training Value, (n=32)
Personally
Somewhat Worthwhile 9%
Worthwhile 44%
Very Worthwhile 44%
For the Organization
Somewhat Worthwhile
Worthwhile 22%
Very Worthwhile 78%
In addition, all respondents reported they would recommend BEC to others if future classes are held,
and about one-third (34%) indicated they had done so already. As shown in Figure 3 (following), they
also rated key components/features of BEC very favorably. All but two respondents rated both phases
of training as Excellent (phase I 54%, phase II 53%) or Very Good (43% phase I, 41% phase II). As stated
previously, all responding participants rated the experience of completing a BEC evaluation project and
the assistance they received to complete their project as Excellent or Very Good. The proportions of
respondents providing high marks for BEC overall and the individual program features were similar to
those for the Class of 2015, but continued to be greater than those from all earlier classes, except the
Class of 2013 (see Table 5). These results provide ongoing evidence that earlier changes in curriculum
and other program strategies were effective. Additionally, modifications undertaken for the Class of
2017, specifically incorporation of additional interactive activities between organizations and
designation of optional sessions for Executive leaders, were described as important and beneficial.
6 A total of 34 Class of 2017 participants received the final survey, all but 2 participants answered (94% response rate).
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 12
Figure 3: Final Ratings for Key Components/Features of BEC, Class of 2017, n=32
Table 5: Percent of BEC Trainees who Rated BEC Components/Features as Excellent, Over Years
Class of 2008 Class of 2010 Class of 2013 Class of 2015 Class of 2017 Phase I training 39% 32% 59% 48% 54% Phase II training 38% 35% 65% 52% 53% Experience Completing a Project 52% 48% 71% 64% 48% Assistance to Complete Project 83% 65% 97% 85% 88%
Longer-Term Importance of BEC
BEC participants were also asked about changes in their level of involvement on the final
survey. Less than 15 percent of the respondents indicated they had frequently been involved in
evaluations at their organizations before BEC. By the time BEC concluded, close to half (41%) reported
they were frequently involved in evaluation work at their organizations. Additionally, 75 percent of
respondents indicated their organizations were very likely to continue or expand evaluation work (the
rest thought their organizations were somewhat likely to do so).
54% 53% 48%
88% 91%76%
0%
20%
40%
60%
80%
100%
Phase I Training Phase II Training ExperienceCompleting a Project
Project Coaching/Assistance
BEC Trainer BEC Overall
Not Good Okay Very Good Excellent
I am more involved than I used to be. I am involved in making sure that evaluation is a core part of program design. BEC Executive Leader, Class of 2017
My supervisor’s position within the organization changed so that there is now more of a focus on evaluation and sharing information and we are therefore dedicating more effort to evaluating our programs. BEC armed us with appropriate tools to be able to move forward in this. BEC Participant, Class of 2017
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 13
Participants’ BEC Experiences
Through their responses shown in Table 6, and more importantly through their completion of
assignments and ultimately development of evaluation designs and completion of full projects, Class of
2017 participants acknowledged BEC’s importance. Table 6 further clarifies what was important to the
participants.
Table 6: Percent of BEC Participants Who Reported the Following Were Important About BEC (N=32)
Somewhat Important
Very Important TOTAL*
Requirement to design an actual evaluation for the selected program 0 100% 100%
Requirement to complete an actual evaluation for the selected program 0 100% 100%
Opportunities to learn about evaluation 6% 94% 100%
Feedback from the trainer regarding the evaluation project 6% 94% 100%
Opportunities for consultations from BEC evaluator/trainer 9% 91% 100%
Writing the evaluation report 16% 84% 100%
Opportunities to interact with colleagues in other organizations 31% 69% 100%
Opportunities to interact with peers within the organization 40% 53% 93%
Reviewing the work of BEC colleagues 45% 52% 95%
Opportunities to showcase evaluation work 48% 45% 93%
* Note: The difference between the total and 100% reflects those who indicated a BEC feature was not important.
All, or almost all, respondents indicated that all key aspects of BEC were Important, with most
respondents indicating they were Very Important. Specifically, 100 percent of respondents indicated
the requirements to design and complete an actual evaluation were Very Important. Additionally, 94
percent of the respondents indicated it was Very Important to have opportunities to learn about
evaluation, and to get feedback about projects from the BEC evaluation consultant/trainer. A total of
91 percent indicated it was Very Important to have opportunities for consultations from the trainer,
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 14
and 84% indicated it was Very Important to write an evaluation report. More than two-thirds of the
respondents (69% compared to only 44% for the Class of 2015) also identified the enhanced
opportunities to interact with colleagues from their peer organizations as Very Important. Interacting
with peers from their own organizations, reviewing the work of colleagues, and opportunities to
showcase evaluation work were also identified as important (about half or more of respondents
indicated those features were Very Important). More importantly, through the diligence with which
they undertook all BEC tasks during each training session and in preparation for the final conference,
they demonstrated their commitment to and acceptance of BEC. Like their predecessors, the Class of
2017 took on challenging projects with a range of complexity. They also used multiple strategies to
collect data, effectively used electronic data collection and analysis tools, and summarized their
findings in interesting and compelling ways.
Comparative Assessment of the BEC Experience
Selected final survey responses were also disaggregated by participant type to help answer
questions about whether and how Executive Leaders7 experience BEC differently from other staff. 8
These results provide limited additional guidance related to ongoing executive level participation.
Comparative results show the following (see also the appendix tables A1 – A5 for details):
• Leaders and other staff experienced and assessed some elements of BEC the same and some differently.
• About the same proportion of leaders (46%), as compared to other staff (42%), rated BEC as Very Worthwhile for themselves, but many more leaders rated BEC as Very Worthwhile for their organizations (100% of leaders compared to 63% of other staff).
• Proportionately fewer staff members rated either the phase I or phase II training or the experience of completing a project or the assistance to do so as Excellent, most staff rated all the above as Very Good.
• Half of the key BEC features we asked about were identified as Very important by equal proportions of leaders and other staff. This included requirement to design an actual
7 Each BEC team has at least one senior official able to make decisions at the organizational level, and some teams have more than one senior manager (e.g., the Executive Director and the Deputy Director). Respondents were asked to identify their role as either senior leader, staff member, board member or “other.” Note that two organizations did not have multiple respondents because of staff transitions, and one of these did not have any senior leadership participating in BEC for most of the cycle, or represented on the survey. 8 The disaggregated data sets were very small so data were reviewed at the extremes (e.g., Very Worthwhile, instead of Very + Somewhat Worthwhile) where differences were more pronounced.
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 15
evaluation for the selected program (100% of both leaders and other staff); requirement to complete an actual evaluation for the selected program (100% of both leaders and other staff); writing the evaluation report (85% leaders, 84% others); reviewing the work of BEC colleagues (54% of leaders, 50% others) and opportunities to showcase evaluation work (46% of leaders, 44% others).
• The other BEC features were identified as Very Important by larger proportions of leaders. This included opportunities to learn about evaluation (100% of leaders, 89% others); feedback from the trainer regarding the evaluation project (100% of leaders, 89% others); opportunities for consultation from the trainer (100% of leaders, 84% others); opportunities to interact with colleagues in other organizations (77% of leaders, 63% others); and opportunities to interact with peers within the organization (69% of leaders, 41% others).
Both leaders and other staff, in roughly equivalent proportions, indicated that the following
statements about BEC importance were Very True.
• BEC helped improve the quality of data they obtain (92% of leaders, 89% of other staff) and BEC helped their organization understand why evaluation is valuable (77% of leaders, 71% of staff).
• BEC taught participants how to look at their programs from different perspectives (73% of leaders, 68% of other staff); BEC helped the participants better understand participatory evaluation (67% of leaders and 72% of other staff); and it strengthened relationships with their organizations (36% leaders, 39% staff).
With the exception of helping to incorporate evaluation into daily practice (42% of Leaders, 53% other
staff); proportionately more leaders than other staff indicated that all the other statements about BEC
importance were Very True.
• BEC helped participants build evaluation into the planning process (92% of leaders, but only 56% of other staff); BEC helped them revise their program based on real data (91% of leaders, but only 56% of other staff).
• BEC organizations now have increased capacity to measure the types of outcomes they value (77% of leaders, 50% of staff) and BEC taught them the importance of involving multiple stakeholders in the evaluation process (75% of leaders, 59% of other staff).
The variations described above confirm that leaders experience some elements of BEC differently
which is expected given their different organizational responsibility. The findings also confirm that
leaders obtain benefits for themselves and their staff/organization. Differences between staff and
leader experiences will continued to be reviewed for each new BEC class.
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 16
III. BEC FINAL RESULTS Class of 2017
This section of the report presents a summary of findings about the BEC Class of 2017. Findings
were compiled from the final participant survey that was administered summer 2017, review of all
classwork/homework products, and assessment of the final evaluation designs and final reports. All
analyses were conducted by the BEC evaluation trainer using an analysis strategy jointly developed
with the Hartford Foundation’s Nonprofit Support Program, in concurrence with the analysis plan used
for the Class of 2008.9
By all accounts, as for prior classes, the BEC program was very productive for the Class of 2017.
In accordance with desired outcomes, participants demonstrated they learned about evaluation and
developed evaluation skills by successfully completing evaluation projects. They also reported using
their evaluative capacity beyond BEC projects. They gained or honed numerous evaluation-related
skills (such as asking clear evaluation questions, designing evaluation projects, selecting data collection
methods, collecting data, analyzing data and summarizing findings), and put each of these skills to use.
Additionally, every group assessed their agency’s levels of evaluative thinking and identified what was
needed to enhance evaluative thinking organization-wide. Also in accordance with desired outcomes,
they developed evaluation designs for selected programs, which they completed, and determined and
used findings that affected ongoing efforts. Details about skill development/learning, completion of
evaluation projects and reports, use of evaluation findings, efforts to extend (“Ripple”) BEC, and self-
reported assessments of BEC-inspired changes in evaluative thinking follow.
Participants Developed Important Skills to Conduct Evaluation
By the end of BEC phase II, all Class of 2017 participants developed skills to conduct
comprehensive evaluations. As demonstrated through their completed projects and their class
homework assignments they could:
Develop, assess and use logic models
Document program implementation/service delivery (summarize recruitment, retention, target population descriptions, develop basic information tracking strategies)
9 The BEC evaluation analysis plan was externally vetted (original materials are available for inspection upon request).
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 17
Design evaluations (clarify the purpose, specify questions, select data collection methods , specify timelines and levels of effort, estimate cost of evaluation)
Design surveys, identify/fix bad surveys, determine how many surveys are needed, develop survey administration plans including those using web-based platforms, and develop analysis plans for use with Survey Monkey and Excel
Design and conduct interviews, observations and record reviews and analyze resultant data
Write evaluation reports, use results to inform action steps, and present findings to stakeholders.
While many participants came with knowledge/experience about various BEC topics, most also
enhanced and added to their knowledge through BEC. By June 2017, all engaged participants of the
Class of 2017 were facile with a common language about evaluation, and every group demonstrated
they could apply what they knew to the development of evaluation designs and completion of
evaluation projects. Most respondents were also clear that they were incorporating evaluation skills
into their everyday work, were providing enhanced evaluation capacity for their organizations, and
were beginning to extend (“Ripple”) these capacities to others in their organizations. Specifically:
• With the exception of logic models, very few (2 – 4) respondents indicated they already knew BEC topics when they started. All or almost all of those who had not previously learned about each of the basic evaluation topics covered in BEC indicated BEC had helped them to learn something about it (many respondents said BEC helped them learn a lot.)
• A total of 96% of those who did not know about developing logic models before BEC indicated BEC had helped them learn something (38%) or a lot (58%) about it. Additionally, 61% reported BEC helped them learn a lot about using Logic Models to inform evaluation.
• All respondents who did not know about specifying evaluation questions learned about it through BEC, and about 71% indicated BEC had helped them learn a lot.
• All respondents reported learning how to develop evaluation designs through BEC, and 75% indicated BEC had helped them learn a lot.
• All respondents who did not know about choosing methods to address evaluation questions learned about it through BEC, and about 65% indicated BEC had helped them learn a lot.
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 18
BEC Class of 2017 participants also reported that BEC helped them develop data collection and
analysis skills, as well as organizational capacity. All or almost all of those who had not previously
learned about the four major evaluation data collection methods indicated BEC had helped them to
learn something about it (many respondents said BEC helped them learn a lot.) Specifically:
• All or almost all the respondents learned about developing surveys, interviews, record review protocols, and observation protocols through BEC, and many (38% - 59%) of the respondents indicated they had learned a lot.
• All those who, before BEC, did not know about collecting and analyzing the various types of evaluation data, indicated they learned how to do it. (Also, more than half said that through BEC they learned a lot, about working with surveys, interviews and observation data; about one-third said they learned a lot about working with record review data ).
Most BEC Class of 2017 respondents who had not previously learned about organizational
evaluation capacities indicated they learned about them through BEC — and again, in many cases BEC
had helped them learn a lot. For example:
• All respondents who did not know about incorporating evaluation into their daily practices reported they learned about it through BEC, including 63% who said they learned a lot.
• All respondents who did not know about completing a full evaluation project and communicating about evaluation findings reported they learned to do so through BEC, including 69% who said they learned a lot about completing evaluation projects, and 67% who said they learned a lot about communicating evaluation findings.
• All respondents who did not know about projecting level of effort for evaluation reported they learned about it through BEC, including 56% who said they learned a lot.
• Almost all (97%) of those who, before BEC, did not know about “Ripple” or about reviewing evaluation designs from external providers, indicated they learned about those strategies through BEC (47% reported they learned a lot about “Ripple” and 47% said they learned a lot about reviewing others’ designs).
Participants Used the Skills they Acquired/Honed During BEC
The BEC Class of 2017 participants used varying and multiple data collection strategies for their
evaluation projects. In addition, like their predecessors in all the earlier classes, many participants
have already been called on one or more times to put their evaluation skills to use for other
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 19
organizational work, especially those related to surveys. More than two-thirds of the respondents
indicated that in addition to their BEC projects they:
• developed surveys (84%)
• created administration plans for surveys (59%)
• administered surveys (66%)
• revised a survey (68%)
Many of those respondents analyzed their own survey data using Survey Monkey (42%) or Microsoft
Excel (41%).
Participants are More Involved in Evaluation at their Organizations
As stated previously, at the beginning of BEC, most of the respondents (63%) had only Some
Involvement in evaluation at their organizations (13% had been Very Involved and 25% Had Not Been
Involved at All). By the end of BEC, most (97%) of the respondents were involved in evaluation
(including 41% who were Very Involved), most were planning for ongoing involvement, and most had
begun to use what they learned for projects in addition to their BEC work (see also sections on use of
training and “Ripple”). They clarified in their own words that they had learned about what was
involved in conducting evaluation, that they had changed and improved their strategies, and that they
were committed to using evaluation to answer their own questions of interest and using the
information to change strategies as needed
Participants Successfully Completed Evaluation Projects The final project for the phase I training period was development of evaluation designs. These
designs had to conform to standard professional evaluation practice, and they showed that Class of
2017 participants were able to apply what they had learned. Each design described the subject
program and why it was selected, specified evaluation questions, and which data collection strategies
participants had chosen. The designs also included projections of level of effort (i.e., who would do
each task and how much time in days or hours would be reserved for them), proposed timelines for
evaluation activities (i.e., when – months/days/seasons -- evaluation activities would happen), and
plans for use of the evaluation results. All participating organizations either developed new
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 20
The organization is looking more closely at the usage and value of our databases for its ability to facilitate the collection and curation of data and its ability to inform our programs and partners. BEC Participant, Class of 2017
instruments to collect data, or revised existing instruments and all developed administration and
analysis plans for their multiple data collection instruments.
During 2016-17, participants implemented these designs, further developed their analysis plans,
collected and analyzed data according to their plans, and developed reports about their findings. All
reports included proposed action steps. Tables 7 and 8 provide details about participant projects:
Table 7 clarifies which programs were evaluated and what evaluation questions were pursued. While they were quite varied, and obviously more complex for some groups than others, the depth of inquiry and seriousness of these studies was clear. The pursuits of the Class of 2017 were as substantial, or more so than those of all previous classes.
Table 8 shows data collection methods/choices. All participants were required to use more than one method, but each identified which methods made the most sense to obtain data to address their questions. As shown, all participants used multiple methods, and many undertook complex efforts utilizing multiple data collection strategies (all but 1 conducted surveys – including 2 who did electronic surveys, 1 that conducted phone surveys using a random sample, and 4 who surveyed multiple respondent groups; 7 conducted interviews including 1 that did intercept interviews and 1 that conducted a focus group interview; 4 conducted observations and all of them developed rubrics for scoring their observations; all but 1 conducted record reviews including 3 who disaggregated by site, 6 who looked at data over time, and 2 who mapped their data; and 8 groups used three or more strategies).
As with the prior classes, the Class of 2017 BEC organizations did not substantially change their designs
after they were finalized (June 2016). All took their projects very seriously and conducted
comprehensive short-term studies including analysis of their own data, and three organizations
(Hartford Performs, Unified Theater and the YWCA Hartford Region) extended their studies beyond the
BEC timeline to further develop reports for external stakeholders.
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 21
Table 7: Programs and Evaluation Questions Selected by BEC Class of 2017 Participants
Organization & Program Evaluation Questions
Connecticut Association for Human Services Connecticut Money School
EQ1 How is CT Money School valuable to our partners?
EQ2 Do our programs help clients change their confidence and financial behaviors? EQ3 How do the CTMS staff and volunteers understand their role and help to integrate their components into an operating system?
Connecticut Radio Information System (CRIS) CRIS Radio
EQ1 How and to what extent do CRIS listeners use the services and what is their assessment of it (what are their favorite aspects and what is missing)? a. How often and when do listeners listen? b. What is CRIS used for (information, socialization , other)? EQ2 How and to what extent does access to CRIS benefit listeners and add to their quality of life? a. Help them to engage more with family and friends b. Help to keep listeners informed about social issues c. Help listeners feel connected.
Foodshare Hunger Action Teams
EQ1 To what extent have Hunger Action Teams been successful? EQ2 What factors have contributed to HAT successes?
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 22
Organization & Program Evaluation Questions
Hartford Neighborhood Centers, Inc. Food Pantry
EQ1 Who is using the service and to what extent is the HNC Food Pantry helping clients? EQ2 Is the Pantry sustainable? Can the agency continue to support the needs of the community? EQ3 Do we contribute to self-sufficiency, i.e., are we really making a difference or just being a food Band-Aid? (This question part of planned alumni study)
Hartford Performs Evaluation of Classroom Outcomes
EQ1 How and to what extent are programs helping students? Are programs supporting learning in the specified content areas? Are they enhancing the development of other skills, such as active listening, creative thinking and expression, problem solving, critical evaluation, collaboration and social skills? EQ2 Are programs being delivered as expected? Are programs delivered as described in the program descriptions prepared by the artist? Do artists encounter unanticipated situations in the classroom that alter how a program is delivered?
Hartford’s Camp Courant Healthy Choices Program
EQ1 How many participating campers learned key concepts and could demonstrate skills in the topic of Nutrition and Healthy Eating? EQ2 How many participating campers learned key concepts and could demonstrate skills in the topic of Active Living? EQ3 How many participating campers learned key concepts and could demonstrate skills in the topic of Mind Over Matter?
Hockanum Valley Community Council, Inc. Dial-a-Ride
EQ1 How and to what extent does our program work? a. Who is served and does that match the desired target population? b. How do clients use the service, how does that contribute to their well- being? c. How satisfied are our stakeholders (clients)? What do they identify as our strengths and weaknesses?
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 23
Organization & Program Evaluation Questions
Hockanum Valley Community Council, Inc. Dial-a-Ride (Continued)
d. How efficient (in terms of time, expense) is our work and what might make it more so? EQ2 Are there gaps in our service or new opportunities for service and if so what can we do about them?
a. What can we do differently? b. What new services can we offer? c. Are there any areas of need that we don’t currently address?
Latino Community Services Medical Transportation Services
EQ1 How and to what extent has the program facilitated clients’ ability to adhere to medical appointments and contributed to client health? EQ2 How do clients experience and assess their time traveling in the medical transportation vehicle? a. Were they consistently greeted by driver and treated with respect? b. Would they recommend the program to someone else?
North Central Regional Mental Health Board, Inc. Catchment Area Councils
EQ1 How and to what extent are the CACs achieving their purposes aligned with the mission of NCRMHB? EQ2 How and to what extent are the CAC meetings addressing the needs of our volunteer members?
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 24
Organization & Program Evaluation Questions
The Discovery Center Residential Program
EQ1 Are participants of the Residential program having meaningful experiences that meet TDC’s organizational goals and mission? EQ2 Do teachers value the experiences they are having with their students during the Residential Programs? EQ3 Are parents satisfied with the experiences their children are having during the Residential Program. EQ4 How did students, staff and facilitators experience the academic components of the program?
The Mark Twain House & Museum Public Programming, Autumn/Winter 2016
EQ1 Who are our program attendees? EQ2 How can we convert our program attendees into more engaged members and supporters?
Unified Theater Spotlight School Program
EQ1 How many and which programs are successful? a. What does a “successful” Unified Theater program look like? b. How many and which schools are reaching benchmarks? EQ2 What factors must be present at a school to create a successful program? a. What are the logistical, implementation components of the Spotlight School
program that indicate a school should be a high performer? b. Is Advisor commitment crucial? c. What are the key factors that are most commonly found in these programs? d. How do these factors interact and influence each other? EQ3 How can the elements of successful programs be promoted across all programs? a. How can the program staff ensure these components of the program are
replicated at all schools? b. Which of these key success factors should be the priority?
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 25
Organization & Program Evaluation Questions
World Affairs Council of CT Fall 2016 Programs
EQ 1 How is what we offer unique and attractive to the community? What are our program attendees’ perception of our role? EQ 2 Did people leave the program having been exposed to a differing or more nuanced perspective on the featured topic? EQ3 Did people feel engaged and able to share in a meaningful exchange of ideas? EQ4 Did people leave wanting more engagement with the Council?
YWCA Hartford Region, Inc. Young Women’s Leadership Corps
EQ 1 To what extent did YWLC meet participant attendance and retention goals? EQ 2 To what extent did participants develop key skills associated with program content? EQ 3 To what extent were new and returning participants satisfied with 2016-17 service delivery?
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 26
Table 8: Selected Evaluation Data Collection Methods: BEC Class of 2017 Participants
Organization Surveys Interviews Observations Record Reviews
Connecticut Association for Human Services a* (focus group) ++
Connecticut Radio Information System (CRIS) (phone)
Foodshare ++ +++ (mapping)
Hartford Neighborhood Centers, Inc. +++ +++ (mapping)
Hartford Performs a** +++ +++
Hartford’s Camp Courant
Hockanum Valley Community Council, Inc.
Latino Community Services +++
North Central Regional Mental Health Board, Inc. * ++ ++ ++
The Discovery Center a ++ **
The Mark Twain House & Museum +++ +++
Unified Theater a ++ ++ ++ ++
World Affairs Council of CT +++ +++ +++
YWCA Hartford Region, Inc. a (intercept) +++
Key:
* = multiple surveys with same respondents, ** = e-survey ++ = cross-site data collection, +++ = over-time data collection a = multiple respondent types
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 27
Participants’ Projects Were Comprehensive and Useful As stated previously, all 14 organizations completed projects, and all of the projects were
summarized into reports that were extensively reviewed by the evaluation consultant/trainer for
adherence to evaluation reporting standards. As with the previous classes, respondents (100% for
the Class of 2017) indicated that the requirement to complete an actual evaluation project was Very
Important (see Table 6). Project reports were also presented at the final conference to BEC
organizations and the Hartford Foundation’s Nonprofit Support Program stakeholders. Those in
attendance, including many senior Hartford Foundation staff were once again consistently impressed
with both the clarity and thoroughness of the efforts. Most importantly, all of the organizations
obtained information through their evaluation projects that informed their ongoing work. The
following are examples of some findings from the Class of 2017 projects:
CT Association for Human Services learned through their BEC project that The Connecticut Money School is evolving and changing corresponding to the mission and vision of CAHS. They also determined that there is clear evidence that they are: a) partnering with agencies that value both, helping low-income families understand their finances, and having quality services regularly available, b) attracting appropriate and effective volunteers, and c) working with staff that understand integrated service delivery and strive to help families using a comprehensive approach that includes both program and policy approaches. The evaluation also made clear that they need to: strengthen the training process for volunteers so that they better understand any barriers to conveying important information during the classes; institute more quality control processes and evaluation of the satisfaction of participants with the volunteer instructors conducting classes; and review partner, volunteer, and pre-post surveys that do not meet expectations, e.g. regression of knowledge and pilot new questions with more effective wording and concepts. CRIS Radio learned that systematically collecting feedback from CRIS listeners and their families about the importance of their services allowed them to understand the value of the services differently and to make future programming decisions. Results of their study highlighted, in quantitative terms, the importance of having access to print information – including the store sales flyers – on the quality of their clients’ lives. They also learned that newspapers, magazines, obituaries and store sales flyers were among the most preferred programming, but also that the diverse programming offered by CRIS is important to listeners and reflects the diverse preferences among the general population. This supported their ongoing efforts to broadcast articles from more than 50 newspapers and magazines. Foodshare conducted a comprehensive evaluation of its Hunger Action Teams using both direct data collection and extensive review of historical information about the teams. Through their work, the evaluation team concluded that program success lies in the impact HATs have both on their community (through projects completed) and within Foodshare itself by supporting various
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 28
departments within the organization. Specifically, there are 20 teams representing 25 towns and 6 Hartford neighborhoods across Foodshare’s service area and these teams have completed over 70 projects that have implemented local solutions to hunger. Highlights of 2016 projects included: helping two area elementary schools became eligible for Universal Free School Meals, and one summer meals site being created to serve children in Bristol. Through community food drives and gleaning events over 25,000 pounds of food was rescued and distributed to those in need. In addition, six Hunger Action Teams were awarded Partnership funding to support the expansion of programs which provide meals to hungry children (summer meals, backpack programs, etc.) and support nutrition education programs for children and families. The evaluation also emphasized the ability of Hunger Action Teams to support Foodshare’s mission to end hunger in the Greater Hartford region by providing a network of hunger-fighting allies throughout the community. In addition to supporting Foodshare in the fight against hunger, the teams provide participants with a network of passionate individuals committed to improving their communities. Through their evaluation project Hartford Neighborhood Centers, Inc., confirmed that their clients are satisfied with and appreciate the Food Pantry (over 90% of the clients asked said that they most appreciated help with food and many also said their anxiety was reduced; all respondents said they were grateful for the assistance). They also learned that their clients come mostly from Frog Hollow, zip code 06106, but also from other nearby areas and that they are a diverse group who use the services regularly and need help because they are un or under-employed (by definition, all clients are low income). During the study period they distributed more than 2000 pounds of food per month and many of their clients used the services on a monthly basis. Most critically they learned that though need is consistent and considerable, the Food Pantry does not have sufficient support to meet expenses: income for the Food Pantry totaled less than half of what is needed to cover expenses. These specific findings about the disparity between need and support will help inform next steps for the Food Pantry. Results of data analyses showed that, with few exceptions, Hartford Performs programs are contributing in important ways to student outcomes. Their findings also showed that Hartford Performs programs are being delivered as expected. Further, with few exceptions, the results for the 2015-16 and 2016-17 school years were very consistent across all categories. A total of 96% of the teachers concurred that Hartford Performs is positively contributing to every key program outcome. This included having a positive impact on students’ confidence, meeting learning goals for the grade level, engaging students, and increasing students’ knowledge about and interest in the subject matter. The teachers also indicated that the programs were age appropriate and met their expectations. Additionally though most teachers were generally very pleased with the programs their students received, 25% indicated there were areas of the program that could have been strengthened in some way and they were able to provide specific suggestions for improvement. A total of 93% of teacher respondents said that they would select the same program again. Through their BEC project Hartford Performs began asking arts providers for their impressions of how successful their 2016-17 programs were in the classroom, to gain a broader understanding of how programs are being delivered. Overall, arts providers reported that their programs went as expected and that both students and teachers were engaged in the program as they delivered it (91% of Arts Providers reported that students were actively engaged, and 92% reported that teachers were engaged for the duration of the program.) Their evaluation work will continue during the summer of 2017.
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 29
Hartford’s Camp Courant undertook an evaluation project to help them understand outcomes for one of their educational/developmental summer programs and also to pilot evaluation strategies to determine how best to document and use outcome data to inform next steps for other programs. Their results showed that: their older campers were definitely achieving topic/lesson objectives set by the Healthy Choices Program Director and that camper achievement did not vary whether the older camper participated in topic sessions during the first or second half of the summer season. They also concluded that: they could effectively incorporate evaluation into the delivery of other educational/developmental programs without a lot of additional work if they involve the Program Director at the outset and centrally administer the collection of data. They determined that their older campers can appropriately self-assess their achievement of lesson/program objectives on a weekly basis; that paper-based evaluation surveys could be used successfully in a camp environment, if the surveys are short and have questions clearly linked to program objectives; and they recognized the need to collect data on the number and age groups of campers attending evaluated programs. Hockanum Valley Community Council, Inc., learned that overall, their clients are happy with the Dial-a-Ride services and that they are providing a service that is needed by the senior and disabled community of the Tri-Town area. Further, their data collection and analysis showed that they provide the service effectively and efficiently. However, while the clients were satisfied with the service it was concerning to them that only 41% of the respondents felt services had improved. Over the past few years HVCC has upgraded its buses, phone and radio systems, its scheduling software, and other areas. Client feedback did not indicate that the upgrades have had the noticeable effect that was hoped for. This may require HVCC to revisit whether all of the upgrades are being used to their potential. Additionally, the evaluation highlighted the need for expansion of service areas. Overall, responses to Latino Community Services survey of Medical Transportation clients showed that they were actively engaged in medical transportation and that it had a positive impact on their lives. Clients confirmed that they had good relationships with the driver, that the driver consistently greeted them and inquired about their health and made it possible for them to get to their medical appointments on time. In addition, respondents reported that this service improved their treatment adherence. Consistent and appropriate service delivery was documented both in terms of driver – client interaction and compliance with the requirements for documentation on each client. Further, services were associated with clients’ stabilized health outcomes. In addition to positive results, LCS also learned that clients often had to wait longer than desirable for return rides which frustrated both clients and drivers. Another key finding, that was not part of the initial evaluation design, was the effect that conducting the evaluation had on the drivers themselves. The drivers were made full partners in the process, first by being trained on how to give the clients the surveys, and then trained on how to conduct the interviews and observations themselves along with some coordinators. Recognition by the drivers of the importance of their medical transport and evaluation roles showed that bringing evaluation down to the line staff level had additional benefits to program improvement.
The North Central Regional Mental Health Board (NCRMHB) conducted a comprehensive evaluation of its Catchment Area Councils (CAC’s). Through surveys and record reviews they found that participants are very passionate about the CAC’s work and remain committed to NCRMHB’s mission to serve as a voice for the communities. Overall, findings indicated the CACs were diverse and representative of the communities they are designed to serve (some additional involvement from
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 30
males would be beneficial) and that for the most part attendance by members is consistent and sufficient. CAC members were engaged in their regular meetings, they connected with legislators and decision-makers, and reported they were learning, and benefiting from the CACs. Additionally, NCRMHB documented that members were getting opportunities to be more engaged in their communities. Participating CAC members reported they learned about how to navigate the mental health system and they appreciate opportunities to network with other people who work in the mental health field or who have direct experience with mental health on a personal or family level. They expressed an interest in continuing to participate at the CAC level. The data showed that the NCRMBH creates a lasting impact for those who attend the meetings.
The Discovery Center took on a comprehensive evaluation project including development of many new evaluation procedures. They learned that their Residential Program met or exceeded its objectives overall (78% of students increased their knowledge of academic strands, 94% of teams successfully accomplished team building challenges, 94% of students made new friends with different life experiences and 91% of students reported intentions to take positive action to interrupt an incidence of discrimination. They concluded that their program as delivered Fall 2016 was effective in reaching its goals, but that there is room for growth in strengthening their relationships with parents and teachers so that both feel positively about students’ experiences, and are positively impacted by the teaching and feel more connected to the content. They also learned some lessons about evaluation implementation and survey design, and the importance and challenge of asking respondents about social identities, and recognizing that they need to begin planning the evaluation process as a part of the camp planning process instead of as a secondary component. The Mark Twain House & Museum surveyed participants during all their Fall 2016 programming and acquired important feedback. They found that their programs were overwhelmingly popular with attendees, including many who are not constituents of other local cultural organizations (such as the Wadsworth, CHS or Hartford stage) but that they do not rely on a loyal core audience (all of their fall programs had many people visiting MTHM for the first time for the program). Instead their surveys showed that they drew widely, albeit within typical museum demographics: attracting a largely female, white, suburban, older crowd with multiple non-typical attendees (such as younger visitors) attending too. They also learned that the majority of the program attendees who responded to the surveys were not members and many of them visited only annually, infrequently, or were actually first-timers. This helped the Mark Twain House & Museum staff realize that evening programs help them meet the important goal of drawing new individuals. Further, their study also showed that though members were in the minority among attendees, public programs were also avenues to serve the current membership while also serving as a potentially fertile field for establishment of new members from program attendees. This heightened their recognition that if they want programs to drive membership, they need to do a better job at making membership attractive to non-member attendees while also recognizing that Programs may draw a new crowd, who are not inclined towards membership, but might not have attended the Museum otherwise. Unified Theater also conducted an extensive and detailed evaluation of its Spotlight School Program. The evaluation will ultimately include attention to 18 different program elements at all 21 of the schools they served. For BEC, they looked at only three program elements (student leadership, theater fundamentals, inclusion) at six schools using comprehensive surveys and comprehensive program observations. After analyzing results of the observations and surveys, they found that five of
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 31
these six programs excelled in the category of student leadership and most of them had room for improvement in the areas of theater fundamentals and inclusion. When the results of the final production at each school were factored in, all of the schools were rated as very strong and both student leaders and participating students and the adults they worked with rated the program favorably. Their evaluation work will continue during the summer of 2017. World Affairs Council of CT recorded high levels of satisfaction by participants in their Fall 2016 programming. They also found that their events were attractive to their constituents, due to the content being addressed and that their visitors left events feeling informed, confident to speak on program topics with others, and inspired towards actions as a result of participation. World Affairs Council documented consistent attendance during their Fall 2016 programs (3 of 4 were sold out), and consistent opportunities for attendees to interact with each other and the speakers. Almost all of their respondents (98%) indicated they would attend another World Affairs Council event and would recommend it to others. YWCA Hartford Region, Inc., collected and analyzed data about their Young Women’s Leadership Corps program, including groups at local high schools, one middle school and for their Saturday program. They focused specifically on attendance and participant skill development. Results showed that attendance remained varied by program site, but most met attendance goals (girls attending ½ or more of the sessions available during their period of enrollment), and about half of the sites met dosage goals (girls attending 9 or more sessions) by the end of the study period (this evaluation will be continued through the end of June 2017). They also obtained specific and consistent feedback from multiple participants illustrating that they were acquiring improved time management, organizational, leadership and interpersonal skills. Most responding participants also reported a safe peer environment and acquisition of leadership, communication, conflict resolution and financial literacy skills. The Saturday program provided an opportunity for participants to engage in more in-depth workshops and field trips.
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 32
While it was clear that the BEC Class of 2017 organizations used the BEC opportunity to learn
about evaluation, they also obtained many important findings about their programs, as shown above.
In addition, all of the participating organizations were able to clarify action steps in response to their
findings, and most had initiated at least some of those actions before their participation in the Class
of 2017 ended. All of the organizations developed new tools and evaluation strategies that can be
continuously used and in many cases expanded for other programs. Additionally the evaluation tools
and strategies developed by Foodshare were shared as a model to evaluate Action Teams within their
larger national network.
BEC Projects Informed Changes, Participants Used Their Findings
Like their predecessors in the prior classes, every Class of 2017 organization identified specific
changes they made or were planning to make to their programs and to other organizational work as a
result of their BEC participation. These changes included both minor and more extensive
adjustments to strategies, staff or space, and in many cases increased attention to the need to
regularly evaluate work. Identifying specific action steps from findings (rather than perfunctory
recommendations) is often one of the most challenging tasks for those learning about and
conducting evaluations. As evidenced in Table 9, participating BEC organizations clearly
accomplished this.
Table 9: Examples of Program Changes Informed by Evaluation Results Building Evaluation Capacity, Class of 2017
Organization Resulting Program Changes
Connecticut Association for Human Services
We are becoming more proactive about working on evaluations and learning from the results. We have not yet completed our review, but know that we need to revise the CT Money School pre- and post-tests, and maybe the curriculum itself, due to findings. We are meeting with the staff soon to discuss the report and start coming up with ideas.
Connecticut Radio Information System (CRIS
Our findings supported the testimony and feedback that we receive on an ongoing basis. Our BEC survey gave those testimonies more credence and value because of the randomized survey we conducted.
Foodshare
We have shared the evaluation results with program staff and are in the process of discussing next steps / how to apply the finding to make adjustments to the program overall. There were also some changes to the program that took place throughout the evaluation process. While all changes responded to identified needs in the community, I also think that the prospect of being evaluated motivated some of the staff to make necessary improvements.
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 33
Table 9 (Continued): Examples of Program Changes Informed by Evaluation Results
Building Evaluation Capacity, Class of 2017
Organization Resulting Program Changes
Hartford Neighborhood Centers, Inc.
Preliminary information has been shared with the board and we plan to share the final results also so that they can see the extent of the funding challenges we are facing and how consistent our food pantry use is.
Hartford Performs
We are planning to continue collecting data from teaching artists and to automate that process. We are also planning to incorporate more observation work into our review of programming.
Hartford’s Camp Courant
Camp will implement an attendance record for all programs, and implement an evaluation design/project for at least 2 more Programs. Camp requested a grant from HFPG to hire people to complete final steps the summer of 2017 and as of now we have brought on people to assist with evaluation this summer. We are still working on survey questions and an observation checklist. We will begin evaluation in two weeks.
Hockanum Valley Community Council, Inc.
We have already begun the process of putting our evaluation into action. The managers of our program were presented with our findings and were asked to begin putting together ideas as to how to complete our new goals and action steps. We are also looking into which of our other programs would benefit most from a similar evaluation. Additionally, we have extended our area of ride destination and added weekend trips to desired locations.
Latino Community Services
Because of the findings of the LCS project that we selected, staff have begun to build strategies that will improve delivery of service to the program. Staff and administration alike were able to see in real time the program areas that needed to be improved.
North Central Regional Mental Health Board, Inc.
Identified areas where we need to recruit members that are more reflective of the diverse cultures of their communities. Have begun working on recruitment. Other changes to be implemented when teams resume meeting this Fall.
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 34
Table 9 (Continued): Specific Examples of Program Changes Informed by Evaluation Results Building Evaluation Capacity, Class of 2017
Organization Resulting Program Changes
The Discovery Center
We moved our Residential Program from the Fall to the Spring, we have developed programs for the parents of the student participants, we have extended the Camp experience throughout the year, with a pilot program that involves six schools. We are revamping our counselor training, and our academic program offerings.
The Mark Twain House & Museum
Shared information with the board and are planning to collect similar data for all programs in the Fall 2017.
Unified Theater
We aren't done yet but based on the preliminary results I am anticipating some changes to how we deliver our programs and specifically where our focus is/who our focus audience for our information and curriculum is. I am sure that we will have even more changes once we finish our full analysis!
World Affairs Council of CT
Just recently, we brainstormed a 2018 program schedule that incorporates our key action steps, which were products of our evaluation report.
YWCA Hartford Region, Inc.
Changes were happening throughout the project. We were more aware of how often we adjust because of BEC. In Youth Development you are only as good as your ability to adapt. BEC helped us to pay attention to those adjustments. That is useful in reporting to funders and to remind us to consider those issues or concerns when we are making new plans. We will continue to look for ways to increase dosage and we anticipate more changes in the new school year.
In addition to the specific program changes described in Table 9, many respondents were able to
clarify how the BEC experience had resulted in some broader changes to their organizations’ work.
The following quotes illustrate these changes which include general approaches, more active
involvement in evaluation work conducted by others, and application of evaluation strategies to
other programs.
Our involvement in the BEC program gave us confidence to conduct our own program evaluations now in contrast to hiring a consultant to conduct them. It also informed us about how to evaluate other programs going forward and to continue to conduct surveys on a regular, ongoing basis.
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 35
Yes, BEC helped give us the tools, time, and expertise to make evaluation a priority.
Last July, all of our programs were asked to complete RBA by one of our funders. While this was the third year of completing RBA for us, I feel like this year, because of BEC, our RBA was the best we had done to date. These positive outcomes developed even before the completion of the program and I'm looking forward to seeing how much better our RBA reports and other evaluation work becomes in the future.
We are planning to do an organization-wide evaluation for 2017-18. We were able to see how our programs fit together through the process of completing the Logic Model for the Residential Program. We started to work "with the end in the mind" and now create all our programs that way. We have incorporated a consultant, [named], in the planning of our programs, in order to think about evaluation as we create our programs, but also to create the programs thinking of what we want to accomplish.
We have certainly improved our instruments and we plan to continue to adapt and revise these instruments to improve data collection in future years.
This sense of change applied beyond the individual programs. BEC respondents confirmed
that though challenging, they were prepared to ensure that evaluation capacity was
extended more broadly in their organizations.
BEC Participants Understand and Have Begun to “Ripple”
In order for BEC to have the broadest impact, it is required that participants extend or
“Ripple” what they learned through BEC. That was clear and desirable for NSP, the evaluation
consultant/trainer, and all participant organizations at the outset. Throughout the BEC training,
participants were briefed about strategies for doing this, and also asked to report about their
“Ripple” plans and current activities. In addition, participants were asked on the final survey to
summarize their efforts to date. As indicated previously, 97% of the respondents who were unsure
how to “Ripple” indicated BEC had prepared them to do so. In addition, all but one participant
reported that “Ripple” had begun at their agencies before the end of their BEC Class of 2017 training
(55% of respondents indicated they had already extended the training a little, and 42%, representing
7 of the 14 organizations, indicated they had done so a Lot). Many were able to provide specific
examples of ways they had begun to or would provide training to others, involve others in evaluation
work, and initiate evaluation in additional programs using materials and strategies they acquired
through BEC.
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 36
Respondents also provided some specific examples of “Ripple” on the final surveys. In their
own words, they acknowledged the need to make evaluation more consistent and useful
throughout their organizations, and to actively use their new/enhanced skills for that purpose.
Being a volunteer Board member, I'm a step removed from the daily operations of the organization. However, I have rippled BEC within my department where I work full time.
We have a very small organization - 3 of our 4 total staff members participated in BEC - so there isn't a lot of room to "ripple." We have certainly been sharing progress with our board and when we have our final report, we will be sharing both internally and externally.
Everyone in our organization understands how important evaluation is to what we do. We have created a committee on the Board, called "Mission & Measurement" and it is the most popular committee on the board. We have tied the ideas of the work and the measurement of the work together. Because of the rigor in our plan, I was directly involved in some observation work for our evaluation. I have also been directly involved in the planning and analysis, but BEC actually allowed me to move more responsibility for evaluation to our program team this year [because] of the shared understanding and staff interest shown through the program. This program has definitely enhanced evaluative thinking in our organization, at least among those who participated in the program. Our task now is to take that knowledge and share with others in the organization. We have already completed a record review of another part of our program to further the evaluation we did for BEC. We are undergoing the process of selecting which program would benefit most from a similar evaluation.
Some respondents also reported that they have continued evaluation-related agendas, and they
expect more change in the near future.
BEC will affect conducting evaluations now. I learned the value of determining a better focus at the outset of the process and the importance of simplicity in communicating what was learned.
Changes were happening throughout the project. We were more aware of how often we adjust because of BEC. In Youth Development you are only as good as your ability to adapt. BEC helped us to pay attention to those adjustments. That is useful in reporting to funders and to remind us to consider those issues or concerns when we are making new plans.
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 37
Because of the findings of the [named] project that we selected, staff have begun to build strategies that will improve delivery of service to the program. Staff and administration alike were able to see in real time with the evaluation areas that needed to be improved.
We have already begun the process of putting our evaluation into action. The managers of our program were presented with our findings and were asked to begin putting together ideas as to how to complete our new goals and action steps. We are also looking into which of our other programs would benefit most from a similar evaluation.
Finally, their comments clarified that respondents recognized they could and should use their new
expertise to benefit their organizations and strengthen their programs.
As a result of this program, we have now invested some of our staff time into integrating evaluative thinking into other aspects of our organization, in addition to the program we evaluated as part of our BEC project. We hope to continue this expanded focus during the BEC Alumna group in the fall.
By having program staff participate in BEC, the programs are more connected to outcomes.
We have been able to make the entire organization more data driven. Where previously we were using mostly professional hunches to make decisions, now we have data to back up those suggestions, or and a leader I can require that others back up their suggestions with data before we invest resources. My Board has also adopted this mindset and is looking to our evaluation to guide future strategy as we adapt to a changing funding and growth landscape.
We have a common language now amongst staff and leadership and the need for evaluation is full understood and embraced.
The respondents also indicated that despite focused efforts, they need more resources, specific
directions, and time to do this. Some will participate in the ongoing NSP-sponsored Alumni Group to
address this ongoing need (see Alumni Study 2016-2017) and all will be invited to future NSP-
sponsored Evaluation Roundtables. All BEC Class of 2017 organizations developed Ripple Plans
before they completed their training.
Evaluative Thinking is Being Enhanced Through BEC
Evaluative thinking is a type of reflective practice that incorporates use of systematically
collected data to inform organizational actions. Key components of evaluative thinking include:
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 38
• Asking questions of substance and determining what data are needed to address the questions.
• Gathering appropriate data in systematic ways. • Analyzing data and sharing results. • Developing strategies to act on evaluation findings.
As was discussed many times during the BEC training, evaluative thinking can be applied to
many organizational functions (e.g., mission development, human resources (HR) decision-making,
communications/marketing) in addition to program development and delivery. Additionally, during
the BEC training, participants had the opportunity to conduct assessments of evaluative thinking in
their own organizations, and to discuss ways evaluative thinking could be enhanced. On the final
survey, 100% of respondents indicated that participating in BEC had enhanced evaluative thinking in
their organizations. Although more than half of the respondents (58%) indicated BEC had enhanced
evaluative thinking a lot at their organizations, there is continued awareness, as illustrated through
the following comments, that more is needed to continually use and enhance evaluative thinking.
We talk about the need to continually evaluate the program we just completed and extend that to other programs. Also, our need not to just throw a few questions together and casually ask people as we meet them, but to plan and execute.
Overall, our organization has been moving toward incorporating more evaluation into our strategy. The BEC program helped us to cement the importance of evaluation moving forward, as we yielded a pretty successful report on our program. I believe evaluation will continue to be a priority and ripple through our organization based on the staff members who participated... our organization has been moving toward incorporating evaluation into all aspects of the company, even creating a new position to measure impact of our programs. I think with the new director in place, evaluation will be incorporated into our overall work.
We used to collect a large amount of data, but never really analyzed it carefully or used it to improve programming. We also were sloppy about our time frames for collecting data. Now we are very deliberative about what data we collect, how and when we collect it, and are diligent in analyzing and reporting on it. We also are starting to build more solid evaluations into programs that lacked them before -- or where we weren't as rigorous as we needed to be.
I believe it has been enhanced but I would like to see it spread beyond the three of us that participated in BEC. We should strive to have our managers have at least some grasp of the major evaluation concepts.
Some in our organization always valued evaluation, but others did not. [Evaluation] is now becoming more of our "culture” because of BEC.
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 39
IV. NEXT STEPS
The BEC training for the Class of 2017 ended in June 2017. Participants reinforced that their
BEC experiences had been important on multiple levels and accomplished what the program was
designed to do. The Foundation elected to initiate a new Alumni Group for the Class of 2017
graduates who opted to participate and to conduct an information session (Fall 2017) to determine if
there is sufficient interest in the Hartford nonprofit community to establish a BEC Class of 2019. This
final section of the report shows responses regarding the importance of BEC and describes plans for
continuation of BEC.
Importance of BEC The final summary items on the Class of 2017 final survey addressed the importance of BEC to
participants. As described in Section II, all or almost all respondents indicated desired BEC outcomes
had been achieved. In other words, all or almost all respondents indicated that it was true that BEC
was important on multiple levels. Specifically, respondents indicated BEC was important because it
helped them improve the quality of data they obtain, taught them to look at programs from different
perspectives, helped them build evaluation into the program planning process, helped them
revise programs based on real data, and helped them increase capacity to measure the outcomes
they value (see Table 10 following). Overall results for the Class of 2017 regarding the importance of
BEC, like those from each of the prior classes were very positive. Additionally, the one area that Class
of 2015 respondents identified as insufficient ̶- the attention to relationship-building and networking
across participating agencies ̶ was identified as important by all Class of 2017 respondents.
Future Classes and Alumni Study
Assuming there is sufficient interest, a new cohort of agencies will be involved in the BEC
program starting in January 2018. As stated above, an information session is planned for Fall 2017
and given results from this evaluation and any additional feedback from participants and other
stakeholders, it is expected that the process for the new Class of 2019 will be very similar to that used
for the Class of 2017, with continued attention to increased participant interaction and leadership
involvement. Additionally, in 2017-18, nine of the Class of 2017 organizations will begin alumni
study. Alumni study will allow Class of 2017 graduates who opted in to continue their studies or
initiate new projects, to include additional staff members in the training process, and to delve deeper
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 40
into advanced evaluation topics while continuing to obtain consultation as needed (see appendix for
schedule and training details). The following Class of 2017 organizations will participate: Connecticut
Radio Information Systems, CT Association for Human Services, Foodshare, Hartford Neighborhood
Centers, Inc., Hartford Performs, Hartford’s Camp Courant, Hockanum Valley Community Council,
Inc, Unified Theater, Inc. and the YWCA Hartford Region.
Table 10: Percent of Respondents Agreeing that Statements about BEC Importance were True
BEC is important because. . . Somewhat True
Very True
TOTAL
It improved the quality of data we obtain 9% 91% 100%
It taught us how to look at our programs from different perspectives 30% 70% 100%
It helped us better understand participatory evaluation 30% 70% 100%
It helped us build evaluation into program planning process 30% 70% 100%
It helped us revise our program based on real data 31% 69% 100%
Our organization now has increased capacity to measure the types of outcomes we value 39% 61% 100%
It helped me form new relationships with other providers in Hartford 61% 39% 100%
It taught us the importance of involving multiple stakeholders in the evaluation process 31% 66% 97%
It helped our organization understand why evaluation is valuable 23% 73% 96%
It helped our organization incorporate evaluation practices into daily practice 48% 48% 96%
It strengthened relationships within our organization 48% 38% 86%
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 41
A Note about BEC Evaluation
As described in the introduction, BEC is an evaluation capacity building project that is also
being evaluated. The BEC evaluation is a participatory evaluation commissioned by NSP. Analyses
are conducted and findings are summarized by the BEC evaluation consultant/trainer according to
plans reviewed by Foundation officials. All data are available for inspection by external reviewers and
reports are disseminated to BEC participants from all the classes. For 2017-18, the evaluation will
include summarized information about the experiences and outcomes of the 2017-18 Alumni Group.
Specifically, Alumni Group participants will answer a final survey at the end of the 2017-18 program
year, and their projects will be reviewed. Additionally, initial results about Class of 2019 and their
evaluation design work will be reported as available. Presentations of findings from Alumni Group
evaluation work and Class of 2019 proposed designs will be made to Foundation staff and other key
stakeholders of the participants during the final conference in June 2018. The results of the
evaluation will inform decision-making regarding ongoing Alumni study and other evaluation training
opportunities sponsored by NSP.
Conclusion and Issues for Further Consideration
The Class of 2017 accomplished much and the program ended smoothly. Everything is in
order for initiation of the new Alumni Group and plans for a new class are under serious
consideration. The following will deserve ongoing attention as the BEC program continues:
For the Alumni Group 2017-18 Ensuring that Alumni Group participants get meaningful opportunities to use multiple
evaluation data collection and analysis strategies, to analyze real data from their own organizations, and continue to successfully plan for and conduct evaluations.
Ensuring that staff new to the BEC project has effective experiences.
For the new Class of 2019 (pending) Attracting and informing a suitable new cohort. Continued involvement of new teams
from organizations from prior classes.
Continuing to use modified strategies to keep all Executive Leaders more engaged in the program, and encouraging all leaders to very carefully consider team formation and succession plans if team members must transition.
Helping participants stay focused on BEC evaluation learning and especially their assignments and projects, while also managing other organizational demands.
Building Evaluation Capacity, Class of 2017: Final Evaluation Report 42
For Both the Alumni Group 2017-18 and the new Class of 2019 (pending) Helping participants deal with the rigor required to analyze evaluation data and
summarize findings for external communication (i.e., develop evaluation reports).
Continued pushing to embed/institutionalize evaluation capacity and to inspire and support efforts to use multiple “Ripple” strategies (apply knowledge to other evaluation needs, involve others in evaluation, and provide training to others about evaluation).
Integrating technology/automation as available (e.g., mapping, hand-held electronic surveying), and use of analytical software (including SPSS, Excel, Survey Monkey and Google applications) wherever possible.
Continued use of activities to promote organizational connections and networking, including possibly having organizations conduct some evaluation tasks for each other as a way of learning new skills and also increasing interaction.
With assistance and support from NSP, the evaluation trainer/consultant will continue to
modify efforts as needed and focus on stated issues to ensure BEC increases evaluation capacity and
enhances evaluative thinking for all participant organizations.
Great opportunity to learn fully on evaluation. It helped me become keener in understanding this process and how as an agency we can deliver findings for improvement and perhaps change the course of a program. BEC Executive Leader, Class of 2017
Table A1: Assessment of BEC Training Value, by Participant Type
Executive Leaders n= 13 Other Staff n= 19
Personally
Somewhat Worthwhile 8% Worthwhile 46% Very Worthwhile 46%
For the Organization
Somewhat Worthwhile 0 Worthwhile 0 Very Worthwhile 100%
Personally
Somewhat Worthwhile 11% Worthwhile 42% Very Worthwhile 42%
For the Organization
Somewhat Worthwhile 0 Worthwhile 37% Very Worthwhile 63%
Table A2: Percent of Respondents who Rated BEC Components/Features as Excellent, by Type Executive Leaders
n=13 Other Staff
n=19 Phase I training 67% 44% 2nd Year Training 69% 42% Experience Completing a Project 62% 37% Assistance to Complete Project 100% 79%
*Use caution with comparisons, small n’s; Executive Leaders not fully representative.
Table A3: Percent of BEC Respondents Who Reported the Following Were Very Important About BEC, by Type
Executive Leaders
n=13
Other Staff n=19
Opportunities to learn about evaluation 100% 89%
Opportunities for consultations from BEC evaluator/trainer 100% 84%
Feedback from the trainer regarding the evaluation project 100% 89%
Requirement to complete an actual evaluation for the selected program 100% 100%
Writing the evaluation report 85% 84%
Requirement to design an actual evaluation for the selected program 100% 100%
Opportunities to interact with peers within the organization 69% 41%
Reviewing the work of BEC colleagues 54% 50%
Opportunities to interact with colleagues in other organizations 77% 63%
Opportunities to showcase evaluation work 46% 44%
Table A4: Percent of Respondents Agreeing that Statements about BEC Importance were Very True, by Type
Executive Leaders
n=13
Other Staff n=19
It taught us how to look at our programs from different perspectives 73% 68%
It helped us better understand participatory evaluation 67% 72%
It helped us build evaluation into program planning process 92% 56%
It improved the quality of data we obtain 92% 89%
Our organization now has increased capacity to measure the types of outcomes we value 77% 50%
It helped our organization understand why evaluation is valuable 77% 71%
It helped us revise our program based on real data 91% 56%
It taught us the importance of involving multiple stakeholders in the evaluation process 75% 59%
It helped our organization incorporate evaluation practices into daily practice 42% 53%
It strengthened relationships within our organization 36% 39%
Table A5: Participant Assessments, by Type Executive Leaders
n=13 Other Staff
n=19 Organization has Rippled at least a little 92% 100% Evaluative Thinking enhanced at least a little 100% 100% Will continue/expand evaluation 100% 100% Importance of optional attendance Not Important Somewhat Important Very Important
8%
46% 46%
BUILDING EVALUATION CAPACITY (BEC) PROGRAM
ALUMNI GROUP 2017-2018 PROGRAM OVERVIEW
The Nonprofit Support Program of the Hartford Foundation for Public Giving invites your organization to apply to participate in the Building Evaluation Capacity (BEC) Program Alumni Group. This program enables BEC graduates to: Learn more about advanced evaluation topics. Continue to work on evaluation projects. Effectively extend what they have learned about evaluation to other staff. Further enhance the use of evaluative thinking in multiple areas within their
organizations. PROGRAM DESCRIPTION The Building Evaluation Capacity Alumni Group will be offered to graduates of the BEC program and will be led by Anita M. Baker, Ed.D., of Evaluation Services. The program will be conducted over a ten-month period and combines classroom learning, hands-on assignments, and project-based learning. Each agency will be required to assemble a team from its organization that includes at least one (preferably two) of its original team members. (Agencies are encouraged to include at least one additional staff member -- up to 4 team members in total.) All members of the team will be expected to attend all of the training sessions.
The program consists of: Eight workshop sessions (2.5 hours each, late afternoon with light refreshments) Individual technical assistance (2 consultation sessions + additional help as needed) Presentation of evaluation plans and findings developed through the program Time at each session to work with your team on your evaluation project Transitional consulting support following the final Alumni Group session
There is no charge for participating in this program. However, it is a requirement of the program that all sessions be attended in their entirety by all members of the team and that an evaluation project is completed.
PROGRAM TIMELINE Applications to participate in the program are due on Friday, July 14, 2017. Program Sessions: DateTimeTopic September 13, 20171:30 – 4:00 p.m.Session 1: Evaluation Overview October 11, 20171:30 – 4:00 p.m.Session 2: Using Surveys Effectively November 15, 2017 12:30 – 4:30 p.m.Session 3: Interviews & Record Reviews: Getting the Most out of Data December 6, 2017one hour meetingSession 4: Consultation Sessions January 10, 20181:30 – 4:00 p.m.Session 5: Including Observations in your Evaluation Toolkit February 7, 20181:30 – 4:00 p.m.Session 6: Summarizing Findings/Communicating Results March 14, 20181:30 – 4:00 p.m.Session 7: Integrating Evaluation into Organizational Life April 10-12, 2018one hour meetingSession 8: Consultation Sessions April 25, 20181:30 – 4:00 p.m. Make Up for Snow Day (if needed) May 9, 20181:30 – 4:00 p.m.Session 9: Critical Read June 6, 20188:30 – 12:00 noonSession 10: Final Conference – Present plans/data
NSP Support
The Hartford Foundation for Public Giving’s Nonprofit Support Program was developed to help area
nonprofits build capacity and increase effectiveness. Its aim is to help nonprofits plan, improve their
governance and management, and build strong organizations so that services can be delivered as
efficiently as possible. NSP offers an array of training programs, grants for technical assistance,
assessments, loans, and networking opportunities for nonprofit leaders. BEC is a special training
program.
During all BEC programming for the Classes of 2008 through 2017, NSP has served as the
administrator of BEC. Each year staff from NSP have handled communication and important
scheduling details, and ensured that there was space and refreshment for all training sessions and
consultations. In addition, program officers Amy Studwell, and Douglas Shipman (as of Fall 2012)
together with NSP director Annemarie Riemer and special assistants Shirley Beyor, Florence
Galbraith, and Monica Kelly reviewed and participated in training sessions, provided conceptual and
operational support to the trainer, and generally oversaw the effectiveness of the program.
Throughout programming for the Class of 2017, NSP also continued the Evaluation Roundtable
program, which serves as a community of practice for area nonprofit and evaluation professionals to
promote ongoing discussion, sharing and learning about evaluation. For example, in April 2017, the
theme of the Roundtable was, “People with Disabilities and Inclusive Evaluation: Hearing Every
Voice," in which an evaluation professional as well as community organization members shared their
knowledge about inclusive evaluation topics, including:
Inclusive Evaluation Essentials; Anita Baker, Ed.D., Evaluation Services
Hearing Every Voice: Evaluation for Listeners with Visual Impairments; Diane Weaver Dunne,
Executive Director, CRIS Radio
Evaluating Students With and Without Disabilities; Laura McLelland, CEO, and Jen McCool,
Director of Programs, Unified Theater, Inc.
Data Collection for the Visual Learner; Jennifer A. Del Conte, Coordinator of Birth to Three Program/CORE Principal, Pre-K-12th Grade, American School for the Deaf
Group Discussions Using the Universal Design for Evaluation Checklist; Anita Baker, Ed.D., Evaluation Services
Of the 30 local nonprofit staff and executive leaders who attended, 22 completed a follow-up survey
about the event, and most of the feedback from responding participants was positive. Several key
findings from the survey include the following:
Over three-quarters of respondents (77%) rated the event as good or excellent.
All but one respondent (95%) indicated they would definitely or possibly use any of the information they learned at the Roundtable.
Over two-thirds of respondents rated each section of the event as useful or very useful, and up to 90% indicated specific topics (such as Inclusive Evaluation Essentials) were very useful.
Almost two-thirds (64%) had previously attended an Evaluation Roundtable event, and 87% indicated they were likely or very likely to attend future sessions.
Respondents also added comments about the event that demonstrate the enthusiastic appreciation
that participants had for the event, and the ways in which they found it was valuable. However, some
respondents indicated that the material was too basic, and others desired additional in-depth
information about working with people with disabilities or other communities. Selected comments
follow:
I think this was the best roundtable yet. There was lots of practical advice that everyone can use on a daily basis. As the presenters said, if you're serving the general public, you are serving individuals with disabilities (or diverse abilities as one presenter said). Very helpful! This has nothing to do with [the trainer] and the quality of her presentations. [The trainer] continues to be the finest consultant I have ever worked with and the one I have learned the most from. It was just this particular roundtable did not meet my needs having to do with practical information I can take back and apply to our evaluation work. Far too much time was used in having the 3 presenters, who are not experts in evaluation, share their experiences. While there is a great deal to be learned from the experiences of others, in this short period of time, particularly the experiences of [presenters] were so specific to their constituents, that it didn't apply broadly enough. I also could not get past the repeated lack of use of People First language by [presenter], throughout her presentation. I wonder if she is even aware she uses such language. The exercise in Universal Design was confusing. I was never sure what we were supposed to
accomplish. It felt rushed with no opportunity for discussion as a group. Having worked in the disability field for over 25 years, I was hoping for more practical discussion of challenges we face in our evaluation work and perhaps feedback from others on how they would deal with similar challenges. That said, there was great information on People First language and Universal Design that I hope others who have not worked in the field would find useful in their work. HFPG is a true partner to nonprofits and while this presentation did not meet my needs, I very much appreciate all the support that [trainer] and the Foundation provide to nonprofits. [Responding to the question: Do you think you’ll use any of the information you learned at the Evaluation Roundtable?] A review of our existing processes in which we construct evaluation tools that capture different styles of participation. -Rethinking how we ask about ability status; e.g. it is helpful to know what a person's ability status is to better support them in our programs, however we know that we cannot expect or enforce such expectations. I will use the information as we design surveys or other evaluation tools. Items discussed at the roundtable will help us to plan for more successful and inclusive evaluation. Presenters were correct in saying that most organizations probably serve people with disabilities, even if we aren't aware of every case. This is certainly true for my institution, and I hope to use the information from the Roundtable to increase the overall inclusiveness of our evaluation methods.
Overall, respondents indicated that the Evaluation Roundtable was a positive experience and
offered valuable tools and information that could be used at participants’ respective
organizations.