evaluation of the bay area stem summer institute · evaluation of the bay area stem summer...
TRANSCRIPT
Evaluation of the Bay Area STEM Summer Institute Final Report
Submitted to:
Growth Sector Beverly Farr, Ph.D., Principal Investigator
Jennifer Laird, Ph.D., Project Director Jessica Robles, M.A., Project Coordinator
MPR Associates, Inc.
2150 Shattuck Avenue, Suite 800
Berkeley, CA 94704
April 2012
ii
Contents
Executive Summary ............................................................................ 1
Evaluation Questions and Methods ............................................................................... 2
Evaluation Results ........................................................................................................ 2
Introduction ......................................................................................... 4
Evaluation Design Overview .............................................................. 7
Evaluation Goals and Audiences ................................................................................... 7
Data Collection ............................................................................................................. 8
Evaluation Results .............................................................................. 11
Background of STEM Summer Institute Participants ................................................... 12
Achievement in Math and Science ................................................................................ 14
Interest in STEM Teaching Careers and Ability to Serve as STEM Learning Leader ... 21
Quality of the STEM Summer Institute......................................................................... 33
Increase in Capacity of After-school Programs to Provide Experiential Science .......... 44
Conclusions and Recommendations ................................................. 54
Recommendations ......................................................................................................... 55
Appendices
Appendix A: Evaluation Plan
Appendix B: Student Survey Instruments
Appendix C: Interviews, Focus Group and Observation Protocols
Appendix B: Supervisor Survey Instruments
1
Executive Summary
MPR Associates, Inc. (MPR) of Berkeley, California conducted an evaluation of the 2011
Bay Area STEM Summer Institute. The Institute is an academic and career preparation
program intended to encourage and support participants from the Bay Area Teacher
Pathway in becoming effective STEM learning leaders and in pursuing careers as STEM
teachers. The Bay Area Teacher Pathway Initiative, which is part of a larger California
Teacher Pathway Initiative, is, in part, a response to the projected shortage in California
of qualified teachers, particularly in math and science. One of the solutions to this prob-
lem is an effort to expand the pool of qualified teaching candidates by training disadvan-
taged youth, veterans, and displaced workers to become teachers in their communities.
Specifically, the five-year pathway incorporates Community College (CC) and California
State University (CSU) training, while participants pursue a teaching credential. Further,
short-term teaching employment in after-school programs during the pathway serve as a
foundation for accelerated teacher training, preparing Teacher Pathway participants for
positions in local school districts.
The STEM Summer Institute is designed to provide CTP participants with a combination
of science and math coursework, professional development, and direct work experience
in delivering hands-on, inquiry based STEM activities to children and youth in OST pro-
grams. Developed and led by Growth Sector, and supported by the S.D. Bechtel, Jr.
Foundation, the Noyce Foundation, and the David and Lucile Packard Foundation, the
STEM Institute is a broad educational partnership, including San Francisco State Univer-
sity (SFSU), California State University East Bay (CSUEB), City College of San Fran-
cisco (CCSF), Chabot College, California School-Age Consortium (CalSAC), the
Exploratorium, Lawrence Hall of Science (LHS), and local out-of-school-time (OST)
program providers.
For most participants, the STEM Summer Institute occurred during their first summer in
the Teacher Pathway program, after they had completed one or two semesters. A number
of activities have extended the Institute activities throughout the year, including a week-
end workshop at the Marin Headlands Institute. Thirty-two participants participated in the
San Francisco program, and thirty participated in the East Bay program. While the San
Francisco and East Bay STEM Summer Institutes share the same fundamental goals (e.g.,
increasing the number of disadvantaged participants who go into STEM teaching), they
were not identical.
2
Questions and Methods
The MPR evaluation was designed to answer questions about the development of partici-
pants’ knowledge and comfort with relevant areas of math and science, their capacity for
teaching, and their interest in STEM career fields. It was also designed to examine the
quality of the STEM Summer Institute and to gauge whether it increased the capacity of
participating OST programs to provide experiential science activities. To conduct the
study, a research design was developed that included a series of surveys for participants
and summer camp supervisors; observations of professional development activities, math
and science courses, and summer and fall placements; and focus groups with participants.
In addition, we reviewed participants’ course grades.
Evaluation Results
Recruitment
Program designers were successful in recruiting participants from targeted un-
derrepresented groups.
Increase in participants’ knowledge and motivation for teaching
Results from a review of participants’ course grades and from survey responses
demonstrated that participants did well and reported gains in math and science
knowledge and interest. They also reported increased comfort with some areas of
math and science and an increased familiarity with and understanding of inquiry-
based learning.
Two questions posed as part of the evaluation design related to effects of the In-
stitute on participants’ motivation to pursue careers in teaching math or science.
The results on the three surveys seemed to indicate that the interest levels de-
clined. But in the interest of exploring this further, we added an item to the last
survey that asked whether they learned that they would like to teach math or sci-
ence, and a majority indicated they would. In addition, an analysis of open-ended
survey items and focus group data also showed an openness and interest in teach-
ing math or science. By the end of the Institute, participants also felt more com-
fortable with their level of knowledge for teaching science in grades K-8.
Participants reported gains in teaching skills: virtually all participants reported
that they improved “somewhat” or “a lot” over the summer in their ability to
teach math and science to children.
Participants also reported increased knowledge of STEM careers.
3
Quality of the Institute
Participants generally had very favorable reactions to many components of the
STEM Summer Institute. They especially appreciated the experience of teaching
in the placement sites. They reported that leading or assisting with lessons in the
summer camps increased their confidence in teaching.
Reactions to the courses they took varied more than anything else. Some of the
courses were ones that the designers of the Institute did not control.
Students uniformly responded positively to the professional development they
received. They expressed appreciation for the CalSAC professional develop-
ment—especially the hands-on, interactive aspects—and also gave high marks to
the Headlands Institute.
The program structure of the summer placement sites varied between the two co-
horts, with participants in San Francisco taking independent responsibility for
providing instruction, and participants in the East Bay serving as aides to teach-
ers in the summer camps at LHS. In both cases, however, the instruction or assis-
tance provided by Institute participants was rated generally to be quite strong.
The majority of participants indicated they would re-enroll if given the oppor-
tunity.
There were some logistical issues that emerged in the East Bay operation with
students indicating that they were unclear about some policies and practices. This
seemed to be due to some degree of management weakness on the part of the lo-
cal partner organization.
Increase in Capacity of After-school Programs
Supervisors reported that having STEM Institute participants increased the ca-
pacity of their organizations to provide STEM activities. While reactions varied
somewhat between the cohorts, they generally rated the participants quite high.
Supervisors reported that the young students were engaged by the activities led or
assisted by the Institute participants.
Many of the supervisors also participated in a one-day training provided by
CalSAC and the LHS, and they rated it favorably and indicated a desire for more.
4
Introduction
MPR Associates, Inc. (MPR) of Berkeley, California conducted an evaluation of the 2011
Bay Area STEM Summer Institute. The Institute is an academic and career preparation
program intended to encourage and support participants from the Bay Area Teacher
Pathway in becoming effective STEM learning leaders and in pursuing careers as STEM
teachers. The Bay Area Teacher Pathway Initiative, which is part of a larger California
Teacher Pathway Initiative, is, in part, a response to the projected shortage in California
of qualified teachers, particularly in math and science. One of the solutions to this prob-
lem is an effort to expand the pool of qualified teaching candidates by training disadvan-
taged youth, veterans, and displaced workers to become teachers in their communities.
Specifically, the five-year pathway incorporates Community College (CC) and California
State University (CSU) training, while participants pursue a teaching credential. Further,
short-term teaching employment in after-school programs during the pathway serve as a
foundation for accelerated teacher training, preparing Teacher Pathway participants for
positions in local school districts. These opportunities to work in out-of-school-time
(OST) settings began in the STEM Summer Institute and continued throughout the path-
way.
The broader California Teacher Pathway (CTP) program was developed in nine sites
around the state including Chico, the East Bay, Fullerton, San Diego, San Francisco, and
Los Angeles (4 sites). The program has received over $6 million in funding to date, in-
cluding $3 million in funding from the Governor’s office, $2 million from the David and
Lucille Packard, and over $1 million from local government and private funders. When
fully implemented, the program will train up to 600 new teachers each year.
The STEM Summer Institute is designed to provide CTP participants with a combination
of science and math coursework, professional development, and direct work experience
in delivering hands-on, inquiry based STEM activities to children and youth in OST pro-
grams. Developed and led by Growth Sector, and supported by the S.D. Bechtel, Jr.
Foundation, the Noyce Foundation, and the David and Lucile Packard Foundation, the
STEM Institute is a broad educational partnership, including San Francisco State Univer-
sity (SFSU), California State University East Bay (CSUEB), City College of San Fran-
cisco (CCSF), Chabot College, California School-Age Consortium (CalSAC), the
Exploratorium, Lawrence Hall of Science (LHS), and local OST program providers.
The STEM Summer Institute was piloted in both San Francisco and the East Bay in the
summer of 2011, with the following primary goals:
5
To encourage Teacher Pathway participants to pursue careers in math and
science teaching, especially through earning the multi-subject credentials
sought by K-8 schools;
To improve the readiness and preparation of Teacher Pathway participants to
succeed in math and science coursework at participating CSU’s including
earning Foundational Math and Science credentials;
To improve the ability of Teacher Pathway participants to serve as STEM
learning leaders;
To develop a sustainable model that can be replicated and implemented
statewide throughout the CSU system, and
To improve the capacity of participating OST programs to provide experien-
tial science learning to participants.
For most participants, the STEM Summer Institute occurred during their first summer in
the Teacher Pathway program, after they had completed one or two semesters. A number
of activities have extended the Institute activities throughout the year, including a week-
end workshop at the Marin Headlands Institute.
Thirty-two participants participated in the San Francisco program, and thirty participated
in the East Bay program. While the San Francisco and East Bay STEM Summer Insti-
tutes share the same fundamental goals (e.g., increasing the number of disadvantaged par-
ticipants who go into STEM teaching), they are not identical. The designs of the two
programs are outlined in the table below.
Growth Sector contracted with MPR to conduct an external evaluation of the pilot STEM
Summer Institute. MPR has conducted at least 20 current or recent science, technology,
engineering, and mathematics (STEM) education research projects, including evaluations
of such National Science Foundation (NSF) funded projects as the STEM Equity Pipe-
line, led by National Alliance for Partners in Equity; Climate Change Education Partner-
ship, led by the University of San Diego; and Innovative Technology Experiences for
Participants and Teachers (ITEST) Learning Resource Center, led by the Education De-
velopment Center. This final report gives a brief overview of the STEM Summer Institute
evaluation design and reports results from surveys and qualitative data collections.
6
San Francisco STEM Summer Institute
Program
East Bay STEM Summer Institute
Program Summer:
9 weeks, 2 cohorts, 32 partici-pants
Weeks 1-2,Summer Kickoff: Math and Science Teacher Con-ference and Exploratorium In-quiry Institute
Weeks 3-8: Participants took physical science lab (1 unit), practical math (3 units), and physical science online course (3 units).
Weeks 3-9: Participants worked in summer camps set-tings at least one day a week leading STEM activities.
Weeks 4-9: Participants had field trips and professional de-velopment on Fridays.
Fall:
Placement in after-school set-tings
Headlands Institute
Teaching methods seminar
Courses*
Summer:
10 weeks, 2 cohorts, 30 partici-pants
Week 1, Summer Kickoff: Math Science and CTE Conference at CCSF, Exploratorium Institute and Lawrence Hall of Science (LHS) Workshop
Week 2: Participants took Engi-neering 10 course (1 unit).
Weeks 3-10: Engineering course ended, and participants took Chemistry lecture and lab (4 units) and Math Applications course.
Weeks 3-10: Participants served as instructional assistants at LHS summer camps half-days four days a week.
Weeks 3-10: Participants had field trips and professional de-velopment on Fridays.
Fall:
Placement in after-school set-tings
Headlands Institute
Courses* * Participants enrolled in fall courses as part of their Teacher Pathway program. The STEM Institute did not design these classes, although Growth Sector helped fund some courses with non-allocated Institute funds.
7
Evaluation Design Overview
In this section, we present a brief overview of the evaluation design created for the study
of the Bay Area STEM Summer Institute as part of the Teacher Pathway Initiative. While
the primary focus was the Summer Institute, the evaluation was designed to collect data
related to fall activities that followed the Summer Institute, including fall courses and fall
placements in afterschool settings.
Evaluation Goals and Audiences The goal of the study was to examine both the implementation and impact of the 2011
Summer Institute as well as fall activities. To examine impact, the evaluation included
measures of effects on participant achievement outcomes (course grades and survey re-
port items) and on attitudes toward STEM areas of study and on their motivation to work
in STEM careers. The work was conducted between February 2011 and March 2012.
The evaluation is guided by seven questions—four related to the impact of the STEM
Summer Institute on Teacher Pathway participants, and three related to the STEM Sum-
mer Institute itself—its effects on the summer and after-school programs in which Insti-
tute participants work and its potential to serve as a model for other programs in the
larger state initiative.
1. What evidence is there that participation in the STEM Summer Institute results in in-
creases in participant knowledge and achievement in math and science?
2. What effects does participating in the STEM Summer Institute have on participants’
attitudes towards and motivation to pursue careers in math and science teaching?
3. What effects does participation in the STEM Summer Institute have on participant
knowledge and competence in STEM based experiential learning?
4. What effects does participating in the STEM Summer Institute have on participants’
ability to serve as STEM learning leaders?
5. What is the level of quality of the STEM Summer Institute, i.e., how well does it
measure against standards of effective professional development?
6. In what ways does the STEM Summer Institute improve the capacity of after-school
programs participating in the initiative to provide experiential science?
7. In what ways does the STEM Summer Institute serve as a model for the seven other
current and developing Pathway programs around the state?
The primary audiences for this study are Growth Sector, other partners involved in the
Institute, and the Bechtel and Noyce Foundations, the Institute’s funders. Secondary au-
8
diences include those interested in teacher preparation programs, out-of-school time pro-
grams, and K-12 STEM education.
Data Collection
At the beginning of the evaluation study, MPR conducted a number of activities to gain a
strong understanding of the conceptual underpinnings and implementation plan for the
program. The activities included conducting background interviews with program de-
signers and instructors, attending planning meetings, and developing a logic model and
data collection matrix (see pages 7-10 of the Evaluation Plan in Appendix A for more de-
tails). This work laid the groundwork for the summer, fall, and winter data collection ac-
tivities listed below. (All surveys and focus group protocols are included in Appendices
B-D.).
1. Pre-Institute Participant Survey
Constructs assessed: Confidence as a math and science learner; motivation for
math and science learning; comfort with math and science topics; comfort teach-
ing math and science topics; confidence and knowledge to teach math and sci-
ence; beliefs about teaching math and science; familiarity with inquiry-based
learning; career plans, and expectations for the STEM Summer Institute.
Both online and paper versions of the survey were administered between May 19,
2011 and the beginning of the Institute, June 1, 2011.
Response rates: 1 East Bay—97%; San Francisco—84%
2. Post-Institute Participant Survey
Constructs assessed: All constructs assessed in the Pre-Institute Participant Sur-
vey as well as feedback on the STEM Summer Institute; usefulness of profes-
sional development activities and courses; frequency of contact with Support
Specialists, and helpfulness of Support Specialists.
Support Specialists administered paper versions of the post-survey to participants
during the final days of the Institute.
Response rates:2 East Bay—100%; San Francisco—93%
1 The pre-Institute survey response was calculated based on the number of participants who were
enrolled in the program when the survey was administered: 27 participants in San Francisco and
30 in the East Bay. Five additional participants, all in their second-year of the Teacher Pathway
program, joined the San Francisco program a few weeks later and participated in the employment
portion of the Institute, but not the courses. Those participants were not administered surveys. 2 The post-Institute survey response rate was calculated based on the number of participants who
were enrolled in the program when the survey was administered, 30 participants in San Francisco
9
3. Fall Participant Survey
Constructs assessed: All constructs assessed in the Pre- and Post-Institute Partici-
pant Survey as well as feedback on Institute activities that extended into the fall
and expanded questions about interest in math and science teaching careers.
Support Specialists administered paper versions of the fall survey to participants
in the last week of the fall term.
Response rates:3
East Bay—92%; San Francisco—96%
4. Summer Participant Focus Groups
Constructs assessed: Personal goals for participating in the Institute; extent to
which those goals were met; experience with and perceptions of various compo-
nents of the Institute (e.g., courses, placements); effects on career planning, and
suggested improvements for the Institute.
At the midpoint of the STEM Summer Institute, two research associates from
MPR conducted two one-hour focus groups. One focus group included eight par-
ticipants from San Francisco and the other nine participants from the East Bay.
Support Specialists helped to recruit participants for the focus groups and were
asked by MPR to invite participants with a range of Institute experiences.
5. Winter Participant Focus Groups
Constructs assessed: The same constructs assessed in the Summer Participant Fo-
cus group with particular emphasis on evolving career goals and with additional
questions about fall Institute extension activities.
In February, MPR conducted two one-hour focus groups. MPR provided the
Support Specialists with a randomized ordering of participants to invite to the
group, asking that the Support Specialists confirm at least six participants. The
San Francisco focus group included nine participants and the East Bay group in-
cluded five participants.
6. Summer Observations of Courses, Professional Development Activities, and Placements
Constructs assessed: Task demand; manipulative usage; group work (frequency
and nature); task type; knowledge needed; connections within math and science;
connections to the participants’ lives; real-world connections; classroom discus-
sion; instructional decisions; conversations with peers, and lesson coherence.
The following observations were conducted by MPR:
and 29 in the East Bay. One participant who started the East Bay program was dismissed midway
through the summer. 3 The fall survey response rate was calculated based on the number of STEM Summer Institute
participants who were enrolled in the Teacher Pathway program when the survey was adminis-
tered: 26 participants in San Francisco and 26 in the East Bay.
10
Pre-Institute:
CalSAC and Exploratorium pre-Institute professional development work-
shops for program staff
Beginning of Institute:
Days 1 and 2 of the STEM Summer Institute, which included Exploratorium
and LHS workshops for the Institute participants.
Midway through the STEM Institute:
East Bay:
One day at the Lawrence Hall of Science observing Institute participants
serving as teaching assistants in the summer camps.
Math Applications course
San Francisco:
Four of the summer camp sites in San Francisco where Institute participants
were leading STEM activities.
Math 108B
End of the STEM Institute:
East Bay:
Chemistry Lab
San Francisco:
Physical Science Lab and Teaching Methods Course
7. Fall Placement Observations
Constructs assessed: Instructional decisions; incorporation of STEM activities.
The evaluators observed two afterschool programs in San Francisco and three in
the East Bay, each with one Institute participant working at the program.
8. Summer and Fall Course Grades
MPR obtained participants’ summer and fall course grades from Growth Sector.
9. Survey of Summer Placement Supervisors
Constructs assessed: Performance of Institute participants as STEM learning
leaders at the summer placement sites; effectiveness of the professional devel-
opment activities provided to the supervisors, and interest in working with STEM
Summer Institute participants in the future.
Administered online during October 2011.
Response rate:4 East Bay—73%; San Francisco—62%
4 Thirteen of 21 San Francisco supervisors completed the survey for a response rate of 62%, while
24 of 33 East Bay supervisors completed the survey for a response rate of 73%.
11
Evaluation Results
The discussion of evaluation results is organized, in general, by the research questions,
though they are combined in some cases to allow for better organization of the analysis
results. In the process of analyzing and reporting the data from the research activities,
(e.g., participant surveys, participant focus groups, observations, supervisor surveys), we
strove to triangulate findings whenever possible. For example, when we synthesized fo-
cus group data on a topic on which we also surveyed participants, we integrated the data
with the discussion of survey findings. In mixed methods evaluations, qualitative data can
offer context or possible explanations for survey findings.
The participant survey results presented here are generally based on three data points:
pre-Institute (baseline), post-Institute (first follow-up), and fall (second follow-up). To be
included in the analysis of the participant survey data, Institute participants had to com-
plete a survey at all three data points (N= 22 for the San Francisco program and N=20 for
the East Bay program). In addition, for a few items, there were not comparable items on
the pre- and post- (e.g., items about reactions to the Institute), so the data only reflect two
data points. However, for these items, we still maintained the same analysis sample. Five
second-year Teacher Pathway participants in San Francisco, who participated in the
teaching portion of the Institute, but not the courses, did not take the surveys.
The summer placement supervisors were surveyed in the early fall. The sample sizes are
also relatively small: 13 in San Francisco and 24 in the East Bay. The survey data were
analyzed separately for the two programs because the programs differed substantially
(see page 4). With relatively small sample sizes, MPR recommends considering a 15-
percentage point change as a meaningful change. That translates to approximately three
participants at a site changing on the indicator, and about three supervisors changing on
the indicator in the East Bay and about two in San Francisco. It is important to consider
these small sample sizes when reviewing or interpreting the results. All of the results
must be considered with a note of caution about such small sample sizes. That combined
with the differences in the cohort demographics and the cohort experiences, as well as the
difficulty of using precise language or terminology on survey items related to a complex
project with many moving parts, necessitates a cautious approach to interpreting the data.
12
Background of STEM Summer Institute
Participants
Participants come from underrepresented backgrounds
The California Teacher Pathway Initiative, of which this Institute is a component, aims to
encourage and prepare underrepresented young adults to pursue careers in teaching. Fig-
ure 1 suggests that the San Francisco and East Bay programs have been successful in re-
cruiting underrepresented young adults. In both programs, the largest single racial/ethnic
group is Hispanic (36% of San Francisco participants and 42% of East Bay participants).
In San Francisco, the second largest group is Biracial/Multiethnic/Other (28%), while in
the East Bay the second largest group is Black (27%). Close to 50% of participants in
each program have at least one parent who was not born in the United States.
A minority of STEM Summer Institute participants were currently employed for pay pri-
or to the Institute (38% of San Francisco participants and 27% of East Bay participants);
some have been homeless (24% of San Francisco participants and 12% of East Bay par-
ticipants), and many are currently receiving government assistance such as food stamps,
low incoming housing, or MediCAL (48% of San Francisco participants and 39% of East
Bay participants). Notable differences between the participant populations in the two
programs include a higher percentage of females in the East Bay program (73% versus
43% in San Francisco), and older participants in the San Francisco program (24% are 24
years or older versus 12% in the East Bay).
13
Figure 1: STEM Summer Institute Participants
N= 25 for San Francisco, N=26 for East Bay.
0
28
12
12
27
42
42
39
42
62
85
12
36
54
73
12
12
27
8
42
12
16
17
24
38
44
44
48
52
64
84
24
60
16
44
4
16
16
28
36
0 10 20 30 40 50 60 70 80 90
Has history of drug/alcohol abuse
Is a single parent
Has been convicted of a crime
Has been homeless
Currently employed for pay
Returning to school after a period of 1+ year
Has an immigrant parent
Receiving government assistance
Someone in family is college grad
English is first language
Has high school diploma
Age: Over 24
Age: 20–24
Age: 17–19
Female
White
Asian or Pacific Islander
Black
Biracial/Multiethnic/Other
Hispanic
Percent
San Francisco
East Bay
14
Achievement in Math and Science
Data related to this question come from participants’ summer course grade and their sur-
vey responses.5 Grades are not a direct measure of knowledge or learning. They are,
however, an indicator of the extent to which participants successfully met the expecta-
tions and demands of the course. Similarly, participants’ self-reports of their mathemat-
ics and science knowledge and achievement are not a direct measure of that knowledge
and achievement. With the pre- and post-Institute surveys, however, we can compare re-
sponses to determine whether participants perceived that their math and science
knowledge and achievement increased.
Participants earned mostly A’s and B’s in summer courses
Figure 2 shows that in three of the four summer courses, most participants earned at least
a B.6 Among the San Francisco participants who took the Physical Science Lab, 59%
earned an A and 26% earned a B. For Math 108B, 80% of the San Francisco participants
earned an A, and 20% earned a B. In the East Bay, 29% earned an A and 35% a B in
Chemistry 10. The only course where a majority did not earn at least a B was for the En-
gineering 10 courses: none earned an A, 29% earned a B, 48% a C, 19% a D, and 3% an
F.
5 At the beginning of the contract, MPR was asked to recommend a pre- and post-Summer Institute mathe-
matics or science assessment. We spent considerable time interviewing Institute designers and instructors to
determine what the content of the math and science instruction in the summer courses would be, and re-
searching a wide range of existing instruments, including college placement tests (e.g., Accuplacer, Wonderlic, CASAS, TABE). In the end, we determined that we could not identify an appropriate instrument
that would be aligned closely enough with what was being taught in the Institute's courses and workshops,
that would be reasonable to administer, and that would have the potential for yielding the desired results. We
presented our recommendation to Project Managers, and they agreed that neither a math nor a science as-sessment was appropriate for this evaluation of the STEM Summer Institute. For more details, see pages 8
and 9 of the Evaluation Plan in Appendix A. 6 Twenty-seven participants took the San Francisco Physical Science Lab, 15 took the San Francisco Math
108b course. Thirty-one participants took the East Bay Chemistry 10 course, and 31 participants took the East Bay Engineering 10 course.
Evaluation Questions: What evidence is there that participation in the STEM Summer Institute
results in increases in participant knowledge and achievement in math and
science?
What effects does participation in the STEM Summer Institute have on
participant knowledge and competence in STEM-based experiential
learning?
1. What effects does participation in the STEM Summer Institute have
on participant knowledge and competence in STEM based experi-
ential learning?
15
Figure 2. Grades in summer courses
N= 27 for Physical Science Lab, 15 for Math 108b, 31 for Chemistry 10 and Engineering 10 course.
Participants reported gains in math and science knowledge and interest during the summer. Reported gains generally continued into the fall.
The post-Institute survey asked participants the extent to which their math and science
knowledge and interest increased as a result of the STEM Summer Institute. The fall
survey repeated the questions, asking the extent to which these things increased as a re-
sult of their summer and fall experiences. In the San Francisco program, in both the post-
Institute survey and the fall survey, almost all participants reported that they improved
“somewhat” or “a lot,” in their math and their science content knowledge (89-100%; Fig-
ure 3). In the East Bay program, at least 70% of participants reported gains in science
knowledge and math knowledge at the two data points (Figure 4). East Bay participants
reported a notable additional increase in math knowledge from the post-Institute survey
to the fall survey (70% to 89%).
The majority of participants in both the San Francisco and East Bay programs reported
increased interest in math and science as a result of the STEM Summer Institute (post-
Institute survey), and as a result of their summer and fall experiences (fall survey) (Fig-
ures 3 and 4). In both the San Francisco and East Bay programs, a greater percentage of
participants became more interested in science compared with math over the summer
(post-Institute survey). Still, at least 60% of participants in each program reported in-
creased interest in math over the summer. When comparing post-Institute responses to
the fall responses, San Francisco participants reported a greater increase in science inter-
59 80 29
26
20
29
35
11
48
19
19
4 3
13
3
0
20
40
60
80
100
Physical Science Lab (SF)
Math 108B (SF) Engineering 10 (EB)
Chemistry 10 (EB)
Pe
rce
nt
Withdrawal
F
D
C
B
A
16
est in the post-Institute survey compared with the fall survey that asked about their expe-
riences in both summer and fall. From focus groups we conducted in the winter, follow-
ing fall activities, we learned from the San Francisco participants that they generally did
not like the online science course they took in the fall. They indicated that the summer
online science course—though they did not appreciate it much either—was more ac-
ceptable because they had the support of a teaching assistant, but without that support,
they found the fall course unappealing and difficult to keep up with. This may account for
the lower rating in “becoming more interested in science.” An illustrative comment from
a student reflected this possibility: “ When I walked through the science process, I would
say ‘yeah, I could teach science, and I was interested in the beginning. The online class
turned me off to the subject. Very negative experience. Many of us shared that (feeling).”
Figure 3: Math and science knowledge and interest post-Institute and fall—San Francisco
N= 22 for San Francisco, N=20 for East Bay.
68
84
89
100
95
82
100
95
0 20 40 60 80 100
Became more interested in science.
Became more interested in math.
Improved my science content knowledge in general.
Improved my math content knowledge in general.
Percent
Post-Institute
Fall
17
Figure 4: Math and science knowledge and interest post-Institute and fall—East Bay
N= 22 for San Francisco, N=20 for East Bay.
Participants reported increased comfort with some math and science content areas over the summer and into the fall. There was some decline in the fall.
By the end of the Institute, more participants reported being “comfortable” or “very com-
fortable” with some specific math and science topics (Figures 5 and 6). It should be noted
that we did not expect gains in all of the topics included in the figure, as the summer
coursework did not cover them all. They were all listed in the survey, in part, because at
the time of survey development, the course curriculum was not finalized.
As shown in Figure 5, notable gains were found in San Francisco in geometry and meas-
urement between the pre- and post-Institute surveys (56% reported being comfortable
with this topic before the Institute versus 75% after the Institute), graphing (72% versus
86%), engineering (18% versus 42%), and technologies used in engineering (53% and
85%).7 Notable declines between the post-Institute and fall surveys in comfort were ob-
served for biology (70% versus 50%) and engineering (42% to 27%)
7 San Francisco participants took a physical science course during the summer, but “physical science” was not specifically listed
on the survey. Physical science typically covers basic principles of physics, chemistry, and earth science. It is possible that par-
ticipants did not recognize the terms presented in the survey and would have reported increased comfort in “physical science” if it had been asked about with more specificity.
84
74
84
89
95
60
90
70
0 20 40 60 80 100
Became more interested in science.
Became more interested in math.
Improved my science content knowledge in general.
Improved my math content knowledge in general.
Percent
Post-Institute
Fall
18
Figure 5: Comfort with math and science topics—San Francisco
N= 22
As shown in Figure 6, by the end of the Institute, more East Bay participants reported
comfort with statistics and probability (33% reported being comfortable with this topic
before the Institute versus 50% after the Institute), graphing (70% versus 90%), chemistry
(11% to 53%), and technologies used in science (58% to 90%). Between the end of
summer and the fall, fewer East Bay participants reported comfort in Earth Science (95%
to 47%) and Biology (83% to 47%),
90
95
27
23
50
73
91
55
73
82
95
95
85
86
42
37
70
84
90
42
81
86
95
100
53
85
18
20
72
85
90
42
62
72
95
100
0 20 40 60 80 100
Technologies used in science
Technologies used in math
Engineering
Chemistry
Biology
Earth Science
General Science
Statistics and Probability
Geometry and Measurement
Graphing
Algebra and Functions
Number and Operation Sense
Percent
Pre-Institute
Post-Instiute
Fall
19
Figure 6: Comfort with math and science topics—East Bay
N= 20
Participants reported increased familiarity with and understanding of the term, “inquiry-based learning.”
The term, “inquiry-based learning,” was used on the survey, rather than “experiential
learning” as specified in the evaluation question because it was agreed that the former
term was what was being used in the Institute. On the survey, items related to this term
included one that asked how familiar they were with it and one that asked them to select
statements that characterized “inquiry-based learning.” We had drawn the statements
from a survey developed for an evaluation by another organization seeking to gain
information about depth of understanding related to the term. Some of the options were
specifically designed to test their understanding of nuanced differences. The survey
results did show that participants reported an increased familiarity (Figure 7), but the
statements did not work well to test the depth of their understanding. We, therefore, did
not include the item on the fall survey.
90
90
53
26
47
74
79
47
79
95
79
89
90
95
55
33
83
95
95
50
70
90
89
85
58
100
11
22
61
75
90
33
65
70
90
95
0 20 40 60 80 100 120
Technologies used in science
Technologies used in math
Chemistry
Engineering
Biology
Earth Science
General Science
Statistics and Probability
Geometry and Measurement
Graphing
Algebra and Functions
Number and Operation Sense
Percent
Pre-Institute
Post-Instiute
Fall
20
Figure 7. Familiarity with “inquiry-based learning”
N= 22 for San Francisco, N=20 for East Bay.
An analysis of an open-ended item from the pre- and post-Institute surveys provided
more fruitful data about participants’ knowledge of inquiry-based learning. Following an
item asking whether they were familiar with the term, they were asked to describe it
briefly. For both the San Francisco and East Bay cohorts—for those who provided
responses—there was strong evidence that their understanding had increased. While the
pre-survey responses included statements like, “Trial and error, find what works,” the
post-survey items included many more responses, with generally correct descriptions,
such as, “It involves getting students to engage in observation, questioning,
hypothesizing, predicting, planning and investigation.” There were a larger number of
responses from San Francisco than East Bay participants, and their answers were
generally more detailed, but both cohorts showed evidence of gaining knowledge of this
important concept.
95
47
19
10
0 20 40 60 80 100
East Bay
San Francisco
Percent
Pre-Institute
Post-Institute
21
Interest in STEM Teaching Careers and Ability to
Serve as STEM Learning Leader
These two questions are considered together in this section because many of the survey
results speak to both. For example, general teaching skills are needed for both a career as
a math or science teacher and as a STEM learning leader. We determined that consider-
ing them together still made sense with the fall data added.
Participants reported increased comfort with level of science knowledge needed to teach grades K-8. Gains found over the summer for both cohorts were also seen in the fall in San Francisco, and there was some decline in the East Bay.
By the end of the Institute, participants felt more comfortable with their level of
knowledge for teaching science in grades K-8. Among the San Francisco participants,
54% felt comfortable before the Institute, while 62% felt comfortable after the Institute
(Figure 8). The perception of increased comfort with their level of knowledge for teach-
ing science continued into the fall, when 71% reported comfort on this indicator. In the
East Bay, the change was much more dramatic: 13% felt comfortable before the Institute
while 89% felt comfortable after the Institute (Figure 9). The percentage dropped to 60%
in the fall, substantially higher than the pre-Institute level, but lower than the post-
Institute level. Participants’ comfort with the level of their knowledge related to the
amount of math needed to teach grades K-8 did not change appreciatively across the three
data points, but they started the summer relatively high on this indicator: at least 90% of
participants in both programs entered the Institute comfortable with the amount of math
they knew for teaching elementary and middle grades.
Evaluation Questions: What effects does participating in the STEM Summer Institute have on
participants’ attitudes towards and motivation to pursue careers in math and
science teaching?
What effects does participation in the STEM Summer Institute have on
participants’ ability to serve as STEM learning leaders?
2. What effects does participation in the STEM Summer Institute have
on participant knowledge and competence in STEM based experi-
ential learning?
22
Figure 8: Comfort with math and science knowledge needed to teach K-8—San Francisco
N= 22
Figure 9: Comfort with math and science knowledge needed to teach K-8—East Bay
N= 20
Participants reported gains in teaching skills over the summer and some persisted into the fall. There was some decline, but results were still generally high.
In addition to math and science content knowledge needed for teaching elementary and
middle school participants, STEM Summer Institute participants were asked about gains
related to teaching skills on the post-Institute survey and again on the fall survey. Virtu-
ally all participants reported that they improved “somewhat” or “a lot” over the summer
in their ability to teach math and science to younger participants (100% of San Francisco
participants and 90% of East Bay participants in post-Institute survey, Figures 10 and
11). At the end of the summer, at least 85% of participants from both programs felt that
they improved their ability to increase learner engagement with math, could better help
program teams develop good teaching plans, and learned how to make math and science
exciting for participants. By the end of the fall, participants continued to report that they
71
95
62
91
55
95
0 20 40 60 80
Your level of knowledge for teaching science in grades K-8.
The amount of math you know for teaching
grades K-8.
Percent
Pre-Instiute
Post-Instiute
Fall
60
90
89
85
13
90
0 20 40 60 80 100
Your level of knowledge for teaching science in grades K-8.
The amount of math you know for teaching grades K-8.
Pre-Institute
Post-Institute
Fall
23
improved their math and science teaching abilities over the course of the summer and fall
(responses ranged between 75% and 100% in fall surveys).
Figure 10: Perceptions on teaching skills post-Institute and fall—San Francisco
N= 22
Figure 11: Perceptions of teaching skills post-Institute and fall—East Bay
N= 20
Participants were also asked, on all three surveys, about various dimensions of math and
science teaching. Figures 12 and 13 report the percentages of participants who “agreed”
or “strongly agreed” with each statement. In the San Francisco program, the percentage
of participants who said they would enjoy the challenge of teaching a new and difficult
75
90
85
100
86
91
95
100
0 50 100 150
Help program teams to develop good teaching plans.
Learn how to make math and science exciting for students.
Improve my ability to increase learner engagement with math in
my classroom.
Improve my ability to teach math and science to younger students.
Percent
Post-Institute
Fall
84
84
72
89
85
85
90
95
0 20 40 60 80 100
Improve my ability to increase learner engagement with math in
my classroom.
Help program teams to develop good teaching plans.
Learn how to make math and science exciting for students.
Improve my ability to teach math and science to younger students.
Percent
Post-Institute
Fall
24
math problem declined, from 77% to 50% from pre- to post- survey administration, but
then increased to 76% on the fall survey administration.
Survey results showed a noticeable drop in the percentage of participants who responded
positively on various dimensions related to their science teaching abilities. Perhaps most
notably, there was a steady decline for the item “I would understand science concepts
well enough to be effective in teaching” (79% on pre-survey, 65% on post-survey, and
50% on fall survey). The percentage of San Francisco participants who agreed to the
statement, “I would enjoy the challenge of teaching a new and difficult concept in sci-
ence” dropped from the pre-Institute survey to the fall survey (86% to 55%). A gain was
observed, however, in the percentage of San Francisco participants who felt they would
know the steps necessary to teach science concepts effectively (46% in the pre-Institute
survey to 64% in the fall survey).
As noted earlier, participants in the focus group in San Francisco indicated that they did
not like the experience of the fall online science course. In addition, they did not have the
opportunity to lead science activities in their fall placements. This may account in part for
the drop-offs on some items.
In the East Bay, the percentage of participants who agreed with the statement, “I would
feel secure about the idea of teaching math” increased from 60% in the pre-Institute sur-
vey to 79% in the post-Institute survey but declined to 50% of participants in the fall.
Fewer participants in the fall agreed with the statement, “I would enjoy the challenge of
teaching a new and challenging concept in math” compared to how they felt pre- or post-
Institute (45% in the fall compared with 65% pre-Institute and 68% post-Institute).
25
Figure 12: Dimensions of Math and Science Teaching—San Francisco
N= 22
In terms of East Bay participants’ perceptions of their own science teaching abilities,
there was a steady decrease in the percentage of participants who said they would wel-
come students’ science questions (95% in the pre-Institute survey, 85% in the post-
Institute survey, and 70% in the fall survey). The percentage of Institute participants who
“would enjoy teaching a new and difficult concept in science grew from 65% to 80%
over the summer, but then dropped to 53% in the fall. Similarly, the percentage that felt
“secure about the idea of teaching science” increased from 55% to 74% over the summer,
but then dropped back to 55%. Finally, at the beginning of the Institute, 40% of East Bay
participants responded that they “would know the steps necessary to teach science con-
cepts effectively.” At the end of the summer, that percentage climbed to 75%. In fall it
dropped to 55%, which was nevertheless substantially higher than the pre-Institute meas-
ure.
64
59
62
50
55
91
77
77
73
76
86
91
72
80
75
65
80
100
79
71
85
50
90
95
46
65
68
79
86
100
68
71
75
77
90
100
0 10 20 30 40 50 60 70 80 90 100
I would know the steps necessary to teach science concepts effectively.
Generally, I would feel secure about the idea of teaching science.
I would be able to answer students' science questions.
I would understand science concepts well enough to be effective in teaching science.
I would enjoy the challenge of teaching a new and difficult concept in science.
When teaching science, I would welcome student questions.
I would know the steps necessary to teach math concepts effectively.
Generally, I would feel secure about the idea of teaching math.
I would understand math concepts well enough to be effective in teaching math.
I would enjoy the challenge of teaching a new and difficult concept in math.
I would be able to answer students' math questions.
When teaching math, I would welcome student questions.
Pre-Institute
Post-Institute
Fall
26
Figure 13: Dimensions of Math and Science Teaching—East Bay
N= 20
Changes in intentions to teach or work with children did not show a clear pattern on the surveys. Focus group interviews and new item on fall survey revealed more openness and interest in teaching.
On the pre-Institute, post-Institute and fall surveys, participants were asked their career
plans with this item, “At this point, I plan to pursue a career in (Please check ONE only)”
and were presented 13 categories. Although participants were asked to mark just one
category, many marked more than one. For example, in the pre-Institute survey, the 21
San Francisco participants marked 39 careers. Due to the lack of clarity in the results, we
added an item to the fall survey. In the next section, data will be presented from a ques-
tion that was added to the fall survey to further examine how participation in the Insti-
tutes might have affected participants’ career interests. The results on that item, along
with data collected from summer and fall focus groups, suggest that through their experi-
ence in the STEM Summer Institutes, participants became more open to the idea of teach-
ing. In the section immediately below, however, we present the results from the earlier
55
60
55
60
53
70
60
50
45
60
70
75
55
75
70
80
85
70
79
68
75
90
40
50
55
58
65
95
55
60
65
65
80
0 20 40 60 80 100
I would know the steps necessary to teach science concepts effectively.
I would be able to answer students' science questions.
Generally, I would feel secure about the idea of teaching science.
I would understand science concepts well enough to be effective in …
I would enjoy the challenge of teaching a new and difficult concept in science.
When teaching science, I would welcome student questions.
I would know the steps necessary to teach math concepts effectively.
Generally, I would feel secure about the idea of teaching math.
I would enjoy the challenge of teaching a new and difficult concept in math.
I would understand math concepts well enough to be effective in teaching …
I would be able to answer students' math questions.
Percent
Pre-Institute
Post-Institute
Fall
27
survey item about the careers participants planned to pursue, which had decidedly mixed
results.
Interestingly, in San Francisco, the percentage of participants who planned to pursue a
STEM teaching career declined over the course of the summer from 48% to 33%, and
was at 27% in the fall. Plans to pursue teaching in a subject other than math or science
also decreased over the summer but then returned to 24% in the fall. Interest in child de-
velopment careers also declined over the summer (from 57% to 33%), but then increased
to 45% in the fall. “Teaching in an afterschool program” was added as a response cate-
gory in the fall survey, after analysis of the pre- and post-Institute surveys raised ques-
tions about whether participants with that career interest were marking the “teaching,”
“child development,” or “other” response categories.
Figure 14: Career plans—San Francisco
N=22. Note that many respondents marked more than one category. All of the responses are counted in this figure. Thus, the summed percentages at each time point exceed 100%. Teaching in an afterschool program was only asked in the fall survey. This item was added after analysis of the earlier surveys led MPR to question whether participants were marking “teaching,” “child development,” or “other” for afterschool teaching career.
Figure 15 presents the San Francisco career plan survey item results in a different manner
by combining the three teaching categories in Figure 14 into one category “Careers in
Teaching” into one category. “Careers in Child Development and Family Services is
maintained as a separate category. “Careers in STEM” combines: teaching in one the
STEM fields, engineering and design, health science and medical technology, and infor-
mation technology. “Careers in Teaching, Child Development or STEM” includes all of
the three previous categories shown in Figure 15. The percentage of San Francisco Insti-
41
14
45
27
24
27
43
14
33
15
33
52
5
57
24
48
0 20 40 60 80 100
Other
At this point, I am undecided.
Child development and family services
Teaching in an afterschool program
Teaching in a subject other than math or science
Teaching in one of the STEM fields
Percent
Pre-Instiute
Post-Institute
Fall
28
tute participants who reported wanting to pursue a teaching career decreased from 67%
pre-Institute to 43% post-Institute. This figure increased to 56% in the fall. Prior to the
Institute, 48% percent of participants wanted to pursue a STEM career, including teach-
ing a STEM subject and also including careers such as engineering and medical science.
That percentage dropped to 23% in the fall. When the previous categories shown in Fig-
ure 15 were combined into one category, 95% of San Francisco participants reported
planning to pursue a career involving teaching, child development, or STEM at the be-
ginning of the Institute. That figure dropped to 71% post-Institute and then increased to
86% in the fall.
Figure 15 Career plans combined—San Francisco
N=22. The “Careers in teaching” combines the following career categories: teaching in one of the STEM fields, teaching in a subject other than math and science, and teaching in an af-terschool setting (fall only). “Careers in Child Development and Family Services” includes just that single category. “Careers in STEM” combines: teaching in one the STEM fields, en-gineering and design, health science and medical technology, and information technology. “Careers in Teaching, Child Development or STEM” includes all of the categories in “Careers in Teaching” “Careers in Child Development and Family Services” and “Careers in STEM.”
In the East Bay, the one notable change in participant career plans as shown in Figure 16
is an increase between the pre-Institute and the fall surveys in the percentage of partici-
pants who wanted to pursue a career in Child Development (25% compared with 45%).
Combining the three teaching careers, child development careers, and STEM careers
(Figure 17) reveals that 55% of East Bay participants planned to pursue one of these
fields pre-Institute, and by the fall that figure increased to 85%.
86
23
45
56
71
38
33
43
95
48
57
67
0 20 40 60 80 100
Careers in Teaching, Child Development or STEM
Careers in STEM
Careers in Child Development and Family Services
Careers in Teaching
Percent
Pre-Instiute
Post-Institute
Fall
29
Figure 16: Career plans—East Bay
N=20. Note that many respondents marked more than one category. All of the responses are counted in this figure. Thus, the summed percentages at each time point exceed 100%. Teaching in an afterschool program was only asked in the fall survey. This item was added after analysis of the earlier surveys led MPR to question whether participants were marking “teaching,” “child de-velopment,” or “other” for afterschool teaching careers.
Figure 17: Career plans combined—East Bay
N=20. See note in Figure 15 for a list of careers included in each category.
50
10
45
25
15
30
47
16
26
31
21
45
20
25
30
25
0 20 40 60 80 100
Other combined
At this point, I am undecided.
Child development and family services
Teaching in an afterschool program
Teaching in a subject other than math or science
Teaching in one of the STEM fields
Percent
Pre-Institute
Post-Institute
Fall
85
30
45
50
63
21
26
42
55
30
25
45
0 20 40 60 80 100
Careers in Teaching, Child Development, or STEM
Careers in STEM
Careers in Child Development and Family Services
Careers in Teaching
Percent
Pre-Instiute
Post-Institute
Fall
30
Fall surveys and focus group interviews revealed more openness and interest in teaching.
To better understand the impact of the Institute on participants’ interest in teaching math
and science, an additional question was added to the fall survey. Participants were asked
the extent to which they learned, through their participation in the Institute, that they
would want to teach math or science. Figure 18 shows that the majority of Institute par-
ticipants from both sides of the Bay reported that they learned that they would like to
teach math or science (68% of San Francisco participants and 58% of East Bay).
Figure 18. Learned they would want to teach math or science
N= 22 for San Francisco, N=20 for East Bay.
Responses both on open-ended items on the fall survey and during the focus groups con-
ducted with both cohorts support the finding that participation in the Institute influenced
their goals for college and career. In an open-ended item on the fall survey, participants
were asked whether the experience changed their goals for college or career. Twenty out
of 22 of those in the San Francisco cohort who provided a response indicated not only
that it had influenced them, but many also indicated that it had helped them decide to go
into teaching or to work with youth. Sample comments include:
It has increased my willingness to become an educator.
I’ve always wanted to work with people and/or students. This pathway helped me
grasp a better “feel” for working with students.
Reassured me that education and youth work will always be my calling and passion
in life and will continue to strive to be a better student and be a better role model to
the youth in my community.
The East Bay participants responded similarly to this item: 14 out of 18 participants who
provided a response indicated that it had influenced them positively to think about teach-
ing and related jobs in STEM. One notable comment was: “The Teacher Pathway has
helped me focus more on becoming a teacher. This experience is great knowing that oth-
58
68
50 55 60 65 70
East Bay
San Francisco
31
ers want to pursue the same career. This program helps in finding other school program
jobs too.”
Responses from focus groups drawn from each cohort also reflect positive inclinations
toward teaching. We include the following list from the East Bay cohort because it re-
flects all of their responses to this question and illustrates the degree of positivity and
what they gained:
I think I wanted to be a school counselor, now I’m interested in both.
I thought I wanted to be in an afterschool program, but now I think I want to be a
teacher.
It was a good experience at LHS. It gave me an idea of what it would be like to be a
teacher, even though we didn’t have that authority at LHS. It still gave me a good
idea about being a teacher.
I really enjoyed the summer. I thought it was a good experience for teaching, espe-
cially the young students, about science and animals. It was my first job in program
working with kids. I really liked it.
I really liked it. I have kids of my own, but they’re older. I liked working with the
young kids as a teacher. After 2 weeks I got to lead an activity, and it was cool. I
think it was a once in a lifetime opportunity. If I could do it again, I would.
I liked the experience. Working with kids is very comfortable for me. It helped me
think about a future career as a teacher.
I had experience teaching my own class in afterschool program. But this was differ-
ent because there were so many activities and supplies. It helped me think about in
the future when I have my own class and how to get the students to participate more.
It helped me to observe, will help me with my class. I still want to work with kids
and be a teacher.
Participants reported increased knowledge of STEM careers over summer. Gains were generally maintained into the fall.
Participants were asked on all three surveys to indicate how much they knew about ca-
reers in STEM fields. Figures 19 and 20 show the percentage of participants who report-
ed knowing “a moderate amount” or “a lot.” In the San Francisco program, an initial
substantial increase was found in the percentage of participants who knew about careers
in engineering (10% pre-Institute versus 36% post-Institute), and careers in technology
(19% versus 50%). Those increases persisted through the fall. Knowledge of math ca-
reers among San Francisco participants did not change substantially between the pre- and
post-Institute surveys, but by fall, the growth was notable (33%, 38%, and 48%, respec-
tively). Knowledge of science careers steadily increased across the three survey periods
32
(14% for the pre-Institute survey, 27% for the post-Institute survey, and 36% for the fall
survey).
In the East Bay, initial gains in career knowledge were also found from pre- to post-
survey administration for engineering (28% versus 50%) and technology (32% versus
55%). Those gains were largely maintained through the fall. The percentage of East Bay
participants who reported knowing about science careers increased from the pre-Institute
survey to the post-Institute survey (20% to 39%) and settled at 32% in the fall survey.
Knowledge of math careers did not change appreciatively over the three time points.
Figure 19: Knowledge of STEM careers—San Francisco
N= 22
Figure 20: Knowledge of STEM Careers—East Bay
N= 20
36
48
36
48
27
50
36
38
14
19
10
33
0 10 20 30 40 50 60
Careers in science?
Careers in technology?
Careers in engineering?
Careers in math?
Pre-Instiute
Post-Institute
Fall
32
47
53
47
40
55
50
35
17
32
28
39
0 10 20 30 40 50 60
Careers in science?
Careers in technology?
Careers in engineering?
Careers in math?
Percent
Pre-Institute
Post-Institute
Fall
33
Quality of the STEM Summer Institute
Participants generally had very favorable reactions to most components of the STEM Summer Institute. Placements received high marks, while reactions to courses varied most.
In the post-Institute and fall surveys, participants were asked the extent to which they
found the courses and professional development activities useful. Figures 21 and 22
show the percentage of participants who responded “useful” or “very useful.” In the San
Francisco program, a majority of participants found each component useful, with the ex-
ception of the fall physical science on-line course. Virtually all participants thought the
Teaching Methods Class, the CalSAC training, the Physical Science lab, the Exploratori-
um Inquiry Institute, Lawrence Hall of Science Workshop, the Math Workshop, Speech
20, Child Development 53, and the Marin Headlands Institute training were useful (rang-
ing from 91% to 100% of San Francisco participants). The three components rated useful
by the fewest participants were the Kickoff CCSSF Teacher conference (60% thought it
was useful), the ET 50 course (57% thought it was useful), and the fall Physical Science
On-Line Course (30% thought it was useful). It is important to note that the Math 108b
course in the summer was not a formal component of the Institute, and as such, Growth
Sector had less control over it. In the spring, participants asked to be able take the course
during the summer, which is required for the Teacher Pathway program, to accelerate
their credit accrual. The courses in the fall were also not a component of the Institute, but
rather part of the broader Teacher Pathway program. Although Growth Sector did not
design the fall courses and had no control over them, they did help pay for them with sur-
plus Institute funds.
Evaluation Questions: What is the level of quality of the STEM Summer Institute, i.e., how well
does it measure against standards of effective professional development?
In what ways does the STEM Summer Institute serve as a model for the
seven other current and developing Pathway programs around the state?
34
Figure 21: Usefulness of courses and professional development activities—San Francisco
* Fall Activities N=22
In the East Bay program, two components had fewer than half the participants reporting
them as useful: the Engineering 10 class (45%) and the Math Applications course (35%)
(Figure 22). Virtually all of the East Bay participants thought the Teaching Methods
class, the Chemistry Lab, the Chemistry Self-Directed Lab, the Lawrence Hall of Science
Workshop and English 7 were helpful (ranging from 94% to 100%).
30
57
79
83
85
87
100
100
100
60
69
81
91
95
100
100
100
100
0 10 20 30 40 50 60 70 80 90 100
Physical Science 11 (online course)*
ET 50*
September Training: Building Leaders by CalSAC*
Juan and Sabrina’s monthly team building exercise*
Larry Horvath’s teaching workshop*
October Training: Exploring Diversity by CalSAC*
Marin Headlands Institute training*
Child Development 53*
Speech 20*
Kickoff CCSF Math, Science and Teacher Conference
Math class (ET108B) with Lawrence
Physical science study group
Math workshop with Mark
Lawrence Hall of Science Workshop
Exploratorium Inquiry Institute
Physical science lab with Jason or Chris
CalSAC training
Teaching Methods Class
Percent
35
Figure 22: Usefulness of Institute courses and professional development activities—East Bay
* Fall Activities
N=20
Institute participants reported distinctly different experiences in the various
courses they took as part of the Institute, and were particularly satisfied
with the science courses.
Results from focus group interviews and open-ended items on surveys reflected consider-
able variation in reactions to the courses that participants took. (It is important to note
that some of these courses were not controlled by Growth Sector and were officially part
of the Teacher Pathway.) Institute participants in both the San Francisco and East Bay
programs praised the hands-on nature of the physical science lab in San Francisco and
chemistry course in the East Bay, respectively. They appreciated that the courses were
relevant and directly applicable to their teaching experiences—even for some of the East
Bay Institute participants, who commented that they were able to use portions of the
chemistry course to support their work as teaching assistants at LHS. During the focus
group, one San Francisco Institute participant commented, “The [physical science lab]
was interactive and fun. I wish we had more time to do it. Could have delved deeper.”
While the students were overwhelming positive about the science courses, the math
courses on both sides of the Bay elicited more negative feedback during the focus groups.
It should be reiterated that the Math 108B class was not an official component of the
STEM Summer Institute but rather was a component of the Teacher Pathway Program
and the STEM Summer Institute had no control over the content. During the summer fo-
74
83
83
89
100
35
45
82
85
89
94
95
100
100
0 10 20 30 40 50 60 70 80 90 100
Biology 10*
Marin Headlands Institute Training*
Math 55*
History 8*
English 7*
Math applications course with Shirley and Doris
Engineering 10 class
Exploratorium Inquiry Institute
Online chemistry class
CalSAC training
Lawrence Hall of Science Workshop
Chemistry self-directed lab and online class
Chemistry lab with Donna
Teaching methods class
Pecent
36
cus group, participants expressed dissatisfaction with the teaching style the math 108B
instructor, expressing concern that he was not the best fit for the types of participant en-
rolled in the STEM Summer Institute. During the fall focus group, however, participants
were much more positive, crediting him with helping them “finally get math.” It is un-
clear whether their reactions to the instructor changed, or if the difference was due to the
different set of students involved in the summer and fall focus groups.
The math applications courses offered on each side of the Bay were STEM Institute-
specific courses (Math Workshop in San Francisco, and Math Applications Course in the
East Bay). The general consensus about the East Bay math applications course was that it
was “pointless” because it had nothing to do with the chemistry course they were taking
simultaneously. A few participants acknowledged that some of the math topics were use-
ful, but the overall level of the class was too basic and not challenging enough. Further,
there were problems with scheduling, as the participants reported that the instructors
would sometimes arrive late. Scheduling problems were also mentioned in regard to the
chemistry lab. The participants who took the lab in the morning were not able to leave
until the class officially ended, even when there was no work to do, because they had
their placements directly after the lab. In contrast, the participants who attended lab in
the afternoons were able to leave upon completion of their work, as the lab was their last
Institute commitment for the day.
Uniformly, Institute participants reacted positively to the professional
development sessions they attended.
Both San Francisco and the East Bay Institute participants had primarily positive things
to say about the CalSAC professional development. Participants appreciated the hands-
on, interactive nature of the workshops and the San Francisco participants, in particular,
enjoyed the applicability of what they learned to the lessons they prepared. One partici-
pant commented, “Toward the end they brought in new tactics that were helpful to me. I
wish they had brought them in earlier.” Another participant noted that it would have
been helpful to have the same training at a later point in the program so they could rein-
force what they had learned during the first few days of the Institute.
On the fall survey, participants also responded very positively to the Headlands Institute
training. They especially liked the hands-on nature of the experience, and they were very
positive about the knowledge and experience of the instructors. “Our instructors were
very informative and knowledgeable. It was also an opportunity to build relationships be-
tween cohorts.”
37
In both the San Francisco and East Bay programs, it was clear that the
Institute participants felt very positive about their teaching placement
experiences and that leading or assisting with lessons in the summer
camps increased their confidence in teaching.
MPR observed the participants’ placements at both the Lawrence Hall of Science (East
Bay) and four of the summer program sites in San Francisco. The San Francisco sites
were selected with help from the Support Specialists who had completed their own eval-
uation of the placements prior to MPR’s observations. Because resources only allowed
observers to go to four sites out of thirteen, MPR organized visits with those that repre-
sented a range in quality (high, medium, and low). In addition, MPR made visits to a
range of fall placement sites for both cohorts.
It is important to note that the designs of the placement experiences for the San Francisco
Institute participants and the East Bay Institute participants were quite different. The San
Francisco Institute participants were placed in independent summer programs around the
city and were responsible for designing and providing instruction for a group of young
students. The East Bay Institute participants, on the other hand, were placed in “summer
camps” run by the Lawrence Hall of Science (LHS). The LHS designs and delivers
camps in different rooms of the Hall based on age and grade level of the children. The In-
stitute participants were assigned to a number of the camps and rotated among them.
Their role was largely that of a teaching assistant, though they sometimes took active
roles in instruction. In the fall, the placement sites were more similar for the two cohorts,
although the experiences the participants had varied across the sites.
In addition, the young students in the placement venues were quite different demograph-
ically. Specifically, the students who attended the LHS summer camps were much less
diverse and appeared to come from a higher socioeconomic status compared with partici-
pants in the San Francisco summer programs.
In interviews conducted with participants after their lessons, they indicated varied levels
of interest in pursuing STEM teaching careers. One Institute participant noted, “This was
my first experience working with kids. I’m not sure about math or science. I’ve thought
about it. The program has definitely given us an interest…I’m going to learn from this.”
Another Institute participant commented that while math and science still made her nerv-
ous, teaching was more of a possibility because of her summer experience. This senti-
ment was similarly echoed during the focus groups. When asked whether their summer
teaching experience had changed their thoughts about a potential career, one Institute par-
ticipant commented, “It’s made me a little closer to teaching. I’m leaning more toward
teaching,” while another echoed, “I agree. I walked in at the beginning nervous, unsure
about my ability to teach. Now I walk in calm, knowing it might not be perfect, but that it
can always be better the next time. I’m open to teaching.”
38
It should also be noted that the LHS Institute participants expressed some disappointment
in not being able to take more of a lead in providing instruction. While they enjoyed their
experience and clearly benefitted, they would have liked more opportunities to apply the
knowledge they were gaining.
In both the San Francisco and East Bay placement sites, program structure
and assistance provided to Institute participants varied.
While this might be expected, it was noted that the variation seemed, in some cases, to be
due to the structure and/or support provided by the particular summer program. For ex-
ample, one of the sites provided four adult staff members to help the Institute participants
keep track of 12 young students, while another site had high school interns serve as sup-
port for the two Institute participants leading a lesson for approximately 26 younger stu-
dents. While both sets of Institute participants were able to carry out the lesson for the
day, it was clear that those who had the support of trained adults were able to carry out
the lesson with fewer interruptions and more individualized attention than the site with
more participants and less substantial support. One Institute participant commented,
“Once I started, there was no communication between me and the coordinator. She never
had time…I had very limited resources at that site. [But] I loved working with the kids, it
was just like the kids I grew up with.” In contrast, another Institute participant who
worked at a different site said, “[My] coordinator was very responsible, knew what she
had to do for the kids. It’s a safe sanctuary in the community.”
At the LHS, each of the teachers in the summer camps designed the activities and in-
volved the Institute participants to varying degrees. Thus, some Institute participants had
more opportunity to develop and display their growing capacity for teaching than others.
While program quality varied, the instruction or assistance provided by
Institute participants in placement sites was generally strong.
In the San Francisco sites, the observers noted that the Institute participants displayed a
high level of professionalism, age-appropriate teaching techniques, and appropriate con-
tent knowledge. Across all four sites observed in San Francisco, Institute participants
were able to talk about the science required for each lesson, and used age-appropriate
techniques to elicit participant conversation about highly scientific concepts and vocabu-
lary. For example, one of the Institute participants began his lesson by asking younger
students if they knew who Benjamin Franklin was. He then asked students to discuss
electricity. Within five minutes, the first, second and third graders were hypothesizing
how a light switch works. The Institute participants were attentive to the individual needs
of the students and encouraged questions. At all but one site, the ratio of Institute partici-
pants-to-students allowed for Institute participants to occasionally check comprehension.
39
This did not occur—perhaps because it was not possible—at a site where there were too
many young students and little to no classroom support by the program.
In the LHS camps, Institute participants likewise demonstrated quite high levels of pro-
fessionalism. While they served as teaching assistants, they were very involved in the in-
struction and were clearly able to anticipate the needs of the lessons, to lead sections, or
in general to help keep the flow of the activities going. They seemed to have good
knowledge of the goals and objectives of the lessons and what needed to be accomplished
in order to achieve those goals. Teachers did not typically have to tell the Institute partic-
ipants what to do or explain what they were trying to achieve. There were, as would be
expected, a few exceptions to this. However, both observers, who attended at different
times, reported the high level of engagement of the Institute participants as part of their
general observations. It should also be noted that the Institute participants were exposed
to very strong models of instruction, classroom management, and well-designed STEM
instructional activities.
Fall observations of participants’ placement sites revealed a different range of activities
as would be expected with school in session. The afterschool sessions often had a focus
on helping students with homework, but the Institute participants that were observed
demonstrated a high level of professionalism, effective instructional and management
techniques with students, and were clearly integral to the afterschool programs. Most did
not have the opportunity to lead STEM activities but expressed a willingness and desire
to do so. One noted that she did not have the materials or lesson plans to allow her to do
so but thought it would be a great idea if the Institute had provided them to the partici-
pants. One student worked with the Techbridge program and did have the opportunity to
help lead interesting STEM activities.
Participants report frequent contact with and helpfulness of Support Specialists
In both the San Francisco and East Bay programs, participants reported frequent contact
with the Support Specialists and found them to be “helpful” or “very helpful” (Figures 23
and 24). In San Francisco, 46% of participants reported contact with Support Specialists
“everyday” or “often,” and an additional 45% reported “sometimes.” Only 10% reported
“rarely” or “never.” Contact with East Bay Support Specialists seemed even more fre-
quent, with 40% reporting it “everyday,” and an additional 55% reporting it “often” or
“sometimes.”
40
Figure 23: Frequency of contact with Support Specialists
N=22 for San Francisco, N=20 for East Bay
All of the San Francisco participants and almost all of the East Bay participants rated the
Support Specialists “helpful” or “very helpful” (100% and 89%, respectively, Figure 24).
Figure 24: Helpfulness of Support Specialists
N=22 for San Francisco, N=20 for East Bay
Responses in fall focus groups also reflected an appreciation for the work of the Support
Specialists but also recognition that they were sometimes overwhelmed by the demands
placed on them.
5 40
41
30 45
25
5 5 5
0
20
40
60
80
100
San Francisco East Bay
Percent
Never
Rarely
Sometimes
Often
Every day
58 28
42
61
11
0
20
40
60
80
100
San Francisco East Bay
1 Not helpful at all
2 Not very helpful
3 Helpful
4 Very helpful
41
On both surveys and from qualitative data, there were strong indications that participants would re-enroll in the program if they had the opportunity.
In the fall survey, Institute participants were asked, “If a program similar to last summer's
STEM Institute was going to be offered this coming summer, would you enroll in it? It
would include a Cal State East Bay/San Francisco State class and Headlands Institute
training.” As Figure 25 reports, 53% of San Francisco participants and 61% of East Bay
participants said they would enroll in a second summer.
Figure 25: Plan to enroll in second summer STEM Institute
N= 22 for San Francisco, N=20 for East Bay.
On open-ended items, participants were asked if they would re-enroll if the STEM Insti-
tute were offered the next summer. In San Francisco 11 out of 20 respondents indicated
they would, but all of those who indicated they would not gave reasons relating to want-
ing to take a break, graduating, committed employment, or needing to take other classes
to graduate. Thirteen out of 19 participants from the East Bay cohort who provided a re-
sponse said they would. As one commented, “STEM programs allow training and learn-
ing from angles not normally provided to teachers in training.”
Issues related to logistics differentiated implementation of the Summer Institute in San Francisco and the East Bay.
In terms of participants’ perceptions of the management and logistics of the STEM
Summer Institute, a few notable issues arose in both San Francisco and the East Bay.
During the San Francisco focus group, several participants voiced frustration about the
lack of clarity around credit accrual for the summer. Specifically, some participants re-
ported that they not take the appropriate math class over the summer, and as a result, did
61
53
0 20 40 60 80 100
East Bay
San Francisco
Percent
42
not earn credits towards graduation. One participant was especially upset about this as the
loss of credits resulted in a temporary hold on his financial aid.
Participants in the East Bay were uneasy with the lack of clarity around the purpose of
the program, rules for getting stipends, and rules for behavior in general. Some partici-
pants expressed frustration about the fact that they were “docked” on their stipends for
breaking rules of which they were not explicitly aware. Further, they felt that these disci-
plinary actions regarding their stipends were unevenly applied. For example, some partic-
ipants were docked for being late while others were not. Similarly, some participants
were unhappy with the lack of guidelines around appropriate demeanor while at the LHS
summer camps. One participant noted that some of her peers would come to the camps
“unprepared” and wished that there could be clearer rules about how to behave while
working with children and additionally, that these rules would be strictly and evenly en-
forced.
In the fall focus group for the East Bay cohort, the participants indicated that they had on-
ly recently been told that the program was not going to continue, and they would not be
able to finish. This, understandably, was causing them considerable distress and may
have affected their responses. It was later established that this had been a miscommunica-
tion, but it illustrates how the complexity of the program and the challenge of managing
the flow of information is but one of a number of critical tasks of managing such a pro-
ject successfully.
The evaluators believe that issues like the foregoing that participants raised in the focus
groups do not reflect the overall management of the Summer Institute. They were clearly
issues that were on their minds, but other information we had from interviews and obser-
vations do not reflect an overall problem with management. Some problems were identi-
fied by the managing organization that resulted from partnering with organizations they
could not control, but in general, the picture is one of a smooth operation, given its com-
plexity, and attention to detail and participants’ needs.
Evidence for the strength and versatility of the model suggests that it is one that could be replicated successfully elsewhere.
This Teacher Pathway model has many moving parts, so the fact that it has been imple-
mented in two different locations with largely positive results suggests that it has the po-
tential to be an effective model that could be implemented in other places. The program
leaders demonstrate a high level of skill in managing a complex program as well as an in-
terest in learning from the implementation and making continuous improvement. The fol-
lowing indicators are drawn from the synthesis of the evaluation results and present a
case for considering this a strong replicable model:
43
Support from funding organizations that recognize the potential of the model
and have confidence in the program leaders to carry it off.
Recruitment strategies that result in the identification of participant cohorts
that reflected the target group for this model: young adults from underrepre-
sented groups who either expressed interest in pursuing a career in teaching
or a STEM field or who were in the process of searching for a meaningful ca-
reer direction.
Involvement of partner organizations with the requisite expertise and skill to
provide effective and engaging professional development activities.
Inclusion of coursework in which students could be successful and build the
confidence to pursue a career in a STEM field.
Planning for participants to move through as a cohort and provide support to
one another.
Deployment of Support Specialists who manage implementation of the pro-
gram, support participants as needed, and troubleshoot issues that might hin-
der their involvement.
Partnerships with a range of afterschool organizations that could provide
teaching placement sites and that would, in return, receive professional de-
velopment to increase their own capacity to enhance STEM teaching.
A willingness on the part of program leaders to make course corrections as
needed.
44
Increase in Capacity of After-school Programs to
Provide Experiential Science
The findings in this section are based on survey responses by after-school program su-
pervisors and on some observation data. For questions about the Summer Institute Interns
(or Fellows), supervisors were asked to provide responses for two participants they ob-
served the most. The amount of time spent working at the sites varied across the two
venues, as did their roles. On the San Francisco side, the interns took responsibility for
providing the instruction in the summer programs, whereas on the East Bay side, the in-
terns served as assistants to the teachers in the LHS summer camps. In addition, the in-
terns at the LHS rotated among the summer camps, so it was a different experience for
them in this respect as well.
Supervisors report that having STEM Institute participants increased the capacity of their organization to provide STEM activities.
Figure 26 shows that 67% of the supervisors in San Francsico felt that the participation
and instruction provided by the STEM Institute participants enhanced to a “great extent”
their organization’s capacity for providing STEM activities to their summer camp
participants. The remainder of the San Fransico supervsiors, 33%, felt it enhanced their
organization’s capacity to a “moderate extent.” The East Bay supervisors reported less of
an impact from the Institute participants on their organization’s capacity to provide
STEM activites to the summer camp participants. This is perhaps not suprising given that
the East Bay participants were placed at LHS, whose camps were already focused on
science. Nevertheless, 21% of East Bay supervisors felt that the interns increased LHS’s
capacity to provide STEM activities to their summer camp participants to a “great
extent,” and 54% felt they did to a “moderate extent.” In one of the open-ended survey
responses for San Francisco, a supervisor noted, “It’s a good program, and the students
really enjoyed the activities. Also our participants learned a lot about science this summer
that they would not have otherwise.” Another San Francisco supervisor suggested,
“Training ideally would be complete well before most summer program start dates,” and
Evaluation Question: In what ways does the STEM Summer Institute improve the capacity of
after-school programs participating in the initiative to provide experiential
science?
45
further commented, “We missed out on having the fellows provide activites for 1 of the 6
weeks due to the fact that they were still receieving training from the STEM program.”
Figure 26: STEM Institute participants increased the capacity of the summer program to
provide STEM activities
N= 13 for San Francisco, N=24 for East Bay.
San Francisco summer placement supervisors rated Institute participants quite high, while East Bay ratings were somewhat lower
The supervisors were asked to rate the STEM Institute participants who served as STEM
learning leaders in their summer camps. The Insitute participants were called “STEM
Fellows” in the San Francisco camps and “Chabot Interns” in the East Bay camps. In
San Francisco, the 13 supervisors who particpated in the survey rated a total of 25 STEM
Fellows. In the East Bay, the 24 responding supervisors rated a total of 47 Chabot
Interns.
On each of the ten dimensions shown in Figure 27, very few San Francisco STEM
Fellows were rated as “below average” or “poor” (combined percentages ranged from 4%
to 12%). At least 60% of the STEM Fellows were rated “above average” on six of the
ten dimensions: interactions with students, class preparation, organization of lessons,
competennce in subject matter, maturity, and communication skills.
Compared with San Francisco, East Bay supervisors’ ratings of the STEM Institute
participants tended to be lower (Figure 28). The percentage of Institute participants who
were rated as “below average” or “poor” in the East Bay ranged from 13% for both
communication skills and Chabot Interns’ interactions with students to 40% for
67 21
33
54
21
4
0
20
40
60
80
100
San Francisco East Bay
Per
cen
t
Don't know
Not at all
Small extent
Moderate extent
Great extent
46
attendance/punctuality (in San Francisco the combined “below average” and “poor”
ratings ranged from 4% to 12%). On only one of the ten dimensions, Chabot Interns’
interactions with students, did the majority (55%) of East Bay Insitute participants
receive an “above average” rating. The lower ratings of East Bay STEM Insitute
participants by supervisors may be related to the fact that the East Bay placements
occurred at LHS, which for many years has also hosted interns from UC Berkeley. It is
possible the LHS supervisors were comparing the Institute participants to other interns.
Figure 27: Ratings of Institute participants by summer supervisors—San Francisco
N= 13 San Francisco Supervisors rated 25 Fellows, N=24 East Bay Supervisors rated 27 Fel-lows. Note that many of East Bay Fellows were supervised and therefore rated by more than one Supervisor.
8
4
8
4
8
4
4
4
4
4
0
0
0
0
0
0
0
0
0
0
8
8
4
8
4
4
4
8
12
8
48
44
32
32
28
32
28
24
16
16
36
44
56
56
60
60
64
64
68
72
0 50 100
Classroom management
Attendance/Punctuality
Poise
Dependability
Communication skills (e.g. grammar, adequate voice level)
Maturity
Competence in subject matter
Organization of lessons
Class preparation
STEM Fellow's interactions with students (e.g. rapport)
Percent
Above Average
Average
Below Average
Poor
I did not Observe this
47
Figure 28: Ratings of Institute participants by summer supervisors—East Bay
N= 13 San Francisco Supervisors rated 25 Fellows, N=24 East Bay Supervisors rated 27 Fellows. Note that many of East Bay Fellows were supervised and therefore rated by more than one Supervisor.
On the survey, LHS supervisors were asked to explain to what extent the Chabot College
Interns contributed to the learning experience of students at the summer camps. On the
whole, the majority of superviors expressed appreciation for the assistance of the Chabot
College Interns. One supervior noted, “The Interns maintained the classroom so that
teaching could be smooth and without as many problems.” Another echoed, “More
teachers in the classroom was great for assisting students with projects. They were great
with the students.” However, while superviors were happy to have the assistance, a few
mentioned that some interns seemed not to have an interest in a career in education. One
supervisor wrote, “Some were interested in a career in education, and some were not
11
36
21
0
0
2
0
0
0
0
6
6
4
13
11
4
2
6
2
0
19
9
17
28
23
26
15
15
11
13
53
38
38
30
34
36
40
34
41
32
11
11
19
30
30
32
43
45
46
55
0 10 20 30 40 50 60
Competence in subject matter
Organization of lessons
Class preparation
Attendance/Punctuality
Dependability
Classroom management
Poise
Maturity
Communication skills (e.g. grammar, adequate voice level)
Chabot Intern's interactions with students (e.g. rapport)
Percent
Above Average
Average
Below Average
Poor
I did not Observe this
48
(and that made a BIG difference), [but] it was good to have their help.” Another noted,
“Some interns did try to enhance the students’ learning experiences, while others did not
try at all. The extent of their contributions depended more on the individual interns
themselves,” One supervisor noted that some interns seemed to be more interested in
observing than leading. The supervior explained, “The interns assisted in the classes,
helping the teacher and were given the opportunity for more responsibility. I would have
liked to have seen more participants step up to lead activites.” Interestingly, during the
East Bay focus group, participants noted that they wished they had been given more
opportunities to lead lessons at LHS.
On both sides of the Bay, supervisors reported their students were engaged by the activities led or assisted by the Summer Institute participants
Figure 29 shows that 77% of San Fransico supervisors and 63% of East Bay supervisors
felt that their summer camp students were “very engaged” by the STEM activities led by
or assisted by the Insitute participants. An additional 22% and 33% of the San Francisco
and East Bay supervisors, respectively, felt their summer camp students were
“moderately engaged” in the activities.
Figure 29: Extent that your students were engaged in the activities led or assisted by the
STEM Institute participants
N= 13 San Francisco, N=24 East Bay.
4
23
33
77
63
0
20
40
60
80
100
San Francisco East Bay
Per
cen
t
Very engaged
Moderately engaged
Minimally engaged
Not at all engaged
49
Majority of San Francisco supervisors interested in working with the Institute participants in the future; more uncertainty among East Bay supervisors
Supervisors were asked if they would recommend that their organization host the STEM
Institute participants again, if they would recommend Institute participants to other or-
ganizations, and if they would hire them if resources were available. In San Francisco,
92% of supervisors reported they would recommend the Institute participants be hosted
again by their organization, and 92% would recommend them to other organizations
(Figure 30). Seventy-seven percent would hire the Institute participants if resources were
available, while 23% were uncertain. A San Francisco supervisor noted, “It was a great
opportunity for our program, and we would welcome this in the future and to other pro-
grams.” Another commented that while it was a “wonderful experience,” he/she also no-
ticed the participants were “getting antsy or restless [near the end of the summer].” As a
solution for this, the supervisor suggested “it would be great to have a workshop for them
on how to manage their own level of stress (from personal and professional commit-
ments) along with the stress levels of their students.”
Figure 30: Future partnership with STEM Summer Institute participants—San Francisco
N= 13 San Francisco
Figure 31 reveals more uncertainty among East Bay Supervisors about working with
STEM Institute participants in the future. Just over half (54%) would recommend that
their organization, LHS, host the participants again. Seventeen percent would not make
this recommendation, and 29% were uncertain. Twenty-nine percent would recommend
the Institute participants to other organizations, 17% would not, and 29% were uncertain.
23
8
8
0
0
0
77
92
92
0 20 40 60 80 100
Would you hire these Interns if resources were available?
Would you recommend these Interns to other organizations?
Would you recommend your organization host these Interns
again?
Percent
Yes
No
Uncertain
50
Twenty-one percent would hire the Institute participants if resources were available, 17%
would not, and 63% were uncertain.
Figure 31: Future partnership with STEM Summer Institute participants—East Bay
N=24 East Bay
When given the chance to explain their interactions with the Chabot College Interns fur-
ther, many of the supervisors reported frustration with the inconsistent attendance of their
interns. For example, one supervisor wrote, “They supervised stations and read them
books [most] of the day. I couldn’t trust them with greater responsibility because their at-
tendance was unreliable…” Another wrote, “Sometimes they showed up, sometimes they
said they wouldn’t come and they would, sometimes they just never came.” Another rea-
son supervisors may have been frustrated is that they felt overwhelmed with the number
of interns that were assigned to each classroom. (They had interns from multiple pro-
grams.) One supervisor explained, “I think my attention and patience was stretched thin
by having so many interns in the room at once. One would have been better.” Another
wrote that while mentoring was a demanding undertaking, “the positives outweigh the
negative. It might have been better to work with [fewer] interns though.”
It seemed as though the supervisors’ experience with the interns was mixed. In many of
their open-ended responses, supervisors mentioned both negative and positive experienc-
es with their interns. As one supervisor explained, “I found some to be amazing and dedi-
cated and some who perhaps didn’t have the maturity [of the others] but were earnest in
their own way and required perhaps a bit more training. In all, I was pleased to see the
Chabot participants trying to branch out and broaden their horizons, and I feel we bene-
fited from having them around.”
63
54
29
17
17
17
21
29
54
0 10 20 30 40 50 60 70
Would you hire these Interns if resources were available?
Would you recommend these Interns to other organizations?
Would you recommend your organization host these Interns
again?
Percent
Yes
No
Uncertain
51
San Francisco supervisors who participated in STEM professional development rated it favorably and indicated desire for more
About half of the San Francisco supervisors participated in a one-day training provided
by CalSAC and the LHS (45%, results not shown in figures). Among those who partici-
pated, half rated the quality of the professional development as “above average”, and the
other half rated it as “average” (Figure 32). A third reported that they have been able to
apply to a “great extent” what they learned at the professional development workshop to
their after-school program (Figure 33). Another third reported they have been able to ap-
ply it to a “moderate extent”, and the final third reported applying it to a “small extent.”
All of the East Bay supervisors (100%, results not shown in figures) who participated in
the STEM professional development workshop reported that they wanted more STEM
professional development. One San Francisco supervisor noted that he/she would have
liked to have the “habitat experience completed at the program,” and would have liked to
see more creative discipline strategies. Another wrote that it would “be great to have a list
of strategies for helping participants manage their frustration if the projects are not going
as well or as easily as they thought.” A different San Francisco supervisor mentioned “the
possibility of taking participants to the Exploratorium or on a field trip was mentioned at
the initial workshop,” and felt that he/she would have liked to take a field trip with the
participants with the support of the program.
Figure 32: Quality of the STEM professional development—San Francisco supervisors
(percent)
N= 13
50 50
Above Average
Average
Below Average
Poor
52
Figure 33: Extent San Francisco supervisors were able to use what they learned at STEM
professional development workshop (percent)
N= 13
Most East Bay supervisors felt the diversity of STEM Institute participants affected the LHS camps in a positive way, some in a negative way
The young students who attended the LHS summer camps were much less diverse and
appeared to come from higher socioeconomic status groups compared with the young
students in the San Francisco summer programs. The most common responses to the
question about whether the diversity of the STEM Institute participants affected the learn-
ing experience of the participants in the LHS camp was “Yes, in a positive way” (46%,
Figure 34). Similarly, the most common responses to the question about whether the di-
versity of the STEM Institute participants affected the teaching experience of the staffing
in the LHS camp was also “Yes, in a positive way” (38%). Sizeable percentages of re-
spondents, however, felt it had both positive and negative effects (13% on the learning
experience and 38% on the teaching experience). No respondents felt it had an entirely
negative impact.
33
33
33
Great extent
Moderate extent
Small extent
Not at all
53
Figure 34: Diversity of the STEM Institute students affected the LHS summer camps
N=24 East Bay
LHS supervisors were asked to explain whether or not they felt that the diversity of the
Chabot College Interns affected the learning experience of the participants in the LHS
summer camps. Overall, supervisors felt that the diversity of the Interns did contribute to
the learning experience of their students. As one supervisor wrote, “The diversity in the
classroom was fantastic. We need more of it at every level of education. In this respect,
the Chabot Interns were unmatched. They were genuine in their interactions and charac-
ter, which allowed them to connect with the students.” Another wrote, “I think we all
benefit—adults and children—from interacting with a more diverse group of people in
terms of age, gender, ethnicity, and life experiences.” Some of the supervisors, while less
sure of the explicit connection between diversity and the students’ learning experience,
still acknowledged that there might be a potential benefit. For example, one supervisor
explained, “Part of the summer camp experience is to meet new people and try new
things. I think it helped to expand the faces of those with whom they interact with in a
comfortable positive setting. That enhanced their general experience, but not sure if you
could link that to learning…or science learning.”
13 29
46
38
17
25
25
8
0
20
40
60
80
100
Affected the learning experience of the
students in the LHS summer camps
Affected the teaching experience of the staff in LHS summer camps
Per
cen
t
I don’t know.
No
Yes, in a positive way
Yes, in a negative way
Yes, in both positive and negative ways
54
Conclusions and Recommendations
As part of a broad education partnership, Growth Sector developed and led the 2011 Bay
Area STEM Summer Institute, designed to provide prospective STEM teachers with math
and science coursework, professional development, and direct work experience in out-of-
school programs for children. Candidates were recruited from the California Teacher
Pathway Initiative and participated in the Institute either in a San Francisco or East Bay
program, which differed somewhat in structure, staffing, and course content.
From the evaluation conducted by MPR Associates, it was found that the participants did
come from groups that were targeted for the initiative, i.e., those who are underrepresent-
ed in STEM careers and in teaching and those who, in general, are from lower-income
households. For the most part, these were also students who were struggling to determine
a career direction.
Overall, the STEM Institute received high marks from participants and was quite success-
ful on a number of indicators. While there were a few management issues on the East
Bay side that were not within the control of Growth Sector, project operation, which re-
quired attention to many different components, was quite efficient. The deployment of
Support Specialists to assist with the operation was a major factor in ensuring this out-
come. It also seemed clear that this model could be readily replicated.
Participants in the program increased their knowledge and comfort level in the math and
science content areas, based both on the grades they obtained in their courses and their
reports of increased knowledge and comfort. They also indicated that they had achieved
the Institute’s objective of developing a greater understanding of “inquiry-based learn-
ing.” Importantly, participants also reported gains in teaching skills.
Participants’ interest in teaching in STEM fields or in careers in STEM fields in general
presented results that were somewhat difficult to interpret. It appeared from the surveys
that interest in STEM fields declined, but the survey item that was used to gauge their in-
terest resulted in some cases in participants marking more than one response when they
were asked to mark one. We also reanalyzed the data, combining some categories, and
this changed the picture somewhat. In addition, we added a question to the last survey
(whether participation had learned they would want to teach math or science) and focused
on the development of interest during the final focus group in the fall. Responses to the
new survey item, to open-ended items, and in the focus groups revealed openness to
teaching in STEM fields.
55
In responses to a survey, supervisors in placement sites where Institute participants pro-
vided or assisted with instruction as part of their Institute involvement also gave quite
high marks to aspects of the Institute. They indicated that the Institute had increased their
organizations’ capacity for providing experiential science activities, and they responded
positively to the professional development they themselves had received, indicating a de-
sire for more. Their reactions to the performance of the Institute participants varied
somewhat between the two cohorts, but the involvement of the participants was also quite
different. On the San Francisco side, the participants provided the instruction, while on
the East Bay side, they served as aides; so their value was not assessed on the same basis.
They were more highly rated on the San Francisco side where they operated largely inde-
pendently in the classrooms. The quality of their instruction was rated to be and observed
to be (by researchers) quite high. While the assistance the participants provided on the
East Bay side was generally rated as positive, there was more variation in how supervi-
sors rated the individual participants.
Recommendations
Recruitment
In the recruitment process, it is important that potential participants gain a clear under-
standing of what the program is about and what expectations they should have for their
involvement. It may be important to gain a greater sense of potential participants’ motiva-
tion for teaching, for working with children in educational settings, or for working in
STEM fields. They do not necessarily have to have a commitment to such work, but it
would be good to assess the possibility of building on a basic level of interest and ad-
vancing their interest through the STEM Summer Institute.
Coherence/Connections
There are many parts to the STEM Summer Institute, and it seems that it could benefit
from establishing and emphasizing the connections between the components of the
Summer Institute, but also between that experience and the rest of their Teacher Pathway
program. After the Summer Institute, they went back to the regular Teacher Pathway,
with the Headlands Institute weekend being the primary continuing STEM Institute activ-
ity. As we observed in their fall placements, few of them were using what they learned in
the summer. In fact, some expressed a desire to have access to the lessons developed in
the summer so that they could use them. If possible, it would be good to have them end
up with a packet of such lesson plans that they could try out in their fall placements. It
would also be good if they could reflect on their efforts to use them when they are in their
fall classes or workshops.
56
Another opportunity to build in more coherence would be between their coursework and
the other STEM Summer Institute activities. This would require aligning the coursework
with the other professional development or with the activities they implement in their
summer placements.
Information about Teaching
It would be good to provide more exposure to information about the teaching profession
during the Summer Institute. This might involve assigning mentor teachers or just in-
creasing the information provided so they gain more of a sense of what teaching is all
about. Alongside this information, they need to gain confidence in their ability to teach
math and science, since this is often what dissuades individuals from teaching or working
in STEM fields.
Teaching Placements
The San Francisco model for summer placement teaching seems more aligned with the
goal of increasing the motivation and capacity for teaching science than was the East Bay
model. While it was clearly seen as worthwhile to have the participants observe quality
science teaching at the LHS, it seemed that the experience of actually teaching young
students using what they had learned was more productive and more powerful.
Responsiveness
Those involved in operating all of the components of the STEM Summer Institute exhib-
ited a high degree of agility in responding to program challenges and making midcourse
corrections. This should be maintained. As the model is expanded to more sites, the need
for this responsiveness should be emphasized with program partners—that there will be
times when it is necessary to reflect and quickly make changes as the program moves
ahead.