college success academy: launching a new program with research and evaluation partners

Post on 20-May-2015

174 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

DESCRIPTION

Presented at the 2013 NPEA conference by: Brigham Nahas Research Associates, The Steppingstone Foundation, Kingsbury Center at NWEA http://www.educational-access.org/npea_conference_speakers2013.php

TRANSCRIPT

Launching a new program with research and evaluation partners

NPEA Conference April 11, 2013

Agenda

• Welcome and introductions• How we got here• Program design: the scorecard• Planning evaluation activities• What we learned• The next chapter: measuring non-cognitive

Who we are

• Robert Theaker, Senior Research AssociateThe Kingsbury Center, Northwest Evaluation Association

• Roblyn Brigham PhD, Managing PartnerBrigham Nahas Research Associates

• Yully Cha, Chief Program OfficerThe Steppingstone Foundation

What we believe about our work

• Evaluate early and often

• Culture matters

• Broad view of what we mean by “data”

• Findings inspire action

• Students front and center

How we got here

• Environmental context– Summer learning loss and the achievement gap– College access to persistence/graduation

• The Steppingstone Foundation’s 2009 strategic plan– The public school venture

Program design1. Academic achievement Center for Higher Education Studies, UC Berkeley;

Cliff Adelman, U.S. Dept of Ed; National Center for Educational Accountability

2. Socio-emotional competency Malecki et al (Measuring Perceived Social Support: Development of the Child and Adolescent Social Support Scale)

• Adult relationships Learning First Alliance (Every Child Learning: Safe and Supportive Schools)

• Self-efficacy Robbins et al, (Do Psychosocial and Study Skill Factors Predict College Outcomes? A Meta-Analysis)

3. Positive behavior Balfanz et al (Preventing Student Disengagement and Keeping Students on the Graduation Path in Urban Middle Grades Schools)

• Attendance• School disciplinary action

4. College awareness The Bridgespan Group (Reclaiming the American Dream); Southern Regional Education Board (Middle Grades to HS: Mending a Weak Link; Choy, U.S. Dept of Ed (Students Whose Parents Did Not Go to College: Postsecondary Access, Enrollment and Persistence)

Academic achievement• The search for the right tool– Summative– Formative– Measure summer learning– And we want to compare against national norms

Measures of Academic Progress (MAP)

National Map

• Partners in 50 states• Over 5000 Partner Districts• Over 6 million students assessed • Partners in 100 foreign countries

Equal interval Linked to

curriculum Achievement scale Cross graded Shows growth Greater score

precision Like an academic

ruler Beginning Reading

Adult Reading

RIT SCALE

X x

x

x x

Xx

xx

X

x x

x

x

x

184

207

232

NWEA Uses a RIT Scale (Rasch Unit)

Daniel

Devon

Grace

High Performing Example

Grace’s Test Pattern

Planning with external evaluator

Logic model:Articulate “the model”Define outcomes and

sequenceBuild consensus around

data

Implement observations, focus groups, and

interviewsInternal team debrief

sessionsBuild SPSS database

Mid-year observations and report

Continue evaluation activities

Three-year analysis and report

Data collection plan

• Qualitative Data – Observations, interviews, focus groups– Program staff, teachers, parents, tutors, and

students• Quantitative Data– Scorecard and MAP– Non-cognitive measures– Surveys to capture perspective of parents

• Process of sharing what we learn

What we learned: qualitative

Strengths

• Program administration is strong

• The program “culture” is taking root

• Scholars’ enthusiasm for the academics

Challenges

• Demanding job for teachers/staff

• Culture needs to deepen to transform

• Defining who to serve/who can best benefit from the program

Our response

• Change schedule and temperature

• Admission info and interview sessions

• IEP info collected in admission; improve faculty orientation

What we learned: quantitative

English Language Arts Math

Score SGP* A/P** Score SGP* A/P**

Grade 4 (2011)

230 n/a 13 230 n/a 11

Grade 5 (2012)

232 55 18 238 65 19

Class 1 Class 2

Summer 2011 93% n/a

Academic Year 2011-2012

87% n/a

Summer 2012 92% 94%

Program attendance: average daily attendance

MCAS

Average GPA

Average days missed from school

Grade 4 (2011) 2.7 8Grade 5 (2012) 2.8 6

Class 1 Class 2

Summer 2011 42/46 n/a

Academic Year 2011-2012

35/42 n/a

Summer 2012 29/35 = 63% 26/46 = 78%

Retention43% of attrition due to mobility

Grades

What we learned: measuring academic impact

• Year one – Correcting mistakes in how test is administered– Learning how to read and report the results

• Year two– Summer learning impact and school year effects– English Language Learners– Boys

Scores and percentiles

How we create a Virtual Comparison Group (VCG)

19

We identify included studentsIdentify all matching students from GRDGrade, Subject, Starting Achievement, School Income, Urban vs. Rural Classification, etc.Randomly select comparison group

20

Compare your

student’s growth to

similar students in

similar schools

Virtual Comparison Groups(VCGs)

Next chapter: measuring non-cognitive

• Virtual Comparison Group (VCG) analysis• External evaluator year-three report• Non-cognitive assessments– Holistic Student Assessment– Survey of After-School Youth Outcomes (SAYO)

top related