project narrative (final)€¦ · project narrative (final) project reads: using data to promote...

35
Project READS 1 PROJECT NARRATIVE (Final) Project READS: Using Data to Promote Summer Reading and Close the Achievement Gap for Low-SES Students in North Carolina The Durham (NC) Public Schools have made significant progress during the past decade in attempting to reduce the large reading achievement gap that exists between children of high versus low socioeconomic status (SES). Despite the advances made to date, district administrators are eager to expand implementation of, and investment in, innovative practices aimed at closing the reading achievement gap so they can stimulate further progress within the district and provide a model for other school districts in North Carolina and the U.S. One promising approach has already been adopted in several of the district's schools: Implementing a program that provides children with books during the summer and promotes summer reading in particular ways (e.g., teacher and parent ―scaffolding‖) that have been shown to be effective in reducing ―summer loss.‖ The program is called Reading Enhances Achievement During the Summer (READS). It was developed by James S. Kim, a researcher at Harvard University who has conducted experimental studies in another district that provided moderate evidence of effectiveness. Dr. Kim has already established strong collaborative relationships with key administrators in both the Durham school district and Communities in Schools of North Carolina, a nonprofit organization that partners with schools to improve student achievement. Harvard University, the Durham Public Schools (DPS), Communities in Schools (CIS) of North Carolina, and other partners are proposing a five-year validation grant addressing Absolute Priority 2, Innovations that Improve the Use of Data. By the end of the grant period, we expect to have provided the READS program to 10,000 elementary students in as

Upload: others

Post on 28-Jul-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Project READS

1

PROJECT NARRATIVE (Final)

Project READS:

Using Data to Promote Summer Reading and Close the Achievement Gap

for Low-SES Students in North Carolina

The Durham (NC) Public Schools have made significant progress during the past decade in

attempting to reduce the large reading achievement gap that exists between children of high

versus low socioeconomic status (SES). Despite the advances made to date, district

administrators are eager to expand implementation of, and investment in, innovative practices

aimed at closing the reading achievement gap so they can stimulate further progress within the

district and provide a model for other school districts in North Carolina and the U.S. One

promising approach has already been adopted in several of the district's schools: Implementing a

program that provides children with books during the summer and promotes summer reading in

particular ways (e.g., teacher and parent ―scaffolding‖) that have been shown to be effective in

reducing ―summer loss.‖ The program is called Reading Enhances Achievement During the

Summer (READS). It was developed by James S. Kim, a researcher at Harvard University who

has conducted experimental studies in another district that provided moderate evidence of

effectiveness. Dr. Kim has already established strong collaborative relationships with key

administrators in both the Durham school district and Communities in Schools of North

Carolina, a nonprofit organization that partners with schools to improve student achievement.

Harvard University, the Durham Public Schools (DPS), Communities in Schools (CIS) of

North Carolina, and other partners are proposing a five-year validation grant addressing

Absolute Priority 2, Innovations that Improve the Use of Data. By the end of the grant

period, we expect to have provided the READS program to 10,000 elementary students in as

Project READS

2

many as 10 school districts, and we expect CIS to be scaling up a well-validated and maximally

cost-effective version of READS in 20 additional North Carolina school districts. In addition to

implementing READS, school staff and district administrators in all participating LEAs will

have adopted (or begun to adopt) a set of Data Use Strategies to inform their decision-making

and improve student achievement growth. One of the key strategies is fall testing so students’

achievement growth can be tracked across the summer months and administrators can determine

whether they need a targeted summer intervention to prevent summer loss from wiping out gains

made during the school year.

The proposed project has three phases: Phase 1 Validation, Phase 2 Validation, and Phase 3

Scale-Up. In Phase 1 Validation, an independent evaluator will conduct an experiment in DPS

to test the effects of the READS program that was implemented previously in another state, test

the effects of an enhanced version of READS to determine whether it can be made more

effective, analyze the cost-effectiveness of the basic and enhanced versions of READS to

identify the most cost-effective version, and conduct an experiment to test the most cost-effective

version in a consortium of three school districts including a rural district. In Phase 2 Validation,

the evaluator will conduct an experimental test of the cumulative effect (across 2 summers) of

the most cost-effective version of READS (CE READS), and compare achievement growth in

the summer and school year for students who received CE READS and students who did not

receive READS. In Phase 3 Scale-Up, CIS will scale up CE READS in 20 NC districts.

Comprehensive, high-quality implementation data will be collected in each phase. These data

will be used to ensure fidelity of implementation, build support for the program, understand how

key stakeholders use data in making decisions about expansion, and understand how a non-profit

organization like CIS can successfully facilitate scale-up of READS across districts.

Project READS

3

We believe the proposed project presents an extraordinary opportunity. 1) SES-based (and

associated ethnic- and language-based) achievement gaps represent one of the most important

problems that American educators face today. 2) Few if any programs have demonstrated

success at reducing summer loss and achievement gaps with evidence from an experimental

study. 3) The prospect of reducing the SES achievement gap by implementing a simple and

relatively cost-effective summer reading program is extremely attractive.

SELECTION CRITERION A:

NEED FOR THE PROJECT AND QUALITY OF THE PROJECT DESIGN

1. The extent to which the proposed project represents an exceptional approach that seeks to

meet a largely unmet need and is a practice that has not been widely adopted.

In the years following school entry, children of low socioeconomic status (SES) lose ground

in reading relative to their high-SES counterparts. This increasing gap in reading achievement

may be largely the result of different rates of learning during the summer months (e.g.,

Alexander, Entwisle, & Olson, 2001; Burkam, Lee, & LoGerfo, 2004; Cooper, Nye, Charlton,

Lindsay, & Greathouse, 1996; Heyns, 1978; Kim, 2004; Phillips & Chin, 2004). Even small

differences in summer learning can accumulate across years resulting in a substantially greater

achievement gap at the end of elementary school than was present at the beginning (Alexander,

Entwisle, & Olson, 2004). Figure 1 in Appendix H illustrates the cumulative impact of summer

loss in Alexander et al. (2004). The different trajectories for SES groups during the summer and

the widening gap between groups are plainly apparent. (We use the term summer loss to mean

low-SES students' loss relative to high-SES students, i.e., SES differentiation. In this usage,

"summer loss" may or may not involve loss in relation to state or national norms.)

Project READS

4

High-SES children learn more than low-SES children during the school year as well as the

summer; however, socioeconomic differences in reading growth rates are larger in the summer

months (Aikens & Barbarin, 2008; Benson & Borman, 2007; Cheadle, 2008; Downey, von

Hippel, & Broh, 2004; McCoach, O’Connell, Reis, & Levitt, 2006). For this reason, summer

reading interventions for poor children make a great deal of sense.

Although the phenomenon of summer reading loss is well-known to educators and

researchers, it remains deeply problematic. There are few available solutions to combat it and no

solutions that are simultaneously (a) evidence-based and data-driven, (b) replicable and scalable,

and (c) cost-effective. Thus, there is a great unmet need for the proposed project.

In the past, school districts have sought to prevent summer reading loss with summer school

programs. A practical difficulty with this approach is that, in an era of shrinking tax bases and

budget cuts, many districts have been forced to eliminate summer school programs to fund their

core instructional program. Even when summer school programs are not in jeopardy, they are

expensive because they involve both facilities costs and substantial personnel costs. Moreover,

research indicates that summer school programs are not effective in closing the SES achievement

gap. Cooper, Charlton, Valentine, and Muhlenbruck (2000) identified 93 studies of summer

school for their quantitative synthesis (meta-analysis). They found an average effect size of .19

standard deviation (SD) units—an overall effect that is not particularly large in light of the costs.

More importantly, middle-class children benefited more than lower-class children.

Some urban school districts have implemented targeted summer programs that are aimed

directly at reducing summer loss. Teach Baltimore and the Chicago Summer School Program are

two well-known programs that have been rigorously evaluated. Teach Baltimore is a multi-year

intervention that recruits and trains college students to provide a full day of instruction for seven

Project READS

5

weeks each summer. Borman and Dowling (2006) randomly assigned students from

high-poverty elementary schools to either Teach Baltimore or a control condition. They found no

overall effects for the program. However, when the analysis was restricted to a subset of 46% of

the students with above average attendance rates across two or three summers, there were

statistically significant positive effects of about .30 SD. Cost data were not provided, so the

cost-effectiveness of this program cannot be determined. The Chicago Summer School Program

identified eligible low-performing children based on a cutoff score from a standardized test. This

program feature allowed Jacob and Lefgren (2004) to employ a regression discontinuity design

to estimate the effects, a design that closely approximates an experimental study. They found a

substantial increase in reading achievement among third grade students but not sixth grade

students. Cost-effectiveness was not studied. Neither of these programs relied on data to match

intervention strategies to children's reading skills or interests.

In the next section, we describe in greater detail a plan for validating and scaling up an

exceptional program: Project READS. READS has the potential to be not only effective but also

a highly cost-effective way to tackle the problem of summer reading loss among low-SES

children. It is motivating for children and intuitively appealing, and it encourages educators to

use data in innovative ways. Although READS has been tested experimentally in an ethnically

and socioeconomically diverse school district and implemented in a few schools in DPS without

formal study, it is not widely implemented and needs to be validated, in both the tested version

and an enhanced version, then re-validated and expanded in a systematic way across several

districts prior to statewide scale up.

2. The extent to which the proposed project has a clear set of goals and an explicit strategy for

using data to achieve project goals.

Project READS

6

The proposed project has three phases, each with a clear set of goals: Phase 1 Validation,

which includes testing READS in three new districts beginning with DPS and an effort to

increase the effectiveness and cost-effectiveness of the program; Phase 2 Validation, which

involves expansion into more districts and many more schools and testing the cumulative impact

of READS across summers; and Phase 3 Scale-Up, which involves expansion into 20 more

districts. Table 1 in Appendix H lists project goals by phase. In each of the phases we employ

one or more of a set of explicit data use strategies that could be replicated elsewhere; these

"READS Data Use Strategies" are listed in Table 2 of Appendix H along with the intended data

users.

Phase 1 Validation (Years 1 and 2, 2010-11 and 2011-2012)

During Year 1 of the study, the independent evaluator will conduct a validation study of the

READS intervention in DPS that experimentally tests "Basic READS" as implemented

previously in another state (Goal 1.1) while we deliberately vary certain features to determine

whether it can be made more effective (Goal 1.2). As shown in Table 3 in Appendix H, the

project will begin in fall 2010 when we administer a Readiness Survey in 12-18 Durham

elementary schools. DPS will identify these schools initially as high-need schools considering

school demographics and test scores, and we will select 10 of the schools based on results from

the Readiness Survey (Data Use Strategy A).

In spring 2011, approximately 1000 grade 3 students in the selected schools will be invited to

participate in the project. Recruited students will be randomly assigned to one of three conditions

and administered pretests. The experiment will conclude in fall 2011 after posttests are

administered (see Table 3 and evaluation plan in Section D for details). The three conditions will

include:

Project READS

7

Basic READS

READS + teacher phone calls (READS + TC)

Control condition not receiving books until the fall

The Basic READS intervention is intended to replicate the experimental studies described in

Section B. The enhanced version, READS + TC, will involve weekly teacher phone calls to

children. The proposed enhancement is a logical next step from previously collected focus group

data indicating that teacher calls may enhance the impact of READS (see Section C).

An important Phase 1 goal (Goal 1.3) is to analyze the cost-effectiveness of the two versions

of READS. To do this, the evaluator will create cost-effectiveness ratios based on the relative per

pupil cost of Basic READS and READS + TC and their observed effects. Section D provides

additional details.

In the initial experiment and all subsequent ones, several kinds of implementation data will

be collected: student surveys and interviews, student postcards returned, and teacher logs. Focus

groups will also be conducted at each school to elicit information that may be useful in making

additional changes or refinements to the program (Data Use Strategy B). And we will discuss

the results of the experimental study and descriptive and correlational implementation data with

teachers, school leaders, and DPS administrators to increase understanding of the project,

support, and buy-in at all levels (Data Use Strategy C). Importantly, the Phase 1 experiment

(and later ones) requires a pretest administered in the spring and a posttest administered in the

fall. By examining spring-to-fall achievement growth for the control group, school and district

staff will be able to see, perhaps for the first time, the extent to which their students exhibit

summer loss. Summer loss is ordinarily hidden by the practice of annual spring testing that

confounds achievement growth occurring in the school year with growth (or decline) occurring

Project READS

8

during the summer. Strategy C also includes data collection and data use procedures used to

select matched books for children. This use of data is unique to our program.

In Year 2 (2011-2012) we address the most important Phase 1 goal (Goal 1.4): Implementing

and experimentally testing the most cost-effective version of READS (CE READS) in a

consortium of three school districts including DPS, Montgomery County, and Guilford County.

For this validation study, we will select a total of 10 schools from the three districts, including at

least two schools that did not implement READS in Year 1. Within each school, about 1000

grade 3 students will be randomly assigned to CE READS or a Control condition receiving

books in the fall.

To select schools for the multi-district validation study for CE READS, each district in the

consortium will be asked to use results (Data Use Strategy D) from the North Carolina End-of-

Grade (EOG) tests to identify high-need candidate schools for the expansion of READS that

meet at least one of following criteria: 1) In relation to the state average, the school average score

decreases across grades 3 to 6; or 2) growth trajectories for low- and high-SES children show a

widening SES gap for students followed longitudinally from grade 3 to grade 6. Next, the

Readiness Assessment developed in Phase 1 will be used to select the final group of 10 schools

from the several districts (Strategy A).

Phase 2 Validation (Years 3 and 4, 2012-2013 and 2013-2014)

The centerpiece of Phase 2 of our validation project will be a 2-year longitudinal and

experimental test of the most cost-effective READS intervention that was identified by means of

the experimental data collected in Phase 1 (Goal 2.1). The study will be conducted in schools

that did not receive READS in Phase 1 and include students in grades 3 to 5. Along with the

additional schools, the added grade levels will increase the generalizability of our findings. Most

Project READS

9

importantly, the longitudinal study will allow us to assess the cumulative impact of CE READS

across two summers and compare these effects to the amount of summer reading loss we observe

among poor children who do not receive READS, as estimated from control group results in

Phase 1 studies and the longitudinal study itself (Goal 2.2). It will also allow us to compare

summer gains with gains during the school year. We will share the results each year with school

and district staff, along with correlational data linking student growth and book reading activities

during the two summers. This should increase school and district-level support (Strategy C).

The longitudinal study requires a different design, random assignment of schools to

conditions (i.e., a school-level cluster randomized trial or RCT). In the within-school designs

above, because grade 3 control group students received books in the fall of grade 4, they could

no longer serve as a control group. The only way to conduct a longitudinal and experimental test

of READS is to implement the treatment and control conditions between schools, in an RCT.

As shown in Table 4 of Appendix H, Phase 2 of the project will begin during the fall of 2012

in the three-district consortium including Durham, Montgomery, and Guilford, and other

districts. A total of 65 schools will be randomly assigned to Group A, CE READS

implementation beginning in grade 3 with a student cohort to be followed to grade 5 OR

Group B, CE READS implementation beginning in grade 4 with a student cohort to be followed

to grade 6. Each cohort will include approximately 4000 students for a total of 8000 participants.

To carry out the longitudinal study, children in both groups of schools will receive READS

for two consecutive summers, the summers following grades 3 and 4 (Group A) or the summers

following grades 4 and 5 (Group B). The lagged approach provides a control group in Group B

for the treated cohort in Group A, and vice-versa. This design was successfully employed in the

Project READS

10

national randomized experiment of Success for All (Borman et al., 2007). It provides a strong

incentive for schools to participate because all schools receive READS for one student cohort.

In the late fall of 2013 and 2014, we will schedule discussions and conduct interviews with

administrators in the participating districts and our nonprofit partner. This will serve two

purposes: 1) building understanding and support among district administrators and CIS, and 2)

helping us understand how key stakeholders use data for budgeting and making decisions about

READS (Strategies C and E).

Phase 3 Scale-Up (Year 5, 2014-2015)

During Phase 3 of the study, we will rely on our non-profit partner, Communities in Schools

(CIS), in an effort to scale up READS in approximately one-half of the 40 North Carolina

districts that have a CIS affiliate (Goal 3.1). Twenty districts will be selected based on the

number or proportion of high-need schools they have, where high-need schools are defined as

before. CIS will assist the 20 districts in implementing the most cost-effective form of READS

that was experimentally tested in Phase 2, CE READS. The participating districts will be asked

to take responsibility for implementing teacher lessons and/or calls, and they will be asked to

work with CIS in an effort to obtain funding for children's books and the additional achievement

testing by 1) identifying existing sources of funding within the district, and 2) soliciting new

sources of funding from the business sector or charitable organizations.

Scale-up will focus on three of the READS data use strategies discussed above: Strategy C,

using data from spring and fall achievement testing and student surveys to select matched books;

Strategy B, using implementation data to ensure fidelity of implementation (e.g., using a

checklist for schools that was developed in earlier phases); and Strategy E, using data from

interviews to understand how district administrators make decisions about READS.

Project READS

11

Apart from serving more students, a major goal of the scale-up (Goal 3.2) is to understand

how non-profit organizations like CIS can successfully facilitate district implementation as

READS is scaled up across districts. We will address this goal by surveying the CIS affiliates,

interviewing CIS administrators, and analyzing the survey and interview data (Strategy F).

3. The extent to which the proposed project is consistent with the research evidence.

The research evidence presented in the next section demonstrates that it is feasible to

implement READS in school settings. It further shows that READS is effective in schools with

many low-SES and minority students, schools that are similar to the Durham schools where we

propose to begin validating READS.

SELECTION CRITERION B:

STRENGTH OF RESEARCH, SIGNIFICANCE AND MAGNITUDE OF EFFECT

The extent to which there is (1) moderate evidence that the proposed practice will have a

statistically significant and (2) important effect on improving student growth.

Kim (2006) and Kim and White (2008) conducted two experiments that provide moderate

evidence (as defined in the Notice) that READS will have a statistically significant and important

effect on improving student growth (see also White & Kim, 2008). Both studies were conducted

in a large suburban school district in the Mid-Atlantic region of the United States. The

participating children were predominantly non-white ethnic minorities (67% or 69% Black,

Hispanic, Asian, or other), and 38 or 39% of them were receiving free- or reduced-price meals,

an indicator of low SES.

Figure 2 in Appendix H displays the logic model underlying the experiments—essentially

our "theory" of why READS should work. In essence, fall reading achievement is influenced by

the amount of "scaffolded" summer reading that children do when reading books that are well

Project READS

12

matched to their reading level and personal interests. The model is supported by research and

theory suggesting, first, that the match between children’s skill levels and the texts they are

reading may be a critically important ingredient in an effective summer reading program (e.g.,

Byrnes, 2000; Carver & Liebert, 1995; Stahl, 2004). Second, to strengthen the efficacy of

summer reading programs, teachers might scaffold silent reading activities by instructing

children how to use strategies to monitor their comprehension of text and reminding them to use

these strategies to improve their reading comprehension in the summer (Kim, 2007).

To provide scaffolding for children’s summer reading, we ask teachers to implement several

lessons at the end of the school year. In these lessons the teacher models research-based

comprehension strategies (National Institute of Child Health and Human Development, 2000)

that students can apply at home during the summer when they are reading independently. The

teacher also provides oral reading fluency practice, encourages students to read aloud to their

parents over the summer, and shows them a simple procedure for doing so. In addition, we ask

parents to listen as their sons or daughters tell them about a book they have read, listen as a short

passage from the book is read out loud by the child, and provide feedback on the degree to which

the child reads smoothly and with expression.

To ensure that books are matched to the children's skills and interests, we administer a survey

to determine students' reading preferences in the spring and use Lexile ratings generated from the

spring pretest to determine students' reading ability and thus an appropriate level of text

difficulty. A computer algorithm creates a list of eight books that represent the best matches for

each child, those with high preference ratings within the child’s Lexile range. For children in the

treatment group, one matched book is mailed each week for eight successive weeks from early

July until the end of August.

Project READS

13

In the Kim (2006) study, 552 fourth-grade children in 10 schools were pretested in the spring

and randomly assigned to a treatment condition in which they received eight matched books over

the summer or a control condition in which they received books in the fall following the posttest.

At the end of the school year all of the children received the scaffolding lessons described above

from their teacher). (We assumed these lessons would not affect the control group students

because they would have little opportunity to practice them over the summer, and this

assumption was borne out by the results.) The Iowa Tests of Basic Reading Skills (ITBS) was

administered as the spring pretest and fall posttest.

Kim (2006) found that reading achievement was higher for children in the treatment group

than children in the control group (see Table 5 in Appendix H, which displays the posttest mean

Total Reading scores on the ITBS adjusted for pretest scores by means of an ANCOVA). The

difference of 2.0 points was just 0.01 short of the conventional 0.05 level of statistical

significance at p < 0.06, but it represented 1.3 additional months of school learning, so it was

significant in practical terms. Additional months of school learning was calculated by dividing

the difference between the treatment and control group means by 1.56, because children gain 14

points from the spring of grade 4 to the spring of grade 5 according to the test publisher’s norm

sample, or 1.56 points per month during a 9-month school year.

Kim (2006) also found that Black and Hispanic children derived the greatest benefit from the

summer reading program, showing treatment effects that were about twice as large as the overall

effect. For Black students, the difference between treatment and control conditions (5.2 points)

represented 3.3 additional months of learning. For Hispanic students, the treatment-control

difference is the equivalent of 2.1 additional months of learning (see Table 5, Appendix H).

Project READS

14

In the Kim and White (2008) study, 514 children in grades 3 to 5 in two schools were

randomly assigned to one of four conditions: 1) matched books only (Books Only); 2) matched

books and oral reading (Books With Oral Reading Scaffolding); 3) matched books, oral reading,

and comprehension strategies instruction (Books With Oral Reading and Comprehension

Scaffolding); and 4) control group receiving books in the fall after posttesting and no teacher or

parent scaffolding (Control). In addition teachers were randomly assigned to one of these

conditions. All other study procedures (book matching, lessons, pre- and post-testing) were

identical with Kim (2006).

White and Kim (2008) found that children in the full treatment group, Books With Oral

Reading and Comprehension Scaffolding, significantly outperformed students in the Control

group on the ITBS ( p < 0.03). The difference in posttest scores of 3.9 points represented a

learning advantage of 2.5 months (see Table 6 in Appendix H). Consistent with our theory,

matched books alone with no form of scaffolding did not have positive effects. Examining

subgroup results for this condition (see Table 5), the largest positive effects, ranging from 1.7 to

5.1 additional months of learning, were observed for Black, Hispanic, and low-income children.

Low-income children gained an average of 4.0 months. Notably, this is enough to offset 100% of

the summer loss shown by low-income students in Cooper et al.’s (1996) meta-analysis of studies

of the effect of summer vacation on achievement, 0.34 grade-level equivalents or about 3 months.

These data indicate that teacher and parent scaffolding that includes oral reading and

comprehension strategies instruction, coupled with a careful book matching procedure, has

statistically significant effects that are large enough to be practically important in an effort to

reduce summer loss among low-SES and minority children. This is what we propose to validate

as "Basic READS."

Project READS

15

These studies clearly meet the definition of moderate evidence: at least one well-designed

and well-implemented experimental or quasi-experimental study. Both studies are fully

randomized experiments that eliminate or greatly reduce the possibility of selection bias. In both

studies, an important potentially confounding factor, teacher skill, can also be ruled out. Kim and

White (2008) randomly assigned teachers as well as students to conditions; and in the Kim

(2006) study, teacher effects were controlled by randomly assigning half of the students in each

class to the treatment and half to the control condition. Our evidence is not "strong" because the

experiments were relatively small and conducted in a single district, and the program developer

(Kim) was involved with implementation.

"Well-designed and well-implemented" means that a study meets the What Works

Clearinghouse (WWC) standards, with or without reservations. According to the WWC, overall

attrition (from both groups in an experimental design) must be less than 13% for a study to be

accepted without reservation. In the Kim (2006) study, the final sample was 486 students, so

attrition was 12%. Kim (2006) also showed that the treatment and control groups were

equivalent (i.e., no significant difference) on the ITBS pretest, and that there was no systematic

relationship between missing ITBS scores and experimental condition. In the Kim and White

(2008) study, there were 400 children in the final sample, so attrition was higher, 22%. However,

because initial equivalence of the treatment groups was established analytically, this study meets

WWC standards "with reservations."

SELECTION CRITERION C:

EXPERIENCE OF THE ELIGIBLE APPLICANT

1. The past performance of the eligible applicant in implementing complex projects.

Project READS

16

Harvard University's Graduate School of Education currently houses several research centers

whose mission is to generate knowledge to improve student achievement and help close the

achievement gap, including the Center for Education Policy Research, the Achievement Gap

Initiative, and the National Center for Teacher Effectiveness. Each center is implementing

complex projects and producing cutting edge research. The research of Dr. Kim described in

Section B is an example.

Because effective collaboration is such a critical ingredient in implementing a project like the

one we are proposing, it is important to recognize that cooperative work with Durham Public

Schools (DPS) and Communities in Schools (CIS) has already begun. In consultation with Dr.

Kim in summer 2008, CIS and DPS implemented READS in two elementary schools and

collected data on children's summer reading. They found that children who read 0 to 4 books

showed more summer loss (-50 Lexiles) than children who read 5 to 8 books (+25 Lexiles).

Again in consultation with Dr. Kim, in summer 2009, DPS and CIS conducted a pilot test of

READS + TC. Teachers called 55 grade 3 students and talked to them about their reading

activities. In focus groups the teachers reported that children who were called by them were more

likely to engage with their books and use comprehension strategies learned during the school

year, and that the children enjoyed talking about their READS books. In short, the applicant is

well-qualified and perfectly positioned to enable DPS and CIS to participate in a large,

methodologically rigorous validation study that builds on this initial work and research

conducted in other districts.

2(a). The extent to which the LEA has significantly closed achievement gaps or significantly

increased student achievement for all groups of students.

Project READS

17

In the Durham Public Schools, the overall achievement gap has decreased by approximately

one-third in the past decade (Durham Public Schools, 2009). The district is deeply committed to

closing the achievement gap. In 2010, DPS won a $1.25 million grant from the National

Education Association Foundation (NEAF) to support a union-district partnership to develop

sustainable practices to close the achievement gap. DPS and two other districts were selected

from more than 14,000 school districts nationwide. As part of this grant, DPS administrators will

examine the magnitude of existing achievement gaps, teacher capacity, and family, community,

and school partnerships. The goals of the proposed project and the NEAF grant are in nearly

perfect alignment.

2(b). The extent to which the nonprofit organization has significantly improved student

achievement, attainment, or retention through its record of work with an LEA or schools.

Like its parent organization, Communities in Schools of North Carolina is a nonprofit that

connects community resources with the needs of at-risk students. CIS of North Carolina

currently operates in 469 schools in 40 school districts (34% of the state's districts) where it

seeks to forge relationships with stakeholders to implement evidence-based policies to improve

student achievement. In each school where CIS operates, there is a site coordinator who is

responsible for providing community services to students (e.g., helping with homework).

Recently, CIS was part of a national evaluation conducted by an independent research firm,

ICF International (2008). The evaluation included a quasi-experimental study comparing 602

CIS schools with 602 matched comparison schools. The results revealed that CIS schools

outperformed the comparison schools in reducing dropout rates and improving proficiency rates

on state reading and math tests. This study was sufficiently rigorous to meet the WWC standards.

The evaluation is continuing and it will include a study that involves random assignment of

Project READS

18

students to CIS programs to assess the causal impact of CIS services on short- and long-term

outcomes. CIS is one of the few non-profit organizations whose efforts have been subjected to a

rigorous evaluation.

SELECTION CRITERION D:

QUALITY OF THE PROJECT EVALUATION

1. The extent to which the methods of evaluation will include a well-designed experimental

study or well-designed quasi-experimental study.

Phase 1 Experimental Studies with Student-Level Random Assignment

Initial validation with enhanced version of READS. Approximately 1000 grade 3 students

and approximately 45 classroom teachers in 10 Durham Public Schools will be invited to

participate. The Basic READS condition will be an exact replication of the two experimental

studies described in Section B. Students in the READS + TC condition will be called by their

teachers each week during the summer. Students will be randomly assigned to one of these

conditions or the control condition, books in the fall. Teachers will be trained to implement the

READS lessons on fluency and comprehension strategies. Teacher training will be led by three

Durham public school teachers who led training for a READS program pilot in 6 pilot schools in

spring 2010. For READS + TC, teachers will make the weekly phone calls using a phone log that

was designed by teachers in collaboration with Dr. Kim in 2010 (Table 7 in Appendix H).

All participating students will be tested with the Gates-MacGinitie Reading Test (GMRT),

Comprehension and Vocabulary, Level 3 Form S, at pretest in June 2011, and with Level 3

Form T at posttest in September 2011. The GMRT includes a total reading score based on a

48-item comprehension subtest and a 45-item vocabulary subtest. The Kuder–Richardson

Formula 20 reliability coefficient for the GMRT Level 4 is .96, and test–retest reliability is .92.

Project READS

19

At pretest, the GMRT comprehension scores also yield a Lexile score which provides the data

needed to match children to appropriately leveled books (Maria, Hughes, MacGinitie,

MacGinitie, & Dreyer, 2007). The North Carolina End-of-Grade (EOG) Test in Reading will

yield a second Lexile score that will be used in combination with the GMRT Lexile score to

obtain a more precise estimate of each student’s reading level at pretest. (EOG tests are

administered in May and scores are available a week later.)

Using estimated impacts of the effect size from our earlier experimental studies, we

conducted a power analysis to identify the number of students needed for the initial Phase 1

randomized experiment. We used two plausible effect sizes (.15 and .20) and a pretest-posttest

correlation on the Gates-MacGinitie of .85 to assess power to detect effects of this size. Since we

will have between 600 and 700 students in our first year study allowing for attrition and

non-consents, there will be more than sufficient power to detect an effect size of .15.

Using the pretest and posttest reading data, the evaluator will conduct an intention-to-treat

analysis using the following ordinary least squares regression model: (1) Yics = 0 +

1(PreGMRT)ics + 2(READS) cs +3(READS + TC)cs + αs + εics, where Yics is the posttest GMRT

reading score for student i in class c in school s, PreGMRTics is the pretest GMRT reading score

administered in spring 2011, READScs is a dummy variable indicating whether a student is in the

Basic READS condition, and READS + TC is a dummy variable indicating whether a student is

in the READS + Teacher Call condition. Because random assignment will occur within each

school, the regression model will include a school fixed effect (αs) that captures school effects

and an error term (εijs) that includes both a student- and classroom-specific error term. Using

ordinary least squares, the evaluator will estimate parameters 2 and 3 to determine the effects

of the two conditions. In addition, the evaluator will test whether there is a statistically

Project READS

20

significant difference in mean posttest scores between students in READS and students in

READS + TC.

To estimate the cost-effectiveness ratio for the two treatment conditions, the evaluator will

collect and analyze data on per pupil costs. The cost per pupil will be estimated using the

ingredients method (Levin & McEwan, 2001). Our non-profit partner, Communities in Schools,

has maintained detailed budgets to estimate the key cost ingredients of READS and the cost per

pupil. In summer 2010, the estimated per pupil cost for Basic READS is $312. Our estimate of

the READS + TC per pupil cost is $450. Assuming an effect size of .15 for Basic READS, which

is the equivalent of raising an average student's score from the 50th to the 56th percentile, and an

effect size of .20 for READS + TC, which is the equivalent of raising an average student's score

from the 50th to the 58th percentile, the analysis would indicate that it costs approximately $52

per pupil to improve reading scores by one percentile point in Basic READS and $56 to improve

scores by one percentile point in READS + TC. Thus, while the effect size would be larger for

READS + TC than for Basic READS, the cost-effectiveness ratio would favor Basic READS

over READS + TC.

Experimentally testing the most cost-effective version of READS in other districts. For this

Year 2 experiment, the final samples will again include 10 schools and about 600-700 grade 3

students; however, some of the schools will be in two new districts. Thus, the year 2 experiment

will include schools from the Durham Public Schools, Guilford County, and Montgomery

County. Table 8 in Appendix H clearly shows that there is, on average, a 30-point gap in pass

rates between economically disadvantaged and non-economically disadvantaged on the North

Carolina EOG reading and math tests in each of the three districts. Thus, the implementation of

the Year 2 experiment will enable us to test whether CE READS can reduce the reading gap. In

Project READS

21

Year 2, there will be only two experimental conditions: CE READS and Control. The

implementation procedures and pretest and posttest measures will be the same as those used in

the first Phase 1 study, and the same power analyses apply.

Phase 2 Longitudinal Study with School-Level Random Assignment

Sample and design. About 65 elementary schools with grades K-6 will be invited to

participate in the longitudinal, school-level experimental study to test the efficacy of CE READS

as identified in Phase 1. The schools will be drawn from as many as 10 districts and selected

based on need (see criteria in Section A). Each school will receive the intervention for two

consecutive summers. Half of the schools will be randomly assigned to Group A where a grade

3-grade 5 cohort will be in the treatment condition and a grade 4-grade 6 cohort will be in the

control condition. The other half of the schools will be assigned to Group B where a grade

4-grade 6 cohort will be in the treatment condition and a grade 3-grade 5 cohort will be in the

control condition. About 8,000 students will be invited to participate, 4,000 in each cohort.

Reading comprehension measures. Each cohort will be administered the GMRT in the spring

and fall of 2013 and the spring and fall of 2014. This testing cycle will allow estimation of

growth during two summers and one school year. The evaluation will also use scores on the NC

EOG reading test to examine whether impacts persist one year after the intervention.

Power. We used Optimal Design software (Spybrook, Raudenbush, Congdon, & Martinez,

2009) to estimate the number of schools needed. We used the following design parameters:

estimated number of students per school after attrition (n = 100), an estimate of the intraclass

correlation ( = .15), and an estimate of the proportion of variance explained by the level 2

covariate (R2 = .75). We determined that, with a two-tailed test with alpha set at .05, this study

will be sufficiently powered to detect an effect size of .15 with 65 schools. The effect size

Project READS

22

estimate of .15 assumes that the effects of CE READS will be no larger than the effects we have

obtained in previous studies of Basic READS, or that if READS + TC is more effective it will be

less cost-effective (and thus not implemented).

Data analysis. The evaluator will use a multi-level model to estimate the impact of the CE

READS intervention after two summers. The Level 1 model for student i in school j can be

written as follows: (2) Yij = 0j + ij, where Yij is the posttest reading score for student i in school

j, 0j is the mean posttest score for school j, and ij is the error term for student i in school j. The

Level 2 model can be written as (3) 0j = 00 + 01(PreGMRT)j + 02(CE READS) j + 0j, where

0j is the posttest reading score for school j and predicted by a pretest covariate, the school mean

pretest GMRT score, and the treatment dummy variable denoting whether a school was

randomly assigned to CE READS or the control condition. Combining the Level 1 and Level 2

equations yields a mixed-effects model, which can be written as (4) Yij = 00 + 01(PreGMRT)j +

02(CE READS)j + (0j + ij), where the pretest GMRT score and the treatment dummy variable

are modeled as fixed effects and the student and school residual terms are modeled as random

effects. If the parameter estimate for CE READS is positive and statistically significant, it will

indicate that schools implementing READS for two consecutive summers enjoyed larger gains

than control schools. In addition, NC EOG reading scores obtained after the last fall GMRT

administration will be examined to see if the effects persist during the course of a school year.

2. The extent to which the methods of evaluation will provide high-quality implementation

data and performance feedback, and permit periodic assessment of progress toward

achievement intended outcomes.

Project READS

23

In the two Phase 1 studies and Phase 2 longitudinal study, the evaluation will include the

collection of comprehensive implementation data. These data will be used to provide

performance feedback and annually assess progress towards achieving project goals.

Implementation measures. Student implementation measures will include data from postcards

returned by students in the Basic READS and READS + TC conditions, and data from posttest

surveys and interviews. Teacher log data (see Table 7 in Appendix H) will include information

on (a) the number of times teachers talked to students, (b) the titles of books students reported

reading, and (c) the content of the conversations between teachers and students (or parents).

Finally, school- and district-level implementation data will include teacher focus groups and

administrator interviews. These data will be analyzed by both quantitative and qualitative

methods (e.g., content analysis).

Data sharing and data use strategies for schools. Both test score data and implementation

data will be used to provide feedback to teachers on the fidelity of implementation in the 10

schools participating in Year 1 and 10 schools participating in Year 2, and the 65 schools

participating in Years 3 and 4. This will be accomplished by using the protocol for sharing

results that was successfully deployed in the fall 2009 pilot study of READS in Durham. Several

types of data will be shared with teachers and principals. First, we will present and discuss

descriptive data on spring-to-fall and fall-to-spring achievement gains or losses on the GMRT for

both the control group and the READS treatment groups, which will allow school staff to

understand the extent to which children from families of differing socioeconomic status decline,

maintain, or gain in reading during the summer months (all project years) and the school year (in

Year 4 only). Second, we will present and discuss descriptive information extracted from

postcards returned by students in the READS treatment conditions. Third, we will present and

Project READS

24

discuss descriptive data from the teacher logs. Fourth, we will present and discuss correlational

data showing whether and how spring-to-fall (summer) gains are related to postcard data (e.g.,

reading comprehension strategies students report using) and teacher log data (e.g., number of

books read, quantified themes emerging from student comments).

Data sharing and data use strategies for district administrators. School and district

administrators will be most interested in the findings on effectiveness and cost-effectiveness. We

will certainly share and discuss with them the results from the two Phase 1 experiments and the

Phase 2 longitudinal experiment including the findings on cost-effectiveness. Following

discussion of findings, we will conduct semi-structured individual interviews with the DPS

leadership team, including the three veteran teachers who conduct training, the Superintendent of

Elementary Curriculum and Instruction, the Chief Financial Officer, and the Superintendent of

Schools.

Performance feedback and progress toward achieving goals. We believe that by sharing the

implementation, program effectiveness, and cost-effectiveness data, we will be providing

powerful performance feedback for the participants. The same evaluation data will be the means

by which the participants assess progress toward achieving the project goals, and the means by

which we ourselves, as Co-Directors of the project, assess progress toward achieving goals.

3. The extent to which the evaluation will provide sufficient information about the key

elements and approach of the project so as to facilitate replication or testing in other settings.

To facilitate replication and testing of the READS data use strategy outside the Durham

Public Schools, we will develop several tools that can be used in scaling up the project in

districts throughout North Carolina and the U.S. First, we will create a Project READS database

that provides checklists for implementing READS and for creating a READS budget (see Table

Project READS

25

9a and Table 9b in Appendix H). The detailed implementation checklist will include critical

items that superintendents, principals, and teachers must understand and implement with high

fidelity. We have already developed drafts of checklists and will refine them in Year 1 so that

other districts can easily implement READS. Second, we will use the CIS network in North

Carolina to disseminate checklists thus facilitating replication in other district settings.

4. The extent to which the proposed project plan includes sufficient resources to carry about

the project evaluation effectively.

There are sufficient resources to fund the project evaluation, as described in the Budget

Narrative. The budget includes funds for the work of a Lead Evaluator and Evaluation Team.

5. The extent to which the proposed evaluation is rigorous, independent, and neither the

program developer nor the project implementer will evaluate the impact of the project.

The Phase 1 and Phase 2 studies described above are rigorous randomized experiments.

These experiments and the cost-effectiveness analyses will be carried out by an evaluation team

led by an independent evaluator, Dr. Jonathan Guryan, who is a labor economist at the

University of Chicago’s Booth School of Business. Dr. Guryan was not involved in the

development of READS, and he will play no role in implementing the project during any of its

three phases.

SELECTION CRITERION E:

STRATEGY AND CAPACITY TO BRING TO SCALE

1. The number of students to be reached by the proposed project and the capacity of the

applicant and partners to reach the proposed number of students during the grant period.

In carrying out each of the two experiments in Phase 1 and the 2-year longitudinal

experiment in Phase 2, we will provide the READS program to an estimated total of 10,000

Project READS

26

students. Two thousand students will receive READS lessons at the end of the school year, eight

books during the summer, and teacher calls in the first and possibly second year. Eight thousand

students will receive READS lessons in the spring of each of two school years and books and

possibly teacher calls in two successive summers. With the kind of strong cooperation we expect

to get from our partners in the three-district consortium and other NC districts and CIS of

Durham (see Letters of Support 1-5 in Appendix D), we have little doubt that we can accomplish

the task of reaching 10,000 students as planned during the grant period.

2.The eligible applicant’s capacity to bring the proposed project to scale on a State or regional

level working directly, or through other partners, either during or following the grant period.

We propose to work through our non-profit partner, Communities in Schools of North

Carolina (CIS-NC), to begin a process of scale-up that seeks to eventually include all elementary

schools in the state of North Carolina that have a demonstrated need for it (i.e., evidence of

summer loss; see criteria listed previously for Data Use Strategy D). In the final year of the

grant, CIS-NC will work with 20 districts and high-need schools in those districts; in the years

following the grant period, CIS-NC plans to continue expanding into more districts until there

are no more high-need schools left in the state.

CIS has already developed and applied a replicable strategy for scaling up READS in North

Carolina public schools. The scale-up strategy includes three components. First, in each district

the CIS Executive Director (ED) will write grants to for-profit donors requesting seed money to

fund a pilot of READS in 1 or 2 elementary schools. Second, each ED will use data generated by

READS to tap into additional private and public sources of funding. Third, each ED will

leverage both sources of funding to scale READS to additional schools.

Project READS

27

This strategy was applied successfully in DPS in 2008, as follows. First, the ED of CIS of

Durham raised corporate funds totaling about $25,000, which provided funding to support 80

students in 2 elementary schools. The data from this small pilot indicated that children who read

5 to 8 books during the summer enjoyed larger reading comprehension gains than children who

read 0 to 4 books (see Section C). CIS staff then shared results from the pilot study with potential

funders in the business community and the superintendent of the Durham Public Schools. In

meetings with both groups, CIS found that business and school leaders were eager to fund

READS because: 1) It focused on improving reading, a core academic skill. 2) It enabled the

business community to address a unique, unmet need—summer reading opportunities for low-

income children—important since the district had been forced by the economic downturn to

eliminate funding for summer school! 3) It yielded clear data on the relationship between

summer reading and student outcomes. And 4) it was a simple and easy intervention to

implement and scale. Consistent with the goals of our validation grant, data is a critical part of

CIS's strategy for raising funds and generating long-term support and buy-in from leaders in

business and education.

As further evidence of the potential to scale READS, for the 2010 program in Durham,

for-profit donors contributed $70,000 and the Durham Public Schools contributed $30,000 (see

Letters of Support 3, and 8-9 in Appendix D). Tapping into a diversified funding stream from

both public and private sources, the CIS ED was able to increase participation in READS from

80 students in 2008 to 500 students in 2010. This track record suggests that CIS EDs can raise

seed money to fund a pilot implementation of READS including the necessary funds for books

and use data from the pilot to generate funding from additional public and private sources.

Project READS

28

3. The feasibility of the proposed project to be replicated successfully, if positive results are

obtained, in a variety of settings with a variety of student populations—with fidelity and ease.

Our plan intentionally extends READS implementation from the district where it was

originally tested, to DPS, to two more NC districts (Phase 1), then to as many as 10 districts with

more than one grade level of students in Phase 2, and finally to 20 NC districts in Phase 3. We

have attended to some important elements of successful replication across settings: explicit

strategies for using data to select program sites and improve the program including its

cost-effectiveness, emphasis on building understanding and support for the program, and

marshalling the necessary expertise and resources.

However, given that many innovative programs are notoriously difficult to scale up in

schools and school districts, can we extend READS to a variety of settings and do this with

fidelity and more importantly with ease? With regard to fidelity, we have already explained how

we will carefully document procedures and develop and make available instruments (e.g.,

checklists) to check fidelity and guide further implementation, and how we will solicit feedback

from teachers and others, which may help us to identify unanticipated obstacles to faithful

implementation, particularly in Phase 1.

We are confident that we can scale up READS with relative ease. There are two primary

reasons for our optimism. First, the program is simple to implement. We hire trainers to train

teachers to do the end-of-year comprehension strategy and fluency lessons. Any district has the

capacity to do this. The lessons themselves are already developed and validated, and, because

they are fully scripted, extensive teacher training is not necessary. In our previous research,

teachers attended a single 2-hour training session. The labor-saving program we use to match

books to children’s interests and reading levels is easy to use and will be made available to

Project READS

29

schools at no cost. In Basic READS, teacher and parent scaffolding is accomplished in the end-

of-year lessons and supplemented with letters and postcards sent to parents and children—again,

not complicated and easily within reach for most school systems.

Our second reason for confidence is that we have found in our previous work that READS

has enormous appeal for practicing educators. As evidence of this, we present in Table 10 of

Appendix H an email from two reading specialists in Illinois who were inspired by our research

and decided to implement their own READS program. This email (received while writing the

proposal!) is similar to many others Dr. Kim has received and is representative of the kind of

enthusiastic comments we have heard from virtually all of the teachers and administrators we

have worked with thus far.

4. The eligible applicant’s estimate of the cost of the proposed project and estimate of the costs

to reach 100,000, 250,000, and 500,000 students.

The proposed project will cost an estimated total of $12.7 million. This includes the cost of

(a) the evaluation that will determine whether READS can be validated, (b) the analyses of

results and implementation data that will be presented to school and district staff to gain their

support, and (c) the costs of supporting project staff to carry out the plans and meet project goals.

Our estimated cost for providing Basic READS is $312 per student. This means that the program

could reach 100,000 students at a cost of $31.2 million, 250,000 students at a cost of $78 million,

and 500,000 students at a cost of $156 million.

5. The mechanisms the eligible applicant will use to broadly disseminate information on its

project to support further development, expansion, or replication.

During the grant period and afterwards, Drs. Kim and White will publish findings from

READS and pursue the possibility of working with the governor and legislators to fund statewide

Project READS

30

implementation of READS. To broadly disseminate information about READS throughout North

Carolina and the U.S., we will work with intermediary organizations that broker relationships

between researchers and practitioners. We will 1) conduct webinars with the Mid-Atlantic Equity

Center (MAEC), 2) write articles for the Johns Hopkins Center for Summer Learning and

practitioner-oriented journals such as The Reading Teacher and Phi Delta Kappan, 3) create

interactive websites at the Harvard Graduate School of Education’s Usable Knowledge website

and Communities in Schools of North Carolina, and 4) communicate findings at conferences for

education researchers such as the American Education Research Association and Society for

Research in Educational Effectiveness. In the past two years, Dr. Kim has conducted several

webinars, written articles, and presented findings in many of these venues, and his work has been

well-received by a large and diverse audience of policymakers and practitioners.

SELECTION CRITERION F:

SUSTAINABILITY

1. The extent to which the eligible applicant demonstrates that it has the resources, as well as

the support of stakeholders, to operate the project beyond the length of the Validation grant.

In post-grant scale-up, CIS will be greatly aided in its work by: 1) having rigorous evidence

of effectiveness and cost-effectiveness in hand, 2) the ability to point to other districts’ success in

implementing READS and applying the data use strategies, 3) the district implementation

checklist tool developed during the project years, and 4) our study of how districts use data and

research on READS to make decisions about adopting and expanding it.

CIS has several unique fund-raising capacities that will facilitate efforts to fund and scale

READS throughout North Carolina beyond the Validation grant. First, CIS-NC has access to a

diverse network of leaders in the business community. For more than 20 years, CIS-NC has

Project READS

31

successfully partnered with corporate donors to fund its core budget and its programs. The Board

of Directors includes numerous business leaders who can potentially support efforts to scale

READS following the grant period in schools through North Carolina. Second, funding-raising

is a core responsibility of each CIS Executive Director; the directors devote approximately 25-

50% of their time to fund-raising. Third, CIS of Durham has developed tools for drafting budgets

and project timelines specifically for READS fund-raising and implementation that will be

shared with the EDs in all 40 NC districts where CIS operates.

Starting in Year 2 of the study, CIS and the Project Director will undertake an additional

fund-raising strategy to sustain READS beyond the life of the validation grant: We will work

with state policymakers to create enabling legislation for READS. Currently, there is no stable

and durable source of state funds for evidence-based education programs like READS. However,

such funding does exist in the NC Department of Social Services, and CIS was able to tap into

these state funds for an evidence-based parenting program (e.g., Incredible Years). Although

there is no such finding mechanism in the NC State Department of Public Instruction, we will

work with state legislators to create a grants program for evidence-based education programs like

READS. Thus state dollars may provide a potential funding stream for CIS affiliates by Phase 3

of the project and afterwards.

2. The potential and planning for the incorporation of project purposes, activities, or benefits

into the ongoing work of the eligible applicant and partners at the end of the Validation grant.

Project purposes (reducing summer loss and improving student achievement), project

activities (READS implementation, data use strategies), and benefits (same as purposes) are all

likely to be incorporated into the ongoing work of our partner school districts and CIS of North

Carolina. As evidence of this we offer the letters of support in Appendix D, the fact that some of

Project READS

32

the most important work is already well underway, such as building partnerships with school

systems like DPS and fund-raising efforts by CIS, and the plans for CIS statewide scale-up

described above.

One of the difficulties in sustaining educational innovations is that district leadership changes

frequently, particularly at the level of the superintendent. Too often, the new superintendent has

no interest in maintaining the programs and initiatives that were at the top of his or her

predecessor's agenda. It is important to note that we will rely on institutional partnerships rather

than a single school district leader to sustain READS beyond the life of the validation grant. In

2009, the DPS Superintendent met with Dr. Kim and CIS staff to learn about the research on

summer reading. He said he supported the expansion of READS in future years and shortly

thereafter left his position. However, the READS program has continued to enjoy strong political

and financial support among district leaders, in large part because Dr. Kim made an effort to

involve and empower not only the Superintendent but also the Superintendent of Elementary

Curriculum and Instruction, principals and teacher-leaders who conducted training for their

teacher colleagues implementing READS, and CIS executives and directors.

SELECTION CRITERION G:

QUALITY OF THE MANAGEMENT PLAN AND PERSONNEL

1. The adequacy of the management plan to achieve the objectives of the proposed project on

time and within budget, including clearly defined responsibilities, timelines, and milestones

for accomplishing project tasks, as well as tasks related to the sustainability and scalability of

the proposed project.

The project's Leadership Team will include a Project Director and Co-Director, Lead

Evaluator, the President and CEO of CIS of North Carolina, the Executive Director of CIS of

Project READS

33

Durham, and the superintendents of participating districts. The Leadership Team will participate

in monthly phone calls and quarterly in-person meetings to discuss progress toward meeting the

goals in Phases 1, 2, and 3. The responsibilities of the key personnel are defined below. The

milestones for the project are the project goals by phase and year (see Table 1 in Appendix H).

Detailed timelines and budgets will be developed for all tasks related to each goal using project

management software.

2. The qualifications, including relevant training and experience, of the project director and

key project personnel, especially in managing complex projects.

Dr. James S. Kim, Ed.D., will serve as Project Director and have final authority to make

personnel and budget decisions. He is the developer of READS, and has already established

strong collaborative relationships with each of the partners. He has managed several complex

projects, worked effectively with school superintendents, principals, and teachers, and managed

budgets from public and private funding sources. Dr. Kim was a 2008 National Academy of

Education /Spencer Postdoctoral Fellow. He has published experimental studies of literacy

interventions in major psychology, literacy, and education policy journals. His research has been

funded by the Institute of Education Sciences, the National Science Foundation, the Spencer

Foundation, and the W.T. Grant Foundation.

Dr. Thomas G. White, Ph.D., of the University of Virginia, will serve as the project's

Co-Director. He will share with Dr. Kim the responsibility for leading the project intellectually

and logistically. Dr. White has a 25-year record of successful work with district and school

administrators on complex research projects. He is currently Co-Principal Investigator for a large

and complex statewide randomized experiment being conducted in collaboration with the

Colorado Department of Education. The study is funded by the Institute of Education Sciences.

Project READS

34

Dr. White specializes in reading research and program evaluation. He has published studies of

vocabulary development, phonics instruction, and reading comprehension in major psychology

and education journals.

Linda R. Harrill, President and CEO of CIS of North Carolina, will manage the Phase 3

scale-up of READS in 20 North Carolina school districts by supervising the CIS Executive

Directors (ED) in each district. In her role as President/CEO of Communities in Schools, she

oversees the replication of the CIS process across the state, works with North Carolina business

leaders, establishes state-level and national partnerships to benefit to local programs, and

maintains relationships with other non-profits, state and local agencies, and the legislature. In

2007, she was named the Distinguished Alumni for the School of Education at North Carolina

State University.

Bud R. Lavery, MSW, Executive Director of Communities In Schools of Durham, will be

involved with all implementation activities in all three phases of the project. He will supervise

CIS staff working on READS, direct partnerships with schools, and use the budget and project

timeline tool to support expansion in new districts. His area of expertise is implementing

evidence-based educational and social programs for youth at-risk of dropping out of school. He

was the program director for a $25 million research study at Duke University and has extensive

experience leading CIS of Durham’s effort to implement and scale evidence-based programs in

public schools.

Stacey Wilson-Norman, Superintendent of Elementary Curriculum and Instruction, Durham

Public Schools, will work with Bud Lavery and the Project Directors to ensure progress toward

project timelines and milestones. She will hire the READS data manager for Durham Schools,

Project READS

35

and manage the DPS READS budget. She will also help administer the readiness survey used to

select READS schools in Durham. She is a former principal in the Durham Public Schools.

3.The qualifications, including relevant training and experience, of the independent evaluator,

especially in designing and conducting experimental and quasi-experimental studies of

educational initiatives.

Jonathan Guryan, Ph.D., will lead and conduct the independent evaluation. He is an

Associate Professor of Economics at the University of Chicago’s Booth School of Business and a

faculty research fellow for the National Bureau of Economic Research. Dr. Guryan’s research in

education has included work on education policy interventions using experimental designs and

advanced multivariate statistical techniques, the financing of public schools, and how students

learn from each other in school. This research has earned him two National Science Foundation

Grants. He has published articles in leading economics journals.

Other Project Personnel will include a full-time, North Carolina-based Field Director

hired on a subcontract with CIS and reporting to the PD and Data Manager hired on a

subcontract with DPS and reporting to the PD, a Project Manager (assistant for Dr. Kim),

research assistants for Drs. Kim and White, and evaluation data collectors.

We have also formed an Advisory Board that will provide oversight and advice during the

five years. The board includes distinguished scholars in the areas of literacy, the economics of

education, testing and measurement, and educational leadership: Catherine Snow, Richard

Murnane, Daniel Koretz, and Thomas Payzant of the Harvard Graduate School of Education.