effects of technology immersion on middle school students’ learning opportunities and achievement

18
This article was downloaded by: [The University of Manchester Library] On: 09 October 2014, At: 05:43 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK The Journal of Educational Research Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/vjer20 Effects of Technology Immersion on Middle School Students’ Learning Opportunities and Achievement Kelly Shapley a , Daniel Sheehan b , Catherine Maloney b & Fanny Caranikas-Walker b a Shapley Research Associates b Texas Center for Educational Research Published online: 02 Aug 2011. To cite this article: Kelly Shapley , Daniel Sheehan , Catherine Maloney & Fanny Caranikas-Walker (2011) Effects of Technology Immersion on Middle School Students’ Learning Opportunities and Achievement, The Journal of Educational Research, 104:5, 299-315, DOI: 10.1080/00220671003767615 To link to this article: http://dx.doi.org/10.1080/00220671003767615 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http:// www.tandfonline.com/page/terms-and-conditions

Upload: fanny

Post on 16-Feb-2017

318 views

Category:

Documents


28 download

TRANSCRIPT

Page 1: Effects of Technology Immersion on Middle School Students’ Learning Opportunities and Achievement

This article was downloaded by: [The University of Manchester Library]On: 09 October 2014, At: 05:43Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House,37-41 Mortimer Street, London W1T 3JH, UK

The Journal of Educational ResearchPublication details, including instructions for authors and subscription information:http://www.tandfonline.com/loi/vjer20

Effects of Technology Immersion on Middle SchoolStudents’ Learning Opportunities and AchievementKelly Shapley a , Daniel Sheehan b , Catherine Maloney b & Fanny Caranikas-Walker ba Shapley Research Associatesb Texas Center for Educational ResearchPublished online: 02 Aug 2011.

To cite this article: Kelly Shapley , Daniel Sheehan , Catherine Maloney & Fanny Caranikas-Walker (2011) Effects ofTechnology Immersion on Middle School Students’ Learning Opportunities and Achievement, The Journal of EducationalResearch, 104:5, 299-315, DOI: 10.1080/00220671003767615

To link to this article: http://dx.doi.org/10.1080/00220671003767615

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) containedin the publications on our platform. However, Taylor & Francis, our agents, and our licensors make norepresentations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of theContent. Any opinions and views expressed in this publication are the opinions and views of the authors, andare not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon andshould be independently verified with primary sources of information. Taylor and Francis shall not be liable forany losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoeveror howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use ofthe Content.

This article may be used for research, teaching, and private study purposes. Any substantial or systematicreproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in anyform to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Page 2: Effects of Technology Immersion on Middle School Students’ Learning Opportunities and Achievement

The Journal of Educational Research, 104:299–315, 2011Copyright C© Taylor & Francis Group, LLCISSN: 0022-0671 print / 1940-0675 onlineDOI:10.1080/00220671003767615

Effects of Technology Immersion onMiddle School Students’ LearningOpportunities and Achievement

KELLY SHAPLEYShapley Research Associates

DANIEL SHEEHANCATHERINE MALONEYFANNY CARANIKAS-WALKERTexas Center for Educational Research

ABSTRACT. An experimental study of the TechnologyImmersion model involved comparisons between 21 middleschools that received laptops for each teacher and student, in-structional and learning resources, professional development,and technical and pedagogical support, and 21 control schools.Using hierarchical linear modeling to analyze longitudinalsurvey and achievement data, the authors found that Tech-nology Immersion had a positive effect on students’ technol-ogy proficiency and the frequency of their technology-basedclass activities and small-group interactions. Disciplinary ac-tions declined, but treatment students attended school some-what less regularly than control students. There was no sta-tistically significant immersion effect on students’ reading ormathematics achievement, but the direction of predicted ef-fects was consistently positive and was replicated across stu-dent cohorts.

Keywords: academic achievement, educational technology,evaluation, middle schools

T he present vision for educational technology imag-ines technology’s infusion into all aspects of theeducational system. Many educators, policymak-

ers, and business leaders recognize technology’s pervasivepresence in individuals’ daily lives and its ties to futureopportunities for students who must compete in a global,knowledge-based economy (Friedman, 2005). Providingthe technological, informational, and communication skillsneeded by 21st century learners, however, challenges schoolsto move beyond conventional modes of teaching and learn-ing as well as the traditional boundaries of the school dayand school walls.

Some researchers believe widespread technology use insociety is moving schools inevitably toward more exten-sive and innovative applications of technology in curricu-lum and instruction (Dede, 2007; Smith & Broom, 2003).This view acknowledges that students who attend schoolstoday are different from those of previous years because usingtechnology in nonschool settings is altering their “learningstyles, strengths, and preferences” (Dede, 2007, p. 11). New

technologies are reshaping how students access information,communicate, and learn within and outside of classrooms(Smolin & Lawless, 2007). Schools, accordingly, must cap-italize on students’ natural inclinations as learners.

Emerging technologies are also supporting more innova-tive forms of teaching and learning. For example, lessonssupported by technology can involve real-world problems,current and authentic informational resources, virtual toursof remote locations, simulations of concepts, or interactionswith practicing experts and global communities. These kindsof experiences are important because research shows that stu-dents learn more when they are engaged in meaningful, rele-vant, and intellectually stimulating work (Bransford, Brown,& Cocking, 2003; National Research Council & Instituteof Medicine, 2004; Newmann, Bryk, & Nagoaka, 2001).Technology-enhanced learning experiences also can helpstudents develop 21st century competencies, such as think-ing and problem solving, interpersonal and self-directionalskills, and digital literacy (Partnership for 21st CenturySkills, 2006).

Texas, similar to other states, recognizes that students’long-term success is tied to their preparation as lifelonglearners, world-class communicators, competitive and cre-ative knowledge workers, and contributing members of aglobal society. Yet, despite high aspirations for technology,the piecemeal way in which most schools have introducedtechnology into the educational process has been an obstacleto the effective use of technology for teaching and learning(Texas Education Agency [TEA], 2006).

Recognizing this limitation, the Texas Legislature in2003 set forth a different vision for technology in Texaspublic schools. Senate Bill 396 called for the TEA toestablish a Technology Immersion Pilot (TIP) that wouldimmerse schools in technology by providing individual

Address correspondence to Kelly Shapley, Shapley Research Asso-ciates, P.O. Box 11858, College Station, TX 77842, USA. (E-mail:[email protected])

Dow

nloa

ded

by [

The

Uni

vers

ity o

f M

anch

este

r L

ibra

ry]

at 0

5:43

09

Oct

ober

201

4

Page 3: Effects of Technology Immersion on Middle School Students’ Learning Opportunities and Achievement

300 The Journal of Educational Research

wireless mobile computing devices and technology-basedlearning resources along with teacher training and supportfor effective technology use. In response, the TEA has usedmore than $20 million in federal Title II, Part D monies tofund Technology Immersion projects for high-need middleschools. Concurrently, a research study, partially funded by afederal Evaluating State Educational Technology Programsgrant, has investigated whether exposure to TechnologyImmersion improves student learning and achievement.

The Present Study

The present article reports third-year findings for studentsinvolved in a comprehensive experimental study of the ef-fects of Technology Immersion on schools, teachers, andstudents. Specifically, we contrast outcomes for two cohortsof middle school students who attended Technology Immer-sion schools with students in control schools on measuresof technology-related learning experiences and competen-cies and measures of academic achievement (reading andmathematics test scores). We present longitudinal outcomesfor Cohort 1 students who attended schools across threeproject implementation years (Grades 6–8) and Cohort 2students who attended schools during two implementationyears (Grades 6–7).

Technology Immersion Model

A state statute provided a general description of Technol-ogy Immersion, but the TEA further defined the fundamen-tal attributes of the model to ensure consistent interpretationacross schools. Technology Immersion involves six compo-nents: (a) a wireless mobile computing device for each ed-ucator and student; (b) productivity, communication, andpresentation software for use as learning tools; (c) onlineinstructional resources that support the state’s curriculum;(d) online formative assessment tools; (e) professional de-velopment for teachers supporting technology integration;and (f) initial and ongoing technical support to maintainan immersed campus. See more complete details of modelcomponents in the Appendix.

Technology Immersion assumes that effective technol-ogy use in schools and classrooms demands a comprehensiveapproach. First, technology use requires robust access. Al-though the ratio of students to instructional computers inTexas has improved over time (Education Week, 2007),survey data show that an average of 2.9 or fewer classroomcomputers is insufficient to allow every student access (Shap-ley, Benner, Heikes, & Pieper, 2002; Shapley et al., 2006).In contrast to present circumstances, with computers typ-ically located in school labs, libraries, and media centers,Technology Immersion requires one-to-one student accessto laptop computers.

Second, Technology Immersion assumes that increasedaccess to and use of technology in schools requires adequatetechnical and pedagogical support. Electronic networks in

schools must support wireless laptops and digital content andcampus-based personnel must be available to assist teach-ers in learning to use technology, troubleshooting techni-cal problems, and integrating technology into lessons (e.g.,Ringstaff & Kelley, 2002; Ronnkvist, Dexter, & Anderson,2000; Shapley et al., 2002).

Third, the Technology Immersion model assumes thatteachers need effective professional development. High-quality professional development, as research demonstrates,should be of longer duration, and thus provide richer andmore comprehensive learning experiences and time for prac-tice and experimentation. Professional development shouldalso focus on subject-specific content or specific teachingmethods, and teachers should receive follow-up support asthey implement new skills in classrooms (Bradburn & Os-borne, 2007; Garet, Porter, Desimone, Birman, & Yoon,2001; Neugent & Fox, 2007; Penuel, Fishman, Yamaguchi,& Gallagher, 2007; Ringstaff & Kelley, 2002). Leadershipdevelopment is also crucial because research points consis-tently to the important role of school leaders in successfulimplementation of technology (Bradburn & Osborne, 2007;Pitler, 2005).

Fourth, Technology Immersion requires curricular and as-sessment resources that support the foundation curriculumin English language arts, mathematics, science, and socialstudies (e.g., online, CD-ROMS, stored on local networks).Thus, all laptops have software that allows students and edu-cators to use wireless computers as a tool for teaching, learn-ing, communication, and productivity. Digital resources andinteractive technologies allow students to acquire more andnewer information and build new knowledge by doing, re-ceiving feedback, and refining their understanding (Brans-ford, Brown, & Cocking, 2003). Online formative assess-ments let teachers diagnose students’ needs and assess theirmastery of curricular standards. Taken as a whole, if Tech-nology Immersion components are well implemented, vari-ous obstacles that historically have posed barriers to the ef-fective use of technology for teaching and learning in schoolsshould be alleviated.

Research Questions

The overarching purpose of the study was to investigatethe effects of Technology Immersion on students’ academicachievement—however, we also examined the relationshipsamong Technology Immersion and intervening factors at theschool, teacher, and student levels. The research involved42 middle schools assigned to either treatment or controlconditions (21 schools in each group). In the present studywe addressed two research questions:

Research Question 1: What is the effect of Technology Im-mersion on students’ learning opportunities (i.e., class-room activities, engagement)?

Research Question 2: Does Technology Immersion affect stu-dent achievement?

Dow

nloa

ded

by [

The

Uni

vers

ity o

f M

anch

este

r L

ibra

ry]

at 0

5:43

09

Oct

ober

201

4

Page 4: Effects of Technology Immersion on Middle School Students’ Learning Opportunities and Achievement

The Journal of Educational Research 301

FIGURE 1. Theoretical model of Technology Immersion.

Theoretical Framework

The theoretical framework of Technology Immersionpresented in Figure 1 postulates a linear sequence of causalrelationships that guided the research. In the framework,students in treatment schools are immersed in technologythrough the introduction of the six components. Givenquality implementation, an improved school technologyenvironment should lead teachers to use technology moreeffectively for teaching. In turn, improved school andclassroom conditions should improve students’ technologyproficiency, learning experiences, engagement in schooland learning, and in due course, improved standardizedtest scores. Student, family, and school characteristics exerttheir own influence on outcomes.

In theorizing student effects, we looked to studies of tech-nology in general as well as to more recent research on one-to-one technology initiatives. Although research on one-to-one laptop projects has grown in recent years, there arestill few large-scale experimental studies or studies with wellmatched comparison groups (Penuel, 2006). Even thoughmany studies fail to meet rigorous research standards, cumu-lative evidence points to important areas for investigation.

Factors Associated with Achievement

Research suggests that changes in students’ experiences al-lowed through Technology Immersion should contribute to

enhanced learning and achievement. Specifically, increasedaccess to technology has been linked to students’ technol-ogy use, technology proficiency, and school engagement, andalthough less robustly, to academic achievement.

Technology use. One-to-one student access to comput-ers, not surprisingly, leads to increased technology use. Rus-sell, Bebell, and Higgins (2004) found that technology isused more often for instructional and learning purposes inone-to-one laptop classrooms. Additionally, studies showthat students involved in ubiquitous technology projectsuse technology more often outside of school as well. Forexample, students in one-to-one classrooms used comput-ers at home more frequently for academic purposes (Russellet al., 2004), and students spent less time watching televi-sion and more time on homework after receiving laptops(Baldwin, 1999). Moreover, research shows that lessons intechnology-rich classrooms involve fewer teacher-centered,lecture-oriented activities and more student-centered ones(Baker, Gearhart, & Herman, 1994). Classroom structuresshift from large group to students working independently orto more student-directed activities (Rockman et al., 1998;Russell, Bebell, Cowan, & Corbelli, 2002).

Technology proficiency. Increases in students’ technologyproficiencies are also associated with ubiquitous technol-ogy. Rockman et al. (1998) reported that laptop studentsconsidered themselves more proficient users of Word, Excel,

Dow

nloa

ded

by [

The

Uni

vers

ity o

f M

anch

este

r L

ibra

ry]

at 0

5:43

09

Oct

ober

201

4

Page 5: Effects of Technology Immersion on Middle School Students’ Learning Opportunities and Achievement

302 The Journal of Educational Research

PowerPoint, the Internet, e-mail, and CD-ROMs than didnonlaptop students. Similarly, elementary students who re-ceived laptops reported increased computer skills and betterInternet research capabilities (Lowther, Ross, & Morrison,2001), and high school students with laptops made greatergains than comparison students on measures such as knowl-edge of hardware and operating systems, productivity tools,and Internet use (Schaumburg, 2001).

Engagement. Numerous studies have associated one-to-one technology with increased student engagement(Maine Educational Policy Research Institute [MEPRI],2003; Rockman et al., 1998; Russell et al., 2004; Woodul,Vitale, & Scott, 2000). For example, students involved inthe Maine Learning Technology Initiative found schooland learning more interesting and preferred using laptopsfor most school-related tasks (MEPRI, 2003). In HenricoCounty, Virginia, researchers related increased studentmotivation, engagement, and interest with one-to-onecomputing (Zucker & McGhee, 2005). Similarly, studentsin Apple Classrooms of Tomorrow voluntarily used timeoutside of school to work on technology-based projects, andthey often initiated their own computer-related projects(Baker, Gearhart, & Herman, 1994).

Other researchers have examined the relationship be-tween technology access and use and student behavior. Astatewide study of middle schools in Florida showed that stu-dent conduct violations and disciplinary actions decreasedas the number of computers per student increased (Barron,Hogarty, Kromery, & Lenkway, 1999). Other studies, like-wise, reported decreased discipline problems associated withone-to-one computing (Baldwin, 1999; MEPRI, 2003). Anevaluation of the North Carolina Laptop Notebook Projectfound that students in the laptop program had fewer absencesand late arrivals compared with nonparticipants (Stevenson,1998).

Academic Achievement

The present study is important because no large-scale,controlled studies have measured the impact of one-to-onecomputing on student achievement. Still, findings on theeffects of laptops on student achievement from a few studieswith comparison groups have been generally positive. Theevaluation of the laptop project in Beaufort County, WestVirginia, found that students participating in the program for2 years had higher language, reading, and mathematics scoresthan did nonlaptop students, although there was no statis-tical control for prior achievement (Stevenson, 1998). Thestrongest evidence on the effects of laptops on achievementis in the area of writing. Lowther, Ross, and Morrison (2001,2003) reported statistically significant effects favoring sixth-and seventh-grade students with laptops over control stu-dents for dimensions of writing, such as ideas and content,organization, and style. Rockman et al. (1999), likewise,

found that laptop students outscored nonlaptop students onmeasures of writing objectives. More recently, a study con-ducted in Maine reported that students’ writing improvedsignificantly with laptops (Silvernail & Gritter, 2007).

Additional research is needed to draw definitive conclu-sions about the effects of ubiquitous technology on studentlearning and achievement. Our experimental design, as de-scribed subsequently, provided the means to test the study’shypotheses about Technology Immersion’s effects.

Method

Sample Selection

The study included Grades 6–8 middle schools from rural,suburban, and urban locations in Texas. Twenty-one Tech-nology Immersion schools selected through a competitivegrant process were matched by researcher with 21 controlschools on multiple pretreatment measures.

Treatment sample. In the spring of 2004, the TEA re-leased a series of Requests for Applications inviting schooldistricts to apply for TIP grants for up to two middle schoolsthat met eligibility requirements for federal Title II, PartD funds (high-need schools due to children from familieswith incomes below the poverty line, schools identifiedfor improvement, or schools with substantial technologyneeds). The agency held an external review of proposals,with applications scored by five readers. Final selection ofTIP schools involved the consideration of several factors,including proposal ratings, size, location, student diversity,and academic achievement. Decisions were influenced bythe need for statewide geographic distribution and the avail-ability of comparable schools for the control-group pool.

Control schools. The selection of control schools first in-volved the generation of a pool of Grades 6–8 middle schoolseligible to receive federal funds for participation in the study.As a next step, we used statistical parameters to identifymiddle schools that matched treatment schools as nearly aspossible on (a) the district and campus size, (b) the regionallocation, (c) the proportion of economically disadvantagedand minority students, (d) the percentage of students passingall Texas Assessment of Knowledge and Skills (TAKS) tests,and (e) the gaps between the percentage of Caucasian stu-dents and African American and Hispanic students passingTAKS (all tests). This selection process yielded 21 con-trol group schools including controls for six schools thatcame from within the same districts as treatment schoolsand controls for 15 schools from closely matched single mid-dle school districts.

The study used a control group delayed interventionmodel. Each control school received $25,000 annuallyfor study participation, with 25% of funds earmarked forprofessional development as required by federal guidelines.

Dow

nloa

ded

by [

The

Uni

vers

ity o

f M

anch

este

r L

ibra

ry]

at 0

5:43

09

Oct

ober

201

4

Page 6: Effects of Technology Immersion on Middle School Students’ Learning Opportunities and Achievement

The Journal of Educational Research 303

TABLE 1. Baseline Characteristics of Technology Immersion (n = 21) and Control Schools (n = 21)

95% Confidence interval for difference

Variable Condition M SD Lower Upper t(40)

Enrollment Immersion 374.9 348.4 −284.6 177.5 −0.47Control 428.5 391.3

Economic disadvantage (%) Immersion 70.8 17.5 −3.4 19.4 1.42Control 62.8 19.0

Minority (%) Immersion 68.1 28.4 −10.4 24.7 0.83Control 60.9 27.8

ESL (%) Immersion 13.5 17.2 −1.6 16.0 1.66Control 6.3 9.9

Special education (%) Immersion 14.7 5.5 −4.0 1.8 −0.76Control 15.8 3.7

Student mobility (%) Immersion 15.8 4.6 −3.8 2.8 −0.30Control 16.3 5.9

TAKS 2004, passing all (%) Immersion 52.4 15.7 −9.2 8.5 −0.08Control 52.8 12.5

TAKS 2003, passing all (%) Immersion 65.9 11.4 −9.1 5.5 −0.50Control 67.6 12.0

Note. ESL = English as a second language; TAKS = Texas Assessment of Knowledge and Skills. Differences between groups were all statisticallynonsignificant. Texas Education Agency (2004) Academic Excellence Indicator System reports.

At the end of the second project year, the TEA offered grantsto control schools to begin planning for Technology Immer-sion. Of the 21 control schools, 16 (76%) applied for andreceived TIP start-up funds. Grant guidelines in the thirdyear (2006–2007) allowed teachers in control schools to re-ceive laptops and instructional resources and participate ingrant-supported professional development.

Characteristics of Comparison Groups

Treatment and control schools were drawn from compa-rable regions in Texas and were well matched by campus anddistrict enrollments. For both groups, middle schools weretypically small (about 80% enrolling 600 students or less),and they were located either in small or very small districts(two thirds enrolling 2,999 students or less) or large districts(one third enrolling 10,000 students or more). Because TIPgrants targeted high-need schools, two thirds of studentsin the study (67%) came from economically disadvantagedbackgrounds. Students were ethnically diverse, roughly 58%Hispanic, 7% African American, and 36% Caucasian.

Table 1 displays the baseline characteristics of schools.Comparisons of student characteristics show that the per-centages of economically disadvantaged, minority, Englishas a second language, and special education students werestatistically equivalent across the treatment and controlschools. In addition, student enrollment, mobility, andTAKS passing rates were statistically comparable acrossgroups. Consequently, the treatment and control schoolswere well matched initially on key demographic and aca-demic performance measures.

The sample selection process and matching proceduresproduced an experimental design with good internal va-lidity, in that there were no large, statistically significanttreatment–control group differences. Nevertheless, the pos-sibility of selection bias needed to be addressed because treat-ment schools selected through a competitive grant processmay have differed from control schools on other dimensions.Thus, during site visits conducted at each of the 42 schoolsin fall 2004, researchers collected extensive baseline data onthe characteristics of schools, teachers, and students; class-room practices; existing access to technology resources; andexisting levels of technical and pedagogical support. Resultsshowed there were no statistically significant or practicallyimportant preexisting differences between treatment andcontrol schools that would bias outcomes (Shapley et al.,2006). However, as noted previously, a threat to internalvalidity was introduced in the third project year when con-trol schools began to plan for immersion. The anticipationof Technology Immersion components in control schoolscould underestimate the magnitude of treatment effects.

Another study limitation was external validity—the ex-tent to which the results can be generalized from the specificsample to the general population. Due to grant funding re-strictions, the treatment group was not representative of theaverage middle school in Texas. Compared to the state, thesample included a greater proportion of economically dis-advantaged students (67% vs. 51%) and Hispanic students(58% vs. 37%), and less economically advantaged students(33% vs. 49%), Caucasian students (36% vs. 46%), andAfrican American students (7% vs. 14%). Sample schoolswere also smaller, on average, than middle schools statewide

Dow

nloa

ded

by [

The

Uni

vers

ity o

f M

anch

este

r L

ibra

ry]

at 0

5:43

09

Oct

ober

201

4

Page 7: Effects of Technology Immersion on Middle School Students’ Learning Opportunities and Achievement

304 The Journal of Educational Research

(about 400 students vs. 670). Nevertheless, our results gener-alize to those schools that are smaller and more economicallydisadvantaged and enroll similar ethnic-racial populations.

Student Cohorts

Two cohorts of students were followed in the study, withCohort 1 students enrolled continuously in schools over3 project years (2004–2005 through 2006–2007), and Co-hort 2 students for 2 years (2005–2006 through 2006–2007).Cohort 1 included 5,449 students (2,586 in 21 treatmentschools and 2,863 in 21 control schools), and Cohort 2 in-cluded 5,526 students (2,644 in 21 treatment schools and2,882 in 21 control schools). Data for multiple student co-horts and measurement occasions allowed researchers to as-sess the replicability of effects across cohorts and outcomemeasures.

Measures

Technology Survey. The Technology Survey includeditems that measured students’ technology proficiency (22items), classroom activities (12 items), and small-group work(6 items). Cronbach’s alpha reliability coefficients for thescale scores ranged from 0.83 to 0.94. As a measure of tech-nology proficiency, students indicated how well they coulduse various technology applications on a 5-point Likert-typescale ranging from 1 (I can do this not at all or barely) to 5 (I cando this extremely well). For measures of classroom activitiesand small-group work, students used a 5-point Likert-typescale to rate the frequency of activities or interactions rang-ing from 1 (never) to 5 (almost daily). Survey response rateswere in the 80%–90% range across the study’s 3 years. Therewere only slight, nonsignificant differences in response ratesbetween cohorts and comparison groups. Survey items weredrawn from previously validated instruments (i.e., TAGLITStudent Assessment; State Educational Technology Direc-tors Association, Observation Tools for School Observers)and adapted through reviews by content experts to alignwith the Texas Technology Applications standards and TIPproject objectives.

Disciplinary actions and school attendance. Texas requiresthat schools report each disciplinary action that results inremoval of a student from any part of the regular academicprogram. Accordingly, we collected Disciplinary Action Re-ports (Texas Public Education Information ManagementSystem [PEIMS], 425 records) for each student from schoolsat the end of the 2006–2007 school year. Additionally, wecollected each student’s annual school attendance data fromPEIMS.

Academic achievement. Our academic outcome measuresare TAKS reading and mathematics scores. The TAKS isa criterion-referenced assessment that annually measures

students’ mastery of the state’s content standards. Read-ing is assessed at Grades 3–9 and mathematics at Grades3–11. TAKS internal consistency reliabilities vary slightlyacross subjects, grade levels, and testing years. For this study,TAKS score reliabilities ranged from 0.87 to 0.90 for read-ing and from 0.88 to 0.91 for mathematics. Evidence alsosupports the content, construct, and criterion-related va-lidity of TAKS assessments. The TAKS scale score has astandard set at 2100 for each grade level. Because scores arenot equated across grades, we used TAKS scale scores tocalculate standardized scores that could be used to measurestudent progress across years. The standard score is a T scorewith a mean of 50 and a standard deviation of 10. The mean(50) represents the state average TAKS score for a gradelevel.

Analyses

This study combined two analytic approaches. First, weused three-level hierarchical growth models (HLM) toestimate the effects of Technology Immersion on students’growth trajectories for mediating variables and academicachievement. Second, we used t tests of differences betweentreatment and control group means to examine disciplinaryactions.

HLM Growth Analyses

HLM models provide statistical tools for studying rates ofchange using measurements from multiple time points (Rau-denbush & Bryk, 2002). For the present study, we collecteddata for Cohort 1 students at four time points: at baselineand at the end of three project implementation years; forCohort 2, we collected data at three time points: at baselineand after each of 2 implementation years. We used separatethree-level HLM models to estimate student- and school-specific effects (i.e., the extent to which outcome measuresvaried across time, students, and schools).

In our models, we hypothesized that student and schoolpoverty are related to students’ initial status and yearlygrowth rate. This supposition stems from evidence associ-ating a higher concentration of economically disadvantagedstudents in a school with a lower level of Technology Im-mersion (Shapley, Sheehan, Maloney, & Caranikas-Walker,2008). Similarly, other research reviews have confirmed neg-ative effects of poverty on school reform efforts (Desimone,2002) and student achievement (Sirin, 2005). Because TIPgrants targeted high-needs schools, the percentages of dis-advantaged students were high across most of the study’sschools. Even so, school poverty varied substantially (rang-ing from 31% to 100%). For purposes of explanation, wedescribe the HLM statistical model for our survey data.

Level 1: Repeated measures model. Level 1 is a repeatedmeasures model (i.e., survey time within students) that

Dow

nloa

ded

by [

The

Uni

vers

ity o

f M

anch

este

r L

ibra

ry]

at 0

5:43

09

Oct

ober

201

4

Page 8: Effects of Technology Immersion on Middle School Students’ Learning Opportunities and Achievement

The Journal of Educational Research 305

captured key features of growth (i.e., initial status and rate ofchange). In the model, Ytij is the survey scale score at year tfor student i in school j, and survey time is the point at whichstudents completed surveys (Cohort 1, fall 2004 and spring2005, 2006, and 2007; Cohort 2, fall 2005 and spring 2006and 2007). The key parameters in the model were π0ij andπ1ij. The coefficient π0ij represents the initial status (i.e.,the estimated initial scale score), for student i in school j infall, and π1ij is the annual growth rate (rate of change) forstudent i in school j. The etij is the error term (within-studentmeasurement error) assumed to be normally distributed witha mean of 0 and a constant variance. Thus, at Level 1, themodel was

Yti j = π0i j + π1i j (Survey Time)t i j + eti j .

Level 2: Student-level model. The Level 2 model (between-students model) determined differences between students infeatures of growth (i.e., initial status [π0ij] and rate of change[π1ij]). In the student-level model, β00j represents the meaninitial status of a more economically advantaged student(advantaged = 0, disadvantaged = 1) within school j, andβ10j represents the mean rate of change for an economicallyadvantaged student within school j. The coefficients β01j andβ11j represent the effects of student poverty on initial statusand school year rate of change, respectively. The r0ij and r0ij

are residuals (i.e., random effects). At Level 2, the modelwas

π0i j = β00 j + β01 j (Disadvantaged)i j + r0i j

π1i j = β10 j + β11 j (Disadvantaged)i j + r1i j .

Level 3: School-level model. At the school level (Level 3),we examined how students’ initial status (β00j) and growth(β10j) varied across schools as a function of school-level ran-dom effects (µ00j and µ10j), as well as school conditions, in-cluding treatment status (an indicator variable with a valueof 0 for a control school and 1 for an immersion school)and school poverty (a continuous variable with a grandmean of 69.8%). That is, we theorized that being in a treat-ment school was positively related to students’ growth ontechnology-related survey scores, after controlling for thepoverty level of the school. Thus, we posed the followingschool-level model:

β00 j = γ000 + γ001(Immersion status) j

+ γ002(School Poverty) j + µ00 j

β10 j = γ100 + γ101(Immersion status) j

+ γ102(School Poverty) j + µ10 j .

In the model, γ 000 was the overall mean initial status ofan advantaged student at a control campus with an aver-

age level of school poverty, and γ 100 was the overall meanstudent growth rate (of an advantaged student at a controlcampus with an average level of school poverty). The coeffi-cients γ 001 and γ 101 represented the direction and strengthof association of immersion status on school-level initialstatus and growth rate, respectively. In addition, γ 002 andγ 102 represented the effect of school poverty on school-levelinitial status and growth rate, respectively.

The model’s simplicity aids in the interpretation of effects.More complex models controlling for additional student de-mographic characteristics (gender and ethnicity) estimatednearly identical treatment growth coefficients.

Analysis of Disciplinary Data

We compared the frequency of disciplinary actions duringthe 2006–2007 school year at treatment and control schoolsfor Cohorts 1 and 2. Preliminary statistical tests showedgenerally nonnormal and negatively skewed distributions ofdisciplinary data. However, given that t tests of differencesbetween means are robust to violations of the normality as-sumption (Rasch & Guiliard, 2004), we used the parametricprocedure to test for differences between groups. Still, as averification of results, the more conservative nonparametricMann–Whitney U test yielded comparable conclusions.

Results

We first report the effects of Technology Immersion onstudent mediating variables, including changes in students’technology-related learning experiences and engagement(measured by school attendance and disciplinary actions).After that, we examine treatment effects on students’ read-ing and mathematics achievement.

Effects of Technology Immersion on Students’ LearningExperiences and Engagement

Analyses of treatment effects on student mediating vari-ables involved Cohorts 1 and 2 students who were continu-ously enrolled in schools since October 2004 and 2005, re-spectively. Statistical details for the three-level HLM growthmodels are reported in Table 2.

First, we estimated the effects of Technology Immersionon growth rates for three self-reported measures of students’learning activities and proficiency. HLM model-based esti-mations of Technology Immersion effects reported in Table2 and model estimates in Table 3 show that after controllingfor school poverty (percentage of economically disadvan-taged students) and student economic disadvantage (qual-ification for free or reduced price lunch), advantaged anddisadvantaged treatment group students compared to theircontrol group counterparts had statistically significant andpositive growth trends for classroom activities, small-group

Dow

nloa

ded

by [

The

Uni

vers

ity o

f M

anch

este

r L

ibra

ry]

at 0

5:43

09

Oct

ober

201

4

Page 9: Effects of Technology Immersion on Middle School Students’ Learning Opportunities and Achievement

306 The Journal of Educational Research

TABLE 2. HLM Analysis of Technology Immersion Effects (Fixed) on Students’ Growth for Mediating Variables

Learning experiences Competency Engagement

Classroom activities Small-group workTechnologyproficiency School attendance

Dependent variable and predictorGamma

coefficient tGamma

coefficient tGamma

coefficient tGamma

coefficient t

Cohort 1 (Grade 8)Initial status (fall 2004) 2.02 29.37∗∗∗ 2.80 48.73∗∗∗ 2.98 49.45∗∗∗ 97.71 567.25∗∗∗

Immersiona 0.26 3.26∗∗ 0.06 0.99 0.04 0.56 −0.05 −0.19School poverty −0.30 −1.34 −0.15 −0.80 −0.13 −0.54 2.23 3.56∗∗

Student disadvantage 0.01 0.43 −0.05 −0.96 −0.34 −8.75∗∗∗ −0.65 −3.79∗∗∗

Growth rate 0.04 1.63 −0.06 −2.56∗ 0.27 18.19∗∗∗ −0.17 −2.72∗

Immersion 0.21 6.34∗∗∗ 0.07 2.79∗∗ 0.04 1.53 −0.27 −3.04∗∗

School poverty 0.08 0.86 −0.05 −0.59 0.01 0.06 −0.26 −1.05Student disadvantage 0.04 3.08∗∗ 0.06 2.79∗∗ 0.01 0.64 −0.12 −2.03∗

Disadvantage × Immersion — — — — 0.06 4.22∗∗∗

Cohort 2 (Grade 7)Initial status (fall 2005) 2.08 32.57∗∗∗ 2.79 45.50∗∗∗ 2.99 49.17∗∗∗ 97.52 671.67∗∗∗

Immersion 0.15 1.57 −0.06 −0.76 0.01 0.11 −0.26 −1.41School poverty 0.45 1.65 0.14 0.81 0.22 0.93 1.55 2.71∗

Student disadvantage −0.01 −0.34 −0.01 −0.25 −0.29 −6.75∗∗∗ −0.49 −3.59∗∗

Growth rate 0.06 1.40 −0.01 −0.35 0.27 8.23∗∗∗ −0.12 −1.38Immersion 0.24 4.16∗∗∗ 0.15 3.62∗∗ 0.16 4.25∗∗∗ −0.25 −2.28∗

School poverty −0.17 −1.05 −0.02 −0.20 −0.15 −1.20 −0.44 −1.59Student disadvantage 0.04 1.62 −0.01 −0.61 0.01 0.20 −0.31 −4.26∗∗∗

Note. HLM = hierachical growth models. Number of students: Cohort 1 (1,337 treatment and 1,467 control), Cohort 2 (1,595 treatment and 1,671control). Number of schools: 21 treatment and 21 control.aTechnology Immersion students had significantly higher initial classroom activities scores. A latent variable regression, controlling for the effect ofthis initial difference on the growth rate, indicated that the difference between the original (0.138) and adjusted (0.212) immersion coefficients wassignificant (the difference divided by the standard error of the difference equals −2.63). The growth rate coefficient adjusted for this difference isreported in the table.∗p < .05. ∗∗p < .01. ∗∗∗p < .001.

work, and technology proficiency. Findings for specific scalesare explained subsequently.

Classroom activities. Students reported the frequency withwhich their core-subject teachers (language arts, mathemat-ics, science, social studies) had them use specific technologyapplications (e.g., use a word processor for writing, use aspreadsheet to calculate or graph, create a presentation) ona 5-point Likert-type scale ranging from 1 (never) to 5 (al-most daily). As anticipated given the increased accessibility ofhardware and digital resources, the yearly estimated rates ofchange in class activities involving technology for econom-ically advantaged and disadvantaged Cohorts 1 and 2 treat-ment students were 0.25 and 0.30 scale-score points and 0.30and 0.34 scale-score points, respectively. In contrast, theircontrol group counterparts had relatively flat rates of change(0.04–0.10 scale-score points). Average estimated pretreat-ment scores for students in Technology Immersion schools(2.2–2.3) indicated that they rarely (i.e., a few times a year)used various technology applications in their core-subjectclasses prior to immersion. However, their classroom usage

across applications increased to nearly sometimes (i.e., onceor twice a month) by spring 2007 (mean estimated scoresfrom 2.8 to 3.2). In contrast, technology usage in controlclassrooms increased just slightly over the same time period(mean estimated baseline scores of 2.0–2.1 compared with2.1–2.3 in spring 2007).

Small-group work. Recognizing established links betweenone-to-one computing and collaborative classroom struc-tures, we asked students to rate the frequency of their small-group interactions with classmates on the 5-point Likert-type scale, including statements such as “we tutor or coacheach other on difficult work,” “we brainstorm solutions toproblems,” and “we produce a report or project.” Growthrate coefficients showed that students at treatment schoolsreported increasing opportunities to work with classmatesin small groups. Across cohorts, economically advantagedand disadvantaged treatment students had significantly pos-itive yearly growth trends (0.02 and 0.07 scale-score pointsand 0.14 and 0.13 scale-score points for Cohorts 1 and2, respectively). Quite the opposite, students at control

Dow

nloa

ded

by [

The

Uni

vers

ity o

f M

anch

este

r L

ibra

ry]

at 0

5:43

09

Oct

ober

201

4

Page 10: Effects of Technology Immersion on Middle School Students’ Learning Opportunities and Achievement

The Journal of Educational Research 307

TABLE 3. HLM Model-Based Estimations of Mean Scale Scores and Mean Growth Rates for Student Learning Variables byTreatment and Control Groups

Statistics for students in schools with average school poverty

Technology immersion Control

Variable/Cohort/Studenteconomic status

Estimated M:Initial status

Yearly growthrate

Estimated M:Spring 2007

Estimated M:Initial status

Yearly growthrate

Estimated M:Spring 2007

Classroom activitiesCohort 1: Grade 8a

Advantaged 2.28 0.25∗∗∗ 3.03 2.02 0.04 2.14Disadvantaged 2.29 0.30∗∗∗ 3.18 2.03 0.08 2.29

Cohort 2: Grade 7b

Advantaged 2.23 0.30∗∗∗ 2.83 2.08 0.06 2.20Disadvantaged 2.22 0.34∗∗∗ 2.90 2.07 0.10 2.27

Small-group workCohort 1: Grade 8a

Advantaged 2.86 0.02∗∗ 2.91 2.80 −0.06 2.64Disadvantaged 2.81 0.07∗∗ 3.03 2.75 0.00 2.75

Cohort 2: Grade 7b

Advantaged 2.72 0.14∗∗ 3.00 2.79 −0.01 2.76Disadvantaged 2.71 0.13∗∗ 2.97 2.78 −0.02 2.73

Technology proficiencyCohort 1: Grade 8a

Advantaged 3.02 0.31 3.95 2.98 0.27 3.78Disadvantaged 2.68 0.38∗∗∗ 3.81 2.64 0.28 3.46

Cohort 2: Grade 7b

Advantaged 3.00 0.43∗∗∗ 3.86 2.99 0.27 3.53Disadvantaged 2.71 0.43∗∗∗ 3.58 2.70 0.27 3.25

Note. HLM = hierachical growth models. Classroom activities and small-group work were measured on a 5-point Likert-type frequency scale rangingfrom 1 (never) to 5 (almost daily). Technology proficiency was measured on a 5-point Likert-type scale ranging from 1 (I can do this not at all or barely)to 5 (I can do this extremely well).aFall 2004 to spring 2007 (3 years’ growth) for Cohort 1.bFall 2005 to spring 2007 (2 years’ growth) for Cohort 2.∗∗p < .01. ∗∗∗p < .001.

campuses reported less frequent small-group activities as theyadvanced to higher grade levels. Thus, in spring 2007, treat-ment students’ average estimated scores (2.9–3.0) indicatedthat they sometimes (i.e., once or twice a month) interactedwith peers in small groups, whereas control students’ aver-age scores (2.6–2.8) suggested that they worked together insmall groups less frequently.

Technology proficiency. As a measure of their proficiencyrelative to Texas Technology Applications Standards,students rated their skills in using applications on a 5-pointLikert-type scale ranging from 1 (I can do this not at allor barely) to 5 (I can do this extremely well). Althoughstudents at treatment schools reported consistently highertechnology proficiency than did their control-group peers,economic status differences emerged for Cohort 1 eighth-grade students in the third year. A statistically significantinteraction between the treatment and students’ economicstatus showed that economically disadvantaged studentsat treatment schools grew in proficiency at a significantlyfaster rate (0.38 scale-score points per year) than their

more affluent immersion peers (0.31 scale-score points)and control-group students (0.27 scale-score points). Thus,based on estimated mean scores in spring 2007, econom-ically disadvantaged treatment students had narrowed theproficiency gap with advantaged treatment students (3.8vs. 4.0), closed the proficiency gap with advantaged controlstudents (3.8 vs. 3.8), and surpassed the proficiency ofdisadvantaged control students (3.8 vs. 3.5).

For Cohort 2, economically advantaged and disadvan-taged treatment students grew in technology proficiencyat significantly faster rates than their counterparts incontrol schools (0.43 scale-score points compared with0.27 scale-score points). Consequently, economically dis-advantaged Cohort 2 students in treatment schools sur-passed advantaged control students in proficiency by theend of Grade 7 (estimated mean scores of 3.6 and 3.5,respectively).

The Technology Immersion model also posited thatgreater technology access and use would enhance stu-dent engagement as evidenced by increased school atten-dance and improved conduct. We found positive effects of

Dow

nloa

ded

by [

The

Uni

vers

ity o

f M

anch

este

r L

ibra

ry]

at 0

5:43

09

Oct

ober

201

4

Page 11: Effects of Technology Immersion on Middle School Students’ Learning Opportunities and Achievement

308 The Journal of Educational Research

immersion on student behavior, but, surprisingly, negativeeffects on school attendance.

School attendance. As reported in Table 2, we used three-level HLM growth models to examine changes in attendancerates over time. For both groups, middle school students’ at-tendance rates decreased as they advanced to higher gradelevels; however, treatment students’ attendance rates de-clined at a faster pace. For Cohort 1, the yearly estimated rateof change in attendance for advantaged and disadvantagedtreatment students (−0.44 and −0.56 percentage points,respectively) was nearly twice as large as the rates for con-trol students (−0.17 and −0.28 percentage points). Thus,at the end of Grade 8, economically advantaged and disad-vantaged students in treatment schools had estimated meanattendance rates of 96.3% and 95.3%, respectively, com-pared with 97.2% and 96.2% for control students. Similarly,the yearly estimated rate of decline in attendance for Cohort2 advantaged and disadvantaged treatment students (−0.37and −0.68 percentage points, respectively) exceeded thechange for control students (−0.12 and −0.43 percentagepoints).

Disciplinary actions. As another measure of engagement,we used independent t tests to compare the frequency of stu-dent disciplinary occurrences (removal of a student from theregular academic program for a full school day) at treatmentand control schools. Results reported in Table 4 show statis-tically significant differences indicating less frequent studentdisciplinary incidents at treatment schools compared to con-trol.

Specifically, Cohort 1 eighth-grade students at Technol-ogy Immersion schools had an average of 0.65 disciplinaryactions compared with 0.90 disciplinary events per studentat control schools, t(5481) = 4.09, p < .001, Cohen’s d =−.11. Similarly, Cohort 2 seventh-grade students at treat-ment schools had significantly fewer disciplinary actionsthan students at control schools, t(5513) = 5.83, p < .001,Cohen’s d =−.16). Third-year findings on student disciplinemirrored results for the first and second project years. More

active and collaborative classroom learning experiences as-sociated with individual laptops and digital resources seemedto improve students’ engagement in class work.

Effects of Technology Immersion on Academic Achievement

Given that changes in students and their learning ex-periences were expected to mediate academic performance,we next estimated treatment effects on students’ TAKS Tscores. Our analyses concentrated on reading and math-ematics scores because students completed TAKS tests forthose subjects annually, whereas they completed TAKS testsfor writing, science, and social studies at intermittent gradelevels. We used three-level HLM growth models to examinehow students’ TAKS reading and mathematics achievementvaried across time (the point at which students completedTAKS assessments each spring), students, and schools. AsTable 5 shows, we estimated school mean rates of change aswell as the separate effects of student economic disadvan-tage and the school poverty concentration on TAKS readingand mathematics performance. Each HLM analysis includedapproximately 3,000–3,330 students divided nearly equallybetween the 21 treatment and 21 control schools. Compara-ble proportions of students were retained in analyses acrossyears (58%–59% of treatment students, 58%–61% of controlstudents).

TAKS reading. After controlling for student and schoollevels of poverty, we found no statistically significant effectof Technology Immersion on students’ estimated growthrate for TAKS reading for either of the cohorts. For Co-hort 1, the reading achievement of advantaged students intreatment and control schools (with average poverty) de-creased across years, whereas economically disadvantagedstudents at treatment and control schools grew in readingachievement at significantly faster rates than their more ad-vantaged peers (0.38 T-score points per year for treatmentstudents vs. 0.17 T-score points for control students). TAKSreading outcomes for Cohort 2 seventh-grade students, like-wise, showed no statistically significant treatment effect on

TABLE 4. Differences between Mean Number of Disciplinary Actions per Student at Treatment and Control Schools byCohort

Treatment Control

Cohort n M SD n M SD t Cohen’s d

Cohort 1 (Grade 8) 2,584 0.65 2.04 2,899 0.90 2.56 4.09∗∗∗ −0.11Cohort 2 (Grade 7) 2,624 0.53 1.64 2,891 0.86 2.45 5.83∗∗∗ −0.16

Note. Independent samples t test for differences between average disciplinary actions per student at treatment and control schools. One outlier wasremoved from the analysis (a Cohort 1 control student with 112 disciplinary actions). Removing the outlier did not affect the conclusion.∗∗∗p < .001.

Dow

nloa

ded

by [

The

Uni

vers

ity o

f M

anch

este

r L

ibra

ry]

at 0

5:43

09

Oct

ober

201

4

Page 12: Effects of Technology Immersion on Middle School Students’ Learning Opportunities and Achievement

The Journal of Educational Research 309

TABLE 5. HLM Statistics for Cohorts 1 and 2 Students: Effects of Technology Immersion on TAKS Reading andMathematics Growth Rates

TAKS reading TAKS mathematics

Dependent variable and predictor Gamma coefficient t Gamma coefficient t

Cohort 1: Grade 8Initial mean status 2004 TAKS T score 54.002 76.89∗∗∗ 53.019 70.83∗∗∗

Immersiona −1.346 −1.89† −1.201 −1.36School poverty −6.504 −4.49∗∗∗ −4.724 −2.50∗

Student disadvantage −6.170 −9.40∗∗∗ −4.487 −8.60∗∗∗

Growth rate −0.369 −2.86∗∗ −0.181 −1.15Immersiona 0.212 1.45 0.582 1.95†

School poverty 0.860 1.75† 1.488 1.67Student disadvantage 0.536 4.07∗∗∗ 0.025 0.29Disadvantage × Immersion −0.408 −1.90†

Cohort 2: Grade 7Initial mean status 2005 TAKS T score 52.770 100.87∗∗∗ 52.306 95.59∗∗∗

Immersion −0.488 −0.81 −1.029 −1.54School poverty −8.049 −5.77∗∗∗ −4.373 −2.39∗

Student disadvantage −5.591 −9.33∗∗∗ −4.492 −7.80∗∗∗

Growth rate −0.155 −0.85 −0.444 −1.65Immersion 0.388 1.66 0.708 1.78†

School poverty 0.905 1.30 0.317 0.31Student disadvantage 0.283 1.56 0.044 0.24

Note. HLM = hierachical growth models; TAKS = Texas Assessment of Knowledge and Skills. Number of students: Cohort 1 reading (1,380treatment, 1,613 control), Cohort 1 mathematics (1,397 treatment, 1,616 control); Cohort 2 reading (1,546 treatment, 1,725 control), Cohort 2mathematics (1,560 treatment, 1,750 control). Number of schools: 21 treatment and 21 control.aCohort 1 Technology Immersion students had significantly lower initial TAKS reading scores. A latent variable regression, controlling for the effectof this initial difference on the growth rate, indicated that the difference between the original and adjusted immersion coefficients was not significant(the difference divided by the standard error of the difference = 1.30).†p < .10. ∗p < .05. ∗∗p < .01. ∗∗∗p < .001.

students’ achievement. Economically disadvantaged stu-dents in both comparison groups grew in reading at a slightlyfaster rate than their more advantaged classmates.

Table 6 shows a comparison of the average estimated ini-tial TAKS reading and mathematics scores, yearly growthrates, and spring 2007 TAKS scores for the treatment andcontrol schools. The table also includes the estimated magni-tude of the Technology Immersion effects on TAKS readingand mathematics scores in standard deviation units.

TAKS mathematics. The estimated treatment effects onTAKS mathematics scores, as shown in Tables 5 and 6,was not statistically significant. Adjusting for student andschool poverty, the estimated Technology Immersion effecton Cohort 1 students’ growth rate for TAKS mathemat-ics just missed conventional statistical significance (p < .06rather than .05), and an interaction effect that approachedstatistical significance (p < .06) was also detected betweenthe treatment and students’ socioeconomic status. That is,economically advantaged students in treatment schools ap-peared to grow in mathematics achievement at a faster ratethan disadvantaged treatment students (0.40 T-score pointsper year compared to 0.02 T-score points per year). Thus, the

estimated mean TAKS mathematics T scores for econom-ically advantaged Cohort 1 eighth-grade students in treat-ment schools increased from 51.8 to 53.0 across 3 schoolyears, whereas the scores for their counterparts in controlschools decreased from 53.0 to 52.5. The mathematics Tscores for economically disadvantaged treatment studentsremained stable across years (47.3–47.4), whereas the scoresof their control group peers declined slightly (48.5–48.1).

The estimated effect of Technology Immersion on theTAKS mathematics achievement of Cohort 2 seventh-gradestudents did not reach statistical significance for either ad-vantaged or disadvantaged students (p < .08). However, theestimated TAKS mathematics scores of economically advan-taged and disadvantaged students at treatment schools (withaverage poverty) increased (0.26 and 0.30 T-score points peryear, respectively), whereas the scores for their counterpartsin control schools decreased (−0.44 T-score points per year).

Overall, the effects of Technology Immersion on aca-demic achievement, although somewhat more promising formathematics, was less robust than expected. Despite that, theyearly estimated growth rates displayed in Table 6 revealedgenerally positive TAKS score growth trajectories for stu-dents in Technology Immersion schools, whereas students

Dow

nloa

ded

by [

The

Uni

vers

ity o

f M

anch

este

r L

ibra

ry]

at 0

5:43

09

Oct

ober

201

4

Page 13: Effects of Technology Immersion on Middle School Students’ Learning Opportunities and Achievement

310 The Journal of Educational Research

TABLE 6. HLM Model-Based Estimations of Mean TAKS Reading and Mathematics T Scores and Mean Growth Rates byTreatment and Control Groups

Statistics for students in schools with average poverty

Technology immersion Control

Subject/Cohort/Studenteconomic status

Estimated M:Initial status

Yearlygrowth rate

Estimated M:Spring 2007

Estimated M:Initial status

Yearlygrowth

rateEstimated M:Spring 2007

Immersioneffect instandard

deviation units

TAKS readingCohort 1: Grade 8a

Advantaged 52.66 −0.16 52.19 54.00 −0.37 52.90 0.06Disadvantaged 46.49 0.38 47.62 47.83 0.17 48.33 0.06

Cohort 2: Grade 7b

Advantaged 52.28 0.39 52.75 52.77 −0.16 52.46 0.08Disadvantaged 46.69 0.52 47.72 47.18 0.13 47.44 0.08

TAKS mathematicsCohort 1: Grade 8a

Advantaged 51.82 0.40† 53.02 53.02 −0.18 52.48 0.17Disadvantaged 47.33 0.02 47.39 48.53 −0.16 48.06 0.05

Cohort 2: Grade 7b

Advantaged 51.28 0.26† 51.81 52.31 −0.44 51.42 0.14Disadvantaged 46.79 0.31† 47.40 47.81 −0.40 47.01 0.14

Note. HLM = hierachical growth models; TAKS = Texas Assessment of Knowledge and Skills. T-score M = 50 (state average TAKS score), SD =10. Immersion effect in standard deviation units = difference in T score cumulative growth between treatment and control groups / 10.aFall 2004 to spring 2007 (3 years’ growth) for Cohort 1.bFall 2005 to spring 2007 (2 years’ growth) for Cohort 2.†p < .10.

in control schools usually had negative growth trends. Thepredicted effects on TAKS scores measured in SDs, althoughvery small, consistently favored students in Technology Im-mersion schools.

Discussion

The study of Technology Immersion is distinguished fromprevious research on one-to-one computing environmentsby its experimental design and use of a theoretical frameworkto investigate causal mechanisms. The theory of change as-sumes that treatment students experience technology-richschool and classroom environments that foster more ac-tive and meaningful schoolwork, which in turn, enhancestudents’ personal competencies and engagement and ul-timately increase academic achievement. Before discussingresults, it is important to note that teachers and students incontrol schools typically had access to computers and digitalresources in computer labs or media centers, as classroom sta-tions (usually 1–3 computers), or on checkout laptop carts.Thus, control schools continued the traditional approachwith technology integration resting largely on the motiva-tion of individual teachers, whereas Technology Immersionschools committed to whole-school integration. In sectionsto follow, we discuss key findings relative to the study’s re-

search questions and the implications for one-to-one laptopprograms at other schools.

Treatment Effects on Students’ Learning Opportunities

Individual laptops and digital resources allowed middle schoolstudents to develop greater technical proficiency and reduced theirdisciplinary problems in classes—however, they attended schoolsomewhat less regularly. Our research confirmed other stud-ies linking one-to-one computing with students’ increasedtechnical proficiency (e.g., Lowther et al., 2001; Rockmanet al., 1998). Students in treatment schools made signif-icantly greater progress than control students in meetingstate standards (e.g., manage documents, use search enginesand online references). Especially noteworthy was the posi-tive immersion effect on students from lower socioeconomicbackgrounds. Economically disadvantaged students in treat-ment schools reached proficiency levels that matched theskills of advantaged students in control schools (about 3.8on the 5-point proficiency scale). Although students’ in-creased technical skills may not raise their standardized testscores, new competencies could have long-ranging effectson students’ future academic and career options.

Consistent with other research (Baldwin, 1999; Barronet al., 1999; MEPRI, 2003), students attending treatment

Dow

nloa

ded

by [

The

Uni

vers

ity o

f M

anch

este

r L

ibra

ry]

at 0

5:43

09

Oct

ober

201

4

Page 14: Effects of Technology Immersion on Middle School Students’ Learning Opportunities and Achievement

The Journal of Educational Research 311

schools exhibited stronger engagement in academic workthrough more positive classroom behavior. Having fewerdisciplinary actions suggests that individual laptops allowedteachers to create more active classroom learning experi-ences that more closely matched some students’ preferredlearning styles. Although effect sizes reflecting fewer dis-ciplinary actions for treatment students were small (−.11and −.16), they were replicated across all student cohortsand evaluation years. Reducing disciplinary actions couldalso have had practically important benefits such as reduc-ing behavioral management demands on administrators andteachers and reducing out-of-class time for students.

On the other hand, our study suggests that giving studentslaptops and raising their expectations for technology use mayhave unintended consequences. Unexpectedly, students attreatment schools attended school less regularly than con-trol students. Although attendance rate differences betweengroups were small (about 1 percentage point, on average),the school attendance deficits were replicated across cohortsand years. The reason why students in immersion schoolsattended school at lower rates is unclear. Conceivably, stu-dents who preferred learning with laptops but experiencedirregular laptop use in classrooms may have stayed homeoccasionally to use their laptops. Contrary to what may beexpected, however, treatment students’ lower average schoolattendance rates were not associated with lower average aca-demic achievement. Future research studies may shed lighton the relationship between individual student laptops andschool attendance.

The infusion of technology resources changed the nature ofclassroom activities. As posited, teachers in Technology Im-mersion schools, who had more abundant and more conve-nient access to computers and resources, had their studentsuse technology more often for learning (e.g., use a word pro-cessor to produce written work, create a presentation andshare the information with classmates, conduct Internet re-search, and communicate via e-mail about topics studied).Teachers in technology-rich classrooms also organized theirclasses differently. Treatment students, for example, inter-acted more often with their peers in small groups to discussassignments, to help each other with difficult work, and tocollaboratively produce reports or projects. It is these kindsof activities that treatment students apparently found moreengaging and that reduced behavioral problems in classes.

Treatment Effects on Academic Achievement

The effect of technology immersion on students’ reading ormathematics achievement was not statistically significant, butthe direction of predicted effects was consistently positive and wasreplicated across student cohorts. Data for multiple student co-horts and measurement occasions allowed researchers to ex-amine longitudinal reading and mathematics achievementtrends. These kinds of analyses are important when evalu-

ating the effectiveness of educational innovations becausesmall treatment effects are noteworthy when evidence in-dicates that effects are replicable (Cohen, 1994; Schmidt,1996) or cumulate into larger effects over time as programsmature (Abelson, 1985).

Evidence for Cohorts 1 and 2 showed that the estimatedTechnology Immersion effect on students’ TAKS readingachievement was positive (0.06 and 0.08 SDs, respectively)but not by statistically significant margins. Similarly, therewas no statistically significant effect of Technology Immer-sion on students’ TAKS mathematics achievement. For Co-hort 1 (eighth-grade students), the predicted immersioneffect was positive, but it was stronger for economicallyadvantaged students than for disadvantaged students (es-timated effects of .17 and .05 SDs, respectively). For Cohort2, the predicted effect of immersion on TAKS mathematicsachievement was the same for economically advantaged anddisadvantaged students (estimated effects of 0.14 SDs).

Several issues help to explain why positive and statis-tically significant changes in students and their learningexperiences (i.e., behavior, technology proficiency, class-room activities, and interactions with peers) did not mediatelarger effects on academic achievement. For one thing, ourachievement analysis was limited by the available measures.We focused on reading and mathematics because TAKStests were administered annually in those subjects, whereasyearly TAKS scores were not available for other subjectareas potentially affected by Technology Immersion (writ-ing, science, and social studies). Additionally, we had touse standardized scores (i.e., T scores based on state aver-ages) because TAKS test scores were not equated acrossgrade levels. As a result, our reported growth rates measurechanges in students’ standing on TAKS reading and mathe-matics tests relative to state averages rather than true growthin student reading and mathematics achievement. Growthanalysis with equated test scores might have yielded differentresults.

Also, previous studies of school reform have associateda greater number of implementation years with increasedeffects on achievement outcomes (Borman, 2005; Borman,Hewes, Overman, & Brown, 2003). Some researchers havereported that school change typically takes 3–5 or even moreyears to fully implement and produce stable student out-comes (Berman & McLaughlin, 1978; Hall & Hord, 2006).Similarly, we found that the effects of Technology Immer-sion on TAKS scores became stronger over time as teachersand students became more accomplished technology users.In the first, start-up year, the estimated immersion effectson TAKS reading and mathematics scores were negative. Inthe second year, immersion effects were typically positive butnot by statistically significant margins, and in the third year,estimated immersion effects on TAKS reading and math-ematics scores remained positive and neared conventionallevels of statistical significance for mathematics.

Finally, despite gradual progress, uneven implementationof the Technology Immersion model across schools and

Dow

nloa

ded

by [

The

Uni

vers

ity o

f M

anch

este

r L

ibra

ry]

at 0

5:43

09

Oct

ober

201

4

Page 15: Effects of Technology Immersion on Middle School Students’ Learning Opportunities and Achievement

312 The Journal of Educational Research

classrooms undermined prospects for substantial improve-ments in student academic achievement. Site visits atschools revealed variations in students’ laptop experiences,with students in some schools using laptops frequently dur-ing the school day, whereas laptop use in other schools wassporadic. Middle schools struggled in the first project year toaccommodate the complex demands of Technology Immer-sion. Teachers initially were at different stages of readinessfor classroom immersion, and mathematics teachers at allschools found it difficult to integrate laptops into lessons.However, as teachers grew more comfortable with technol-ogy in the second and third project years, many drew selec-tively from a wide range of technology resources to enhancetheir teaching and students’ learning.

English language arts and reading teachers increasinglyhad students use laptops to write compositions and cre-ate presentations, learn and practice skills, read and com-prehend texts, and play educational games. Mathematicsteachers, who initially had students use laptops for onlineactivities and mathematics games only after they completedpaper-and-pencil assignments, gradually expanded their usesof laptops for diagnostic assessment, enrichment of conceptspresented in traditional lessons, online mathematics pro-grams that allowed students to work at their own pace, andfor individualized test preparation activities. In general, stu-dents in Cohort 2 had teachers who facilitated more ap-propriate learning experiences with laptops than studentsin Cohort 1, who were in Grade 6 during the start-up year.Shapley, Sheehan, Maloney, and Caranikas-Walker (2009)provided an in-depth examination of the implementationfidelity of Technology Immersion at middle schools and itsrelationship with student academic achievement.

Implications for Technology in Schools

The relationship between technology and studentachievement continues to be an important topic and thefocus of considerable research. Some recent and influentialstudies have raised concerns about the viability of financialinvestments in educational technology (e.g., Cuban, 2001;Dynarski et al., 2007). Likewise, if improved standardizedtest scores is the primary justification for investments in one-to-one laptop programs, then results probably will be disap-pointing. Evidence from this study suggests that large-scaleone-to-one laptop programs are difficult to implement, and,as a result, programs may produce either very small or no im-provements in test scores. Nonetheless, as the costs of laptopsdecline and the uses of wireless computers expand (e.g., dig-ital textbooks and resources, online testing, school-to-homecommunication), interest in laptop programs is increasing(Zucker & Light, 2009). This pilot study of the TechnologyImmersion model offers lessons for school leaders as well aspolicymakers who are considering laptop programs for theirschools.

Foremost, effective technology use clearly involves morethan just buying computers and software. This study and

others suggest that laptop programs may be more effectivewhen technology is part of comprehensive school reforminitiatives (Ringstaff & Kelley, 2002; Zhao & Frank, 2003).Successful Technology Immersion schools had highly com-mitted administrative leaders who secured teacher buy-infor student laptops and provided the support componentsspecified by the model. Particularly important were invest-ments in technical support for school networks and timelylaptop repairs, and the provision of ongoing professionaldevelopment for teachers (Shapley, Maloney, Caranikas-Walker, & Sheehan, 2008). Consistent with other research,schools that served mainly economically disadvantaged stu-dent populations encountered numerous obstacles in tryingto implement a complex school reform model (Desimone,2002; Vernaz, Karam, Mariano, & DeMartini, 2006). Thus,those schools needed additional planning time to build ca-pacity and secure adequate supports prior to implementingan immersion project.

Additionally, one-to-one laptop programs were morelikely to be well implemented and sustained if laptops ad-vanced overall goals for student learning and achievement.District and school leaders who embraced Technology Im-mersion believed that individual student laptops had bene-fits above and beyond simply raising standardized test scores.Financial investments in laptops were part of an overall mi-gration toward digital school environments, including elec-tronic textbooks, online assessments, and virtual course-work. These leaders believed laptops helped prepare theirstudents for the 21st century, exposed them to worldwidecultures, expanded learning outside of school, and movedstudents toward product creation and away from drill andpractice for tests. Technology Immersion supported their vi-sion for learning opportunities that intellectually challengedand motivationally engaged students, inspired students tolearn on their own, and prepared students for life, furthereducation, and careers.

REFERENCES

Abelson, R. P. (1985). A variance explanation paradox: When a little is alot. Psychological Bulletin, 97, 129–133.

Baker, E. L., Gearhart, M., & Herman, J. L. (1994). Evaluating the Appleclassrooms of tomorrow. In E. Baker and H. O’Neil, Jr. (Eds.), Technologyassessment in education and training (pp. 173–198). Hillsdale, NJ: Erlbaum.

Baldwin, F. D. (1999). Taking the classroom home. Appalachia, 32(1),10–15.

Barron, A., Hogarty, K., Kromery, J., & Lenkway, P. (1999). An exami-nation of the relationships between student conduct and the number ofcomputers per student in Florida schools. Journal of Research on Computingin Education, 32(1), 98–107.

Berman, P., & McLaughlin, M. W. (1978). Federal programs supportingeducational change: Vol. 8. Implementing and sustaining innovations. Santa,Monica, CA: RAND.

Borman, G. D. (2005). National efforts to bring reform to scale in high-poverty schools: Outcomes and implications. In L. Parker (Ed.), Reviewof research in education, 29 (pp. 1–28). Washington, DC: American Edu-cational Research Association.

Borman, G. D., Hewes, G. M., Overman, L. T., & Brown, S. (2003).Comprehensive school reform and achievement: A meta-analysis. Reviewof Educational Research, 73, 125–230.

Dow

nloa

ded

by [

The

Uni

vers

ity o

f M

anch

este

r L

ibra

ry]

at 0

5:43

09

Oct

ober

201

4

Page 16: Effects of Technology Immersion on Middle School Students’ Learning Opportunities and Achievement

The Journal of Educational Research 313

Bradburn, F. B., & Osborne, J. W. (2007, March). Shared leadershipmakes an IMPACT in North Carolina. eSchool News. Retrieved fromhttp://www.eschoolnews.com/news/top-news/index.cfm?i=45744

Bransford, J. D., Brown, A. L., & Cocking, R. R. (2003). How people learn:Brain, mind, experience, and school. Washington, DC: National AcademyPress.

Cohen, J. (1994). The earth is round (p < .05). American Psychologist, 49,997–1003.

Cuban, L. (2001). Oversold & underused: Computers in the classroom. Cam-bridge, MA: Harvard University Press.

Dede, C. (2007). Reinventing the role of information and communicationstechnologies in education. In L. Smolin, K. Lawless, & N. C. Burbules(Eds.), Information and communication technologies: Considerations of cur-rent practice for teachers and teacher educators (pp. 11–38). Malden, MA:Blackwell.

Desimone, L. (2002). How can comprehensive school reform modelsbe successfully implemented? Review of Educational Research, 72, 433–479.

Dynarski, M., Agodini, R., Heaviside, S., Novak, T., Carey, N., Campuzano,L., et al. (2007). Effectiveness of reading and mathematics software products:Findings from the first student cohort (NCEE 2007–4005). Washington,DC: Institute of Education Sciences.

Education Week. (2007, March 29). Technology counts 2007: A digitaldecade [A special state-focused supplement]. Education Week, 26(30).Retrieved from www.edweek.org/rc

Friedman, T. L. (2005). The world is flat: A brief history of the twenty-firstcentury. New York, NY: Farrar, Straus, and Giroux.

Garet, M. S., Porter, A. C., Desimone, L., Birman, B. F., & Yoon, K.S. (2001). What makes professional development effective? AmericanEducational Research Journal, 38, 915–945.

Hall, G., & Hord, S. (2006). Implementing change: Patterns, principles, andpotholes. Boston, MA: Allyn & Bacon.

Lowther, D., Ross, S., & Morrison, G. (2001, July). Evaluation of a laptopprogram: Success and recommendations. Paper presented at the NationalEducational Computing Conference, Chicago, IL.

Lowther, D., Ross, S., & Morrison, G. (2003). When each one has one:The influence on teaching strategies and student achievement of usinglaptops in the classroom. ETR&D, 51(3), 23–44.

Maine Educational Policy Research Institute. (2003). The Maine learningtechnology initiative: Teacher, student, and school perspectives, mid-year eval-uation report. Gorham, ME: University of Southern Maine.

National Research Council & Institute of Medicine. (2004). Engagingschools: Fostering high school students’ motivation to learn. Washington,DC: The National Academies Press.

Neugent, L., & Fox, C. (2007, January). Peer coaches’spark technology integration. eSchool News. Retrieved fromhttp://www.eschoolnews.com/news/top-news/index.cfm?i=42086

Newmann, F., Bryk, A., & Nagoaka, J. (2001). Authentic and intellectual workand standardized tests: Conflict or coexistence? Chicago, IL: Consortium onChicago School Research.

Partnership for 21st Century Skills. (2006). Results that mat-ter: 21st century skills and high school reform. Retrieved fromhttp://www.21stcenturyskills.org/index.php?option=com content&task=view&id=204&Itemid=114

Penuel, W. R. (2006). Implementation and effects of one-to-one computinginitiatives: A research synthesis. Journal of Research on Technology inEducation, 38, 329–348.

Penuel, W. R., Fishman, B. J., Yamaguchi, R., & Gallagher, L. P. (2007).What makes professional development effective? Strategies that fostercurriculum implementation. American Educational Research Journal, 44,921–958.

Pitler, H. (2005). McREL technology initiative: The development of atechnology intervention program: Final report (Report No. 2005–09).Denver, CO: Mid-continent Research for Education and Learning.(ED486685)

Rasch, D., & Guiliard, V. (2004). The robustness of parametric statisticalmethods. Psychology Science, 46, 175–208.

Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Appli-cations and data analysis methods. Thousand Oaks, CA: Sage.

Ringstaff, C., & Kelley, L. (2002). The learning return on our edu-cational technology investment. Retrieved from http://www.wested.org/cs/wes/view/rs/619

Rockman et al. (1998). Powerful tools for schooling: Second year study of thelaptop program. San Francisco, CA: Author.

Rockman et al. (1999). A more complex picture: Laptop use and impactin the context of changing home and school access. San Francisco, CA:Author.

Ronnkvist, A., Dexter, S., & Anderson, R. (2000). Technology sup-port: Its depth, breadth and impact in America’s schools. Retrieved fromhttp://www.crito.uci.edu/tlc/findings.html

Russell, M., Bebell, D., Cowan, J., & Corbelli, M. (2002). An AlphaSmartfor each student: Does teaching and learning change with full access to wordprocessors? Chestnut Hill, MA: Boston College.

Russell, M., Bebell, D., & Higgins, J. (2004). Laptop learning: A comparisonof teaching and learning in upper elementary classrooms equipped withshared carts of laptops and permanent 1:1 laptops. Journal of EducationalComputing Research, 30, 313–330.

Schaumburg, H. (2001, June). Fostering girls’ computer literacy through laptoplearning: Can mobile computers help to level out the gender difference? Paperpresented at the National Educational Computing Conference, Chicago,IL.

Schmidt, F. (1996). Statistical significance testing and cumulative knowl-edge in psychology: Implications for the training of researchers. Psycho-logical Methods, 1, 115–129.

Shapley, K. S., Benner, A. D., Heikes, E. J., & Pieper, A. M. (2002).Technology integration in education (TIE) initiative: Statewide survey report,Executive Summary. Austin, TX: Texas Center for Educational Research.

Shapley, K. S., Maloney, C., Caranikas-Walker, F., & Sheehan, D. (2008)Evaluation of the Texas Technology Immersion Pilot: Third-year (2006–07)traits of higher Technology Immersion schools and teachers. Austin, TX: TexasCenter for Educational Research.

Shapley, K. S., Sheehan, D., Maloney, C., & Caranikas-Walker, F. (2008).Evaluation of the Texas Technology Immersion Pilot: Outcomes for the thirdyear (2006–07). Austin, TX: Texas Center for Educational Research.

Shapley, K. S., Sheehan, D., Maloney, C., & Caranikas-Walker, F. (2009).Evaluating the implementation fidelity of technology immersion and itsrelationship with student achievement. Journal of Technology, Learning,and Assessment, 9(4), 5–68. Retrieved from http://www.jtla.org

Shapley, K. S., Sheehan, D., Sturges, K., Caranikas-Walker, F., Hunts-berger, B., & Maloney, C. (2006). Evaluation of the Texas TechnologyImmersion Pilot: First-year results. Austin, TX: Texas Center for Educa-tional Research.

Silvernail, D. L., & Gritter, A. K. (2007). Maine’s middle school laptopprogram: Creating better writers. Gorham, ME: University of SouthernMaine, Maine Education Policy Research Institute.

Sirin, S. R. (2005). Socioeconomic status and academic achievement: Ameta-analytic review of research. Review of Educational Research, 75,417–453.

Smith, M. S., & Broom, M. (2003). The landscape and future of the useof technology in K–12 education. In H. F. O’Neal & R. S. Perez (Eds.),Technology applications in education: A learning view (pp. 3–30). Mahwah,NJ: Erlbaum.

Smolin, L., & Lawless, K. (2007). Technologies in schools: Stimulating adialogue. In L. Smolin, K. Lawless, & N. C. Burbules (Eds.), Informa-tion and communication technologies: Considerations of current practice forteachers and teacher educators (pp. 11–38). Malden, MA: Blackwell.

Stevenson, K. R. (1998, November). Evaluation report-year 2: Schoolbooklaptop project. Beaufort, SC: Beaufort County School District.

Texas Education Agency. (2004). 2003–04 academic excellence indicator sys-tem. Austin, TX: Author.

Texas Education Agency. (2006). Long-range plan for technology,2006–2020: A report to the 80th Texas Legislature from the Texas Edu-cation Agency. Austin, TX: Author.

Vernaz, G., Karam, R., Mariano, L. T., & DeMartini, C. (2006). Evaluatingcomprehensive school reform models at scale: Focus on implementation. SantaMonica, CA: RAND.

Woodul, C., Vitale, M., & Scott, B. (2000). Using a cooperative multimedialearning environment to enhance learning and affective self-perceptionsof at-risk students in grade 8. Journal of Educational Technology Systems,28, 239–252.

Zhao, Y., & Frank, K. A. (2003). Factors affecting technology uses inschools: An ecological perspective. American Educational Research Jour-nal, 40, 807–840.

Zucker, A., & Light, D. (2009). Laptop programs for students. Science,323(82), 82–85.

Zucker, A., & McGhee, R. (2005). A study of one-to-one computer use inmathematics and science instruction at the secondary level in Henrico CountyPublic Schools. San Francisco, CA: SRI International.

Dow

nloa

ded

by [

The

Uni

vers

ity o

f M

anch

este

r L

ibra

ry]

at 0

5:43

09

Oct

ober

201

4

Page 17: Effects of Technology Immersion on Middle School Students’ Learning Opportunities and Achievement

314 The Journal of Educational Research

AUTHORS NOTE

Kelly Shapley, PhD, is the Director of Shapley ResearchAssociates, a private research enterprise that specializes ineducation research, program evaluations, and policy stud-ies. Her recent work has focused on studies of technologyintegration in schools and classrooms, whole-school reform,the efficacy of charter schools, and the value of programsand policies aimed at students at risk of academic failure.

Daniel Sheehan, EdD, is a senior research analyst atthe Texas Center for Educational Research, a nonprofitresearch entity. He is a statistician and psychometricianwith specializations in hierarchical linear models, measure-ment, test development, and program evaluation. He hasauthored or coauthored articles appearing in a wide range ofjournals.

Catherine Maloney, PhD, is the Director of the TexasCenter for Educational Research. Her work at the researchcenter focuses on the use of technology to improve the edu-cational outcomes of underserved student groups, the role ofschool choice in efforts to reform public education, and theeffectiveness of initiatives designed to improve the collegereadiness of low-income students.

Fanny Caranikas-Walker, PhD, is the training coordina-tor for the Small Business Development Center at TexasState University–San Marcos. She previously was an Assis-tant Professor at Washington State University and a researchanalyst at the Texas Center for Educational Research. Herresearch interests focus on the behavioral aspects of employ-ment and employer–employee relationships in educationaland other organizations, and the factors contributing to theretention and success of students.

Dow

nloa

ded

by [

The

Uni

vers

ity o

f M

anch

este

r L

ibra

ry]

at 0

5:43

09

Oct

ober

201

4

Page 18: Effects of Technology Immersion on Middle School Students’ Learning Opportunities and Achievement

The Journal of Educational Research 315

APPENDIXTechnology Immersion Components

The Texas Education Agency selected three lead vendors as providers of Technology Immersion packages (Dell Computer,Inc., Apple Computer Inc., and Region 1 Education Service Center [ESC]). Sections to follow provide descriptions ofpackage components.

Wireless Laptops and Productivity Software

All vendors offered a wireless laptop as the mobile computing device. Campuses could select either Apple laptops (iBookand MAC OSX) or Dell laptops (Inspiron or Latitude with Windows OS). For Apple laptops, AppleWorks provided asuite of productivity tools, including Keynote presentation software, Internet Explorer, Apple Mail, iCal calendars, iChatinstant messaging, and iLife Digital Media Suite (iMovie, iPhoto, iTunes, GarageBand, and iDVD). For Dell laptops,Microsoft Office included Word, Excel, Outlook, PowerPoint, and Access. In addition, eChalk served as a “portal” toother web-based applications and resources included in the immersion package and a student-safe e-mail solution. Region1 ESC provided Dell products.

Online Instructional and Assessment Resources

Immersion packages included a variety of digital resources. Apple included the following online resources: netTrekker(an academic Internet search engine), Beyond Books from Apex Learning (reading, science, and social studies online),ClassTools Math from Apex Learning (complete mathematics instruction), ExploreLearning Math and Science (supple-mental mathematics/science curriculum), TeenBiz3000 from Achieve 3000 (differentiated reading instruction), and MyAccess Writing from Vantage Learning (support for writing proficiency). Dell, Inc. selected netTrekker (an academicInternet search engine) and Connected Tech from Classroom Connect (technology-based lessons and projects). Region1 ESC selected Connected Tech but also added a variety of teaching and learning resources including Unitedstreaming(digital videos), Encyclopedia Britannica, EBSCO (databases), NewsBank, and K12 Teaching and Learning Center. For theApple package, AssessmentMaster (Renaissance Learning) provided a formative assessment in all four core subject areas.Both the Dell and Region 1 ESC packages provided i-Know (CTB McGraw Hill) for core-subject assessment. In addition,all campuses had access to the online Texas Mathematics Diagnostic System (TMDS) and Texas Science DiagnosticSystem (TSDS) that were provided free of charge by the state.

In addition to these resources, individual schools had access to software products that had been purchased prior to theTechnology Immersion project, digital resources that accompanied adopted textbooks, and educational resources availablefree of charge on the Internet.

Professional Development

Each immersion package included a different professional development provider. Apple used its own professional devel-opment model, whereas the Dell package relied on Pearson Learning Group, a commercial provider (formerly Co-nect),to support professional development. Region 1 ESC used a combination of service center support plus other servicesoffered through Connected Coaching and Connected University. Although the professional development models andproviders differed, they all were expected to include some common required elements, including support for immersionpackage components, the design of technology-enhanced learning environments and experiences, lesson developmentin the core-subject areas, sustained learning opportunities, and ongoing coaching and support. Individual districts andcampuses collaborated with vendors to develop specific professional development plans for their teachers and other staff.

Technical and Pedagogical Support

Each Technology Immersion package provider also was required to provide campus-based technical support that advancedthe effective use of technology for teaching and learning. Apple designed a Master Service and Support Program. Dellestablished a Call Center dedicated to technical support for TIP grantees as well as an 800 telephone number for hardwareand software support. Region 1 ESC had an online and telephone HelpDesk to answer questions and provide assistance.Individual districts and schools also provided support.

Dow

nloa

ded

by [

The

Uni

vers

ity o

f M

anch

este

r L

ibra

ry]

at 0

5:43

09

Oct

ober

201

4