middle school students’ writing and feedback in a cloud-based classroom environment

29
ORIGINAL RESEARCH Middle School Students’ Writing and Feedback in a Cloud-Based Classroom Environment Binbin Zheng Joshua Lawrence Mark Warschauer Chin-Hsi Lin Ó Springer Science+Business Media Dordrecht 2014 Abstract Individual writing and collaborative writing skills are important for academic success, yet are poorly taught in K-12 classrooms. This study examines how sixth-grade students (n = 257) taught by two teachers used Google Docs to write and exchange feedback. We used longitudinal growth models to analyze a large number of student writing samples (n = 3,537) as they changed across individual writing sessions (n = 18,146), and multiple regression analyses to relate writing quantity and types of feedback to changes in students’ standardized test scores. Additionally, student survey data and content analysis of students’ writing samples were used to contextualize and interpret students’ writing patterns. Overall, students had a positive attitude towards using Google Docs for editing and for the provision and receipt of feedback. They received various types of feedback from both their teachers and peers, but most were direct in character and in the areas of mechanics and grammar errors. However, neither students’ writing quantity nor their types of feedback on Google Docs was associated with their writing achievement. Our findings suggest that cloud-based technology could be integrated into K-12 classrooms to support students’ writing and editing, and to engage students in collaborative writing and improve interactions between writers and readers. Keywords Google Docs Collaborative writing Feedback Cloud-based technology B. Zheng (&) C.-H. Lin 620 Farm Lane, East Lansing, MI 48824, USA e-mail: [email protected] C.-H. Lin e-mail: [email protected] J. Lawrence M. Warschauer 3200 Education, Irvine, CA 92697, USA e-mail: jfl[email protected] M. Warschauer e-mail: [email protected] 123 Tech Know Learn DOI 10.1007/s10758-014-9239-z

Upload: uio

Post on 31-Mar-2023

0 views

Category:

Documents


0 download

TRANSCRIPT

ORI GINAL RESEARCH

Middle School Students’ Writing and Feedbackin a Cloud-Based Classroom Environment

Binbin Zheng • Joshua Lawrence • Mark Warschauer •

Chin-Hsi Lin

� Springer Science+Business Media Dordrecht 2014

Abstract Individual writing and collaborative writing skills are important for academic

success, yet are poorly taught in K-12 classrooms. This study examines how sixth-grade

students (n = 257) taught by two teachers used Google Docs to write and exchange

feedback. We used longitudinal growth models to analyze a large number of student

writing samples (n = 3,537) as they changed across individual writing sessions

(n = 18,146), and multiple regression analyses to relate writing quantity and types of

feedback to changes in students’ standardized test scores. Additionally, student survey data

and content analysis of students’ writing samples were used to contextualize and interpret

students’ writing patterns. Overall, students had a positive attitude towards using Google

Docs for editing and for the provision and receipt of feedback. They received various types

of feedback from both their teachers and peers, but most were direct in character and in the

areas of mechanics and grammar errors. However, neither students’ writing quantity nor

their types of feedback on Google Docs was associated with their writing achievement. Our

findings suggest that cloud-based technology could be integrated into K-12 classrooms to

support students’ writing and editing, and to engage students in collaborative writing and

improve interactions between writers and readers.

Keywords Google Docs � Collaborative writing � Feedback � Cloud-based technology

B. Zheng (&) � C.-H. Lin620 Farm Lane, East Lansing, MI 48824, USAe-mail: [email protected]

C.-H. Line-mail: [email protected]

J. Lawrence � M. Warschauer3200 Education, Irvine, CA 92697, USAe-mail: [email protected]

M. Warschauere-mail: [email protected]

123

Tech Know LearnDOI 10.1007/s10758-014-9239-z

Writing is a vital skill in a knowledge economy, yet only 27 % of eighth-grade students in

the United States write at a proficient level (NCES 2012). Individual feedback is con-

sidered essential for learning how to write, but secondary teachers may have difficulty

providing feedback to all the students they teach in a day (Cho and Cho 2007). In tradi-

tional instructional environments, students write for their teacher rather than with or for

their classmates. This format limits not only the amount of feedback students might obtain

from their peers, but also the possibilities for collaborative writing; this in turn contributes

to the challenges students are likely to face later in office and university environments, in

which the majority of documents now have at least two contributors (Ede and Lunsford

1992; Geoffrey 2001; also see discussion in Calvo et al. 2011).

New cloud-based writing environments could potentially enable more writing feedback

and collaboration among students, as well as between students and teachers; and with the

growth of school laptop programs and free online services, they are becoming more

common. However, such cloud-based writing environments as exist in K-12 schools have

not been widely studied. In this paper we present a study of 257 middle school students’

writing and feedback based on their writing samples on Google Docs, standardized literacy

test scores, and survey responses. First, we review relevant studies related to collaboration

and feedback in writing.

1 Computer-Supported Collaborative Writing

According to Flower and Hayes (1981), writing comprises three main components: plan-

ning, translating, and reviewing. This model was further refined from a socio-cultural

perspective by Hayes (1996), who suggests that writing is a recursive process during which

students plan, reflect, make inferences, solve problems, and produce and interpret texts (see

also Lindblom-Ylanne and Pihlajamaki 2003). Students are encouraged to participate in

writing activities which foster interactions and collaborations with others (Storch 2005;

Yeh et al. 2011). Collaboration in writing tends to help learners reflect on their own

language production when they attempt to create meaning (Swain 1995). However, the

process of collaborative writing is complex: involving role assignment, planning, brain-

storming, drafting, reviewing, revising, and editing (Calvo et al. 2011). It is best accom-

plished with the support of shared documents that allow for synchronous editing, tracked

changes, and shared responsibilities across the writing task.

New digital technologies can provide better environments for carrying out these com-

plex tasks than traditional word-processing tools can: a meta-analysis of 26 studies on

student writing with computers (Goldberg et al. 2003) found that computer-based writing is

typically more collaborative and more iterative, and involves more peer editing, than

writing with paper and pencil. Nowadays, with the rapid emergence of new technologies,

collaborative writing can take place on discussion boards, online chat platforms, wikis, or

web-based word-processing tools (Kessler et al. 2012). Computer-supported collaborative

writing could be of great value for writers because of inherent features such as flexible

authoring and content creation, and the support for collective knowledge construction

(Elola and Oskoz 2010; Yeh et al. 2011). For example, the open-editing and review

functions of wikis have made them effective tools to support collaborative writing (Parker

and Chao 2007). These technologies also ease the processes of feedback and revision,

which could further benefit students’ collaborative writing by potentially increasing their

motivation and promoting their sense of ownership and autonomy (Kessler et al. 2012).

B. Zheng et al.

123

2 Collaborative Writing and Student Learning

Previous research has indicated that collaborative writing could potentially contribute to a

better sense of audience and ownership; increased motivation and critical thinking skills;

closer attention to grammatical accuracy, vocabulary use, and discourse; and, ultimately,

better writing quality (e.g., Chao and Lo 2009; Elola and Oskoz 2010; Kessler et al. 2012;

Lindblom-Ylanne and Pihlajamaki 2003; Storch 2005; Yeh et al. 2011).

Nagelhout (1999) argues that collaborative writing is most beneficial in its potential to

engage students in the recursive process of writing. During the collaborative process, less

proficient writers could learn from others to improve their writing, while more proficient

ones could write more critically due to the exchange of ideas with others (Yeh et al. 2011).

Elola and Oskoz (2010) examined collaborative writing on a wiki platform by college

students taking Spanish as L2; their respondents perceived that when writing collabora-

tively, the overall quality of their essays was improved through the process of polishing

over multiple drafts. Similarly, another study found that collaborative writing in a wiki

environment improved English as a foreign language (EFL) college students’ writing

accuracy (Kessler et al. 2012). This echoes Storch’s (2005) suggestion that EFL college

students writing in pairs produced shorter, but higher-quality texts with greater gram-

matical accuracy and linguistic complexity than did students writing individually. Other

research has demonstrated that undergraduate students significantly improved their writing

skills in terms of grammar, mechanics, writing style, and referencing after going through a

peer-review process (Fallahi et al. 2006). At the elementary level, Chinese fifth-grade EFL

students reported that they enjoyed English writing with wikis, and perceived that col-

laborative writing helped foster teamwork and improve their writing (Woo et al. 2011).

However, contradictory results about the effects of collaboration on writing quality have

also been found. For example, wiki-based collaborative writing practice was shown to have

a significant effect on elementary EFL students’ attitudes toward writing, but no significant

effect on their writing abilities (Li et al. 2014). Using focus-group interviews and ques-

tionnaires, another study indicated that, although college-level EFL students expressed

positive attitudes toward collaborative writing on wiki pages, they felt inhibited and

somewhat uncomfortable about editing one another’s work (Aydin and Yildiz 2014).

Similarly, undergraduate students collaborating on a written assignment indicated that they

were more comfortable receiving suggestions than direct edits, since they felt that others

sometimes worsened their documents when editing them (Blau and Caspi 2009).

3 Types of Collaborative Feedback

Students writing in shared online environments are able to engage in meaningful processes

of peer interaction, through providing feedback to one another and receiving it (Chao and

Lo 2009). Research suggests that the improved outcomes observed in regard to collabo-

rative writing may be due in part to enhanced opportunities to give and receive feedback

(Ge 2011; Storch 2005). Different types of feedback may lead to different levels of learner

uptake, and previous research has identified several distinct feedback types. For example,

Sheppard (1992) categorized it into two types: discrete-item attention to form, in which

feedback providers indicate error types and locations, and holistic feedback on meaning, in

which they use general requests for clarification, such as ‘‘You might want to be more clear

about what this means.’’ Another study, by Robb et al. (1986), summarized four different

feedback types: direct, coded, uncoded, and marginal. Direct feedback refers to a teacher

Middle School Students’ Writing and Feedback

123

providing correct forms of all categories of lexical, syntactic, and stylistic errors. Coded

feedback means that the teacher uses an abbreviated code system, such as ‘‘s-v’’ for

subject-verb disagreement. Uncoded feedback means that the teacher highlights the

location of errors, but does not specify precisely what they are. Lastly, the most indirect

type of feedback, marginal feedback, indicates that the teacher only provides the number of

errors per line in the margins. Similarly, Ellis (2009) presents a typology of written

corrective feedback (CF) types and shows six strategies for providing CF, including a)

direct CF; b) indirect CF; c) metalinguistic CF; d) unfocused and focused CF; e) electronic

feedback; and f) reformulation. Among these, a) is similar to the direct feedback identified

by Robb et al., while b) covers all three remaining types of feedback (i.e., coded, uncoded,

and marginal) discussed in Robb et al.’s study (1986).

In addition to CF, feedback providers have the option of providing non-corrective

feedback that neither directly nor indirectly points out errors or correct forms, but instead

provides general feelings or evaluation. Affective feedback is a sub-type of non-corrective

feedback, which is defined as the provision of encouragement or emotional responses

(Vigil and Oller 1976). Vigil and Oller differentiated affective feedback from cognitive

feedback, and argued that it could have differing impacts on writers, as affective feedback

could be positive or negative (see also Han 2002). It has also been suggested that using

encouragement in responses could have beneficial effects on students’ motivation to write

well (Jago 2001).

When asked to work in technology-supported collaborative environments such as

wikis, students tend to avoid giving feedback to others’ written products, and to dis-

courage others from editing their own texts (Dalke et al. 2007). When they do give

feedback, it is focused more on their peers’ language at the word- and sentence-levels

than on content in a broader sense (Chamberlain 2010; Lund and Smørdal 2006, August;

Storch 2005). For example, Wang (2009) found that university-level Taiwanese EFL

students in a blog environment provided more lexical and grammatical feedback and

less organizational and content feedback. Similarly, Ge (2011) found that Chinese adult

English learners participating in a network-based peer-review activity focused primarily

on pointing out grammatical mistakes in others’ writing, while only a few of them

provided comments on style or textual organization. These findings align with models

that acknowledge how types of feedback are related to feedback providers’ writing

levels: with more experienced writers tending to focus on errors of meaning or structure,

while less experienced writers tend to focus on surface changes (Flower et al. 1986;

Jones 2008). Chaulk (1994), for example, found that while students’ feedback usually

focused on the micro level, teachers tend to provide feedback that was on the macro

level. It is perhaps unsurprising, then, that students are more inclined to report adopting

teacher feedback to improve their writing, and tend to be more critical of feedback

received from peers (Yang et al. 2006). Even so, peer feedback plays an important role

and could be an effective complement to teacher feedback in students’ writing process

(Zhang 2008). Cho and colleagues found that, compared to expert feedback, peer

feedback is more effective in improving writing in both educational and organizational

settings (Cho and Cho 2007; Cho and Schunn 2007).

Most prior research on computer-supported writing and revision has been conducted in

higher-education or industry settings (see, e.g., Couture and Rymer 1991; Geoffrey 2001),

while most research on feedback has favored feedback provided by teachers. There has

been little research to date on computer-supported writing and feedback provision in K-12

schools, and virtually none involving new cloud-based tools. This study, therefore, intends

to investigate the impact of the use of one prominent cloud-based tool, Google Docs, on

B. Zheng et al.

123

middle school students’ writing development, and the types of feedback that both teachers

and peers provided using this tool.

4 Google Docs

Wikis provide a favorable online environment for collaborative writing and social

interaction, and facilitate the creation of online writing communities in which par-

ticipants can write, edit, and modify texts collaboratively (Chao and Lo 2009; God-

win-Jones 2003). However, writing on wikis often involves the use of complex code

and access to specialized sites. An alternative collaborative writing environment,

Google Docs, has come into widespread use in recent years, probably due to its

association with other popular Google tools as well as to the simplicity of its writing

interface.

Google Docs is widely used by individuals, and has also been made available at no

charge to school districts, universities, and nonprofit organizations through Google Apps

for Education (Oishi 2007). The formatting functions available to authors in the Google

Docs environment are limited compared to those available in some commercial word-

processing applications, but they allow for the kinds of composition, communication, and

publishing that are typically required in K-12 schools. Google Docs can support both

synchronous and asynchronous editing and commenting by multiple users on different

computers, allowing students to share documents with others while controlling the level of

access that invited collaborators have to a given document (Blau and Caspi 2009;

Chamberlain 2010; Conner 2008). Users other than the original author can view, edit, and

comment on any document, once its author gives them permission to do so. The possibility

of content editing by multiple contributors on Google Docs may make it a desirable

application for collaborative learning (Educause Learning Initiative 2008). In addition, the

revision history function of Google Docs contributes to more transparency and ease of use

among collaborators in the writing process, and the fact that students know their changes

could be automatically saved and previous versions could be retrieved makes them more

willing to make changes, and alleviates the tension between individual and collaborative

writing (Kessler et al. 2012).

This study examines how middle school students used Google Docs for different writing

activities; the relationship between students’ writing on Google Docs and their writing

achievement; and what types of feedback students received from peers and teachers on

Google Docs. Four research questions were addressed:

1. Perceptions: How did students perceive the use of Google Docs in their English

language arts (ELA) classrooms?

2. Writing quantity: What were the patterns of revisions and amendments for documents

with various numbers of authors and with authors at different writing skill levels?

3. Writing and reading achievement: How were students’ writing development on

Google Docs, and their self-reported use of Google Docs, related to their standardized

writing and reading achievement growth?

4. Feedback: What kinds of feedback did students receive from peers and teachers on

Google Docs? And do different types of feedback or feedback providers affect

students’ writing and reading achievement growth?

Middle School Students’ Writing and Feedback

123

5 Methods

5.1 Context

This study took place in one middle school in a suburban Colorado school district. The district

employed Calkins’ (1994) Writing Workshop model to implement a district-wide writing

curriculum in 2009. This model focuses on writing for authentic purposes and audiences. To

support this curriculum, a laptop initiative called Inspired Writing was first implemented

among all fifth-grade classes in the district in the 2009–2010 school year. Each student in the

program was provided with an Eee netbook for use throughout the school day, using the open-

source Linux operating system and, for the most part, open-source software. Netbooks and

various social media were extensively used in the school district to support students’ authentic

writing. Teachers involved in this laptop program participated in a week of training on the

hardware and software and the integration of this technology into the curriculum. From the

beginning of the 2011–2012 school year, all ELA teachers in participating middle schools in

the district were encouraged to use Google Docs in their classrooms to support students’

authentic writing. All these teachers were asked to share documents with our research team as

part of a larger study, but this study focuses on one middle school in particular because the two

teachers in that school shared all 3,537 of their 257 sixth-grade students’ documents with us.

Descriptive statistics of these students’ individual characteristics are presented in Table 1.

Among these participants, 89 % were White and 7 % were Hispanic. Only 2 % were English

learners, and 10 % were free- or reduced-price lunch recipients. This closely matched the

demographic composition of the school district as a whole.

5.2 Sources of Data

5.2.1 Test score data

The statewide Colorado Student Assessment Program (CSAP) writing and reading test

scores from the 2010–2011 school year were collected before students began working on

Google Docs in the target classrooms, and CSAP writing and reading test scores from the

2011–2012 school year were collected at the end of the year.

5.2.2 Student Survey

All sixth graders in this study completed our district-wide online survey (see Appendix) at

the end of the 2011–2012 school year. The survey queried students’ basic demographic

information; their self-perceived computer skills; the frequency of their laptop use for

particular tasks and activities (including the specific use of various technologies); the

frequency of their use of Google products; their opinions of how writing on Google Docs

compared with writing on paper and on word-processing software; how they perceived

collaborative writing; and their overall evaluation of the use of Google Docs. A total of 231

students responded to the student survey (a response rate of 89.9 %).

5.2.3 Documents

We collected samples of writing created using Google Docs by all 257 students taught by

the two teachers during the 2011–2012 school year: in all, this consisted of 3,537 docu-

ments and 18,146 revisions. Our research team developed an analytical tool called

B. Zheng et al.

123

SCAPES (http://scapes-uci.appspot.com/) to extract records of every revision session of

each Google Docs document. In addition, 40 students’ writings were randomly selected for

further content analysis, and feedback from both their teachers and peers was retrieved for

each of the 919 documents that these 40 students created.

5.3 Measurement

5.3.1 Writing and Reading Achievement

CSAP is Colorado’s state standards-based assessment, and its numeric writing and reading

scale scores were used in this study. The writing test asks students to write essays on the

basis of a given writing prompt. For example, the sixth-grade writing prompt in 2010 was:

‘‘Imagine that your family is moving to a new place and your pet will not be allowed to live

there. Write a letter to persuade the person in authority to allow your pet to live there.’’ The

reading test mainly focuses on students’ reading comprehension, asking them to respond to

reading materials, often by using details from the reading to describe what happened or to

explain its overall meaning. According to the Colorado Department of Education (2011b),

the Cronbach’s alpha reliability scores for the 2011 CSAP sixth-grade writing and reading

tests were .91 and .93, respectively. The same report also mentioned that IRT models were

used to test validity, with the results indicating that less than 4 % of score items across all

subjects at all grade levels on the 2011 assessments were flagged for poor model fit,

meaning that the resulting scores are interpretable and valid (p. 83). The sixth-grade

writing test was graded on four levels: advanced (score range: 600–840), proficient

(513–599), partially proficient (423–512), and unsatisfactory (230–422). For the sixth-

grade reading test, the score ranges for the four levels were: advanced (696–970), profi-

cient (600–695), partially proficient (543–599), and unsatisfactory (260–542). Among the

students in the present study, pre-test writing scores ranged from 419 to 780 (M = 549,

SD = 54), and post-test writing scores from 411 to 737 (M = 558, SD = 53). The sam-

ple’s pre-test reading scores ranged from 220 to 788 (M = 651, SD = 53), and post-test

reading scores from 489 to 819 (M = 669, SD = 46).

5.3.2 Writing and Revision

In this study, we used time variables, session-level variables, and document-level variables

to describe students’ writing and revision.

Table 1 Descriptive statistics ofstudent characteristics

Percentages (%)

Male 51

Hispanic 7

White 89

Others 4

Free/reduced lunch recipients 10

English learners 2

Gifted students 33

IEP students 5

N 257

Middle School Students’ Writing and Feedback

123

Editing Sessions This variable records how many times authors made changes to a

document. The minimum value for SESSION was 1 since the initial creation of a document

was deemed to include one editing session. The document with the largest number of

editing session had been worked on 74 different times. We treat this variable as our metric

of time in our longitudinal analysis, as described in more detail below.

Words Added This variable registers how many words were added to a document during

a specific editing session. WORDS_ADD ranged from 0 to 3,911. To help us answer our

third research question, we also created a variable reporting the average number of words

added per session for each student (AVE_WORDS_ADD).

Words Deleted WORDS_DEL indicates how many words were removed from a par-

ticular document during a specific editing session. WORDS_DEL ranged from 0 to 3,106.

Total Words TOTAL_WORDS describes the number of words in a given document

after it was edited for the last time. TOTAL_WORDS ranged from 1 to 5,500.

Primary Author This variable describes the contributor who first created a document.

Number of Contributors This variable indicates how many people in total contributed to

a specific document as authors or as editors.

Number of Edits This variable describes how many times a specific document was

edited.

5.3.3 Usage and Perception

Students’ self-reported usage of Google Docs and perceptions of Google Docs were

measured using survey items (see Appendix for the complete survey). Usage of Google

Docs was measured by three items: (1) ‘‘How much do you use the following Google

products AT SCHOOL?’’; (2) ‘‘How much do you use the following Google products AT

HOME?’’; and (3) ‘‘How much do you usually use Google Docs AT SCHOOL to do the

following?’’ For the answers to these questions, the scale was: 0 (never), 0.5 (less than once

a week), 1 (once a week), and 2.5 (a few times a week), 5 (less than 1 h a day), 10 (1–2 h a

day), and 15 (more than 2 h a day).

Students’ perceptions of Google Docs were measured by two survey items with a total

of ten sub-items (see Questions #12 and #13 in the Appendix): (1) ‘‘Comparing writing

with Google Docs to WRITING ON PAPER, please indicate how much you agree or

disagree with the following statements about use of Google Docs?’’; and, (2) ‘‘Comparing

writing with Google Docs to WRITING ON WORD PROCESSING SOFTWARE (Open

Office, Microsoft Word, etc.), please indicate how much you agree or disagree with the

following statement about use of Google Docs?’’ For the answers to these ten sub-ques-

tions, the scale was: -2 (strongly disagree), -1 (disagree), 0 (neutral), 1 (agree), and 2

(strongly agree).

In addition, information about the participating students was provided by the Colorado

Department of Education (CDE) (2011a).

English Learner Status The CDE defines English Learners as students who have a

language background other than English and are currently being served or monitored by

either Bilingual or English as Second Language (ESL) programs. In this study, EL is a

dummy variable that distinguishes English learners (EL = 1) from non-English learners

(EL = 0).

Ethnicity Dummy variables are also used to represent each ethnicity in this study. Since

the majority ethnicities in the district are Whites and Hispanics, two dummy variables

(HISPANICS and OTHERS) were generated, with the base category being WHITES.

B. Zheng et al.

123

Free- or Reduced-Price Lunch Status This variable identifies students who meet the

eligibility criteria for free- or reduced-price lunch pursuant to the provisions of the Federal

National School Lunch Act. In this study, FREE LUNCH is a dummy variable, in which

‘‘1’’ identifies students who receive free/reduced-price lunch, and ‘‘0’’ identifies those who

do not.

5.4 Data Analysis

For the present study, Stata 13 software was used to conduct all of the quantitative

analyses. To answer our first research question, regarding students’ usage and perceptions

of Google Docs in their ELA classrooms, descriptive statistics were used to analyze

students’ survey responses.

To answer the second research question, which asks about patterns of revision and

amendments, we fit a two-level individual growth model using hierarchical linear modeling

with maximum likelihood estimates (Singer and Willett 2003). The equation is as follows:

Level 1: Lengthij ¼ p0j þ p1jSessionij þ eij

Level 2: p0j ¼ b00 þ b01Number of Contributorsj þ b02 Pr ej þ b03Xj þ u0j

p1j ¼ b10 þ b11Number of Contributorsj þ b12Prej þ b13kXj þ u1j

ð1Þ

In this equation, the dependent variable is the total words in the jth document at the ith

edit. Level 1 describes within-document variation, which is the word count of the docu-

ment in different edit sessions, with SESSION the time variable. Our models used editing

sessions rather than days of revision as a marker of time, since the models using editing

session fit better. In these models, the second editing session is treated as the second

opportunity for document growth, even if it occurred many days after the document was

created. Level 2 explains between-document variation. The document-level covariates

include NUMBER_OF_CONTRIBUTORS, which indicates the total number of editors of

the document plus the original author; PRE, consisting of students’ pre-test scores in

reading and writing; and X, including ethnicity, English learner status, and lunch-recipient

status. PRE and X were measured by aggregating individual writers’ and editors’ PRE and

X scores within every document. These variables were used to account for the variance

between students and thus entered the equation at Level 2.

To answer the third research question, we used a multiple regression residualized

change model to examine the effects on students’ writing and reading post-test of, (a) their

participation on Google Docs, and (b) their self-reported usage of Google Docs, controlling

for their writing and reading pre-test. The equation is as follows:

Post-testi ¼ b0 þ b1Pre� testi þ b2Edit counti þ b3Words addedi

þ b4Usagei þ b5Teacher Ai þ b6Xi

ð2Þ

In this equation, the dependent variable is a student’s writing and reading CSAP post-

test scores; writing and reading CSAP pre-test scores are controlled. The covariate

Edit_count is the total number of edits this student made on Google Docs, with the

covariate Words_added being the average number of words this student added on Google

Docs. Usage represents how much a particular student used Google Docs for different

activities. Teacher_A indicates teacher factor. Since students were from two teachers’

classrooms (A and B), Teacher_A is a dummy variable used to control for between-teacher

variance.

Middle School Students’ Writing and Feedback

123

The fourth research question—regarding the kinds of feedback students received from

Google Docs—was answered using descriptive statistics and content analysis. Two ana-

lytical frameworks were used here: types of feedback and the focus of feedback. For types

of feedback, each item was analyzed using a combination of top-down and bottom-up

coding processes. First, a theory-driven coding strategy based on findings by Robb et al.

(1986) and Ellis (2009) was employed. Additional codes emerged during the coding

process, as a result of which, a final Online Feedback Analysis Framework (see Table 2)

was created. This framework included five types of feedback: direct feedback, commentary

feedback, highlighted feedback, affective feedback, and evaluative feedback. The first

researcher performed the top-down and bottom-up coding and established the framework; a

second researcher then coded all items again to establish inter-rater reliability, which was

found to be .93. Where an item was coded inconsistently between the two coders, they sat

down together to discuss it and agreed on its final coding. For focus of feedback, a bottom-

up coding strategy was used to analyze which area each item was focused on (i.e.,

mechanics, grammar, word choice, content, organization, conventions, and general feed-

back). The latter part of the research question – regarding the effect of feedback types on

achievement growth – was answered using multiple regression. The equation is as follows:

Post-testi ¼ b0 þ b1Pre - testi þ b2N Tfeedi þ b3N Sfeedi þ b4Feedback Typei ð3Þ

In this equation, the dependent variables are a student’s writing and reading CSAP post-

test scores; writing and reading CSAP pre-test scores are controlled. The covariate N_Tfeed

is the total number of feedback items this student received from the teacher, with the

covariate N_Sfeed being the total number of feedback items he/she received from peers.

Feedback_Type includes five variables: number of direct feedback items, number of

commentary feedback items, number of highlighted feedback items, number of affective

feedback items, and number of evaluative feedback items.

6 Results

6.1 Students’ Perception of Using Google Docs in their ELA Classrooms

Among all Google products, our sample of students used Google Docs the most: both in

ELA classrooms (one to 2 h per day) and at home (several times per week). These self-

reported student usage patterns are shown in Table 3. In terms of the specific uses to which

Google Docs was put, students reported using it the most for revising and drafting their

own documents (see Table 4), and, interestingly, more for revising than for drafting. This

finding suggests that students feel Google Docs provides a favorable environment for

student revising and editing, and this aligns with previous research findings, that computer-

based writing is typically more iterative, and involves more peer editing than writing with

paper and pencil (Goldberg et al. 2003). Following these two most frequent uses, other

frequent uses included commenting on others’ writing, chatting with others, filling in

teacher templates, and taking notes.

When asked about how writing on Google Docs compared to writing on paper and

writing on word-processing software, students tended to report that using Google Docs

helped them become more organized, and that they preferred writing on Google Docs to

using other platforms. A majority of students also agreed that they edited their work much

B. Zheng et al.

123

more easily and received more feedback from peers when using Google Docs. This

agreement was stronger when the point of reference was writing on paper than when it was

word-processing software (see Fig. 1). This suggests that students liked writing on Google

Docs the most, followed by word-processing software, and paper the least.

6.2 Document Drafting and Revision

Descriptive statistics of students’ writing and revision on Google Docs are presented in

Table 5. Overall, each student created an average of 13.8 (=3,537/257) documents and

made 67.8 (=4.93 9 [3,537/257]) edits during the 2011–2012 school year. The average

document contained 248.0 words in the first session and 429.9 words in the last session,

and 4.9 edits made by 1.3 authors over a period of 15.3 days. No more than six contributors

worked on any given document. The average edit session saw 118.0 words added and 28.1

words deleted. In addition, descriptive statistics of student writing and reading pre-test and

post-test scores are shown in Table 6.

In terms of the number of contributors to each document, a large majority of the

documents (73 %) were edited by a single author, an additional 22 % was edited by two

contributors, and the remaining 5 % of documents were all edited by three or more con-

tributors. Looking at this result more closely reveals that the main author was the most

prolific contributor to writing on Google Docs, followed by teachers, and then peer col-

laborators, with the main author composing an average of 129 words during each session,

teachers 46 words, and peer students 38 words. Previous research on learners’ collaborative

writing on wikis found that students tended to avoid editing others’ work, as they felt it

inappropriate to change others’ written products (Dalke et al. 2007), and this may partially

explain why peers added fewer words than teachers did.

A two-level growth model was used to examine the effects over time of 1) editing and

2) the number of authors on writing quantity on Google Docs, measured by the total

number of words composed (see Table 7). Model 1 is an unconditional model including

only the time variable Session without controlling for any other factors. Model 2 includes

the number of contributors. Model 3 includes students’ individual characteristics as well as

their standardized achievement scores in writing and reading. The final model includes all

main variables from Model 2 and Model 3, and adds the interaction between number of

contributors and session, and the interaction between standardized writing/reading

achievement scores and session. The results show that documents had significantly greater

Table 2 Online Feedback Analysis Framework

Types of Feedback Description

Corrective feedback

Direct feedback Provides the correct form

Commentary feedback Does not provide correction. Instead, he or she indicates that anerror exists by identifying it and/or asking for clarification

Highlighted feedback Does not indicate the nature of the error, but highlights it toindicate its location

Non-corrective feedback

Affective feedback Provides the writer with encouragement or the feedback provider’semotional response to the writing

Evaluative feedback Provides a more general evaluation of the written texts

Adapted and modified from Ellis (2009) and Robb et al. (1986)

Middle School Students’ Writing and Feedback

123

lengths when edited more times (p \ .001). However, more contributors editing a docu-

ment resulted in significantly fewer words added at each session (p \ .001). Thus, when

edit sessions were controlled, having a second contributor resulted in fewer words being

added on a document.

The demographic characteristics of the students influenced neither the length of the

initially drafted document, nor the amount of amendments that were made to it in each

editing session. The results also show that students’ pre-test scores in writing, but not in

reading, significantly predicted their words added during the drafting session (p \ .05) and

the number of words they added at each editing session (p \ .001).

In order to illustrate the relationship between number of contributors, number of edits,

and writing quantity, we plotted the predicted trajectories of documents created by one,

two, or three contributors using the coefficients from the final fitted model (Fig. 2). It

should be noted that at the first editing session, multiple-authored documents were shorter.

Single authored documents tended to be drafted quickly, and on average were more than

twice as long at the end of the first editing session as documents with three contributors.

Furthermore, multiple-authored documents grew slowly at each editing session. The

Table 3 Students’ self-reportedGoogle products use at schooland at home (times/week)

Use at school Use at home

M SD M SD

Gmail 1.09 2.18 1.99 3.34

Google Docs 7.28 3.94 3.40 3.75

Google Sites 1.87 2.83 1.01 2.68

Google Talk 0.43 2.02 0.47 2.31

Google Calendar 0.47 2.04 0.54 2.11

Google Reader 0.23 1.58 0.16 1.13

Google Video 0.21 1.46 0.95 2.86

Google Maps 0.75 2.22 0.86 2.22

EasyBib 0.58 1.63 0.24 1.47

Aviary 0.25 1.48 0.19 1.44

N 231

Table 4 Students’ use of Go-ogle Docs for different learningpurposes at school

M SD

Taking notes 1.61 2.83

Writing drafts as the sole author 3.95 3.82

Writing drafts that have two or more authors 1.29 2.79

Revising or editing something they wrote 4.11 3.72

Giving comments on other students’ writing 2.94 3.30

Filling in teacher templates during class activities 1.97 2.91

Chatting with others in Google Docs 2.01 3.05

Making or working on spreadsheets 1.11 2.44

Making presentation slides 1.65 2.52

N 231

B. Zheng et al.

123

advantage of multiple authorship of documents, therefore, only becomes apparent when we

consider the fact that documents with more contributors were worked on more. Documents

edited by only one contributor were edited 3.7 times on average, whereas documents edited

by two contributors were edited 6.5 times on average. As a result, even though multiple-

authored documents start small and grow slowly, they end up larger (on average): 448

words for dual-authored, as against 433 words for single-authored documents. Documents

that were edited by three contributors, meanwhile, had an average of 12.7 edits and ended

up with 541 words on average.

6.3 Writing and Reading Achievement

To answer our third research question, regarding the impact of using Google Docs on

students’ writing and reading achievement, multiple regression analyses were conducted

using residualized change models, controlling for students’ baseline standardized

achievement scores in writing and reading (see Table 8). Models a and b used students’

writing post-test achievement as the outcome, and Models c and d used their reading post-

test achievement as the outcome. Model a and c included students’ number of edits,

average words added, demographic information, and teacher factor. Model b and d

included their self-reported Google Docs usage. The results show that writing and reading

pre-test scores were positively associated with writing post-test scores as well as reading

ones, which supports the notion of reading and writing connections (Graham and Herbert

Fig. 1 Agreement with statements about writing on Google Docs compared to writing on paper/writing onword processing software (y-axis scale: -2 strongly disagree, -1 disagree, 0 neutral, 1 agree, 2 stronglyagree)

Middle School Students’ Writing and Feedback

123

2010). Neither the student’s demographic information nor their teacher-group had a sig-

nificant effect on either writing or reading post-test scores. Looking at Google Docs in

particular, students’ average number of words added had no effect on their post-test scores,

either for writing or reading; nor did their total edit times have any significant effect on

their post-test scores. Students’ self-reported usage of Google Docs was then included to

determine whether the frequency of students’ usage of that platform for different learning

purposes affected their achievement. The results suggest that students’ usage of Google

Docs did not have a significant effect on their writing test scores nor their reading post-test

scores.

6.4 Feedback Received on Google Docs

In addition to analyzing students’ writing and revision on Google Docs, and the rela-

tionship of these activities to their academic achievement, we felt it would be useful to look

more closely at the specific feedback students received from others. Thus, in this study, a

random sample of 40 students’ Google documents was chosen for an investigation of the

types and foci of feedback that students received. Descriptive statistics of types of feedback

are presented in Table 9. Within the 40-student sample, the teachers and students col-

lectively provided 344 items of feedback. Among these, 74 % constituted direct feedback,

12 % commentary feedback, 9 % evaluative feedback, 3 % affective feedback, and 1 %

highlighted feedback. With regard to who provided it, 229 of the 344 total items were from

teachers, and 115 from peers. Direct feedback made up the largest percentage for both

groups: 72 % in the case of teachers and 79 % in the case of students. For teachers, the

Table 5 Descriptive statistics of student writing and revision on Google Docs

M SD Min Max

Document level (N1 = 3,537)

Number of contributors 1.34 0.62 1 6

Number of edits 4.93 5.63 1 74

Age of document (in days) 15.26 38.93 0 384

Word count of writing at first session 248.00 350.83 0 2,558

Word count of writing at last session 429.89 457.50 1 5,500

Session level (N2 = 17,435)

Words added 118.03 239.24 0 3,911

Words deleted 28.09 120.03 0 3,106

Table 6 Descriptive statistics ofstudent writing and reading pre-test and post-test scores on theColorado Student AssessmentProgram

M SD Min Max

Year 2010–2011

Writing 548.92 54.08 419 780

Reading 650.71 53.08 220 788

Year 2011–2012

Writing 558.17 53.00 411 737

Reading 669.39 45.75 489 819

B. Zheng et al.

123

Ta

ble

7A

two

-lev

elg

row

thm

od

elp

red

icti

ng

tota

lw

ord

sfr

om

edit

sess

ion

,n

um

ber

of

auth

ors

,an

dst

ud

ent-

lev

elv

aria

ble

s

Dep

end

ent

var

iab

le:

tota

lw

ord

s

Mo

del

1:

un

cond

itio

nal

mod

elM

odel

2:

Mo

del

1p

lus

nu

mb

ero

fco

ntr

ibu

tors

Mo

del

3:

Mo

del

1p

lus

indiv

idu

alv

aria

ble

sM

odel

4:

fin

alm

od

el

Co

nst

ant

22

5.1

***

(5.8

95)

31

1.7

***

(13

.89)

22

4.3

***

(6.7

80)

30

7.7

***

(14

.68)

Ses

sio

n4

5.6

3***

(0.8

59)

58

.96

***

(1.9

32)

46

.37

***

(0.9

94)

60

.18

***

(2.0

21)

Nu

mb

ero

fco

ntr

ibu

tors

-6

6.6

3***

(9.3

19)

-6

3.5

5***

(9.5

02)

His

pan

ic2

3.6

1(2

3.8

0)

19

.23

(23

.70)

En

gli

sh_

lear

ner

-4

3.7

4(3

8.9

9)

-5

1.7

9(3

8.8

1)

Fre

e_lu

nch

-2

8.7

2(2

3.0

0)

-2

7.0

7(2

2.9

0)

Pre

_w

riti

ng

22

.48

**

(8.3

93)

20

.68

*

(8.3

54)

Pre

_re

adin

g-

13

.37

(9.0

78)

-1

3.2

0(9

.03

5)

Nu

mb

ero

fco

ntr

ibu

tors

9se

ssio

n-

8.6

72

***

(1.0

74)

-8

.840

***

(1.1

01)

His

pan

ic9

sess

ion

-2

.208

(3.5

36)

En

gli

sh_

lear

ner

9se

ssio

n-

1.2

77

(5.9

81)

Fre

e_lu

nch

9se

ssio

n1

.179

(3.3

85)

Pre

_w

riti

ng

9se

ssio

n4

.552

***

(1.3

21)

6.0

33

***

(0.9

27)

Middle School Students’ Writing and Feedback

123

Ta

ble

7co

nti

nued

Dep

end

ent

var

iab

le:

tota

lw

ord

s

Mo

del

1:

un

cond

itio

nal

mod

elM

odel

2:

Mo

del

1p

lus

nu

mb

ero

fco

ntr

ibu

tors

Mo

del

3:

Mo

del

1p

lus

indiv

idu

alv

aria

ble

sM

odel

4:

fin

alm

od

el

Pre

-rea

din

g9

sess

ion

2.2

70

(1.3

49)

Var

ian

ce(r

ate

of

chan

ge)

97

9.0

***

(23

.54)

93

5.7

***

(22

.63)

96

0.6

***

(25

.00)

91

2.4

***

(23

.88)

Var

ian

ce(i

nit

ial

stat

us)

11

2,7

49

.4***

(1,4

83.1

)1

11

,53

5.9

***

(1,4

69.1

)1

09

,01

8.8

***

(1,5

37.9

)1

08

,03

9.1

***

(1,5

25.9

)

Co

rrel

atio

n(r

ate

and

init

ial

stat

us)

0.3

40

***

(0.0

318

)0

.31

3***

(0.0

327

)0

.305

***

(0.0

368

)0

.271

***

(0.0

380

)

Wit

hin

-do

cum

ent

resi

du

alv

aria

nce

9,0

05

.0***

(57

.40)

8,9

99

.1***

(57

.36)

8,7

26

.3***

(59

.06)

8,7

21

.1***

(59

.03)

N1

7,4

35

17

,43

51

5,5

49

15

,54

9

Dev

iance

(-2

Log

L)

22

4,5

71

.32

22

4,4

70

.22

19

9,7

40

.49

19

9,6

46

.04

AIC

22

4,5

63

.32

24

,48

6.2

19

9,7

72

.51

99

,67

4.0

BIC

22

4,6

29

.92

24

,54

8.3

19

9,8

94

.91

99

,78

1.2

Sta

nd

ard

erro

rsin

par

enth

eses

*p\

0.0

5,

**

p\

0.0

1;

**

*p\

0.0

01

B. Zheng et al.

123

next most frequent feedback types were commentary feedback (12.9 %) and evaluative

feedback (12.5 %). For students, commentary feedback (10.2 %) was also the second most

frequent, but students provided much less evaluative feedback than teachers did (2.5 %),

and much more affective feedback (7.6 % as opposed to 1.3 %): suggesting that students

might be more comfortable expressing affection or feelings than providing critical eval-

uation of their peers. When we examined the effects of feedback provider and feedback

type on students’ achievement growth, the results suggested that the amount of teacher

feedback, amount of student feedback, and the number of different feedback types did not

have any significant effects on students’ writing or reading achievement change (see

Table 10).

In addition to feedback types, the content of feedback students received on Google Docs

was also examined (see Table 11 for descriptive statistics). The foci of feedback students

received on Google Docs fall into seven broad categories: mechanics, grammar, word

choice, content, organization, conventions, and general. Feedback on mechanics can be

further subdivided into feedback on spelling, capitalization, and punctuation. Feedback on

grammar can be subdivided into nine categories: articles, noun-pronoun agreement, sub-

ject-verb agreement, tenses, prepositions, conjunctions, redundant words, missing words,

and sentence structure. Our results show that most feedback was focused on pointing out

students’ errors of mechanics (40.6 %), followed by grammar errors (26.3 %). Among

feedback from teachers, 36.3 % of items dealt with mechanics and 25.1 % with grammar;

while among students, 49.1 % focused on mechanics and 28.6 % on grammar. After these

two major categories, the other feedback provided (averaged across teachers and students)

covered content (13.7 %), organization (9.6 %), word choice (4.5 %), conventions

(3.3 %), and general feedback (2.1 %). Feedback on content, organization, and word

choice was mostly provided by teachers. Students provided almost as much feedback on

conventions as teachers did. Also, most general feedback such as ‘‘Excellent essay!’’,

‘‘Nice!’’, and so forth was provided by students.

Content analysis was also used to examine the different kinds of feedback students

received. First, we found that most of the errors of mechanics, grammar, and word choice

were corrected by teachers and peers directly within the written texts. In some cases, the

teacher also used highlighted feedback and commentary feedback to point out students’

mechanics and grammar errors. For example, in one student’s essay, the teacher high-

lighted ‘‘wast’’ but did not correct it or leave any comment; later, the original writer

corrected this word by changing it to the correct form, ‘‘wasn’t.’’ In another piece of work,

the teacher left a comment on the margin, ‘‘Spell check. Don’t abbreviate like in a text

message. Avoid U in a formal essay.’’ In another document, the teacher commented, ‘‘Use

the spell check help offered in Google with the red underlining of a misspelled word.’’

Instead of correcting an error directly for the student, the teacher provided guidance so that

the student could correct it him- or herself.

Second, by providing affective feedback, students expressed encouragement and sym-

pathy for their peers’ writing. Most affective feedback was focused on providing general

statements of approbation, such as ‘‘This is very interesting! I can’t wait for you to finish’’

or ‘‘I like your poem!’’ However, some affective feedback provided by students was

focused on specific content, e.g., ‘‘I like [the deep fryer] part,’’ or even on the organization

of a peer’s writing, such as ‘‘I like how you started and ended with new york, new york.’’

Third, although most of the student-originated feedback was focused on the language/

micro level, we did find that students also provided feedback on content. Some typical

comments by peers included, ‘‘You might want to be more clear about what this means’’

and ‘‘You need to fix your last sentence.’’ One student not only pointed out the locus of the

Middle School Students’ Writing and Feedback

123

problem, but also provided a suggestion for a possible solution, commenting at the end,

‘‘Good except you need an ending sentence. Try ‘She certainly goes by this motto and wished

for others, as well.’’’ Later, the original author saw this comment and actually used the first

part of the sentence that the peer student had suggested. This example suggests that through

the process of providing and receiving feedback, and revising, writers improved their writing;

and that readers also developed their writing abilities, by thinking more deeply about the

content and structure of the written texts and providing feedback on how to improve. In this

way, readers developed their literacy as both readers and writers. Another example further

reflects how one feedback provider developed her own writing through the process of pro-

viding feedback for others. First, this student identified a possible error in another writer’s

sentence, ‘‘Hoping that he would make it into Canisius College, he did!’’ She first revised it by

adding ‘‘and’’ before ‘‘he did!’’ Later, the reader realized there might be a better way to

rephrase the sentence, and changed it to, ‘‘Hoping, dreaming, wishing, John wanted to make it

into Canisius College. He did!’’ Through revising her own feedback, this reader developed

her writing as a thoughtful writer and an active reader. All these examples tend to strengthen

previous research on reading-writing connections, and specifically, that writing during

reading can help readers to become more reflective, more evaluative, and more engaged in

deep thinking (Salvatori 1985; Tierney and Shanahan 1991).

The fourth major category of finding from our content analysis was that a small number

of interactions did occur between writers and readers on Google Docs, all between student

writers and student feedback providers. In one case, the peer commented, ‘‘Nice poem

Morgan. I miss seeing you ’’, to which the writer responded by saying ‘‘Thanks!’’ In

another example, the author first commented on her own writing on the margin, ‘‘I was

wondering if it was weird to have poem titles,’’ after which one peer commented, ‘‘nah I

love it it describes spring. Very good job. :D.’’ Although these two examples reflect

relatively simple interactions, they suggest that Google Docs provided an environment in

which writers and readers could interact with each other conveniently and immediately,

which was less possible when writing on paper or with word-processing software.

100

200

300

400

500

Tota

l wor

ds

0 5 10 15Number of edits

Single authorTwo authorsThree authors

Fig. 2 Prototypical plots for documents with one, two, or three authors, based on Model 4 in Table 7

B. Zheng et al.

123

Table 8 Residualized change model predicting writing and reading post-test achievement from demo-graphic information, Google Docs writing, and self-reported Google Docs usage

Outcome (Post-test)

Writing Reading

Model a Model b Model c Model d

Pre-writing 0.59***(0.06)

0.62***(0.06)

0.29***(0.05)

0.31***(0.05)

Pre-reading 0.27***(0.06)

0.25***(0.06)

0.47***(0.05)

0.48***(0.05)

Male -.01(4.64)

-2.78(3.69)

Hispanic 3.43(9.26)

-11.94(7.36)

Other -11.21(11.80)

-11.41(9.38)

ELL -10.20(17.01)

8.33(13.52)

Free lunch recipients 2.43(8.23)

-7.25(6.54)

Teacher A .33(4.67)

1.87(3.71)

Words added 0.09(0.05)

0.04(0.04)

Edit count -0.05(0.06)

0.02(0.05)

Take notes 1.80(0.99)

0.79(0.77)

Write drafts as the only author 0.08(0.95)

-0.27(0.74)

Write drafts with others -0.22(0.99)

0.17(0.77)

Revise own writing -0.36(1.10)

-0.30(0.85)

Give comments 0.93(1.33)

0.81(1.03)

Fill in templates -0.46(1.45)

1.47(1.13)

Chat -1.05(1.23)

-1.60(0.95)

Spreadsheets -0.63(1.23)

0.40(0.95)

Presentations 0.45(1.32)

-3.44***(1.03)

Constant 57.22(30.14)

52.83(29.62)

204.78***(23.91)

192.54***(22.98)

Observations 198 198 198 198

R-square 0.67 0.67 0.72 0.73

Standard errors in parentheses

* p \ 0.05; ** p \ 0.01; *** p \ 0.001

Middle School Students’ Writing and Feedback

123

7 Discussion

Google Docs has emerged as a popular cloud-based writing platform, that many school

districts have either recently adopted or are considering adopting due to its affordability

and accessibility. To date, there has been very little research on the use of Google Docs use

in K-12 schools. This study has examined Google Docs use by sixth graders in one school

district that has made a long-term investment in implementing technology into instruction.

This case study is not intended to be representative, but by carefully examining Google

Docs use over the course of a school year, we feel we have made an important contribution

to the literature of digital writing, as well as to a hitherto sparse literature that might be of

interest to school district administrators considering Google Docs implementation.

Our first finding is that the students in our sample were enthusiastic about using Google

Docs. In surveys, they exhibited more positive attitudes toward using Google Docs for

Table 9 Types of feedback students received on Google Docs

Types of feedback Number (N = 344) Feedback provider

Teacher Peer student

Direct feedback 260 (74.3 %) 167 (72 %) 93 (78.8 %)

Commentary feedback 42a (12 %) 30 (12.9 %) 12 (10.2 %)

Highlighted feedback 4 (1.1 %) 3 (1.3 %) 1 (0.8 %)

Affective feedback 12 (3.4 %) 3 (1.3 %) 9 (7.6 %)

Evaluative feedback 32b (9.1 %) 29 (12.5 %) 3 (2.5 %)

a One was also categorized as affective feedbackb Five were also categorized as affective feedback

Table 10 The effect of feedbackprovider and types of feedbackon writing and reading achieve-ment gains

Standard errors in parentheses

* p \ 0.05; ** p \ 0.01;*** p \ 0.001

Outcome (Post-test)

Writing Reading

Pre_writing 0.69***(.12)

Pre_reading 0.78***(.18)

Number of teacher feedback 18.58(18.42)

9.58(21.03)

Number of peer feedback 14.86(18.36)

12.43(20.92)

Number of direct feedback -17.19(18.71)

-11.45(21.37)

Number of commentary feedback -14.83(20.95)

-15.72(23.86)

Number of highlighted feedback -18.14(22.98)

-17.66(26.31)

Number of affective feedback -3.89(14.51)

-8.82(16.31)

Number of evaluative feedback -29.79(17.54)

-11.90(20.15)

B. Zheng et al.

123

organizing, writing, and giving and receiving feedback than toward other writing modal-

ities, including traditional word-processing software. We were particularly interested to

find that students reported spending so much time editing documents (an average of 4.11

times per week). One explanation for the very high report of time spent on editing and

revising is that students may have considered any second session working a document as a

revision or editing session, even if from a process-writing perspective they were still

basically drafting the document. If so, these results triangulate well with our findings that

most documents had multiple editing sessions. Although we do not have comparison data

collected from paper-based classrooms, these results suggest that Google Docs provides

support for a process-writing perspective that encourages students to draft, edit, revise and

share documents over longer stretches of time.

It is not surprising that documents with multiple contributors were drafted more slowly

and had fewer words amended during each editing session than single-authored documents;

an individual does not have to confirm or consider the views of their collaborators when

drafting or revising. However, our research provides the first empirical confirmation of this

trend based on large numbers of documents in real-world contexts. The data from our case

study suggest that teachers should not expect rapid development of co-authored docu-

ments. Rather, these documents may start slowly, but will benefit from a greater number of

editing sessions and end up longer than individually-authored documents in the long run. A

comparative investigation of the quality of documents produced in sole- and multiple-

Table 11 Focus of feedback students received on Google Docs

Focus of feedback Number Feedback provider

Teacher Student

Mechanics

Spelling 86 52 34

Capitalization 27 17 10

Punctuation 23 12 11

Total 136 (40.6 %) 81 (36.3 %) 55 (49.1 %)

Grammar

Articles 1 0 1

Noun-pronoun agreement 13 5 8

Subject-verb agreement 1 1 0

Tenses 11 7 4

Prepositions 5 1 4

Conjunctions 5 2 3

Redundancy 13 10 3

Missing words 2 1 1

Sentence structure 37 29 8

Total 88 (26.3 %) 56 (25.1 %) 32 (28.6 % %)

Word choice 15 (4.5 %) 12 (5.4 %) 3 (2.7 %)

Content 46 (13.7 %) 38 (17.0 %) 8 (7.1 %)

Organization 32 (9.6 %) 29 (13.0 %) 3 (2.7 %)

Convention 11 (3.3 %) 6 (2.7 %) 5 (4.5 %)

General 7 (2.1 %) 1 (0.4 %) 6 (5.4 %)

Total 335 223 112

Middle School Students’ Writing and Feedback

123

authored conditions did not form a part of this study, but is a part of ongoing work in our

lab.

Our findings also suggest that the average number of words students added on Google

Docs did not have a significant effect on either their writing or reading test scores. That being

said, there were a number of factors that could not be controlled in our models. For instance,

we do not know how much writing the sample students had done on paper, either indepen-

dently or in other classes. However, we are continuing this line of research with much larger

samples of students and much larger collections of documents; unless and until it is replicated

with a better-powered analysis, we will treat this null finding cautiously.

Google Docs facilitated the giving and receiving of editorial feedback between and

among students and their teachers in ways that the other locally-available writing

modalities could not. Both teachers and students in this study provided feedback to student

writers in order to help them improve their writing. Peer feedback was mostly focused on

pointing out errors of mechanics and grammar. This finding aligns with previous research

(Chaulk 1994; Ge 2011; Lund and Smørdal 2006, August), suggesting that students’ edits

were more focused on the language level or micro-level than on the content level or, so to

speak, the macro-level. During the process of giving and receiving feedback, both readers

and writers exhibited deeper thinking regarding both the texts written by others and their

own writing, as students constantly revised their work and provided feedback to others. We

can conclude from this that Google Docs or similar systems could potentially help a wider

population of students improve their writing techniques and skills over the long term.

The findings of the current study, combined with previous studies on computer-sup-

ported collaborative writing, suggest that collaborative learning could be helpful for

enhancing active participation and active writing, and enriching students’ learning pro-

cesses (e.g., Blau and Caspi 2009; Parker and Chao 2007). However, as observed in our

study, most of the collaboration on Google Docs consisted of students’ single writing with

feedback from others, rather than other higher levels of collaboration, such as joint writing,

or parallel writing as separate writers. In addition, most of the feedback student writers

received from others was on the language level rather than the content level. The finding

that students added fewer words to their peers’ documents than teachers did is consistent

with previous research findings regarding students’ reluctance to edit others’ written

products. Students can be said to have psychological ownership of their academic products

(Pierce et al. 2003), and this appears to include their writing on Google Docs: students in

our sample were reluctant to edit others’ work as well as inclined to protect their own work

from being changed or edited. Even when encouraged by their teachers to review and edit

peers’ work, students appeared more comfortable correcting grammar or mechanics errors

rather than editing on the higher levels, such as content or structure. When it came to edits

on the content or structure level, students may have felt more secure about leaving com-

ments or suggestions in the margins rather than making changes to others’ writing directly.

It is also important to mention that, since this is the first year in which the school district

encouraged middle school teachers to use Google Docs to facilitate students’ authentic

writing and collaborative writing, teachers were still exploring the educational features of

Google Docs and how best to integrate it into their writing instruction; they may also need

time to adapt to this product as it continues to develop. Most of the writing activities

teachers guided during the studied school year were targeted towards students’ individual

writing with feedback from others, since it seems to be the easiest type of collaboration for

students to master at this early implementation stage. Teachers could, however, explore

other forms of collaborative writing with Google Docs in the future. Teachers also

expressed frustration with Google Docs, as it sometimes formatted in a strange way, such

B. Zheng et al.

123

that teachers had to copy and paste student writing into word-processing software before

printing it out. Nevertheless, this case study of middle school students’ use of Google Docs for

writing provides an example of how cloud-based tools can be integrated into ELA classrooms

to support student writing, revision, and interaction, with an ultimate goal of improving

students’ literacy development and preparing them for their future study and careers.

The present research also has several limitations. First, it is difficult to argue the benefits

of Google Docs without a comparison group. Although our findings suggest that use of

Google Docs did not have an effect on students’ writing and reading achievement growth,

providing a comparison group would create a sounder basis for making strong arguments.

Second, the current study focuses on document growth trajectory and the relationship

between students’ use of Google Docs and writing achievement. Because of the large

volume of documents involved and the huge amount of effort required, we were not able

within the scope of this paper to closely analyze document quality, which is undoubtedly of

great importance to research on Google Docs and writing. One focus of our future research

should therefore be the effects of different feedback types on writing quality.

Acknowledgments Funding for this study was provided by a Google Faculty Research Award (MarkWarschauer, PI).

Appendix: Laptops and Learning—Collaborative Writing Survey

1. What grade are you in?

• 6th grade

• 7th grade

• 8th grade

2. Which school do you go to?

• Newton

• Euclid

• Powell

• Goddard

3. Your gender

• Male

• Female

4. How many computers are there in your house?

• 0

• 1

• 2

• 3

• 4

• 5 or more

5. How would you rate your skill using computers?

• Novice: I can turn my laptop on, but I don’t know how to use many programs.

• Beginner: I am able to do a few simple things like browsing the Internet and some

writing.

Middle School Students’ Writing and Feedback

123

• Intermediate: I do OK with 4 or 5 programs.

• Advanced: I use lots of programs and sometimes help my classmates.

• Expert: I’m often able to help others with their laptops, and generally fix

computer problems quickly.

6. How much do you use your computer at school?

• Never

• Less than 1 h a day

• 1–2 h a day

• 2–3 h a day

• 3–4 h a day

• 4–5 h a day

• 5–6 h a day

7. How much do you use your computer for each of these subjects at school?

I do nottake thisclass

Less than anhour perweek

1–2 hperweek

3–4 hperweek

5–6 hperweek

7 or morehours perweek

English language arts (includesreading, writing, spelling,language)

Social studies/history

Math

Science

Writing papers (total writingtime for any subject)

8. How much do you use the following Google products AT SCHOOL?

Never Less thanonce a week

Weekly A few timesa week

Less than1 h a day

1–2 h a day More than2 h a day

Gmail

Google Docs

Google Sites

Google Talk

Google Calendar

Google Reader

Google Video

Google Maps

EasyBib

Aviary

B. Zheng et al.

123

9. How much do you use the following Google products AT HOME?

Never Less thanonce a week

Weekly A few timesa week

Less than1 h a day

1–2 h aday

More than2 h a day

Gmail

Google Docs

Google Sites

Google Talk

Google Calendar

Google Reader

Google Video

Google Maps

EasyBib

Aviary

10. How much do you usually use Google Docs AT SCHOOL to do the following?

Never Less thanonce a week

Once aweek

A fewtimes aweek

Less than1 h a day

1–2 ha day

More than2 h a day

Take notes

Write drafts as the onlyauthor

Write drafts that havetwo or more authors

Revise or edit somethingyou have written

Give comments on otherstudents’ writing

Fill in teacher templatesduring class activities

Chat with others inGoogle Docs

Make or work onspreadsheets

Make presentation slides

Middle School Students’ Writing and Feedback

123

11. How much do you usually use Google Docs AT HOME to do the following?

Never Less thanonce a week

Once aweek

A fewtimes aweek

Less than1 h a day

1–2 ha day

More than2 h a day

Write drafts as the onlyauthor

Write drafts that havetwo or more authors

Revise or editsomething you havewritten

Revise or edit otherstudents’ writing

Give comments onother students’writing

Chat with others inGoogle Docs

Make or work onspreadsheets

Make presentationslides

12. Comparing writing with Google Docs to WRITING ON PAPER, please indicate how

much you agree or disagree with the following statements about use of Google Docs?

Stronglydisagree

Disagree Neutral Agree Stronglyagree

Using Google Docs helps me keep betterorganized than using paper

I like writing on Google Docs more than writingon paper

It’s easier for me to revise/edit my work onGoogle Docs than on paper

I write higher quality drafts on Google Docs thanon paper

I get more feedback on my writing when I writeon Google Docs than on paper

13. Comparing writing with Google Docs to WRITING ON WORD PROCESSING

SOFTWARE (Open Office, Microsoft Word, etc.), please indicate how much you

agree or disagree with the following statements about use of Google Docs?

B. Zheng et al.

123

Stronglydisagree

Disagree Neutral Agree Stronglyagree

Using Google Docs helps me keep betterorganized than using a word processor

I like writing on Google Docs more than writingon a word processor

It’s easier for me to revise/edit my work onGoogle Docs than on a word processor

I write higher quality drafts on Google Docs thanon a word processor

I get more feedback on my writing when I write onGoogle Docs than on a word processor

14. Please indicate how much you agree or disagree with the following statements about

writing?

Stronglydisagree

Disagree Neutral Agree Stronglyagree

I like writing for a real audience

Writing for a real audience helps improvemy writing

I like getting feedback on my writing

Getting feedback helps improve my writing

I like giving others feedback on their writing

Giving feedback to others helps me improvemy writing

I like writing with multiple authors

Working with multiple authors helpsimprove my writing

15. Please describe what you like most about Google Docs:

16. Please indicate any challenges you have experienced with Google Docs:

17. Please include any other comments or suggestions regarding use of Google Docs:

References

Aydin, Z., & Yildiz, S. (2014). Using Wikis to promote collaborative EFL writing. Language Learning &Technology, 18(1), 160–180.

Blau, I., & Caspi, A. (2009). Sharing and collaborating with Google Docs: The influence of psychlogicalownership, responsibility, and students’ attitudes on outcome quality. Paper presented at the Worldconference on E-learning in Corportate, Government, Healthcare, and Higher Education, Vancouver,Canada. http://www.openu.ac.il/research_center/download/Sharing_collaborating_Google_Docs.pdf

Calkins, L. (1994). The art of teaching writing. Portsmouth, NH: Heinemann.Calvo, R. A., Stephen, T. O. R., Jones, J., Yacef, K., & Reimann, P. (2011). Collaborative writing support

tools on the cloud. IEEE Transaction on Learning Technologies, 4(1), 88–97.

Middle School Students’ Writing and Feedback

123

Chamberlain, A. B. (2010). Synchronous computer-mediated collaborative writing in the ESL classroom.(Master of Arts), Michigan State University.

Chao, Y.-C. J., & Lo, H.-C. (2009). Students’ perceptions of Wiki-based collaborative writing for learners ofEnglish as a foreign language. Interactive Learning Environments, 19(4), 395–411. doi:10.1080/10494820903298662.

Chaulk, N. (1994). Comparing teacher and student response to written work. TESOL Quarterly, 28(1),181–188.

Cho, K., & Cho, M.-H. (2007). Self-awareness in a computer supported collaborative learning environment.In D. Schuler (Ed.), Online communities and social computing (Vol. 4564, pp. 284–291). Berlin:Springer.

Cho, K., & Schunn, C. D. (2007). Scaffolded writing and rewriting in the discipline: A web-based reciprocalpeer review system. Computers & Education, 48(3), 409–426.

Colorado Department of Education. (2011a). Automated data exchange documentation: Student data ele-ments and definitions. Retrieved March 15th, 2011, from https://cdeapps.cde.state.co.us/appendix_sodefs.htm

Colorado Department of Education. (2011b). Colorado Student Assessment Program Technical Report 2011.Retrieved June 29, 2012, from http://www.cde.state.co.us/assessment/CoAssess-AdditionalResources.asp

Conner, N. (2008). Google Apps: The missing manual. Sebastopol, CA: O’Reilly Media.Couture, B., & Rymer, J. (1991). Discourse interaction between writer and supervisor: A primary collab-

oration in workplace writing. In M. M. Lay & W. M. Karis (Eds.), Collaborative writing in industry:Investigations in theory and practice (pp. 87–108). Amityville, NY: Baywood Publishing Company.

Dalke, A., Cassidy, K., Grobstein, P., & Blank, D. (2007). Emergent pedagogy: Learning to enjoy theuncontrollable—and make it productive. Journal of Educational Change, 8(2), 111–130.

Ede, L. S., & Lunsford, A. A. (1992). Singular Texts/Plural Authors: Perspectives on collaborative writing.Carbondale, IL: Southern Illinois University Press.

Educause Learning Initiative. (2008). 7 things you should know about Google Apps. Retrieved June 10,2013, from http://net.educause.edu/ir/library/pdf/ELI7035.pdf

Ellis, R. (2009). A typology of written corrective feedback types. ELTJ, 63(2), 97–107.Elola, I., & Oskoz, A. (2010). Collaborative writing: Fostering foreign language and writing conventions

development. Language Learning & Technology, 14(3), 51–71.Fallahi, C. R., Wood, R. M., Austad, C. S., & Fallahi, H. A. (2006). A program for improving undergraduate

psychology students’ basic writing skills. Teaching of Psychology, 33(3), 171–175.Flower, L., & Hayes, J. R. (1981). A cognitive process theory of writing. College Composition and Com-

munication, 32(4), 365–387. doi:10.2307/356600.Flower, L., Hayes, J., Carey, L., Schriver, K., & Stratman, J. (1986). Detection, diagnosis, and the strategies

of revision. College Composition and Communication, 37(1), 16–55.Ge, Z.-g. (2011). Exploring e-learners’ perceptions of net-based peer-reviewed English writing. Interna-

tional Journal of Computer-Supported Collaborative Learning, 6(1), 75–91. doi:10.1007/s11412-010-9103-7.

Geoffrey, A. C. (2001). Forming the collective mind: A contextual exploration of large-scale collaborativewriting in industry. New York: Hampton Press.

Godwin-Jones, R. (2003). Blogs and Wikis: Environments for on-line collaboration. Language learning andtechnology, 7(2), 12–16.

Goldberg, A., Russell, M., & Cook, A. (2003). The effects of computers on student writing: A meta-analysisof studies from 1992 to 2002. The Journal of Technology, Learning, and Assessment, 2(1).

Graham, S., & Herbert, M. A. (2010). Writing to read: Evidence for how writing can improve reading, aCarnegie Corporation time to act report. Washington, DC: Alliance for Excellence Education.

Han, Z. H. (2002). Rethinking the role of corrective feedback in communicative language teaching. RELCJournal, 33(1), 1–34.

Hayes, J. (1996). A new framework for understanding cognition and affect in writing. In C. Levy & S.Ransdell (Eds.), The science of writing: Theories, methods, individual differences and applications (pp.1–27). New Jersey: Lawrence Erlbaum Associates.

Jago, C. (2001). Cohesive writing: Why concept is not enough. Portsmouth, NJ: Heinemann.Jones, J. (2008). Patterns of revision in online writing. Written Communication, 25(2), 262–289. doi:10.

1177/0741088307312940.Kessler, G., Bikowski, D., & Boggs, J. (2012). Collaborative writing among second language learners in

academic web-based projects. Language Learning & Technology, 16(1), 91–109.Li, X., Chu, S. K. W., & Ki, W. W. (2014). The effects of a wiki-based collaborative process writing

pedagogy on writing ability and attitudes among upper primary school students in Mainland China.Computers & Education, 77, 151–169. doi:10.1016/j.compedu.2014.04.019.

B. Zheng et al.

123

Lindblom-Ylanne, S., & Pihlajamaki, H. (2003). Can a collaborative network environment enhance essay-writing processes? British Journal of Educational Technology, 34(1), 17–30. doi:10.1111/1467-8535.d01-3.

Lund, A., & Smørdal, O. (2006, August). Is there a space for the teacher in Wiki? Paper presented at the2006 International Symposium on Wikis, Odense, Denmark. http://Wikisym.org/ws2006/proceedings/p37.pdf

Nagelhout, E. (1999). Pre-professional practices in the technical writing classroom: Promoting multipleliteracies through research. Technical Communication Quarterly, 8(3), 285–299.

National Center for Education Statistics. (2012). The nation’s report card: Writing 2011 (NCES 2012-470).Washington, D.C.: Institute of Education Sciences, U.S. Department of Education.

Oishi, L. (2007). Working together: Google Apps goes to school. Technology & Learning, 27(9), 46–47.Parker, K. R., & Chao, J. T. (2007). Wiki as a teaching tool. Interdisciplinary Journal of Knowledge and

Learning Objects, 3, 57–72.Pierce, J. L., Kostova, T., & Dirks, K. T. (2003). The state of psychological ownership: Integrating and

extending a century of research. Review of General Psychology, 7(1), 84–107.Robb, T., Ross, S. M., & Shortreed, I. (1986). Salience of feedback on error and its effect on EFL writing

quality. TESOL Quarterly, 20(1), 83–95.Salvatori, M. (1985). The dialogical nature of basic reading and writing. In D. Bartholomae & A. Petrosky

(Eds.), Facts, artifacts and counterfacts: Theory and method for a reading and writing course (pp.137–166). Upper Montclair: NJ: Boynton.

Sheppard, K. (1992). Two feedback types: Do they make a difference? RELC Journal, 23(1), 103–110.Singer, J. D., & Willett, J. B. (2003). Applied longitudinal data analysis: Modeling change and event

occurrence. New York: Oxford University Press.Storch, N. (2005). Collaborative writing: Product, process, and students’ reflections. Journal of Second

Language Writing, 14(3), 153–173.Swain, M. (1995). Three functions of output in second language learning. In G. Cook & B. Seidlhofer (Eds.),

Principle and practice in applied linguistics: Studies in honor of H.G. Widdowson (pp. 125–144).Oxford: Oxford University Press.

Tierney, R., & Shanahan, T. (1991). Research on the reading/writing relationship: Interactions, transactions,and outcomes. In P. E. Pearson, M. Barr & P. B. Mosenthal (Eds.), Handbook of Reading ResearchVolume II (pp. 246-280). New York: Longman.

Vigil, N. A., & Oller, J. W. (1976). Rule fossilization: A tentative model. Language Learning, 26(2),281–295.

Wang, H.-C. (2009). Weblog-mediated peer editing and some pedagogical recommendations: A case study.The JALT CALL Journal, 5(2), 29–44.

Woo, M., Chu, S., Ho, A., & Li, X. (2011). Using a wiki to scaffold primary-school students’ collaborativewriting. Educational Technology & Society, 14(1), 43–54.

Yang, M., Badger, R., & Yu, Z. (2006). A comparative study of peer and teacher feedback in a Chinese EFLwriting class. Journal of Second Language Writing, 15(3), 179–200.

Yeh, S.-W., Lo, J.-J., & Huang, J.-J. (2011). Scaffolding collaborative technical writing with proceduralfacilitation and synchronous discussion. International Journal of Computer-Supported CollaborativeLearning, 6(3), 397–419. doi:10.1007/s11412-011-9117-9.

Zhang, S. (2008). Assessing the impact of peer revision on English writing of tertiary EFL learners.Teaching English in China, 31(2), 47–54.

Middle School Students’ Writing and Feedback

123