document resume cs 012 172 author gambrell,...

34
ED 384 008 AUTHOR TITLE INSTITUTION SPONS AGENCY PUB DATE CONTRACT NOTE PUB TYPE EDRS PRICE DESCRIPTORS IDENTIFIERS ABSTRACT DOCUMENT RESUME CS 012 172 Gambrell, Linda B.; And Others Assess'ng Motivation To Read. Instructional Resource No. 14. National Reading Research Center, Athens, GA.; National Reading Research Center, College Park, MD. Office of Educational Research and Improsh_ment (ED), Washingtor, DC. 95 117A20007 34p. Reports Descriptive (141) Guides Classroom Use Teaching Guides (For Teacher) (052) Tests /Evaluation Instruments (160) MFO1 /PCO2 Plus Postage. Elementary Education; *Evaluation Methods; *Models; Reading Achievement; *Reading Attitudes; Self Concept; *Student Evaluation; *Student Motivation Motivation to Read Profile; *Reading Motivation The Motivation to Read Profile (MRP) is a public-domain instrument designed to provide teachers with an efficient and reliable way to assess reading motivation qualitatively and quantitatively by evaluating students' self-concept as readers and the value they place on reading. The MRP consists of two basic instruments: the Reading Survey (a Likert-type, self report, group-administered instrument), and Conversational Interview (which is administered on an individual basis). Item selection for the MRP was based on a review of research and theories related to motivation and included an analysis of existing instruments designed to assess motivation and attitude toward reading. The Reading Survey instrument can be administered to an entire class, a small group, or an individual, while the Conversational Interview is designed to be conducted on an individual basis. Information derived from an analysis of the results of the MRP can be used to plan instructional activities that will support students in their reading development. (Contains 36 references. Appendixes present the Reading Survey, the Conversational Interview, teacher directions for both instruments, scoring directions for the Reading Survey, and a scoring sheet.) (RS) *********************************************************************** Reproductions supplied by EDRS are the best that can be made from the original document. ***********************************************************************

Upload: vothuy

Post on 21-Mar-2018

217 views

Category:

Documents


2 download

TRANSCRIPT

ED 384 008

AUTHORTITLE

INSTITUTION

SPONS AGENCY

PUB DATECONTRACTNOTEPUB TYPE

EDRS PRICEDESCRIPTORS

IDENTIFIERS

ABSTRACT

DOCUMENT RESUME

CS 012 172

Gambrell, Linda B.; And OthersAssess'ng Motivation To Read. Instructional ResourceNo. 14.National Reading Research Center, Athens, GA.;National Reading Research Center, College Park,MD.

Office of Educational Research and Improsh_ment (ED),Washingtor, DC.95

117A2000734p.

Reports Descriptive (141) Guides Classroom UseTeaching Guides (For Teacher) (052)

Tests /Evaluation Instruments (160)

MFO1 /PCO2 Plus Postage.Elementary Education; *Evaluation Methods; *Models;Reading Achievement; *Reading Attitudes; SelfConcept; *Student Evaluation; *Student MotivationMotivation to Read Profile; *Reading Motivation

The Motivation to Read Profile (MRP) is apublic-domain instrument designed to provide teachers with anefficient and reliable way to assess reading motivation qualitativelyand quantitatively by evaluating students' self-concept as readersand the value they place on reading. The MRP consists of two basicinstruments: the Reading Survey (a Likert-type, self report,group-administered instrument), and Conversational Interview (whichis administered on an individual basis). Item selection for the MRPwas based on a review of research and theories related to motivationand included an analysis of existing instruments designed to assessmotivation and attitude toward reading. The Reading Survey instrumentcan be administered to an entire class, a small group, or anindividual, while the Conversational Interview is designed to beconducted on an individual basis. Information derived from ananalysis of the results of the MRP can be used to plan instructionalactivities that will support students in their reading development.(Contains 36 references. Appendixes present the Reading Survey, theConversational Interview, teacher directions for both instruments,scoring directions for the Reading Survey, and a scoring sheet.)(RS)

***********************************************************************

Reproductions supplied by EDRS are the best that can be madefrom the original document.

***********************************************************************

turis_000
Sticky Note
http://files.eric.ed.gov/fulltext/ED384008.pdf Education Resources Information Center (ERIC)

00

'Tr00

LT-1

s

LINDA B. GAMBRELL BARBARA MARTIN PALMER ROSE MARIE CODUNG SUSAN ANDERS MAZZONI

U S DEPARTMENT OF EDUCATIONOnce o' F.. o,ca 114,144411C41 and Inv,ovemont

EDUCAT:ONAL RESOURCES INFORMATIONCEN1ER tERICI

6/This document has been reproduced asreceived from the person or organizationoriginating itMinor changes have been made toimprove reproduction quality

Ports of view or opinion; stated in thisdocument do not necessarily representotfcial OERI position or pol..y

ii

Nr6

NRRC;

, .1

National Reading Research Center

; C71 vt--

At

I

Instructional Resource No. 14Summer 1995

BEST COPY AVAILABLE

NRRCNational Reading Research Center

Assessing Motivation to Read

Linda B. Gambrel!Barbara Martin Palmer

Rose Marie CodlingSusan Anders Mazzoni

University of Maryland College Park

INSTRUCTIONAL RESOURCE NO. 14Summer 1995

The work reported herein is a National Reading Research Project of the University of Georgiaand University of Maryland. It was supported under the Educational Research andDevelopment Centers Program (PR/AWARD NO. 117A20007) as administered by the Officeof Educational Research and Improvement, U.S. Department of Education. The findings andopinions expressed here do not necessarily reflect the position or policies of the NationalReading Research Center, the Office of Educational Research and Improvement, or the U.S.Department of Education.

NRRC NationalReading ResearchCenter

Executive CommitteeDonna E. Alverrnann, Co-DirectorUniversity of Georgia

John T. Guthrie, Co-DirectorUniversity of Maryland College Park

James F. Baumann, Associate DirectorUniversity of Georgia

Patricia S. Koskinen, Associate DirectorUniversity of Maryland College Park

Nancy B. Mizelle, Acting Associate DirectorUniversity of Georgia

Jamie Lynn Metsala, Interim Associate DirectorUniversity of Maryland College Park

Penny ()leatherUniversity of Georgia

John F. O'FlahavanUniversity of Maryland College Park

James V. HoffmanUniversity of Texas at Austin

Cynthia R. HyndUniversity of Georgia

Robert SerpellUniversity of Maryland Baltimore County

Betty ShockleyClarke County School District, Athens, Georgia

Linda DeGroffUniversity of Georgia

Publications Editors

Research Reports and PerspectivesLinda DeGroff, EditorUniversity of Georgia

James V. Hoffman, Associate EditorUniversity of Texas at Austin

Mariam Jean Dreher, Associate EditorUniversity of Maryland College Park

Instructional ResourcesLee Galda, University of GeorgiaResearch HighlightsWilliam G. HollidayUniversity of Maryland College Park

Policy BriefsJames V. HoffmanUniversity of Texas at Austin

VideosShawn M. Glynn, University of Georgia

NRRC StaffBarbary F. Howard, Office ManagerKathy B. Davis, Senior SecretaryUniversity of Georgia

Barbara A. Neitzey, Administrative AssistantValerie Tyra, AccountantUniversity of Maryland College Park

National Advisory BoardPhyllis W. AldrichSaratoga Warren Board of Cooperative EducationalServices, Saratoga Springs, New York

Arthur N. ApplebeeState University of New York, Albany

Ronald S. BrandtAssociation for Supervision and CurriculumDevelopment

Marsha T. DeLainDelaware Department of Public Instruction

Carl A. GrantUniversity of Wisconsin-Madison

Walter KintschUniversity of Colorado at Boulder

Robert L. LinnUniversity of Colorado at Boulder

Luis C. MollUniversity of Arizona

Carol M. SantaSchool District No. 5Kalispell, MontanaAnne P. SweetOffice of Educational Research and Improvement,U.S. Department of Education

Louise Cherry WilkinsonRutgers University

Production EditorKatherine P. HutchisonUniversity of Georgia

Dissemination CoordinatorJordana E. RichUniversity of Georgia

Text FormatterAnn Marie VanstoneUniversity of Georgia

NRRC - University of Georgia318 AderholdUniversity of GeorgiaAthens, Georgia 30602-7125(706) 542-3674 Fax: (706) 542-3678INTERNET: [email protected]

NRRC - University of Maryland College Park3216 J. M. Patterson BuildingUniversity of MarylandCollege Park, Maryland 20742(301) 405-8035 Fax: (301) 314-9625INTERNET: [email protected]

A

'1

About the National Reading Research Center

The National Reading Research Center (NRRC) isfunded by the Office of Educational Research andImprovement of the U.S. Department of Education toconduct research on reading and reading instruction.The NRRC is operated by a consortium of the Universi-ty of Georgia and the University of Maryland CollegePark in collaboration with researchers at several institu-tions nationwide.

The NRRC's mission is to discover and documentthose conditions in homes, schools, and communitiesthat encourage children to become skilled, enthusiastic,lifelong readers. NRRC researchers are committed toadvancing the development of instructional programssensitive to the cognitive, sociocultural, and motiva-tional factors that affect children's success in reading.NRRC researchers from a variety of disciplines conductstudies with teachers and students from widely diversecultural and socioeconomic backgrounds in pre-kinder-garten through grade 12 classrooms. Research projectsdeal with the influence of family and family-schoolinteractions on the development of literacy; the interac-tion of sociocultural factors and motivation to read; theimpact of literature-based reading programs on readingachievement; the effects of reading strategies instructionon comprehension and critical thinking inscience, and history; the influence of innovative groupparticipation structures on motivation and learning; thepotential of computer technology to enhance literacy;and the development of methods and standards foralternative literacy assessments.

The. :RC is further committed to the participationof teacLers as full partners in its research. A betterunderstanding of how teachers view the development ofliteracy, how they use knowledge from research, andhow they approach change in the classroom is crucial toimproving instruction. To further this understanding,the NRRC conducts school-based research in whichteachers explore their own philosophical and pedagogi-cal orientations and trace their professional growth.

Dissemination is an important feature of NRRC activi-ties. Information on NRRC research appears in severalformats. Research Reports communicate the results oforiginal research or synthesize the findings of severallines of inquiry. They are written primarily for re-searchers studying various areas of reading and readinginstruction. The Perspective Series presents a widerange of publications, from calls for research andcommentary on research and practice to first-personaccounts of experiences in schools. InstructionalResources include curriculum materials, instructionalguides, and m..iterials for professional growth, designedprimarily for teachers.

For more information about the NRRC's researchprojects and other activities, or to have your nameadded to the mailing list, please contact:

Donna E. Alvermann, Co-DirectorNational Reading Research Center318 Aderhold HallUniversity of GeorgiaAthens, GA 30602-7125(706) 542-3674

John T. Guthrie, Co-DirectorNational Reading Research Center3216 J. M. Patterson BuildingUniversity of MarylandCollege Park, MD 20742(301) 405-8035

5

NRRC Editorial Review Board

Peter AfflerbachUniversity of Maryland College Park

Jane AgeeUniversity of Georgia

JoBeth AllenUniversity of Georgia

Janice F. AlmasiUniversity of Buffalo-SUNY

Patty AndersUniversity of Arizona

Harriette ArringtonUniversity of Kentucky

Maas BanningUniversity of Utan

Jill BartoliElizabethtown Co'tege

Janet BentonBowling Green, Kentucky

Irene BlumPine Springs Elementary School

Falls Church, Virginia

David BloomeAmherst College

John BorkowskiNotre Dame University

Karen BromleyBinghamton University

Martha CarrUniversity of Georgia

Suzanne ClewellMontgomery County Public Schools

Rockville, Maryland

Joan ColeyWestern Maryland College

Michelle CommeyrasUniversity of Georgia

Linda CooperShaker Heights City Schools

Shaker Heights, Ohio

Karen CostelloConnecticut Department of Education

Hartford, Connecticut

Jim CunninghamGibsonville, North Carolina

Karin DahlOhio State University

Marcia DelanyWilkes County Public Schools

Washington, Georgia

Lynne Diaz-RicoCalifornia State University-San

Bernardino

Ann Egan-RobertsonAmherst College

Jim FloodSan Diego State University

Dana FoxUniversity of Arizona

Linda GambrellUniversity of Maryland College Park

Mary GrahamMcLean, Virginia

Rachel GrantUniversity of Maryland College Park

Barbara GuzzettiArizona State University

Frances HancockConcordia College of Saint Paul,

Minnesota

6

Kathleen HeubachUniversity of Georgia

Sally Hudson-RossUniversity of Georgia

Cynthia HyndUniversity of Georgia

David JardineUniversity of Calgary

Robert JimenezUniversity of Oregon

Michelle KellyUniversity of Utah

James KingUniversity of South Florida

Kate KirbyGwinnett County Public Schools

Lawrenceville, Georgia

Linda LabboUniversity of Georgia

Michael LawUniversity of Georgia

Donald T. LeuSyracuse University

Susan LytleUniversity of Pennsylvania

Bert ManginoLas Vegas, Nevada

Susan MazzoniBaltimore, Maryland

Ann Dacey McCannUniversity of Maryland College Park

Sarah McCartheyUniversity of Texas at Austin

Veda McClainUniversity of Georgia

Lisa Mc FallsUniversity of Georgia

Randy McGinnisUniversity of Maryland

Mike McKennaGeorgia Southern University

Barbara MichaloveFowler Drive Elementary School

Athens, Georgia

Elizabeth B. MojeUniversity of Utah

Lesley MorrowRutgers University

Bruce MurrayUniversity of Georgia

Susan NeumanTemple University

John O'FlahavanUniversity of Maryland College Park

Marilyn Ohlhausen-McKinneyUniversity of Nevada

Penny OldfatherUniversity of Georgia

Barbara M. PalmerMount Saint Mary's College

Stephen PhelpsBuffalo State College

Mike PickleGeorgia Southern University

Amber T. PrinceBerry College

Gaoyin QianLehman College-CUNY

Tom ReevesUniversity of Georgia

Lenore RinglerNew York University

Mary RoeUniversity of Delaware

Nadeen T. RuizCalifornia State University-

Sacramento

Olivia SarachoUniversity of Maryland College Park

Paula SchwanenflugelUniversity of Georgia

Robert SerpellUniversity of Maryland Baltimore

County

Betty ShockleyFowler Drive Elementary School

Athens, Georgia

Wayne H. SlaterUniversity of Maryland College Park

Margaret SmithLas Vegas, Nevada

Susan SonnenscheinUniversity of Maryland Baltimore

County

Bernard SpodekUniversity of Illinois

Steve StahlUniversity of Georgia

Roger StewartUniversity of Wyoming

Anne P. SweetOffice of Educational Research

and Improvement

Louise TomlinsonUniversity of Georgia

Bruce VanSledrightUniversity of Maryland College Park

Barbara WalkerEastern Montana University-Billings

Louise WaynantPrince George's County Schools-

Upper Marlboro, Maryland

Dera WeaverAthens Academy

Athens, Georgia

Jane WestAgnes Scott College

Renee WeisburgElkins Park, Pennsylvania

Allen WigfieldUniversity of Maryland College Park

Shelley WongUniversity of Maryland College Park

Josephine Peyton YoungUniversity of Georgia

}laic YuppCalifornia State University

About the Authors

Linda B. Gambrell is Associate Dean of FacultyResearch and Professor of Curriculum andInstruction at the University of Maryland. She isalso a principal investigator with the NationalReading Research Center and a former classroomteacher and reading teacher at the elementaryschool level. In recent years, Dr. Gambrell'sresearch has focused on comprehension processesand the role of children's literature in the readingprogram. She has published in The ReadingTeacher, Journal of Reading, and Reading Re-search Quarterly, and is co-author, with RobertM. Wilson, of Reading Comprehension in theElementary School, published by Allyn and Bacon.She is currently co-editor of the Journal of Read-ing Behavior and a member of the Board ofDirectors of the International Reading Association.

Barbara Martin Palmer, Ph.D., is AssistantProfessor of Education and Associate Dean ofUndergraduate Studies at Mount Saint Mary'sCollege, Emmitsburg, MD. Previously, Dr.Palmer taught high school Spanish. As a ReadingSpecialist, she taught reading and study skills at avariety of levels. In addition to literacy motiva-tion, her research interests include comprehension,metacognition, and teacher education. Presently,Dr. Palmer serves on the Board of Directors ofthe College Reading Association, sits on severalreview boards, and edits Literacy: Issues andPractices, the journal of the State of MarylandInternational Reading Association Council. Herpublications have appeared in the National Read-ing Conference Yearbook, Educaticnal Psycholo-gist, Contemporary Psychology, and The ReadingTeacher.

Rose Marie Codling is a doctoral candidate in theDepartment of Curriculum and Instruction at theUniversity of Maryland. She is a former class-room teacher and research assistant at the NationalReading Research Center. She currently teachesundergraduate courses in reading methods. Herresearch interests are in the areas of motivationand reading disability. Ms. Codling is a memberof the International Reading Association. She haspublished in and serves on the editorial advisoryboard of Literacy: Issues and Practices, thejournal of the State of Maryland InternationalReading Association Council.

Susan Anders Mazzoni is a graduate fellow inreading education at the University of MarylandCollege Park and a research assistant at the Na-tional Reading Research Center. She has workedwith both adults and children as a literacy educa-tor. Ms. Mazzoni is a member of the InternationalReading Association and Phi Delta Kappa.

AssessingMotivation To Read

Linda B. GambrellBarbara Martin Palmer

Rose Marie CodlingSusan Anders Mazzoni

University of Maryland College Park

National Reading Research CenterUniversities of Georgia and Maryland

Instructional Resource No. 14Summer 1995

Teachers have long recognized that moti-vation is at the heart of many of the pervasiveproblems we face in teaching young children toread. In a study conducted by Veenman (1984).teachers ranked motivating students as one oftheir primary and overriding concerns. A morerecent national survey of teachers also revealedthat "creating interest in reading" was rated asthe most important area for future research(O'Flahavan, Gambrell, Guthrie, Stahl, &Alvermann, 1992). The value teachers place onmotivation is supported by a robust researchliterature that documents the link betweenmotivation and achievement (Elley, 1992;

Gambrell & Morrow, in press; Cutirie, Scha-fer, Wang, & Afflerbach, 1993; Purves &Beach, 1972; Walberg & Tsai, 1985; Wixson& Lipson, 1991). The results of these studiesclearly indicate the need to increase our under-

1

standing of how children acquire the motiva-tion to develop into active, engaged readers.

Research supports the notion that literacylearning is influenced by a variety of motiv '.-tional factors (Deci & Ryan, 1985; Eccles,1983; Ford, 1992; Kuhl, 1986; Lepper, 1988;Maehr, 1976; McCombs, 1991; Wigfield,1994). A number of current theories suggestthat self-perceived competence and task valueare major determinants of motivation and taskengagement. For example, Eccles (1983)advanced an "expectancy-value" theory ofmotivation which stated that motivation isstrongly influenced by one's expectation ofsuccess or failure at a task as well as the "val-ue" or relative attractiveness the individualplaces on the task. The expectancy componentof Eccles' theory is supported by a number ofresearch studies which suggest that studentswho believe they are capable and competentreaders are more likely to outperform thosewho do not hold such beliefs (Paris & Oka,1986; Schunk, 1985). In addition, there isevidence which suggests that students whoperceive reading as valuable and important andwho have personally relevant reasons for readingwill engage in reading in a more planful andeffortful manner (Ames & Archer, 1988; Dweck& Elliott, 1983; Paris & Oka, 1986).

The work of other motivational theorists,such as Ford (1992) and Winne (1985), hasbeen grounded in the expectancy-value theory.Ford's (1992) motivational systems theorymaintained that people will attempt to attaingoals they value and perceive as achievable.Similarly, Winne (1985) viewed the "idealizedreader" as one who feels competent and per-ceives reading as being of personal value and

9

2 Gambrel', Palmer, Codling, & Mazzoni

Table 1. Motivation to Read Profile

READING SURVEY CONVERSATIONAL INTERVIEW

Group Administration

15 - 20 minutes to administer

20 itelas

Liken scale cued response

Subscales:Self-concept As a ReaderValue of Reading

Individual Administration

15 - 20 minutes to administer

14 scripted items

Open-ended free response

Sections:Narrative ReadingInformational ReadingGeneral Reading

practical importance. Given the emphasis on self-concept and task value in motivation theory, itseems important that teachers have resourcesfor assessing both of these factors.

A review of current instruments designedto assess reading motivation revealed a numberof instruments for measuring students' generalattitude toward reading (McKenna & Kear,1990; Tunnel', Calder, Justen, & Phaup,1988), as well as several that measure thespecific dimension of self-concept (Harter,1981; Pintrich & DeGroot, 1990). However,none of these instruments specifically addressthe two fundamental components of motivationas suggested by motivational theory: self-concept and task value. In addition, none of theinstruments combine the use of quantitative andqualitative approaches for assessing readingmotivation. Our purpose was to develop apublic-domain instrument that would provideteachers with an efficient and reliable way to

quantitatively and qualitatively assess readingmotivation by evaluating students' self-conceptas a reader and the value they place on read-ing. This article presents the Motivation toRead Profile (MRP), along with a discussion ofits development and suggestions for use withelementary students. The instruments, teacherdirections, scoring sheet, and scoring instruc-tions are provided in the Appendix.

Description of the Motivationto Read Profile

The MRP consists of two basic instru-ments: The Reading Survey and the Conversa-tional Interview. The Reading Survey is aLiken -type, self-report, group-administeredinstrument, and the Conversational Interview isdesigned to be administered on an individualbasis (see Table 1). The survey assesses twospecific dimensions of reading motivation, self-

NATIONAL READING RESEARCH CENTER, INSTRUCTIONAL RESOURCE NO. 14

Assessing Motivation 3

concept as a reader and value of reading, whilethe interview provides information about theindividual nature of students' reading motivation, such as what books and stories are mostinteresting, favorite authors, and where andhow children locate reading materials thatinterest them most. Because the MRP combinesinformation from a group-administered surveyinstrument with an individual interview, it is auseful tool for exploring more fully the person-al dimensions of students' reading motivation.The MRP is highly individualized making itparticularly appropriate for inclusion in portfo-lio assessment.

The Reading Survey

This instrument (see Appendix) consists of20 items and uses a 4-point Likert-type responsescale. The survey assesses two specific dimen-sions of reading motivation: self-concept as areader (10 items) and value of reading (10 items).The items that focus on self-concept as a readerare designed to elicit information about stu-dents' self-perceived competence in readingand self-perceived performance relative topeers. The value-of-reading items are designedto elicit information about the value studentsplace on reading tasks and activities, particu-larly in terms of frequency of engagement andreading-related activities.

The Conversational Interview

The interview (see Appendix) is comprisedof three sections. The first section probesmotivational factors related to the reading ofnarrative text (3 questions); the second section

elicits information about informational reading(3 questions); and the final section focuses onmore general factors related to reading motiva-tion (8 questions).

The interview is designed to initiate aninformal, conversational exchange between theteacher and student. According to Burgess(1980), conversational interviews are socialevents that can provide greater depth of under-standing than more rigid interview techniques.While conversational interviews are scripted,deviations from the script are anticipated andexpected (Baker, 1984). The teacher is encour-aged to deviate from the basic script in order toglean information that might otherwise bemissed or omitted in a more formal, star.dard-ized interview approach. Teachers need to keepin mind that the primary purpose of the con-versational interview is to generate informationthat will provide authentic ineghts into stu-&Tits' reading experiences. Participating in aconversational interview allows children to usetheir unique ways of describing their readingmotivation and experiences, and it also allowsthem to raise ideas and issues related to per-sonal motivation that may not be reflected inthe scripted interview items (Denzin, 1970).

How was the MRP Developed?

Item selection for the MRP was based ona review of research and theories related tomotivation and included an analysis of existinginstruments designed to assess motivation andattitude toward reading. A number of instru-ments were examined in order to gather ideasfor the development of an initial pool of MRPitems (Gottfried, 1986; Harter, 1981; Johnson& Gaskins, 1991; McKenna & Kear, 1990;

NATIONAL READING RESEARCH CENTER, INSTRUCTIONAL RT:SOURCE NU. 14

4 Gambrell, Palmer, Codling, & Mazzoni

Pintrich & De Groot. 1990; It ynor & Nochajski,1986; Schell, 1992; Tunnel!, et al., 1988).

An assessment instrument is useful onlyif it is valid and reliable. Validity refers tothe instrument's ability to measure the traitit purports to measure, while reliabilityrefers to the ability of the instrument toconsistently measure that trait. To gaininformation about the validity and reliabilityof the MRP, the Reading Survey, and theConversational Interview were field tested.

Development and Field Testing of theReading Surrey

The criteria for item selection anddevelopment for the survey instrumentincluded: (a) applicability t., grades onethrough six; (b) applicability to all teachingapproaches and materials; (c) suitability forgroup administration; and (d) accuracy inreflecting the appropriate dimension ofmotivation (i.e., self-concept or value). Allsurvey items employ a Likert-type responsescale. A 4-point scale was used to avoidneutral, central response patterns. A 4-pointscale also seemed more appropriate forelementary students as there is some evi-dence to suggest that young children havedifficulty simultaneously discriminatingamong more than five discrete categories(Case & Khanna, 1981; Nitko, 1983). Inorder to avoid repetition in the presentationof the response alternatives and to controlfor the threat of "response set" (i.e., chil-dren selecting the same responses for eachitem), some response alternatives proceedfrom most positive to least positive whileothers are ordered in the opposite way.

An initial pool of survey items was devel-oped based on the criteria described above.Three experienced classroom teachers, whowere also graduate students in reading, cri-tiqued over 100 items for their construct valid-ity in assessing students' self-concept or valueof reading. The items that received 100%agreement by the teachers were then compiled.The agreed upon items were then submitted tofour classroom teachers who were asked to sortthe items into three categories of function: (1)measures self-concept, (2) measures values ofreading, and (3) not sure or questionable. Onlythose items that received 100% trait agreementwere selected for inclusion on the ReadingSurvey instrument.

The final version of the Reading Surveyinstrument was field tested in the late fall with330 third- and fifth-grade students in 27 class-rooms in four schools from two school districtsin an eastern state. To assess the internal con-sistency of the Reading Survey, Cronbach's(1951) alpha statistic was calculated, revealinga moderately high reliability for both thirdgrade (.70) and fifth grade (.76).

Development and Field Testing of theConversational Interview

Approximately 60 open- ended questionsregarding narrative and infor national reading,general and specific reading experiences, andhome and school reading prLaices were devel-oped for the initial pool of interview items.These items were field tested in the springwith a stratified random sample of 48 students(24 third-grade and 24 fifth-grade students).These two classroom teachers were asked toidentify these students according to three

NATIONAL READING RESEARCH CENTER, INSTRUCTIONAL RESOURCE NO. 14

Assessing Motivation 5

reading-ability levels: (1) at grade level,(2) above grade level, and (3) below gradelevel. The teachers were then asked to identify,within each of the three ability level lists, thetwo most "highly motivated readers" and thetwo "least motivated readers." Twenty-fourstudents from the list of most highly motivatedreaders and 24 students from the list of leastmotivated readers participated in the fieldtesting of the 60 interview items. Two graduatestudents, who were former classroom teachers,analyzed the 48 student protocols and selected14 questions that revealed the most usefulinformation about students' motivation to read.These 14 questions were used for the finalversion of the Conversational Interview.

Validation of the MRP

An additional step was taken to validatethe MRP. Responses to the survey and theinterview were examined for consistency ofinformation across the two instruments. Thesurvey and interview responses of two highlymotivated and two less motivated readers wererandomly selected for analysis. Two indepen-dent raters compared the student responses onthe survey instrument with their responses onthe interview for each of the 4 students. Forexample, one item on the survey asks the stu-dents to indicate whether they think they are a"very good reader," "good reader," "OK read-er," or "poor reader." Comments made duringthe conversational interview were then analyzedto determine if students provided any confirmingevidence regarding their self-perceived compe-tence in reading as they reported on the surveyinstrument.

Two raters independently compared eachstudent's responses to items on the survey withinformation provided during the interview,with an interrater agreement of .87. There wasconsistent, supporting information in the inter-view responses for approximately 70% of theinformation tapped in the survey instrument.The results of these data analyses support thenotion that the children responded consistentlyon both types of assessment instruments (sur-vey, interview) and across time (fall, spring).

Administering the MRP

The MRP combines group and individualassessment procedures. The Reading Surveyinstrument can be administered to an entireclass, small group, or individual, while theConversational Interview is designed to beconducted on an individual basis.

Administration and Scoring of the ReadingSurvey

The administration of the Reading Surveyinstrument takes approximately 15-20 min.Teachers should take into consideration gradelevel and attention span when deciding howand when to administer the survey instrument.For example, teachers of young children maydecide to administer the first 10 items in onesession and the final 10 during a second ses-sion.

The survey is designed to be read aloud tostudents (see Appendix for Teacher Direc-tions). One of the problems inherent in muchof the motivational research is that readingability often confounds the results so thatproficient, higher ability readers are typically

NATIONAL READING RESEARCH CENTER, INSTRUCTIONAL RESOURCE NO. 14

13

6 Gambrel!, Palmer, Codling, & Mazzoni

identified as "motivated," while less proficient,lower ability readers are identified as "unmoti-vated " Research indicates that this character-ization is inaccurate and that there are profi-cient readers who are not highly motivated toread, just as there are less proficient readerswho are highly motivated to read (McCombs,1991; Roettger, 1980). When students areinstructed to read independently and respondto survey items, the results for the lessproficient, lower-ability readers may not bereliable due to their frustration when readingthe items. For these reasons, the surveyinstrument is designed to be read aloud bythe teacher to help ensure the veracity ofstudent responses.

It is also important that students under-stand that their responses to the surveyitems will not be "graded." They should betold that the results of the survey will pro-vide information that the teacher can use tomake reading more interesting for them andthat the information will only be helpful ifthey provide their most honest responses.

Directions for scoring the ReadingSurvey as well as a scoring sheet are pro-vided (see Appendix). When scoring thesurvey, the more positive response is assignedthe highest number (i.e., 4) while the leastpositive response is assigned the lowestnumber (i.e., 1). For example, if a studentreported that s/he is a "good" reader, a "3"would be recorded. A percentage score onthe Reading Survey can be computed foreach student as well as scores on the twosubscales (Self-Concept As A Reader andValue of Reading). Space is also provided atthe bottom of the Scoring Sheet for theteacher to note any interesting or unusual

responses that might be probed later duringthe conversational interview.

Administration of the Conversational Interview

The Conversational Interview is designedto elicit information that will help the teachergain a deeper understanding of a student'sreading motivation in an informal, conversa-tional manner (see Appendix for TeacherDirections). The entire interview takes apprcx-imately 15-20 min but can easily be conductedin three 5-7 min sessions, one for each of thethree sections of the interview (narrative,informational, and general reading). Individualportfolio conferences are an ideal time toconduct the interview.

We suggest that teachers review studentresponses on the Reading Survey prior toconducting the Conversational Interview so thatthey may contemplate and anticipate possibletopics to explore during the interview phase ofthe MRP. During a conversational interview,some children will talk enthusiastically withoutprobing, while others may need support andencouragement. Children who are shy or whotend to reply in short, quick answers can beencouraged to elaborate upon their responsesusing nonthreatening phrases like "Tell memore about that . . .", "What else can you tellme . . .", and "Why do you think that . . . ."

Probing of brief responses from children isoften necessary in order to reveal importantand relevant information.

Teacher:. are also encouraged to extend,modify, and adapt the 14 questions outlined inthe Conversational Interview, especially duringconversations with individual students. Follow-up questions based on comments made by the

NATIONAL READING RESEARCH CENTER, INSTRUCTIONAL RESOURCE NO. 14

14

Assessing Motivation 7

students often provide the most significantinformation in such an interview.

Using the Results of the MRP to MakeInstructional Decisions

Information derived from an analysis ofthe results of the MRP can be used to planinstructional activities that will support studentsin their reading development. The followinglist provides some ideas for ways in which theresults can be used to enhance literacy learn-ing. First, specific recommendations are pre-sented for using the results of the ReadingSurvey and the Conversational Interview.Then, general recommendations for using theMRP are provided.

Using the Results of the Reading Survey

Because of the highly individualized natureof motivation, perhaps the best use of theinformation derived from the Reading Sur-vey is a careful examination of an individ-ual's responses. Individual responses tothis survey instrument may provide valu-able insights which can be used to createmore meaningful, motivational contextsfor reading instruction. For example, if achild indicates on the survey form that"reading is very hard" and that "reading isboring," the teacher can suggest bookswhich are of particular interest to the childand which the child can read with ease.

A total score and scores on the two sub-scales of the Reading Survey (Self-ConceptAs A Reader and Value of Reading) canbe computed for each student. Teachers

can then identify those children who havelower scores in these areas. These studentsmay be the ones who are in need of addi-tional support in developing motivation toread and may benefit from interventions topromote reading engagement.

Students who have lower subscores on theSelf-Concept As A Reader scale maybenefit from experiences that highlightsuccessful reading. For example, to buildfeelings of competence, the teacher canarrange for the child to read books tochildren in lower grades.

Students who have lower subscores on theValue of Reading scale may benefit fromexperiences that emphasize meaningfulpurposes for reading. For example, theteacher can ask the child to read abouthow to care for a class pet or could in-volve the child in class plays or skits.

If the class, as a whole, scores low on theValue of Reading scale, the teacher canimplement meaningful cooperative groupactivities where children teach one anotherabout what they have read regarding aparticular topic. The teacher can alsoinvolve the class in projects which requirereading instructions (e.g., preparing arecipe, creating a crafts project, or per-forming a science experiment).

Class averages for the total score andsubscores on the Reading Survey (Self-Concept As A Reader and Value of Read-ing) can be computed. This informationmay be helpful in obtaining an overview

NATIONAL READING RESEARCH CENTER, INSTRUCTIONAL RESOURCE NO. 14

8 Gambrel!, Palmer, Codling, & Mazzoni

of the classroom level of motivation atvarious eoints throughout the school year.

Teachers may also analyze class responsesto an individual item on the Reading Sur-vey. For example, if many children indi-cate on the survey instrument that theyseldom read at home, the teacher maydecide to implement a home reading pro-gram, or the teacher might discuss theimportance of home reading and parentinvolvement during Parent Night. Anothersurvey item asks children to complete thefollowing statement: "I think libraries are. . . ." If many students report a negativeresponse toward libraries, the teacher canprobe the class for further information inorder to identify reasons which can thenbe addressed.

Using the Results of the ConversationalInterview

The primary purpose of the Conversa-tional Interview is to gain insight into whatmotivates the student to engage in reading.Therefore, the interview questions focuson reading that students find "most inter-esting." This information can inform theteacher about specific topics, books, andauthors that the individual student findsengaging and motivating.

The Conversational Interview might alsoreveal particular activities related to read-ing that the child enjoys. For example, onechild in our field study mentioned hisfather several times during the i'iter-view reading to his father, telling hisfather about something interesting he had

read, and selecting and buying books withhis father. In such a situation, s teachercan suggest home activities or even spe-cific books that the father and child mightenjoy reading at home.

Class responses to items on the Conversa-tional Interview may also reveal usefulinformation. For example, if many childrenexpress interest in a particular topic,teachers may find ways to include readingactivities regarding the topic in theirinstructional programs. Many childrenmay also express the same interest in aparticular instructional activity whichinvolves reading, such as inviting guestreaders into the classroom or "YoungAuthors' Night" where children presenttheir stories to parents and guests. Thisinformation can then be taken intoaccount for future planning.

General Recommendations for Using the MRP

The MRP can provide a means of assess-ing and monitoring student responses toinnovations in the classroom that aredesigned to promote reading motivation.For example, the teacher might collectinformation using the MRP prior to andfollowing the implementation of a readingmotivational intervention, such as a sus-tained silent reading program or involve-ment in a classroom or a schoolwide read-ing motivational program. The informa-tion from the MRP can serve as a meansof monitoring and documenting the effectof classroom innovations on student moti-vation.

NATIONAL READING RESEARCH CENTER, INSTRUCTIONAL RESOURCE NO. 14

I f3

Assessing Motivation 9

The MRP can be given at the beginning ofthe year to provide the teacher with pro-files of each child. This information canbe placed in children's reading portfolios.Teachers may decide to administer theMRP several times throughout the schoolyear so that changes in the child's attitudesand interests about reading can be docu-mented and compared.

The MRP can be administered at each gradelevel and the assessment data retained so thatteachers can compare changes in a child'sself-concept as a reader and value of read-ing as s/he progresses from grade tograde.

These are only a sampling of ideas of theways in which the MRP can be used in theclassroom. Each teacher will have his/her ownparticular insights about ways in which theMRP information can best be applied to meetthe needs of students.

Cautions About Interpreting Responses to theMRP

It is important to recognize that althoughthere is support for the reliability and validityof the MRP, it is a self-report instrument, andit has limitations that are commonly associatedwith such instruments. For example, it is

impossible to determine from self-report instru-ments alone whether or not students actuallyfeel, believe, or do the things they report. Eventhough the elaborate, descriptive informationgleaned from the interview can substantiatesurvey responses to some extent, it is only

through careful observation that teachers canverify information derived from the MRP.

Also, one should be cautious when inter-preting responses to individual items due to thecontextual nature of reading motivation. Forexample, a student might feel highly competentas a reader when reading high-interest, self-selected narrative materials and yet feel far lesscompetent when reading content area materials.It is more important to look across the surveyand interview responses to determine patternsthat reveal factors that are relevant to thestudent's reading motivation.

Finally, as with any assessment, the MRPshould be used in conjunction with otherassessment instruments, techniques, and proce-dures. Teachers should consider the MRP asone source of information about reading moti-vation.

Summary

Teachers today view motivation as anintegral component of reading instruction. Inaddition, there are a number of studies thatsuggest a connection between motivation andachievement. Current motivational theoryemphasizes the role of self-perceived compe-tence and task value as determinants of motiva-tion and task engagement. The Motivation toRead Profile was developed to provide teacherswith an efficient and reliable instrument forassessing reading motivation by evaluating stu-dents' self- concept as a reader and the valuethey place on reading. In addition, the assess.ment instrument provides both quantitative andqualitative information by combining the use ofa survey instrument and an individual interview.

NATIONAL READING RESEARCH CENTER, INSTRUCTIONAL RESOURCE NO. 14

17

10 Gambrell, Palmer, Codling, & Mazzoni

Motivation: Integral component of readinginstruction.

There are a number of ways in which theMRP can be used to make instructional deci-sions, and teachers are in the best position todecide how they will apply the informationgleaned from the MRP in their classrooms.Ideally, the MRP will help teachers acquireinsights about individual students, particularlythose students about whom teachers worrymost in terms of their reading motivation anddevelopment. The individualized nature of the

information derived from the MRP makes thisinstrument particularly appropriate for inclu-sion in portfolio assessment. Careful scrutinyof the responses to the Reading Survey and theConversational Interview, coupled with teacherobservations of student behaviors in variousclassroom reading contexts, can help teachersplan for meaningful instruction that will sup-port students in becoming highly motivatedreaders.

NATIONAL READING RESEARCH CENTER, INSTRUCTIONAL RESOURCE NO. 14

is

Assessing Motivation 11

References

Ames, C., & Archer, J. (1988). Achievement goalsin the classroom: Students' learning strategiesand motivation processes. Journal of Educa-tional Psychology, 80, 260-267.

Baker, C. D. (1984). The search for adultness:Membership work in adolescent-adult talk.Human Studies, 7,301-323.

Burgess, R. (1980). Field research: A sourcebookand field manual. London: Allen & Uwin.

Case, R., & Khanna, F. (1981). The missing links:Stages in children's progression from sensori-motor to logical thought. In K. W. Fischer(Ed.), Cognitive development: New directionsfor child development. San Francisco: Jossey-Bassey.

Cronbach, L. J. (1951). Coefficient alpha and theinternal structure of tests. Psychometrika, 16,297-334.

Deci, E., & Ryan, R. (1985). Intrinsic motivationand self-determination in human behavior.New York: Plenum.

Denzin, N. (1970). The research act in sociology.London: Butterworth.

Dweck, C., & Elliott, E. (1°83). Achievementmotivation. In E. M. Heatherington (Ed.),Handbook of child psychology: Vol. 4, Social-ization, personality, and social development(pp. 643-691). New York: Wiley.

Eccles, J., (1983). Expectancies, values and academicbehaviors. In J. T. Spence (Ed.), Achievementand achievement motives (pp. 75-146). SanFrancisco: Freeman.

Elley, W. B. (1992). How in the world do studentsread? Hamburg, Germany: InternationalAssociation for the Evaluation of EducationalAchievement.

Ford, M. E. (1992). Motivating humans. NewburyPark: Sage.

Gambrell, L. B., & Morrow, L. M. (in press).Creating motivating contexts for literacy

learning. In L. Baker, P. Afflerbach, & D.Reinking (Eds.), Developing engaged readersin home and scnool communities. Hillsdale,NJ: Erlbaum.

Gottfried, A. E. (1986). Children's academic in-trinsic motivation inventory. Odessa, FL:Psychological Assessment Resources.

Guthrie, J. T., Schafer, W., Wang, Y., & Affler-bach, P. (1993). Influences of instruction onreading engagement: An empirical explorationof a social-cognitive framework of readingactivity (Reading Research Report No. 3).Athens, GA: NRRC, Universities of Georgiaand Maryland College Park.

Harter, S. (1981). A new self-report scale of intrin-sic versus extrinsic orientation in the class-room: Motivational and informational compo-nents. Developmental Psychology, 17, 300-312.

Johnson, C. S., & Gaskins, J. (1991) Readingattitude: Types of materials and specific strate-gies. Reading Improvement, 28, 237-242.

Kuhl, J. (1986). Introduction. In J. Kuhl & J. W.Atkinson (Eds.), Motivation, thought, andaction (pp. 1-16). New York: Praeger.

Lepper, M. R. (1988). Motivational considerationsin the study of instruction. Cognition andInstruction, 5,289-309.

Maehr, M. L. (1976). Continuing motivation: Ananalysis of a seldom considered educationaloutcome. Review of Educational Research, 46,443-462.

McCombs, B. L. (1991). Unraveling motivation:New perspectives from research and practice.The Journal of Experimental Education, 60,3-88.

McKenna, M. C., & Kear, D. J. (1990). Measur-ing attitude toward reading: A new tool forteachers. The Reading Teacher, 43, 626-639.

Nitko, A. J. (1983). Educational tests and measure-ment: An introduction. New York: HarcourtBrace Jovanovich.

NATIONAL READING RESEARCH CENTER, INSTRUCTIONAL RESOURCE NO. 14

19

12 Gambrell, Palmer, Codling, & Mazzoni

O'Flahavan, J., Gambrell, L. B., Guthrie, J.,Stahl, S., Baumann, J., & Alvermann, D.(1992). Poll results guide activities of researchcenter. Reading Today, 10(1), 12.

Paris, S. G., & Oka, E. R. (1986). Self-regulatedlearning among exceptional children. Excep-tional Children, 53,103-108.

Pintrich, P. R., & De Groot, E. V. (1990). Motiva-tional and self-regulated learning componentsof classroom academic performance. Journalof Educational Psychology, 82, 33-40.

Purves, A., & Beach, R. (1972). Literature and thereader: Research on response to literature,reading interests, and teaching of literature.Urbana, IL: National Council of Teachers ofEnglish.

Raynor, J. 0., & Nochajski, T. H. (1986). Devel-opment of the motivation for particular activityscale. In D. R. Brown & I. Vernoff (Eds.),Frontiers of motivational psyt%ology (pp.1-25). New York: Springer-Verlag.

Roettger, D. (1980). Elementary students' attitudestoward reading. The Reading Teacher, 33,451-453.

Schell, L. M. (1992). Student perceptions of goodand poor readers. Reading Improvement, 29,50-55.

Schunk, D. (1985). Self-efficacy and school learn-ing. Psychology in the Schools, 22, 208-223.

Tunnell, M. 0., Calder, J. E., Justen, J. E., &Phaup, E. S. (1988). Attitudes of young read-ers. Reading Improvement, 25,237-242.

Veenman, S. (1984). Perceived problems of begin-ning teachers. Review of Educational Re-search, 54, 143-178.

Walberg, H. J., & Tsai, S. (1985). Correlates ofreading achievement and attitude: A nationalassessment study. Journal of EducationalResearch, 78, 159-167.

Wigfield, A. (1994). Expectancy-value theory ofachievement motivation: A developmental

perspective. Educational Psychology Review,6(1), 48-78.

Winne, P. (1985). Steps toward promoting cogni-tive achievements. Elementary School Journal,85 , 673-693.

Wixson, K. K., & Lipson, M. Y. (1991). Readingdiagnosis and remediation. Glenview, IL:Scott Foresman.

NATIONAL READING RESEARCH CENTER, INSTRUCTIONAL RESOURCE NO. 14

Assessing Motivation 13

APPENDIX

Motivation to Read Profile

Reading Survey

Conversational Interview

Teacher Directions: MRP Reading Survey

Scoring Directions: MRP Reading Survey

Scoring Sheet

Teacher Directions: MRP Conversational Interview

NATIONAL READING RESEARCH CENTER, INSTRUCTIONAL RESOURCE NO. 14

4. 1

TEACHER DIRECTIONS: MRP READING SURVEY

Distribute copies of the Reading Survey. Ask students to write their names on the spaceprovided.

Say:I am going to read some sentences to you. I want to know how you feel about your

reading. There are no right or wrong answers. I really want to know how you honestlyfeel about reading.

I will read each sentence twice. Do not mark your answer until I tell you to. Thefirst time I read the sentence, I want you to think about the best answer for you. Thesecond time I read the sentence, I want you to fill in the space beside your best answer.Mark only one answer. Remember: Do not mark your answer until I tell you to.Okay, let's begin.

Read the first :ample item. Say:Sample #1: I am :It (pause) 1st grade, (pause) 2nd grade, (pause), 3rd grade, (pause) 4thgrade, (pause) 5th grade, (pause) 6th grade.

Read the first sample again. Say:This time as I read the sentence, mark the answer that is right for you, I am in (pause)1st grade, (pause) 2nd grade, (pause) 3rd grade, (pause) 4th grade, (pause) 5th grade,(pause) 6th grade.

Read the second sample item. Say:Sample #2: I am a (pause) boy, (pause) girl.

Say:Now, get ready to mark your answer.I am a (pause) boy, (pause) girl.

Read the remaining items in the same way (e.g., number , sentence stem followed by apause, each option followed by a pause, and then give specific directions fox students tomark their answer while you repeat the entire item).

MOTIVATION TO READ PROFILE

READING SURVEY

Name Date

Sample #1: I am in

O 1st gradeO 2nd gradeO 3rd grade

Sample #2: I am a

O boyO girl

O 4th gradeO 5th gradeO 6th grade

1. My friends think I am

O a very good readerO a good readerO an OK readerO a poor reader

2. Reading a book is something I like to do.

O NeverO Not very oftenO SometimesO Often

3. I read

O not as well as my friendsO about the same as my friendsO a little better than my friendsO a lot better than my friends

4. My best friends think reading is

O really funO funO OK to doO no fun at all

5. When I come to a word I don't know, I can

O almost always figure it outO sometimes figure it outO almost never figure it outO never figure it out

6. I tell my friends about good books I read.

O I never do this.O I almost never do this.O I do this some of the time.O I do this a lot.

7. When I am reading by myself, I understand

O almost everything I readO some of what I readO almost none of what I readO none of what I read

8. People who read a lot are

O very interestingO interestingO not very interestingO boring

O a poor readerO an OK readerO a good readerO a very good reader

10. I think libraries are

O a great place to spend timeO an interesting place to spend timeO an OK place to spend timeO a boring place to spend time

11. I worry about what other kids think about my reading

O every dayO almost every dayO once in a whileO never

12. Knowing how to read well is

O not very importantO sort of importantO importantO very important

13. When my teacher asks me a question about what I have read, I

O can never think of an answerO have trouble thinking of an answerO sometimes think of an answerO always think of an answer

00 r4

14. I think reading is

O a boring way to spend timeO an OK way to spend timeO an interesting way to spend timeO a great way to spend time

15. Reading is

O very easy for meO kind of easy for meO kind of hard for meO very hard for me

16. When I grow up I will spend

O none of my time readingO very little of my time readingO some of my time readingO a lot of my time reading

17. When I am in a group talking about stories, I

O almost never talk about my ideasO sometimes talk about my ideasO almost always talk about my ideasO always talk about my ideas

18. I would like for my teacher to read books out loud to the class

O every dayO almost every dayO once in a whileO never

19. When I read out loud I am a

O poor reader1-04; reader

O good readerO very good reader

20. When someone gives me a book for a present, I feel

O very happyO sort of happyO sort of unhappyO unhappy

4..

SCORING DIRECTIONS: MRP READING SURVEY

The survey has 20 items based on a 4-point Liken scale. The highest total score possible is 80points, which would be achieved if a student selects the most positive response for every item on thesurvey. On some items, the response options are ordered least positive to most positive (see item #2below), with the least positive response option having a value of 1 point and the most positive optionhaving a point value of 4. On other items, however, the response options are reversed (see item #1below). In those cases, it will be necessary to recode the reponse options. Items where recoding isrequired are starred on the Scoring Sheet.

EXAMPLE: Here us how Maria completed items 1 and 2 on the Reading Survey.

1. My friends think I am

O a very good readera good reader

O an OK readerO a poor reader

2. Reading a book is something I like to do.

O NeverO Not very oftenO Sometimes

Often

To score item 1, it is first necessary to recode the response options so that

a poor reader equals 1 point,an OK reader equals 2 points,a good reader equals 3 points,a very good reader equals 4 points.

Since Maria answered that she is a good reader the point value for that item, 3, is entered on the firstline of the Self-Concept column on the Scoring Sheet. See below.

The response options for item 2 are ordered least positive (1 point) to most positive (4 points), soscoring item 2 is an easy process. Simply enter the point value associated with the response that Mariachose. Because Maria selected the fourth option, a 4 is entered for item #2 under the Value of Readingcolumn on the Scoring Sheet. See below.

Scoring Sheet

Self-Concept as Reader

*recode 1..3_

Value of Reading

2. 4

To calculate the Self-Concept raw score and Value raw score, add all student responses in therespective column. The Full Survey raw score is obtained by combining the column raw scores. Toconvert the raw scores to percentage scores, it is necessary to divide student raw scores by the totalpossible score (40 for each subscale, 80 for the full survey).

MRP READING SURVEYSCORING SURVEY

Student Name

Grade Teacher

Administration Date

recoding scale

1 = 42 = 33 = 24 = 1

Self-Concept

*recode

as Reader

1.

Value of Reading

2.

3. *recode 4.

*recode 5. 6.

*recode 7. *recode 8.

9. *recode 10.

*recode 11. 12.

13. 14.

*recade 15. 16.

17. *recode 18.

19. *recode 20.

SC Raw Score: /40 V Raw Score: /40

Full survey raw score (Self-Concept & Value): /80

Percentage Scores Self-ConceptValueFull survey

Comments:

TEACHER DIRECTIONS: MRP CONVERSATIONAL INTERVIEW

Duplicate the Conversational Interview so that you have a form for each child.

Choose in advance the section(s) or specific questions you want to ask from the ConversationalInterview. Reviewing the information on students' Reading Surveys may provide informationabout additional questions that could be added to the interview.

3 Familiarize yourself with the basic questions provided in the interview prior to the interviewsession in order to establish a more conversational setting.

4. Select a quiet corner of the room and a calm period of the day for the interview.

5. Allow ample time for conducting the conversational interview.

6. Follow up on interesting comments and responses to gain a fuller understanding of their readingexperiences.

Record students' responses in as much detail as possible. If time and resources permit, you maywant to audiotape answers to Al and B1 to be transcribed after the interview for more in-depthanalysis.

Enjoy this special time with each student!

MOTIVATION TO READ PROFILE

CONVERSATIONAL INTERVIEW

Student Name: Date:

A. Emphasis: Narrative Text

Suggested Prompt (designed to engage student in a natural conversation):

I have been reading a good bo)k . . I was talking with . . . about it last night. I enjoy talkingabout good stories and books ti tat I've been reading. Today I'd like to hear about what you havebeen reading.

1. Tell me about the most resting story or book you have read this week (or even last week).Take a few minutes to think about it. (Wait time.) Now, tell me about the book or story.

Probes: What else can you tell me?Is there anything else?

2. How did you know or fmd out about this story?

assigned in schoolchosen out of school

3. Why was this story interesting to you?

B. Emphasis. Informational Text

Suggested Prompt (designed to engage student in a natural conversation):

Often we read to find out about something or to learn about something. We read for information.For example, I remember a student of mine . who read a lot of books .about . . to find out asmuch as he/she could about . . . . Now, I'd like to hear about some of the informational readingyou have been doing.

1. Think about something important that you learned recently, not from your teacher and notfrom televisio.1, but from a book or some other reading material. What did you read about?(Wait time.) Tell me what you learned.

Probes: What else could you tell me?Is there anything else?

2. How did you know or find out about this book/article?

assignedchosen

in schoolout of school

3. Why was this book (or article) important to you?

C. Emphasis: General Reading

1. Did you read anything at home yesterday? What?

2. Do you have any books at school (in your desk/storage area/locker/bookbag) today that youare reading? Tell me about them.

3. Tell me about your favorite author.

4. What do you think you have to learn to be a better reader?

5. Do you know about any books right now that you'd like to read? Tell me about them.

6. How did you fmd out about these books?

7. What are some things that get you really excited about reading books?

Tell me about . . .

8. Who gets you really interested and excited about reading books?

Tell me more about what they do.

N R R C NationalReading ResearchC alter318 Ac1erhokl, University of Georgia, Athens, Georgia 30602.71253216/. M. Patterson Building University of Maryland, College Park, MD 20742

ti