monitoring student progress: administrative issues doug marston john hintze july 8, 2005

50
Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

Upload: jair-dome

Post on 15-Dec-2015

216 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

Monitoring Student Progress:

Administrative Issues

Doug Marston

John Hintze

July 8, 2005

Page 2: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

2

Monitoring Student Progress: Administrative Issues

Part I of this presentation is found in the accompanying History presentation

III. Administrative leadership and support for success in implementing Progress Monitoring– Mike Schmoker, RESULTS: The key to continuous school

improvement– John Hintze, Professor, University of Massachusetts– Bonnie Glazewski, Assistant Principal, Oak Dale

Elementary– Barriers to Implementation – Concerns-Based Adoption Model: SocQ– Stan Deno, Erica Lembke, & Amy Reschly: Leadership for

Developing a School-wide Progress Monitoring System

IV.Group Activity: Resources for data leaders– Progress Monitoring (Deno, Lembke & Reschly)– School Improvement Data Selection Tool (Heartland AEA,

Iowa)

V. Questions & Answers

Page 3: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

III. Administrative Leadership and Support for

Success in Implementing Progress Monitoring

Page 4: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

4

III. Administrative Leadership and Support for Success in Implementing Progress Monitoring

Mike Schmoker: RESULTS: The key to continuous school improvement

John Hintze, Professor, University of Massachusetts

Bonnie Glazewski, Assistant Principal, Oak View Elementary

Barriers to Implementation Concerns-Based Adoption Model: SocQ Stan Deno, Erica Lembke, & Amy Reschly:

Leadership for Developing a School-wide Progress Monitoring System

Page 5: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

5

The Keys to Improving Schools*

Effective Teamwork Measurable Goals Performance Data

* Schmoker, M. (1999). Results: The Key to Continuous School Improvement

Page 6: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

6

“Collegiality among teachers, as measured by the frequency of communication, mutual support, help, etc., was a strong indicator of implementation success. Virtually every research study on the topic has found this to be the case”

(Fullan, 1991, p. 132).

Warning:“Much of what we call teamwork or collegiality does not favor nor make explicit what should be its end: better results for children … the weaker, more common forms of collegiality ‘serve only to confirm present practice without evaluating its worth’”

(Schmoker, p. 15).

Effective Teamwork(Schmoker, 1999)

Page 7: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

7

Measurable Goals: Criteria for Effective Goals(Schmoker, 1999)

Measurable Annual: reflecting an increase over the

previous year of the percentage of students achieving mastery.

Focused, with occasional exceptions, on student achievement.

Linked to a year-end assessment or other standards-based means measuring established level of performance.

Written in simple, direct language that can be understood by almost any audience.

Page 8: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

8

Performance Data(Schmoker, 1999)

“Teachers can base teaching decisions on solid data rather than on assumptions, and they can make adjustments early on to avoid the downward spiral of remediation”

(Waters, Burger, and Burger, 1995, p. 39).

Page 9: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

“Stressing the connection between teamwork and analysis of data, Fullan adds that “the crux of the matter is getting the right people together with the right information at their disposal”

(1991, p. 87).

“Part of the reason we dismiss this call for data is the outworn mind-set that because schools are so different from other organizations, quality and learning will thrive spontaneously, without any formal effort to use data equivalent to what other organizations use routinely. Schools generally avoid goals and precise means of measuring progress toward them”

(Schmoker, 2001, p. 39).

Page 10: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

10

Group Data vs. Conventional Data

“Lortie found that educators do not seek to identify and address patterns of success and failure, which can have broad and continuous benefits for greater numbers of children…the real power of data emerges when they enable us to see—and address—patterns of instructional program strengths or weaknesses, thus multiplying the number of individual students we can help”

(Schmoker, p. 43).

Page 11: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

11

Ten Most Frequently Cited Barriers to Implementation of Curriculum-Based Measurement (Yell, Deno & Marston)

Need for a variety of instructional strategies when data indicates a change is necessary.

Collecting data but not using it for instructional decisions.

CBM represents change which creates anxiety and resistance.

Ongoing training for general and special education staff. CBM at secondary level. Logistics of monitoring and making changes. Staff resistant to making instructional changes. Support necessary for new users. Adequate staffing. Concern over relationship between fluency and

comprehension.

Page 12: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

12

Why is fluency important?

Samuels (1979) notes that reading fluency and comprehension are intertwined:“…As less attention is required for decoding, more attention becomes available for comprehension.”

According to the Commission on Reading (1985), A Nation of Readers“ readers must be able to decode words quickly and accurately so that this process can coordinate fluidly with the process of constructing the meaning of the text”

Page 13: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

“…the National Assessment of Educational Progress conducted a large study of the status of fluency achievement in American education”

(Pinnell et. al., 1995)– Found 44%of students to be disfluent even with

grade-level stories that the students had read under supportive testing conditions.

– Found a close relationship between fluency and reading comprehension. Students who are low in fluency may have difficulty getting the meaning of what they read.

Page 14: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

14

Ideas for Saving Time, Increasing Efficiency and Minimizing Disruption of Small Group Instruction

Create expectation with students that “reading aloud” is part of instruction.

Once a week monitoring versus 2/3 x per week. Technology for creating charts and trend lines. Establish progress monitoring as one of learning

stations. Use educational assistants and/or tutors Measure during “independent level” instruction. Use group administered procedures when

possible.

Page 15: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

15

When is CBM administered? What is the frequency?

The frequency of assessment is determined by how often we want to make a decision on whether a student is in need of an instructional change to increase student achievement.

– A student above grade level

– A student at grade level

– A student below grade level

Page 16: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

16

Above Benchmarks 2–3 x/year(>65th Percentile)

Below Benchmarks 4–6 x/year(25-65th Percentile)

Significant Help 2 x/month(5th – 25th Percentile)

Special Education Weekly(Below 5th Percentile)

An optimal model assessment schedule

Page 17: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

17

Advantages of Using CBM

Minimal Cost Time efficient Widely used in district Highly correlated to State

Assessments Rich research base

Page 18: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

18

High

Medium

Low

Concerns-Based Adoption Model (CBAM) Hall & Rutherford (1977)

Impact on Self Management Concerns System-Level Impact

Page 19: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

19

Concerns Based Adoption Model (CBAM) (Hall & Rutherford)

Self concerns (“What will it mean for me?)

Task concerns (“How do I do it?”) Impact concerns (“How will affect

students/staff?” “Can we do it better?”)

From Deno, Lembke, & Reschly—University of Minnesota

Do not reproduce without permission

Page 20: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

20

Leadership for Developing a School-wide Progress Monitoring System

Stan DenoErica LembkeAmy Reschly

Leadership Team ActivitiesLeadership Team Content Module

Study Group ActivitiesProgress Monitoring Content Module

[email protected] of Minnesota

Page 21: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

21

CBM Concerns

Self– Time/resources– Value/validity– Accountability/consequences

Task– Interpretation of data– Intervention availability/feasibility/resources

Other – (Remains to be seen)

From Deno, Lembke, & Reschly—University of MinnesotaDo not reproduce without permission

Page 22: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

22

Time/Resource Concerns

Support from lead staff to develop efficient procedures for screening and progress monitoring – Recruiting volunteers/EAs– Organizing materials– Planning the process and schedule– Collecting and organizing products

Assurance of required resources

From Deno, Lembke, & Reschly—University of MinnesotaDo not reproduce without permission

Page 23: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

23

Value/Validity Concerns

Research-based effective practice Linked to State Standards (Primary Level)

– Read, Listen, View: Literal Comprehension • #3 “Pronouncing new words using phonic skills”• #5 “Reading aloud fluently with expression”

– Correlates highly with MCAs & MBST (Reading) Linked to curricula

– E.g., Houghton Mifflin’s Teacher Assessment Handbook

From Deno, Lembke, & Reschly—University of MinnesotaDo not reproduce without permission

Page 24: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

24

Responding to Common Questions

How would you respond to the following commonly asked questions if asked by one of your staff?

From Deno, Lembke, & Reschly—University of Minnesota

Do not reproduce without permission

Page 25: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

25

How can I do progress monitoring with all the other things I have to include in my literacy block?

There is a growing consensus that school improvement occurs when student performance outcomes are placed at the center of our attention. In this REA project we are going to have to order our priorities so that we view time spent monitoring student progress is just as important as time spent in instruction.

Results: The Key to Continuous School Improvement—Schmoker

From Deno, Lembke, & Reschly—University of MinnesotaDo not reproduce without permission

Page 26: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

26

We already use the MCAs and another standardized achievement test to assess students. How are these measures different?

Standardized tests of achievement, like the MCAs, the Northwest Achievement Levels Tests, and the Iowa Tests of Basic Skills, are typically given once a year and provide an indication of student performance relative to peers at the state or national-level. Conversely, curriculum-based measures are an efficient means of monitoring student performance on an ongoing basis. With CBM, we are able to detect whether students are in fact, making progress toward an end goal and to monitor the effects of instructional modifications aimed at helping the student reach this goal.

From Deno, Lembke, & Reschly—University of MinnesotaDo not reproduce without permission

Page 27: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

27

How is CBM different from running records? Or IRIs?

Running records and informal reading inventories (IRIs) focus on what might be taught in an effort to improve reading; whereas, CBMs are outcome indicators that reflect on the success of what is taught. A large body of research has shown that one-minute samples of the number of words read correctly from reading passages are sensitive, reliable, and valid of measures of reading growth. If teachers find them useful, running records and IRIs can be used in conjunction with regular progress monitoring to help generate ideas for possible changes in students’ programs that can be evaluated using CBM.

From Deno, Lembke, & Reschly—University of MinnesotaDo not reproduce without permission

Page 28: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

28

The measures are often called “curriculum-based.” Do we need to use our curriculum for progress measurement?

Research has shown that it isn’t necessary to use passages from school’s curriculum to validly describe growth. What’s important is whether the passages used for monitoring are at a similar level of difficulty from one sample to the next. Using your own curriculum can be useful, but isn’t necessary.

From Deno, Lembke, & Reschly—University of MinnesotaDo not reproduce without permission

Page 29: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

29

My students’ oral reading scores bounce up and down from one passage to the next. Does this mean the data are unreliable?

There is no way to assure that all passages used are at the exact same level of difficulty. Passages (even taken from the same level) are going to vary. In addition to passage difficulty, student performance may vary from week-to-week for a number of reasons – lack of sleep, problems with friends, being hungry, etc. That’s why it is important to look at the overall trend of the data (it’s kind of like the stock market). Every data point that is collected adds stability to the measure of reading performance. This problem can be dealt with by measuring frequently (once a week) or taking the median of 3 passages at each measurement period.

From Deno, Lembke, & Reschly—University of MinnesotaDo not reproduce without permission

Page 30: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

30

Should I have my students practice reading passages out loud for one minute?

No. Reading aloud is NOT the intervention—it is used as an indicator of growth in overall reading proficiency.

From Deno, Lembke, & Reschly—University of Minnesota

Do not reproduce without permission

Page 31: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

31

Should I count words wrong for ELL students? Even if the student mispronounces a word due to an accent?Should I count words wrong for students who speak with a different dialect?

We can decide whether to count pronunciations of a word consistent with an accent or dialect as correct; however,

Counting rules must be consistent across student’s and teachers so we can aggregate our data

From Deno, Lembke, & Reschly—University of Minnesota

Do not reproduce without permission

Page 32: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

32

Some of my students are making progress but they are still not meeting their goal. Should I lower their goal?

No, instead of lowering the goal, we might ask: is there anything I can do differently, or is there a need for an instructional change? And remember, there will be individual differences across students. Students will not always grow at the same rate.

From Deno, Lembke, & Reschly—University of Minnesota

Do not reproduce without permission

Page 33: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

Supporting Teachers in Developing Their Progress

Monitoring Procedures

Continued address of

Task Level concerns

Page 34: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

34

Goals of Teacher’s Study Group (Set-up)

Identify and organize reading passages Develop a plan for progress monitoring Complete Fall Screening Set goals for individual students,

establish classwide benchmarks, and begin progress monitoring

Implement a data utilization rule for individual students and revise programs

From Deno, Lembke, & Reschly—University of MinnesotaDo not reproduce without permission

Page 35: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

35

Goals (Follow-through)

Develop a plan for, schedule, and conduct the Winter Screening

Make data-based program evaluation and revision decisions about classroom program

Complete Spring Screening and summarize outcomes

From Deno, Lembke, & Reschly—University of Minnesota

Do not reproduce without permission

Page 36: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

36

Basic Plan

Teachers screen entire class F-W-S using the same 3 “Grade Level” passages

Identify “At Risk” Students (bottom 20-40%?) Monitor Progress of At Risk students

(weekly/biweekly) Evaluate progress of individual At Risk students

and revise programs as necessary Evaluate class progress W-S and revise

From Deno, Lembke, & Reschly—University of MinnesotaDo not reproduce without permission

Page 37: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

37

Timeline

July/August Decide on the level at which you will proceed

(classroom, grade, or school-wide) Prepare materials Decide on a monitoring schedule Practice probe administration and scoring Develop a data-management system Develop background knowledge

From Deno, Lembke, & Reschly—University of MinnesotaDo not reproduce without permission

Page 38: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

38

Timeline

September Conduct a Fall screening Identify students at-risk Develop background knowledge

October Set classroom goals and establish benchmarks Prepare graphs for students that will be monitored Set short term objectives and long range goals for

students that will be monitored Develop background knowledge

From Deno, Lembke, & Reschly—University of MinnesotaDo not reproduce without permission

Page 39: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

39

Timeline

November Data utilization and decision making Implementing interventions Develop a plan and schedule the Winter screening Develop background knowledge

January/February Conduct a Winter screening Evaluate classroom progress relative to benchmarks Develop background knowledge

April/May Develop a plan and schedule the Spring screening Conduct a Spring screening Evaluate classroom progress relative to benchmarks

From Deno, Lembke, & Reschly—University of MinnesotaDo not reproduce without permission

Page 40: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

40

Leadership Team Activities (PreFall)

Review study group activities Provide leadership in developing a plan

for screening Promote a discussion among the

teachers about the role that data are going to play in school improvement

Find times for study groups

From Deno, Lembke, & Reschly—University of MinnesotaDo not reproduce without permission

Page 41: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

41

Leadership Activities: Sep-Oct

Keep study groups moving forward Assist teachers in completing the fall screening Participate in determining “At Risk” Collaborate in setting student goals and class-

wide benchmarks Secure assistance for teachers as they begin

progress monitoring

From Deno, Lembke, & Reschly—University of MinnesotaDo not reproduce without permission

Page 42: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

42

Leadership Activities: Nov.

Assist teachers in evaluating progress of At Risk students

Generate and select research-based interventions

Seek resources to support interventions Schedule Winter screening

From Deno, Lembke, & Reschly—University of MinnesotaDo not reproduce without permission

Page 43: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

43

Leadership Activities: Jan-Feb

Complete winter screening Review classroom and grade level success

in meeting benchmark standards Consider class and grade program changes Continue to meet with teachers to review

individual student progress and seek research-based interventions

From Deno, Lembke, & Reschly—University of MinnesotaDo not reproduce without permission

Page 44: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

44

Leadership Activities: Apr-May

Continue to support individual formative evaluation

Plan and implement Spring screening Assist teachers in summarizing outcomes Aggregate school-wide data

From Deno, Lembke, & Reschly—University of Minnesota

Do not reproduce without permission

Page 45: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

45

Data Aggregation System

Consider the types of questions you want to answer– How are the students growing F-W-S?– How does growth compare across grades?– (How does growth occur in classrooms?) – How do different subgroups compare?

From Deno, Lembke, & Reschly—University of Minnesota

Do not reproduce without permission

Page 46: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

IV. Group Activity: Resources for Data Leaders

Page 47: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

47

IV. Group Activity: Resources for Data Leaders

Leadership for Developing a School-wide Progress Monitoring System (Deno, Lembke & Reschly)

School Improvement Data Selection Tool (Heartland AEA, Iowa)

Page 48: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

“In a survey of state education officials conducted for Technology Counts 2005 by the Education Week Research Center, 15 states reported that the 3-year-old No Child Left Behind Act had influenced their decisions to put in place bigger and better data-collection systems” Education Week, May 5, 2005

Page 49: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

http://www.aea11.k12.ia.us/assessment/sidsst.pdf

Page 50: Monitoring Student Progress: Administrative Issues Doug Marston John Hintze July 8, 2005

V. Questions and Answers