national conference national conference on student

Post on 06-Jun-2022

11 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Chad Buckendahl (for Susan Davis), Alpine Testing SolutionsChris Domaleski, Center for Assessment

Kris Ellington, Florida Department of Education Tim Parshall, University of Missouri

Gay Pickner, South Dakota Department of EducationTeri Siskind, Carolina State Department of Education

National Conference National Conference on Student Assessment

June 2010, Detroit, Michigan

Chad W. BuckendahlAlpine Testing Solutions

Context of the impact

Range of questions about impact that may inform programs

Range of stakeholders (e.g., educators, public, policymakers)

Defining goals and anticipated action plans as part of the design in advance

Mehrens (1998)

Curricular and instructional impact

Teacher motivation and stress

Student motivation and self-concept

Changes in student achievement

Public awareness of student achievement

Lane, Parks, & Stone (1998)

Multi-level stakeholder consideration (e.g., classroom, school, district, state, national)

Tim ParshallAssociate DirectorAssessment Resource CenterCollege of EducationUniversity of Missouri

Background

Peer Review

Technical Advisory Committee

Department of Elementary & Secondary Education

Assessment Resource Center

Outstanding Schools Act (1993) v. NCLB (2001)

Plan (2012) v. Reality (2009)

Focus Groups & Interviews

236 participants from 14 stakeholder groups, representing variety of Missouri locales, grade levels, etc.

What are the consequences of MAP testing under NCLB?

Superintendents

Curriculum Coordinators

Data Professionals

Principals

Focus Groups & Interviews, cont.

Library/Media Specialists

Teachers

Students

Parents

School Board Members

Business Community Leaders

Focus Groups & Interviews, cont.

Representatives from Testing Industry

Regional Professional Development Center Staff

Teacher Educators

DESE Staff

Surveys

Seven surveys, with response rates from 27% for parents to 86% for superintendents)

Superintendents (86%)

School Board Members (44%)

Principals (68%)

Surveys, cont.

Teachers (36%)

Students (63%)

Parents (27%)

DESE/RPDC Staff (50%)

Reporting

Written reports to DESE June 2007

September 2008

June 2009

September 2009

Updates to and input from TAC

Reports to State Board of Education

AERA presentation in 2009

Major Findings

Student motivation

Teacher frustration/morale

Responsibility/accountability: learning/test scores

Increased emphasis on targeted content

Greater collaboration among teachers

Misunderstandings, including intent vis-à-vis good/bad: unintended consequences not necessarily “bad”

Major Applications

Overcoming challenge of communicationReducing misunderstandings about testing and accountability: leaders, teachers, students, parents

Discovering what all those groups want out of testing

Reducing “Content Drag” at the local level

Establishing End-of-Course exams

Teri SiskindSouth Carolina Department of Education

Does this accountability system do what it is intended to do?

South Carolina Education Accountability ActPush schools and students toward higher performance

Furnish clear and specific information about school and district academic performance and other performance to parents and the public

Stimulate quality teaching and learning practices and target assistance to low performing schools

Improve student performance and reduce gaps in performance

Does this accountability system do what it is intended to do?

No Child Left BehindHas student achievement increased?

Have achievement gaps narrowed?

4th Grade Mathematics

AYP Ste

p

Format

ive

Ado

ption

List

AYP Ste

p

EAA Am

men

dmen

ts

Incr

ease

in R

igor

AYP Bas

elin

e

NCLB B

ecom

es L

aw

First R

eport C

ard

EAA Pas

sed

30.0

40.0

50.0

60.0

70.0

80.0

90.0

1996

1997

1998

1999

2000

2001

2002

2003

2004

2005

2006

2007

2008

2009

Pe

rce

nt

At/

Ab

ov

e B

as

ic

NAEP

PACT

PASS

2009: PACT replaced

with PASS

No accommodations

permitted

Final year off-grade-level

testing counted

8th Grade Mathematics

EAA Pas

sed

First R

eport C

ard

NCLB B

ecom

es L

aw

AYP Bas

elin

e

Incr

ease

in R

igor

EAA Am

men

dmen

ts

AYP Ste

p

Format

ive

Ado

ption

List

AYP Ste

p

30.0

40.0

50.0

60.0

70.0

80.0

90.0

1996

1997

1998

1999

2000

2001

2002

2003

2004

2005

2006

2007

2008

2009

Pe

rce

nt

At/

Ab

ov

e B

as

ic

NAEP

PACT

PASS

2009: PACT replaced

with PASS

No accommodations permitted

Final year off-grade-

level testing counted

Performance Category

Point Value Number of Students

Points

Advanced 5 10 50

Proficient 4 10 40

Basic 3 10 30

Below Basic 2 2 10 20

Below Basic 1 1 10 10

In the example above:

Absolute Index = 50 + 40 + 30 + 20 + 10 = 150/50 = 3.0

Rating Range of Indexes

Excellent 3.9 and above

Good 3.5-3.8

Average 3.1-3.4

Below Average 2.7-3.0

Unsatisfactory Below 2.7

Year Minimum Average Index Achieved Mean Index

2001 2.6 2.89

2002 2.6 2.93

2003 2.6 2.96

2004 2.7 3.08

2005 2.8 3.09

2006 2.9 3.05

2007 3.0 3.04

2008 3.1 3.10

In 2001, 168 schools were rated Excellent. In 2008, 89 schools were rated Excellent.In 2001, 71 schools were rated Unsatisfactory. In 2008, 195 schools were rated Unsatisfactory.

Gay PicknerSouth Dakota Department of Education

Part of Peer Review

What SD didHired a vendor

Carried out our plan

What we learned from first part

Conducted a dissemination meeting

What we learned from the dissemination meeting

Our plans before RRT and reauthorization

Our plans now

Chris DomaleskiCenter for Assessment

Implemented in various forms starting in 1999-2000Policy determines promotion for students in grades 4 and 8 based on performance on the Mathematics and English Language Arts components of the LEAP test Legislative resolution called for the Board to study the negative and positive effectsLouisiana’s broad-based Accountability Commission took up this task throughout 2009-2010

Are selected subgroups (e.g. students with disabilities) disproportionately retained?

Is there evidence that remediation programs are effective?

Is there evidence that retaining students improves performance in subsequent grades?

Does the policy increase drop-outs?

Does the testing and retention policy create barriers that prevent students from progressing to middle school or high school?

What can be learned from extant research and from applications of similar policies?

2000-01

•Kindergarten Students

2001-02

•Cohort

•Grade K

•Grade >=1

•Unmatched:

•Dropouts

•Other

2002-03

•Cohort

•Grade K

• Grade 1

•Grade >=2

•Unmatched

•Dropouts

•Other

2003-04

•Cohort

•Grade <=2

•Grade >=3

•Unmatched:

•Dropouts

•Other

2004-05

•Cohort

•Grade <=3

•Grade >=4

•Unmatched

•Dropouts

•Other

2005-06

•Cohort

•Grade <=3

•Grade =4

•Grade >=5

•Unmatched

•Dropouts

•Other

2006-07

•Cohort

•Grade <=3

•Grade =4

•Grade =5

•Grade >=6

•Unmatched

•Dropouts

•Other

How many students are still in 4th

grade 2 years later?

How many students were retained following Kindergarten or 1st

grade?

Is there a spike in dropouts?

Each approach addressed broad claims about the goals/ outcomes associated with the assessment program

The methods considered impact at different levels (e.g. student, school, teacher)

A variety of evidence was considered to inform decisions

Each state considered how to use the results to improve practice

It is important to specify a comprehensive Theory of Action

What are the intended outcomes?

How does the assessment support these outcomes?

What intermediate actions/ supports are needed to achieve these outcomes?

Consider the evaluation plan as a important component that is built into the Theory of Action (not an afterthought). What evidence will bolster the credibility of the claims?

Distinguish between threats related to the precision of the measure versus those associated with policy or intended use. Each requires a distinct response.

Adverse impact could be related to a flaw in the assessment. For example, it is not well aligned to the content standards.

Alternatively, undesirable outcomes may be associated with claims in the TOA that lack support. For example, strategies to support struggling students are not effective.

top related