leadership in using nesa data data conference april 18-19, 2011

31
Leadership in Using NeSA Data Data Conference April 18-19, 2011 Pat Roschewski [email protected] Jan Hoegh [email protected] John Moon [email protected]

Upload: jayden

Post on 15-Jan-2016

36 views

Category:

Documents


0 download

DESCRIPTION

NEBRASKA STATE ACCOUNTABILITY. Leadership in Using NeSA Data Data Conference April 18-19, 2011. Pat Roschewski [email protected] Jan Hoegh [email protected] John Moon [email protected]. Nebraska schools have many sources of data:. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

Leadership in Using NeSA Data

Data Conference April 18-19, 2011

Pat Roschewski [email protected] Hoegh [email protected] Moon [email protected]

Page 2: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

2

Data SourcesEXPLORE/PLAN/ACT AIMSweb

DIBELS Gates MacGinitie

Reading Theme Tests Chapter Tests

Measures of Academic Progress Developmental Reading Assessments (DRAs)

Curriculum-Based Measurement Teacher Observation

Running Reading Records Student Projects

Nebraska schools have Nebraska schools have manymany sources of sources of data:data:

Each assessment tool has a Each assessment tool has a purposepurpose and a and a rolerole in the big picture of Continuous in the big picture of Continuous Improvement!Improvement!

Commit to data analysis as a continuous process, not an event. (Reeves, 2009)

Page 3: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

3Commit to data analysis as a continuous process, not an event.

(Reeves, 2009)

Page 4: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

4

AND NOW . . . AND NOW . . .

Page 5: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

Nebraska schools should use Nebraska schools should use NeSANeSA data to . . . data to . . .

Provide feedback to students, parents and the community

Inform instructional decisions.

Inform curriculum development and revision.

Measure program success and effectiveness.

Promote accountability to meet state and federal

requirements.

5

Page 6: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

Leadership in Using Leadership in Using NeSANeSA Data Data

Session IoWhat is NeSA?

oHow do we access and interpret NeSA data?

Session IIoHow do we use NeSA data?

6

Page 7: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

What is NeSA?

Commit to data analysis as a continuous process, not an event. (Reeves, 2009)

Page 8: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

NeSA . . . Criterion-referenced summative tests.Measurement of the revised Nebraska Academic

Standards for reading, mathematics, and science.Tools that include multiple-choice items.Tests administered to students online OR

paper/pencil.

8

Page 9: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

Commit to data analysis as a continuous process, not an event. (Reeves, 2009)

Page 10: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

Administered during the spring of the year.

Based on Tables of Specification and Performance Level Descriptors.

Built upon the best thinking of Nebraska educators, national experts, and a worthy partner – Data Recognition Corporation.

10

NeSA . . .

Page 11: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

11

What is the author’s purpose for writing the story?

A.to inform the reader about chores for childrenB.to persuade the reader to increase chore ratesC.to entertain about two children visiting a farmD.to describe the benefits of living on a farm

Page 12: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

12

Goals for Instruction

Page 13: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

Where do we find the content components of NeSA?

www.education.ne.gov/Assessment

Tables of SpecificationPerformance Level DescriptorsAccommodations GuidesWebb’s DOK documentsOther important NeSA documents

13

Page 14: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

Produces a raw score that converts to a scale score of 0-200.

Allows for students to be classified into one of three categories: Below the Standards, Meets the Standards, Exceeds the Standards.

Provides comparability across Nebraska school buildings and districts.

14

NeSA . . .

Page 15: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

How do we access and interpret NeSA data?

Commit to data analysis as a continuous process, not an event. (Reeves, 2009)

Page 16: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

16

Page 17: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

What is the Data Reporting System?(DRS)What is the Data Reporting System?(DRS)

Secure Site – through portalSecure Site – through portal

Public Site – NDE website Public Site – NDE website http://drs.education.ne.govhttp://drs.education.ne.gov

17Commit to data analysis as a continuous process, not an event.

(Reeves, 2009)

Page 18: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

District/building level information

Individual student level information

Subgroup information

Indicator information

18

Interpreting NeSA-R Data Reports

Page 19: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

19

Use the Reports Interpretive Guide!

http://www.education.ne.gov/Assessment/documents/NESA.Read.InterpretiveGuide.pdf

Page 20: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

~NeSA Terminology~

PERFORMANCE LEVELS - three possible categories of student performance on NeSA

NeSA Performance Levels

Exceeds the Standards

Meets the Standards

Below the Standards

20

Page 21: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

21

Cut score processes:

Contrasting Group Method – 400+ teachers

Bookmark Method – 100+ teachers

State Board of Education Reviewed Examined results of both processes Examined NAEP and ACT results for Nebraska Made decisions within recommended range at public

meeting

How are performance levels determined?How are performance levels determined?

Page 22: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

~NeSA Terminology~

RAW SCORE – the number of items a student answers ‘right’ on NeSA-R

Content Area Points Possible

Points Earned

Student’s Scale Score

Reading 42 21 126

Mathematics 42 21 127

Raw Score Scale Score Performance Level

25 200 Exceeds

24 167 Exceeds

23 148 Exceeds

22 135 Exceeds

21 126 Meets

20 118 Meets

19 111 Meets

on NeSA Reports

22

on Conversion Chart

Page 23: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

~NeSA Terminology~

SCALE SCORE – a student’s transformed version of the raw score earned on NeSA

Performance Level Reading Scale-Score Range

Exceeds the Standards 135 -- 200

Meets the Standards 85-134

Below the Standards 84 and below

23

Page 24: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

~NeSA Terminology~

What is the difference between a raw score and a scale score?

What is a raw score?A raw score is the number of correct items. Raw scores have been typically used in classrooms as percentages: 18/20= 90% correct.

24

Page 25: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

~NeSA Terminology~

What is a scale score?

A scale score is a “transformation” of the number of items answered correctly to a score that can be more easily interpreted between tests and over time. The scale score maintains the rank order of students (i.e., a student who answers more items correctly gets a higher scale score). For NeSA, we selected 0-200 and will use it for all NeSA tests, including writing.

25

Page 26: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

~NeSA Terminology~

Why convert raw scores to scale scores?

Raw scores are converted to scale scores in order to compare scores from year to year. Raw scores should not be compared over time because items vary in difficulty level. Additionally, raw scores should not be compared across different content area tests. Scale scores add stability to data collected over time that raw scores do not provide.

26

Page 27: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

~NeSA Terminology~

SCALE SCORE CONVERTED TO PERCENTILE RANK?

On score reports why is the . . .

The percentile rank was placed on the score reports because our Technical Advisory Committee felt that parents would want to know their child’s position in relation to other test takers.

A percentile rank of 84 means the child scored better than 84% of the students who took the test that year.

27

Page 28: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

NeSA (CRT) vs. NRT ?

28

--Differences--

Purposes: NeSA is intended to match and measure identified standards and instruction.NRT is not intended to measure any state’s standards. The intention is to compare students to each other.Item Development:NeSA items with exact match to the standards – NDE had to prove the match with an independent alignment studyNRT – No standards to match – matches inherent and previous knowledge, enriched homes, pre-skills.

Page 29: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

NeSA (CRT) vs. NRT ?

29

--Similarities—All the psychometric steps – standard setting (Bookmark, Angoff, Contrasting Group)Reliabilities – KR 20-21 / Inter-rater ReliabilitiesDescriptive Statistics (Item P-values, Dif-analysis)

Administration:Both standardized – are generally administered the same way.

Page 30: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

Leadership in Using Leadership in Using NeSANeSA Data Data

Session IoWhat is NeSA?

oHow do we access and interpret NeSA data?

Session IIoHow do we use NeSA data?

30

Page 31: Leadership in Using NeSA Data Data Conference  April 18-19, 2011

NeSA results ARE an important data source!

When combined with other information, these data can support curricular, instructional, and learning support

decision making.

--It’s all about the Continuous

Improvement Process!

31