nznsa conference michael johnston 29 aug 2012

22
Achieving consistency of teachers’ judgements and measuring progress against national standards: Curriculum knowledge deployed within a psychometric framework Michael Johnston 29 August 2012

Upload: mana-education

Post on 27-May-2015

268 views

Category:

Education


2 download

DESCRIPTION

Dr Michael Johnston's presentation to NZ Normal Schools confere

TRANSCRIPT

Page 1: Nznsa conference michael johnston 29 aug 2012

Achieving consistency of teachers’ judgements and measuring progress

against national standards:

Curriculum knowledge deployed within a psychometric framework

Michael Johnston29 August 2012

Page 2: Nznsa conference michael johnston 29 aug 2012

National Standards design

• Validity of teacher judgements is emphasised:– Wide range of evidence– Over-reliance on formal testing discouraged– No national testing

• Reliability will be a challenge:– Overall judgement must be formed from

diverse evidence on multiple criteria

Page 3: Nznsa conference michael johnston 29 aug 2012

Achieving reliability

• Important to teacher acceptance of the standards:– Common understanding of standards – Confidence that they can make judgements with

reasonable national consistency

• Exemplars and moderation help but are not sufficient– A given student does not usually resemble the

exemplar– Social moderation is not designed to produce

consistency

Page 4: Nznsa conference michael johnston 29 aug 2012

The Progress and Consistency Tool (PaCT)

• Designed to support teachers to make reliable judgements without sacrificing validity.– A diverse evidence base and professional

judgement remains at the forefront. The framework gives structure to the professional judgement.

• Will provide a mechanism to measure progress.

Page 5: Nznsa conference michael johnston 29 aug 2012

Design of PaCTEach sequence of standards (reading, writing,

mathematics):– Conceptualised as a developmental continuum,

underpinned by a quantitative scale– Decomposed into a set of aspects, each comprising

a series of (sequential) items stages.

• Each item comprises annotated illustrations.• Collectively, the aspects comprise a rubric.

Page 6: Nznsa conference michael johnston 29 aug 2012

The form of the rubrics

An item

An illustration

Page 7: Nznsa conference michael johnston 29 aug 2012

Guidelines and principlesfor developing progressions

1. Aspects should represent salient stages of cognitive development and/or curriculum progression.

2. Aspects should cover the range of relevant cognitive functions important to the standard.

3. The number of aspects for each standard should be manageable for teachers.

4. Observational evidence for each item (stage) of each aspect must be readily available to, and recognisable by, teachers.

Page 8: Nznsa conference michael johnston 29 aug 2012

Guidelines and principlesfor developing progressions (ctd.)

5. Aspects should be associated with cognitive development and learning rather than with year levels.

6. The aspects defined for a given standard need not all entail the same numbers of stages.

7. The extent of each stage on each progression in relation to the overall continuum will be determined empirically, not defined by the standards writers.

Page 9: Nznsa conference michael johnston 29 aug 2012

Aspects for the mathematics rubric (Gill Thomas)

Strand Aspects Number and algebra

• Additive thinking • Multiplicative thinking • Patterns and relationships • Using symbols and expressions

Measurement and geometry

• Measurement sense • Geometrical thinking

Statistics • Statistical investigations • Statistical literacy and probability

Page 10: Nznsa conference michael johnston 29 aug 2012

1. Making sense of text 2. Applying vocabulary knowledge (with Writing)3. Using reading to organise for learning4. Locating and using information and ideas in

informational texts (continuous and non-continuous/in print)

5. Locating and using information and ideas in informational texts (continuous and non-continuous/on-line)

6. Interpreting and responding to ideas, information and experiences in literary texts (in print and on-line)

7. Identifying and reflecting on the way writers use ideas and language to influence their readers.

Aspects for the reading rubric (Sue Douglas)

Page 11: Nznsa conference michael johnston 29 aug 2012

 

1.Encoding 2.Sentence Structure (different types of sentences & sentence beginnings; grammar; punctuation)3.Applying vocabulary knowledge (with Reading)4.Using writing to think and organise for learning 5.Using writing (in print) to communicate knowledge and understanding 6.Using writing (online) to communicate knowledge and understanding 7.Creating texts for literary purposes8.Creating texts to influence others

Aspects for the writing rubric (Sue Douglas)

Page 12: Nznsa conference michael johnston 29 aug 2012

The components of an illustration

Student response

What the child does or says (work sample, interview transcript)

Annotation

The interpretation of the response & why it ‘fits’

Annotation

The interpretation of the response & why it ‘fits’

Prepared for TAG, 14 August 2012

Page 13: Nznsa conference michael johnston 29 aug 2012

More expertMore expert

Less expertLess expert

About the same

About the same

UnsureUnsure

Page 14: Nznsa conference michael johnston 29 aug 2012

Implementing the framework

• Defining the rubrics; describing the aspects and items

• Developing the illustrations• Collection and analysis of sample data

– Determine quality of rubrics, items and illustrations

– Calibrate the measurement scale

• Standard setting

Page 15: Nznsa conference michael johnston 29 aug 2012

Developing the frameworkAnalysis of sample data- Collect judgements on each aspect for a

sample of students- Sample should be representative of

Year levels, geographic regions, school deciles and demographics

- The scale for each sequence of standards will be constructed from these data using item-response analysis.

Page 16: Nznsa conference michael johnston 29 aug 2012

Utility of item response analysis

• Creates a scale with equal-interval properties.– Ideal for measuring progress

• Locates each stage of each progression on this overall scale.

Page 17: Nznsa conference michael johnston 29 aug 2012

Developing the framework

Standard setting– When the overall continuum for each sequence of

standards has been constructed, a standard-setting exercise will be required to define overall Year-level boundaries.

– Standard setting will require holistic judgements by expert judges made using the same evidence base as that supporting the scale construction exercise.

– Important to be clear that boundaries are stochastic, not deterministic

Page 18: Nznsa conference michael johnston 29 aug 2012

Rubric design: Standard setting

✔ ✔

✔ ✔

Page 19: Nznsa conference michael johnston 29 aug 2012

Test information

• Teachers can also enter results for a selection of tests; e.g.,

–PAT–GLOSS–asTTle

• It is envisaged the role of test information will as a self-moderation process.

Page 20: Nznsa conference michael johnston 29 aug 2012

Bands showing probable location of a student

Observational Test

Year 4

Year 3

Year 2

Year 1

95% confidence

68% confidence

Con

tinuu

m

Page 21: Nznsa conference michael johnston 29 aug 2012

Measuring and reporting progress

-2

-1

0

1

2

3

4

5

2011 (Year 2) 2012 (Year 3) 2013 (Year 4) 2014 (Year 4) 2015 (Year 5)

Year and schooling year

Perf

orm

ance

(log

its)

Standard

Year 2

Year 3

Year 4

Year 5

Page 22: Nznsa conference michael johnston 29 aug 2012

Problematic issues

• Making the tool mandatory

• Housing and usage of data