educator evaluations, tsdl, growth, vams

46
Educator Educator Evaluations, Evaluations, TSDL, Growth, TSDL, Growth, VAMs VAMs Office of Psychometrics, Accountability, Research and Evaluation

Upload: lemuel

Post on 23-Jan-2016

46 views

Category:

Documents


0 download

DESCRIPTION

Educator Evaluations, TSDL, Growth, VAMs. Office of Psychometrics, Accountability, Research and Evaluation. Important Dates - Overview. During school years 2011/12 and 2012/13, Educator Evaluation Systems are locally determined, but evaluations must be based on student growth measures. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Educator Evaluations, TSDL, Growth, VAMs

Educator Educator Evaluations, Evaluations,

TSDL, Growth, TSDL, Growth, VAMsVAMs

Office of Psychometrics, Accountability, Research and Evaluation

Page 2: Educator Evaluations, TSDL, Growth, VAMs

Important Dates - Important Dates - OverviewOverview

During school years 2011/12 and 2012/13, Educator Evaluation Systems are locally determined, but evaluations must be based on student growth measures.

Data from local, state, and nationally standardized assessments should be integrated if/where available along with other evidence of growth from portfolios, behavior rubrics, etc.

Page 3: Educator Evaluations, TSDL, Growth, VAMs

RequirementsRequirementsReport one of four labels required by

legislation in REP: Highly effective Effective Minimally effective Ineffective

The Governor’s Council* will develop a tool to be used by districts beginning in 2013-14.

• *Renamed the Michigan Council for Educator Effectiveness as part of DTMB as of April 30, 2012.

Page 4: Educator Evaluations, TSDL, Growth, VAMs

MCEE’s Interim Progress MCEE’s Interim Progress Report Report

Michigan Council for Educator Effectiveness (MCEE) issues its Interim Report.

MCEE is recommending that the state start with a pilot program that will be tested in 12 school districts during the 2012-13 school year.

Page 5: Educator Evaluations, TSDL, Growth, VAMs

Detroit Free Press – Detroit Free Press – 4/27/20124/27/2012

• MCEE concluded a pilot is imperative, saying rushing to develop a system "would be reckless, both fiscally and technically”.

• During the pilot, districts would test three tools for observing teachers and several models for using student assessment to evaluate teachers.

• Results of the pilot will be used to develop MCEE’s final recommendations.

• http://www.freep.com/article/20120428/NEWS06/204280401/Council-says-state-should-start-small-on-educator-evaluation-system?odyssey=mod%7Cnewswell%7Ctext%7CFRONTPAGE%7Cs

Page 6: Educator Evaluations, TSDL, Growth, VAMs

School Year

Tool Type

% of evaluation based on

student growth & achievement

data

Reporting Requirem

ent

2011-2012

locally determined Educator Evaluation

Systems

significant part*effectiveness labels in June REP collection

2012-2013

locally determined Educator Evaluation

Systems & MCEE Pilot* if

legislatively approved

2013-2014

Governor’s Council Evaluation Tool

25%

2014-2015

40%

2015-2016

50%

Current Overview of Current Overview of DatesDates

Page 7: Educator Evaluations, TSDL, Growth, VAMs

Who MUST be Who MUST be evaluated?evaluated?

• Based on the code used to report the employee in the REP.

• Visit www.michigan.gov/CEPI.oClick on CEPI Applications on the lefto Then, click on Registry of Educational

Personnel on the lefto Scroll down to EOY 2012 REP PreviewoClick on EOY 2012 REP Data

Descriptions and go to page 71.

Page 8: Educator Evaluations, TSDL, Growth, VAMs

Who MUST be Who MUST be evaluated?evaluated?

• Required Reporting Codes

Assignment Code Description“000AX” through “000ZZ”, (except “00SUB”, “00PAR” and “00200” through “00413”

Teachers

“00192” through “00197” Teachers

“00501” through “00598” Teachers

“Y*0AX” through “Y*0ZZ” (except “Y*014” or “Y*016”)

TeachersParaprofessionals/Aides

“60300” and “60400” Teachers

“70***”, “71***”, “72***”, “73***”, and “74***” (Example: 70100: ISD Superintendent)

Superintendents, Assistant Superintendents, Administrators, Principals, and Assistant Principals

Page 9: Educator Evaluations, TSDL, Growth, VAMs

Who is OPTIONAL to Who is OPTIONAL to evaluate?evaluate?

• Optional Reporting Codes

Assignment Code Description“Y*014” or “Y*016” Paraprofessionals/Aides

“00SUB” and “00PAR” Day-to-day substitute staff members

“00200” through “00407” Additional Special Education Staff Members

“00410” through “00413” Migrant Education ProgramParaprofessionals/Aides

“60100” through “60700”, except “60300” and “60400”

Early Childhood Staff Members

“75***” through “79*99” Administrative Positions

“81500” through “99900” Non-Instructional Staff Members

Page 10: Educator Evaluations, TSDL, Growth, VAMs

TSDLTSDLThe Teacher-Student Data Link:

What it is and how it could be used as part of a district evaluation system

Page 11: Educator Evaluations, TSDL, Growth, VAMs

Teacher/Student Data Teacher/Student Data LinkLink

Data initiative to link each student to the courses he/she took and to the teachers who taught those courses

Required under State Fiscal Stabilization Fund as a deliverable

Will mature in the coming years to be able to provide measures and information over time

Page 12: Educator Evaluations, TSDL, Growth, VAMs

State-provided State-provided measuresmeasures

• Extremely limited, so districts choose which “pieces” make sense in their local context

• Generated for each educator of students in tested grades, regardless of subject taught.

• BUT “growth”, or Performance Level Change (PLC), exists only for reading and mathematics for MEAP and MI-Access FI in grades 4-8

Page 13: Educator Evaluations, TSDL, Growth, VAMs

How does the TSDL Work?How does the TSDL Work?

•Teachers are linked to courses•Students are linked to courses•For each course taught, a teacher has a list of students who were reported as taking that course.

Spring assessment data 2011 and fall assessment data 2011 will attribute to teachers from the 2010-2011 school year

“Feeder school” for fall assessment data

Page 14: Educator Evaluations, TSDL, Growth, VAMs

Linking assessment Linking assessment data to studentsdata to students

•Once teachers are linked to students, the TSDL file provides:•Performance level change (PLC) for MEAP and MI-Access FI in reading and mathematics for each teacher where available (regardless of subject taught) in grades 4-8.

•Performance level in writing, science, social studies, reading and mathematics for each teacher where available (regardless of subject taught) across all tested grades.

Page 15: Educator Evaluations, TSDL, Growth, VAMs

Performance Level Change Performance Level Change (“growth”) (“growth”)

Year X Grade Y MEAP

Performance Level

Year X+1 Grade Y+1 MEAP Performance LevelNot

ProficientPartially

Proficient Proficient Adv

Low Mid High Low High Low Mid High Mid

NotProficient

Low M I I SI SI SI SI SI SIMid D M I I SI SI SI SI SIHigh D D M I I SI SI SI SI

PartiallyProficient

Low SD D D M I I SI SI SIHigh SD SD D D M I I SI SI

ProficientLow SD SD SD D D M I I SIMid SD SD SD SD D D M I IHigh SD SD SD SD SD D D M I

Advanced Mid SD SD SD SD SD SD D D M

Page 16: Educator Evaluations, TSDL, Growth, VAMs

Access to TSDL dataAccess to TSDL dataTSDL User role must be established

in the Secure Site to access the data at the district or school level

Spring Assessments/High school link available through the Secure Site as of January.

Fall Assessments (Elementary and Middle) TSDL through the Secure Site as of March.

Page 17: Educator Evaluations, TSDL, Growth, VAMs

After dowloading the TSDL After dowloading the TSDL FileFile

District/school needs to adjust each list based on rules like • student attendance• subject taught match• grade taught• other local factors

Page 18: Educator Evaluations, TSDL, Growth, VAMs

Sample Components of Sample Components of

EvaluationEvaluation

Page 19: Educator Evaluations, TSDL, Growth, VAMs

Using PLC Data with MDE Using PLC Data with MDE ToolTool

• This year, the TSDL provides PLC data linked to teachers to districts for integration into local systems along with an optional tool.

• These are general guidelines/suggestions—NOT requirements for reading and math in grades 4-8

**Currently, MDE is working with districts in pilot programs to research the most valid way to use PLC and other assessment data in value-added models and educator evaluation systems.

Page 20: Educator Evaluations, TSDL, Growth, VAMs

One Possible Method Using MDE One Possible Method Using MDE

Tool Tool

STEP #1•Download TSDL file through BAA Secure Site•Apply rules regarding which students “count” toward a teacher’s evaluation (i.e. attendance rules)•Consider de-duplication of records•Paste your modified TSDL data into the Weighted PLC Tool

Page 21: Educator Evaluations, TSDL, Growth, VAMs

One Possible Method Using MDE One Possible Method Using MDE

Tool Tool

STEP #2Determine/Adjust the Weight the PLCs in the tool (calculations automatically adjust/are calculated)Default weights in the MDE TSDL Weighted PLC Tool:

Sig. Improv

Improve Maintain Decline Sig. Decline

Proficient 2 1 1 -1 -2Not Proficient 2 1 0 -1 -2

Page 22: Educator Evaluations, TSDL, Growth, VAMs

One Possible Method Using MDE One Possible Method Using MDE

Tool Tool STEP #3Look at the results at various levels: what is the Weighted PLC at the district, school, grade, and/or subject level? What is a reasonable Weighted PLC for teachers to show?Note: Possible range using this Weighted PLC method is from -2 to 2.The meaning of 0 here is that you’re, on average, maintaining your proficient students.If using a different weight, it’s necessary to determine the range & meaning of the results.

Page 23: Educator Evaluations, TSDL, Growth, VAMs

Example: Determining Example: Determining

ThresholdsThresholds• In Sunshine School:

• The weighted PLC is .643 for math at the school level

Considerations• Positive Weighted PLC = effective• Negative Weighted PLC = minimally effectiveDetermine threshold for highly effective or ineffective• Set the bar based on the school level—that teachers should at least meet the school level weighted PLC.• For example, for a teacher to be considered effective for this portion of the evaluation, he/she must have a Weighted PLC of .60 or greater.

Page 24: Educator Evaluations, TSDL, Growth, VAMs

Student Math PL PLC Weighted PLC

Johnny 3 SI 2

Tammy 3 I 1

Chloe 2 M 1

Jose 1 M 1

Frank 2 D -1

Sally 2 D -1

Carla 4 M 0

Martin 3 M 0

Number of students:

8

Total WPLC: 3

Page 25: Educator Evaluations, TSDL, Growth, VAMs

Using weighted PLC and Using weighted PLC and

thresholdsthresholds

• To calculate the teacher’s percent of students demonstrating growth, divide Weighted PLC by number of students: 3/8 = .375

• If target for “effective” was .643, this teacher did not meet the “effective” threshold.

• BUT, if the target for effective was having a positive Weighted PLC (>0), this teacher would have met it.

• Use this as one “growth” component of a multi-measure evaluation system

Page 26: Educator Evaluations, TSDL, Growth, VAMs

Paste the modified* TSDL data into the Weighted PLC tool.

Page 27: Educator Evaluations, TSDL, Growth, VAMs

School Level Weighted PLC = .643

Page 28: Educator Evaluations, TSDL, Growth, VAMs

PIC (teacher) Level Weighted PLC = 1.33

Page 29: Educator Evaluations, TSDL, Growth, VAMs

CautionsCautions• Must base targets on data; need to set

targets that are attainable but also challenge educators to improve student learning

• Make decisions about the extent (if at all) reading and math PLC should count in subjects other than reading and math

• Make decisions about which students contribute; need firm business rules that apply to all!

• Use other measures and factors!

Page 30: Educator Evaluations, TSDL, Growth, VAMs

Integrating Growth Integrating Growth CarefullyCarefully

Use in conjunction with other measures

Use other types of growth too (i.e. portfolios, rubrics, performance-based assessments) particularly in non-tested subjects and grades—and for special populations.

Page 31: Educator Evaluations, TSDL, Growth, VAMs

Integrating Growth Integrating Growth (again)(again)

Can be used more qualitatively too—set general guidelines/targets, but use it to inform the decision

Consider the measures that may already be in place in your district that are meant to show growth and develop a rules around that data

Page 32: Educator Evaluations, TSDL, Growth, VAMs

Non-Tested Grades Non-Tested Grades and Content Areasand Content Areas

• Caveat: No easy answer to this question!• One answer: Develop more tests

o But….tests in new content areas take time (and are difficult to do, particularly if a state has not yet adopted content standards in those areas)

• Another answer: set standards for “adequate growth” on common assessments that you already have

• One more answer: use instruments to measure growth that are not “assessments” (i.e. portfolios, progress toward goals, etc.)

Page 33: Educator Evaluations, TSDL, Growth, VAMs

Step #1: Take stock of your common Step #1: Take stock of your common assessments/local assessmentsassessments/local assessments

•Ask yourself the following questions:•What is our spectrum of locally administered and identified assessments?•What content areas do we currently assess?

For your common assessments:• Do we currently have standards set for proficiency (i.e. how much is enough to be considered “proficient” on this assessment)?• How often do we administer this test?• How is the test scored? Scaled?

Page 34: Educator Evaluations, TSDL, Growth, VAMs

For purchased assessments:•Do they establish how much “more” is sufficient growth?•How are these tests scaled and scored?•Work with the company producing these tests to determine how they can or cannot be used to measure growth.

Step #1: Take stock of your common Step #1: Take stock of your common assessments/local assessmentsassessments/local assessments

Page 35: Educator Evaluations, TSDL, Growth, VAMs

Step #2: Setting standards Step #2: Setting standards for GROWTH on common for GROWTH on common

assessmentsassessments• Even if you have standards for

proficiency, may need to set standards for growth

• Several methods of setting standards that can be used to help make these determinations

Page 36: Educator Evaluations, TSDL, Growth, VAMs

Step #3: Set the standards, Step #3: Set the standards, implement and EVALUATEimplement and EVALUATE

• Although legislation does not necessarily provide for this (yet??)—all of these systems need continuous improvement

• If the standards for growth that you set using one of the above methods seem out of synch with actual student learning—re-evaluate!

Page 37: Educator Evaluations, TSDL, Growth, VAMs

Value-Added ModelingValue-Added Modeling• VAMs attempt to isolate the contributions, or

“value add”, of teachers or schools to student learning.

• MDE is not, at the present, running VAMs for every teacher in the stateo Issue #1: We do not have sufficient datao Issue #2: We do not have sufficient systemso Issue #3: We do not necessarily believe we can

specify these models in a way that allow us to get fair estimates

o Issue #4: We are not currently given the direct authority/responsibility to do this

Page 38: Educator Evaluations, TSDL, Growth, VAMs

VAMsVAMs• We do want to start to provide our

best expert advice to districtsoNon-bindingo Take it or leave ito Tell us if you disagreeThe level at which the VAM is run is

important for three reasons: 1)available data2)coherence with local expectations3)ability to compare teachers

Page 39: Educator Evaluations, TSDL, Growth, VAMs

VAMs: Who should run VAMs: Who should run these?these?

• VAM run by MDE for the whole state:oPros: Standardized model; allows you

to talk about distribution of effectiveness across the state.

oCons: Less data available (i.e. only state assessments); not reflective of district expectations; only available for a limited number of educators (i.e. those who teach in a tested grade/subject)

Page 40: Educator Evaluations, TSDL, Growth, VAMs

VAMs: Who should run VAMs: Who should run these?these?

• VAM run by a district:oPros: Can utilize both state and local

assessment data; can be more sensitive to local assumptions; can hypothetically include many more teachers.

oCons: Analytically difficult to do; may not follow standard format; “lowest X% of teachers” is now district-specific

Page 41: Educator Evaluations, TSDL, Growth, VAMs

Value-Added Modeling: What to Value-Added Modeling: What to

do?do?• Involve educators in roster verification

processo Are these my students? Which ones should be included in the

VAM? Need business rules around this—districts need to make rules

• Use prior achievement data as a predictor o More years of data are bettero Can use data from all available subjectso Can use data from different types of testso BUT—need complete data on every studento SO—tradeoff on how many years of data you include and how

many students you include in the model• More assessment scores = fewer students who will have all

those scores available

Page 42: Educator Evaluations, TSDL, Growth, VAMs

Value-Added Modeling: What to Value-Added Modeling: What to

do?do?

•Consider using aggregate peer effects at the teacher level as a covariate• For example: average economic disadvantage• McCaffrey—including student level covariates is not

helpful• However—if there is a correlation between the

aggregate demographics and the value added estimate, that sets up differential expectations depending on classroom demographics

•Teachers with small numbers of students: need firm business rules• McCaffrey recommends 10 or more students• Rules for what to do with less than 10 teachers

Page 43: Educator Evaluations, TSDL, Growth, VAMs

Value-Added Modeling: What to Value-Added Modeling: What to

do?do?• Many different types of VAMs all key to a

certain set of assumptions and more importantly, the underlying data that you have.o Gain scores: Need a vertical scaleo Covariate adjustment

• Which covariates you select are critical• Dropping students with incomplete data AND differential

expectationso Random or fixed effects

• Are the time-invariant teacher characteristics fixed?• Do we need to explicitly model the variation within and

between teachers?

Page 44: Educator Evaluations, TSDL, Growth, VAMs

Including growth = Including growth = VAM?VAM?

• You can include student growth data in an evaluation system WITHOUT running a VAM

• Our TSDL tool does this• Don’t forget to call your local university,

your ISD OR the Department if you want to run a VAM and need assistance

• AND—growth data does not always have to be obtained from assessments

Page 45: Educator Evaluations, TSDL, Growth, VAMs

Contact Information Contact Information Carla Howe OlivaresEvaluation Research and AccountabilityOffice of Psychometrics, Accountability, Research and Evaluation (OPARE)

Bureau of Assessment and Accountability (BAA)

[email protected]

[email protected]

Page 46: Educator Evaluations, TSDL, Growth, VAMs

MDE website for Ed Eval MDE website for Ed Eval infoinfo

• www.michigan.gov/baao Click on the Educator Evaluation tab on the left to access

materials, resources, and links