value-added guidebook - battelle for...

32
Value-Added Guidebook FOR TEACHERS Designed to support teachers using value-added data to impact teaching and student learning © 2018, Battelle for Kids. All Rights Reserved.

Upload: others

Post on 26-Jul-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

1

Value-Added GuidebookFOR TEACHERS

Designed to support teachers using value-added data to impact teaching and student learning

© 2018, Battelle for Kids. All Rights Reserved.

Page 2: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

2

TABLE OF CONTENTS

3 Introduction

4 Access to and Navigation of SAS® EVAAS® Reporting Site

5 Understanding: Terminology and Anatomy of Reports

7 Interpreting eTPES and Teacher Value-Added Reports in EVAAS®

9 Reports-at-a-Glance

11 Interpreting and Analyzing Teacher Value-Added Reports Investigation #1: Did students reach the expected level of growth for the current year?

12 Apply Your Learning: Interpreting and Analyzing the Teacher Value-Added Reports

13 Check for Understanding: Interpreting and Analyzing Teacher Value-Added Reports

14 Interpreting and Analyzing Diagnostic Reports Investigation #2: Which students benefited most (and least) from your instruction?

17 Apply Your Learning: Interpreting and Analyzing the Diagnostic Reports

18 Check for Understanding: Interpreting and Analyzing Diagnostic Reports

19 Interpreting and Analyzing Custom Diagnostic Reports Investigation #3: How did the growth of the students in the after-school program compare to those who did not participate in that program?

21 Apply Your Learning: Interpreting and Analyzing the Custom Diagnostic Reports

22 Check for Understanding: Interpreting and Analyzing Custom Diagnostic Reports

23 Making Use of Student Information in EVAAS®

24 Interpreting Student History Reports Investigation #4: How has a particular student performed on the state tests over time? Does their performance differ from the average score at their school or district?

25 Apply Your Learning: Interpreting and Analyzing the Student History Reports

26 Check for Understanding: Interpreting and Analyzing Student History Reports

28 Interpreting Student Projection Reports Investigation #5: When are intervention or enrichment strategies appropriate for a particular student?

29 Apply Your Learning: Interpreting and Analyzing the Student Projection Reports

30 Check for Understanding: Interpreting and Analyzing Student Projection Reports

31 Identifying Contributing Factors Resource

32 Preparing for the Educator Growth Plan

Please note: This guide uses visual representations of copyrighted EVAAS® Web reporting software from SAS Institute Inc. for instructional purposes.

Page 3: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

3

INTRODUCTION

This Value-Added Guidebook for Teachers is designed to help you use value-added data to impact your teaching and your students’ learning. Prior to using these guidebooks, it is recommended that you watch the Value-Added Video Series and complete the accompanying facilitation guides, designed to help you acquire a basic, conceptual understanding of Ohio’s value-added measures.

In the materials that follow, you will be taken through a scenario in which various SAS® EVAAS® reports are used to answer particular questions about a teacher’s value-added results, providing helpful information for accurate interpretation and sample analysis questioning. The reports used throughout this guidebook use reports generated from both value-added models:

• MRM: Used in grades 4–8 math and English language arts• URM: Used in grades 5 and 8 science, grade 6 social studies, and high school end-of-course tests

While the reports look slightly different depending on the value-added model, interpretation and analysis of the reports are the same.

By using this guidebook, you will:

• Understand how to interpret key SAS® EVAAS® reports.• Understand how to analyze data to prioritize strengths and opportunities for growth.• Identify possible contributing factors that impacted value-added results.• Develop capacity to use value-added data to inform professional practices. • Use analysis of the reports to support the development of professional growth plans.

ADDITIONAL TOOLS AND RESOURCES:

Data Coaching FrameworkTo begin building your data literacy, it is essential to establish the right conditions and incorporate the right processes to effectively support data inquiry and use. Throughout this guidebook, the processes of building awareness, understanding, interpretation, analysis, and use are referenced. See the Data Coaching Framework on the Ohio Portal for more information about building data literacy in your school or district.

ESTA

BLISHING THE RIGHT CONDITIONS

INCORPORATING THE RIGHT PROCES

SES

ADATA-

LITERATE TEAM

INCORPORATING THE RIGHT PROCES

SES

Page 4: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

4

Visit ohiova.sas.com to access value-added reports.

The public has access to a limited number of value-added reports at the district and school level through the Education Value-Added Assessment System (EVAAS®). Educators have access to additional reporting that requires a username and password. The homepage of the EVAAS® website offers resources including links to the technical guide, an FAQ document, and a step-by-step guide to access reports. The site also offers many different reports, and includes interactive features and help screens to assist you in analyzing data.

This guide will focus on the Teacher-Level Reports available in the SAS® EVAAS® reporting. Once logged in, you can access your value-added report by selecting Teacher Reports from the reports menu. Once selected, the Teacher Value-Added summary will be displayed. This is the landing page for teacher-level reporting and multiple teacher-level reports can be accessed from the Teacher Value-Added Summary.

SAS® EVAAS® Help FeatureWithin the SAS® EVAAS® reporting site, you can access the help screen by selecting the icon in the upper right corner. This feature provides additional support to promote awareness and understanding of the measures and interpretation of the reports. A navigation menu appears on the left-hand side of the help screen to allow you to easily access information about all available reports. You can also hover the mouse over terminology and features included on the reports to see pop-up boxes with additional information to assist in understanding and interpretation.

ACCESS TO AND NAVIGATION OF SAS® EVAAS® REPORTING SITE

Page 5: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

5

Understanding the terminology used on value-added reports will help you interpret, analyze, and use the information to improve teaching and learning. The following includes explanations of terms found on the Teacher-Level Value-Added and Diagnostic Reports.

UNDERSTANDING TERMINOLOGY AND ANATOMY OF REPORTS

TERMINOLOGY ON VALUE-ADDED REPORTS

1 Year: Year associated with the report.2 Growth Measure: Conservative estimate of a teacher’s influence on students’ academic progress, compared to

the growth standard. A teacher whose students meet the growth standard would have a score near 0.0.• MRM: Expressed in NCEs• URM: Expressed in scale score points

3 Standard Error: Measure of certainty around the estimate and helps determine if the growth measure is significantly different than the growth standard.

4 Index: Basis for establishing an effectiveness level. Dividing the estimated growth measure by the associated standard error produces the index score. Associated with the performance level and color.

5 Level: Based on the index score, each teacher is assigned to a particular performance level. Descriptors of these levels are provided at the bottom of the page.

6 Distribution: Shows the distribution of teachers in each effectiveness level, statewide for a particular grade and subject or course.

7 Effectiveness Level: Determined by the index and represents the amount of growth, coupled with the amount of evidence that students made compared to the growth standard.

1 2 3 4 5

6

7

Page 6: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

6

TERMINOLOGY ON DIAGNOSTIC REPORTS

1 Growth: Conservative estimate of a teacher’s influence on students’ academic progress, compared to the growth standard, and represented by the height of the bars in the graph. A teacher whose students meet the growth standard would have a growth measure near 0.0.

• MRM: Expressed in NCEs• URM: Expressed in scale score points

2 Standard Error: Measure of certainty around the estimate and helps determine if the growth measure is significantly different than the growth standard.

3 Achievement Groups: Students are placed into one of three achievement groups, or tertiles, based on their achievement. The lowest achievement tertile represents the students in the lowest-achieving third of the state while the highest achievement tertile represents the highest-achieving third of the state.

• MRM: Students are assigned to an achievement group based on the average of the most recent year and prior year NCE.

• URM: Students are assigned to an achievement group based on their predicted score.

1

2

3

Page 7: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

7

INTERPRETING eTPES AND TEACHER VALUE-ADDED REPORTS IN EVAAS®

CROSSWALKING TEACHER-LEVEL VALUE-ADDED REPORTS AND eTPES

1 View your Value-Added Composite result in eTPES (www.ohiotpes.com) both in the Student Growth Worksheet and the Ratings tab within the Summative Evaluation feature. Both are similar views of information and can be exported into a PDF.

1A If you are an A1 or A2 teacher, you will see your state value-added composite results on a 1–5 scale. Having a State Value-Added Composite result means that you have a value-added growth measure for state-tested areas.

A scale of 1–5 is used to represent the performance level associated with value-added composite measures.

1

1A 1B

Most Effective 5

Index 2 or greater: Significant evidence of exceeding the growth standard, expectation.

Above Average4

Index equal to or greater than 1, but less than 2: Moderate evidence of exceeding the growth standard, expectation.

Above Average3

Index equal to or greater than -1, but less than 1: Evidence of meeting the growth standard, expectation.

Approaching Average2

Index equal to or greater than -2, but less than -1: Moderate evidence of not meeting the growth standard, expectation.

Least Effective1

Index: Less than -2: Growth is more than 2 standard errors below the Growth Standard (0)

Accessible to:District LeadersSchool LeadersTeachers Public

1B If you are an applicable B teacher, you will have a Vendor Value-Added Composite on a 1–5 scale. Having a Vendor Value-Added Composite result means that you have a value-added growth measure for vendor-based assessments. This is an optional growth measure for districts and schools in Ohio.

Battelle for Kids is using visual representations of eTPES evaluation software from RANDA Solutions in this document for instructional purposes.

Page 8: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

8

TEACHER VALUE-ADDED SUMMARY

2 Visit ohiova.sas.com to review your teacher value-added summary report, located on the landing page of the SAS® EVAAS® website. The summary report displays the growth measures, standard errors, index values, and levels for each grade, subject, or course for which a value-added report is available in the current year, as well as your composite index and effectiveness level.

2A The State Teacher Value-Added Composite is shown in both the chart and the table of the Teacher Value-Added Summary. It is the result that is also reflected in eTPES. In EVAAS®, this composite result is shown as an index value (4.06) and level (Most Effective). See the chart on page 7 for additional details on the value-added performance levels.

By clicking on the State Composite link in the table, you can see details on which growth measures from state tested areas are included in your composite result.

2B The Teacher Value-Added Summary also includes your growth measures results for each grade/subject area in the most recent year. These results are shown in both the chart and the table of the Teacher Value-Added Summary. The table includes the growth measure, standard errors and index value for each grade/subject area result. This example teacher report has a result for Ohio State Test (OST) Tested Math, Grade 6 and OST Tested Math, Grade 7 in the most recent year. The gain index allows you to make comparisons from year to year and subject to subject.

OST Composite

Ohio’s State Tests (3–8) Tested Mathematics Grade 6

Ohio’s State Tests (3–8) Tested Mathematics Grade 7

2A

2B

OST Composite

OST Tested Math, Grade 6

OST Tested Math, Grade 7

2A

2B

Page 9: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

9

REPORTS-AT-A-GLANCE

The following reports are available on the SAS® EVAAS® website and support efficiency by summarizing large amounts of data at the teacher level. These reports reflect those most commonly used by teachers.

TEACHER VALUE-ADDED SUMMARY REPORTPurpose Possible QuestionsThis summary report is the landing page when teachers access their teacher reports. It displays the growth measures, standard errors, index values, and levels for each grade/subject for which a teacher value-added report is available in the current year, as well as the teacher’s composite index and level.

• What trends emerge in my estimates/teacher value-added results across years for all applicable grades and subjects?

TEACHER VALUE-ADDED COMPOSITE REPORTPurpose Possible QuestionsThe Composite report includes a teacher’s composite gain index and effectiveness level across assessments. The State Composite includes any applicable value-added growth measures from state assessments for up to three years. The Vendor Composite is provided to teachers in districts/schools receiving value-added from vendor-based assessments and includes value-added growth measures from vendor assessments for up to three years.

The report shows all tests/subjects used in the analysis for this composite report.

Only subjects for which the teacher has a value-added measure in the current year are included.

• What is my value-added effectiveness level in OTES based on?

• How does my effectiveness level from this composite report compare to the distribution of teachers statewide in each level?

TEACHER VALUE-ADDED REPORTPurpose Possible QuestionsThis report displays an estimate of the gains achieved by a teacher’s students in a particular subject and grade. The report also indicates the teacher’s effectiveness level. There are five possible levels:

• Most Effective• Above Average• Average• Approaching Average• Least Effective

• Did my students exceed, meet, or not meet the growth standard?

• How does my effectiveness level compare to the distribution of teachers statewide in each level?

• What trends emerge in the estimate of the gains achieved by my students over time?

Page 10: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

10

TEACHER CUSTOM DIAGNOSTIC REPORTPurpose Possible QuestionsThis report allows the teacher to create a diagnostic report by choosing at least 15 students from a particular grade level and subject area.

Student placement into subgroups of this report reflects their prior achievement levels compared to this specific group of students.

• How did the growth of the students in the after-school program compare to those who did not participate in that program?

TEACHER DIAGNOSTIC SUMMARY REPORTPurpose Possible QuestionsThis report shows the progress of a teacher’s students, disaggregated by entering achievement level, for all grade/subjects for which the teacher received a value-added report in the current year. Pie charts at the top of the report offer a visual representation of the data presented in the table below.

• What trends emerge in the progress of my students, by achievement subgroup, across years for all applicable grades and subjects?

TEACHER DIAGNOSTIC REPORTPurpose Possible QuestionsThis report indicates how well a teacher is doing with each tertile of students. Students are placed within a tertile based on where they fell in the state distribution for their subject and grade.

The student list is on the right. This provides the list of students who fell into one of three prior achievement subgroups (tertiles).

• Which achievement subgroup is showing progress?

• Is there an achievement subgroup that I am not stretching?

• Is there a colleague in my school that can share strategies with me so that I can improve my instruction?

STUDENT/TEACHER LINKAGES REPORTPurpose Possible QuestionsThis report provides a list of all students linked to a particular teacher during roster verification. The report also indicates whether a student’s data were included in the teacher value-added analysis. Student names are hyperlinked to their achievement history.

• What students were included in the teacher report?

Page 11: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

11

INTERPRETING AND ANALYZING TEACHER VALUE-ADDED REPORTS

INVESTIGATION #1: DID STUDENTS REACH THE EXPECTED LEVEL OF GROWTH FOR THE CURRENT YEAR?

The Teacher Value-Added Report displays the growth made by students in a particular subject and grade or course for up to three consecutive school years. A multi-year trend is provided when sufficient data is available.

INTERPRETING THE TEACHER VALUE-ADDED REPORT • The report can be viewed as an Index Graph (shown) or a Growth Measure Graph. A table, provided below the

graph, includes the numerical data represented in the graphs. • The Student List View allows you to see a list of students who were used in the analysis. This list represents the

students who were verified on your rosters during the roster verification process. • The Educator Distribution lists the number of teachers in this grade and subject or course in each effectiveness

level, statewide.• The Multi-Year Trend represents up to three years of data, when possible, in both the growth measure and

standard error. This is often why the standard error is smaller in the multi-year trend in comparison to a one-year measure. More data leads to more certainty in the measure.

TIP FOR USING THE TEACHER-LEVEL VALUE-ADDED REPORT

A supplemental table is provided on teacher value-added report for grades and subjects or courses that use the predicted-mean, or URM approach. This table provides the average scale score, average predicted score, and average predicted percentile for a group of students.

Teacher growth measures are from SAS ® EVAAS ® multivariate, longitudinal analyses using multiple subjects, grades, and years of datafor each student.

Ohio Department of Education

Copyright © 2017 SAS Institute Inc., Cary, NC, USA. All Rights Reserved.

Report: Teacher Value Added Test: Ohio's State Tests (3-8)School: Cuttlefish Middle School Subject: ScienceDistrict: West Suburb School District Type: Tested

Teacher: Teacher-LkM Grade: 8th-Grade

Teacher Growth Measures and Standard Errors

Year Growth Measure Standard Error Index Level

2018 0.2 3.8 0.05 Average

Supplemental Information

Year Number of Students Average Score Average Percentile Average Predicted Score Average Predicted Percentile

2018 64 595.0 84.0 594.6 84.0

EXTEND YOUR LEARNING With a better understanding of your individual teacher reports, you are more prepared to have

team discussions. Using your grade-level reports, dig deeper into subject area reporting to discover grade-level areas of strength and opportunities for growth.

Reminder: You can access subject, grade, school, and district reports through the EVAAS® reporting system.

Page 12: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

12

1 What is the 2013 growth measure?

2 Did students reach the expected level of growth for the most recent year?

3 What trends emerge in the growth measure achieved over time?

4 By analyzing value-added reports, educators can prioritize the areas of strength and opportunities for growth that can be addressed at the classroom level. What are some strengths that you notice on this value-added report?

5 What are some opportunities for growth?

6 What might be some contributing factors to their strengths and opportunities to grow? Use page 31 to view sample contributing factors.

APPLY YOUR LEARNING: INTERPRETING AND ANALYZING TEACHER VALUE-ADDED REPORTSUse the teacher value-added report on page 11 to answer the following questions. Check your understanding on the next page.

Page 13: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

13

1 What is the 2013 growth measure?

2 Did students reach the expected level of growth for the current year?

3 What trends emerge in the growth measure achieved over time?

4 By analyzing value-added reports, educators can prioritize the areas of strength and opportunities for growth that can be addressed at the classroom level. What are some strengths that you notice on this value-added report?

5 What are some opportunities for growth?

6 What might be some contributing factors to their strengths and opportunities to grow? Use page 31 to view sample contributing factors.

The growth measure for the 2013 year is 1.8 NCEs.

The students exceeded the growth expectation. This can be found two ways. The first is by calculating the growth index, which is 4.16. This is found by dividing the growth measure (7.9) by the standard error (1.9). Since the growth index, 4.16, is more than 2, this teacher received an effectiveness level of “Most Effective.” Another way to tell that students met the expectation is by determining how many times larger the growth measure is than the standard error. This growth measure, 7.9 NCEs, is more than 2 times the standard error. Because of this, this teacher received an effectiveness level of “Most Effective.”

Over time this teacher’s growth measure has steadily increased. In 2012, the growth measure was -2.7 NCEs below the growth standard. In 2013, it was 1.8 NCEs above the growth standard, and in 2014, the growth measure was 7.9 NCEs above the growth standard.

Over time, this teacher’s growth measure has steadily increased.

In 2014, this teacher’s index was 4.16, meaning there was significant evidence that this class of students grew more than the average class in the state. This teacher met expectations in 2013 and exceeded expectations in the following year.

This teacher’s multi-year trend index is 2.22, meaning there was significant evidence that overall, this teacher’s students have grown more than the average in the state.

While this teacher was “Most Effective” in the most recent year, this teacher can still continue to improve, refine, and reflect on instructional practices to make sure they continue to meet the needs of their students.

2012 was the first year that I taught this subject and I didn’t have a deep content expertise. In 2013, our school adopted new curriculum to align with new standards and as I became more familiar with the content and had opportunities to collaborate with other teachers in this subject, I became more confident in the implementation of the curriculum.

CHECK FOR UNDERSTANDING: INTERPRETING AND ANALYZING TEACHER VALUE-ADDED REPORTS

Page 14: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

14

INTERPRETING AND ANALYZING DIAGNOSTIC REPORTS

INVESTIGATION #2: WHICH STUDENTS BENEFITED MOST (AND LEAST) FROM YOUR INSTRUCTION?

The Diagnostic Report allows you to identify patterns of progress in your classroom across different achievement levels, also called tertiles. Even though you may exceed expectations on your value-added report, it is possible that the classroom is not meeting the needs of all of your students.

Ohio Department of Education

Copyright © 2016 SAS Institute Inc., Cary, NC, USA. All Rights Reserved.

Report: Teacher Diagnostic Test: OAASchool: Tamarin Middle School Subject: ReadingDistrict: Big City School District Type: Tested

Teacher: KAITLIN COLEMAN (046535) Grade: 7th-Grade

Achievement Groups

1 (Lowest) 2 (Middle) 3 (Highest)

Standard for Academic Growth 0.0 0.0 0.0

2014

Growth 11.8 7.4 -4.1

Standard Error 4.3 1.1 2.1

No. of Students 22 23 36

Percent of Students 27.2 28.4 44.4

2013Growth 7.5 3.9 -6.4

Standard Error 2.7 1.0 2.4

1 (Lowest) 2 (Middle) 2 (Highest)

Page 15: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

15

INTERPRETING THE DIAGNOSTIC REPORT

• A growth measure is represented by the vertical blue (most recent year) and gold (previous year) bars on the graph and is also available in the table.

• The growth standard is represented by a green, horizontal line.

• The standard error is represented by the solid and dotted vertical black lines on the graph and is also available in the table. This measure represents the certainy around the growth measure and takes two things into consideration: the quality and the quantity of the data.

• If the standard error line, (solid black line) is entirely above the growth expectation (green horizontal line), then the group of students exceeded expectations. If the line is crossing the growth standard, the group of students met expected growth and if the line is entirely below the growth standard the group of students did not meet expectations.

• This report can also be viewed as a pie chart, which many educators share is their preferred view. This graph uses colors (green, yellow, and red) to indicate whether an achievement group exceeded, met, or did not meet the growth standard. The size of each piece represents the percentage of students that are in each achievement group.

Ohio Department of Education

Copyright © 2016 SAS Institute Inc., Cary, NC, USA. All Rights Reserved.

Report: Teacher Diagnostic Test: OAASchool: Tamarin Middle School Subject: ReadingDistrict: Big City School District Type: Tested

Teacher: KAITLIN COLEMAN (046535) Grade: 7th-Grade

Achievement Groups

1 (Lowest) 2 (Middle) 3 (Highest)

Standard for Academic Growth 0.0 0.0 0.0

2014

Growth 11.8 7.4 -4.1

Standard Error 4.3 1.1 2.1

No. of Students 22 23 36

Percent of Students 27.2 28.4 44.4

2013

Growth 7.5 3.9 -6.4

Standard Error 2.7 1.0 2.4

No. of Students 12 23 31

Percent of Students 18.2 34.8 47.0

Page 16: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

16

EXTEND YOUR LEARNING

With a better understanding of diagnostic reports, you are more prepared to have team discussions. Using your grade-level reports, dig deeper into subject area reporting to discover grade-level areas of strength and

opportunities for growth.

Reminder: You can access subject, grade, school, and district reports through the EVAAS® reporting system.

TIPS FOR USING DISTRICT- AND SCHOOL-LEVEL DIAGNOSTIC REPORTS

The following paterns describe some trends you might find on diagnostic reports

Learning experiences benefit all students similarly in a Flat Pattern.

Learning experiences benefit higher-achieving students more than their lower-achieving peers in the Upward Shed while the Downward Shed shows lower-achieving students benefitting more than higher-achieving peers.

Upward Shed

Teepee

Downward Shed

Reverse Teepee

Flat

Learning experiences benefit middle-achieving students more than their lower- and higher-achieving peers in the Teepee Pattern, while the Reverse Teepee shows higher-and lower-achieving students benefiting more than their lower-achieving peers.

The following gain patterns describe the trends shown on the diagnostic reports. Use the blue bars for this exercise.

Identifying Gain Patterns in School Diagnostic Reports

Copyright, 2009. Battelle for Kids. All rights reserved.

Upward Shed Pattern:School learning experiences benefit higher-achieving students more than their lower-achieving peers.

Downward Shed Pattern:School learning experiences benefit lower-achieving students more than their higher-achieving peers.

Teepee Pattern:School learning experiences benefit middle-achieving students more than their lower- and higher-achieving peers.

Reverse Teepee Pattern:School learning experiences benefit middle-achieving students less than their lower- and higher-achieving peers.

The following gain patterns describe the trends shown on the diagnostic reports. Use the blue bars for this exercise.

Identifying Gain Patterns in School Diagnostic Reports

Copyright, 2009. Battelle for Kids. All rights reserved.

Upward Shed Pattern:School learning experiences benefit higher-achieving students more than their lower-achieving peers.

Downward Shed Pattern:School learning experiences benefit lower-achieving students more than their higher-achieving peers.

Teepee Pattern:School learning experiences benefit middle-achieving students more than their lower- and higher-achieving peers.

Reverse Teepee Pattern:School learning experiences benefit middle-achieving students less than their lower- and higher-achieving peers.

The following gain patterns describe the trends shown on the diagnostic reports. Use the blue bars for this exercise.

Identifying Gain Patterns in School Diagnostic Reports

Copyright, 2009. Battelle for Kids. All rights reserved.

Upward Shed Pattern:School learning experiences benefit higher-achieving students more than their lower-achieving peers.

Downward Shed Pattern:School learning experiences benefit lower-achieving students more than their higher-achieving peers.

Teepee Pattern:School learning experiences benefit middle-achieving students more than their lower- and higher-achieving peers.

Reverse Teepee Pattern:School learning experiences benefit middle-achieving students less than their lower- and higher-achieving peers.

The following gain patterns describe the trends shown on the diagnostic reports. Use the blue bars for this exercise.

Identifying Gain Patterns in School Diagnostic Reports

Copyright, 2009. Battelle for Kids. All rights reserved.

Upward Shed Pattern:School learning experiences benefit higher-achieving students more than their lower-achieving peers.

Downward Shed Pattern:School learning experiences benefit lower-achieving students more than their higher-achieving peers.

Teepee Pattern:School learning experiences benefit middle-achieving students more than their lower- and higher-achieving peers.

Reverse Teepee Pattern:School learning experiences benefit middle-achieving students less than their lower- and higher-achieving peers.

Copyright, 2009. Battelle for Kids. All rights reserved.

Flat Pattern:School learning experiences benefit students of all achievement levels similarly.

Random Pattern:No discernible pattern exists that relates achievement levels to gains.

Identifying Gain Patterns in School Diagnostic Reports - Continued

Page 17: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

17

APPLY YOUR LEARNING: INTERPRETING AND ANALYZING DIAGNOSTIC REPORTSUse the Diagnostic Report on page 14 to answer the following questions. Check your understanding on the next page.

1 What was the growth measure for the most recent year for the lowest tertile?

2 What percent of students are in the highest achieving tertile in each of the reported years?

3 Which students benefitted most (and least) from the instruction in the most recent year?

4 What trends emerge in the growth measures achieved over time?

5 What are the greatest areas of strength and opportunities for growth in terms of student groups?

6 What might be some contributing factors to their strengths and opportunities to grow? Use page 31 to view sample contributing factors.

Page 18: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

18

1 What was the growth measure for the most recent year for the lowest tertile?

2 Did students reach the expected level of growth for the current year?

3 What trends emerge in the growth measures achieved over time?

4 By analyzing value-added reports, educators can prioritize the areas of strength and opportunities for growth that can be addressed at the classroom level. What are some strengths that you notice on this value-added report?

5 What are some opportunities for growth?

6 What might be some contributing factors to their strengths and opportunities to grow? Use page 31 to view sample contributing factors.

The growth measure is 11.8 NCEs. If this were a URM report, the measure would be reported in a scale score.

44.4% of students were in the highest tertile in 2014 and 47% of students were in the highest tertile in 2013.

This teacher is producing more than expected growth in the first and second tertiles in 2014, which are lower- and middle-achieving students. Students in the third tertile, or highest achievement group, are not meeting expected growth. We can tell this by looking at both the bar chart and the pie chart. In the bar chart, the height of the blue bar on the first and second tertile is showing positive growth, coupled with a standard error line that is entirely above the growth standard. The height of the blue bar in the third tertile is negative and the standard error line is entirely below the growth standard. The colors used in the pie chart indicate whether each achievement group exceeded, met, or did not meet the growth standard.

Over time, this teacher has increased the growth of the lowest- and middle-achievement group and continued to exceed the growth standard for these groups of students.

The highest-achieving group of students have not met the growth standard in the most recent or previous school year.

CHECK FOR UNDERSTANDING: INTERPRETING AND ANALYZING DIAGNOSTIC REPORTS

The curriculum that was used for the past few years, while aligned to state standards, did not provide enough stretch for high-achieving students. Data was not monitored or used frequently throughout the school year to identify high-achieving students and provide appropriate instruction to meet their needs. Also, support staff who co-taught classes tended to spend a majority of time helping low-achieving students and little emphasis was placed on high-achieving students.

Page 19: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

19

INTERPRETING AND ANALYZING CUSTOM DIAGNOSTIC REPORTS

INVESTIGATION #3: HOW DID THE GROWTH OF THE STUDENTS IN THE AFTER-SCHOOL PROGRAM COMPARE TO THOSE WHO DID NOT PARTICIPATE IN THAT PROGRAM?

The Custom Diagnostic Report disaggregates progress for a group of students that you choose.

INTERPRETING THE CUSTOM DIAGNOSTIC REPORT

• You can select from a number of pre-determined filters including race, gender, and demographic data to create a custom report.

• The tertiles are not relative to the state distribution. Students that are selected to be used in the report are distributed as evenly as possible among the three achievement groups.

• This report can also be viewed as a pie chart.

Custom Diagnostic Report: After-School Remediation Program

Page 20: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

20

TIPS FOR USING CUSTOM DIAGNOSTIC REPORT

Consider this report when evaluating the impact of enrichment or intervention programs offered to your students. You may also find the Custom Diagnostic Report useful if you have a Diagnostic Report that includes students in only one tertile of achievement when compared to the statewide distribution. By selecting all students and creating a Custom Diagnostic Report, you can see growth patterns that exist among the classroom’s three achievement levels. The additional pattern information may assist with goal setting.

EXTEND YOUR LEARNING

With a better understanding of Custom Diagnostic Reports, you are more prepared to have team discussions. Use your grade-level reports, dig deeper into subject area reporting to discover grade-level areas of strength

and opportunities for growth.

Reminder: You can access subject, grade, school, and district reports through the EVAAS® reporting system.

Page 21: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

21

APPLY YOUR LEARNING: INTERPRETING AND ANALYZING DIAGNOSTIC REPORTSUse the Custom Diagnostic Report on page 19 to answer the following questions. Check your understanding on the next page.

1 Which students benefitted most (and least) from the after-school math instruction in the most recent year?

2 What trends do you notice in the report?

3 What were areas of strength and opportunities for growth?

4 What might be some contributing factors to their strengths and opportunities to grow? Use page 31 to view sample contributing factors.

Page 22: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

22

CHECK FOR UNDERSTANDING: INTERPRETING AND ANALYZING DIAGNOSTIC REPORTS

1 Which students benefitted most (and least) from the after-school math instruction in the most recent year?

2 What trends do you notice in the report?

3 What were areas of strength and opportunities for growth?

4 What might be some contributing factors to their strengths and opportunities to grow? Use page 31 to view sample contributing factors.

The students in the highest tertile exceeded expectations, while students in the first tertile did not meet expectations. Of the students that participated in this after-school program, the top achieving third of students benefitted the most while the bottom third of students benefitted the least.

The higher-achieving students in this group seemed to benefit more from the after-school program than their peers.

The progress of the low achieving tertile is an opportunity for growth.

The instructional materials used during the after-school remediation program met the needs of the middle- and high-achieving students within this group. However, instruction needed to be differentiated to meet the needs of the lowest third of the group. In the future, it might be beneficial to strategically use teachers and community volunteers to provide one-on-one tutoring or small group instruction to the lowest-achieving students.

Page 23: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

23

MAKING USE OF STUDENT INFORMATION IN EVAAS®

Explore the individual student information in the SAS® EVAAS® website by drilling down from the diagnostic reports or by using the Student Search features. Student information provides transparency on the students who were used in the analysis, which should reflect the students linked during the teacher roster verification process as long as they had sufficient data. You can automatically access student information for students you taught last year that were used in your most recent teacher value-added report(s).

Investigation numbers 4 and 5 model the use of student information from the Student History Report and the Student Projection Reports in EVAAS®. Be aware that there are limitations to the Student History Report; be cautious not to overuse single data points from one student. The actual value-added analysis measures the gain of a group of students and uses multiple data points from students to minimize measurement error.

TIP FOR USING STUDENT LEVEL INFORMATION

By default, you have access to projection reports for students you taught in the previous school year. Principals can provide you with access to all students in the EVAAS® website, which gives you the ability to see Student History Reports and Student Project Reports for students currently enrolled in your grade and subject or course.

Page 24: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

24

INTERPRETING STUDENT HISTORY REPORTS

INVESTIGATION #4: HOW HAS A PARTICULAR STUDENT PERFORMED ON THE STATE TESTS OVER TIME? DOES THEIR PERFORMANCE DIFFER FROM THE AVERAGE SCORE AT THEIR SCHOOL OR DISTRICT?

The Student History Report displays the student’s historical performance on previous state assessments and includes the average score for the school and the district. This report can be helpful for you to review, however, too much emphasis should not be placed on comparing scores of one student from year to year.

INTERPRETING THE STUDENT HISTORY REPORT

• Percentiles are used to represent achievement data. These are indicated on the graph and provided in the accompanying table, along with students’ NCE or a scale score.

• State performance levels are provided for each available state assessment.

• Red triangles represent the student’s achievement scores, in percentiles, on previous assessments in a particular subject.

• Green diamonds represent the average achievement, in percentiles, for the school in which the student tested each year.

• Blue circles represent the average achievement, in percentiles, for the district in which the student tested each year. If the school and district averages are the same, only the school’s data point will be visible.

Ohio Department of Education

Copyright © 2017 SAS Institute Inc., Cary, NC, USA. All Rights Reserved.

Report: Student History Report Test: Ohio's State Tests (3-8)Student: KELCI NUNEZ OH7881952 Subject: Mathematics

Year: 2018

Subject: Mathematics

Year (Grade or Subject Tested)

Ohio's State Tests (3-8) (Mathematics)

2013(3) 2014(4) 2015(5) 2016(6) 2017(7) 2018(8)

NCE \ Score 54 58 51 58 50 65

%-ile 57 65 51 66 49 76

Perf Level P ACC P ACC P ACC

Performance Levels:

P - Proficient

ACC - Accelerated

Ohio Department of Education

Copyright © 2017 SAS Institute Inc., Cary, NC, USA. All Rights Reserved.

Report: Student History Report Test: Ohio's State Tests (3-8)Student: KELCI NUNEZ OH7881952 Subject: Mathematics

Year: 2018

Subject: Mathematics

Year (Grade or Subject Tested)

Ohio's State Tests (3-8) (Mathematics)

2013(3) 2014(4) 2015(5) 2016(6) 2017(7) 2018(8)

NCE \ Score 54 58 51 58 50 65

%-ile 57 65 51 66 49 76

Perf Level P ACC P ACC P ACC

Performance Levels:

P - Proficient

ACC - Accelerated

Page 25: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

25

TIP FOR USING STUDENT HISTORY REPORTS

By selecting the Tests/Subjects tab in the navigation pane, you can use Student History Reports to review student achievement in other subject areas. Reviewing student achievement across subjects can inform collaborative conversations with other teachers in your district or school.

From the Student Reports tab in the Navigation Pane, you can access Student Projection Reports. Our previous example looked at a Student History Report for a student who most recently completed 8th grade math. The following includes a list of projection options that would be available for such a student. Next we will look at his Student Projection Report for Algebra I.

Page 26: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

26

APPLY YOUR LEARNING: INTERPRETING STUDENT HISTORY REPORTSUse the Student History Report on page 24 to answer the following questions. Check your understanding on the next page.

1 How has this particular student performed on the test over time?

2 How did the student typically perform in relationship to the average score at their school?

3 How might you use this report to plan instruction for this student?

4 How might you communicate this report to the student or the student’s family?

Page 27: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

27

CHECK FOR UNDERSTANDING: INTERPRETING STUDENT HISTORY REPORTS

1 How has this particular student performed on the test over time?

2 How did the student typically perform in relationship to the average score at their school?

3 How might you use this report to plan instruction for this student?

4 How might you communicate this report to the student or the student’s family?

Over time this student has performed either accelerated or proficient, typically scoring above the 50%. This means, typically, they scored higher than 50% of all other students taking the same test in the state of Ohio. Looking at the Student History Report, we can compare this students’ performance to the average performance in their school and their district. This student performed above the average percentile of both his school and district, for all math state tested areas.

A teacher might use this report to reflect on educational opportunities this student has had in the past and how those might have affected the student’s strengths and opportunities for growth. For example, if this student received additional support in the most recent school year, the teacher may want to protect this practice since it could be a contributing factor to the student’s performance.

This report could be used at a parent teacher conference or in a student learning conference. When sharing this report it would be important to help the student and/or family understand how to interpret the report. Strengths should be highlighted first, which could include the performance in the most recent year. The teacher might begin to pose questions to the student or family to encourage reflection on the performance over time.

This student performed above the average achievement percentile for math at their school.

Page 28: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

28

INTERPRETING STUDENT PROJECTION REPORTS

INVESTIGATION #5: WHEN ARE INTERVENTION OR ENRICHMENT STRATEGIES APPROPRIATE FOR A PARTICULAR STUDENT?

The Student Projection Report indicates a student’s projected percentile and the likelihood of reaching the various performance levels on a future state assessment. Projections to Ohio’s State Tests are available for all students in grades 5 through 8 as well as end-of-course subjects, provided students have a minimum of three historical state test data points.

INTERPRETING THE STUDENT PROJECTION REPORT

• Projection is based on a student’s prior test history and how students with similar testing histories have performed on the assessment.

• Projection percentile assumes students will have an average schooling experience.• Reports are not about predicting the future; they should inform actions.• If a Student Projection Report indicates a low probability of success, and a teacher responds by providing

additional support and interventions, the student may outperform the projection.

Ohio Department of Education

Copyright © 2017 SAS Institute Inc., Cary, NC, USA. All Rights Reserved. 1

Report: Student Projection ReportStudent: KELCI NUNEZ OH7881952

Projection: OST EOC Algebra I

Projection: OST EOC Algebra I

Projected State PercentileProbability of scoring the indicated Performance Level or above

Basic Proficient Accelerated Advanced

67 99.2% 90.4% 63.0% 21.4%

Student's Testing History

Year (Grade or Subject Tested)

Ohio's State Tests (3-8) (Mathematics)

2013(3) 2014(4) 2015(5) 2016(6) 2017(7) 2018(8)

NCE \ Score 54 58 51 58 50 65

%-ile 57 65 51 66 49 76

Ohio Department of Education

Copyright © 2017 SAS Institute Inc., Cary, NC, USA. All Rights Reserved. 1

Report: Student Projection ReportStudent: KELCI NUNEZ OH7881952

Projection: OST EOC Algebra I

Projection: OST EOC Algebra I

Projected State PercentileProbability of scoring the indicated Performance Level or above

Basic Proficient Accelerated Advanced

67 99.2% 90.4% 63.0% 21.4%

Student's Testing History

Year (Grade or Subject Tested)

Ohio's State Tests (3-8) (Mathematics)

2013(3) 2014(4) 2015(5) 2016(6) 2017(7) 2018(8)

NCE \ Score 54 58 51 58 50 65

%-ile 57 65 51 66 49 76

TIPS FOR USING STUDENT PROJECTION REPORTS

• Understand each student‘s achievement trajectory. A worthwhile end-of-the-year goal would be for each of your students to achieve at a level higher than his or her projection. If, for example, a student is projected to score just below the advanced level, your goal should be to get that student to the advanced level.

• Create intervention lists for students at-risk of not reaching proficiency.• Identify students who have done well academically in the past and are on the trajectory to continue to

do so. Keep them engaged, encouraged, and offer enrichment opportunities.• Compare curricular offerings to student readiness. For example, if many students have a 90 percent

or better probability of reaching proficiency on the math test at the end of the year, consider offering enrichment opportunities or exploring acceleration options.

• Compare students expected to achieve at each achievement level to your Diagnostic Report to note how these student groups have grown in the past.

Page 29: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

29

APPLY YOUR LEARNING: INTERPRETING STUDENT PROJECTION REPORTSUse the Student Projection Report on page 28 to answer the following questions. Check your understanding on the next page.

1 How is this student projected to perform on the Algebra I end-of-course test?

2 How might the information in the Projection Report and Student History Report inform your instructional decisions for this student?

Page 30: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

30

CHECK FOR UNDERSTANDING: INTERPRETING STUDENT PROJECTION REPORTS

1 How is this student projected to perform on the Algebra I end-of-course test?

2 How might the information in the Projection Report and Student History Report inform your instructional decisions for this student?

This student is projected to perform at the 67th percentile in Algebra I. The likelihood of scoring at the Proficient Performance Level is 90.4%, assuming the average schooling experience. The probability of scoring at the Accelerated Performance Level is 63%.

This student, historically, has performed proficient or accelerated on state math tests and is projected to continue this trend. Challenging learning opportunities should be provided to this student within the Algebra I course content to push this student to outperform their projection.

Page 31: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

31

PREPARING TO ANALYZE SAS® EVAAS® REPORTS: IDENTIFYING CONTRIBUTING FACTORS

It is important to uncover factors that produce particular strengths and areas for growth in academic outcomes. The primary reason for uncovering contributing factors is that they provide a significant lever for improving instructional practice and accelerating student learning. If something is working and educators know why, practices can be maintained and protected. If something is not working and educators know why, resources needed to discover a solution can be identified.

Because this guidebook is aimed at improving instructional practice that impacts student results in a classroom, it is important to prioritize the areas of strength and opportunities for growth that can be addressed by teachers at the classroom level.

When identifying factors, focus reflection on things like:

• What educational opportunities were available for students? • Was data used to inform decisions? • How was instructional time used? • What professional learning opportunities were available?

Sample contributing factors are provided below. This list is not all-inclusive, rather it provides several examples that might represent assessment, instructional, professional learning, and curriculum factors.

Strengths Opportunities for Growth• Flexible learning groups used based on assessment data• Effective conversations around instruction, assessment,

curriculum, and classroom management• Effective use of time• Strong partnership with parents and community• Instructional materials met students’ needs• Balance of formative and summative assessment• Use of self- and peer-assessment• Regular use of descriptive feedback• Systematic use of data to inform instruction• Frequent peer observation• High expectations for all students• Use of intervention strategies• School-wide intervention plan in place• Curriculum and resources are aligned to standards• Strong content area expertise

• Learning targets and goals were unclear • Ineffective questioning strategies• Infrequent use of descriptive feedback• Gaps in content-area expertise• High expectations for some students• Most assessment is summative in nature• Infrequent use of quality rubrics• Congenial rather than collegial conversations• No access to high quality professional learning• Instruction varies little year to year• Curriculum not aligned to state standards• Unclear learning goals• Most questions are at the basic-learning level• Minimal use of data to inform ongoing instruction• Static learning groups• Instructional materials and strategies do not reach

all students• Ineffective use of classroom time

IDENTIFYING CONTRIBUTING FACTORS RESOURCE

Page 32: Value-Added Guidebook - Battelle for Kidsstatic.battelleforkids.org/documents/ohio/VATeacherGuide.pdf · to answer particular questions about a teacher’s value-added results, providing

32

Here are guiding questions you can use to reflect and analyze your value-added reports.

• What trends exist?• In which years was the growth standard met?• What areas of strength and opportunity for growth can I identify in my Teacher-Level

Value-Added report?• What additional data supports my strengths and opportunities for growth? • How can I use my strengths to improve my instructional practices and accelerate

student learning?

PREPARING FOR THE EDUCATOR GROWTH PLAN

Once reports have been analyzed, resulting information can be used to help develop a professional growth plan. Use the following questions to guide your planning.

What are your areas of strength and opportunities for growth?

What are possible goals that address your strengths and opportunities for growth?

What strategies will be implemented that will address your goal(s)?

What evidence supports that the strategies were effective?