team evaluator training · professionalism scoring summative conference 26 repeat as needed...
TRANSCRIPT
TEAM Evaluator Training
Summer 2014
TEAM Teacher Evaluation ProcessDay 1
2
Instruction Planning
Environment
Agenda
3
Day Components
Day One •TEAM Overview•Diving into the Rubric•Collecting Evidence•Pre-Conferences
Day Two •Post-Conferences•Professionalism•Alternate Rubrics•Quantitative Measures•Closing out the Year
Expectations
To prevent distracting yourself or others, please put away all cellphones, iPads, and other electronic devices.
There will be time during breaks and lunch to use these devices as needed.
4
Overarching Training Objectives
Participants will:
• Be able to implement the TEAM process
• Successfully collect and apply evidence to the rubric
• Be prepared to provide meaningful feedback to teachers based on the rubric
• Be able to meaningfully articulate the purpose of evaluation and the role of feedback to teachers and parents
5
Norms
Keep your focus and decision-making centered on students and educators.
Be present and engaged.
• Limit distractions and sidebar conversations.
• If urgent matters come up, please step outside.
Challenge with respect, and respect all.
• Disagreement can be a healthy part of learning!
Be solutions-oriented.
• For the good of the group, look for the possible.
Risk productive struggle.
• This is a safe space to get out of your comfort zone.
Commissioner of Education (Video)
7
Chapter 1: TEAM Overview
8
Evaluation closely links with Common Core
9
Student Readiness for Postsecondary Education
and the Workforce
WHY we teach
Common Core State Standards provide a vision of excellence for WHAT we teach
TEAM provides a vision of excellence for HOW we teach
We aim to be the fastest improving state in the nation by 2015
We will measure our success by our progress on NAEP, ACT, and PARCC
*Tennessee had the largest recorded growth in NAEP history! Te
nness
ee
10
11
And we will continue to close achievement gaps as we grow overall achievement
and
Growth for all students, every year
Faster growth for those students who are furthest behind
EXPLORE and PLAN results show Tennessee making substantial growth over the last three years
12
14
14.2
14.4
14.6
14.8
15
15.2
15.4
15.6
15.8
16
2010 2011 2012 2013
Ave
rage
Co
mp
osi
te S
core
(0
-25
)
16.0
16.2
16.4
16.6
16.8
17.0
17.2
17.4
17.6
17.8
18.0
2010 2011 2012 2013A
vera
ge C
om
po
site
Sco
re (
0-3
2)
EXPLORE (8th grade) PLAN (10th grade)
Tennessee Results
National Norm
13
These gains mean thousands of additional students are performing on grade level
Nearly 91,000 additional students are at or above grade level in all math subjects now, as compared to 2010.
Nearly 52,000 additional students are at or above grade level in all science subjects, as compared to 2010.
*2011 was the baseline year for the Algebra II EOC. 14
197,035
152,278
227,997 223,947 225,782
278,178
Reading Math Science
Grades 3-8
43,887
33,056
17,228
41,185
49,679
40,862
27,035
42,832
English I Algebra I Algebra II Biology I
Grades 9-12
20132010 2011 20132010
Tennessee’s gains on TCAP are substantial when compared with other states
* In 2011-12, Delaware began providing students with a second opportunity to retake its state assessment, and included in its accountability data only the higher score for any student who took the test twice. http://www.doe.k12.de.us/dcas/files/StateSumOverviewReport2012.pdf; http://news.delaware.gov/2012/06/13/state-tests-show-student-gains/
15
Top states in percentage point gains, 2010-11 to 2011-12
3-8 Math 3-8 ELA 3-8 Math + ELA
1 DE* 10.6% 1 DE* 12.9% 1 DE* 23.5%
2 TN 6.3% 2 AR 6.1% 2 TN 8.7%
3 NE 4.9% 3 HI 4.3% 3 NE 8.1%
4 WV 4.3% 4 NV 3.7% 4 HI 8.0%
5 HI 3.7% 5 MS 3.5% 5 AR 7.1%
6 NV 3.0% 6 MI 3.3% 6 NV 6.7%
7 MS 2.8% 7 CA 3.3% 7 MS 6.3%
8 WA 2.7% 8 NE 3.2% 8 WA 5.5%
9 ME 2.6% 9 WA 2.8% 9 ME 4.8%
10 AL 2.3% 10 TN 2.4% 10 CA 4.7%
16
Origin of the TEAM rubric
TDOE partnered with NIET to adapt their rubric for use in Tennessee.
The NIET rubric is based on research and best practices from multiple sources. In addition to the research from Charlotte Danielson and others, NIET reviewed instructional guidelines and standards developed by numerous national and state teacher standards organizations. From this information they developed a comprehensive set of standards for teacher evaluation and development.
Work that informed the NIET rubric included:
• The Interstate New Teacher Assessment and Support Consortium (INTASC) • The National Board for Professional Teacher Standards • Massachusetts' Principles for Effective Teaching • California's Standards for the Teaching Profession • Connecticut's Beginning Educator Support Program, and • The New Teacher Center's Developmental Continuum of Teacher Abilities.
17
Components of Evaluation: Tested Grades and Subjects
Qualitative includes:
Observations in planning, environment, and instruction
Professionalism rubric
Quantitative includes:
Growth measure
TVAAS or comparable measure
Achievement measure
Goal set by teacher and evaluator
18
Qualitative 50%
Achievement Measure
15%
GrowthMeasure
35%
Components of Evaluation:Non-tested Grades and Subjects
Qualitative includes:
Observations in planning, environment, and instruction
Professionalism rubric
Quantitative includes:
Growth measure
TVAAS or comparable measure
Achievement measure
Goal set by teacher and evaluator
19
Qualitative 60%
Achievement Measure
15%
GrowthMeasure
25%
Rubrics
General Educator
Library Media Specialist
School Services Personnel
School Audiologist PreK-12
School Counselor PreK-12
School Social Worker PreK-12
School Psychologist PreK-12
Speech/Language Therapist
May be used at the discretion of LEA for other educators who do not have direct
instructional contact with students, such as instructional coaches who work with
teachers.
20
Domains
21
Instruction
Environment
ProfessionalismPlanning
Planning Domain
22
Instructional Plans
Student Work
Assessment
Environment Domain
23
Managing Student Behavior
Expectations
Environment
Respectful Culture
Professionalism Domain
24
Professional Growth and Learning
Use of Data
School and Community Involvement
Leadership
Instruction Domain
25
Standards & Objectives
Motivating Students
Presenting Instructional Content
Lesson Structure & Pacing
Activities & Materials
Questioning
Academic Feedback
Grouping Students
Teacher Content Knowledge
Teacher Knowledge of Students
Thinking
Problem-Solving
Evaluation Process
Initial Coaching Conversation• Required for teachers who received an overall effectiveness rating or
individual growth score of 1 in the previous year
Pre-Conference
Classroom Visit
Post-Conference
Professionalism Scoring
Summative Conference
26
Repeat as needed depending on number of required observations
Coaching Conversations (Video)
27
Suggested Pacing
28
Observation Guidance
Coaching Conversation• A targeted conversation with teachers who scored a 1 on their overall
evaluation or individual growth about the number of required observations and what supports they will receive this year to improve student achievement.
Observing Multiple Domains During One Classroom Visit• Districts may choose to observe the instruction domain during the same
classroom visit as either the planning domain or the environment domain.
Announced vs. Unannounced Visits • At least half of domains observed must be unannounced, but it is the
district’s discretion to have more than half of domains observed unannounced.
29
Growth Measure Overview
State law currently requires value-added (or a comparable growth measure) to count as 35 percent of the total evaluation score.
For teachers in state tested grades/subjects, the 35 percent growth component is their individual TVAAS score.
For teachers in districts that have opted-into the portfolio growth models, their portfolio score will count as 35 percent.
For teachers without an individual growth measure, this will be a school-, district-, or state-wide TVAAS score that comprises 25 percent.
Additional measures for non-tested grades/subjects are in development.
30
Growth vs. Achievement
Growth measures progress from a baseline
Ex. John grew faster than we would have expected this year based on his testing history.
Achievement measures proficiency
Ex. John scored a 98 percent on his test.
31
15 Percent Achievement Measure
The 15 percent measure is based on a yearly goal set by the educator and his/her evaluator that is measured by current year data.
To make the 15% meaningful, the evaluator and educator work together to identify a measure.
• If there is a disagreement between the educator and the evaluator, the educator’s decision stands.
The selection and goal-setting process involves determining which measure most closely aligns to the educator’s job responsibilities and the school’s goals.
32
Process of Selecting the 15 Percent Measure
Selection of the measure should not focus on:
Measures that are not directly impacted by the subject area taught. (Example: Teacher using Social Studies who does not teach Social studies.)
Measures that are unrelated to the role of the educator.
33
Framing Questions (Activity)
• Why do we believe that teacher evaluations are important?
• What should be accomplished by teacher evaluations?
• What beliefs provide a foundation for an effective evaluation?
34
Core Beliefs
We all have room to improve. Our work has a direct impact on the opportunities and future of our students. We must take seriously the importance of honestly assessing our effectiveness and challenging each other to get better.
The rubric is designed to present a rigorous vision of excellent instruction so every teacher can see areas where he/she can improve. The focus of observation should be on student and teacher actions because that interaction is where learning occurs.
35
Core Beliefs (Continued)
We score lessons, not people. As you use the rubric during an observation, remember it is not a checklist. Observers should look for the preponderance of evidence based on the interaction between students and teacher.
Every lesson has strengths and areas that can be improved. Each scored lesson is one factor in a multi-faceted evaluation model designed to provide a holistic view of teacher effectiveness.
As evaluators, we also have room to improve. Observing teachers provides specific evidence that should inform decisions about professional development. Connecting teachers for coaching in specific areas of instruction is often the most accessible and meaningful professional development we can offer.
36
Materials Walk
37
Chapter 2: Diving into the Rubric
38
Evaluator Expectations
Initially, evaluators aren’t expected to be perfectly fluent in the TEAM rubric.
The rubric is not a checklist of teacher behaviors. It is used holistically.
Just being exposed to the rubric is not sufficient for full fluency.
Fully fluent use of the rubric means using student actions and discussions to analyze the qualitative effects of teacher practice on student learning.
We’ll learn how to use it together through practice.
39
The Value of Practice
To utilize the rubric tool effectively, each person has to develop his/her skills in order to analyze and assess each indicator in practical application.
Understanding and expertise will be increased through exposure and engagement in simulated or practice episodes.
This practice will define the evaluator’s understanding and strengthen his/her skills as an evaluator.
40
Placemat Consensus (Activity)
2 minutes to write individually
3 minutes to talk and reach consensus
5 minutes to debrief
41
Participant A
Participant B
Participant D
Participant C
Consensus Elements
QUESTION: When you walk out of a classroom lesson that you believe to be effective, what were the elements that led you to that decision?
Effective Elements Summary
Defined daily objective that is clearly communicated to students
Student engagement and interaction
Alignment of activities and materials throughout lesson
Rigorous student work, citing evidence and using complex texts
Student relevancy
Numerous checks for mastery
Differentiation
42
TEAM RubricTDOE has worked with NIET to define a set of professional indicators, known as the Instructional Rubrics, to measure teaching skills, knowledge, and responsibilities of the teachers in a school.
43
Instruction
Significantly Above Expectations (5) At Expectations (3)Significantly Below Expectations
(1)
Stan
dar
ds
and
Ob
ject
ive
s
• All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are aligned and logically sequenced to the lesson’s major objective.
• Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines.
• Expectations for student performance are clear, demanding, and high.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Most learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are mostly aligned to the lesson’s major objective.
• Learning objectives are connected to what students have previously learned.
• Expectations for student performance are clear.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Few learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are inconsistently aligned to the lesson’s major objective.
• Learning objectives are rarely connected to what students have previously learned.
• Expectations for student performance are vague.
• There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
The Parts of the Rubric: Domains
44
Instruction
Significantly Above Expectations (5) At Expectations (3)Significantly Below Expectations
(1)
Stan
dar
ds
and
Ob
ject
ive
s
• All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are aligned and logically sequenced to the lesson’s major objective.
• Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines.
• Expectations for student performance are clear, demanding, and high.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Most learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are mostly aligned to the lesson’s major objective.
• Learning objectives are connected to what students have previously learned.
• Expectations for student performance are clear.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Few learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are inconsistently aligned to the lesson’s major objective.
• Learning objectives are rarely connected to what students have previously learned.
• Expectations for student performance are vague.
• There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
The Parts of the Rubric: Indicators
45
Instruction
Significantly Above Expectations (5) At Expectations (3)Significantly Below Expectations
(1)
Stan
dar
ds
and
Ob
ject
ive
s
• All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are aligned and logically sequenced to the lesson’s major objective.
• Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines.
• Expectations for student performance are clear, demanding, and high.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Most learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are mostly aligned to the lesson’s major objective.
• Learning objectives are connected to what students have previously learned.
• Expectations for student performance are clear.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Few learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are inconsistently aligned to the lesson’s major objective.
• Learning objectives are rarely connected to what students have previously learned.
• Expectations for student performance are vague.
• There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
The Parts of the Rubric: Descriptors
46
Instruction
Significantly Above Expectations (5) At Expectations (3)Significantly Below Expectations
(1)
Stan
dar
ds
and
Ob
ject
ive
s
• All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are aligned and logically sequenced to the lesson’s major objective.
• Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines.
• Expectations for student performance are clear, demanding, and high.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Most learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are mostly aligned to the lesson’s major objective.
• Learning objectives are connected to what students have previously learned.
• Expectations for student performance are clear.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Few learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are inconsistently aligned to the lesson’s major objective.
• Learning objectives are rarely connected to what students have previously learned.
• Expectations for student performance are vague.
• There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
The Parts of the Rubric: Performance Levels
47
Instruction
Significantly Above Expectations (5) At Expectations (3)Significantly Below Expectations
(1)
Stan
dar
ds
and
Ob
ject
ive
s
• All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are aligned and logically sequenced to the lesson’s major objective.
• Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines.
• Expectations for student performance are clear, demanding, and high.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Most learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are mostly aligned to the lesson’s major objective.
• Learning objectives are connected to what students have previously learned.
• Expectations for student performance are clear.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Few learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are inconsistently aligned to the lesson’s major objective.
• Learning objectives are rarely connected to what students have previously learned.
• Expectations for student performance are vague.
• There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
What is the Process of Modeling your Thinking (Think-Aloud)
48
Think Aloud: Teacher models thinking, revealing his/her metacognition
I do
Scaffold & Cue: Students work in partners or groups applying thinking, with teacher monitoring and supporting
Students Explain Thinking: Students demonstrate mastery and explain their thinking
You do
We do
Standards and Objectives
49
Instruction
Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1)
Stan
dar
ds
and
Ob
ject
ive
s
• All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are aligned and logically sequenced to the lesson’s major objective.
• Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines.
• Expectations for student performance are clear, demanding, and high.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Most learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are mostly aligned to the lesson’s major objective.
• Learning objectives are connected to what students have previously learned.
• Expectations for student performance are clear.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Few learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are inconsistently aligned to the lesson’s major objective.
• Learning objectives are rarely connected to what students have previously learned.
• Expectations for student performance are vague.
• There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
Standards and Objectives
50
Instruction
Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1)
Stan
dar
ds
and
Ob
ject
ive
s
• All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are aligned and logically sequenced to the lesson’s major objective.
• Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines.
• Expectations for student performance are clear, demanding, and high.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Most learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are mostly aligned to the lesson’s major objective.
• Learning objectives are connected to what students have previously learned.
• Expectations for student performance are clear.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Few learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are inconsistently aligned to the lesson’s major objective.
• Learning objectives are rarely connected to what students have previously learned.
• Expectations for student performance are vague.
• There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
Standards and Objectives
51
Instruction
Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1)
Stan
dar
ds
and
Ob
ject
ive
s
• All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are aligned and logically sequenced to the lesson’s major objective.
• Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines.
• Expectations for student performance are clear, demanding, and high.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Most learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are mostly aligned to the lesson’s major objective.
• Learning objectives are connected to what students have previously learned.
• Expectations for student performance are clear.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Few learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are inconsistently aligned to the lesson’s major objective.
• Learning objectives are rarely connected to what students have previously learned.
• Expectations for student performance are vague.
• There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
Standards and Objectives
52
Instruction
Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1)
Stan
dar
ds
and
Ob
ject
ive
s
• All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are aligned and logically sequenced to the lesson’s major objective.
• Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines.
• Expectations for student performance are clear, demanding, and high.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Most learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are mostly aligned to the lesson’s major objective.
• Learning objectives are connected to what students have previously learned.
• Expectations for student performance are clear.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Few learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are inconsistently aligned to the lesson’s major objective.
• Learning objectives are rarely connected to what students have previously learned.
• Expectations for student performance are vague.
• There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
Standards and Objectives
53
Instruction
Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1)
Stan
dar
ds
and
Ob
ject
ive
s
• All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are aligned and logically sequenced to the lesson’s major objective.
• Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines.
• Expectations for student performance are clear, demanding, and high.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Most learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are mostly aligned to the lesson’s major objective.
• Learning objectives are connected to what students have previously learned.
• Expectations for student performance are clear.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Few learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are inconsistently aligned to the lesson’s major objective.
• Learning objectives are rarely connected to what students have previously learned.
• Expectations for student performance are vague.
• There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
Standards and Objectives
54
Instruction
Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1)
Stan
dar
ds
and
Ob
ject
ive
s
• All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are aligned and logically sequenced to the lesson’s major objective.
• Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines.
• Expectations for student performance are clear, demanding, and high.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Most learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are mostly aligned to the lesson’s major objective.
• Learning objectives are connected to what students have previously learned.
• Expectations for student performance are clear.
• There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
• Few learning objectives are communicated, connected to state standards and referenced throughout lesson.
• Sub-objectives are inconsistently aligned to the lesson’s major objective.
• Learning objectives are rarely connected to what students have previously learned.
• Expectations for student performance are vague.
• There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard.
Instructional Domain (Activity)
Directions:
Highlight key words from the descriptors under the “At Expectations”
column for the remaining indicators with your shoulder partner. You will
have 15 minutes to complete this.
55
Reflection Questions (Activity)
How is the rubric interconnected? (what threads do you see throughout the indicators?)
Where do you see overlap?
56
Questioning and Academic Feedback (Activity)
The Questioning and Academic Feedback indicators are closely connected with each other.
With a partner, look closely at these two indicators and discuss how you think they are linked.
What does this mean for your observation of these two indicators?
57
Thinking and Problem-Solving (Activity)
The Thinking and Problem-Solving indicators are closely connected with each other.
With a partner, look closely at these two indicators and discuss how you think they are linked.
What does this new learning mean for your observation of these indicators?
58
The Thinking and Problem-Solving Link
59
Thinking Problem-Solving
Process Product
Instructional Rubric Activity
Directions:
Place 1 post-it on the t-chart with the indicator that you currently perceive as a strength for your teachers.
Place 1 post-it on the t-chart with the indicator that you perceive your teachers will find most challenging.
60
Common Core Connections
Standards and Objectives
• How does the evaluation rubric assess lessons that span multiple days?
Presenting Instructional Content
• Will student-led discussions hurt my evaluation score because I’m not talking very much? What if I don’t directly present content?
Academic Feedback
• Will staying neutral during student-led conversations hurt my score on Academic Feedback? Will I get marked down for not providing positive reinforcement?
61
RTI2
RTI2 Tier I instruction is synonymous with effective, differentiated instruction.
Effective observation of RTI2 Tier II and Tier III contexts requires a strong understanding of holistic scoring.• Ex. Look at the grouping indicator. Which descriptors would you expect to see in the
RTI context? Which descriptor may not be relevant?
Be intentional about using professional judgment to determine when it is appropriate to observe an educator in an intervention setting.• Ex. Is a regular classroom teacher facilitating computer-based intervention rather than
delivering instruction today? It may be appropriate to treat this similarly to if you walk in on an assessment.
Be intentional about using professional judgment to determine which rubric is the most appropriate for an educator.• Ex. An interventionist whose sole responsibility is to facilitate computer-based
intervention may be evaluated using the SSP rubric if they are consistently delivering services rather than instruction.
62
Chapter 3: Collecting Evidence
63
When do you collect evidence?
Pre-conference (Announced only)
Review of lesson plan as applicable
64
What the teacher says and does
What the students say and do
Ask clarifying questions if needed (before the post-conference)
Ex. What thought process did you use to group your students?
Collecting Evidence is Essential
Detailed Collection of Evidence:
Unbiased notes about what occurs during a classroom lesson.
Capture:
• What the students say
• What the students do
• What the teacher says
• What the teacher does
Copy wording from visuals used during the lesson.
Record time segments of lesson.
Remember that using the rubric as a checklist will not capture the quality of student learning.
65
The collection of detailed evidence is ESSENTIAL for the evaluation process to be implemented accurately, fairly, and for the intended purpose of the process.
Evidence Collecting Tips
During the lesson:
66
1. Monitor and record time
2. Use short-hand as appropriate for you
3. Pay special attention to questions and feedback
4. Record key evidence verbatim
5. Circulate without disrupting
6. Focus on what students are saying and doing, not just the teacher
Sample Evidence Collection Notes
67
Sample Evidence Collection Notes
68
Teacher
Observing Classroom Instruction
• We will view a lesson and gather evidence.
• After viewing the lesson, we will categorize evidence and assign scores in the Instruction domain.
• In order to categorize evidence and assign scores, what will you need to do as you watch the lesson?
• Capture what the students and teacher say and do.
• Remember that the rubric is NOT a checklist!
69
Questions to ask yourself to determine whether or not a lesson is effective:
What did the teacher teach?
What did the students and teacher do to work toward mastery?
What did the students learn, and how do we know?
70
Video #1
71
Evaluation of Classroom Instruction
Reflect on the lesson you just viewed and the evidence you collected.
Based on the evidence, do you view this teacher’s instruction as Above Expectations, At Expectations, or Below Expectations?
• Thumbs up: Above Expectations
• Thumbs down: Below Expectations
• In the middle: At Expectations
72
Categorizing Evidence and Assigning Scores (Activity)
• Using the template provided (pgs. 3-5), work with a partner to categorize evidence for the indicators of the Instruction domain.
• Participants will explain their thinking and share any follow up questions they would ask the teacher.
• Next, assign scores together, using the evidence and the rubric to justify your scores.
73
Did you remember to ask yourself these questions?
What did the teacher teach?
What did the students and teacher do to work toward mastery?
What did the students learn, and how do we know?
74
Descriptors from the Planning Domain(Think-Aloud)
75
Trainer models using the Planning domain and the chart to identify key words for the first indicator.
•
Indicator
•
Key Words
Planning—Instructional Plans
76
PlanningSignificantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1)
Inst
ruct
ion
al P
lan
s
Instructional plans include:• measurable and explicit goals aligned to
state content standards;• activities, materials, and assessments
that:• are aligned to state standards.• are sequenced from basic to
complex.• build on prior student knowledge,
are relevant to students’ lives, and integrate other disciplines.
• provide appropriate time for student work, student reflection, and lesson unit and closure;
• evidence that plan is appropriate for the age, knowledge, and interests of all learners; and
• evidence that the plan provides regular opportunities to accommodate individual student needs.
Instructional plans include:• goals aligned to state content
standards;• activities, materials, and assessments
that:• are aligned to state standards.• are sequenced from basic to
complex.• build on prior student
knowledge.• provide appropriate time for
student work, and lesson and unit closure;
• evidence that plan is appropriate for the age, knowledge, and interests of most learners; and
• evidence that the plan provides some opportunities to accommodate individual student needs.
Instructional plans include:• few goals aligned to state content
standards;• activities, materials, and assessments
that:• are rarely aligned to state
standards.• are rarely logically sequenced.• rarely build on prior student
knowledge.• inconsistently provide time for
student work, and lesson and unit closure;
• Little evidence that the plan provides some opportunities to accommodate individual student needs.
Planning—Instructional Plans
77
PlanningSignificantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1)
Inst
ruct
ion
al P
lan
s
Instructional plans include:• measurable and explicit goals aligned to
state content standards;• activities, materials, and assessments
that:• are aligned to state standards.• are sequenced from basic to
complex.• build on prior student knowledge,
are relevant to students’ lives, and integrate other disciplines.
• provide appropriate time for student work, student reflection, and lesson unit and closure;
• evidence that plan is appropriate for the age, knowledge, and interests of all learners; and
• evidence that the plan provides regular opportunities to accommodate individual student needs.
Instructional plans include:• goals aligned to state content
standards;• activities, materials, and assessments
that:• are aligned to state standards.• are sequenced from basic to
complex.• build on prior student
knowledge.• provide appropriate time for
student work, and lesson and unit closure;
• evidence that plan is appropriate for the age, knowledge, and interests of most learners; and
• evidence that the plan provides some opportunities to accommodate individual student needs.
Instructional plans include:• few goals aligned to state content
standards;• activities, materials, and assessments
that:• are rarely aligned to state
standards.• are rarely logically sequenced.• rarely build on prior student
knowledge.• inconsistently provide time for
student work, and lesson and unit closure;
• little evidence that the plan provides some opportunities to accommodate individual student needs.
Planning—Instructional Plans
78
PlanningSignificantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1)
Inst
ruct
ion
al P
lan
s
Instructional plans include:• measurable and explicit goals aligned to
state content standards;• activities, materials, and assessments
that:• are aligned to state standards.• are sequenced from basic to
complex.• build on prior student knowledge,
are relevant to students’ lives, and integrate other disciplines.
• provide appropriate time for student work, student reflection, and lesson unit and closure;
• evidence that plan is appropriate for the age, knowledge, and interests of all learners; and
• evidence that the plan provides regular opportunities to accommodate individual student needs.
Instructional plans include:• goals aligned to state content
standards;• activities, materials, and assessments
that:• are aligned to state standards.• are sequenced from basic to
complex.• build on prior student
knowledge.• provide appropriate time for
student work, and lesson and unit closure;
• evidence that plan is appropriate for the age, knowledge, and interests of most learners; and
• evidence that the plan provides some opportunities to accommodate individual student needs.
Instructional plans include:• few goals aligned to state content
standards;• activities, materials, and assessments
that:• are rarely aligned to state
standards.• are rarely logically sequenced.• rarely build on prior student
knowledge.• inconsistently provide time for
student work, and lesson and unit closure;
• little evidence that the plan provides some opportunities to accommodate individual student needs.
Planning—Instructional Plans
79
PlanningSignificantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1)
Inst
ruct
ion
al P
lan
s
Instructional plans include:• measurable and explicit goals aligned to
state content standards;• activities, materials, and assessments
that:• are aligned to state standards.• are sequenced from basic to
complex.• build on prior student knowledge,
are relevant to students’ lives, and integrate other disciplines.
• provide appropriate time for student work, student reflection, and lesson unit and closure;
• evidence that plan is appropriate for the age, knowledge, and interests of all learners; and
• evidence that the plan provides regular opportunities to accommodate individual student needs.
Instructional plans include:• goals aligned to state content
standards;• activities, materials, and assessments
that:• are aligned to state standards.• are sequenced from basic to
complex.• build on prior student
knowledge.• provide appropriate time for
student work, and lesson and unit closure;
• evidence that plan is appropriate for the age, knowledge, and interests of most learners; and
• evidence that the plan provides some opportunities to accommodate individual student needs.
Instructional plans include:• few goals aligned to state content
standards;• activities, materials, and assessments
that:• are rarely aligned to state
standards.• are rarely logically sequenced.• rarely build on prior student
knowledge.• inconsistently provide time for
student work, and lesson and unit closure;
• little evidence that the plan provides some opportunities to accommodate individual student needs.
Planning—Instructional Plans
80
PlanningSignificantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1)
Inst
ruct
ion
al P
lan
s
Instructional plans include:• measurable and explicit goals aligned to
state content standards;• activities, materials, and assessments
that:• are aligned to state standards.• are sequenced from basic to
complex.• build on prior student knowledge,
are relevant to students’ lives, and integrate other disciplines.
• provide appropriate time for student work, student reflection, and lesson unit and closure;
• evidence that plan is appropriate for the age, knowledge, and interests of all learners; and
• evidence that the plan provides regular opportunities to accommodate individual student needs.
Instructional plans include:• goals aligned to state content
standards;• activities, materials, and assessments
that:• are aligned to state standards.• are sequenced from basic to
complex.• build on prior student
knowledge.• provide appropriate time for
student work, and lesson and unit closure;
• evidence that plan is appropriate for the age, knowledge, and interests of most learners; and
• evidence that the plan provides some opportunities to accommodate individual student needs.
Instructional plans include:• few goals aligned to state content
standards;• activities, materials, and assessments
that:• are rarely aligned to state
standards.• are rarely logically sequenced.• rarely build on prior student
knowledge.• inconsistently provide time for
student work, and lesson and unit closure;
• little evidence that the plan provides some opportunities to accommodate individual student needs.
Descriptors from the Planning Domain
81
Identify and highlight key words for the remaining two indicators in the Planning domain.
•
Indicator
•
Key Words
Planning—Student Work
82
PlanningSignificantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1)
Stu
de
nt
Wo
rk
Assignments require students to:• organize, interpret, analyze,
synthesize, and evaluate information rather than reproduce it;
• draw conclusions, make generalizations, and produce arguments that are supported through extended writing; and
• connect what they are learning to experiences, observations, feelings, or situations significant in their daily lives both inside and outside of school.
Assignments require students to:• interpret information rather
than reproduce it;• draw conclusions and
support them through writing; and
• connect what they are learning to prior learning and some life experiences.
Assignments require students to:• mostly reproduce
information;• rarely draw conclusions and
support them through writing; and
• rarely connect what they are learning to prior learning or life experiences.
Planning—Assessment
83
PlanningSignificantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1)
Ass
ess
me
nt
Assessment Plans:• are aligned with state content
standards;• have clear measurement
criteria;• measure student performance
in more than three ways (e.g., in the form of a project, experiment, presentation, essay, short answer, or multiple choice test);
• require extended written tasks;• are portfolio-based with clear
illustrations of student progress toward state content standards; and
• include descriptions of how assessment results will be used to inform future instruction.
Assessment Plans:• are aligned with state
content standards;• have measurement criteria;• measure student
performance in more than two ways (e.g., in the form of a project, experiment, presentation, essay, short answer, or multiple choice test);
• require written tasks; and• include performance checks
throughout the school year.
Assessment Plans:• are rarely aligned with state
content standards;• have ambiguous
measurement criteria;• measure student
performance in less than two ways (e.g., in the form of a project, experiment, presentation, essay, short answer, or multiple choice test); and
• include performance checks, although the purpose of these checks is not clear.
Guidance on Planning Observations
The spirit of the Planning domain is to assess how a teacher plans a lesson that results in effective classroom instruction for students.
Specific requirements for the lesson plan itself are entirely a district and/or school decision.
Unannounced planning observations
• Simply collect the lesson plan after the lesson.
Evaluators should not accept lesson plans that are excessive in length and/or that only serve an evaluative rather than an instructional purpose.
84
Making Connections: Instruction and Planning (Activity)
• Review indicators and descriptors from the Planning domain to identify connecting or overlapping descriptors from the Instruction domain.
• With a partner, discuss the connections between the Instruction domain and the Planning domain.
• With your table group, discuss how these connections will inform the scoring of the Planning domain and why.
• Be ready to share out.
85
Chapter 4: Pre-Conferences
86
Planning for a Pre-Conference (Activity)
Evaluators often rely too heavily on physical lesson plans to assess the Planning domain.• This should not dissuade evaluators from reviewing physical lesson plans.
Use the following guiding questions: What do you want students to know and be able to do?
What will the students and teacher be doing to show progress toward the objective?
How do you know if they got there?
What are some additional questions you would need to ask to understand how a teacher planned to execute a lesson?
How would these questions impact the planning of a pre-conference with the teacher?
87
Viewing a Pre-Conference
When viewing the pre-conference:
• What are the questions the conference leader asks?
• How do our questions compare to the ones asked?
88
Pre-Conference Video
89
Pre-Conference Reflection (Activity)
• What questions did the conference leader ask?
• How did these compare to the ones you would have asked?
• What questions do you still have?
90
Environment Domain
91
Environment and Instruction Connections
92
Environment Instruction
Exp
ecta
tio
ns
Teacher sets high and demanding expectations for every student.
S/O: Expectations for student performance are clear.There is evidence that most students demonstrate mastery of the objective.
PIC: Presentation of content includes modeling by the teacher to demonstrate performance expectations.AM: Activities and materials are challenging.Q: Questions sometimes require active responses.AF: Feedback from students is used to monitor and adjust instruction TKS: Teacher sometimes provides differentiated instructional methods and content to ensure children have the opportunity to master what is being taught
Teacher encourages students to learn from mistakes.
Teacher creates learning opportunities where most students can experience success.
Students complete work according to teacher expectations.
Environment and Instruction Connections (Activity)
With a partner (5 min.)
• Make connections between the Instruction domain and the Managing Student Behavior indicator in the Environment domain.
Individually (10 min.)
• Make connections between the Instruction domain Environment and Respectful Culture indicators in the Environment domain.
93
Video #2
94
Evaluation of Classroom Instruction
• Reflect on the lesson you just viewed and the evidence you collected.
• Based on the evidence, do you view this teacher’s instruction as Above Expectations, At Expectations, or Below Expectations?
• Thumbs up: Above Expectations
• Thumbs down: Below Expectations
• In the middle: At Expectations
95
Next Steps
Hold on to your evidence and make sure you bring it with you tomorrow.
Optional Homework:
• Try labeling your evidence with the short-hand we discussed today.
• List any follow up questions you would need to ask the teacher
We will score this lesson tomorrow based on the evidence you collected today.
96
Wrap-up for Today
As we reflect on our work today, please use two post-it notes to record the following:
• One “Ah-ha!” moment
• One “Oh no!” moment
• Please post to the chart paper
Expectations for tomorrow:
• We will continue to collect and categorize evidence and have a post-conference conversation
97
This Concludes Day 1Thank you for your participation!
98
Instruction Planning
Environment
Welcome to Day 2!
99
Instruction Planning
Environment
Expectations
To prevent distracting yourself or others, please put away all cellphones, iPads, and other electronic devices.
There will be time during breaks and lunch to use these devices as needed.
100
Day 2 Objectives
Participants will:
• Continue to build understanding of the importance of collecting evidence to accurately assess classroom instruction.
• Understand the quantitative portion of the evaluation.
• Identify the critical elements of summative conferences.
• Become familiar with data system and websites.
101
Norms
Keep your focus and decision-making centered on students and educators.
Be present and engaged.
• Limit distractions and sidebar conversations.
• If urgent matters come up, please step outside.
Challenge with respect, and respect all.
• Disagreement can be a healthy part of learning!
Be solutions-oriented.
• For the good of the group, look for the possible.
Risk productive struggle.
• This is a safe space to get out of your comfort zone.
Agenda: Day 2
103
Day Components
Day Two •Post-Conferences•Professionalism Rubric•Alternate Rubrics•Quantitative Measures•Closing out the Year
Evidence and Scores
Remember:
In order to accurately score any of the indicators, you need to have sufficient and appropriate evidence captured and categorized.
Evidence is not simply restating the rubric.
Evidence is:
• What the students say
• What the students do
• What the teacher says
• What the teacher does
104
Evidence and Scores
Which of these is an example of evidence?
Activities and Materials
A. Students used the computer program Kidspiration to develop a Venn Diagram using the two read-alouds as the basis for their comparisons.
OR
B. The activities and materials incorporated multimedia and technology.
105
Categorizing Evidence and Assigning Scores
Using the template provided (pgs. 3-5), you will categorize evidence and assign scores for the Instruction domain.
Using the template provided, you will also categorize evidence collected and assign scores on the Environment domain.
106
Note: Please do this independently without talking.
Early Finishers
Read the page on Post-Conferences in your Supplemental Materials (pg. 9).
107
Consensus Scoring (Activity)
Work with your shoulder partner to come to consensus regarding all indicator scores.
Work with your table group to come to consensus regarding all indicator scores.
108
Chapter 5: Post-Conferences
109
Post-Conference Round Table (Activity)
What is the purpose of a post-conference?
As a classroom teacher, what do you want from a post-conference?
As a classroom teacher, what don’t you want from a post-conference?
As an evaluator, what do you want from a post-conference?
As an evaluator, what don’t you want from a post-conference?
110
Characteristics of an Ideal Post-Conference
Teacher did a lot of the talking
Teacher reflected on strengths and areas for improvement
Teacher actively sought help to improve
A professional dialogue about student-centered instruction
Collaboration centered on improvement
Discussion about student learning
More asking, less telling
111
Parts of the Post-Conference
Introduction• Greeting, purpose, time, and general impression question
Reinforcement (area of relative strength)• Ask self-analysis question• Provide evidence from notes• Identify potential opportunities for sharing this strength
– Ex. Peer partnership, sharing at a faculty meeting or PLC, etc.
Refinement (area of relative improvement)• Ask self-analysis question• Provide evidence from notes• Give a recommendation for actionable next steps• Give a definite follow up timeline
Share Scores
112
Developing Coaching Questions
Questions should be open-ended.
Questions should ask teachers to reflect on practice and student learning.
Questions should align to rubric and be grounded in evidence.
Questions should model the type of questioning you would expect to see between teachers and students.
• i.e. open-ended, higher-order, reflective
113
Examples of Coaching Questions
114
• What kind of background information did students need to have?
• What did you want students to learn or be able to do?
• How did you decide what you wanted to teach?
Questions that clarify goals:
• How were you assessing the students during the lesson?
• What were you looking for or listening for to determine if students were able to master the objective?
Questions that gauge success of the lesson:
Examples of Coaching Questions
115
• What problems did you anticipate students would have mastering this objective?
• Tell me about activities you planned and how they supported the objective.
Questions that anticipate
approaches:
• Who was successful with this lesson?
• What were you able to do to help them be successful?
• Who struggled with this lesson?
• Why do you think they struggled?
Questions that reflect on the students:
Examples of Coaching Questions
116
• What do you think went well during the lesson?
• How do you know that?
• What evidence did you see that….
• Why is that important?
Questions that summarize and recall details:
• What do you think caused…
• What impact do you think that had on…
• What was different between what you envisioned and what happened?
• Why do you think those differences occurred?
Questions that analyze causal
factors:
Examples of Coaching Questions
117
• What do you want to be mindful of from now on?
• How might this affect student learning?
• How else might this look in your class?
Questions that construct new
learning/
application:
Examples of Coaching Questions
118
• How do you plan to apply what we have talked about?
• What can you do to maintain this new focus?
Questions that commit
to application:
• As you reflect on this conversation, how has it supported your learning?
• How might what we talked about impact your thinking on (a specific indicator)?
Questions that reflect on the process:
Selecting Areas of Reinforcement and Refinement
Remember:
Choose the areas that will give you the “biggest bang for your buck”.
Do not choose an area of refinement that would overlap your area of reinforcement, or vice-versa.
Choose areas for which you have specific and sufficient evidence.
119
Identify Examples: Reinforcement
Identify specific examples from your evidence notes of the area being reinforced. Examples should contain exact quotes from the lesson or vivid descriptions of actions taken.
For example, if your area of reinforcement is academic feedback, you might highlight the following:
• In your opening, you adjusted instruction by giving specific academic feedback.
• “You counted the sides to decide if this was a triangle. I think you missed a side when you were counting. Let’s try again,” instead of just saying “Try again”.
120
Identify Examples: Refinement
Identify specific examples from your evidence notes of the area being refined. Examples should contain exact quotes from the lesson or vivid descriptions of actions taken.
For example, if your area of refinement is questioning, you might highlight the following:
• Throughout your lesson you asked numerous questions, but they all remained at the ‘remember level’.
– Ex. “Is this a triangle?” instead of “How do you know this is a triangle?”
• Additionally, you only provided wait time for three of the six questions you asked.
121
Post-Conference Video
122
Post-Conference Debrief (Activity)
Discuss with your table group parts of the post-conference that were effective and the reasons why.
Discuss with your table group at least one way the evaluator could improve and why.
Be ready to share with the group.
123
Last Practice…
This is the third and final practice video during our training.
You will watch the lesson, collect evidence, categorize the evidence, and score the instructional indicators.
Requirements for certification:
• No indicator scored +/- 3 away
• No more than two indicators scored +/- 2 away
• Average of the twelve indicators must be within +/- .90
124
Video #3
125
Categorizing Evidence and Assigning Scores (Activity)
Work independently to categorize evidence for all 12 Instruction indicators.
After you have categorized evidence, assign scores for each indicator. Are there clarifying questions you would ask the teacher prior to your post-conference?
When you have finished, you may check with a trainer to compare your scores with those of the national raters.
126
Writing Your Post-Conference Plan (Activity)
On the sheet provided (pg. 16), write your:
Area of reinforcement (relative strength)
Self-reflection question
Evidence from lesson
127
Writing Your Post-Conference Plan (Activity)
On the sheet provided (pg. 17), write your:
Area of refinement
Self-reflection question
Evidence from lesson
Recommendation to improve
128
Role Play (Activity)
With a shoulder partner, take turns leading the post-conference you’ve developed.
Participants should be cooperative the first time to practice developing and asking open-ended, reflective questions in the moment.
Participants can be more creative in the role they assume the second time to practice different feedback strategies.
129
Reflect and Debrief (Activity)
Discuss with your table
What were some strengths of these conferences?
What are some struggles with these conferences?
Why is questioning more effective than simply “telling” in a post-conference?
Tables share something discussed with whole group.
130
Whole Group Debrief (Activity)
• Share some examples of what can be said and done and what should be avoided in the post-conference.
• How did this experience help you as a learner?
• How and why is this powerful for student learning?
• Scores are shared at the end of the conference. Why is it appropriate to wait until the end of the conference to do this?
131
Chapter 6: Professionalism
132
Professionalism Form
Form applies to all teachers
Completed within last six weeks of school year
Based on activities from the full year
Discussed with the teacher in a conference
133
Professionalism Rubric
134
Professionalism Rubric (Continued)
135
Rubric Activity
With a partner (15 min.)
• Identify the main differences between the performance levels for each indicator.
• What would that look like in reality?
• List examples of evidence that could be used to score each indicator.
136
Chapter 7: Alternate Rubrics
137
Reflection on this Year
It is important to maintain high standards of excellence for all educator groups.
• School Services Personnel: Overall Average of 4.29
• Library Media Specialists: Overall Average of 4.06
• General Educators: Overall Average of 3.78
As you can see, scoring among these educator groups is somewhat higher than what we have seen overall. As evaluators why is that the case?
138
When to Use an Alternate Rubric
If there is a compelling reason not to use the general educator rubric, you should use one of the alternate rubrics.
• Ex. If the bulk of an educator’s time is spent on delivery of servicesrather than delivery of instruction, you should use an alternate rubric.
If it is unclear which rubric to use, consult with the teacher.
When evaluating interventionists, pay special attention to whether or not they are delivering services or instruction.
139
Pre-Conferences for Alternate Rubrics
140
For the Evaluator
Discuss targeted domain(s)
Evidence the educator is expected to provide and/or a description of the setting to be observed
Roles and responsibilities of the educator
Discuss job responsibilities
For the Educator
Provide the evaluator with additional context and information
Understand evaluator expectations and next steps
Library Media Specialist Rubric
Similar to General Educator Rubric
Professionalism: same at the descriptor level
Environment: same at the descriptor level
Instruction: similar indicators, some different descriptors
Planning: specific to duties (most different)
141
Educator groups using the SSP rubric
142
Audiologists
Counselors
Social Workers
School Psychologists
Speech/Language Pathologists
Additional educator groups, at district discretion, without primary responsibility of instruction
Ex. instructional and graduation coaches, case managers
SSP Observation Overview
143
All announced
Conversation and/or observation of delivery
Suggested observation
10-15 minute delivery of services (when possible)
20-30 minute meeting
Professional License:
Minimum 2 classroom visits
Minimum 60 total contact minutes
Apprentice License:
Minimum 4 classroom visits
Minimum 90 total contact minutes
SSP Planning
144
Planning indicators should be evaluated based on yearly plans
Scope of work
Analysis of work products
Evaluation of services/program – Assessment
When observing planning two separate times:
the first time is to review the plan
the second time is to make sure the plan was implemented
SSP Delivery of Services
Keep in mind that the evidence collected may be different than the evidence collected under the General Educator Rubric.
Some examples might be:
Surveys of stakeholders
Evaluations by stakeholders
Interest inventories
Discipline/attendance reports or rates
Progress to IEP goals
145
SSP Environment
Indicators are the same
Descriptors are very similar to general educator rubric
Environment for SSP
May be applied to work space (as opposed to classroom) and interactions with students as well as parents, community and other stakeholders.
146
Observation Guidance Documents
Educator groups convened by TDOE to provide additional information for evaluators to inform evaluation using SSP rubric
Observation guidance documents were created for the following educator groups:
147
GENERAL EDUCATOR RUBRIC SCHOOL SERVICES PERSONNEL RUBRIC
Early Childhood School Counselors
Special Education School Audiologists
Career and Technical Education (CTE) Speech/Language Pathologists (SLP)
Online Teaching School Social Workers (SSW)
Alternative Educators Vision Specialists
School Psychologists
Key Takeaways
Evaluating educators using the alternate rubrics:
• Planning should be based on an annual plan, not a lesson plan.
• Data used may be different than classroom teacher data.
• The job description and role of the educator should be the basis for evaluation.
• Educators who spend the bulk of their time delivering servicesrather than instruction, should be evaluated using an alternate rubric.
• It is important to maintain high standards for all educator groups.
148
Chapter 8: Quantitative Measures
149
Components of Evaluation: Tested Grades and Subjects
Qualitative includes:
Observations in planning, environment, and instruction
Professionalism rubric
Quantitative includes:
Growth measure
TVAAS or comparable measure
Achievement measure
Goal set by teacher and evaluator
150
Qualitative 50%
Achievement Measure
15%
GrowthMeasure
35%
Components of Evaluation:Non-tested Grades and Subjects
Qualitative includes:
Observations in planning, environment, and instruction
Professionalism rubric
Quantitative includes:
Growth measure
TVAAS or comparable measure
Achievement measure
Goal set by teacher and evaluator
151
Qualitative 60%
Achievement Measure
15%
GrowthMeasure
25%
Growth Overview
State law requires value-added (or a comparable growth measure) to count as 35 percent of the total evaluation score for teachers in tested grades and subjects.
State law requires value-added to count as 25 percent of the total evaluation score for teachers in non-tested grades and subjects.
Any additional changes in the requirement of 35 percent counting as value-added would require legislative action.
Additional measures for non-tested grades/subjects.
152
Growth vs. Achievement
Growth measures progress from a baseline Ex. John grew faster than we would have predicted based on his
testing history.
Achievement measures proficiency Ex. John scored a 98 percent on his test.
A link to a video series about TVAAS as well as some additional guidance documents can be found here: http://team-tn.org/evaluation/tvaas/
153
Tested Grades/Areas
154
• Includes subjects currently taught
• 3 year trend scores, where available
• Any educator with an individual score has to use it
Individual Value-Added Score
• All individual value-added scores will be directly imported into the data system by the state.
• All educators, including those who anticipate earning an individual growth score, must select a school-wide option
Data System
• Scores are returned by June 15thTimeline
Non-tested Grades/Areas
155
• 4 composite options: overall, literacy, numeracy, and literacy + numeracy
• 1 year score
• TCAP specific, SAT 10 specific and CTE Concentrator
School-Wide Value-Added
Score
• Evaluators must select which composite to use
• All educators, including those who anticipate earning an individual growth score, must select a school-wide option
• Scores will be imported into the data system by the state
Data System
• Scores will be returned in mid-late June Timeline
Districts will determine which composite a non-tested educator will use
156
Subject Recommended CompositeAcademic Interventionists Overall, Literacy, Math, or Math/Literacy
Computer Technology Overall
CTE CTE Concentrator/Student (where available)
ELL Overall, Literacy
Fine Arts Fine Arts Portfolio (in participating districts), Overall, Literacy
Health-Wellness and PE Overall
HS Core Non-Tested Overall, Literacy, Math, or Math/Literacy
Library Media Specialists Overall, Literacy
SPED Overall, Literacy, Math, or Math/Literacy
School Services Providers Overall, Literacy, Math, or Math/Literacy
World Languages Overall or Literacy
Early Grades Overall or Math/Literacy (from feeder schools)
15 Percent Achievement Measure Overview
The 15 percent achievement measure is a yearly goal set by the educator and his/her evaluator that is based on current year data.
157
Spirit and Process of the 15 Percent Measure
Relationship to core beliefs
• If our focus is on improving the lives of students, then we have to approach the selection of the measure with that in mind.
To make the 15 percent selection meaningful, the evaluator and educator work together to identify a measure.
• If there is a disagreement between the educator and the evaluator, the educator’s decision stands.
The process should involve determining which measure most closely aligns to the educator’s job responsibilities and the school’s goals.
158
Process of Selecting the 15 Percent Measure
Things to watch for:
Measures that are unrelated to a teacher’s instructional responsibilities.
Ex. A 5th grade math teacher choosing 4th grade social studies.
Scales that are not rigorous or reflective of expectations.
Ex. If last year’s P/A% for Social Studies was 90 percent, the scale below would not be rigorous or reflective of expectations.
159
1 2 3 4 5
0-20% 20-40% 40-60% 60-80% 80-100%
Spirit of Scaling the 15% Measure
Scales should be determined with the following spirit in mind:
160
Score Equivalent Scale
1 0- ½ years of growth
2 ½-1 years of growth
3 1- 1 ½ years of growth
4 1 ½ - 2 years of growth
5 2+ years of growth
Not standardized at a school for all teachers: All teachers start at a different baseline. Set of students and context should inform goal.
Beginning of the Year Conference
Evaluator notifies teacher which 35 percent measure will apply.
This occurs even for teachers who anticipate receiving an individual growth score. If the teacher has enough students to generate an individual score, that score will be automatically mapped in and will override the selected school-wide measure.
Evaluator and teacher choose a 15 percent measure.
Evaluator and teacher scale the 15 percent measure.
161
Chapter 9: Closing out the Year
162
End of Year Conference
Time: 15-20 minutes
Required Components:
Discussion of Professionalism scores
Share final qualitative (observation) data scores
Share final 15 percent quantitative data (if measure is available)
Let the teacher know when the overall score will be calculated
Other Components:
Commend places of progress
Focus on the places of continued need for improvement
163
End of Year Conference
Saving Time
• Have teachers review their data in the data system prior to the meeting.
• Incorporate this meeting with existing end of year wrap-up meetings that already take place at the district/school.
164
Grievance Process
Areas that can be challenged:
Fidelity of the TEAM process, which is the law.
Accuracy of the TVAAS or achievement data
Observation ratings cannot be challenged.
165
Relationship Between Individual Growth and Observation
We expect to see a logical relationship between individual growth scores and observation scores.• This is measured by the percentage of teachers who have individual
growth scores three or more levels away from their observation scores.
Sometimes there will be a gap between individual growth and observation for an individual teacher, but not every educator in your building…that’s okay!
When we see a relationship that is not logical for many teachers within the same building, we try to find out why and provide any needed support.
School-wide growth is not a factor in this relationship.
166
Data System Reports
• The CODE data system reports allow district- and school-level administrators to access a lot of valuable data, including, but not limited to, the following:
• Comparisons of observation scores by evaluator
• Summaries of average scores
• School-wide reinforcement and refinement indicators
• You can access more information about the CODE data system here: http://team-tn.org/evaluation/data-system/
167
CODE Reports
Potential Purpose CODE Report
Which teachers are really strong in areas where we struggle as a district?
•Observation Summary by Teacher (export)•Educator Averages by Rubric Domain
How should we target school or district PD?
•Overall Averages by Rubric Indicator•Refinement Goals (pie chart)
Are there patterns in the way our administrators are scoring?
•Observer Averages by Rubric Domain•Overall Averages by Observer
How are the teachers in key AMO subject areas performing on observations?
•Overall Averages by Subject•Subject Averages by Rubric Domain
Which teachers should be recommended for leadership opportunities and recognition?
•Overall Averages by Educator
168
Important Reminders
• We must pay more attention than ever before to evidence of student learning, i.e. “How does the lesson affect the student?”
• You are the instructional leader, and you are responsible for using your expertise, knowledge of research base, guidance, and sound judgment in the evaluation process.
• As the instructional leader, it is your responsibility to continue learning about the most current and effective instructional practices.
• When appropriate, we must have difficult conversations for the sake of our students!
170
Resources
E-mail:
Questions: [email protected]
Training: [email protected]
Websites:
NIET Best Practices Portal: Portal with hours of video and professional development resources. www.nietbestpractices.org
TEAM website: www.team-tn.org
Weekly TEAM Updates• Email [email protected] to be added to this listserv.
• Archived versions can also be found on our website here: http://team-tn.org/resources/team-update/
171
Expectations for the Year
Please continue to communicate the expectations of the rubrics with your teachers.
If you have questions about the rubrics, please ask your district personnel or send your question(s) to [email protected].
You must pass the certification test before you begin any teacher observations.
• Conducting observations without passing the certification test is a grievable offense and will invalidate observations.
• Violation of this policy will negatively impact administrator evaluation scores.
172
Immediate Next Steps
MAKE SURE YOU HAVE PUT AN ‘X’ BY YOUR NAME ON THE ELECTRONIC ROSTER!• Please also make sure all information is correct.
• If you don’t sign in, you will not be able to take the certification test and will have to attend another training. There are NO exceptions!
Within the next week, you will be receiving an email to invite you to the portal.• Email [email protected] with any problems or questions.
You will need to pass the certification test before you begin your observations.
Once you pass the certification test, print the certificate and submit it to your district HR representative.
173
Thanks for your participation! Have a great year!
174
Instruction Planning
Environment