Download - Overview for School Leaders
Pennsylvania’sStudent Learning Objective Process
Overview for School Leaders
Session Objectives
I. Review Teacher Effectiveness System
II. Define SLO processIII.Exploring SLO
Templates -Assessment Literacy-
IV.Identifying Key Points for School Leaders
V. Action Planning
I. Teacher Effectiveness System
• (B) FOR PROFESSIONAL EMPLOYES AND TEMPORARY PROFESSIONAL• EMPLOYES WHO SERVE AS CLASSROOM TEACHERS, THE FOLLOWING
SHALL APPLY:• (1) BEGINNING IN THE 2013-201 4 SCHOOL YEAR, THE EVALUATIO N• OF THE EFFECTIVENESS OF PROFESSIONAL EMPLOYES AND TEMPORARY• PROFESSIONAL EMPLOYES SERVING AS CLASSROOM TEACHERS SHALL
GIVE• DUE CONSIDERATION TO THE FOLLOWING:• (I) CLASSROOM OBSERVATION AND PRACTICE MODELS THAT ARE• RELATED TO STUDENT ACHIEVEMENT IN EACH OF THE FOLLOWING
AREAS:• (A) PLANNING AND PREPARATION.• (B) CLASSROOM ENVIRONMENT.• (C) INSTRUCTION.• (D) PROFESSIONAL RESPONSIBILITIES.• (II) STUDENT PERFORMANCE, WHICH SHALL COMPRISE FIFTY PER• CENTUM (50%) OF THE OVERALL RATING OF THE PROFESSIONAL
EMPLOYE• OR TEMPORARY PROFESSIONAL EMPLOYE SERVING AS A CLASSROOM
TEACHER• AND SHALL BE BASED UPON MULTIPLE MEASURES OF
House Bill 1901Race to the
TopAct 82
Building Level Data, 15%
Teacher Specific Data, 15%
Elective Data, 20%
Observation/ Practice, 50%
Teacher Observation & PracticeEffective 2013-2014 SYDanielson Framework DomainsPlanning and PreparationClassroom EnvironmentInstructionProfessional Responsibilities
Building Level Data/School Performance ProfileEffective 2013-2014 SYIndicators of Academic AchievementIndicators of Closing the Achievement Gap, All StudentsIndicators of Closing the Achievement Gap, SubgroupsAcademic Growth PVAASOther Academic IndicatorsCredit for Advanced Achievement
Teacher Specific DataPVAAS / Growth 3 Year Rolling Average2013-2014 SY2014-2015 SY2015-2016 SYOther data as provided in Act 82
Elective Data/SLOsOptional 2013-2014 SYEffective 2014-2015 SYDistrict Designed Measures and Examina-tionsNationally Recognized Standardized TestsIndustry Certification ExaminationsStudent Projects Pursuant to Local Re-quirementsStudent Portfolios Pursuant to Local Re-quirements
Teacher Effectiveness System in Act 82 of 2012
5
Building Level Data, 15%
Observation/ Practice, 50%
Teacher Observation & Practice Effective 2013-2014Danielson Framework DomainsPlanning and PreparationClassroom EnvironmentInstructionProfessional Responsibilities
Building Level Data/School Performance ProfileEffective 2013-2014 SYIndicators of Academic AchievementIndicators of Closing the Achievement Gap, All StudentsIndicators of Closing the Achievement Gap, SubgroupsAcademic Growth PVAASOther Academic IndicatorsCredit for Advanced Achievement
Elective Data/SLOsOptional 2013-2014 SYEffective 2014-2015 SYDistrict Designed Measures and Examina-tionsNationally Recognized Standardized TestsIndustry Certification ExaminationsStudent Projects Pursuant to Local Re-quirementsStudent Portfolios Pursuant to Local Re-quirements
Elective Data, 35%
Teacher Effectiveness System in Act 82 of 2012
6
Educator Effectiveness Prezi• http://prezi.com/2wjeukgle6ja/?utm_ca
mpaign=share&utm_medium=copy&rc=ex0share
Observation/Evidence (50%)
4 Domains, 22 ComponentsPrincipal/Evaluator Observes
Charlotte Danielson’sFramework for Teaching
Multiple Measures of Student Achievement
1. Building Level Data (School Performance Profile)
Academic Achievement, Graduation/Promotion Rate, Attendance, AP-IB Courses offered, PSAT, Building Level PSSA and Keystone Assessment Data
2. Correlation Data Based on Teacher Level Measures PSSA, Keystone Data
3. Elective Data (SLOs)
II. SLO Process
SLO Process
A process to document a
measure of educator effectiveness
based on student achievement of
content standards.
SLO ConceptsStudent achievement can be measured in ways that reflect authentic learning of content standards.
Educator effectiveness can be measured through use of student achievement measures.
The SLO in PA is written to a specific teacher and a specific class/course/content area for which that teacher provides
instruction in the area they are certified to teach.
Many factors can influence the size of an SLO,
but the process remains the same………..
Time Frame
Course Content
Important Learning Needs
SLO Process Design
IndicatorsPerformance Measures
Goal-Standards
SLO Goal
Assessment #1a
Indicator #1
Indicator #2Assessment
#1b
Assessment #2
SLOs should:1. Represent the diversity of students
and courses/content areas taught.2. Align to a set of approved
indicators/targets related to selected academic content standards.
3. Be based upon two time-bound events/data collection periods and/or performance defined levels of “mastery”.
4. Be supported by verifiable data that can be collected and scored in a standardized manner.
5. Include a set of independent performance measures.
SLO Process Criteria
SLO Process Steps: Teacher
1.Identify subject and students
2.Select the “big idea” from the content standards
3.Establish a goal4.Identify indicators
associated with the goal5.Select and/or create
performance measures for each indicator
6.Create performance expectations across all indicators
III. SLO Template
SLO Template
A process tool used to identify goals, indicators,
and performance measures for use in the greater Teacher Effectiveness
SystemHandouts: SLO Template, Help Desk, & Performance Task Framework
SLO Template Design
Context
Goal
Measures
Indicators
Expectations
1. Goals are based upon the “big ideas” within the content standards.
2. Performance indicators are specific, measureable, attainable, and realistic.
3. Performance measures should be valid, reliable, and rigorous assessments.
4. Data should be collected, organized, and reported in a consistent manner.
5. Teacher expectations of student achievement should be demanding.
SLO Template Criteria
SLO Template Steps: Teacher
1. Classroom Context
1a. Name 1b. School 1c. District
1d. Class/ Course Title
1e. Grade Level
1f. Total # of Students
1g. TypicalClass Size
1h. Class Frequency
1i. Typical Class Duration
2. SLO Goal
2a. Goal Statement
2b. PA Standards
2c. Rationale
Spanish 1Students will be able demonstrate effective communication in the target language by speaking and listening, writing, and reading. 8th Grade ArtStudents will demonstrate the ability to manipulate visual art materials and tools to create works based on the ideas of other artists and to evaluate the processes and products of themselves and other artists.
Grade 5 LibraryStudents will demonstrate the ability to use online D.P.S. databases and search engines, Britannica Elementary, Culture Grams, and Nettrekker toward support real world experiences and determining which is the best source for specific information.
2a. The SLO Goal
Statement:
What’s the Important Learning?
Targeted content standards used in developing the SLO.
Arts and Humanities:
9.1, 9.2, 9.3, 9.4
http://pdesas.org/
2b. Standards selection:
What Standards Match the
Goal Statement?
Explains why the SLO is important and how students will demonstrate learning of the standards through
this objective.
Grade 8 Art:Developing the ability to manipulate visual art materials and tools are important to the artistic creation process, as is the ability to evaluate the process and product created by oneself and others. Child Development (FCS)Understanding how children grow and develop will prepare individuals and families to meet challenges associated with raising children.
2c. Rationale
statement: Why is this
Learning Important?
SLO Template Steps: Teacher
3. Performance Measures (PM)
3a. Name
PM #1 PM #2 PM #3 PM #4 PM #5
3b. Type
____District-designed Measures and Examinations____Nationally Recognized Standardized Tests____Industry Certification Examinations____Student Projects ____Student Portfolios____ Other:______________________________
3c. Purpose
PM #1 PM #2 PM #3 PM #4 PM #5
3d. Metric
Growth (change in student performance across two or more points in time)
Mastery (attainment of a defined level of achievement)
Growth and Mastery
3e. AdministrationFrequency
PM #1 PM #2 PM #3 PM #4 PM #5
3f. Adaptations/Accommodations
IEP ELL
Gifted IEP Other
3g. Resources/Equipment
PM #1 PM #2 PM #3 PM #4 PM #5
3h. Scoring Tools
PM #1 PM #2 PM #3 PM #4 PM #5
3i. Administration & Scoring Personnel
PM #1 PM #2 PM #3 PM #4 PM #5
3j. Performance Reporting
PM #1 PM #2 PM #3 PM #4 PM #5
Many things must be considered when choosing or building
quality assessments.
Choosing or Building Performance Measures and Tasks
What must a Student know and do to complete a
performance measure?
What does a Teacher do to administer a performance
measure?
How does a Teacher score a
performance measure?
SLO Template Steps: Teacher
4. Performance Indicators (PI)
4a. PI Targets: All Student Group
PI Target #1 PI Target #2 PI Target #3 PI Target #4 PI Target #5
4b. PI Targets: Subset Student Group(optional)
PI Target #1 PI Target #2 PI Target #3 PI Target #4 PI Target #5
4c. PI Linked(optional)
4d. PI Weighting(optional)
Describes individual studentperformance expectation
4a. What performance measure(s) –
tests, assessments– will be used to measure student achievement of the standards, and what’s the expected student achievement level based on the scoring system for those measures?
4b.What’s the expected achievement
level for unique populations? (IEP, students who did not do well on a pre-test, etc.)
3: Performance
Indicator: What does
Student Performance
Look Like?
Performance Indicator Statement
HS ChoralIndividual Vocal Assessment TaskStudents will achieve proficient or advanced levels in 6 out of 8 criteria of the second scoring rubric.
5th Grade ELADRA text gradient chart Students will demonstrate one year of reading growth
A Temporary Detour…
Foundational KnowledgeBasic Assessment Literacy
Test SpecificationsWhen developing test specifications consider:• Sufficient sampling of targeted content standards
• Aim for a 3:1 items per standard ratio
• Developmental readiness of test-takers• Type of items
• Multiple Choice (MC)• Short Constructed Response (SCR)• Extended Constructed Response (ECR)/Complex Performance
tasks
• Time burden imposed on both educators and students
34
Test Specifications (cont.)When developing test specifications consider:• Cognitive load
• Aim for a balance of DoK levels• Objectivity of scoring
• Each constructed response item/task will need a well-developed rubric
• Weight of items (point values)
• Measures (tests) should consist of 25-35 total points; 35-50 points for high school
• Item cognitive demand level/DoK level
• Measures should reflect a variety of DoK levels as represented in the targeted content standards
35
Test Specifications Example
36
Content Strand(s) MC SCR ECR TotalExpressions & Equations 4 0 0 4Creating Equations 5 0 0 5Structure in Expressions 3 0 0 3Ratios & Proportions 3 2 0 5Reasoning with Equations & Inequalities 4 1 0 5Interpreting Functions 3 2 1 6Real Number System 5 1 1 7Grand Totals 27 6 2 35
*Performance measure contains 35 items/tasks.
Content Strand(s) MC(1 pt.)
SCR(2pts.)
ECR(4pts.)
Total
Expressions & Equations 4 0 0 4Creating Equations 5 0 0 5Structure in Expressions 3 0 0 3Ratios & Proportions 3 4 0 7Reasoning with Equations & Inequalities 4 2 0 6Interpreting Functions 3 4 4 11Real Number System 5 2 4 11Grand Totals 27 12 8 47
*Performance measure score based upon 47 points.
Test Specifications Example (cont.)
37
Content Strand(s) DoK 1 DoK 2 DoK 3 TotalExpressions & Equations 1 2 1 4Creating Equations 1 2 2 5Structure in Expressions 2 0 1 3Ratios & Proportions 0 5 0 5Reasoning with Equations & Inequalities
3 2 0 5
Interpreting Functions 0 2 4 6Real Number System 1 4 2 7Grand Totals 8 17 10 35
*Performance measure contains items/tasks with the following Level/DoK distribution: DoK 1 = 23% DoK 2 & 3 = 77%
• Stem (question) with four (4) answer choices• Typically worth one (1) point towards overall score• On the new PSSA there are MC questions with two answers• Generally require about one (1) minute to answer,
depending on rigor/DoKPros• Easy to administer• Objective scoring Cons• Students can guess the correct answer• No information can be gathered on the process the student
used to reach answer (error analysis)
38
Multiple Choice Items
• Requires students to apply knowledge, skills, and critical thinking abilities to real-world performance tasks
• Entails students "constructing" or developing their own answers in the form of a few sentences or bullet points, a graphic organizer, or a drawing/diagram with explanation
• Worth 1-3 pointsPros• Allows for partial credit• Provides more details about a student’s cognitive process• Reduces the likelihood of guessing
Cons• Greater scoring subjectivity• Requires more time to administer and score
39
Short Constructed Response Items
• Requires students to apply knowledge, skills, and critical thinking abilities to real-world performance tasks by developing their own answers in the form of narrative text with supporting graphic organizers and/or illustrations
• Worth 4 or more points• Entails more in-depth explanations than SCR itemsPros• Allows for partial credit• Provides more details about a student’s cognitive process• Reduces the likelihood of guessingCons• Greater scoring subjectivity• Requires more time to administer and score
40
Extended Constructed Response Items
Depth of Knowledge is…• The complexity of mental processing that must occur in order to construct an answer• A critical factor in determining item/task rigor
41
Level Example of Verb Example of TaskDoK Level 1 Recall List three characteristics of
metamorphic rocks. DoK Level 2 Compare/Contrast Describe the difference
between metamorphic and igneous rocks.
DoK Level 3 Create Develop a model to represent the rock cycle.
DoK Level 4 Construct Using multiple sources, develop an essay on the rise of the Industrial Revolution.
Depth of Knowledge ChartDoKLevel Definition Verbs Examples
1
Involves recall and the response is automatic. Activities require students to demonstrate a rote response, follow a set of procedures, or perform simple calculations.
define, duplicate, list, memorize, recall, repeat, reproduce, state, classify, describe, discuss, explain, identify, locate, recognize, report, select, paraphrase
Identify the main character. Subtract the numbers. Label the rivers on the map. Measure the length of your desk. List the steps in the water cycle.
2
Activities are more complex and require students to engage in mental processing and reasoning beyond a habitual response.
These activities make students decide how to approach a problem.
choose, demonstrate, dramatize, employ, illustrate, interpret, operate, schedule, sketch, solve, use, write, appraise, compare, contrast, criticize, differentiate, discriminate, distinguish, examine, experiment, question, test
Summarize the events in the story. Describe the cause/effect of an
event. Organize the data using a bar
graph. Formulate a problem given data. Compare and contrast the main
characters from the stories.
3
Activities necessitate higher cognitive demands. Students are providing support and reasons for conclusions they draw.
Typically, Level 3 activities have more than one correct response or approach to the problem.
appraise, argue, defend, judge, select, support, value, evaluate, assemble, construct, create, design, develop, formulate, write
Support your ideas with details and examples.
Design investigations for a scientific problem.
Construct a model of the solar system.
Using the graph, predict how many teeth would be lost by all the 2nd grade classes in the school and justify your answer.
42
Process Steps
1. Review content standards from completed Targeted Content Standards Template and insert content strand(s) into specification table.
2. Determine the number of items by item type (i.e., Multiple Choice, Short Constructed Response, Extended Constructed Response) for each content strand.
3. Ensure item type and cognitive level (I, II, III)/depths of knowledge (DoK) are assigned.
4. Assign item weights to each item type.5. Assign number of passages (by type) when using literary works.
43
QA Checklist• There is a sufficient sampling of targeted standards.• The specifications reflect a balance between
developmental readiness and time constraints.• Time is considered for both educators and students.• The cognitive demands reflect those articulated in the
targeted standards.• The measure allows for both objective and subjective
scoring procedures.• The measure consists of 35-50 points with the Level I/DoK
I limited to one-third of the items/tasks.
44
Blueprints• Content ID #• Content Statement • Item Depth of Knowledge (DoK)
• Performance measures should reflect a variety of DoK levels.
• Sufficient sampling of content standards• Aim for a 3:1 item to standard ratio (3 items for every
standard).
• Cognitive load• Aim for a balance of DoK levels among standards.• Design measures with at least 50% DoK 2 or higher.
45
Blueprint Example
46
Standard/Content ID
Content Statement Item Count
DoK1
DoK2
DoK3
8.EE.1Know and apply the properties of integer exponents to generate equivalent numerical expressions.
2 1 0 1
8.EE.2
Use square root and cube root symbols to represent solutions to equations of the form x2 = p and x3 = p, where p is a positive rational number. Evaluate square roots of small perfect squares and cube roots of small perfect cubes.
2 0 2 0
A-CED.1 Create equations and inequalities in one variable and use them to solve problems. 5 1 2 2
Process Steps1. List the standards by number and statement in
the appropriate columns. Remember to aim for a 3:1 item to standard ratio.
2. Decide on the item count for each standard and fill in the appropriate column.
3. Determine the number of DoKs for each standard following the specified guidelines for “rigor.”
4. Repeat Steps 1-3 ensuring that item and DoK counts meet the specification requirements.
47
• The blueprint lists the content standard ID number.
• The blueprint lists or references the targeted content standards.
• The blueprint designates item counts for each standard.
• The blueprint reflects a range of DoK levels.• The blueprint item/task distribution reflects that
in the specification tables.
48
QA Checklist
QA Checklist• All items/tasks articulated on the blueprint are
represented within the Scoring Key.• MC items have been validated to ensure only one
correct answer among the possible options provided exists.
• MC answers do not create a discernible pattern.• MC answers are “balanced” among the possible
options.• Scoring Key answers are revalidated after the final
operational form reviews are complete.
49
50
Scoring Rubrics
Holistic vs. Analytic Rubric Scoring
Holistic Scoring• Provides a single score based on an overall
determination of the student’s performance• Assesses a student’s response as a whole for the overall
quality• Most difficult to calibrate with different raters
Analytic Scoring• Identifies and assesses specific aspects of a response• Multiple dimension scores are assigned• Provides a logical combination of subscores to the
overall assigned score51
Rubric Scoring Considerations • Describe whether spelling and/or grammar
will impact the final score.• Avoid using words like “many,” “some,” and
“few” without adding numeric descriptors to quantify these terms.
• Avoid using words that are subjective, such as “creativity” or “effort.”
• Avoid subjective adjectives such as “excellent” or “inadequate.”
52
SCR Rubric Example
53
General Scoring Rubric
2 points
The response gives evidence of a complete understanding of the problem. It is fully developed and clearly communicated. All parts of the problem are complete. There are no errors.
1 point
The response gives evidence of a reasonable approach but also indicates gaps in conceptual understanding. Parts of the problem may be missing. The explanation may be incomplete.
0 points There is no response, or the work is completely incorrect or irrelevant.
SCR Rubric Example
54
Sample Response: “In two complete sentences, explain why people should help save the rainforests.”
2 points
The student’s response is written in complete sentences and contains two valid reasons for saving the rainforest.“People must save the rainforest to save the animals’ homes. People need to save the rainforest because we get ingredients for many medicines from there.”
1 pointThe student’s response contains only one reason. “People should save the rainforest because it is important and because people and animals need it.”
Rubrics for ECR TasksCreate content-based descriptions of the expected
answer for each level of performance on the rubric.
Provide an example of a fully complete/correct response along with examples of partially correct responses.
Reference the item expectations in the rubric.Make the rubric as clear and concise as possible so
that other scorers would assign exact/adjacent scores to the performance/work under observation.
55
ECR Rubric Example
56
General Scoring Rubric
4 points
The response provides all aspects of a complete interpretation and/or a correct solution. The response thoroughly addresses the points relevant to the concept or task. It provides strong evidence that information, reasoning, and conclusions have a definite logical relationship. It is clearly focused and organized, showing relevance to the concept, task, or solution process.
3 points
The response provides the essential elements of an interpretation and/or a solution. It addresses the points relevant to the concept or task. It provides ample evidence that information, reasoning, and conclusions have a logical relationship. It is focused and organized, showing relevance to the concept, task, or solution process.
2 points
The response provides a partial interpretation and/or solution. It somewhat addresses the points relevant to the concept or task. It provides some evidence that information, reasoning, and conclusions have a relationship. It is relevant to the concept and/or task, but there are gaps in focus and organization.
1 point
The response provides an unclear, inaccurate interpretation and/or solution. It fails to address or omits significant aspects of the concept or task. It provides unrelated or unclear evidence that information, reasoning, and conclusions have a relationship. There is little evidence of focus or organization relevant to the concept, task, and/or solution process.
0 pointsThe response does not meet the criteria required to earn one point. The student may have written on a different topic or written "I don't know."
ECR Rubric Example
57
Sample Response: “List the steps of the Scientific Method. Briefly explain each one.”
4 points
1. Ask a Question- Ask a question about something that you observe: How, What, When, Who, Which, Why, or Where?
2. Do Background Research- Use library and Internet research to help you find the best way to do things.
3. Construct a Hypothesis- Make an educated guess about how things work.4. Test Your Hypothesis- Do an experiment. 5. Analyze Your Data and Draw a Conclusion- Collect your measurements and analyze them to see if
your hypothesis is true or false.6. Communicate Your Results- Publish a final report in a scientific journal or by presenting the results
on a poster.
3 points
1. Ask a Question2. Do Background Research-Use library and Internet research. 3. Construct a Hypothesis- An educated guess about how things work.4. Test Your Hypothesis- Do an experiment. 5. Analyze Your Data and Draw a Conclusion6. Communicate Your Results
2 points
1. Ask a Question2. Do Background Research3. Construct a Hypothesis4. Test Your Hypothesis5. Analyze Your Data and Draw a Conclusion6. Communicate Your Results
1 point Ask a Question, Hypothesis, Do an Experiment, Analyze Your Data
0 points “I don’t know.”
QA Checklist• CR items/tasks have scoring rubrics that reflect a performance
continuum.• CR items/tasks include sample responses for each level of
performance.• CR scoring rubrics are clear and concise.• CR scoring rubrics include all dimensions (aspects) of the tasks
presented to the students.• CR scoring rubrics avoid including non-cognitive (motivation,
effort, etc.) or content irrelevant attributes.• Don’t use a rubric if a checklist or simpler tool will do!
58
Back to the SLO template…
SLO Template Steps: Teacher
5. Teacher Expectations
5a. Level
Failing0% to ___ % of students will meet the PI targets.
Needs Improvement___% to ___% of students will meet the PI targets.
Proficient___% to ___% of students will meet the PI targets.
Distinguished___% to 100% of students will meet the PI targets.
5b. Elective Rating
Distinguished (3) Proficient (2) Needs Improvement (1) Failing (0)
Notes/Explanation
.Teacher Signature _________________________Date______ Evaluator Signature _____________________Date______
.Teacher Signature _________________________Date______ Evaluator Signature _____________________Date______
Describes the number of students expected to meet the performance indicator criteria.
5a: Proficient85% to 94% of students meet
the performance indicator.
5a: Teacher
Effectiveness Measure
Proficient!
5a: Proficient85% to 94% of this audience
can explain the SLO process to their stakeholders!
SLO Online Resources
pdesas.org
IU8 Curriculum
Network wiki
Available Templates
Available Rubrics
Homeroom
IU8 Wiggio(Online support group)
IV. Key Points for School Leaders
Key Process Points
The SLO process facilitates a conversation about
expectation between educators (principals and
teachers)
Key Points (3)
• What is the subject or content focus?
• Who does it encompass?
• How can it improve instruction and educator practice?
Goals-Standards
• Are they high quality measures?
• Who administers and scores the measures?
• What are the expectations for students?
Performance Measures
• What are indicators of success?
• How are they being measured?
• Upon which students are they based?
Indicators
1. The SLO is based upon small numbers of students/data points.
2. Goals and indicators are linked to standards.
3. Indicators are vague without specific performance criteria.
4. Growth and/or mastery is not clearly defined
5. Performance measures are not well designed or lack rigor.
6. Overall student achievement expectations are extreme.
IV. Areas of Caution
IV. Generic Process Steps: Leader
1. Establish SLO template completion timeline ASAP
2. Review complete template3. Conduct review meeting with
teacher4. Agree on any revisions; submit
materials5. Establish “mid-cycle” spot
review6. End-of-Year review with
supporting data
V. Action Planning• Implementation Timeline• Roles for administrative team• Professional development for teacher• Repurposing your schedule for job embedded
SLO activities– SLO creation (Design, Build, Review)– Resource investigations– Time to work in grade level/content area teams
• Support systems for teachers from administrators and peers
Contact InfoCURRICULUM
Janel Vancas, Acting Assistant Director
Laura J. Toki, [email protected]
ED PROGRAMSJennifer Anderson, Assistant Director
Frequently Asked QuestionsWhat are the definitions of “tested,” “non-tested?”
Tested: Teachers with Eligible PVAAS Score (20% Elective)
A PA certified educator with full or partial responsibility for content specific instruction of the assessed eligible content as measured by a Grade 4-8 PSSA or Keystone Exam.
Non-tested: Teachers without Eligible PVAAS Score
(35% Elective)Teachers who do not teach coursesassessed by Grade 4-8 PSSA or Keystone exams.
Who develops the SLO? Is this an individual effort or a collaborative effort?
Each educator will be responsible to develop SLOs as required by the LEA. Collaborative development of SLOs is encouraged (e.g., similar content area or grade level teachers, interdisciplinary groups of educators, collaboration through professional organizations educators, etc.). A PDE approved SLO Template is provided to help guide educators and administrators through the process.
How will the final SLO measure be translated into a “score” that can be applied to the 20% or 35% of a teacher’s evaluation?
This formula and computation process is currently under development by PDE and will be published in the PA Bulletin by June 30, 2013.
What is the SLO template and process designed to address in terms of instructional delivery time, number of students, or size of the objective?
SLOs can be written to address the entire length of a grade or course, but could be tailored to a focused time period. Student achievement for large or select groups of students can be described. The template is designed to address a grade or course plan but could be used to address a meaningful, focused instructional objective or focused teaching practice.
Will PDE recommend some performance measures and scoring tools?
Model SLOs for a variety of content areas will be provided, utilizing a variety of performance measures and scoring tools. These modelscan be used as is or can be modified.
How many SLOs per teacher/per year/per grade? What about “co-taught” classes, teachers who travel between schools, and other unique instructional scenarios?
Policy and guidelines on these issues are yet to be determined.
How will the SLO process be monitored?
A principal or LEA-assigned evaluator would monitor the SLO process, including (but not limited to) the timeline for development, approval for the SLO to be implemented and verification of the measure of educator effectiveness based on the completion of the SLO. Tools are currently being developed to assist principals toward efficiently and effectively monitoring this process.
How do “goals” and “performance indicators differ?
The Goal Statement should address important learning content to be measured, and the performance indicators should describe expected levels of achievement.
If a school is already having conversations about SLO and is having success, is it necessary to fill out this template or can we continue what we are doing?
State regulations say that “LEAs shall use an SLO to document the process to determine and validate the weight assigned to the Elective Data measures that establish the Elective Rating.”
When will LEAs be expected to implement SLOs?
Models will be available for school year 2013-2014, and LEAs have the option to use SLOs as a component for measuring educator effectiveness in school year 2013-14. LEAs will be expected to implement SLOs in school year 2014-2015. First year teachers will not be expected to implement SLOs.
What supports will be available to teachers and districts to develop and implement rigorous SLOs?
An online training program and process/definitions manual will be provided, as will an up-to-date template and content-specific models. Anticipated availability of these supports is August 2013.
Your Questions?