![Page 1: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/1.jpg)
Tools for Developing a Comprehensive Evaluation Template
Heather Peshak George, Ph.D.Karen Elfner Childs, M.A.
University of South FloridaCynthia Anderson, Ph.D.
University of OregonSkill-Building Workshop: March 27, 2010
7th International Conference on Positive Behavior Support: St. Louis, MO
![Page 2: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/2.jpg)
2
Purpose
• Familiarize participants with the new Benchmarks for Advanced Tiers (BAT) and other tools to develop a comprehensive evaluation template addressing behavior support across tiers with application at state, district, and/or school levels.
![Page 3: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/3.jpg)
3
Objectives
Purpose of a comprehensive evaluation systemAdministration and completion – what is it?Using the results to boost implementation and validate outcomes – how do you use it?
SchoolDistrictState
Future considerations
![Page 4: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/4.jpg)
4
Agenda
8:30-9:00 Introduction and Rationale9:00-10:00 Implementation Monitoring: TIC, PIC10:00-10:30 Implementation Integrity: BoQ10:30-10:45 BREAK10:45-11:45 Implementation Integrity: BAT11:45-12:15 Implementation Research: SET, ISSET12:15-12:30 Wrap-up
![Page 5: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/5.jpg)
5
Purpose of Evaluation
• To examine the extent to which teams are accurately selecting and implementing PBS systems and practices
• Allows teams to determine the extent to which target student outcomes are being and/or likely to be achieved
• To determine if teams are accurately and consistently implementing activities and practicesas specified in their individualized action plan
(PBIS Blueprint, 2005)
![Page 6: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/6.jpg)
6
PBIS Evaluation Blueprint:A Work in Progress…
• Context– What was provided, who provided, who received
• Input– Professional development, value, perspective
• Fidelity– Implemented as designed, w/fidelity, process evaluation
• Impact– Behavior change, other schooling changes
• Replication, Sustainability and Improvement– Capacity, practice, policy– Expanding implementation, allocating resources
(PBIS Blueprint, 2010)
![Page 7: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/7.jpg)
7
Factors to Consider in Developing Comprehensive Evaluation Systems
1) Systems Preparation– Readiness activities
2) Service Provision– Training and technical assistance
3) Identification and Assessment of Behavior Problems– Possible data sources
4) Evaluation Process– Timelines, data systems
5) Evaluation Data (Across all three Tiers)– Implementation Fidelity, Impact on Students, Attrition, Client
Satisfaction6) Products and Dissemination
– Reports, materials, presentations, etc.(modified from Childs, Kincaid & George, in press)
![Page 8: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/8.jpg)
Florida’s Evaluation ModelEvaluation
Data
TrainingOn-going technical
assistanceFLPBS
↓Districts
↓Coaches
↓Schools
End-Year
ImpactOutcome data (ODR, ISS, OSS)FL Comprehensive Assessment TestBenchmarks of QualitySchool Demographic DataPBS WalkthroughDaily Progress ReportsBehavior Rating ScalesClimate Surveys
Implementation FidelityPBS Implementation Checklist (PIC)Benchmarks of Quality (BoQ)Benchmarks for Advanced Tiers (BAT)School Demographic DataSchool-wide Implementation FactorsTier 3 plan fidelity checklistBEP Fidelity checklist
Project ImpactAttrition Survey/Attrition RatesDistrict Action Plans
Client SatisfactionSchool-Wide Implementation FactorsDistrict Coordinator’s SurveyTraining Evaluations
Annual ReportsRevisions to
training and technical assistance processNational, State,
district, school dissemination activitiesWebsiteOn-line training
modules
Identification/Assessment
Service Provision
Products and Dissemination
Systems Preparation
Evaluation Process
MidYear
I
MidYear
II
Discipline RecordsESE ReferralsSurveysWalkthroughsPICClassroom
Assessment ToolStudent rank/ratingTeacher requestsLack of responseBATBehavior Rating
ScaleDaily Progress
Report Charts
•District Action Plan•District Readiness Checklist•SchoolReadinessChecklist•New School Profile (includes ODR, ISS, OSS)
![Page 9: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/9.jpg)
9
ComprehensiveEvaluation Blueprint:
Implementation Monitoring
Implementation Integrity
ImplementationResearch
•TIC (1)
Team Implementation Checklist
Sugai, Horner & Lewis-Palmer (2001)
•PIC (1,2,3)
PBS Implementation Checklist for Schools
Childs, Kincaid & George (2009)
•BoQ (1)
Benchmarks of Quality
Kincaid, Childs & George (2005)
•BAT (2,3)
Benchmarks for Advanced Tiers
Anderson, Childs, Kincaid, Horner, George, Todd, Sampson & Spaulding (2009)
•SET (1)
School-wide Evaluation Tool
Sugai, Lewis-Palmer, Todd & Horner (2001)
•ISSET (2,3)
Individual Student Systems Evaluation Tool
Anderson, Lewis-Palmer, Todd, Horner, Sugai & Sampson (2008)
![Page 10: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/10.jpg)
10
Implementation Monitoring
Team Implementation Checklist (TIC)
PBS Implementation Checklist (PIC)
![Page 11: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/11.jpg)
11
Progress Monitoring Measures
• designed to assess the same core features as the research and annual self-assessment measures
• used by school teams (typically with the support of their coach) on a frequent basis (e.g. monthly, every two months, or quarterly) to guide action planning during the implementation process
• require 15-20 minutes to complete online and are used by the team, coach and trainer to tailor actions, supports, and training content associated with assisting the school to implement with high fidelity
(PBIS Blueprint, 2010)
![Page 12: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/12.jpg)
12
ComprehensiveEvaluation Blueprint:
Implementation Monitoring
Implementation Integrity
ImplementationResearch
•TIC (1)
Team Implementation Checklist
Sugai, Horner & Lewis-Palmer (2001)
•PIC (1,2,3)
PBS Implementation Checklist for Schools
Childs, Kincaid & George (2009)
•BoQ (1)
Benchmarks of Quality
Kincaid, Childs & George (2005)
•BAT (2,3)
Benchmarks for Advanced Tiers
Anderson, Childs, Kincaid, Horner, George, Todd, Sampson & Spaulding (2009)
•SET (1)
School-wide Evaluation Tool
Sugai, Lewis-Palmer, Todd & Horner (2001)
•ISSET (2,3)
Individual Student Systems Evaluation Tool
Anderson, Lewis-Palmer, Todd, Horner, Sugai & Sampson (2008)
![Page 13: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/13.jpg)
13
Team Implementation Checklist (TIC) , Version 3.0
![Page 14: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/14.jpg)
14
Team Implementation Checklist
• Utility• Initial planning for implementation• Progress monitoring early implementation
• Completed quarterly by Tier I team• Checklist 1:
![Page 15: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/15.jpg)
15
Components of the TIC
Checklist I Checklist 2•Commitment•Team•Self assessment•Expectations•Information system•Capacity for Tier III
Monitor ongoing activity
![Page 16: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/16.jpg)
16
Use of the Team Checklist
• Who completes the Team Checklist?• The school-team (individually or together)
• When is Team Checklist completed?• At least quarterly, best if done monthly• Web-based data entry www.pbssurveys.org
• Who looks at the data?• Team• Coach• Trainers/State Evaluation
• Action Planning
![Page 17: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/17.jpg)
17
Action Planning with the Team Checklist
• Define items Not in place or Partially in place• Identify the items that will make the biggest
impact• Define a task analysis of activities to achieve
items• Allocate tasks to people, time, reporting
event
![Page 18: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/18.jpg)
Iowa Checklist 01-05, PK-6 % Fully & Partially Implemented
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
05-A
ug-0
3
05-N
ov-0
3
23-F
eb-0
4
22-J
an-0
4
01-F
eb-0
5
02-J
un-0
5
12-A
ug-0
4
24-N
ov-0
4
01-M
ar-0
5
12-S
ep-0
2
31-O
ct-0
2
28-F
eb-0
3
21-A
pr-0
3
01-S
ep-0
3
05-N
ov-0
3
05-A
ug-0
3
11-S
ep-0
3
07-N
ov-0
3
06-F
eb-0
4
01-S
ep-0
3
01-N
ov-0
3
01-M
ar-0
4
03-A
ug-0
4
08-N
ov-0
4
08-M
ar-0
5
03-J
un-0
5
1 1 1 2 2 2 3 3 3 4 4 4 4 4 4 5 5 5 5 6 6 6 7 7 7 7
Start Up Full Implementation Start Up Part Implementation
![Page 19: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/19.jpg)
Iowa Elementary SchoolsTeam Checklists 02-04, % Items Fully & Partially Implemented
0
20
40
60
80
100
Aug
.N
ov.
Feb.
Sep
.N
ov.
Mar
.A
pr.
May
Sep
.N
ov.
Mar
.O
ct.
Sep
.O
ct.
Feb.
Apr
.S
ep.
Nov
.Fe
b.O
ct.
Aug
.S
ep.
Nov
.Fe
b.
Sep
.N
ov.
Mar
.
Nov
.Fe
b.
Nov
.Fe
b.
Sep
.N
ov.
Mar
.A
pr.
May
Sep
.N
ov.
Mar
.O
ct.
Sep
.N
ov.
Mar
.A
pr.
May
Sep
.N
ov.
Mar
.O
ct
Aug
.N
ov.
Feb.
AdamsES-D
Douds ES * Iowa Valley ES* JacksonES-D
MLKES-D
MonroeES-D
ParkAve.ES-D
Prescott ES* Stockport ES-P* StoweES-D
Per
cent
(%) I
mpl
emen
ted
% Imp. % Partially Imp.
![Page 20: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/20.jpg)
R. V. Traylor Elementary SchoolTeam Checklist 03-04
0
20
40
60
80
100
Commit Team Self-Assess Expect.Define
Expect.Teach
RewardsSystem
ViolationsSystem
Info. Function % ItemsImplemented
% TotalPoints
Oct. '03 Dec. '03 Mar. 04
![Page 21: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/21.jpg)
21
Putting your School in Perspective
Use % of Total Items/ or % of pointsCompare multiple schools
– Messages:• It is possible• You don’t need to be perfect immediately
![Page 22: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/22.jpg)
0
20
40
60
80
100
A B C D E F G H I J K L M N
Series1 Series2
Team Checklist Total Scores
![Page 23: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/23.jpg)
0
20
40
60
80
100
A B C D E F G H I J K L M N
Series1 Series2
Team Checklist Total Scores
![Page 24: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/24.jpg)
0
20
40
60
80
100
A B C D E F G H I J K L M N
% Imp. % Partially Imp.
Team Checklist Total Scores
![Page 25: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/25.jpg)
25
PBS Implementation Checklist (PIC)
![Page 26: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/26.jpg)
26
PIC Purpose
• Provide school teams a “snapshot” of where they are in implementation of PBS– Implementation of critical elements at Tier 1– Implementation of Tiers 2 and 3
• Completed 3 and 6 months into the school yr
• Guides action planning and team activities
![Page 27: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/27.jpg)
27
PBS Implementation Checklist
![Page 28: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/28.jpg)
28
Tier 1 Critical Element Implementation Level chart
![Page 29: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/29.jpg)
29
PBS Implementation Level chart
![Page 30: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/30.jpg)
30
Using PIC Results
• Use the results of PIC to guide your PBS team towards implementation with fidelity at all three tiers.
![Page 31: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/31.jpg)
31
Implementation Monitoring Tools
• Will you progress monitor your school(s)?– If so, how often?– Who is responsible to administer, collect and
synthesize the data?– How will it be reported back to the team?
• Which tool will you utilize?• How will you use the results?
• At the school, district, or state/project level?• As it relates to fidelity? Outcomes? Other?
![Page 32: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/32.jpg)
32
Implementation Integrity
Benchmarks of Quality (BoQ)
Benchmarks for Advanced Tiers (BAT)
![Page 33: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/33.jpg)
33
Annual Self-Assessment Measures
• Designed to document the same content as the research measures but to do so more efficiently
• Most available online and provide a school team/coach with the ability to determine once a year if a school is implementing SWPBS practices at a level that would be expected to affect student outcomes
• Always guide development of action planning to assist in efficient and continuous improvement of systems used in the school
(PBIS Blueprint, 2010)
![Page 34: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/34.jpg)
34
ComprehensiveEvaluation Blueprint:
Implementation Monitoring
Implementation Integrity
ImplementationResearch
•TIC (1)
Team Implementation Checklist
Sugai, Horner & Lewis-Palmer (2001)
•PIC (1,2,3)
PBS Implementation Checklist for Schools
Childs, Kincaid & George (2009)
•BoQ (1)
Benchmarks of Quality
Kincaid, Childs & George (2005)
•BAT (2,3)
Benchmarks for Advanced Tiers
Anderson, Childs, Kincaid, Horner, George, Todd, Sampson & Spaulding (2009)
•SET (1)
School-wide Evaluation Tool
Sugai, Lewis-Palmer, Todd & Horner (2001)
•ISSET (2,3)
Individual Student Systems Evaluation Tool
Anderson, Lewis-Palmer, Todd, Horner, Sugai & Sampson (2008)
![Page 35: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/35.jpg)
35
Benchmarks of Quality (BoQ)
![Page 36: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/36.jpg)
36
Creation: Based on Needs
Reliably assess team’s implementationDistinguish Model SchoolsEasy to complete by Coaches with little trainingQuick to completeProvide feedback to teamClarify outcomes as related to implementation
![Page 37: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/37.jpg)
37
Benchmarks of Quality
• Identified items aligned with SWPBS Training process53 items addressing areas of:
• Faculty commitment• Effective procedures for dealing with discipline• Data entry and analysis plan established• Expectations and rules developed• Reward/recognition program established• Lesson plans for teaching• Implementation plan• Crisis plan• Evaluation
![Page 38: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/38.jpg)
38
BoQ Validation Process
• Expert Review• Pilot Study• Florida & Maryland Schools
– Elementary, Middle, High, Center/Alt• Reliability – Test-retest, Inter-rater both >.01• Concurrent Validity – SET/ODRs• For more details see JPBI – Fall 2007
![Page 39: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/39.jpg)
39
Use of the School-Wide Evaluation Tool (SET)
•SET is a validated research tool that combines multiple assessment approaches (interviews, observations, product reviews) to arrive at an implementation score•Concerns:
– Time– High scores– Potential for “practice effect”– May not reflect current activities– Not as useful for action planning
•Results of correlation with BoQ– Overall r=.51 (p<.01)
![Page 40: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/40.jpg)
40
Scatterplot of SET and BoQ scores
0
10
20
30
40
50
60
70
80
90
100
0 10 20 30 40 50 60 70 80 90 100
BoQ Scores
SET
Scor
es
![Page 41: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/41.jpg)
41
BoQ Factor Analysis
• Exploratory and confirmatory analysis– Most items “hang together” within a critical element but
fit better within a 5 factor structure– All but 4 of the 53 items were found to have internal
consistency (strong items)– Item/total correlations indicated that 46 of the 53 items
were highly correlated with total score• The 4 items without strong internal consistency were also
found to lack item/total correlation• All 3 crisis items
![Page 42: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/42.jpg)
42
Utility of the BoQBoQ is reliable, valid, efficient and usefulModerate correlation with SETData regarding association with ODRsEase of use
Little trainingLittle time from team and CoachAreas not unique to one training approachAssist states that are rapidly expanding PBS efforts
Specific team feedback: celebration/planning
![Page 43: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/43.jpg)
43
Benchmarks Review
• Describe the Benchmarks of Quality (what is it?)
• Describe the psychometric properties of the Benchmarks of Quality (can we trust it?)
• Share your answers to these questions with your neighbor
![Page 44: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/44.jpg)
44
Administration and Completion
![Page 45: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/45.jpg)
45
3 Elements of theBenchmarks of Quality
• Team Member Rating Form• Completed by team members independently• Returned to coach/facilitator
• Scoring Form• Completed by coach/facilitator using Scoring Guide• Used for reporting back to team
• Scoring Guide• Describes administration process• Rubric for scoring each item
![Page 46: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/46.jpg)
46
Method of Completion
• Coach/facilitator uses Scoring Guide to ascertain the appropriate score for each item, collects Team Member Rating forms, resolves any discrepancies, and reports back to team
• Alt. Option – Scoring Form is completed at a team meeting with all members reaching consensus on the appropriate score for each item using the Scoring Guide rubric. The team identifies areas of strength and need.
![Page 47: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/47.jpg)
47
Completion of BoQStep 1 – Coach’s Scoring
• The Coach/facilitator will use his or her best judgment based on personal experience with the school and the descriptions and exemplars in the Benchmarks of Quality Scoring Guide to score each of the 53 items on the Benchmarks of Quality Scoring Form (p.1 & 2). Do not leave any items blank.
![Page 48: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/48.jpg)
48
Benchmarks Practice:Scoring Form, Scoring Guide
Critical Elements STEP 1 STEP 2
++, +, or _ STEP 3
1.Team has broad representation 1 0
2 Team has administrative support 3 2 1 0
3 Team has regular meetings (at least monthly) 2 1 0
PBS Team
4 Team has established a clear mission/purpose 1 0
![Page 49: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/49.jpg)
49
Benchmarks Practice:Scoring Form, Scoring Guide
Critical Elements STEP 1 STEP 2
++, +, or _ STEP 3
1. Team has broad representation 1 0
2 Team has administrative support 3 2 1 0
3 Team has regular meetings (at least monthly) 2 1 0
PBS Team
4 Team has established a clear mission/purpose 1 0
![Page 50: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/50.jpg)
50
Completion of BoQStep 2 – Team Member Rating
• The coach/facilitator will give the Benchmarks of Quality Team Member Rating Form to each SWPBS Team member to be completed independently and returned to the coach upon completion. Members should be instructed to rate each of the 53 items according to whether the component is “In Place,” “Needs Improvement,” or “Not in Place.” Some of the items relate to product and process development, others to action items; in order to be rated as “In Place;” the item must be developed andimplemented (where applicable). Coaches will collect and tally responses and record on the Benchmarks of Quality Scoring Form the team’s most frequent response using ++ for “In Place,” + for “Needs Improvement,” and –for “Not In Place.”
![Page 51: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/51.jpg)
Benchmarks Practice:Scoring Form, Team Members Rating Form
In Place (++)Needs Improvement (+)
Not In Place (-)
Team Member B1.Team has broad representation X
2. Team has administrative support X
Team Member A1. Team has broad representation X
2. Team has administrative support X
STEP 1STEP 2++, +,
or _
STEP 3
1. Team has broad representation 1 0
2. Team has administrative support 3 2 1 0+++
Team Member C1. Team has broad representation X
2. Team has administrative supportX
![Page 52: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/52.jpg)
52
Benchmarks Team Member Tally Form
![Page 53: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/53.jpg)
53
Completion of BoQStep 3 - Team Report
• The coach will then complete the Team Summary on p. 3 of the Benchmarks of Quality Scoring Form recording areas of discrepancy, strength and weakness.
• Discrepancies - If there were any items for which the team’s most frequent rating varied from the coaches’rating based upon the Scoring Guide, the descriptions and exemplars from the guide should be shared with the team. This can happen at a team meeting or informally. If upon sharing areas of discrepancy, the coach realizes that there is new information that according to the Scoring Guide would result in a different score, the item and the adjusted final score should be recorded on the Scoring Form
![Page 54: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/54.jpg)
Benchmarks Practice:Scoring Form, Team Members Rating Form
In Place (++)Needs Improvement (+)
Not In Place (-)
Team Member B1.Team has broad representation X
2. Team has administrative support X
Team Member A1. Team has broad representation X
2. Team has administrative support X
STEP 1STEP 2++, +,
or _
STEP 3
1. Team has broad representation 1 0
2. Team has administrative support 3 2 1 0+++
Team Member C1. Team has broad representation X
2. Team has administrative supportX
![Page 55: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/55.jpg)
55
Completion of BoQStep 4 – Reporting Back to Team• After completing the remainder of the Benchmarks
of Quality: Scoring Form, the coach will report back to the team using the Team Report page of the Benchmarks of Quality: Scoring Form. If needed, address items of discrepancy and adjust the score. The coach will then lead the team through a discussion of the identified areas of strength (high ratings) and weakness (low ratings). This information should be conveyed as “constructive feedback” to assist with action planning.
![Page 56: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/56.jpg)
Benchmarks Team Summary:Scoring Form
Areas of Discrepancy
Item #
Team Response
Coach’sScore Scoring Guide Description
Areas of Strength
++, ++, + Administrator does not actively support the process2 0
Critical Element
Description of Areas of Strength
Critical Element
Description of Areas in Need of DevelopmentAreas in Need of Development
![Page 57: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/57.jpg)
Benchmarks Critical Element Maximum
![Page 58: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/58.jpg)
58
Alternative Option* for Completion of BoQ
*statistically validated as an alternative option
![Page 59: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/59.jpg)
59
Alternative OptionStep 1 – Team Member Scoring
• The team member uses personal experience with PBS and the descriptions and exemplars in the Benchmarks of Quality Scoring Guide ) for each of the 53 items on the Benchmarks of Quality Scoring Form (p.1 & 2). The team will meet and reach consensus on the appropriate score for each item.
![Page 60: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/60.jpg)
60
Alternative OptionStep 2 – Team Summary
• After completing the Benchmarks of Quality: Scoring Form, the team should use the Team Report page of the Benchmarks of Quality: Scoring Form to guide a discussion of the identified areas of strength (high ratings) and weakness (low ratings). This information should be used as “constructive feedback” to assist with action planning.
![Page 61: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/61.jpg)
61
Submitting Your Evaluation
• Step 5 – Reporting/Entering Data• The coach/facilitator will enter the data
from the Benchmarks of Quality: Scoring Form on www.pbssurveys.org
• See PBS Surveys Users Manual for specific instructions.
• District/state coordinators may establish due dates for completion of the BoQ annually, or more frequently as needed.
![Page 62: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/62.jpg)
62
PBS Surveys
www.pbssurveys.org
![Page 63: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/63.jpg)
63
Using the BoQ Results to Boost Implementation and
Validate Outcomes
![Page 64: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/64.jpg)
64
Using the BoQ Results
Action plan to increase fidelity of implementationSchoolDistrictState/project
Outcome reportingModel school identification
![Page 65: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/65.jpg)
65
BoQ Max Scores per Critical Element
School
![Page 66: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/66.jpg)
66
PBS Surveys - BoQ ReportCritical Elements
School
![Page 67: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/67.jpg)
Jones Middle School
Are our Benchmarks scores above 70 and rising?Scores have never been over 70 and dropped 15 points last year.
School
![Page 68: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/68.jpg)
68
PBS Surveys - BoQ ReportOverall Scores
District
![Page 69: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/69.jpg)
District PBS Implementation Levels
0%10%20%
30%40%50%60%70%
80%90%
100%Te
am
Facu
ltyC
omm
it.
Dis
cipl
ine
Proc
Dat
a En
try
Expe
ctat
ions
Rew
ards
Teac
hing
Impl
emen
. Pla
n
Cris
is
Eval
uatio
n
Benchmark Category
Ave
rage
% o
f Po
ssib
le P
oint
s Ea
rned
2004-2005 (11 schools)2005-2006 (15 schools)
Are our schools implementing PBS with fidelity?
District
Average BoQ scores over 70% and increasing in all 10 domains.
![Page 70: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/70.jpg)
District Average Referrals by Implementation Level
0
50
100
150
200
250
300
350
400
2004-2005 2005-2006School Year
Ave
rage
# O
DR
/100
St
uden
ts
Low Implementers*
High Implementers
*(Implementation Level based upon score on School-Wide PBS Benchmarks of Quality; >70 or <70 of a possible 100 points)
Is there a difference in ODR outcomes for schools?Low implementers have many more ODRs, but number is decreasing.
District
![Page 71: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/71.jpg)
District Average ISS Days by Level of Implementation
0
10
20
30
40
50
60
70
2004-2005 2005-2006School Year
Ave
rage
# D
ays
ISS
per 1
00 S
tude
nts
Low Implementers*High Implementers
Is PBS impacting ISS in our schools?High implementing schools have 70% fewer ISS and decreased by 50%.
District
![Page 72: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/72.jpg)
72
State
![Page 73: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/73.jpg)
73
State
![Page 74: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/74.jpg)
74
State
![Page 75: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/75.jpg)
75
State
Academic AchievementStudents at Level 3+ in Reading on Florida’s Comprehensive Assessment Test
53
6067
57 59
68
57 58
67
0
10
20
30
40
50
60
70
All FL Schools Low (BoQ<70) High (BoQ>=70)
Ave
rage
Per
cent
age
Scor
ing
Leve
l 3+
2004-2005 2005-2006 2006-2007
![Page 76: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/76.jpg)
76
Using Benchmarks Results
• How will you use the results of the Benchmarks?• At the school, district, state/project level?• As it relates to fidelity of implementation?• As it relates to outcomes?• As it relates to identifying model schools?• Other?
![Page 77: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/77.jpg)
77
Benchmarks forAdvanced Tiers (BAT)
![Page 78: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/78.jpg)
78
Benchmarks for Advanced Tiers (BAT)
• The Benchmarks for Advanced Tiers (BAT) allows school teams to self-assess the implementation status of Tiers 2 (secondary, targeted) and 3(tertiary, intensive) behavior support systems within their school and is designed to answer three questions:1. Are the foundational (organizational) elements in
place for implementing secondary and tertiary behavior support practices?
2. Is a Tier 2 support system in place?3. Is a Tier 3 system in place?
![Page 79: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/79.jpg)
79
BAT OrganizationTier 1: Implementation of School-wide PBSTier 2-3 Foundations
• Commitment• Student Identification• Monitoring and Evaluation
Tier 2: Support SystemsMain Tier 2
• Strategy Implementation• Strategy Monitoring and Evaluation
Tier 3: Intensive Support SystemsTier 3: Assessment and Plan Development
![Page 80: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/80.jpg)
80
Instructions for CompletingWho: The team(s) or individuals involved with Tiers 2 and 3
behavior supportHow: As a group or each member independently. If
completed independently, the team reconvenes to review scores on each item. Team (or individuals involved with Tiers 2 and 3 behavior support) must reach
consensus on the score for each item.
Scoring: After reviewing the rubric for each item, select the score that most closely matches the state of affairs at the school. Rate each item as “2” fully in place, “1” partially in place, or “0” not yet started.
![Page 81: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/81.jpg)
81
Additional Tips
• Before starting the first administration, read through the items to determine who on campus will be likely to have knowledge of the topic(s).
• Since the BAT covers several topic areas and usually requires input from multiple people it is best to work from a paper copy until all items have been scored.
![Page 82: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/82.jpg)
82
Tier 1: (A) SWPBS
![Page 83: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/83.jpg)
83
Tiers 2-3: (B) Foundations
![Page 84: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/84.jpg)
84
Tiers 2-3: (D) Monitoring/Eval
![Page 85: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/85.jpg)
85
Tier 2: (E) Tier 2 Support System
![Page 86: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/86.jpg)
86
Tiers 2: (F) Main Tier 2 Strategy Intervention
![Page 87: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/87.jpg)
87
Tiers 2: (G) Main Tier 2 Strategy Monitoring/Evaluation
![Page 88: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/88.jpg)
88
Additional Tier 2 Interventions
• Items 18-31 may be repeated for other Tier 2 strategies in use at your school for evaluation purposes. However, only the scores associated with the most commonly used Tier 2 strategy will be accounted in your Benchmarks for Advanced Tiers (BAT) score.
![Page 89: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/89.jpg)
89
Tier 3: (H) Intensive Support Systems
![Page 90: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/90.jpg)
90
Tier 3: (I) Assessment & Planning
![Page 91: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/91.jpg)
91
Tier 3: (J) Monitoring/Eval
![Page 92: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/92.jpg)
92
Using the BAT Results
• School teams should use the BAT to build an action plan to define next steps in the implementation process.
• The BAT can also assess progress over time, as scores on each area can be tracked on a year-to-year basis.
![Page 93: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/93.jpg)
Benchmarks for Advanced Tiers
![Page 94: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/94.jpg)
Benchmarks for Advanced Tiers
![Page 95: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/95.jpg)
95
Using the Data for Action Planning
![Page 96: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/96.jpg)
96
Using BAT Results• How will you use the results of the Benchmarks
for Advanced Tiers (BAT)?• At the school, district, state/project level?• As it relates to fidelity of implementation?• As it relates to outcomes?• As it relates to identifying model schools?• Other?
![Page 97: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/97.jpg)
97
Implementation Integrity Tools
• Will you self-assess implementation fidelity for your school(s)?– If so, who is responsible to administer, collect
and synthesize the data?– How will it be reported back to the team?
• How will you use the results? • At the school, district, or state/project level?• As it relates to fidelity? Outcomes? Other?
![Page 98: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/98.jpg)
98
Implementation Research
School-wide Evaluation Tool (SET)
Individual Student Systems Evaluation Tool (ISSET)
![Page 99: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/99.jpg)
99
Research Measures
• designed to have high validity and reliability, and typically involve external observersassessing procedures during a multi-hour evaluation process
• used in formal evaluation and research analyses to allow unequivocal documentation of the extent to which SWPBS Universal, Secondary and Tertiary practices are being used as intended
(PBIS Blueprint, 2010)
![Page 100: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/100.jpg)
100
ComprehensiveEvaluation Blueprint:
Implementation Monitoring
Implementation Integrity
ImplementationResearch
•TIC (1)
Team Implementation Checklist
Sugai, Horner & Lewis-Palmer (2001)
•PIC (1,2,3)
PBS Implementation Checklist for Schools
Childs, Kincaid & George (2009)
•BoQ (1)
Benchmarks of Quality
Kincaid, Childs & George (2005)
•BAT (2,3)
Benchmarks for Advanced Tiers
Anderson, Childs, Kincaid, Horner, George, Todd, Sampson & Spaulding (2009)
•SET (1)
School-wide Evaluation Tool
Sugai, Lewis-Palmer, Todd & Horner (2001)
•ISSET (2,3)
Individual Student Systems Evaluation Tool
Anderson, Lewis-Palmer, Todd, Horner, Sugai & Sampson (2008)
![Page 101: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/101.jpg)
SCHOOL-WIDE EVALUATION TOOL (SET)
Todd, Lewis-Palmer, Horner, Sugai, Sampson, & Phillips (2005)
![Page 102: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/102.jpg)
SET• Developed as a research tool• What the SET does
• Discriminates schools that are and are not implementing Tier I
• What the SET does NOT do• Discern level/degree of implementation• Give information about the extent of
implementation• Lead to action planning
![Page 103: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/103.jpg)
SET Subscales
1. Expectations defined (2 items)2. Expectations taught (5 items)3. Acknowledgement procedures (3 items)4. Correction procedures (4 items)5. Monitoring and evaluation (4 items)6. Management (8 items)7. District support (2 items)
![Page 104: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/104.jpg)
SET Activities
• Interviews• Administrator
![Page 105: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/105.jpg)
Administrator QuestionsDiscipline System
1. Do you collect and summarize office discipline referral information? Yes No If no, skip to #4.
2. What system do you use for collecting and summarizing office discipline referrals? (E2)
a. What data do you collect? __________________b. Who collects and enters the data? ____________________
3. What do you do with the office discipline referral information? (E3)
a. Who looks at the data? ____________________
b. How often do you share it with other staff? __________4. What type of problems do you expect teachers to refer to the
office rather than handling in the classroom/ specific setting? (D2)
5. What is the procedure for handling extreme emergencies in the building (i.e. stranger with a gun)? (D4)
![Page 106: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/106.jpg)
SET Activities• Interviews• Administrator• 15 randomly selected students• 15 randomly selected staff• PBIS team members
• Observations• School rules• Crisis procedures
• Permanent product review• SIP• Action plan and implementation plan• ODR form
![Page 107: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/107.jpg)
SET Activities• Interviews• Administrator• 15 randomly selected students• 15 randomly selected staff• PBIS team members
• Observations• School rules• Crisis procedures
• Permanent product review• SIP• Action plan and implementation plan• ODR form
![Page 108: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/108.jpg)
Feature Evaluation Question
1. Is there documentation that staff has agreed to 5 or fewer positively stated school rules/ behavioral expectations?(0=no; 1= too many/negatively focused; 2 = yes)A.
Expectations Defined 2. Are the agreed upon rules & expectations publicly posted in 8
of 10 locations? (See interview & observation form for selection of locations). (0= 0-4; 1= 5-7; 2= 8-10)1. Is there a documented system for teaching behavioral expectations to students on an annual basis?(0= no; 1 = states that teaching will occur; 2= yes)2. Do 90% of the staff asked state that teaching of behavioral expectations to students has occurred this year?(0= 0-50%; 1= 51-89%; 2=90%-100%)
3. Do 90% of team members asked state that the school-wide program has been taught/reviewed with staff on an annual basis?(0= 0-50%; 1= 51-89%; 2=90%-100%)
4. Can at least 70% of 15 or more students state 67% of the school rules? (0= 0-50%; 1= 51-69%; 2= 70-100%)
B.Behavioral
Expectations Taught
5. Can 90% or more of the staff asked list 67% of the school rules? (0= 0-50%; 1= 51-89%; 2=90%-100%)
![Page 109: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/109.jpg)
Feature Evaluation Question
1. Does the school budget contain an allocated amount of money for building and maintaining school-wide behavioral support? (0= no; 2= yes)
G.District-
Level Support 2. Can the administrator identify an out-of-
school liaison in the district or state? (0= no; 2=yes)
![Page 110: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/110.jpg)
Scoring the SET
1. Calculate percentage of points earned for each subscale
2. Graph scores on each subscale
![Page 111: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/111.jpg)
SETElementary School K
pre/post
Expec
t. defi
ned
Expec
t taug
htAck
nowled
gmen
tCorr
ectio
ns
Evalua
tion
Lead
ership
Distric
t Sup
port
mean
0
20
40
60
80
100
% o
f fea
ture
s im
plem
ente
d
fall 98fall 99
features
![Page 112: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/112.jpg)
SETMiddle School T
year 3 to 4
Expec
t defi
ned
Expec
t taug
htAck
nowleg
emen
tCorr
ectio
nsMon
itorin
gLe
aders
hipDist
rict S
uppo
rt
mean
0
20
40
60
80
100
% o
f fea
ture
s im
plem
ente
d
fall 98fall 99
features
![Page 113: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/113.jpg)
Can Schools AdoptSchool-Wide PBS Systems?
SET Scores Oregon and Hawaii
0
20
40
60
80
100
a b c d e f g h I j k l m n o p q r
Schools
SET
Tot
al S
core
PrePost 1Post 2
![Page 114: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/114.jpg)
Individual Student Systems Evaluation Tool (ISSET)
Anderson, Lewis-Palmer, Todd, Horner, Sugai, and Sampson, (2008)
![Page 115: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/115.jpg)
ISSET• Developed as a research tool• What the ISSET does
• Discriminates schools that are and are not implementing Tiers II and III
• Provides in-depth analysis of extent to which tiers are in place
• What the ISSET does NOT do• Lead to action planning
![Page 116: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/116.jpg)
ISSET• 36 questions across 3 sub-scales• Measurement
• Interview questions: staff, students• Permanent product review (FBAs, BSPs,
intervention manuals for Tier II)• Use
• Administered by trained ISSET evaluator (external to school)
• Takes about 2 hours to administer• Scoring requires about 30 min
![Page 117: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/117.jpg)
What the ISSET MeasuresFoundations
– Commitment– Team-based Planning– Student Identification– Monitoring and Evaluation
Targeted Interventions– Implementation– Evaluation and Monitoring
Intensive Individualized Interventions– Assessment– Implementation– Evaluation and Monitoring
SYSTEMS
Practices
Data
![Page 118: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/118.jpg)
Components of the ISSETData collection protocolInterview questions
– Administrator– Behavior support team leader– 5 randomly selected teachers
Permanent Product ReviewISSET scoring guide
– Organization of the scoring guideScore summary page
![Page 119: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/119.jpg)
Sample Item from Commitment
![Page 120: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/120.jpg)
120
![Page 121: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/121.jpg)
![Page 122: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/122.jpg)
![Page 123: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/123.jpg)
Summary Score Page
![Page 124: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/124.jpg)
Overall Scores
![Page 125: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/125.jpg)
In-Depth Scores
![Page 126: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/126.jpg)
126
Implementation Research Tools
• Will you research implementation fidelity for your school(s)?– If so, who is responsible to administer, collect
and synthesize the data?– How will it be reported back to the team?
• How will you use the results? • At the school, district, or state/project level?• As it relates to fidelity? Outcomes? Other?
![Page 127: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/127.jpg)
127
Back to the Big PictureEvaluation
Data
TrainingOn-going technical
assistanceFLPBS
↓Districts
↓Coaches
↓Schools
End-Year
ImpactOutcome data (ODR, ISS, OSS)FL Comprehensive Assessment TestBenchmarks of QualitySchool Demographic DataPBS WalkthroughDaily Progress ReportsBehavior Rating ScalesClimate Surveys
Implementation FidelityPBS Implementation Checklist (PIC)Benchmarks of Quality (BoQ)Benchmarks for Advanced Tiers (BAT)School Demographic DataSchool-wide Implementation FactorsTier 3 plan fidelity checklistBEP Fidelity checklist
Project ImpactAttrition Survey/Attrition RatesDistrict Action Plans
Client SatisfactionSchool-Wide Implementation FactorsDistrict Coordinator’s SurveyTraining Evaluations
Annual ReportsRevisions to
training and technical assistance processNational, State,
district, school dissemination activitiesWebsiteOn-line training
modules
Identification/Assessment
Service Provision
Products and Dissemination
Systems Preparation
Evaluation Process
MidYear
I
MidYear
II
Discipline RecordsESE ReferralsSurveysWalkthroughsPICClassroom
Assessment ToolStudent rank/ratingTeacher requestsLack of responseBATBehavior Rating
ScaleDaily Progress
Report Charts
•District Action Plan•District Readiness Checklist•SchoolReadinessChecklist•New School Profile (includes ODR, ISS, OSS)
Evaluation in Training
Evaluation in Identification
Evaluation in
Readiness
![Page 128: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/128.jpg)
128
How Do These Evaluation Tools Fit into Your Big Picture?
• How will you integrate the necessary tools into your overall evaluation system?
Implementation Monitoring: TIC, PIC
Implementation Integrity: BoQ, BAT
Implementation Research: SET, ISSET
![Page 129: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/129.jpg)
129
Data-Based Improvements Made
1. Increased emphasis on BoQ results for school and district-level action planning
2. Increased training to District Coordinators and Coaches and T.A.targeted areas of deficiency based upon data
3. Academic data used to increase visibility and political support4. Specialized training for high schools5. Identifying critical team variables impacted via training and T.A.
activities6. Revised Tier 1 PBS Training to include classroom strategies, problem-
solving process within RtI framework7. Enhanced monthly T.A. activities
![Page 130: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/130.jpg)
130
In Summary…
1. Know what you want to know2. Compare fidelity of implementation with
outcomes – presents a strong case for implementing Tier 1 PBS with fidelity
3. Additional sources of data can assist a state in determining if Tier 1 PBS process is working, but also why or why not it is working
4. Address state, district, school systems issues that may impact implementation success
![Page 131: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/131.jpg)
131
Some Resources
• Algozzine, B., Horner, R. H., Sugai, G., Barrett, S., Dickey, S. R., Eber, L., Kincaid, D., et al. (2010). Evaluation blueprint for school-wide positive behavior support. Eugene, OR: National Technical Assistance Center on Positive Behavior Interventions and Support. Retrieved from www.pbis.org
• Childs, K., Kincaid, D., & George, H.P. (in press). A Model for Statewide Evaluation of a Universal Positive Behavior Support Initiative. Journal of Positive Behavior Interventions.
• George, H.P. & Kincaid, D. (2008). Building District-wide Capacity for Positive Behavior Support. Journal of Positive Behavioral Interventions, 10(1), 20-32.
• Cohen, R., Kincaid, D., & Childs, K. (2007). Measuring School-Wide Positive Behavior Support Implementation: Development and Validation of the Benchmarks of Quality (BoQ). Journal of Positive Behavior Interventions.
![Page 132: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/132.jpg)
132
Evaluation Instruments
• PBIS website:– http://www.pbis.org/evaluation/default.aspx
• FLPBS:RtIB Project Coach’s Corner:– http://flpbs.fmhi.usf.edu/coachescorner.asp
• PBS Surveys– http://www.pbssurveys.org/pages/Home.aspx
![Page 133: Tools for Developing a Comprehensive Evaluation Template](https://reader031.vdocuments.mx/reader031/viewer/2022020704/61fb57782e268c58cd5d0830/html5/thumbnails/133.jpg)
133
Contact
Heather George, Ph.D. & Karen Childs, M.A.University of South FloridaEmail: [email protected]: http://flpbs.fmhi.usf.edu
Cynthia Anderson, Ph.D.University of OregonEmail: Email: [email protected]