evaas and the nc educator evaluation system
DESCRIPTION
EVAAS and the NC Educator Evaluation System. October 23 , 2012 Macon County Schools Principals and Assistant Principals Joyce Gardner Professional Development Consultant, DPI. Resources and Materials. http:// region8wnc.ndpi.wikispaces.net http:// wikicentral.ncdpi.wikispaces.net - PowerPoint PPT PresentationTRANSCRIPT
October 23, 2012
Macon County Schools Principals and Assistant Principals
Joyce GardnerProfessional Development Consultant,
DPI
EVAAS and the
NC Educator Evaluation System
Joyce GardnerProfessional Development ConsultantRegion [email protected]
Resources and Materials
• http://region8wnc.ndpi.wikispaces.net
• http://wikicentral.ncdpi.wikispaces.net
• http://evaas.ncdpi.wikispaces.net/home
Agenda • Welcome
• EVAAS – What is it? What’s in it for Teachers?
• System Overview
• Pre-Assessment
• Connections with Educator Evaluation
• Reflective Assessments
• Reports
• Exit Ticket
3
Outcomes:
• Make connections between EVAAS and Educator Evaluation Standards
• Define Value-Added • Differentiate between Achievement and
Growth• Explore reflective assessments• Identify the meaning of the “Whiskers”• See various EVAAS reports
Resources
Wikicentral.ncdpi.wikispaces.net
http://evaas.ncdpi.wikispaces.net/home
Data Literacy Module
https://center.ncsu.edu/nc
Data Resource Guide
http://www.ncpublicschools.org/acre/improvement/resources/
Growing Data Literacy Skills
Pre-Assessment
Poll: I am very familiar with the Education Va...
Poll: I know how to login to the EVAAS website...
Poll: I know how to navigate the EVAAS website...
Poll: I understand EVAAS report names.
Poll: I know how to use the EVAAS website to g...
Poll: I know how to access EVAAS reports for i...
Poll: I am able to analyze the metrics in EVAA...
Poll: I know how to collect evidence from EVAA...
Poll: I know how to collect evidence from EVAA...
Poll: I know how to interpret the following re...
Poll: I am able to communicate the findings of...
Poll: I am able to use data analysis to initia...
A Data Literate Person Can…
A data literate person possesses the knowledge to gather, analyze, and graphically convey information to support short and long-term
decision-making.
Why should EVAAS Matter to You?
Teachers Principals District Leaders
NC Professional Teaching Standards
Standard I: Teachers demonstrate leadership.
Take responsibility for the progress of all students
Use data to organize, plan, and set goals
Use a variety of assessment data throughout the year to evaluate progress
Analyze data
Standard IV: Teachers facilitate learning for their students.
Use data for short and long range planning
Standard V: Teachers are reflective on their practice.
Collect and analyze student performance data to improve effectiveness
Standard 6 for Teachers
Teachers contribute to the academic success of students.
The work of the teacher results in acceptable, measurable progress for students based on established performance expectations using appropriate data to demonstrate growth.
NC Standards for School Executives
Standard 2: Instructional Leadership
• Focuses his or her own and others’ attention persistently and publicly on learning and teaching by initiating and guiding conversations about instruction and student learning that are oriented towards high expectations and concrete goals;
• Creates processes for collecting and using student test data and other formative data from other sources for the improvement of instruction
• Ensures that there is an appropriate and logical alignment between the curriculum of the school and the state’s accountability program
• Creates processes for collecting and using student test data and other formative data from other sources for the improvement of instruction
Standard 8 for School Executives
Academic Achievement Leadership
School executives will contribute to the academic success of students. The work of the school executive will result in acceptable, measurable progress for students based on established performance expectations using appropriate data to demonstrate growth.
Benefits and Considerations for Teachers
• Understand academic preparedness of students before they enter the classroom.
• Monitor student progress, ensuring growth opportunities for all students.
• Modify curriculum, student support, and instructional strategies to address the needs of all students.
Professional Development
is the Key
• Culture of School
• Sensitivity of Data
• Finger Pointing and Blame Game
• Window vs. Mirror
Benefits for Principals• Gain a consolidated view of student progress and
teacher effectiveness, as well as the impact of instruction and performance.
• Bring clarity to strategic planning and function as a catalyst for conversations that must take place to ensure that all students reach their potential.
• Understand and leverage the strengths of effective teachers.
• Use the valuable resource of effective teaching to benefit as many students as possible.
ACHIEVEMENTVS. GROWTH
Student Achievement
End of School Year
Proficient
• Student is able to meet specific standards ?
• Students fall into a lim
ited range or band of
achievement?
Does not account for :
• change outside of that range
• student ability before they came to class
“I can do it”
Student Growth
End of School Year
Proficient
Start of School Year
Not Proficient
Change over
time
Student Growth
End of School Year
Proficient
Start of School Year
Not Proficient
Change over
time
Student growth:
• Accounts for student achievement within range or beyond that range
• Compares actual to predicted student achievement
• Discerns between teacher impact and student ability
• Accounts for student ability before they came to class
Improvement and progression
Achievement vs. Growth
Student Achievement: Where are we?
• Highly correlated with demographic factors
Student Growth: How far have we come?
• Highly dependent on what happens as a result of schooling rather than on demographic factors
The EVAAS Philosophy
• All students deserve opportunities to make appropriate academic progress every year.
• There is no “one size fits all” way of educating students who enter a class at different levels of academic achievement.
The EVAAS Philosophy
• Adjustments to instruction should be based on the students’ academic needs, not on socio-economic factors.
• "What teachers know and can do is the most important influence on what students learn." (National Commission on Teaching and America's Future, 1996)
Achievement and Poverty
How is this fair?
Academic Growth and Poverty
No one is doomed to failure.
Proficiency vs. Growth
Scenario Proficient Growth
5th grader begins the year reading at a 1st grade level. Ends the year reading at a 4th grade level.
5th grader begins the year reading at a 7th grade level. Ends the year reading at the 7th grade level.
NO
NO
YES
YES
EVAAS Overview
What is EVAAS?
•Education
E
•Value
V•A
dded
A
•Assessment
A
•System
SSo What Does It Do?
What is EVAAS?
SAS EVAAS Analyses
Writing
ACT
End of Course
End of Grade
LOOKING AHEAD
Planning for Students’ Needs:
Student Projections to Future Tests
LOOKING BACK
Evaluating Schooling
Effectiveness:
Value Added & Diagnostic Reports
How can EVAAS help me?
Improve the Education Program
EVAAS: Looking Back
Past Program Effectiveness
Local Knowledge &
Expertise
EVAAS: Looking Ahead
Incoming Student Needs
Education Value Added Assessment System
– Answers the question of how effective a schooling experience is for learners
– Produces reports that• Predict student success • Show the effects of schooling at particular schools• Reveal patterns in subgroup performance
EVAAS extracts data AFTER DPI collects data through the secure shell. DPI runs processes and checks for validity. Once DPI has completed their processes with the data, they present to the SBE. At this point, data is sent to EVAAS.
What is value-added assessment?
• An approach to analyzing student achievement data.
• Follows a student’s academic progress over time.
• Connects each student’s test records from grade to grade over subjects, the influence of the district, school and teacher on the rate of academic progress can be extracted via complex data analysis.
Changes in Reporting for 2012-13
2011-2012 2012-13
Above
Not Detectably Different
Below
Exceeds Expected Growth
Meets Expected Growth
Does Not Meet Expected Growth
Three Kinds of Reports
• Value-added
• Diagnostic
• Performance diagnostic
– District and Sch0ol level have elements in common: if you can read one you can read the other.
52
District Value-Added Report
• Used to evaluate the overall effectiveness of a district on student progress
• Compares each district to the average district in the state for each subject tested in the given year
• Indicates how a district influences student progress in the tested subjects
Value-Added Reporting
Value-Added Reporting
Overall effectiveness of a school on student progress: School Value Added Reports compare each school to the average school in the state. Comparisons are made for each subject tested in the given year and indicate how a school influences student progress in those subjects.
The School Value Added Report compares each school to the average school in the state.
Comparisons are made for each subject tested in the given year and indicate how a school influences student progress in those subjects.
Scores from the EOG tests are converted to State NCEs (Normal Curve Equivalent scores)
• NCE scores are on an equal-interval scale
• Allow for a comparison of students' academic attainment level across grades.
• Remain the same from year to year for students who make exactly one year of progress after one year of instruction even though their raw scores would be different.
• NCE gain would be zero (0) for these students..
Normal Curve Equivalent Score: NCE
Value-Added Reporting
If the Mean NCE Gain is greater than or equal to zero, the average student in this school has achieved a year’s worth of academic growth in a year
If the Mean NCE Gain is less than zero, the average student in this school has achieved less growth than expected
Value-Added Reporting
The NCE Base is by definition set at 50.0, and it represents the average attainment level of students in the grade and subject, statewide.
If the school mean is greater, the average student in the school is performing at a higher achievement level than the average student in the state.
District Diagnostic Reports
• Identify patterns or trends of progress among students expected to score at different achievement levels
Diagnostic Report
District Performance Diagnostic Reports
• Use to identify patterns or trends or progress among students predicted to score at different performance levels as determined by their scores on NC tests
• Students assigned to Projected Performance Levels based on their predicted scores
• Shows the number (Nr) and percentage of students in the district that fall into each Projected Performance Level
District Performance Diagnostic Reports
District Performance Diagnostic Reports
The Reference Line in the table indicates the gain necessary for students in each Prior-Achievement Subgroup to make expected
progress and reflects the growth standard.
When Gain is reported in NCEs, as it is here, the growth standard is 0.0.
District Performance Diagnostic Reports
The Gain is a measure of the relative progress of the school's students in each Prior-Achievement
Subgroup compared to the Growth Standard.
District Performance Diagnostic Reports
Standard errors appear beneath the Gain for each Prior-Achievement Subgroup. The standard error allows the user to establish a confidence band around the estimate. The smaller the number of students, the larger the standard error.
Interpreting the Pie Chart
Light Red
Green
Yellow
Yellow: students progressed at a rate similar to that of students in the average district in the state.
Light Red: students made more than one standard error less progress in this subject than students in the average district in the state.
Green: the progress of students was more than one standard error above that of students in the average district in the state.
BREAKReturn in 15 minutes.
70
Reflective Assessments
Value-Added Reports
Diagnostic Reports Looking for Patterns
Diagnostic Reports:Looking for Patterns
Green line = Reference line: amount of progress students need to make to maintain their entering achievement level. Bars above the line = students in that group made good progress. Bars below the line = students left this grade at a lower achievement level than when they started.
Diagnostic Reports: Looking for Patterns
Blue bars = progress of students in the most recent year. Gold bars = progress of students in up to three previous cohorts, when data are available.
No bar = groups with fewer than five students.
Diagnostic Reports Looking for Patterns
Red vertical line that intersects each bar indicates one standard error above and below the progress measure. The standard error allows you to establish a confidence band around the estimate.
School DiagnosticShed Pattern
School Diagnostic: Shed Pattern
Lowest achieving students are making sufficient progress. Students at an average achievement level are making expected progress.Highest achieving students appear to be losing ground. Teachers and administrators will want to find ways to create more progress opportunities for high achieving students.
School Diagnostic: Reverse Shed Pattern
High achieving students are making excellent progress.Students who are average in achievement also are making sufficient progress. Lowest achieving students are not making as much progress as they should. A pattern like this widens the achievement gap. Teachers and administrators should consider how to help lower achieving students gain more ground.
School DiagnosticTent Pattern
School Diagnostic: Tent Pattern
Students in the middle of the achievement distribution are making sufficient progress.Lower- and higher-achieving students are falling behind their peers.Consider how to support low-achieving students and how to challenge high-achieving students.
School DiagnosticV Pattern
School Diagnostic: V Pattern
Opposite of the Tent Pattern: only the lowest and the highest achieving students are making good progress. Students in-between have not had enough opportunities for academic growth.
School DiagnosticOpportunity Gap Pattern
School Diagnostic: Opportunity Gap Pattern
Students in every achievement group are making sufficient progress in the most recent year, except for the second group. Consider how to adjust the classroom instruction to meet these students’ needs. What approaches successful with the lowest achieving students could be expanded to include students in the second achievement group.
Draw an ideal pattern on a Diagnostic Report
that would indicate a closing of the achievement gap?
Diagnostic Reports – Desirable Pattern
Diagnostic Report: Desirable Pattern
What does this say?
Diagnostic Report: Desirable Pattern
All bars above the green line = district highly effective with students in all achievement groups. Students in the lowest quintile made more progress than students in the other quintiles.
• These students are starting to catch up with their peers; the gap is closing; they are increasing their performance more than a year’s worth of growth.
DIAGNOSTIC & PERFORMANCE DIAGNOSTIC REPORTS
(PART 2)
Diagnostic Reports – the Whiskers
Diagnostic Reports – the Whiskers
The Red Whisker represents the confidence interval due to the standard error for the mean. There is a high probability that the actual mean falls somewhere within the Whiskers. The size of the confidence interval is determined by the sample size.
Diagnostic Reports – the Whiskers
If the Whisker passes over the green line (reference line), expected growth is indicated. There is a chance the mean is actually on the other side of the green line. It is not certain that the teacher is exclusively above or below the reference line if the whisker crosses the green reference line.
Diagnostic Reports – the Whiskers
On the left, the Whiskers lie completely below the green line, so the group represented by the bar made less than average progress (↓).
On the right, the Whiskers contain the green line, so the group represented by the bar made average progress (-).
Diagnostic Reports – the Whiskers
Here, the Whiskers lie completely below the green line, so the group represented by the bar made less than average progress (↓).
Diagnostic Reports – the Whiskers
Here, the Whiskers contain the green line, so the group represented by the bar made average progress (---).
1. Go to the websitewww.ncdpi.sas.com
Overview of School Effects (sample data)
Overview of School Effects (sample data)
Break- Until Next time…
MACON County School Administrators
• PART 2
Overview of School Effects (sample data)
Overview of School Effects
On Your Own• Finish the table.
Do this by yourself.
• Use sample data
• Complete your table.
Overview of School Effects
What did you find?
• Interesting Patterns
• Insights
• Areas of Concern
• Areas of Celebration
Overview of School Effects (sample data)
1. Go to the websitewww.ncdpi.sas.com
1. Go to the website ncdpi.sas.com
1. Go to ncdpi.sas.com
2. BOOKMARK IT!
3. Secure & ConvenientOnline Login
Do you see this?
Then Sit Tight!
Overview of School EffectsIt’s Your Turn!
• Find the blank table.
Do this by yourself.
• Using your data
• Fill in your table.
Overview of School Effects
What did you find?
• Interesting Patterns
• Insights
• Areas of Concern
• Areas of Celebration
Lunch
Student Pattern Report
Student Pattern Report
Key points to remember:
• The report shows growth for the lowest, middle, and highest
achieving students within the chosen group.
• The report can be used to explore the progress of students with
similar educational opportunities.
• Like all diagnostic reports, this report is for diagnostic purposes only.
• A minimum of 15 students is needed to create a Student Pattern
Report.
Student Pattern Report
Student Pattern Report
Key Questions
Student Pattern Report – Key Questions
Different experience?
Different strategies?
Different needs?
Number of hours?
Student Pattern Report – Key Questions
Different experience?
Different strategies?
Different needs?
Number of hours?
Rerun the report with new criteria.
YES!
Student Pattern Report – Next Steps
16 Students who attended for 40+ hours
All 31 Students in the Program
Less Informed Conclusion: We need to change the selection criteria for this program.
More Informed Conclusion: We need to adjust the recommended hours for participants.
CUSTOM STUDENT REPORT
Custom Student Report HANDOUT
Today’s PresentersHeather MullinsProfessional Development ConsultantRegion [email protected]
Robert SoxProfessional Development [email protected]
Joyce GardnerProfessional Development ConsultantRegion [email protected]
Jason RhodesProfessional Development ConsultantRegion [email protected]