where we were: 2009-10 quality review

41
Quality Review 2010- 11 September/October 2010 Academic Quality Division of Performance & Accountability

Upload: annabella-oral

Post on 31-Dec-2015

22 views

Category:

Documents


0 download

DESCRIPTION

Where we were: 2009-10 Quality Review. Shifted primary focus of the QR process: from a school’s data culture to its organizational and instructional coherence - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Where we were: 2009-10 Quality Review

Quality Review 2010-11

September/October 2010

Academic Quality

Division of Performance & Accountability

Page 2: Where we were: 2009-10 Quality Review

Where we were: 2009-10 Quality Review• Shifted primary focus of the QR process: from a school’s

data culture to its organizational and instructional coherence

• Stated that effective data and resource use, professional development, goal setting, and monitoring should be evident in the instructional core across classrooms• The accountability is in the task• If it’s not in the instructional core, it’s not there (question

of impact).

• Conceptualized teacher teams as the engine of improvement: vast majority of teachers need to be engaged in collaborative inquiry

2

Page 3: Where we were: 2009-10 Quality Review

Impact: The QR and Collaborative Inquiry

More teachers participated in collaborative inquiry teams in schools that experienced QRs in 09-10 than those that did not: 74% of teachers in schools with a QR 63% of teachers in schools without a QR• The difference is statistically significant

Within those reviewed, the relationship was linear: Well Developed schools: 80% teachers Proficient schools: 69% teachers UPF schools: 59% teachers

Note data source: Inquiry Spaces

3

Page 4: Where we were: 2009-10 Quality Review

A challenge presented to us: Test Scores

After soaring for years, test scores have appeared to drop dramatically. A correction is occurring, which has led to confusion. Questions are being asked:• What is really going on in our schools?• How much do we trust the quantitative data?• What does it mean?

This presents an opportunity for the work of Network Teams and Quality Reviewers.

A focus on classrooms and the instructional core is even more important.

4

Page 5: Where we were: 2009-10 Quality Review

Elmore agrees…

Elmore’s research on instructional coherence: a. Variability of practice across classrooms within a school is

higher than across schoolsb. Strong cultural norms in schools, among teachers (i.e.

teams) must raise the standard at the low end of practice, while variability at the high end of practice (i.e. innovation) should be encouraged

c. Instructional leadership shapes the school culture

school culture influences collective efficacy of teachers

collective efficacy of teachers has the strongest correlation to student outcomes

5

Page 6: Where we were: 2009-10 Quality Review

PROPORTION OF VARIANCE IN STUDENT GAIN SCORES-- READING, MATH-- EXPLAINED BY LEVEL--PROSPECTS STUDY

CLASS60%

READING52-72%MATH

STUDENTS28% R19% M SCHOOLS

12% R10-30 M

ROWAN, ET AL., “. . .PROSPECTS. . .” TEACHERS COLLEGE RECORD (2002).

From Elmore presentation, June 6, 2010

Page 7: Where we were: 2009-10 Quality Review

ORGANIZATIONAL COHERENCE AND INSTRUCTIONAL CAPACITY – ELMORE RESEARCH

INSTRUCTIONAL LEADERSHIP

ORGANIZATIONAL STRUCTURE/PROCESS

EFFICACY

INDIVIDUAL COLLECTIVECULTURE

• INSTRUCTIONAL EXPERTISE• MODELS LEARNING• DEVELOPS, DISTRIBUTES LEADERSHIP• BUFFERS, MODULATES EXTERNAL FORCES

• TEAM STRUCTURE• PROTOCOLS, NORMS• VERTICAL, LATERAL ACCOUNTABILIITY

QUALITY/PERFORMANCE(STUDENT OUTCOMES)

•LOCUS OF CONTROL•AGENCY•EFFORT

•NORMS/VALUES•COMMITMENTS•ARTIFACTS

From Elmore presentation, June 6, 2010

Page 8: Where we were: 2009-10 Quality Review

Data Reflection on Classroom Practices:• The score of indicator 1.2 (classroom practices and pedagogy)

has been significantly correlated to a school’s Progress Report score in the following year.

• Indicator 1.2 was the lowest rated of the 20 indicators in 09-10

• While Well Developed and Proficient schools in 09-10 exhibited high quality practices in numerous areas, the evidence of impact was not yet apparent in the instructional core across classrooms in a large number of schools:

8

% of Well Developed schools earning

Proficient or below on indicator 1.2

% of Proficient schools earning UPF or below

on indicator 1.2

51% 33%

Page 9: Where we were: 2009-10 Quality Review

Implication: Refinements to QR are Required

The importance of classroom practices in our evaluation must be heightened, and the evidence clarified (“Look Fors” and “Listen Fors”)

Evidence of excellent practice in all areas of the school must be tied to what is going on in classrooms – specifically what students are doing and producing (“accountability is in the task”

Our standard of evaluation will be raised across the rubric to that of 1.2

9

Page 10: Where we were: 2009-10 Quality Review

Changes to Quality Review in 2010-11To address the areas of concern highlighted by data and critiques

of the QR process throughout 2009-10, we have made changes to the:

1. QR Rubric2. Scoring guidelines3. Selection criteria

A. QR-JITB. NSQRC. Peer Reviews

4. Site visit protocols 5. Appeals Procedure6. QR Report

There is a memo on the QR webpage explaining most of these changes in greater detail.

10

Page 11: Where we were: 2009-10 Quality Review

1. Quality Review Rubric

> Articulated Underdeveloped column and moved language down in indicators to more accurately capture lowest level of practice observed

> UPF Developing> Inserted language regarding “Across classrooms…” in

various areas of the rubric> Indicator 2.2 now focuses more explicitly on

assessment quality and coherence with curriculum> Integrated language referring to the Common Core

State Standards (4.3, 5.1, 5.2, 5.3)

Note: A color-coded version of the rubric at the QR webpage clearly depicts each change from 2009-10 to 2010-11.

11

Page 12: Where we were: 2009-10 Quality Review

2. Quality Review Scoring Guidelines• The scoring guidelines are changing to a point-based system

with cut scores between quality categories.

• A school will earn points on each of the 20 indicators and these points will directly add up to the overall score.

• This shift solves a pressing concern regarding fairness. In the past, the scoring policy allowed for two schools to earn the same array of indicators and receive different overall scores depending on the way in which indicator scores were distributed.•  Example. a school with four Proficient indicators and 16 Well Developed

indicators was scored Proficient overall if pairs of Proficient indicators fell in two separate Quality Statements; another school with the same number of Proficient and Well Developed indicators was rated Well Developed overall when each of the Proficient indicators fell in four separate Quality Statements.

12

Page 13: Where we were: 2009-10 Quality Review

2. Quality Review Scoring Guidelines (cont.)

• The point-based scoring guidelines also offer the opportunity to weight key indicators more than others. The following indicators will be double in scoring weight:> 1.1: Rigorous and accessible curriculum> 1.2: Differentiated classroom practices and pedagogy> 1.3: Leveraging structures, technology, and resources to

improve student outcomes> 2.2: Assessment quality> 4.1: Data-informed staff support and performance evaluation

decisions

 

13

Page 14: Where we were: 2009-10 Quality Review

2. Quality Review Scoring Guidelines (cont.)

Using the following point scale: • Well Developed 4 points• Proficient 3 points• Developing 2 points• Underdeveloped 1 point

with a total of 20 indicators, five of which are weighted with double value:• the highest score possible on a Quality Review is 100, and • the lowest score possible on a Quality Review is 25.

  

14

Page 15: Where we were: 2009-10 Quality Review

2. Quality Review Scoring Guidelines (cont.)

• The chart below shows the cut scores and scoring ranges.

  

• The cut line between Well Developed and Proficient remains essentially the same as in 2009-10. The cut lines for Proficient and Developing return to levels similar to those required for Proficient and UPF in 2008-09.

• An excel file “QR Scoring Calculator” has been created to aid score tallying; it is available for download on the Quality Review page of the DOE website.

  

15

Scoring Category Range

Well Developed 92-100

Proficient 72-91

Developing 47-71

Underdeveloped 25-46

Page 16: Where we were: 2009-10 Quality Review

3. Quality Review Selection Criteria

• Given the results of State tests in the lower grades and the alterations to our system’s Progress Reports, we would be slated to review over 1150 schools if we used the QR selection criteria from 2009-10. 

• Therefore we are changing the criteria with the purpose of ensuring that every school experiences a review within a four-year cycle. The following criteria will trigger a Quality Review during 2010-11: • 09-10 Progress Report of F, D, or third C in a row (07-08, 08-09, 09-10)• Schools in the lowest 10 percentile of the PR scores • 09-10 Quality Review of UPF or U• Schools identified as Persistently Lowest Achieving by New York State• Schools with Principals at risk of not receiving tenure• Schools in their second year (opened in September 2009)*• Schools chosen from a lottery, within districts, that have not had a review

since 2007-08; schools that do not receive a review this year will receive one next year.*

* See slide on Peer Reviews

16

Page 17: Where we were: 2009-10 Quality Review

A. QR-JIT REVIEWPLA schools, as identified by NYSED, that require a JIT visit this fall, will

have a QR at the same time. (See QR-JIT memo online)

Why?> Extension of “EQR” policy in 2009-10 to increase alignment of

NYSED and NYCDOE processes> Decreases disruption to schools (one visit instead of two)> Increases the number of people contributing to both processes –

better reliability

What to expect:> An early review (completed by mid-November – SED wanted

these schools reviewed by last spring), except for 3 IA Ps 2 day visit of JIT team, overlapped with 2 or 2.5 day visit by QR

reviewer (depending on school size)> A joint NYSED-NYCDOE process, including the sharing of

documents and evidence gathered during the school visit> Separate outcomes:

QR report (provisional score shared with school) Recommendation to the JIT (not shared with school)

Page 18: Where we were: 2009-10 Quality Review

B. New School Quality Reviews (NSQR)

• Schools opening in 2010-11 will have a one-day New School Quality Review (NSQR).

• As in 2009-10, these reviews will be conducted by the network team and the reports will be shared internally but not published or used for accountability purposes.

• For more information, see the NSQR documents on the QR webpage of the NYCDOE site.

18

Page 19: Where we were: 2009-10 Quality Review

C. Peer Reviews• In the last year, DPA documented a number of networks and schools

that piloted different models of peer visits and reviews, all with significant positive feedback (see the QR Promising Practices Library: https://www.arisnyc.org/connect/node/813911).

• Every school is encouraged to engage in these formative intervisitations. The option of a more formalized Peer Review process include: • Schools in their second year (opened in 09-10)• Schools in the selection lottery showing a sustained history of significant

gains, i.e. a grade of “A” on the Progress Report in 07-08, 08-09, 09-10.

• Peer Reviews will be organized and conducted by network teams, and occur in lieu of an external Quality Review. Reports will be shared internally but not published or used for accountability purposes.

• Training will be offered.

• The Peer Review policy memo is online at the DOE QR webpage.

19

Page 20: Where we were: 2009-10 Quality Review

4. Quality Review Site Visit Protocols

• Almost all of the site visit protocols will remain the same.

• At least one of the two teacher team meetings must exhibit an examination of student work in the presence of teacher work (curriculum, academic tasks, assessments/rubrics, etc.).

• Both teacher team meetings will provide an opportunity for the reviewer to triangulate information on, among other things, how the school is approaching the evolving nature of the New York State standards (i.e. implications of the Common Core State Standards).

20

Page 21: Where we were: 2009-10 Quality Review

5. Appeals Procedure

• The appeals procedure has historically had two levels; in 2010-11 it will be collapsed into one level of response.• Data verification will be separated from the appeal of

evidence. • There will be an appeal form, requesting a reference to the

rubric, the issue of concern, and the evidence/impact to support the appeal.

• Concerns about the QR process during the site visit must be appealed through the evidence of practice (i.e. what evidence is not present or incorrect due to the process).

• The QR team will make the decision if the appeal investigation warrants a school visit.

21

Page 22: Where we were: 2009-10 Quality Review

6. QR Report

• The overall evaluation (summary narrative) section of the QR report has been eliminated.

• The demographics section and the10 bullet points that detail Strengths and Areas for Improvement remain.

• This change should expedite the writing and Quality Assurance Reading (QAR) process. The QR team expects to make the commitment to a 6 week turnaround.

22

Page 23: Where we were: 2009-10 Quality Review

Expectations of Schools and CCSS

• Begin Planning (see rubric)

• Exposure to teachers of CCSS – 2x this year

• A subset of teacher teams engaged in inquiry work related to CCSS (high end performance on QR)

• Schools have received $$, per capita, to engage in PD on the CCSS

Page 24: Where we were: 2009-10 Quality Review

The Rubric & Expectations of the “Evolving State Standards”

Indicator 4.3.a: professional learning opportunities

Indicator 5.1.a: structures to adjust curricular and instructional practices

Indicator 5.1.b: structures to adjust organizational resources

Indicator 5.1.c: structures to adjust capacity-building practices

Indicator 5.2.a: planning to revise assessments

Indicator 5.3.a: long-term planning

Page 25: Where we were: 2009-10 Quality Review

Teacher Teams Engaged in Collaborative Inquiry

• We will continue to meet with two teacher teams engaged in collaborative inquiry

• We understand that teachers may be at a variety of points in the inquiry cycle during the time of the review

• Now we are going to request that at least one of the two meetings involves looking at student work, teacher tasks, and state standards in order to move student outcomes

• Both teacher team meetings provide an opportunity for the reviewer to triangulate information about where the school is in preparing for evolving state standards

Page 26: Where we were: 2009-10 Quality Review

Quality Review TeamNancy Gannon, Senior Director for School Quality

New Directors for School Quality:• Evelyn Terrell• Carolyn Yaffe• Eileen Waters• Beverly Ffolkes-Bryant

Esther Maluto, Administrative Support

Deborah Shiff, Administrative Support

Alex Thome, Project Manager, Academic Quality

Doug Knecht, Executive Director, Academic Quality

Contact: [email protected]

26

Page 27: Where we were: 2009-10 Quality Review

Appendix

• Activity: What constitutes good? (Weighted indicators)

• Activity: Using Evidence in the QR Report

• Activity: What’s new and what constitutes good?

• Discussion: Goals of American Education, NYC, and QR

27

Page 28: Where we were: 2009-10 Quality Review

28

Activity: What Constitutes Good?

• In pairs/trios you are assigned one of the weighted indicators.

• Respond to these questions:• What data should be examined before and during the

QR regarding this statement/indicator?• What are some probing questions to help uncover

information around this statement/indicator?• What are the key “Look Fors” and/or “Listen Fors” that

would distinguish the difference between Proficient and Developing? Between WD and Proficient?

• Share out

28

Page 29: Where we were: 2009-10 Quality Review

Using Evidence in the QR Report

Questions to Consider:1. Is the bullet clear and related to the rubric?2. What level of the rubric is the bullet describing?3. Is this bullet supported with site-based evidence?4. Does this bullet show cause and effect? Impact?5. From Quality Assurance Reading perspective, does the supporting

evidence bullet(s) represent two of the sub-indicators (a, b, or c) of the indicator of focus?

29

Page 30: Where we were: 2009-10 Quality Review

Indicator 1.1 (Curriculum)

The school has created an exceptionally strong and coherent curriculum that connects across grades and subjects, supporting learning at high levels. 

> The school has implemented a set of key cognitive strategies that spans the three grades.  Those standards have been integrated into curriculum and into a “College Readiness” continuum that all teachers use to measure student growth in these key cognitive strategies over time.  Since this implementation, teachers and students have noted an increase in rigor of assignments across grades and subjects, and teachers have seen an increase in student achievement on major projects in English and math.

> Rubrics around scholarly writing are integrated into curriculum across three years, so that in addition to gaining specific content and strategies, students develop habits that support ongoing and independent learning.  Since the school has implemented these guidelines, student performance on writing-based exams has increased significantly.

30

Page 31: Where we were: 2009-10 Quality Review

Indicator 1.2 (Pedagogy)

 Ensure higher levels of active student engagement across all classrooms so that teaching strategies provide classroom support to all learners into the curriculum and continually improve student cognitive capacity.

> Although many students produce student work that is reflective of rigorous and precise curriculum and instructional goals, not all teachers plan lessons that thoroughly and actively engage all students in the classroom. In turn, some students passively participate by taking notes and reading required text, and a few teachers do not check-in with these students for understanding. Consequently, we do not know what they have learned, how well they have learned it, and if immediate intervention and clarification can benefit the progress of these students.

31

Page 32: Where we were: 2009-10 Quality Review

Indicator 1.3 (Structures & Resource-use)

The principal’s strategic leadership promotes organizational decisions that support school, teacher team, and classroom level goals well, consistently improving student outcomes.

> When the school relocated to its new site this past summer, at the faculty’s request the principal made it a priority to allocate space that offers each department opportunities to meet regularly both formally and informally. In turn, they discuss student progress and plans for future improvement as well as provide regularly schedule office time to tutor struggling students. Furthermore, the principal creatively manages the budget, resulting in lower class size and an effective student advisory program that focuses on each student’s requisite for success.

> The highly collaborative nature of all stakeholders led to redesigning the library space to become a hub for students to participate in a learning center where teachers regularly tutor small groups or individuals, as well as to have open access throughout the day to work and do research, improving their student coursework outcomes.

32

Page 33: Where we were: 2009-10 Quality Review

Indicator 2.2 (Assessment quality)

 Deepen expectations for sharing and analyzing current student work to celebrate learning and make public what is being studied. 

> Across classrooms, there is virtually no student work visibly posted, with notable exceptions in an English language arts class and a self-contained special education class.  This leaves most classroom environments bare, without important scaffolds for students to view what they are studying.

> Teachers do not sufficiently share and discuss student work products as a way to assess students’ learning and instructional consistency across classrooms.  Instead, grade and department team discussions center on students’ Regents and periodic assessment scores and performance.  Thus, teachers’ expectations for meaningful student work vary widely.

33

Page 34: Where we were: 2009-10 Quality Review

Indicator 4.1 (Staff Supervision & Support)

 The school effectively uses observations and other teacher data to improve teacher practice and student outcomes. 

> Each department leader works to support teachers in a variety of ways to move them to their next level.  Beyond mentoring they give to new teachers, they provide one-on-one support, arrange intervisitations, and look at student work as a group to help all teachers learn.  Because school leaders plan carefully around teacher learning opportunities, differentiation has increased and students’ learning needs are addressed at every level.

> The math department is exceptional in the way it organizes data to analyze teacher performance.  Because teachers give common assessments, the assistant principal can compare results across teachers and better document and target exemplary practices and teachers who need support.  Because teachers get concrete feedback connected to student performance results, they are better able to share best practices and hone their craft to support student progress.

34

Page 35: Where we were: 2009-10 Quality Review

How reviewers initially scored these indicators:

1.1 = WD

1.2 = P

1.3 = WD

2.2 = P

4.1 = P

Do these scores make sense to you? Should they have been revised?

35

Page 36: Where we were: 2009-10 Quality Review

36

Activity: What’s New and What Constitutes Good

• In pairs/trios, choose a Quality Statement or indicator.

• Identify what’s new in this statement/indicator

• Respond to these questions:• What data should be examined before and during the

QR regarding this statement/indicator?• What are some probing questions to help uncover

information around this statement/indicator?• What are the key “Look Fors” and/or “Listen Fors” that

would distinguish the difference between each level (WD, D, P, UD) of the statement/indicator?

• Share out

36

Page 37: Where we were: 2009-10 Quality Review

Goals of American Education

Think Now: The following eight goals of the American education system have been distilled from over 225 years of discourse and public surveys. Rank in order of importance the eight goals (1 to 8).

___ Arts & Literature ___ Basic academic skills (e.g. literacy)___ Citizenship ___ Critical thinking ___ Emotional health ___ Physical health ___ Preparation for skilled work___ Social skills and work ethic

37

Page 38: Where we were: 2009-10 Quality Review

Eight Goals of American Education Grading Education: Getting Accountability Right (Rothstein, 2008)

Goal Relative Importance

1. Basic academic skills (e.g. literacy) 21%2. Critical thinking 16%3-4. Citizenship 13%3-4. Social skills and work ethic 13%5-6. Arts and literature 10%5-6. Preparation for skilled work 10%7. Physical health 9%8. Emotional health 8%

38

Page 39: Where we were: 2009-10 Quality Review

NYCDOE Accountability & the Eight Goals

TABLE TALK (7 min):

How well does our accountability system and tools support schools in meeting the eight goals?

1. Which goals and how much?2. Is it the right balance?3. Which accountability tools offer the best

leverage? Why?

39

Page 40: Where we were: 2009-10 Quality Review

Accountability & Our Citywide Work

• Since 2002 New York City has improved graduation rates and worked to close the achievement gap as measured by standardized testing. (To be addressed tomorrow.)

• We are now focusing on special education and ELL reform, and college and work readiness (CCSS work, future alterations to the Progress Report, etc.) to:• Increase the level of academic challenge for all students • Ensure school structures support all students and their teachers

• As a result, DPA, in partnership with DSSI and other central offices, are emphasizing more of the eight goals.

40

Page 41: Where we were: 2009-10 Quality Review

Role of the Quality Review

TABLE TALK (5 min):

What is the role of the Quality Review in achieving our agenda to improve New York City schools?