grantee performance management system (gpms) work …

13
1 | Page Updated 2/5/2021 Grantee Performance Management System (GPMS) Work Schedule and Timeline Management and Coordination 1. Team Members OHS GPMS Development Oversight, Leadership and SME Performance Measurement SMEs OHS Leadership Ann Linehan Adia Brown Traci Padgett Colleen Rathgeb Tamara White Rick Fiene Amanda Bryans (ECE, PMQI) Cynthia Romero Thom Flottemesch Sharon Yandian Jesse Escobar Contractor GPMS Oversight Programmatic SMEs OHS Stakeholders Bert Sorongon (Lead) Catherine Robin Marco Beltran (HEA) Sarah Sargen (Manager) Melissa Bandy Sangeeta Parikshak (MH) Marisa Russo (Oversight) Tabitha Temple Kiersten Beigel (FCE) Jaycee Jones (Coordinator) Jamie Sheehan (ECE) Corporate Oversight Specialty SMEs Sarah Merrill (ECE) Corporate Oversight Cynthia Northington Dayana Garcia (ECE) Helene Fisher Leta Chadwick David Jones (PMQI; ERSEA) Ro Franchi Lindsey Hutchison (PMQI) LaToia Frayer Data/Technology SMEs Jennifer Amaya-Thompson (DLL) Bert Sorongon Belinda Rinker (FIS) OPRE POC(?) Ashwin Manne Aarti Nashte Randy Rosso John Hufford Health Johnson (FIS) Tanesha Canzater (ERSEA) Catherine Hildum (ERSEA) Regional Program Managers 2. Status Update Cadence The team will provide weekly updates on progress, submitted Fridays. The team will communicate regularly with OHS on the progress of EAS and GPMS development and continue to monitor the tenability of our measure assumptions through pilot testing, discussions with content leads and other subject matter experts, and data analysis. As assumptions shift, the team will examine the potential impact on projected milestones and discuss the impact with OHS and Timeline

Upload: others

Post on 24-Mar-2022

4 views

Category:

Documents


0 download

TRANSCRIPT

1 | P a g e U p d a t e d 2 / 5 / 2 0 2 1

Grantee Performance Management System (GPMS)

Work Schedule and Timeline

Management and Coordination

1. Team Members

OHS GPMS Development Oversight, Leadership and SME

Performance Measurement SMEs

OHS Leadership Ann Linehan

Adia Brown Traci Padgett Colleen Rathgeb Tamara White Rick Fiene Amanda Bryans (ECE, PMQI) Cynthia Romero Thom Flottemesch Sharon Yandian Jesse Escobar Contractor GPMS Oversight Programmatic SMEs OHS Stakeholders Bert Sorongon (Lead) Catherine Robin Marco Beltran (HEA) Sarah Sargen (Manager) Melissa Bandy Sangeeta Parikshak (MH) Marisa Russo (Oversight) Tabitha Temple Kiersten Beigel (FCE) Jaycee Jones (Coordinator) Jamie Sheehan (ECE) Corporate Oversight Specialty SMEs Sarah Merrill (ECE) Corporate Oversight Cynthia Northington Dayana Garcia (ECE) Helene Fisher Leta Chadwick David Jones (PMQI; ERSEA) Ro Franchi Lindsey Hutchison (PMQI) LaToia Frayer Data/Technology SMEs Jennifer Amaya-Thompson (DLL) Bert Sorongon Belinda Rinker (FIS) OPRE POC(?) Ashwin Manne

Aarti Nashte Randy Rosso John Hufford

Health Johnson (FIS) Tanesha Canzater (ERSEA) Catherine Hildum (ERSEA) Regional Program Managers

2. Status Update Cadence ➢ The team will provide weekly updates on progress, submitted Fridays.

➢ The team will communicate regularly with OHS on the progress of EAS and GPMS development

and continue to monitor the tenability of our measure assumptions through pilot testing,

discussions with content leads and other subject matter experts, and data analysis.

➢ As assumptions shift, the team will examine the potential impact on projected milestones and

discuss the impact with OHS

and Timeline

2 | P a g e U p d a t e d 2 / 5 / 2 0 2 1

Objective The GPMS will calculate grantee performance level to help OHS to more effectively direct monitoring, RO and

T/TA resources to grantees most needing support, and to share effective practices identified amongst the

highest grantee performers.

Exhibit 1. GPMS Framework

As Exhibit 1 presents, the components of the proposed GPMS system include the following, described briefly

here and expanded on in the subsequent sections of the work plan:

➢ Risk Assessment: Applies set of risk indicators to identify existence of flags that suggest the grantee is

at risk of decreased performance/quality. This triggers a deeper analysis during the FA1 or FA2 review,

or potentially the need for a Special Review, or RO or T/TA support. Risk is not included in the grantee’s

performance measurement score.

➢ Performance Assessment: Identifies a set of performance indicators that collectively are used to rank

grantee performance. The system will include monitoring data to start, and can incorporate other data

(e.g., QRIS, licensing). Note that we are developing the Evidence Assessment System (or EAS) to allow

for more granular and standardized distinctions in performance as assessed through FA1 and FA2

monitoring reviews. This will strengthen the PM data used in the GPMS.

➢ Weighting Scoring System: Applies weights to performance indicators and uses a scoring algorithm to

calculate grantee performance scores.

➢ Grantee Performance Level GPMS Reporting): Provides standardized performance ratings or scores

based on available performance data. The GPMS system will produce a GPMS report, updated on an

agreed-upon reporting schedule.

The GPMS will be delivered via a Tableau/IT-AMS dashboard to the client in an intuitive and easily navigable

display. The table below presents the overarching timeline for development, validation, and deployment of the

GPMS.

3 | P a g e U p d a t e d 2 / 5 / 2 0 2 1

GPMS Timeline – High level snapshot

FY 2021

by May 2021 Develop FA2 EAS

• Work with identified OHS stakeholders to refine and develop PMs, QIs and quality markers

• Work with EAS/GPMS scoring development team to monitor types of changes to PM (how

they’re operationalized and measured)

• Monitor for impact of changes to PMs and QIs on GPMS-related activity

• Monitor for impact of pandemic-related data collection methodology on GPMS-related activity

Apr – May 2021 Pilot test FA2 EAS

Focus on collecting meaningful data; QI/QM question clarity

Late Apr - May 2021 Refine the grantee review report template for FY2022 monitoring (as needed)

May-Aug 2021 Program review report, FA2 EAS into IT-AMS

By July 2021 Develop GPMS scoring algorithm

July– Sept 2021 Training for FY 2022 Monitoring

Oct 2021 (FY2022) Implement FA2 EAS in monitoring

• Use EAS PM / QI framework, with quality markers providing anchors for ratings at the QI level

• QI-level ratings aggregate to PM-level rating

• Information does not auto-populate the grantee review report

4 | P a g e U p d a t e d 2 / 5 / 2 0 2 1

FY 2022

Oct – Dec 2021 Continue pilot testing (assumes FY2022 Monitoring launches in Oct 2021)

• Review the quality of data collected – e.g., alignment between data collected with the

questions asked; look for contradictions within or across QIs and/or PMs,

• Calibrate EAS thresholds with OHS leadership

• Test EAS scoring

• Test methodology for collecting data

Nov 2021 – Jan 2022 Program GPMS scoring algorithm into IT-AMS

Feb 2022-June 2022 GPMS scoring system validation for internal use

• When EAS are stable, use FA2 PM ratings data to GPMS scoring algorithm

• Re-run GPMS data analyses to see if new EAS-driven data supports same PM rankings

• Iterative testing process for GPMS using EAS data, calibration with OHS, making refinements to

scoring algorithm

Late Apr - May 2022 Develop/refine the report templates

• Consider any report refinements (e.g., auto-populating report with EAS-driven customized

content at the QI and PM levels)

• Design GPMS performance reporting template (for internal use with OHS/not grantee

distribution)

June 2022 Pause to reassess

• Assess stability of GPMS scoring algorithm

• Make recommendation for moving forward with broader-scale implementation in FY 2023 vs

continuing with internal use (behind the scenes with OHS leadership and RPMs) for another

year based on results of EAS pilot and GPMS testing to date, as well as better understanding of

pandemics impact on data collection methodologies.

Jun – Sept 2022 Training for FY 2023 Monitoring

Oct 2022 Launch FY2023 Monitoring

5 | P a g e U p d a t e d 2 / 5 / 2 0 2 1

Risk Assessment Component

Conceptualization

• Risk Indicator = particular circumstance or grantee

characteristic (e.g., turnover at management level) that

suggests there might be increased risk for quality

concerns in the program

• A risk flag triggers a deeper analysis during the FA1 or FA2

review to determine whether performance is impacted.

• FY 2021 risk indictor data will serve as a pilot data to

further explore the relationship between risk and

performance.

• If desired, the risk assessment component of GPMS also

can evolve to be used for grantees not on the National

Schedule, to gauge risk in between monitoring events, directing T/TA and RO support to higher risk programs.

• Risk does not equal performance - grantees may have risks but be high performers.

• Risk is not included in the grantee’s performance measurement score.

Timeline Note: Indicates “we are here” in our work

Expected Due Date Risk Assessment Component Activities Completed Identify risk indicators

Completed Determine availability of risk indicator data

Completed (Oct ‘20) FA1 Review Handbook - Incorporate analysis prompts into handbook to support analysis based on any risk indicator flags identified during pre-review analysis

Completed (Nov ’20) Team to finalize recommendations for collecting risk indicator data during FA1 and FA2 reviews (postponed until after launch)

Completed (Nov ’20) Discuss options with OHS – finalize plans for risk assessment implementation

Completed (Dec ‘ 20) Program risk indicators into IT-AMS; UAT conducted Dec 9-10, 2020

Completed (Dec ’20) Train Review Leads on risk indicator data collection and verification for FA1 and FA2

Feb 2021 Begin to analyze risk indicator (pre-site) data (frequencies, distributions).

Mar – May 2021 Explore relationships between risk and performance; will continue exploring relationships using FY2022 data. Discuss and develop Risk Profile template; consider options for risk dashboard and management / RO risk profile reports

Jun – July 2021 Refine risk indicators and methodology as needed; ensure we’re collecting meaningful and impactful risk indicators

July - Aug 2021 Update risk indicators (pre-site guides in IT-AMS) and program Risk Profile Summary into IT-AMS , test risk indicator collection and risk profile development (in IT-AMS); train Review Leads for Oct 2021 monitoring launch

Ongoing Repeated refinement and evaluation over lifetime of contract

Risk Assessment Team Lead: Marisa Core Team: Sarah, Tabitha, Jaycee, Catherine Additional Support: LaToia, Bert, Thom, Rick, Ashwin, Aarti

6 | P a g e U p d a t e d 2 / 5 / 2 0 2 1

Risk Assessment Component

Potential Risk Indicators

Grantee prior performance challenges

✓ Challenges reflected in experiences with CDI or DRS ✓ Prior findings ✓ Complaints brought against the grantee

Having waivers The grantee is addressing gaps that led to the waiver

Fiscal complexity and challenges

✓ General budget information such as total budget size, funds for facilities (major purchases), leases

✓ Having multiple funding streams ✓ Having audit findings ✓ Cuts in funding

✓ Late in submitting applications, amendments, drawdowns, etc.

✓ Repeated carry over balance or misused funds

Grantee experience and service complexity

✓ Lacking experience with specific types of services or program options, or with grants all together

✓ Language complexity, reflecting a need to ensure quality service delivery in multiple languages

✓ Site complexity: Having a large number of sites or a wide geographical spread of sites to manage

Staff experience, structure and turnover ✓ Program staff inexperience ✓ Insufficient staffing structure ✓ Staff turnover

Enrollment challenges

✓ Significant fluctuations in enrollment ✓ Chronic under-enrollment

7 | P a g e U p d a t e d 2 / 5 / 2 0 2 1

Performance Assessment Component

Conceptualization

• GPMS Performance Indicators internal to monitoring includes data

from FA1, FA2, RAN, Special Reviews, and CLASS®). This includes

Performance Measure ratings (from monitoring reviews), and scope

and severity measures such as repeat or uncorrected findings, total

number and severity of findings.

• The system also will include new measures – measures of

sustainability and outcomes are in development.

• Performance Indicators external to the monitoring system include

data such as QRIS, NAEYC, and licensing data

• We note that compliance does not equal performance - grantees may

be compliant but not high performing

Timeline

Expected Due Date Performance Assessment Component Activities Completed Develop, apply and test PI evaluation criteria based on data analysis and SME input

Completed Identify and Rank Performance Indicators (includes PMs and others from monitoring data [i.e., scope and severity]) – See Appendix B for details

Completed Conduct environmental scan to identify additional Performance Indicators (includes indicators from other systems [e.g., NAEYC, QRIS]

Completed Analyze monitoring data – case studies, statistical analysis, findings data analysis, relinquished/terminated grants analysis, repeat findings analysis – to make recommendations for final set of GPMS Performance Indicators

Dec 2020 – May 2021 Finalize set of draft Performance Indicators – PMs, CLASS, and scope and severity measures; consider approaches for measuring outcomes/results building off their measurement through EAS, where appropriate.

Feb 2021-Jan 2022 Continue to monitor EAS and test GPMS performance indicators-development to gauge potential impact on GPMS performance indicators, PM selection, EAS scoring, and GPMS scoring algorithm.

Note that the Performance Assessment activities end with their testing and refinement, as they then fold into the scoring system activity, which uses the performance indicators to calculate performance scores.

Performance Indicator Team

Lead: Bert Core Team: Traci, Thom, Rick, Marisa Additional Support: Jaycee, Melissa, Tabitha, LaToia, Ashwin, Aarti

8 | P a g e U p d a t e d 2 / 5 / 2 0 2 1

Proposed Performance Indicators:

FA2 Performance Measures

Performance Measures from FA2 Monitoring1

Program’s design (options, services, hours, languages, etc.) remains responsive to community needs. (PDM PM1)

1. The grantee’s program structure and design is informed by the community’s strengths and needs.

Management and staffing structure is effective (PMQI PM1)

2. The grantee establishes a management structure that consists of staff, consultants, or contractors who ensure high-quality service delivery; have sufficient knowledge, training, experience, and competencies to fulfill the roles and responsibilities of their positions; and provide regular supervision and support to staff.

Program uses data to track outcomes and progress and inform improvements. (PMQI PM2)

3. The grantee uses data to identify program strengths, needs, and areas needing improvement; to evaluate progress toward achieving program goals and compliance with program performance standards; and to assess the effectiveness of professional development.

Governing body is engaged and effective. (PMQI PM3)

4. The grantee maintains a formal structure of program governance to oversee the quality of services for children and families, and to make decisions related to program design and implementation.

Budget supports effective systems and quality services. (FIS PM1)

5. The grantee develops and implements its budget to sustain management, staffing structures, and the delivery of services that support the needs of enrolled children and families. This entails relating financial data to accomplishments of the grant award and an awareness of program progress, lessons learned, and needed improvements.

Fiscal management system is effective. (FIS PM2)

6. The grantee plans and implements a fiscal management system that supports the organization’s ongoing capacity to execute its budget over time and meet the needs of its organization.

Program has effective internal controls in place. (FIS PM3)

7. The grantee’s financial management system provides for effective control over and accountability for all funds, property, and other assets.

Program’s teaching practices promote school readiness – program achieves outcomes. (ECD PM2)

8. Teaching practices intentionally promote progress toward school readiness and provide high-quality learning experiences for children.

Teachers know how to implement curriculum to achieve SR outcomes. (ECD PM3)

9. The grantee ensures teachers are prepared to implement the curriculum and support children’s progress toward school readiness.

Parents are engaged in children’s learning and development. (FCE PM3)

10. The grantee’s education and child development services recognize parents’ role as children’s lifelong educators and encourage parents to engage in their child’s education.

Program supports children’s health and wellbeing through timely health information (HEA PM1)

11. The grantee effectively monitors and maintains timely information on children’s health status and care, including ongoing source of health care, preventive care and follow-up.

1 This is a table that lists the top rated/ranked FA2 PMs

9 | P a g e U p d a t e d 2 / 5 / 2 0 2 1

Proposed Performance Indicators:

FA2 Performance Measures (Being explored or developed)

Additional Measures Under Development for Consideration in GPMS

Results/Outcomes The grantee achieves outcomes for children and families, and results for the program.

Measure being explored We’re exploring how to conceptualize outcomes and results, including integrating outcomes into Performance Measures and EAS.

Sustainability / Responsivity The program is able to remain responsive to staff, children and family needs and sustain comprehensive and quality services in the face of shifting community needs, adversity, and unanticipated circumstances that impact priorities.

Measure being explored We’re integrating Sustainability into the EAS criteria. We will explore with the OHS stakeholder group how best to measure and report on Sustainability.

Effective use of child assessment data The program uses aggregated and individual child-level assessment data and other program data to determine progress toward meeting program goals, direct continuous program improvement, and individualize instruction to support each child’s progress.

DRS regulation Working with the DRS SR workgroup to ensure the SR requirements are addressed through EAS development and the FY2022 monitoring protocol. Note that, in the FY2021 FA2 protocol, aggregated data use (1304.11(b)(2)(i)) is monitored under PMQI, and the individualization component (1304.11(b)(2)(ii)) is monitored under ECD PM2.

Scope and severity measures

We are exploring potential measures to reflect scope and severity of findings. We are exploring issues such as:

• Volume of findings within a prescribed period of time

• Severity of findings

• Measure related to challenges with addressing identified weaknesses (repeat findings, uncorrected findings)

NEW: CLASS Quality Measure The team will explore options for including the CLASS quality thresholds and competitive thresholds into GPMS.

10 | P a g e U p d a t e d 2 / 5 / 2 0 2 1

Evidence Assessment System (EAS)

Conceptualization

• EAS is designed to standardize and strengthen the data

assessment process on FA1 and FA2 monitoring reviews.

• EAS provide anchors and a roadmap for what to consider when

evaluating performance against each Performance Measure (PM),

and what constitutes high, mid-level and low performance

• The goal of EAS is to provide for finer distinctions in grantee

performance, enabling OHS to better distinguish various levels of

performance

• EAS impact Performance Measure ratings, which are an integral

part of the GPMS.

Timeline

Expected Due Date EAS Development Completed (Oct 2020) Finalize EAS framework, approach, and path forward for developing EAS

Completed (Dec 2020) Revised FY21 FA2 Performance Measures and associated criteria (Quality Indicators), refined the framework based on experience of working through PMQI and Health content areas. Arrived at an enhanced framework to drive PM and QI development in Dec 2020.

Completed (Jan 2021) Revised EAS development process based on experience to date. Defined process, received approval from OHS to implement new process.

Jan – Mar 2021 Draft FA2 PMs, QIs and generic quality markers by content area; engage OHS key stakeholders in iterative process of drafting, reviewing and refining PMs and QIs. Discuss role of FA1 in GPMS and need for EAS development.

Mar - Apr Develop EAS rating methodology

Mar 2021 Design EAS pilot study

Mar – May 2021 Develop quality markers, identifying lo/mid/hi quality levels for each QI and developing the distinctions between ratings/levels

Apr – May 2021 Begin pilot testing FA2 EAS; pilot each content area as it’s ready (rolling basis). This phase of the pilot testing will focus on analyzing the quality of data collected through the new FA2 EAS for comprehensiveness, practical usefulness, their ability to capture what we intend to capture through the measures.

May – Aug 2021 Program FA2 EAS PMs, QIs and quality markers into IT-AMS in preparation for larger scale FY 2022 EAS testing

July – Sept 2021 Develop and conduct training on FA2 EAS

Oct 2021 Launch FA2 EAS with FY 2022 monitoring

Oct 2021 – Jan 2022 Pilot the FA2 EAS – calibrate thresholds with OHS; test and refining rating system as needed

Feb 2022 Finalize FA2 EAS based on pilot test results. Continue to validate and test GPMS weighting and scoring system using results of finalized EAS.

Feb – May 2022 Continue testing and refining GPMS scoring system

11 | P a g e U p d a t e d 2 / 5 / 2 0 2 1

Expected Due Date EAS Development

Jun – Aug 2022 Refine EAS as needed in IT-AMS

Aug – Sept 2022 Train Review Leads and Reviewers on implementation of FA2 EAS

Oct 2022 Launch FA2 EAS GPMS scoring system as part of FY 2023 monitoring

EAS Development Team Leads: Sarah, Marisa Core Team: Melissa, Traci, Jaycee, Abigail, Rebecca, Zipi (Traci, Bert, Rick, Thom – scoring) Additional Support: Tabitha, Eman, Daniel, Zipi, Rebecca, LaToia

Overview of OHS Stakeholder Engagement in EAS Development Process Phase I – Develop PMs, QIs and generic quality markers

• Contractor EAS development team develops draft PMs, QIs, generic quality markers to stimulate discussions with OHS stakeholders. Shares framework with team, develops and shares workbooks to facilitate stakeholder review and discussions.

• Iterative process of engagement with OHS content leads o Facilitate discussions to solicit SME input from OHS stakeholders o Adjust PMs and QIs based on stakeholder input o Engage with Adia (and Colleen and Adia’s POP team if they desire) to review refinements and

adjust to align with OHS priorities o Share updated content with stakeholders

• Share updated drafts with OHS leadership team and RPMs when a subset of content areas are complete (ECD, FCE, HEA, PMQI)

• Review FIS and ERSEA in subsequent sessions, bringing PMQI (especially governance) EAS back into the discussion as well.

Phase II – Develop quality markers with defined rating scale anchors

• Repeat the above process for development of quality markers within each QI

• Phase II Review can overlap with Phase I – some content areas may move into developing quality markers while others are still developing QIs and refining PMs. Each content area moves at its own pace.

12 | P a g e U p d a t e d 2 / 5 / 2 0 2 1

Weighted Scoring System Component

Conceptualization

• The GPMS will include a scoring algorithm that calculates

performance scores based the performance indicators included in

the system

• Weights will be developed and applied to GPMS data to prioritize

performance indicators and account for shifting priorities over time

• In FY 2021, scores will be calculated based on monitoring data;

future scoring can incorporate external data sources (e.g., QRIS,

licensing, audits) if appropriate

Timeline

Expected Due Date Weighted Scoring System Component Activities Sept 2020 – Jul 2021 Define GPMS scoring algorithm, including weighting strategy – Note that EAS pilot test data

will be available April-May 2021 and will be incorporated into the scoring and weighting development process.

Jan – Mar 2021 Develop GPMS testing and validation (i.e., face validity) plan to ensure the system operates as intended and appropriately identifies high and low performers.

Nov 2021 – Jan 2022 Program GPMS scoring algorithm into IT-AMS

Feb – Jun 2022 Test and validate GPMS

June 2022 Review GPMS test data, make refinements to GPMS as necessary, to determine GPMS is ready for launch in FY 2023

Oct 2022 Launch scoring system in GPMS in FY 2023 monitoring for public (see reporting activity)

Scoring System Team

Lead: Bert Core Team: Thom, Rick, Traci Additional Support: Marisa, LaToia, Ro, Ashwin, Aarti

13 | P a g e U p d a t e d 2 / 5 / 2 0 2 1

Grantee Performance Level/Report Component

Conceptualization

• We recommend that the Grantee Review Report and GPMS Report are 2 separate reports

• Grantee Review Report describes review-level performance

o FY 2021 is status quo with FY 2020; it does not include PM-level scores

o FY 2022 grantee report does not include PM-level scores, but we share scores with

OHS during the pilot period (available internally)

o We will discuss with OHS including PM-level scores as soon as EAS are stable

• GPMS Report provides performance across multiple events/metrics

o Contains standardized ratings that support comparison across grantees and across

grant lifecycles

o We recommend providing to each grantee annually (timeframe under discussion)

o GPMS Report becomes available to OHS in Spring FY 2022 for internal use during

the GPMS validation phase

o GPMS Report becomes available to grantees in FY 2023

Timeline

Expected Due Date

Grantee Performance Level/Report Component Activities

GRANTEE REVIEW REPORT

Apr –May 2021 Discuss options and finalize FY 2022 Grantee review report (ideally incorporates EAS criteria as customized content for grantees)

May-Jul 2021 Tech team programs FY 2022 Grantee Review Report template in IT-AMS

May – Jul 2021 Market new Grantee Review Report to grantees/NHSA/other stakeholders

July – Sept 2021 Train review team members and contract staff on new grantee review report’s development and production

Aug 2021 Test FY 2022 Grantee Review Report production in IT-AMS

GPMS REPORTING 022 GRANTEE REVIEW REPORT

Oct – Dec 2021 Conceptualize GPMS reporting options (dashboard, grantee reports, OHS management reports); discuss report cadence.

Jan – June 2022 Operationalize GPMS reporting for internal review and testing (draft reports using pilot data), finalize reporting elements and report cadence

Feb 2022 Launch GPMS in IT-AMS for internal use

Oct 2022 Launch GPMS Reports in FY 2023 for internal and external use

Reporting Team Lead: TBD Core Team: Bert, Tabitha, Catherine, Marisa Additional Support: Traci, LaToia, Ro, Thom, Rick, Ashwin, Aarti