about the national evaluation and technical assistance center · washington, dc: national...

25

Upload: others

Post on 15-May-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,
Page 2: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

About the National Evaluation and Technical Assistance Center for the Education of Children and Youth Who Are Neglected, Delinquent, or At-Risk This document was developed by the National Evaluation and Technical Assistance Center for the Education of Children and Youth Who Are Neglected, Delinquent, or At-Risk (NDTAC), which is funded by a contract awarded by the U.S. Department of Education to the American Institutes for Research (AIR) in Washington, D.C. The mission of NDTAC is to improve educational programming for youth who are neglected, delinquent or at-risk of academic failure. NDTAC’s mandates are to provide information, resources, and direct technical assistance to States and those who support or provide education to youth who are neglected or delinquent, develop a model and tools to assist States and providers with reporting data and evaluating their services, and serve as a facilitator to increase information-sharing and peer-to-peer learning at State and local levels. For additional information on NDTAC, visit the Center’s Web site at http://www.neglected-delinquent.org

Suggested Citation:

Amos, Lauren. (2015). Data Dashboards To Support Title I, Part D Program Administration: A Step-by-Step Guide. Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent, or At Risk (NDTAC).

The content of this document does not necessarily reflect the views or policies of the U.S. Department of Education. This document was produced by NDTAC at the American Institutes for Research with funding from the Office of Elementary and Secondary Education, U.S. Department of Education, under contract no. ED-ESE-10-O-0103. Permission is granted to reproduce this document.

Page 3: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

Contents

Data Dashboards To Support Title I, Part D Program Administration: A Step-by-Step Guide ...................... 1

Introduction ............................................................................................................................................. 1

What Is a Data Dashboard? ................................................................................................................ 2

How To Design a Dashboard .................................................................................................................... 4

Step 1: Define the purpose of your dashboard and program priorities ............................................. 4

Step 2: Explore existing data .............................................................................................................. 5

Step 3: Identify potential data sources .............................................................................................. 5

Step 4: Select performance indicators ............................................................................................... 8

Step 5: Group indicators conceptually ............................................................................................. 13

Step 6: Set performance targets and threshold criteria ................................................................... 14

Step 7: Design the dashboard interface ........................................................................................... 14

Step 8: Develop your dashboard or purchase a dashboard solution ............................................... 16

How To Use a Dashboard To Support Your Role as a State Coordinator .............................................. 18

Guiding Questions: CSPR Data Quality Review ................................................................................ 18

Guiding Questions: GPRAMA Measures ........................................................................................... 20

Guiding Questions: Decisionmaking ................................................................................................. 21

| ii |

Page 4: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

Data Dashboards To Support Title I, Part D Program Administration: A Step-by-Step Guide

Introduction

Among a number of strategies implemented to improve academic outcomes for incarcerated youth, the U.S. Department of Education (ED) and U.S. Department of Justice (DOJ) promote the use of data such as ED’s Title I, Part D (Part D) and Civil Rights Data Collection1 to identify, document, and address disparities in educational services and academic performance. Furthermore, in 2014, ED and DOJ developed and released a guidance package2 that included a document highlighting five principles for providing high-quality educational services in long-term secure care facilities.3 The package reflected Federal focus on correctional education reform to address considerable inequities in educational access and academic outcomes and use of performance-based management and monitoring to achieve these goals. Seventeen core activities are associated with the five guiding principles. Three of these activities explicitly recommend the collection and use of student and teacher performance data, and the remaining activities necessitate the collection of administrative and program performance data to monitor the extent to which the guiding principles are being met by secure care settings.

With a wealth of State, Agency, and facility-level data and resources at its disposal,4 the Part D program in particular has a significant amount of education-related data to draw upon. States and facilities are well positioned to use these data in alignment with this Federal guidance. However, using these data to improve service delivery and outcomes for children and youth who are system involved requires proactive and actionable decision support tools. A data dashboard can be such a tool.

This guide is an introduction to data dashboards. It will describe what a data dashboard is, outline the process of designing a dashboard, and give you the opportunity to practice using a sample dashboard to support Part D program administration.

1 http://ocrdata.ed.gov/ 2 http://www2.ed.gov/policy/gen/guid/correctional-education/index.html 3 http://www2.ed.gov/policy/gen/guid/correctional-education/guiding-principles.pdf 4 i.e., Title I, Part D program data collected for the Consolidated State Performance Report (CSPR) through the EDFacts Initiative, and accessible through ED Data Express, and the National Evaluation and Technical Assistance Center’s (NDTAC’s) Fast Facts Web pages.

| 1 |

Page 5: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

What Is a Data Dashboard?

A data dashboard, like the dashboard in your car, is a navigation system that can graphically represent current and/or long-term program performance—highlighting key areas of strength and weakness—as well as predict or forewarn of programs that are not on track to meet performance goals at a glance.

Similar to report cards, which are also commonly used to graphically represent program performance at a glance, a dashboard can be used to easily identify programs that are performing as expected, those exceeding expectations or performing above the norm, and those performing below expectation. On the other hand, a report card is a concise presentation or snapshot of data and other information about a school or program that assesses or evaluates its performance by focusing on outcomes and drawing comparisons (e.g., across time, across sites, against benchmarks). Whereas report cards support accountability, data dashboards support decisionmaking.

| 2 |

Page 6: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

Dashboard Resources: Books

To learn more about the power and design of dashboards, refer to some of the following publications on the topic:

Information Dashboard Design: Displaying Data for at-a-Glance Monitoring Stephen Few

The Accidental Analyst: Show Your Data Who's Boss Eileen McDaniel and Stephen McDaniel

| 3 |

Page 7: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

How To Design a Dashboard

An effective dashboard adheres to principles of good design. Among other characteristics, technological tools that have been designed well are intuitive and aesthetically pleasing, and make prominent the most important information to support usability. Designing a dashboard that meets these criteria involves at least eight steps:

1. Define the purpose of your dashboard and program priorities.

2. Explore existing data.

3. Identify potential data sources.

4. Select performance indicators.

5. Group indicators conceptually.

6. Set performance targets and threshold criteria.

7. Design the dashboard interface.

8. Develop your dashboard or purchase a dashboard solution.

Step 1: Define the purpose of your dashboard and program priorities

As a first step, it is important to articulate why you want a dashboard (e.g., academic performance tracking, fiscal risk management, grant compliance monitoring) and clarify what information your data dashboard will convey. A dashboard can summarize what you wish to accomplish programmatically—consistent with the purpose of your dashboard—by representing the key areas in which you would like to see improvement, such as—

Facility-, staff-, or student-level outcomes (e.g., student performance on posttests, staff attrition)

Administrative challenges (e.g., poor annual count data quality)

Implementation challenges (e.g., poor transition planning)

Being thoughtful and clear about the goals of your dashboard from the outset of the design project will increase the likelihood that you will implement a dashboard that can support your decisionmaking effectively.

| 4 |

Page 8: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

Step 2: Explore existing data

Once you have identified your program priorities, the next step is to identify and evaluate the data sources to which you currently have access to measure the changes you would like to see. It is important to consider the quality of these data sources. Good dashboards begin with good data. Good data are—

Accessible. Data used in a dashboard should be easy and inexpensive to collect from sources that you have the rights or permissions to access. In addition, you should be able to disaggregate the data to the degree you need to inform decisionmaking. For instance, if you need to determine whether a literacy program is satisfactorily serving the academic needs of students with limited English proficiency (LEP), then you need to collect student performance data that allow you to reliably distinguish the scores of students with LEP from native English speakers.

Clean. Data should be reviewed for accuracy before use in a dashboard to address erroneous, missing, or duplicate information. Data should not be used if it is unclear how the data were collected, by whom they were collected, or from where they were collected. They should also be de-identified if they include personally identifiable information and you are required by Agency policy or law to protect the privacy and confidentiality of students and/or staff.

Timely. Dashboard data should be opportune. They should be collected and accessible at the right time and with the appropriate frequency to ensure that the information reflected in the dashboard is current enough to inform decisionmaking.

Comprehensible. The specific data used and the ways in which those data are displayed (e.g., pie chart vs. bar chart) and labeled should be easy to understand. Using the data should not require specialized training, skill, or technical knowledge.

Actionable. Dashboard data should be informative enough to suggest ways in which you could respond to the data immediately.

Step 3: Identify potential data sources

Once you have determined what data you have at hand, consider whether any additional data sources might help you answer the questions you have about your subgrantees’ performance that are not sufficiently answered by your existing data sources. You should evaluate any potential sources for their accessibility and quality. The following worksheet can be used to help you define your program priorities as well as identify, evaluate, and align your current and potential dashboard data sources.

| 5 |

Page 9: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

Tool: Program Priorities Worksheet

Program Priorities Existing Data Sources Potential Data Sources

Example: Improve employment outcomes for youth who are N or D after exit from custody/care

Example: Statewide Education and Training Placement Information System

Example: Vocational certificate completion rates from Department of Corrections Offender Tracking Database

1.

2.

3.

4.

| 6 |

Page 10: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

Directions

1. In the first column, identify and list up to three program priorities that you would like a data dashboard to address.

Things To Consider:

What do you want to accomplish programmatically?

In what areas do you want your subgrantees and facilities to improve their performance?

What key outcomes do you want to see improved for youth?

In what areas have your subgrantees experienced program administration challenges?

In what areas have your subgrantees experienced program implementation challenges?

What problems have been identified in Federal monitoring visits?

2. In the second column, identify existing data sources for each priority.

Things To Consider:

What data do you have at your disposal—including but also beyond what you collect for the annual count and Consolidated State Performance Report (CSPR)—that can help you monitor subgrantee performance in these priority areas?

Do your data tell you a story that is timely and actionable?

What are your data not telling you that you wish you knew?

3. In the third column, identify potential data sources for each priority.

Things To Consider:

Are the data you desire of high quality?

− Do you trust the data and believe the data reflect accurate information?

Are the data you desire accessible?

− Are they hand-collected data that have to be entered into a spreadsheet or database?

− Are they in an electronic file format you can use easily (e.g., Excel), or do they require special software and/or expertise to extract and transform them for use in your dashboard?

− Are the data currently in a structure you can use, or do you have to reorganize them before use?

− What human resource demands are associated with collecting and using these potential data sources?

| 7 |

Page 11: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

Step 4: Select performance indicators

After you have identified data sources that align with your program priorities, the next step is to specify how you will measure changes in subgrantee performance with these data. Performance indicators are metrics that allow you to make inferences about the health, compliance, and/or effectiveness of a program or organization. They indicate or suggest the extent to which your subgrantees are performing above, below, or consistent with your expectations.

Because dashboards are intended to support decisionmaking at a glance, it is important to select the most informative and actionable indicators for your dashboard. Keep in mind a few considerations as you identify performance indicators:

1. Select a concise set of indicators. The number of indicators that you choose should not be exhaustive. Instead, collectively, the set of indicators you select should be succinct and provide a meaningful summary of the key administrative and implementation outcomes you want to represent graphically. A general rule of thumb is to use 3 to 10 indicators.

2. Select the most appropriate “grain size.” Select indicators that capture the level of detail you need most to adequately assess a program’s performance (e.g., student, teacher, classroom, school, facility, district, or State level).

3. Select the most appropriate type of quantitative indicator. There are primarily two types of quantitative, or numerical, indicators: single indicators and composite indicators. A single indicator represents one variable or measurement unit (e.g., numbers, means, medians, modes, percentages) such as the CSPR items “average length of stay” and “number of long-term students.” By comparison, a composite indicator merges two or more indicators into a single measure (e.g., ratios, indices, cumulative score), such as a student–teacher ratio. Composite indicators are useful for measuring longitudinal performance as well as complex or multidimensional concepts, such as teacher effectiveness, that cannot be reasonably or meaningfully measured by a single variable. They are also useful for making comparisons between or ranking programs on the basis of multiple contextual factors or performance criteria.

4. Select reliable and valid indicators. To support decisionmaking, a dashboard needs sound data—data that are reliable and valid. Reliability refers to the accuracy of the data collected over time as well as across settings and populations. For an indicator to be reliable, reliable data collection instruments must be used. Regardless of who collected the data, how the data were collected, the point in time at which the data were collected, or where the data were collected, a reliable data collection instrument or process should yield comparable and consistent results. Validity refers to the accuracy with which an indicator measures the concept or outcome it is intended to measure.

| 8 |

Page 12: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

Example 1: Identifying Inappropriate Uses of Funds

Indicator A: Average Daily Enrollment

Indicator B: Part D Subgrant Allocation

Indicator C: LEA Budget Allocation

Suppose that on a recent Federal monitoring visit, a monitor found that a number of facilities under a single local education agency (LEA) were supplanting Part D funds. The State education agency (SEA) would like to add one indicator to its existing grant compliance dashboard that can identify subgrantees and/or facilities that may be misusing funds so that technical assistance could be provided. The State Part D coordinator explored several single indicator and composite indicator options to determine which indicator to include on the SEA’s risk assessment dashboard.

Here, the coordinator compared one of the facilities found to be supplanting funds (facility B) with three compliant facilities. Indicators A, B, and C are single indicators. Indicator A is the average number of students enrolled daily in each facility for 1 year. Indicator B is the amount of Federal Part D subgrant funds allocated to each facility by the LEA. Indicator C is the amount of local funds allocated to each facility by the LEA for the regular program of instruction.

Indicator D: Part D per-Pupil Spending

Indicator E: LEA per-Pupil Spending

Indicator F: Part D Spending as a % of LEA

Budget

Indicators D, E, and F are composite indicators. Indicator D demonstrates that facility B is spending more Part D funds per student than the other facilities, but this may be because the students served by that facility are in greater academic need. Indicator E shows that facility B receives less funding per pupil from the school district than do the other facilities. It suggests that facility B may not have sufficient funding to deliver the regular program of instruction. Indicator F demonstrates that the amount of facility B’s Part D allocation relative to its LEA budget allocation is larger than that of the other facilities (Part D Subgrant Allocation/LEA Budget Allocation). The coordinator decides that Indicator F is the most informative and actionable among the six options because it can suggest when a facility may be compensating for insufficient local funds with Federal funds.

| 9 |

Page 13: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

Example 2: Determining Reliability and Validity

A thermometer measures temperature. An outdoor thermometer would be considered an invalid or incorrect indicator of the temperature if every morning when you checked it during a week-long snowstorm, it consistently indicated that the outdoor air was hot (figure A).

A speedometer measures how fast you are driving. Imagine that every day you were driving to work on a stretch of road where a police officer stood with a laser speed gun, and your speedometer always indicated you were going below the speed limit. But on one of these days, the police officer pulled you over to give you a speeding ticket. Although, technically speaking, a speedometer is a valid measure of the speed at which a car is going, you would consider your speedometer unreliable and therefore in need of calibration (figure B).

If every evening you were to place frozen dinner rolls into your oven, set at 350 degrees for 20 minutes, but on some days the bread burned and on other days the bread baked perfectly, you would consider your oven thermometer both invalid and unreliable (figure C).

When selecting indicators, the goal is to choose indicators that you can rely on consistently to measure your target outcomes accurately (figure D).

When selecting indicators, ask yourself the following about each one:

From what kinds of setting will I need to collect data on this indicator (e.g., juvenile corrections, juvenile detention, day treatment facility, group home)?

From what populations will I need to collect data on this indicator (e.g., teachers, paraprofessionals, short-term students, long-term students, administrators)?

− Do I need disaggregated data (e.g., by gender, race/ethnicity, disability status)?

− Do I need data on special or specific populations (e.g., students with low literacy skills, students enrolled in career and technical education courses)?

− Are sufficient data available on my target populations and subgroups?

By whom will the data I need on this indicator be collected?

− Am I confident that this person has the knowledge and capacity to collect the data properly?

Figure A Figure B Figure C Figure D

| 10 |

Page 14: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

With what tools or instruments (e.g., survey, focus group questions, pre-posttest score data collection template) or through what mechanisms (e.g., Statewide online database) will the data on this indicator be collected?

− Am I confident that these tools are being used consistently across settings (e.g., were facility staff trained on how to report pre-posttest data in the online database? Do State Agency personnel review facility data for quality before submitting enrollment and performance data to the SEA)?

− Am I confident that these tools are appropriate for my target population (e.g., were Spanish-speaking LEP students administered a Spanish-language translation of the student survey?)?

Am I confident that this indicator accurately measures the outcome or outcomes it is intended to reflect?

− How do I know?

− Is the indicator commonly used in research or evaluation studies on my settings and/or target populations?

Before finalizing your indicators, consider consulting an education researcher or program evaluator, ideally one with some expertise in juvenile justice, to confirm that you’ve chosen reliable and valid measures.

5. Select leading indicators that align with lagging indicators to measure both short-term and long-term change.

Leading indicators are outputs and short-term outcomes. They are formative or process measures that—

Demonstrate signs of growth or change in a given direction, suggesting early wins and indicating areas of improvement

Provide an early read on progress toward long-term outcomes

Indicate the conditions prerequisite to the desired long-term outcomes (i.e., predict lagging indicators)

Comparatively, lagging indicators are long-term outcomes such as the number of youth who begin a technical trade while in aftercare (i.e., as a result of earning a career/technical education [CTE] certificate while in residence). They are summative measures that assess the activities that have occurred in terms of their—

Effectiveness, indicating whether long-term program goals have been accomplished (e.g., Did the literacy intervention successfully improve reading comprehension?)

Impact, indicating the degree to which desired long-term outcomes have been achieved—whether successful or unsuccessful (e.g., On average, by how many percentage points did reading comprehension scores improve? Did reading comprehension improve for both native English speakers and LEP students?)

| 11 |

Page 15: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

Example 3: Selecting Leading and Lagging indicators

Whether an indicator is a leading or lagging indicator depends on the context. How might you group the following measures into leading and lagging indicators if a program’s mission is to increase access to higher education?

Percentage of students taking the SAT/ACT Average SAT/ACT score College enrollment rate Percentage of students enrolled in SAT/ACT prep Percentage of students completing two or more

college applications Percentage of students applying for SAT/ACT exam

fee waivers

Percentage of students passing college preparatory courses

Percentage of students participating in college campus visits

Percentage of students earning a high school diploma

Percentage of students accessing college counseling services

Here is one way to group these indicators:

Leading Indicators Lagging Indicators

Percentage of students enrolled in SAT/ACT prep Percentage of students applying for SAT/ACT exam

fee waivers Percentage of students accessing college

counseling services Percentage of students passing college preparatory

courses Percentage of students participating on college

campus visits Percentage of students taking the SAT/ACT

Average SAT/ACT score Percentage of students earning a high school

diploma Percentage of students completing two or more

college applications College enrollment rate

| 12 |

Page 16: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

Step 5: Group indicators conceptually

Having identified your indicators, the next step is to categorize them to make it easier for you to process the information on your dashboard cognitively. It is easier to interpret information organized in a way that is logical to you and your colleagues than it is when it is disorganized. How you group your indicators conceptually should be consistent with—

The purpose of your dashboard (e.g., academic performance tracking, fiscal risk management, grant compliance monitoring)

Your program priorities (e.g., grouping all of your indicators that address common implementation challenges)

The relationship between your indicators (e.g., grouping your indicators in pairs of leading and associated lagging indicators)

Commonalities among indicators (e.g., grouping all of your indicators that address staffing)

Example 4: Grouping Indicators by Program Priority

How might you group the following indicators? There are a number of ways to group them. Here is one set of groupings:

Graduation rate GED enrollment rate Number of CTE certificates awarded Course completion rate Average SAT/ACT score Types of CTE courses offered Number of youth served Pretest score: Literacy Pretest score: Math

Program Inputs Number of youth served Pretest score: Literacy Pretest score: Math Priority 1: Improve College Readiness Graduation rate Course completion rate Average SAT/ACT score Priority 2: Improve Career Readiness Types of CTE courses offered GED enrollment rate Number of CTE certificates awarded

| 13 |

Page 17: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

Step 6: Set performance targets and threshold criteria

Once you have grouped your indicators conceptually, the next step is to set performance targets or benchmarks that specify in quantifiable terms within what time frame and/or to what degree, frequency, or magnitude you would like to see change occur (e.g., 100 percent posttesting for long-term students every 90 days within the next 3 years). It is also useful to establish threshold criteria that can indicate to you when a subgrantee or facility may be off track or at risk of not meeting your performance targets (e.g., no increase in posttesting percentages for 6 months). These thresholds can signal to you when a subgrantee or facility is in critical need of intervention in the form of training or technical assistance. As you set your performance targets and threshold criteria, ask yourself the following guiding questions:

In terms of your priorities, where do you want your subgrantees and facilities to be in 1 year? Two years? Three years?

What performance benchmarks might you establish to measure their progress along the way?

How will you know when to target a subgrantee or facility for technical assistance? At what point might you sound the alarm?

Step 7: Design the dashboard interface

Thoughtfully planning the design of your dashboard interface or screen ensures the usability of your dashboard as a decisionmaking tool. As you determine the layout of the data and labels on your dashboard, it is important to consider the following design principles:

Prioritize visibility. Ideally, all dashboard data should be visible on a single screen without scrolling. If you want to develop a dashboard that serves multiple purposes, create a separate dashboard interface for each purpose.

Keep it simple. A dashboard should help you process information at a glance.

− Only display high-level information that the user can understand.

− Exclude extraneous or irrelevant information.

− Do not use meaningless color coding, variety, or decorative elements.

An intuitive design is a usable design. The physical placement of your indicators on the interface should align with the conceptual grouping of your indicators.

Data without a context are trivia. Ensure that the order and structure of the data and labels on your dashboard tell a visual story without the need for a detailed, written narrative or complex analysis.

Highlight important data. Emphasize important data by its position on the dashboard and with visual attributes like color intensity, size, and line width.

Choose the right display. The nature of your data and your analytic needs ultimately determine which types of data displays are most appropriate for your dashboard. Some data

| 14 |

Page 18: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

displays are more comprehensible for lay audiences than others, and some facilitate interpretation or convey certain types of information better than others. Play with your data to determine whether you can best tell a visual story with your data using, for instance—

− Tabular (spreadsheet) displays, graphical displays, or a combination of the two

− Bar charts, pie charts, gauges, thermometers, maps, or time series graphs

Each of these formats has its advantages and drawbacks. Consider the following example.

Example 5: Choosing the Right Data Display

Display A Full Time Equivalency (FTE) by Instructional Area

Display A uses a bar chart, which is often promoted as a simple way to display frequencies and percentages. But not all bar charts are created equal. This chart highlights programmatic differences between facilities: Facility A devotes most of its staff to social studies, foreign language, and vocational education, and provides

no staffing for science, music, art, and physical education. Facility B devotes most of its staff to science, math, physical education, and special education instruction. Facility C offers the most balanced curriculum, with consistent staffing in each of the core courses and foreign

language. Facility D devotes most of its staff to English, music, and art instruction. None of the facilities devotes staff to LEP/English language learners, reading/literacy, or a library. However, this display requires the dashboard user to scrutinize the graph to gain this information because the display is cluttered with excessive detail.

| 15 |

Page 19: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

Example 5: Choosing the Right Data Display

Display B

Ratio of Special Education Staff to Youth With Individualized Education Programs

Students Demonstrating Improvement in Math

Comparatively, the graphs in display B easily communicate actionable information on a small scale. Namely, they indicate that facilities C and D are underperforming relative to facilities A and B, and therefore may benefit from targeted technical assistance. The graph on the left suggests that the academic needs of special education students in residence at facilities C and D are not being met because of high student-to-teacher ratios. This conclusion or visual story is reinforced in the graph on the right, which indicates that students in those facilities are struggling in mathematics—averaging well below the State average and even further below their peers in facilities A and B.

Step 8: Develop your dashboard or purchase a dashboard solution

Having drafted your interface, determine whether you will build your own dashboard, purchase an off-the-shelf product, or hire a vendor to develop a custom tool for you. Even if you elect to outsource the development of your dashboard, implementing the prior seven steps can help you select a dashboard solution that best aligns with your needs. Ask yourself the following key questions before making a final decision:

Are there any technical constraints or demands that might affect your ability to develop and/or implement your dashboard?

− Do you need the tool to be accessible online?

− Do you need it to be accessible by a mobile device (e.g., when conducting site visits)?

− Do you have reliable technical support (e.g., from your Agency’s IT department, an onsite computer programmer)?

Are there any resource constraints or demands that might affect your ability to develop and/or implement your dashboard?

− Do you have the time to build the dashboard?

− Will you be the sole user of the dashboard, or will training and/or user documentation be needed to ensure that colleagues can use the dashboard as well?

− Do you have sufficient funds to purchase the software and maintain the software license if it must be renewed annually?

| 16 |

Page 20: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

If you do not have access to sufficient funds to purchase a dashboard solution, developing your own dashboard is possible. Microsoft Excel is the most common software used to develop simple, custom dashboards by individuals with little to no technical training or experience. A multitude of free resources are available online to help you build your own dashboard.

Dashboard Resources: Microsoft Excel Tutorials and Templates

Learn more about designing and customizing Excel dashboards with online tutorials and templates.

Formulas and Functions (Howtogeek.com)

Excel Dashboard Toolbox (Chandoo.org)

How to Create an Excel Chart (ExcelCharts.org)

Interactive Dashboards using PowerPoint and Excel (office.microsoft.com)

Video: Create a Speedometer Chart in Excel

Video: Excel Dashboards For Beginners: Battery Chart In Excel

Video: Step-by-Step Instructions for Easy EXCEL Dashboards

Video: Excel Traffic Light Dashboard Tutorial

Video: How To Make a Good-Looking Excel Dashboard: Navigation Bar

Video: Business Dashboards in Excel for Beginners (Series)

Video: Making an Excel Dashboard: Copying Widgets

Dashboard Resources: Software

When an organization’s analytic needs outgrow or exceed Microsoft Excel’s capabilities, a host of simple but sophisticated software solutions are available.5

Tableau

SiSense

iDashboards

Dundas

Good Data

MicroStrategy

Domo

Antivia

5 These vendors have not been endorsed by the U.S. Department of Education or NDTAC.

| 17 |

Page 21: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

How To Use a Dashboard To Support Your Role as a State Coordinator

A data dashboard can support your role as a State Coordinator in a number of ways. It can help you improve data quality, identify struggling and high-performing subgrantees and facilities, determine subgrantee and facility technical assistance and staff development needs, evaluate programs, review applications, monitor subgrantees, and promote interagency data sharing and collaboration.

In this section of the guide, use the sample Part D dashboard that has been populated with mock State data to consider key questions you should ask yourself whenever you carry out three important tasks as a State Coordinator:

1. Conducting a CSPR data quality review

2. Reviewing your State GPRAMA (Government Performance and Results Act of 1993 [GPRA] Modernization Act of 2010) measures to determine how well your State program performs relative to the Nation

3. Driving decisionmaking with subgrantee performance data

The sample dashboard was developed with Microsoft Excel. The workbook contains six worksheets:

Data Quality (Subpart 1)

Data Quality (Subpart 2)

GRPAMA (Subpart 1)

GPRAMA (Subpart 2)

Decisionmaking (Subpart 1)

Decisionmaking (Subpart 2)

Guiding Questions: CSPR Data Quality Review

There are a number of common CSPR data quality concerns:

Academic and vocational outcomes are over the age-eligible ranges.

The sum of students by race, age, or gender does not equal the unduplicated count of students.

The number of long-term students testing below grade level is more than the number of long-term students for reading and math.

| 18 |

Page 22: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

The number of students demonstrating results (sum of rows 3–7) does not equal the number of students with complete pre- and posttests (row 2) for reading and math.

Average length of stay is 0 or more than 365 days.

Use the “Data Quality (Subpart 1)” and “Data Quality (Subpart 2)” worksheets in the sample dashboard to check for these common errors. These worksheets use conditional formatting to auto-highlight potential data quality concerns. Items that are potentially problematic are color-coded in red for easy identification.

Consider the following questions after you complete your review:

Is your current data collection tool sound?

Do your subgrantees know how to use your data collection tool?

− How do you know?

− What technical assistance do you provide to support their use of the tool?

− How effective is the technical assistance you provide?

− How might you strengthen the technical assistance you provide to address any data quality concerns you have identified?

Do your facilities have a data collection system, process, or tool that allows them to track student-level data accurately?

How do your current subgrantee monitoring activities support CSPR data quality?

| 19 |

Page 23: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

Guiding Questions: GPRAMA Measures The Government Performance and Results Act of 1993 (GPRA) requires all Federal agencies to manage their activities with attention to the consequences of those activities. Each Agency is to clearly state what it intends to accomplish, identify the resources required, and periodically report its progress to the Congress. In so doing, it is expected that the GPRA can contribute to improvements in accountability for the expenditures of public funds, improve Congressional decisionmaking through more objective information on the effectiveness of Federal programs, and promote a new government focus on results, service delivery, and customer satisfaction. The GPRA Modernization Act of 2010 (GPRAMA) provides enhanced performance planning, management, and reporting tools to the 1993 act to help inform Congressional and Executive Branch decisionmaking to better address significant challenges facing our Nation.

In the simplest terms, GPRAMA measures are a subset of indicators used to monitor performance and inform Federal decisionmaking for a given program. There are four education-specific indicators associated with the Part D program.6 The GPRAMA data you submit along with the other information you provide for the CSPR can indicate how well your State is performing (e.g., as compared with the Nation, similar States, or previous years). ED often looks closely at GPRAMA indicators during Federal monitoring or risk assessments to assess grantee progress over time.

Use the “GPRAMA (Subpart 1)” and “GPRAMA (Subpart 2)” tabs in the sample dashboard to consider the following questions:

In what areas is your State underperforming relative to the Nation?

What contextual factors may explain areas of underperformance?

What questions do you have about your State’s performance in this area(s)?

Is this cause for concern?

If so, what are your next steps?

6 The Part D GPRAMA indicators are as follows: earned high school course credits, earned a high school diploma or GED, improvement in reading on pre-posttests, and improvement in math on pre-posttests.

| 20 |

Page 24: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

Guiding Questions: Decisionmaking

A data dashboard can help you prioritize to whom you provide technical assistance or direct discretionary funding, or even decide whether to discontinue funding after several years of poor performance. Use the “Decisionmaking (Subpart 1)” and “Decisionmaking (Subpart 2)” tabs in the sample dashboard to answer the following questions:

Are certain demographic groups overrepresented in a particular program, and if so, are those programs taking measures to serve the unique needs of those populations?

Are all students being served equally well, or are certain program types outperforming (or underperforming) their peers?

Which program type(s) appear to be effective? Why might it/they appear to outperform the other program types (e.g., higher student performance upon entry, more highly qualified teachers, smaller student-teacher ratio, and low teacher attrition)?

What questions do you have about your State’s performance in this area?

Do you have any causes for concern?

If so, what are your next steps?

Are your subgrantees meeting the needs of your State’s youth who are neglected or delinquent?

Are youth needs being consistently met across program types?

What does the dashboard suggest about how well Part D funds are being used in your State? Are the funds you have allocated to your subgrantees reflected in program outcomes?

What might you do differently, moving forward, to improve data quality and data-driven decisionmaking in your State?

| 21 |

Page 25: About the National Evaluation and Technical Assistance Center · Washington, DC: National Evaluation and Technical Assistance Center for Children and Youth Who Are Neglected, Delinquent,

| 22 |