Education Data Doesn’t Have to Be a Needle in a Haystack

Blog post Jessica NadzamSREB Research and Policy Analysis Associate

In the data-driven age, leaders constantly turn to data to help prevent or solve problems in education. As problems such as teacher shortages and student learning loss persist in education, could a lack of data be hindering educational leaders from reaching successful outcomes?  

Data-driven practices occur at every level of education — teachers collect and analyze data to best inform their instructional practice with their students. Schools and districts pore over multiple measures of student success — from attendance to test scores to demographics — to try and create the best environments for learning. At the state and federal levels, leaders evaluate every possible option to create exceptional educational opportunities for students.

Unfortunately, researchers — and thus leaders — don’t always have enough data, or the right data, to craft effective, sustainable solution plans.   

As researchers, we collect data constantly, hoping one more statistic will solve our problems. Unfortunately, we might as well be searching for the holy grail. The data we seek either doesn’t exist, isn’t publicly available or is in such an obscure location that it will never make it to an analyst’s code.  

It can be easy to believe that solving any problem is as simple as typing our question into a search engine – how to retain more teachers, how to address discipline, how to evaluate district leadership, etc. While an internet search will often find anecdotal headlines, they cannot always direct researchers to concrete, objective solutions. At SREB, we look for answers to questions like these every day, and we do our best to create solutions based on facts and figures.    

Education data can be difficult to acquire for many reasons. For one, our education system is siloed. Each state has the flexibility to do what is best for its own individual students; however, with this freedom comes variety in data collection and reporting.

For example, Louisiana has been collecting data on teacher attrition since 2008 through required teacher exit surveys for every public school teacher leaving the profession. This data is available in yearly reports and helps the state better understand turnover. This isn’t the case in many other states, resulting in a dearth of information to better understand attrition. 

Another issue with data variance is states may collect the same data, but they report with different metrics, making comparison difficult if not impossible. For example, all states report data to Title II on educator preparation programs and program completion.  

However, different states have different strategies to train and license their teachers, such as their licensing examinations. With some states using Praxis, some using edTPA, and some using assessments they created themselves, it is difficult to compare program quality and their candidates across state lines.  

Even when uniform data collection procedures are in place, not every institution has the capacity and skill to collect and analyze it properly. In districts where there are already staffing shortages, teachers may not have time in their contract hours to collect and meaningfully analyze data besides grades. Administrators may have additional responsibilities that inhibit their ability to plan for additional data collection or to reflect upon it. 

In some cases, staff may not have the knowledge necessary to know what constitutes good data collection and analysis. While the state education agency may have capacity for data analysis, their skillset is moot if the district is unable to submit quality data.  

We don’t always have the data we need to answer our questions. Due to the complexity of data analysis, a professional analyst is often necessary to fully evaluate and report back on collected data. These analysts may be in-house within a district or department or externally contracted out. Regardless, they don’t always have contextual knowledge of the K-12 education system from the classroom level to the state and federal level, which can result in “blind spots” about how to interpret and report back on the data they are tasked with analyzing.  

For example, an analyst with experience in an elementary school may be able to interpret data on student conduct differently from someone without professional experience in education. They are more likely to ask questions about data that could bolster a finding or notice when important context is missing, whereas someone without education experience would simply run the data they have without knowing what questions to ask or what important information is not available.  

In a way, it is like asking a neurologist to do a podiatrist’s job. They may have some of the same foundational training, but that doesn’t mean they have the skills and context necessary to fully understand the performance of the task.  

SREB examined data availability in all 16 SREB-member states to produce our Teacher Compensation and Teacher Workforce dashboards, and we know from experience where the most inclusive, accessible and relevant data is readily available on teachers. When we pulled data to create these dashboards, we looked for numerous data points that are mandated to be included in annual report cards by the Every Student Succeeds Act.  

To better contextualize the current landscape of teacher workforce policy, SREB also collects additional factors such as teachers’ average number of years of experience and education levels. We also sought out teacher compensation data, such as average salaries at different years of experience, with and without advanced degrees. Some states share this additional data but some do not report on unrequired factors at all. 

In our hunt for data, SREB analysts identified two states that produce detailed annual report cards and two that publish extra reports annually to highlight additional information in their respective states. Kentucky’s Open House touts itself as a “one-stop-shop” for education data, and they live up to the name with easy access to school report card information, research data and even parent portals for caregivers to see their children’s grades and attendance in real time.  

Another state that makes their data easily accessible and includes lots of metrics is Arkansas. The ADE Data Center makes student and educator data relatively easy to locate and interpret with visualizations, as well as clean data that can easily be downloaded from their website.  

Through partnerships, four more states have also committed to data transparency and availability. Both North Carolina and South Carolina, as well as Kentucky and Tennessee, have done so via additional reports on teacher retention, demographics, finances, student discipline and more.  These states demonstrate a clear commitment to data availability, which has allowed SREB to generate resources to help solve problems that will ultimately impact students across the nation.  

Without data, it is a struggle not only to design solutions, but to know if they will be effective and sustainable. To solve problems effectively and equitably, let’s start with reliable, accessible, detailed data. All states can review the examples highlighted above and improve the availability and transparency of their own data.