Creating School Data Simulations

“Using data to drive instruction” can be as difficult as determining who on your teaching staff is an “innovative educator.” Educational leaders understand the basic definition of each term, but when trying to clarify what is entailed in their everyday classroom practice, the definitions become slippery and harder to articulate. Every teacher believes they use data to drive instruction, but the real question is what data are they using? Are they using classroom data, school-wide data, district-wide data, state or nation-wide data. Who should make the decisions about which data to use?

I believe this knowledge gap can be conquered by creating simulations and trainings on the use of data in education. In order to do this, teachers must explicitly articulate hidden assumptions that they are reluctant to voice. A classic assumption in public schooling is that students need to be present in order to learn. While competency-based learning models are challenging this assumption, most school funding is predicated on average daily attendance. Therefore, educational leaders focus on improving attendance as an essential element in improving student achievement. This may not be an accurate assumption. The increase of blended learning calls into question the reliability of the Carnegie unit as “seat time” in a traditional brick and mortar school becomes increasingly irrelevant for self-motivated digital learners.

Creating a School-wide Data Simulation

GPA Data

Boudett et al (2005) suggest creating a graphical data overview and sharing this data with staff. This creates an inquiry process as educators endeavor to individually and collectively interpret graphs, tables and statistics. Examining the graph of school-wide GPA data above, reveals that grades at this school are distributed along a curve of normal distribution. This suggests that the instructional program is relatively sound. If there were a high number of 4.0 students or a high number of failing students, that might suggest that grades aren’t standards-based.

Students by GPA

A deeper analysis in the above figure shows exactly how many students need to improve and by how much. The school can use this information to develop a better understanding of which students need basic skills intervention and which students need additional motivation in completing school work.

Grade-Abs Correlation

After analyzing grades given by each teacher in a school, leadership can conduct a grade to attendance correlation study. This data, whether it is either by class or by school-wide GPA, can offer up powerful student achievement information and get staff to question how can a student missing 20 days of school still have a 3.5 GPA? Or even worse, how can students attend school every day and still fail almost every class? Are the students who miss 80 percent of the school year doing so because they are on the path to dropping out, or are they homeless or caring for a terminally ill relative. Numbers have power, but we have to remember that our students are individuals. Sometimes this type of analysis lets us start a conversation that may be crucial in reaching a disaffected student.

Grade Level Attend

A school should examine whether or not attendance is a predictor of a student’s grade point average (GPA). The above figure shows the relationship of attendance rate on GPA for an N=259 student sample. This correlation rate was .371, which is a weak relationship, suggesting that school attendance has a small effect on a student’s academic achievement. Since several educational researchers (Bridgeland, DiJulio, & Morison, 2006; Fisher, Frey, & Lapp, 2011) have suggested that attendance has a direct correlation on student achievement, this instructional program may be inconsistent in measuring student achievement.

Class Level Attend

As teachers struggle to comprehend this data, it may be beneficial to zoom in on one classroom’s attendance/grade correlation. The graph above shows an individual classroom with grades correlated with attendance rate. There are only 44 dots on the graph instead of 259, so the relationship is easier to spot. There are also fewer outliers, making these students easier to identify and provide intervention for.

James-Ward et al (2013) suggested asking broad questions to participants in data analysis like: What do we know from the data from our last school year? How does this information compare to prior years? Next, the participants can generate more specific questions that can be discussed in break out sessions. For example: Why did the 10th graders have the lowest ELA scores? What changed that increased our science scores so dramatically? How can we increase our attendance rate? Discussion on these questions can be used to create more specific goals and objectives for individual subjects and departments. A data team’s goal is to find changes in an instructional program, consider what caused them, then develop an action plan to improve instruction, implement it, and monitor the results.

Simulations can provide powerful epiphanies about the need to build a school-wide culture in using data to drive instruction. If only one or two teachers on a campus have strong relationships between attendance and grades, this may suggest that the school does not have a meaningful picture of its actual student achievement. If most of the school’s teachers have a strong correlation with attendance and grades, then perhaps their instructional program is more accurate in predicting actual student achievement.

References

Boudett, K., & Steele, J. (2007). Data wise in action: Stories of schools using data to improve teaching and learning. Cambridge, MA: Harvard Education Press.

James-Ward, C., Fisher, D., Frey, N., & Lapp, D. (2013). Using data to focus instructional improvement. Alexandria, VA: ASCD.

Spotlight Education Makes DDDM Easy

Many scholars believe that teaching and learning has not improved nationally because teachers as a group have not learned how to use data effectively to improve student learning. This is chiefly a leadership problem, because many school principals lack the necessary skills to make decisions-based on data. Additional research has revealed that teachers also lack the clarity and tools to understand how their students, schools, districts and states are performing. Thus, it is clear that extensive training needs to take place so that educators and educational leaders can develop the skills to work with data and grow comfortable using data tools that increase student achievement.

Educational leaders need to make data more accessible and comprehensible for a workforce that has most likely avoided challenging coursework in statistics and mathematics. Getting these educators to embrace data-driven strategies may be a formidable challenge. For decades, businesses and government agencies have used cultural simulations to prepare executives for foreign travel, implement diversity programs, and ensure diplomacy at home and abroad. Few educational leaders have used these cultural simulations. Most will be surprised by the visceral epiphany they deliver. If similar simulations can be developed to ensure that all teachers become data-driven educational leaders, the U.S. education system could be poised to make a great leap forward.

Julia-Smith-with-Bubbles-e1418422091627

Gary Shirts, creator of the culture shock simulation BaFa’ BaFa’, outlines ten steps to successful simulations: 1) don’t confuse replication with simulation; 2) choose the right subject to simulate; 3) develop a design plan; 4) design the simulation, so trainees take responsibility for their actions; 5) use symbols to deal with emotionally charged ideas; 6) don’t play games with trainees; 7) use non-trainees to add realism; 8) develop an appropriate performance assessment model; 9) alpha test your simulation in low-risk circumstances; and 10) set your own standards for success.

Most teachers are not trained as quantitative statisticians and requiring them to work within the realm of data is analogous to requiring them to learn a foreign language. Simulations can ease the anxiety of this process by providing safe places for teachers to increase their understanding of student achievement data and how to systemically collect and analyze that data for continuous instructional improvement. When teachers gain data literacy skills and improve their understanding, the data analysis tasks can become more sophisticated. Solving a problem for someone else is often less worrisome than bringing up your own problem areas and having others comment on them. As teachers become more confident about their analysis and use of data, educational leaders will see teachers displaying class achievement data and clearly showing students where they have improved.

Spotlight Education provides a promising software service that simplifies educational data.  Instead of extensive lists of statistics with difficult-to-read tables and charts, their service creates easy-to-read, narratives in either written or video form. These reports offer insightful analyses of education data. They customize the reports for each stakeholder: teachers, students, parents, principals, or superintendents.

References

Boudett, K., & Steele, J. (2007). Data wise in action: Stories of schools using data to improve teaching and learning. Cambridge, MA: Harvard Education Press.

Boudett, K., City, A., & Murnane, R. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

Corrigan, M., Grove, D., & Vincent, P. (2011). Multi-dimensional education: A common sense approach to data-driven thinking. Thousand Oaks, Calif.: Corwin Press.

Mandinach, E., & Jackson, S. (2012). Transforming teaching and learning through data-driven decision making. Thousand Oaks: Corwin Press.

Mandinach, E., & Honey, M. (2008). Data-driven school improvement: Linking data and learning. New York: Teachers College Press.

Shirts, G. R. (1992). Ten secrets of successful simulations. Training, 29(10), 79-83.