Supporting the independent education community

Data walls: the state of the evidence

 

Lois Harris, Claire Wyatt-Smith, Lenore Adie

Institute for Learning Sciences and Teacher Education, Australian Catholic University

 

 

What is the evidence that shows the effectiveness of data walls in schools? Under what conditions might data walls be most effective in informing teachers’ decisions and actions to improve learning? This article explores these questions, providing guidance to school leaders and teachers who may be considering implementing data walls or reviewing their own current practices.
Data walls are “public or semi-public physical artefacts that visually represent student achievement data in a range of graphic and linguistic formats (e.g., student grades in letter, numeric score or alphabet forms; standardised test scores, reading levels, school-based assessment results displayed in tables or graphs of a range of types)” (see article in NM#4). First developed in the 1990s in the United States as a way for teachers to engage with student achievement data and track progress over time, data walls are one of a growing number of practices designed to help teachers in the ‘visualisation of data’ (Hardy & Lewis, 2018). According to Sharratt and Fullan (2012), “Data walls create visuals of all students’ progress and provide a forum for rich conversation among teachers” (p.78). While originally data walls were designed as a tool for teachers, they are now being used with students (Jimerson, Cho, & Wayman, 2016; Harris, 2018; Spina, 2017).
 

Key variables to consider when interrogating the evidence around data walls

A recently completed systematic review of research on data walls identified 49 relevant academic sources (i.e., books, journal articles, book chapters, doctoral theses, published reports). Twenty-two of these met the strict criteria for inclusion in our review (for more methodological details, see article in NM#5). This corpus of research made it clear that data walls are implemented in highly diverse ways, subject to the intended users, the location of the data walls and their purposes. Identified users included teachers, students, parents, and members of the public. 
The research shows that there is a range of practices associated with implementing data walls. Typically, these give an emphasis to teacher talk and interaction around data as teachers meet to infer meaning from the data, with discussions exploring how to interpret and use it. These social practices can also serve to motivate teacher inquiry, encourage student review and goal setting, serve to demonstrate school accountability, and identify pedagogical solutions to problems. The corollary of this is that the research shows the potential of data walls to negatively impact students and teachers. The clear message is that the walls themselves do not determine how teachers make sense of the displayed data, nor do they prescribe how teachers, students and others interact around the data.
The use of data walls is also linked to privacy: where the data wall is located (i.e., Is it in a public space or teacher-only space?); how data are identified, and the decision to de-identify (i.e., Are student names, ID numbers, and/or photos attached to the data?). In studies we reviewed, while some placed data walls in staff only areas and kept the student-identifying information on the back of tiles, out of public view (Singh & Glasswell, 2013), in other implementations, data walls were in public classroom spaces. Consequently, in these walls, student’s identified achievement data were made visible to peers, parents, and school staff. 
 

What we know about data walls

Our review of literature has led us to conclude that, at present, there is only limited evidence to show the impact data walls of different types have on student learning or their ability to help teachers engage in better decision-making about next teaching strategies and goal setting. When examining impact on student achievement, one notable study (Singh, Märtsin, & Glasswell, 2015) included achievement measures (i.e., TORCH and NAPLAN test results). While learning growth was found, as data walls were just one component of a complex intervention undertaken in this study, the authors could not attribute growth specifically to the use of data walls.
Most reviewed studies were small-scale and qualitative in nature, making it difficult to generalise findings beyond the context of the study. Teachers in these studies reported both benefits and limitations from data wall use. In multiple studies, teachers claimed that using data walls helped stimulate productive professional conversations among teachers, where problems were diagnosed and pedagogical knowledge was shared (Goss et al., 2015; Renshaw et al., 2013). Some reported that the prominence of data, when displayed on data walls, forced teachers to act to improve results as poor results could not be ignored (Singh & Glasswell, 2013). Multiple studies reported that colour coding was a helpful feature (Cristoph, 2014; Goss et al., 2015; Renshaw et al., 2013), which assisted teachers to group students for instructional purposes and identify those in need of intervention. 
However, studies also identified unresolved challenges in relation to how data walls were implemented within their schools. Concerns were raised regarding the staff time needed to create and update data walls and attend meetings (Parkinson and Stooke, 2012; Spina, 2017). Further, the categorisation of students often led to a focus on supporting some groups of students over others (Carter, 2014; Kiro et al., 2016). Some teachers articulated concerns that there was undue emphasis placed on results related to the subject domains and/or particular tests which were displayed on the wall, with concerns this may encourage teachers and students to set goals around scores rather than standards (Hardy, 2014). One study highlighted the difficulty school leaders may experience when trying to keep data wall discussions focused on pedagogical solutions to achievement problems, rather than potential external causes of these (Earl, 2009).
There were also concerns raised in relation to how some forms of data wall implementations might compromise student privacy, as well as student and/or teacher psychological safety (Carter, 2014; Marsh et al., 2016). Multiple studies reported data wall implementations where identified student data were shared with the entire school community (Abrams et al., 2016; Carter, 2014) or class (Jimerson et al., 2016; Spina, 2017). These types of implementation clearly raise questions about who owns individual student’s data and the extent to which students and their families/carers should be allowed input into the sharing of their private results. Studies also identified challenges around mediating negative teacher and student reactions to data and associated social practices (Singh, Märtsin, & Glasswell, 2015; Singh, 2018), particularly those which invited comparison (Renshaw et al., 2013). These concerns led some teachers in Spina’s (2017) study to choose not to implement classroom data walls, despite a school-wide expectation to do so.
When data walls have been used with students, additional concerns are reported. For example, there are difficulties associated with teaching students to understand and use the data to establish and monitor their own learning goals (Marsh et al., 2016). Additionally, it was acknowledged that a frequent focus on student comparison and competition may inadvertently encourage students to adopt a performance orientation towards learning (Jimerson et al., 2016; Marsh et al., 2016; Thrupp & White, 2013). This is especially the case where the data gives priority to norm-referenced data where the direct inter-student comparison is the basis for judgement. 
 

Implications

Currently, there is insufficient evidence at scale to show the benefits and limitations of using data walls. This observation holds for all phases of schooling. There are some studies however that indicate the desirable conditions under which data walls may have positive impact. One of these is the need for a school climate which features strong collegial relationships and an atmosphere of trust (Singh, Märtsin, & Glasswell, 2015). It appears important that there is systemic as well as school administrative support (Potenziano, 2014); however, as Koyama’s (2013) study highlights, top-down implementations are unlikely to be effective. Additionally, it must be recognised that data walls are one tool among a range of possible tools. There is no consensus about the optimum ways to develop teachers’ data expertise, yet social practices around data walls are clearly an important mediating factor (i.e., nature of staff and/or student interactions, discussions, rituals, and uses of these data). However, we are only beginning to understand these practices. 
Due to the current state of the evidence, data walls should be viewed as an emerging, but as yet largely unverified, strategy. This is particularly the case in relation to implementations where students are encouraged to view and use data walls. Schools should carefully consider how they implement data walls within their own site, taking into account student privacy, staff workload, and how data walls align with the school’s goals and existing data interpretation and use strategies. It is vital that the efficacy of data wall usage is also monitored. What benefits are being seen within the school as a result of this practice? How are staff and student privacy and psychological safety being considered? What social practices are developing around data wall use? Are the data wall constructions and associated social practices fit-for-purpose and what is the impact on staff time? 
In addition to monitoring effectiveness within their own schools, we strongly encourage school leaders and teachers to become involved in formally researching data walls in their schools. Contact Dr Elizabeth Heck, Administrative Officer for the Schools Data Network Project (SDN) in the Institute for Learning Sciences and Teacher Education (Elizabeth.heck@acu.edu.au) if you are interested in joining our network.
 
References
Abrams, L., Varier, D., & Jackson, L. (2016). Unpacking instructional alignment: The influence of teachers’ use of assessment data on instruction. Perspectives in Education, 34(4), 15-28. doi:http://dx.doi.org/10.18820/2519593X/pie.v34i4.2
Carter, M. (2014). A multiple case study of NAPLAN numeracy testing of Year 9 students in three Queensland secondary schools.PhD, Queensland University of Technology, Brisbane.
Christoph, R. M. (2014). Organizing data for instructional use in one elementary school: An action research study.Doctor of Education, Washington State University, Pullman, Washington.
Earl, L. (2009). Leadership for evidence-informed conversations. In L. Earl & H. Timperley (Eds.), Professional learning conversations: Challenges in using evidence for improvement(pp. 43-52). Dordrecht: Springer.
Hardy, I. (2014). A logic of appropriation: enacting national testing (NAPLAN) in Australia. Journal of Education Policy, 29(1), 1-18. doi:10.1080/02680939.2013.782425
Hardy, I., & Lewis, S. (2018). Visibility, invisibility, and visualisation: The danger of school performance data. Pedagogy, Culture & Society, 26(2), 233-248. doi:10.1080/14681366.2017.1380073
Harris, L. M. (2018). Perceptions of teachers about using and analyzing data to inform instruction. Doctor of Education, Walden University.
Hattie, J., Masters, D., & Birch, K. (2016). Visible learning into action: International case studies of impact.New York: Routledge.
Goss, P., Hunter, J., Romanes, D., & Parsonage, H. (2015). Targeted teaching: How better use of data can improve student learning.Melbourne: Grattan Institute.
Jimerson, J. B., Cho, V., & Wayman, J. C. (2016). Student-involved data use: Teacher practices and considerations for professional learning. Teaching and Teacher Education, 60(Supplement C), 413-424. doi:https://doi.org/10.1016/j.tate.2016.07.008
Kiro, C., Hynds, A., Eaton, J., Irving, E., Wilson, A., Bendikson, L., . . . Rangi, M. (2016). The Starpath Project: Starpath phase 2 final summative evaluation report.Auckland: University of Auckland.
Koyama, J. (2013). Global scare tactics and the call for US schools to be held accountable. American Journal of Education, 120(1), 77-99. doi:10.1086/673122
Marsh, J. A., Farrell, C. C., & Bertrand, M. (2016). Trickle-down accountability: How middle school teachers engage students in data use. Educational Policy, 30(2), 243-280. doi:10.1177/0895904814531653
Parkinson, H. C., & Stooke, R. K. (2012). Other duties as assigned: The hidden work of reading and writing assessments in two primary classrooms. Language and Literacy, 14(1), n/a.
Potenziano, P. J. (2014). Opportunity to learn: The role of structures and routines in understanding and addressing educational inequities.Retrieved April 18, 2018 from https://search.proquest.com/docview/1526000657?accountid=8194
Renshaw, P., Baroutsis, A., van Kraayenoord, C., Goos, M., & Dole, S. (2013). Teachers using classroom data well: Identifying key features of effective practices. Brisbane: University of Queensland.
Sharratt, L., & Fullan, M. (2012). Putting FACES on the data: What great leaders do!Thousand Oaks, California: Corwin and the Ontario Principals' Council.
Singh, P. (2018). Performativity, affectivity and pedagogic identities. European Educational Research Journal, 17(4), 489-506. doi: 10.1177/1474904117726181
Singh, P., & Glasswell, K. (2013). Chasing social change: Matters of concern and the mattering practice of educational research. International Journal of Innovation, Creativity and Change, 1(2), 1-15.
Singh, P., Märtsin, M., & Glasswell, K. (2015). Dilemmatic spaces: High-stakes testing and the possibilities of collaborative knowledge work to generate learning innovations. Teachers and Teaching, 21(4), 379-399.
Spina, N. (2017). The quantification of education and the reorganisation of teachers' work: An institutional ethnography.PhD, Queensland University of Technology, Brisbane.
Thrupp, M., & White, M. (2013). Research, Analysis and Insight into National Standards (RAINS) Project Final Report: National Standards and the Damage Done.Hamilton: The New Zealand Educational Institute Te Riu Roa (NZEI).