A deeply held and widely shared belief in education is the “summer slide.” For nine months, teachers and students work tirelessly to build student achievement only to have it unravel over the summer. Upon returning to school, teacher conversations are laced with laments of the learning lost. This blog post is not an argument concerning the reality of the summer slide; rather, I am pondering why the idea of a summer slide makes me so uncomfortable.
The first question I ask is “Who was sliding and by how much?” The answer is easily accessible through achievement test results assessed at the end of the school year, and then assessed again in the fall for those same students. Since the schools I worked for administered the Northwest Evaluation Association (NWEA) Measure of Academic Progress (MAP), as many schools do, I had reams of data to scan for summer slide patterns. At various times over the last twelve years, I made comparisons of MAP scores for math, gathered from students who tested during these two seasons. This was done at a high-performing suburban middle school, and at an urban “beat the odds” elementary school featuring a high number of English Language Learners. The results were astonishingly similar. The averaged cohort results showed only a little loss over the summer for most grade levels. These minimal numbers were closely aligned with the NWEA’s own norming studies (NWEA, 2011). How could this be?
To shed some light on this, I sorted the students by their previous fall math score. The clear, consistent pattern was that the higher the achievement level, the greater the summer slide. High-level math students showed the largest slide, while the lowest performers actually made gains over the summer. This evidence flew in the face of what I believed. Perhaps we are not asking the right questions about summer slide. Has learning really stopped over the summer, or does it take on a different form not measurable by a standardized test? Are standardized tests measuring lower-level recall skills and not deeper concepts of learning? Are poorly performing students already doomed because of the lack of progress during the school year?
Math, like many skills, is a “use-it-or-lose-it” proposition. Taking a long break will cause rustiness to appear that can usually be fixed with a bit of spiraling upon return. High performing students will lose the most, but they have the confidence and skills to rebound quickly from the slide. On the other hand, students with the biggest gaps in math still have to survive in a world where basic math skills are used daily. While frequent daily interactions may cause some building of math skills, when these students return to school, they may lack the confidence and the learning skills to succeed in a system that has not worked well for them in the past.
I do not offer a solution for the summer slide, just a call to action for a triumphant end of summer return. Schools must rekindle the fire and passion for learning in all students, no matter their achievement level. Instructional strategies in our classrooms must focus not on what works, but on what works best, (Hattie, 2009) for both student success and developing a culture of learning based on clear learning targets and informed by actionable formative assessment (Wiliam, 2011). We have to reengage both our “summer slide” students, and those chronically unengaged, into a learning and assessment culture that allows students to see that the fruits of their labors pay dividends in the sweet success of learning, and not simply an assessment culture of “remembering.”
Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. NY: Routledge.
Wiliam, D. (2011). Embedded Formative Assessment. Bloomington, IN: Solution Tree Press.
NWEA Measures of Academic Progress Normative Data. (2011) , Portland, OR: Northwest Evaluation Association.