When educators think about state accountability testing, it is rarely in connected with the process of fostering reflective learners – but maybe it should be. If a student develops meaningful relationships in a learning community while being guided by the use of formative assessment feedback, then the state assessment becomes a simple exercise in “showing what you know.” As educators “build” a learning path with quality assessment, “pave” the path by providing students with the tools to reflect on their learning, and “illuminate” it by the “light” of understanding student expectations for future success – and then push them beyond those expectations (Hattie, 2009), the state test become simply a small part of a balanced assessment system.
My Data and Assessment Coordinator position is tasked to support all things assessment, from the informal in-the-moment formative evaluation to high-stakes tests. Not having a classroom presence pushes me to interact with students whenever possible, as a result most students will freely converse with me. Several months before state testing begins, I engage in a shameless attempt to increase proficiency rates by motivating students who are near the bar to jump a little higher. Two groups of 3rd and 4th grade students are slated to visit with me about their learning progress. The first are the “fence sitters,” those who have climbed almost to the bar but need a nudge to get over. A second, the “bouncers,” test like experts at one session but days or weeks later have forgotten all they knew.
Individual conferences last 20 minutes and focus on a data-rich discussion about learning progress. I offer my assurance, based on multiple sources of data, that they are on track to successfully “meet the standard” on the upcoming tests. The data we ponder come from a variety of sources. Visible student-friendly learning targets reflecting standards are the norm, which are assessed in a variety of ways and documented in a local database. This allows us to identify students who do not meet standards to receive academic interventions. These data are highly correlated with the final student results on the state tests. As we conclude our discussion, I point out that the tests are simply an opportunity to “show what you know” about your learning.
In reality, what happens next is much more enriching and astonishing than simply raising a test score.
As a student enters my office, our conversation is centered on math and reading and I ask which is their favorite. This dialogue allows us to start the conversation from a place of strength (Dimich 2015). As Dylan Wiliam (2011) points out, nothing motivates like success. We discuss strengths and challenges in both math and reading. They talk about strategies used when “stuck” in their learning, and identify support people who can help. The conversations are easy because the student understands the targets and can explain their learning in relation to the target (Sadler, 1989). This happens not by accident, but by design; the result of standards-based learning targets helping students to know what is to be learned, rubric-based assessments helping student clarify where they are at in their learning, and how to move to the next step. Intertwined in the discussion are references to the critical student/teacher relationship that is central to engaging learners (Dimich, 2015).
Even with these systems in place, there is still a gap that must be crossed if students are not invested in their learning. The structure bridging this gap is built with classroom assessments, which engage the student and help them to understand what success looks like. This leads to the biggest step a student can make in their education – investing as an active participant in their own learning (Dimich, 2015). If the system, the curriculum, and the instruction are the bricks used to build success, student investment is the mortar.
What were the outcomes of these conversations?
- Remember, my shameless purpose was to increase the number of “proficient” students on the state tests. Since I was targeting students who were on the verge of jumping the proficiency bar, I simply wanted to increase engagement, boost their confidence, and reduce stress. Did it work? In math the answer is a resounding yes. I would have expected less than half of these students to meet the standard on the state test, in reality almost all reached the standard, and the rest were very close. In reading, the results were much more mixed, but this is a tougher area for our school given that most of the students are English language learners. There is more work to be done.
- The session always started with having the students talk about math or reading. There was a high degree of interest in math but not reading – this was an unexpected problem. I shared this finding with our staff, and many were not surprised. They had told me this in the past, but I finally had the ears to hear. This has heightened our discussion of how we can improve reading instruction.
While I believe that these conferences might have given some students that final boost to help them over the bar, the reality is that it takes high yield, research-proven classroom practices to build a rich culture of learning. Proven, powerful practices like standards-based formative assessment coupled with descriptive feedback of learning move all students forward in their learning journey. It is not about working harder, it is about doing the right work effectively, that has the best hope of building strong learners.
Hattie, J.A.C. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. Cambridge, MA: Routledge
Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional Science 18, 199-144.
Dimich, Nicole. (2015). Design in five: essential phases to create engaging assessment practice. Bloomington, IN: Solution Tree Press.
Wiliam, D. (2011). Embedded formative assessment. Bloomington, IN: Solution Tree Press.