The question of “Why Assess?” is one that is posed in schools and districts everywhere. It’s important to challenge educators to think about their assessment practice and how they derive information about student progress. If the purpose of assessment is merely to rank and sort, then little needs to change from the assessment practices of previous generations. If, instead, the purpose is to focus on student learning, then educators need to examine whether their current practice is aligned with that outcome. In “Teacher As Assessment Leader” (2009) I suggested that the teacher’s role is to “make frequent environmental scans to collect formal evidence such as assessments, exams, or homework, and informal evidence such as the questions students may ask, their comments during group work, or even their confused expressions” in order for a productive exchange of information between teacher and student as part of an ongoing, seamless assessment and instruction plan. Analysis of this evidence allows educators to plan next steps and adjust instruction going forward.
Assessment data are only effective when they’re instructionally actionable. It’s no longer acceptable to limit assessment analysis to determining what’s wrong with students. Teachers must use the evidence of student learning to collaborate with colleagues to identify either teaching strengths to share, or areas of concern for which to seek new instructional strategies. The purposes of assessment ought to be framed around diagnosing student learning difficulties and setting individual teacher, and team goals for student improvement.
Professor John Hattie (2012) in “Visible Learning for Teachers: Maximizing Impact on Learning” offers the flip side to the question when he states, “There are certainly many things that inspired teachers do not do; they do not use grading as punishment; they do not conflate behavioral and academic performance; they do not elevate quiet compliance over academic work; they do not excessively use worksheets; they do not have low expectations and keep defending low quality learning as ‘doing your best’; they do not evaluate their impact by compliance, covering the curriculum, or conceiving explanations as to why they have little or no impact on their students; and they do not prefer perfection in homework over risk-taking that involves mistakes.” While this encompasses more than the assessment conversation, it equally serves as a compelling support for the broader discussion schools and districts ought to engage in.
In Starting A Movement, Ken Williams and I (2015) suggest,
“If you’ve been teaching for more than two years, then you have experienced the very humbling phenomenon of roller coaster results from one year to the next. In a culture of collective responsibility, instead of placing the burden of expected achievement on uncontrollable factors, teachers focus on the collective expertise of the people inside the school. They no longer analyze common assessment data out of compliance. They see the process of engaging in this type of dialogue as part of the promise their school has made to students, parents, and staff.”
The answer to the question of “Why Assess?” is rooted in these actions that committed educators take as they continue to work towards improving the life chances of every student. Regardless of the initiative you and your colleagues are engaged in, the quality of your assessment practice will be the lynchpin to success.
As you read this and think about your next lesson, also think about a next first step in assessment. Wherever you currently are in your assessment practice is where you are. That next first step could involve student analysis of their assessment (reviewing it for errors and then structuring a learning plan before getting re-assessed), it could involve you changing the instructional design and re-teaching a content piece students were not as successful with (an intervention plan that presents content with a new instructional strategy or more time), or it could involve a new format of assessment (do all learning outcomes need to be assessed with paper and pen?).
Gathering high quality evidence, using that evidence to guide next steps, and then gathering more evidence of the efficacy of the strategy, will provide educators and their students the opportunity to focus on the reaching those broader goals. At the end of the day assessment is too valuable to waste by leaving it as an end product and too significant as a daily routine to ignore the evidence that could strengthen the teaching-learning connection.
T agree with your views on assessment and that the analysis of the data must contain plans for actionable instruction. However, my school provides a calendar of curriculum that must be followed. To reteach subjects not mastered changes our calendar. While I disagree with this, I would like your take on administrators creating calendars that must be followed with fidelity. Our kids are losing out because of this.
Hi Mary. I agree with your analysis. If we continue with the frantic attempt to cover everything, we will move the academic success needle marginally (if at all). Having a pacing calendar (defining the number of weeks for a unit with built in buffer time between units) is a more practical approach than a firm pacing guide which delineates what is happening on each day. With the latter approach, I (and many of my colleagues who talk about sound assessment practice) do not think it is possible to use assessment in an actionable way, intervene on behalf of students, or do anything but use the resulting data as a hammer (rank and sort). Thanks for your feedback!
Thank you for a multi-faceted summary of why to assess. This brief blog says so much in terms of helping someone beginning their assessment journey and giving them a few (certainly not overwhelming) critical points to consider. In addition, your blog also gives those who have been at it for a while, some direction and a few points to consider, all based on strong research, to help them tweak their practice / thinking.
Both of these scenarios remind us all, that this process of assessment is to dipstick and make decisions based on a broad base of evidence, ultimately moving students forward in their achievement. And finally, when necessary, to use this body of evidence to evaluate and as accurately as possible, determine a grade for the outcomes assessed. Thank you for this… It will be useful with my staff.
Thanks Rick. I appreciate the feedback (and got a chuckle that both of our names were mangled) and the insights you shared. If we want to use the evidence appropriately, we need to stop using data as a hammer (rank and sort, which promotes little change in either teaching or learning) and start using data as a flashlight (convert to evidence that enlightens both teacher and learner).
The road to effective common assessment is a long one for many campuses, requiring intentional diagnosing, planning, training, monitoring, and coaching of a bevy of skill sets for educators. Yielding “instructionally actionable” data is not something that just happens because assessment literacy is often not taught in teacher preparation programs. District and campus leadership can easily espouse the need for common formative assessment, but all too often bypass crucial questioning that accompanies this necessary strategy: Are modes of assessment aligned to the learning objectives in outcome and rigor? Do teachers/team members trust one another enough to expose their students’ results to one another? Do they know how to analyze the data and create action plans? Has the time that this level of intentionality requires been allotted for in the overall structure and schedule of the school day and year? Does the system offer competent personnel for coaching and differentiation of professional development necessary to accomplish this goal?
Thanks Staci. You have surfaced a lot of critical pieces that must be part of sound assessment practice. It all begins with purpose borne out of collective commitment by all members of a school faculty. Effective schools don’t just take all the fruit, they also tend to the root. This is done through the strategies you’ve identified in your comments.