The rapid onset of COVID-19 that forced schools across North America to immediately pivot to a virtual learning model may have exposed some aspects of teaching and learning long overdue for reconsideration. For some teachers, what became most evident was that transitioning to a virtual learning model was not as seamless as it could have been.
There was, of course, no way anyone could anticipate or avoid the steep learning curve that students, parents, and teachers experienced, as educators swiftly reimagined the remainder of the school year. While some teachers may have had the benefit of a spring break (it wasn’t much of a “break”) to prepare, most had to immediately rethink literally everything we had all come to predictably expect from our day-to-day experiences.
With no immediate end to this pandemic in sight, the collective rethinking of what evidence of learning truly matters and how to maintain academic integrity has to be prioritized going forward.
What some teachers (not necessarily many or most) found out in terms of assessment is that some of what was planned for face-to-face instruction did not transfer to remote learning at all; not even a little. Generally, the lower the cognitive complexity of a standard (and subsequent assessment), the more teachers were concerned about academic dishonesty and how they would verify that the evidence of learning was the students’ alone. The traumatic nature of this global pandemic created so much uncertainty that has yet (with some exceptions) to subside. Uncertainty leads to students making poor decisions at the best of times, never mind during a once-in-a-lifetime event.
Students don’t have adult brains. The part of the brain responsible for sensation-seeking develops in puberty while the part responsible for impulse control develops in young adulthood (Weisinger & Pawli-Fry, 2015); this explains why teenagers engage in risky behaviors like cheating. This combination can lead many teenagers to feel desperate.
Tom Guskey (2020) recently wrote: “Students cheat on assessments because of the consequences attached to their performance and uncertainty about results. In other words, they fear what might happen if they don’t do well and they’re unsure about how best to prepare.” The 2020-21 school year presents arguably the most uncertainty students have faced in generations, which means teachers and parents have to know that the majority of students aren’t acting with malice; it’s desperation.
So, what’s the answer? How do we ensure academic integrity within the summative purpose of assessment within a remote learning model?
The answer is to prioritize evidence of original thinking and creativity. First, recall is not thinking; as Dylan Wiliam (2020) recently pointed out, “Psychologists who research memory…point out that how easy it is to retrieve something from memory is different from how well something has been learned—what they call retrieval strength and storage strength, respectively.”
So even when students retrieve from memory, it is still often a poor indicator of student learning. Second, assessments emphasizing recall are easy to copy. There simply is no way for teachers to control the assessment context in a virtual or even hybrid model; nor should they want to. Spending a disproportionate amount of time and energy trying to manage the assessment context is a certain path to burnout.
Eliciting evidence of original thinking means that students cannot source their evidence of learning directly from any outside source, and teachers need not waste valuable energy wondering about whose learning they’re assessing. Yes, students can use outside sources to shape their thinking, but a direct copy would likely be clear and obvious. When students are asked to critique, analyze, synthesize, interpret, or inquiry, they must create a cohesive, original thought on the topic at hand. It’s their inquiry question, their analysis or their critique. The good news is that so many standards are already at an appropriate level of cognitive complexity that this should not represent a tough ask for most teachers. Below is a small sampling of standards that illustrate this point:
- Write arguments to support claims with clear reasons and relevant evidence.
- Develop a probability model and use it to find probabilities of events. Compare probabilities from a model to observed frequencies; if the agreement is not good, explain possible sources of the discrepancy.
- Define the criteria and constraints of a design problem with sufficient precision to ensure a successful solution, taking into account relevant scientific principles and potential impacts on people and the natural environment that may limit possible solutions.
Creativity, if we follow Howard Gardner’s (2010) definition, means students will need to think beyond the conventional wisdom and habitual practice. It is not simply producing something that is aesthetically pleasing. With that, teachers would be wise to emphasize the formative acquisition of foundational knowledge and skills. The focus on feedback and the development of this knowledge or those skills in absence of grades, scores, or levels is most favorable.
Creativity does not occur in a vacuum. For creativity (or any critical 21st century competency) to authentically develop, students need a breadth and depth of knowledge; it’s a repurposing of knowledge as a means rather than an end (Erkens, Schimmer, & Vagle, 2019). As Gardner asserts, “…it’s not possible to think outside the box unless you have a box” (pg. 17).
Of course, not every standard reaches the same depth of knowledge, which means we have two choices:
- We prioritize the standards that are more cognitively complex and allow for original thinking, or
- We use evidence of learning on those standards less sophisticated formatively and exclusively focus on feedback; in some cases, you might do both. Remember, using assessment evidence for the summative purpose does not necessarily mean there needs to be a summative event. Summative assessment is truly a moment where the teacher examines the preponderance of evidence to make an overall, holistic judgement about the level of proficiency achieved, whether that be on one standard, a unit, or an entire course.
The overarching point is to disproportionately favor the formative purpose of assessment by emphasizing descriptive feedback to maintain a positive growth trajectory. When the time comes for summative judgments, ensure that the assessment tasks or prompts elicit original and creative thinking. If students copying other students is a primary concern, then the relatively simple solution is to design assessments that aren’t copyable; students can’t copy what can’t be copied.
For our youngest learners this may mean more synchronous assessment routine, which can be daunting. However, emphasizing quality over quantity will ensure that what evidence is elicited is reliable and valid. Teachers at every level are going to have to accept the fact that they will have less evidence to use for summative judgments; however, less does not have to mean poor or invalid.
High quality assessment in a remote or hybrid model is doable as long as the assessment tasks (and performance criteria) are deep, authentic, and sophisticated for the learners. This would be a change that I for one would hope becomes permanent new normal even when full-time, face-to-face learning returns.
Erkens, C., Schimmer, T., & Vagle, N. (2019) Growing tomorrow’s citizens in today’s classrooms: Assessing seven critical competencies. Bloomington, IN: Solution Tree.
Gardner, H. (2010). Five Minds for the Future. In J. Bellanca & R. Brandt (Eds.), 21st century skills: Rethinking how students learn (pp. 9-31). Bloomington, IN: Solution Tree.
Guskey, T. (2020). What to Do About Cheating on Assessments in Virtual Learning? [Blog] Education Week. Retrieved 2 September 2020, from https://blogs.edweek.org/edweek/finding_common_ground/2020/08/what_to_do_about_cheating_on_assessments_in_virtual_learning.html.
Weisinger, H. & Pawliw-Fry, J.P. (2015). Performing Under Pressure: The Science of Doing Your Best When It Matters Most. New York, NY: Crown Business Publishing.
Wiliam, D., 2020. COVID-19 Learning Loss: What We Know and How to Move Forward. [Blog] Education Week, Available at: <http://blogs.edweek.org/edweek/rick_hess_straight_up/2020/08/covid-19_learning_loss_what_we_know_and_how_to_move_forward.html> [Accessed 2 September 2020].