“And once I had recognized the taste of the crumb of madeleine (cake)...immediately the old grey house upon the street, where her room was, rose up like the scenery of a theater...so in that moment all the flowers in our garden...and the water lilies on the Vivonne and the good folk of the village and their little dwellings and the parish church and the whole of Combray and its surroundings...sprang into being, town and gardens alike, all from my cup of tea.”
- Marcel Proust, Remembrance of Things Past
Our desire to elucidate learning, memory, and mind, those strange bedfellows that inhabit the dynamic world between our ears, is as old as philosophy itself. While a research orientation approximating cognitive science has been around for more than a century, we are now in a golden age. In recent decades, an array of disciplines, most especially neurology, psychology, sociology, anthropology, and education, seek to understand learning in a more precise way; that is, less anecdote and more data. In particular, the last decade of research has given us remarkable insights into the nature of learning. Many of these insights are counter-intuitive and challenge long standing myths about teaching and learning. According to Peter Brown et.al., authors of Make it Stick: The Science of Successful Learning (2015), “A great deal of what we think we know about how to learn is taken on faith and based on intuition but does not hold up under empirical research. Persistent illusions of knowing lead us to labor at unproductive strategies.” Educators are only beginning to come to terms with these findings.
All learning begins with memory. At first, this statement might seem obvious. If so, then it is a wonder we ignore the profound implications of how memory actually works with regards to durable and transferable learning, and that we cling so desperately to myths about the same. Cognitive psychologist Daniel Willingham, in Why Don't Students Like School? (2009), contends that “memory is the residue of thought”, not a perfect record of experience so much as a snapshot of what we understood and related to most in a particular moment. This process, “...of converting sensory perceptions into meaningful representations in the brain is still not perfectly understood. We call the process encoding, and we call the new representations within the brain memory traces” (Brown et.al.).
These initial memory traces require consolidation so as to constitute learning. Consolidation is the brain stabilizing and reorganizing these traces into discernible and useful memories. Research suggests that, “...the brain replays or rehearses the learning, giving it meaning, filling in blank spots, and making connections to past experiences and to other knowledge already stored in long term memory. Prior knowledge is a prerequisite for making sense of new learning, and forming those connections is an important task of consolidation” (Brown et.al.). Benedict Carey, in How We Learn (2014), contends that, “The brain does not store facts, ideas, and experiences like a computer does, as a file that is clicked open, always displaying the identical image. It embeds them in networks of perceptions, facts, and thoughts, slightly different combinations of which bubble up each time. And that just retrieved memory does not overwrite the previous one but intertwines and overlaps with it. Nothing is completely lost, but the memory trace is altered and for good.” Thus, our ability to effectively and efficiently recall information from our long term memory, the storehouse of our learning, is wholly dependent on encoding, recoding, consolidation, and association.
Cognitive science identifies a variety of memory types. For our purposes, we will focus on the interplay between experience, working memory, and long term memory. Experience is the source material for our memories. Working memory, while relatively limited in terms of duration and capacity, allows us to hold the object of our attention, e.g. steps in a process, directions, an immediate problem or concern, etc., long enough to retrieve important information from our long term memory. Long term memory is home to our most permanent memories; specifically, our procedural memory (how to do things), our episodic memory (the autobiographical accounts of past experience) and semantic memory (the ability to recall words, numbers, and concepts). Our long term memory is constantly evolving. There is a use it or “lose” it phenomena where long term memory is concerned—a mental pruning that doesn’t eliminate past memory, but certainly shifts some of it to unmarked boxes in the back of our cerebral garages (see the above Proust quote). What is most remarkable about long term memory is that a simple stimulus—a smell, a song lyric, a phrase—can bring even our dimmest recollections to the fore, and this has everything to do with how closely memory is tied to context, emotion, and conscious and subconscious stimuli.
So what does all of this have to do with learning? While we have some sense of how memory works at an intuitive level, we get a lot about how memory shapes our learning very wrong. This is because, among other things, we are creatures of habit who prefer our learning to come easy. We also harbor a ready-made narrative about how learning occurs, and this is tied to our past experience, to persistent approaches in education with regards to the way material is generally presented and how learning is usually assessed. It also has a lot to do with our personal myths about our “learning style”, and our adopted study habits. Cognitive science provides a counterintuitive view of the interplay of memory and learning, and challenges these narratives and myths.
Let’s begin with massed practice, a stalwart in education, coaching, and professional development. Massed practice—doing the same problem type or action over and over—yields immediate but short lived and relatively superficial results. At a minimum, spaced practice produces a far better return for retrieval and learning. Interleaving, working on different problem types in close succession, results in far more durable learning. Think of this as mentally toggling between different concepts and skills, intermittently returning to each. We are most successful learning, and consequently learn more deeply, when we do this, and we do this more than we likely recognize (e.g. learning a new program, application, or skill at the same time that we are working on a specific work-related problem or problems). Nonetheless, this type of learning is more challenging because some confusion and forgetting occurs, and this leads to more frustration on the front end. Children and adults alike prefer immediate gratification, e.g. the successful completion of ten of the same problem types in a row, over interleaving. But interleaving has additional benefits—it helps us “learn better how to assess context and discriminate between problems, selecting and applying the correct solution from a range of possibilities” (Brown et.al.). If one can do the latter, a durable and transferable learning has occurred, e.g. conceptual understanding.
And here we arrive at another counterintuitive fact, that some confusion, some difficulty, some forgetting, is required to deeply learn a thing. This paradox, “...is at the heart of the concept of desirable difficulties in learning: the more effort required to retrieve (or, in effect, relearn) something, the better you learn it. In other words, the more you’ve forgotten about a topic, the more effective relearning will be in shaping your permanent knowledge” (Brown et.al.). Put another way, “Normal forgetting—that passive decay we so often bemoan—is also helpful for subsequent learning...Some “breakdown” must occur for us to strengthen learning when we revisit the material. Without a little forgetting, you get no benefit from further study” (Carey).
Furthermore, a little discomfort goes a long way in overcoming the familiarity trap. If we think we know a thing, we are less likely to practice diligently, to quiz ourselves, to ensure our understanding, and more likely to take short cuts, which undermines both memory and learning. Consider the use of “flash cards”. If we never alter the order of the stack, we can anticipate the next card, the next answer. However, if we re-shuffle the deck continuously, we better measure both the accuracy of our retrieval and the depth of our knowing.
But something else is in play when we reach an intellectual impasse, incubation. Walking away when we struggle to solve a problem seems as if we are simply abandoning our task. Not necessarily so. We often continue processing, unconsciously sorting through our memory, through past knowledge, and strategically forgetting, taking in other clues, even reconsidering the problem (Carey). This is why after a coffee break or a run or some other unrelated task, we return with fresh insight or a novel angle of approach.
Similarly, interference actually supports learning, provided it’s done thoughtfully. Interference refers to differences/discrepancies in how information is presented, e.g. the distinctions between a textual overview (textbook or similar) and class instruction about the same, or between a PowerPoint and the lecture that accompanies it, etc. We have to work harder to make sense of material delivered in this way, to consolidate variance in competing narratives, and to associate these with our prior knowledge. Consequently, this approach to learning is more likely to stick in our memory, and result in more durable and transferable learning (I repeat this phrase intentionally).
One longstanding myth about learning, promoted by none other than the esteemed mid-century intellectual B.F. Skinner, is the notion that error should be avoided at all costs. This myth arises from the fear that if we commit an error to memory, whether it be factual, conceptual, algorithmic, etc., it is therefore learned, tenacious, and potentially permanent. Cognitive science disagrees, for “When learners commit errors and are given corrective feedback, the errors are not learned. Even strategies that are highly likely to result in errors, like asking someone to try to solve a problem before being shown how to do it, produce stronger learning and retention of the correct information than more passive learning strategies, provided there is corrective feedback” (Brown et.al.). Skinner’s broadly adopted notion fails to hold up to scientific scrutiny. Worse, it can inhibit future learning and intellectual resiliency. Consider, for example, the critical role of failure in innovation and the advancement of the sciences. Students who accept that they will inevitably commit errors in their learning are better prepared for more difficult problems in their futures, and are more likely to seek them out. They also tend to learn from their mistakes.
While we still have much to discover about how learning occurs, and just how this complex process should shape our conversations about pedagogy and assessment, we know definitively that memory plays a critical role in our knowing and meaning making. Understanding how memory works, and confronting the myths associated with learning, are central to improving learner outcomes.
See you around campus.