Learning to Unlearn: Adapting to New Evidence in the Face of Experience
In medicine, experience is a source of wisdom, but it can also become a barrier to learning. Clinical habits, forged over years and reinforced by patient outcomes, often resist new evidence—even when that evidence is robust. For doctors committed to lifelong learning, the hardest lessons aren’t always new treatments or guidelines. Sometimes, the real challenge is letting go of what we thought we knew.
The Psychology of Unlearning
Changing a well-worn practice is not just an intellectual exercise—it’s a psychological one. Theories of adult learning, such as Mezirow’s Transformative Learning Theory, describe how adults revise meaning structures only when faced with a disorienting dilemma (Mezirow, 1997). In medicine, this might come in the form of a new guideline, an audit of practice, or a bad outcome. But recognition alone is not always enough.
Several cognitive and emotional factors can block this change:
Ego and Identity Threat
To admit that a long-standing practice is outdated can feel like admitting we were wrong—or worse, that we caused harm. This presents a direct challenge to the clinician’s self-concept. Self-affirmation theory suggests that when our core identity is threatened, we are more likely to dismiss conflicting evidence (Steele, 1988), and place additional weight on evidence that supports our view. In medicine, where competence is closely tied to professional worth, such threats can be powerful.
Confirmation Bias and Anecdotal Reinforcement
Clinicians are particularly susceptible to confirmation bias—the tendency to favour information that supports our beliefs and ignore data that contradicts them (Nickerson, 1998). A patient who improves despite inappropriate treatment reinforces the behaviour, even if the outcome was coincidental. This is the root of "anecdotal-based medicine."
Repeated success reinforces habit, even when the practice is misaligned with the evidence. It’s a classic case of operant conditioning: reward strengthens behaviour.
The “In My Hands" Fallacy
A common response to unfavourable evidence is the belief that one can achieve better outcomes than those in the studies—because of superior technique, patient selection, or clinical intuition. This "in my hands" fallacy reflects overconfidence bias, where individuals overestimate their abilities relative to others (Kruger & Dunning, 1999). While it’s true that expertise can influence outcomes, this reasoning is often used to justify ongoing use of discredited therapies.
When Evidence Shifts
Medicine is replete with examples where guidelines changed—often dramatically—in the face of evolving evidence:
Sepsis Management
The Surviving Sepsis Campaign once endorsed early goal-directed therapy (EGDT), driven by a protocol of central venous pressure monitoring and aggressive fluid resuscitation. Subsequent studies (ProCESS, ARISE, ProMISe) showed no mortality benefit over usual care, leading to revised guidelines (Mouncey et al., 2015; Peake et al., 2014). Many clinicians were slow to change, citing past successes or preference for protocolised care.
Opioid Prescribing
Once considered humane and safe for chronic non-cancer pain, opioid prescribing surged in the 1990s based on limited evidence. The resulting opioid epidemic revealed the flaws in that logic, but change has been slow. Clinicians often reference “trusted patients who need it,” further illustrating selective reasoning.
COVID-19 Treatment
Early in the pandemic, hydroxychloroquine, convalescent plasma, and avoidance of steroids were widely adopted based on theoretical benefit. High-quality trials (RECOVERY, Solidarity) later overturned many of these assumptions, leading to new protocols. Even then, uptake was inconsistent, highlighting the challenge of de-implementation.
The Problem with Evidence
Doctors are scientists, but medicine is an art coloured by uncertainty. And evidence, while powerful, is rarely absolute.
Imperfection and Rejection
Randomised trials, systematic reviews, and guidelines often produce probabilities—not guarantees. Doctors rightly ask: Is this study applicable to my patient? Are the inclusion criteria too narrow? Too broad? Are the outcomes relevant?
Such questions are reasonable, but they can also be weaponised to avoid change. Cognitive dissonance theory (Festinger, 1957) explains how individuals reduce discomfort by downplaying the importance or validity of contradictory evidence.
Population vs. Individual
Most clinical evidence is population-based: the average effect in a defined group. But doctors treat individuals, not averages. This leads to the argument that “there may be responders” or “I know which patients benefit.” While subgroup effects are real, they’re notoriously hard to identify prospectively. This creates tension between statistical significance and clinical judgment.
Strategies to Support Unlearning
Recognising outdated practice is difficult—especially when it’s invisible to us. Here are strategies that can help:
1. Deliberate Reflection
Regular self-audit, journaling, and structured reflection help surface entrenched patterns. Platforms like Osler's CPD tools allow clinicians to log reflections and recognise shifts in thinking.
2. Direct challenge
Forcing yourself to assess your current practice can help to unearth areas where your practice is lagging. Structured self-assessments and quizzes which challenge your practice are great options.
3. Peer Review and Feedback
Engaging in peer feedback—whether via case discussions, performance review, or M&M meetings—offers an external mirror to identify blind spots. It normalises questioning and supports accountability.
4. Active Monitoring for Change
Doctors should proactively seek out guideline updates, review journals, and summaries like BMJ Best Practice or UpToDate. Set a schedule for reviewing your own default practices annually.
5. Institutional Support
Hospitals and health services must foster environments that value humility, adaptation, and team-based reflection. Unlearning is safer and more sustainable when done collectively.
Conclusion: The Courage to Change
Medicine evolves, and so must we. The willingness to unlearn is not a mark of failure—it’s a hallmark of professionalism. It takes cognitive flexibility, emotional resilience, and intellectual humility to let go of practices that once felt right.
So ask yourself: What have I stopped doing this year? And what does that say about my growth?
References
Festinger, L. (1957). A Theory of Cognitive Dissonance. Stanford University Press.
Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognising one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121–1134.
Mezirow, J. (1997). Transformative learning: Theory to practice. New Directions for Adult and Continuing Education, 1997(74), 5–12.
Mouncey, P. R., et al. (2015). Trial of early, goal-directed resuscitation for septic shock. The New England Journal of Medicine, 372(14), 1301–1311.
Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175–220.
Peake, S. L., et al. (2014). Goal-directed resuscitation for patients with early septic shock. The New England Journal of Medicine, 371(16), 1496–1506.
Steele, C. M. (1988). The psychology of self-affirmation: Sustaining the integrity of the self. In Advances in Experimental Social Psychology (Vol. 21, pp. 261–302). Academic Press.