Review of Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts, by Carol Tavris and Elliot Aronson (Mariner Books, New York, 2007, 2015)
The Netflix documentary, Making a Murderer, was all the rage earlier this year. The series, filmed over a period of ten years, follows the story of Wisconsinite Steven Avery, a two-time convict. Avery was exonerated by DNA evidence while serving time for his first conviction, only to be charged again for a second violent crime shortly after his exoneration and release.
While the focus is on Avery, the story also includes his nephew, Brendan Dassey, who was convicted for collaborating with Avery in the murder of Teresa Halbach. The documentary drew attention to the means by which investigators elicited a confession from Dassey, who seemed clearly vulnerable to investigators—given both his young age and intellectual disability.
However one lands on the question of Avery (and Dassey’s) culpability, the documentary does a fantastic job at raising hard questions about the role of biases within our criminal justice system and making a case at how easily things could go awry. It presses the point that, no matter the good-intentions of police and prosecutors, the cloud of subjectivity shapes judgment and, eventually, determines outcomes; this can have serious, life-altering consequences and, if the judgments are wrong and the outcomes unjust, can be tragic.
Some rightly argue that the filmmakers have their own set of biases, which cloud their judgment, and—as a consequence—their viewers’ judgment too).
The series closes with a moving reflection by Avery’s attorneys—both of whom come off as probably the most likable (and, for me anyway, the most rational) people involved in the whole affair. The remarks of Attorney Dean Strang caught my attention.
Reflecting on the jury’s verdict of “guilty,” Strang had this to say:
“I just can’t imagine he [Avery] did them…and I don’t believe he did them.
The forces that caused that I understand and I don’t think are driven by malice. I think [they] are just expressions of ordinary human failing. But the consequences are what are so sad and awful.
Most of what ails our criminal justice system lie in unwarranted certitude on the part of police officers and prosecutors and defense lawyers and judges and jurors, that they’re getting it right. That they simply are right. Just a tragic lack of humility—of everyone who participates in our criminal justice system.” (emphasis added)
Granted, Dean Strang has biases, too. And who’s to say whether Avery, in the end, is really guilty or not. That debate, at least among the interested public, goes on still today.
But that phrase, “a tragic lack of humility,” seems to me to describe not only glaring weaknesses in the criminal justice system–at least as portrayed in “Making a Murderer”—it also captures so much of what ails our society as a whole. So often we are driven by certainty, empowered by a relentless need to demonstrate, to confirm, that we (me, my tribe, my party, etc.) are absolutely, unequivocally in the right.
It’s so hard to admit that we might be wrong, especially when we have so much invested in being right. This is true on the public scale (criminal justice, politics, policy-making, religious conflict, etc.) and it’s true on the private scale (navigating conflicts in marriage or other relationships, purchasing more house than we can afford, or just generally admitting our own mistakes to ourselves).
In their book, Mistakes Were Made (but Not by Me), Carol Tavris and Eliot Aronson explain why we have such difficulty admitting that we are wrong—or even seriously considering that we might be wrong. They demonstrate, using both empirical research and historical and current examples, why it’s so excruciatingly difficult to admit our mistakes, to acknowledge that we’ve looked at the evidence incorrectly, or to fess up that we’ve misremembered an event from the past. They show why it’s so hard for us to say “I’m sorry.”
Our difficulty with humility—with making and admitting mistakes—can be largely explained by a psychological theory called “cognitive dissonance.” This theory was first developed by social psychologist Leon Festinger in the 1950s; the phrase has become a part of popular lingo, but its profound implications probably eludes most who use it.
Cognitive dissonance, Tavris and Aronson explain, is “the state of tension that occurs whenever a person holds two cognitions (ideas, attitudes, beliefs, opinions) that are psychologically inconsistent, such as ‘Smoking is a dumb thing to do because it could kill me’ and ‘I smoke two packs a day’” (pp. 15-16). Wherever and whenever such dissonance occurs, the person experiencing it will naturally work to reduce or eliminate the dissonance, either by changing a behavior (i.e. quitting smoking) or through more “ingenious” ways, such as “by convincing herself that smoking isn’t really so harmful” or that it is “worth the risk” because of some other benefit (such as weight loss).
The need to eliminate dissonance means that we find all sorts of creative ways to deal with “disconfirming evidence” in order to maintain our prior beliefs, convictions, or behaviors. This ability to contort our minds in order to maintain consonance is knowns as the “confirmation bias” (p. 22).
When we encounter conflicting information or competing beliefs, we naturally ferret out information and evidence that runs against the beliefs we already hold. These can be beliefs about politics, economics, religion, or even about ourselves (who we are and what we stand for). Cognitive dissonance, whenever we experience it, cannot be maintained for long. We don’t have the mental energy to live with it.
The implications of cognitive dissonance and confirmation bias are innumerable. They run throughout our culture wars, our political competitions, our policy debates, to the more personal and relational: marriage conflicts, family disagreements, religious beliefs and practices.
There is good news. By understanding cognitive dissonance and the impact of confirmation bias, we’ll be better prepared to overcome its most pernicious effects.
This book just might change your life. It will teach you the art of making mistakes—and how to say “I’m sorry.”
And just a few days ago, on August 12, 2016, a federal judge in Milwaukee, Wisconsin declared that mistakes were made in the 2007 conviction of Brendan Dassey for the murder of Teresa Halbach, as he overturned Dassey’s murder and sexual assault convictions, and ordered his release from prison unless prosecutors schedule a new trial within 90 days. In particular, the methods used by investigators were deemed coercive, due to his age and learning disability.
In two subsequent posts I will explore the implications of cognitive dissonance and confirmation bias for (1) the current public discussion of prejudice and racism in policing and (2) the political season, through the contributions of this book, Mistakes Were Made (But Not By Me).
Note: Mistakes Were Made (but Not by Me).was originally published in 2008 and has been reissued in an edited version (2015). In the foreword to the new edition, the authors acknowledge that there may have been a few mistakes made (by them) in the first edition.
Kyle Roberts is Associate Professor of Public and Missional Theology at United Theological Seminary of the Twin Cities. Click here to read his Patheos blog, UnSystematic Theology.
The views, opinions, authors, and contributors represented in The Table do not necessarily represent the beliefs of Biola University or the Biola University Center for Christian Thought.