You are here
Being Wrong II (Summer Reading)
Wednesday, July 27th, 2011
Robert Wechsler
This is the second of two posts looking at Kathryn Schulz's excellent book, Being Wrong: Adventures in the Margin of Error (2010), as it applies to local government ethics. This post focuses on how to deal responsibly with one's mistakes, and to the extent possible prevent them.
Dealing Responsibly with Mistakes
What is an official to do? Although the author does not deal with government per se, she gets right at its heart in one of the conclusions she arrives at late in her book, a conclusion about what should be done in an organization to prevent mistakes:
-
[W]e can foster the ability to listen to each other and the freedom
to speak our minds. We can create open and transparent environments
instead of cultures of secrecy and concealment. And we can permit
and encourage everyone, not just a powerful inner circle, to speak
up when they see the potential for error. These measures might be a
prescription for identifying and eliminating mistakes, but they
sound like something else: a prescription for democracy.
That’s not an accident. Although we don’t normally think of it in
these terms, democratic governance represents another method ... for
accepting the existence of error in trying to curtail its more
dangerous incarnations.
She notes that the traditional response to mistakes was the same evasion, obfuscation, and denial we so often see in government officials. Most important, she notes that students learn from their teachers and supervisors how to practice the concealment of errors, and that this behavior is rewarded. Ms. Berlinger is quoted as saying, "They learn how to talk about unanticipated outcomes until a ‘mistake’ morphs into a ‘complication.’ Above all, they learn not to tell the patient anything.” They also learn how to justify their habit of nondisclosure.
This is just what happens in government.
Some hospitals now require explanations and apologies, as soon as possible, even going so far as to e-mail the entire staff and send out a press release about incidents, such as botched surgeries. Shulz sagely points out, “If you want to try to eradicate error, you have to start by assuming that it is inevitable.” And you have to learn not how to conceal it or justify it, but rather how to apologize for it and turn it into a learning exercise.
It is important not only to admit you can make a mistake, but to figure out what caused you to make particular mistakes. Saying "I took my eye off the ball" or "I screwed up" is not enough to prevent the same mistake from happening again. It's important to talk to others, because one of the essential things about being wrong is that, while it is hard for us to believe we're wrong, it's very easy for others to see how wrong we are.
An important part of preventing mistakes is to create a culture of openness and honesty. Since we are unlikely to report our mistakes, it is important to encourage others to do so, and to protect them from punishment when they do. In the government ethics context, this means not only open discussions of the ethical aspects of situations, with someone playing devil's advocate so that criticism is not taken as personal. It also means an independent system of ethics advice that higher officials actively encourage everyone to use, and an independent enforcement mechanism that is as open as any other part of the process.
Accountability
Another aspect of government ethics upon which Schulz sheds some light is the fact that negligence, doing something wrong without meaning to, is the touchstone of government ethics. This is one of the principal differences betweeen government ethics and criminal law. Schulz notes that, because our mistakes are unforeseeable to us, “we seldom feel that we should be held accountable for them.” If we didn't know we were making them, why should we have to pay for them? In effect, we want no-fault government ethics.
She also notes that denial too involves an ethical dilemma: “should we or should we not be held accountable for refusing to admit that we are wrong?” She looks at prosecutors who are wrong, and yet deny it. She notes that “people who have signed on to serve the cause of justice generally see themselves, not unreasonably, as being on the side of the angels.” They do not feel they should be held accountable for their mistakes, because they were done for a good cause.
All sorts of local government officials feel this way. They have sacrificed their careers for their communities, they feel, and should not be held accountable when they help others, and themselves.
Government ethics is all about accountability. Clearly, it is not as bad a thing to make a mistake as to do something intentionally. But because it is so hard to prove crimes such as bribery, a tiny percentage of government officials who do take bribes are held accountable for it. Accountability is far greater in government ethics, but officials do not go to prison, and in few cases are their positions even jeopardized. But they are held accountable for what might be mistakes or might be intentional conduct. The difference from criminal law is that, in government ethics proceedings, intent does not have to be proved.
It's Really About Appearances
One of the reasons we are wrong so often in a government ethics context is what Schulz calls the ’Cuz It’s True Constraint. We can't believe that there are selfish reasons for our believing what we do. For example, we cannot believe that we vote for a grant because our brother is the director of the charity it is going to. We believe we vote for the grant because it's going to a cause that is good for the community. And yet most neutral people will believe we voted for the grant because of our brother.
In fact, we would believe the same thing about someone else. “[W]e impute biased and self-serving motives to other people’s beliefs all the time. And, significantly, we almost always do so pejoratively. … Psychologists refer to this asymmetry as ‘the bias blind spot.’”
This asymmetry is produced by the fact that we can look into our own minds, but not anyone else’s, so that we “draw conclusions about other people’s biases based on external appearances — on whether their beliefs seem to serve their interests — whereas we draw conclusions about our own biases based on introspection. … Our conclusions about our own biases are almost always exculpatory. At most, we might acknowledge the existence of factors that could have prejudiced us, while determining that, in the end, they did not. Unsurprisingly, this method of assessing bias is singularly unconvincing to anyone but ourselves.”
And yet with stick with it ’Cuz It's True, no matter what anyone thinks.
But government is heavily dependent on what other people think, how they perceive appearances. Appearances are, for the public, all there is.
Officials recognize this. For them, how they appear is extremely important. They spend a great deal of time attempting to guide the public's perceptions. Why should this not be equally as true with respect to dealing responsibly with conflicts? But it's not. The public's perceptions come into play not with respect to how we deal with a conflict, but only when we try to justify our conduct, to look good, to save our face.
Being Wrong Is a Learning Experience
Being wrong does not have to be seen as a negative emotional experience that undermines who you are and how you are perceived. As Schulz notes, being wrong is how a child learns. It's also how scientists learn. And it's not just that we learn from our mistakes. Those around us also learn from our mistakes. If our community's leaders stand strong on their rightness, even when they are wrong, especially about ethical conduct, no one learns anything but bad habits, and the public's trust is undermined. Don't local government officials have an obligation not only to lead, but to teach, to help their communities grow not just economically, but in other more abstract but equally important ways?
Some Good Quotations
Being Wrong deals with many, many more aspects of wrongness. It's one of the best books I've ever read on any topic. You would be wronging yourself not to read it. I will leave you with a few thought-provoking quotations from the book:
On denial in politics: “The arena of politics ... is to denial what a greenhouse is to an orchid: it grows uncommonly big and colorful there.”
On cutting our losses: “[W]e are quasi-rational actors, in whom reason is forever sharing the stage with ego and hope and stubbornness and loathing and loyalty. The upshot is that we are woefully bad at cutting our losses.”
On the emotional aspects of being wrong: “Our capacity to tolerate error depends on our capacity to tolerate emotion. ... if we can’t do the emotional work of accepting our mistakes, we can’t do the conceptual work of figuring out where, how, and why we make them.”
On certainty, something we usually value in our political leaders: "Our sense of certainty is kindled by the feeling of knowing — that inner sensation that something just is, with all the solidity and self-evidence suggested by that most basic of verbs. Viewed in some lights, in fact, the idea of knowledge and the idea of certainty seem indistinguishable. But to most of us, certainty suggests something bigger and more forceful than knowledge. ... [But] knowledge is a bankrupt category and ... the feeling of knowing is not a reliable indicator of accuracy. We have seen that our senses can fail us, our minds mislead us, our communities blind us. And we have seen, too, that certainty can be a moral catastrophe waiting to happen.
"Moreover, we often recoil from the certainty of others even when they aren’t using it to excuse injustice and violence. The certainty of those with whom we disagree … never looks justified to us, and frequently looks odious. … By contrast, we experience our own certainty as simply a side effect of our rightness, justifiable because our cause is just. … We cannot imagine, or do not care, that our own certainty, when seen from the outside, must look just as unbecoming and ill-grounded as the certainty we abhor in others.
"This is one of the most defining and dangerous characteristics of certainty: it is toxic to a shift in a perspective. If imagination is what enables us to conceive of and enjoy stories other than our own, and if empathy is the act of taking other people’s stories seriously, certainty deadens or destroys both qualities. On our relationship to evidence: “Ignorance isn’t necessarily a vacuum waiting to be filled; just as often, it is a wall, actively maintained. … The facts might contradict our own beliefs, not those of our adversaries. Alternatively, the facts might be sufficiently ambiguous to support multiple interpretations. ... We think the evidence is on our side. It is almost impossible to overstate the centrality of that conviction to everything this book is about.” and "Evidence is almost invariably a political, social, and moral issue … If we want to improve our relationship to evidence, we must take a more active role in how we think — must, in a sense, take the reins of our own minds. To do this, we must query and speak and investigate and open our eyes. Specifically, and crucially, we must learn to actively combat our inductive biases: to deliberately seek out evidence that challenges our beliefs, and to take seriously such evidence when we come across it.”
On our ability to justify our beliefs and actions: “[I]f we want to eat dinner rather than be dinner, we are well served by a process so rapid and automatic that we don’t need to waste time deliberately engaging it. As with our perceptual processes, this automatic theorizing generally careens into consciousness only when something goes wrong.” and “The creative ability to construct plausible-sounding responses and some ability to verify those responses seem to be separate in the human brain.”
François de la Rochefoucauld: “Everyone complains about their memories; no one complains about their judgment.”
Robert Wechsler
Director of Research-Retired, City Ethics
203-859-1959
Story Topics:
- Robert Wechsler's blog
- Log in or register to post comments