You are here
Blind Spots I — Unconscious Unethical Conduct
Monday, April 4th, 2011
Robert Wechsler
Although it is not a book about government ethics, Blind
Spots:
Why
We
Fail
to Do What's Right and What to Do about It by
Max H. Bazerman and Ann E. Tenbrunsel (Princeton University Press) is
a must-read book for government ethics practitioners. This new book (it
came out just a couple of weeks ago) incorporates a great deal of
research in
behavioral ethics to look (1) at what is going on in the minds and
actions of those who act unethically but do not intend to act
unethically, and (2) at what can be done to change their behavior for
the better.
As with any book that touches on a field you know, there are not a lot of new revelations here. I've noted a lot of the behavior, reasoning, and justifications that the authors note. What this book does is collect, organize, analyze, and give names to what is familiar, so that we can more easily and profitably discuss unintentional unethical behavior, teach officials better, and set up better ethics programs. Over the next week or so, I will be writing about some of the concepts in this book that are most useful for government ethics.
The authors start from the premise that most unethical behavior is unintentional, and that we have numerous psychological obstacles that stand in the path of our acting ethically.
The "blind spots" referred to in the book's title involve psychological processes that prevent us from recognizing the biases that lead us to act unethically. There are also blind spots in an organization, consisting of leaders' failure to appreciate the fact that their employees often act inconsistently with their beliefs. Instead, leaders tend to believe that their employees' integrity will protect them from making unethical decisions.
Does Our Integrity Protect Us?
Just before I picked up this book, I was told by a state ethics commission staff member that a member of the ethics commission who had been accused of acting unethically (in the government ethics sense) was a man "of unquestioned integrity." I nearly hit the ceiling. It is common to hear government officials saying this about themselves, their colleagues, and their employees. But when a government ethics professional says this, it shows how powerful our blind spots really are.
When I asked him, in a responding e-mail message, "Do you truly believe that someone 'of unquestioned integrity' cannot put his self-interest before the public interest, cannot let his anger get the best of him?" I received no response.
Effectively, this book provides the response. This government ethics professional probably does believe that integrity (whatever that is; it is an ambiguous term) is a protection against unethical behavior.
Bounded Awareness
The mechanism that prevents us from seeing what we need to see to make ethical decisions is what the authors call "bounded awareness." We tend to exclude important, relevant information from our decision-making by placing bounds around our definition of a problem. In effect, we put on blinders, like a race horse. We narrow our concept of responsibility (e.g., to our boss rather than to the public), we focus on instructions that are given to us or support a decision our supervisor or local legislators support. We do not ask for neutral external input, and we reject those who differ with us as partisan or self-interested. We focus on meeting a deadline rather than seeking out more information and opinions. We limit ourselves to our functional boundaries, such as engineering, law, finance. We give in to groupthink, that is, seek or accept unanimity rather than consider alternatives. We act out of fear, that is, fear of rejection, of being seen as goody-goody, of the consequences of whistle-blowing, of threatening our job. We focus on the law rather than the ethics.
The blinders of bounded awareness form the first set of obstacles to good people acting ethically. The first steps in dealing with these blinders are to recognize that we wear them, to talk about them openly, and to run ideas and possible decisions by people we trust to give us an honest response that is not biased toward us and our colleagues. Then we have to remove the obstacles as best we can, hopefully but not necessarily with the cooperation of our colleagues.
These steps are very demanding. In a poor or even neutral ethics environment, they require us to go against a lot of grains, to practically tip the boat. It's difficult to take on more responsibility than is asked of us, to openly question our bosses' and colleagues' assumptions, to welcome differing opinions (especially from our supposed enemies), to argue for extending a deadline in order to make a more responsible decision, to take on group (often party) unanimity, to talk about ethics when others are talking about laws and results, and to overcome our many, reasonable fears.
I'll discuss some possible ways of doing this in a future blog post. It's important to recognize, to start with, that we have to get beyond the belief in the prophylactic powers of an individual's integrity before we can even consider taking steps to deal with our blind spots. We need to recognize that we do act unethically without realizing it, that intent is not all that matters, and that to handle conflicts responsibly and professionally, we need to stare down our blind spots, as painful as it may sometimes be.
Robert Wechsler
Director of Research-Retired, City Ethics
---
As with any book that touches on a field you know, there are not a lot of new revelations here. I've noted a lot of the behavior, reasoning, and justifications that the authors note. What this book does is collect, organize, analyze, and give names to what is familiar, so that we can more easily and profitably discuss unintentional unethical behavior, teach officials better, and set up better ethics programs. Over the next week or so, I will be writing about some of the concepts in this book that are most useful for government ethics.
The authors start from the premise that most unethical behavior is unintentional, and that we have numerous psychological obstacles that stand in the path of our acting ethically.
The "blind spots" referred to in the book's title involve psychological processes that prevent us from recognizing the biases that lead us to act unethically. There are also blind spots in an organization, consisting of leaders' failure to appreciate the fact that their employees often act inconsistently with their beliefs. Instead, leaders tend to believe that their employees' integrity will protect them from making unethical decisions.
Does Our Integrity Protect Us?
Just before I picked up this book, I was told by a state ethics commission staff member that a member of the ethics commission who had been accused of acting unethically (in the government ethics sense) was a man "of unquestioned integrity." I nearly hit the ceiling. It is common to hear government officials saying this about themselves, their colleagues, and their employees. But when a government ethics professional says this, it shows how powerful our blind spots really are.
When I asked him, in a responding e-mail message, "Do you truly believe that someone 'of unquestioned integrity' cannot put his self-interest before the public interest, cannot let his anger get the best of him?" I received no response.
Effectively, this book provides the response. This government ethics professional probably does believe that integrity (whatever that is; it is an ambiguous term) is a protection against unethical behavior.
Bounded Awareness
The mechanism that prevents us from seeing what we need to see to make ethical decisions is what the authors call "bounded awareness." We tend to exclude important, relevant information from our decision-making by placing bounds around our definition of a problem. In effect, we put on blinders, like a race horse. We narrow our concept of responsibility (e.g., to our boss rather than to the public), we focus on instructions that are given to us or support a decision our supervisor or local legislators support. We do not ask for neutral external input, and we reject those who differ with us as partisan or self-interested. We focus on meeting a deadline rather than seeking out more information and opinions. We limit ourselves to our functional boundaries, such as engineering, law, finance. We give in to groupthink, that is, seek or accept unanimity rather than consider alternatives. We act out of fear, that is, fear of rejection, of being seen as goody-goody, of the consequences of whistle-blowing, of threatening our job. We focus on the law rather than the ethics.
The blinders of bounded awareness form the first set of obstacles to good people acting ethically. The first steps in dealing with these blinders are to recognize that we wear them, to talk about them openly, and to run ideas and possible decisions by people we trust to give us an honest response that is not biased toward us and our colleagues. Then we have to remove the obstacles as best we can, hopefully but not necessarily with the cooperation of our colleagues.
These steps are very demanding. In a poor or even neutral ethics environment, they require us to go against a lot of grains, to practically tip the boat. It's difficult to take on more responsibility than is asked of us, to openly question our bosses' and colleagues' assumptions, to welcome differing opinions (especially from our supposed enemies), to argue for extending a deadline in order to make a more responsible decision, to take on group (often party) unanimity, to talk about ethics when others are talking about laws and results, and to overcome our many, reasonable fears.
I'll discuss some possible ways of doing this in a future blog post. It's important to recognize, to start with, that we have to get beyond the belief in the prophylactic powers of an individual's integrity before we can even consider taking steps to deal with our blind spots. We need to recognize that we do act unethically without realizing it, that intent is not all that matters, and that to handle conflicts responsibly and professionally, we need to stare down our blind spots, as painful as it may sometimes be.
Robert Wechsler
Director of Research-Retired, City Ethics
---
Story Topics:
- Robert Wechsler's blog
- Log in or register to post comments