You are here
Max Bazerman
Tuesday, May 19th, 2015
TRENDS IN ETHICS TRAINING
MAX BAZERMAN
Max is the Straus Professor at the Harvard Business School and the Co-Director of the Center for Public Leadership at the Harvard Kennedy School. He is also formally affiliated with the Harvard Kennedy School of Government, the Psychology Department at Harvard, and the Program on Negotiation.
(**Recommendation: Read Max's book: Blind Spots) The following excerpts were chosen from Max's information on the Ethical Systems website.)
IDEAS TO APPLY (Based on research covered below)
- Be more humble. To overcome your blind spots, you first need to realize that you are not as ethical as you think you are and that you won’t behave as ethically as you think you will. Research has demonstrated that thinking about the motivations that will be present at the time of the decision will increase the accuracy of your predictions.
- Examine and correct the reward systems that lead to ethical fading and motivated blindness.
- What are you really rewarding your employees to do?
- What are you rewarding them to see, and to not see?
- Are you rewarding for sales revenue without concern for how those sales are obtained?
- Are you promoting individuals who contribute to the bottom line but lack integrity?
- Use data.
- What data are you collecting that would prove, or disprove, how ethical you and your organization are?
- What’s being done with the data?
- Is it being tracked and reported on, or getting shuffled into a dark drawer, never to be seen again?
- Set the stage for "psychological safety" when it comes to ethics.
How often are you publicly describing your own growth as an ethical person? If you are not willing to discuss the process of learning about your blind spots, no one else will be either. Amy Edmondson describes psychological safety as a team member’s belief that it is safe to take risks, such as discussing failures or confessing uncertainty. Set the stage for psychological safety about ethics in your organization by making it "something we talk about," not just something about rules and laws. - Examine the language euphemisms that hide ethics from the decision maker.
- Do you call it "earnings management" or "earnings manipulation?"
- Do you call your customers “clients” or “muppets”?
- Is your factory producing pollution or run-off?
- Similarly, people are more comfortable when described as cheating than being cheaters; lying than being liars; the language of verbs let us off the hook by not forcing us to define what we are doing as representative of who we are. Nouns are about who we are.
Much of the research on blind spots in ethical decision making is based on the concepts of bounded ethicality and ethical fading.
- How might bounded ethicality affect ethical decision-making?
Bounded ethicality is rooted in psychologist Herbert Simon’s groundbreaking concept of bounded rationality, a framework that describes the systematic, predictable, and biased psychological processes that contribute to the gap between our true preferences and our behavior. Bounded ethicality refers to the systematic and predictable ways in which people engage in unethical acts without their own awareness that they are doing anything wrong. Chugh, Bazerman, and Banaji (2005; Banaji, Bazerman, and Chugh, 2003) argue that all of us engage in behaviors that are inconsistent with our actual ethical preferences. The goal of the bounded ethicality approach is not to preach to people about how we should behave, but rather to help raise us to the ethical level we would endorse upon greater reflection about our own behavior. - How might ethical fading impact ethical decision making?
Ethical fading refers to the process whereby ethics are removed from the decisions we face, a process that contributes to bounded ethicality. When I don’t see the ethical implications of a decision, ethical considerations aren’t part of the decision criteria. As a result, I behave unethically and in a way that is against my values and yet I am not aware of this inconsistency. Ethical fading can be caused by situational factors, which can change the type of decision that the decision maker believes that they are making. For example, one study found that the introduction of a compliance system designed to reduce undesirable behavior was found to reduce the likelihood that people saw a question as an ethical decision; it made it easier for them to see it purely as a business decision. It’s as though the introduction of the compliance system allowed decision makers to off-load their moral responsibility. Undesirable behaviors became more frequent, not less. - Why don’t we recognize our unethical behavior?
One reason is that our actions are hidden between predictions that we will behave ethically and recollections that we behaved more ethically than we really did. In other words, when we are predicting how we will behave, we believe that we will behave in line with our “should” self and behave ethically; however, at the time of the decision our “want” self wins out and we behave unethically. Yet when we recall that behavior, we see it through our “should” self and believe we behaved more ethically than we actually did. As a result, our long record of unethical behavior remains hidden to us.- One of the best examples of prediction errors comes from a study done by Woodzicka and LeFrance who asked college-aged women to predict how they would respond to the following questions if they were asked during an interview: “Do you have a boyfriend?” “Do people find you desirable?” “Do you think it is important for women to wear bras to work?” While a majority of women predicted that they would refuse to answer these inappropriate and sexually harassing questions, their actual behavior was quite different: Everyone in that situation actually answered the questions.
- · When recalling actions that we engaged in that did not fit our standard, we engage in “revisionary” ethics and revise the behavior so that our inflated image of our own ethicality remains intact. Revisionary ethics can take the place of biased attributions, or “blame the other guy” and by changing our perception of how wrong the behavior actually was. It has been found, for example, that when people are in an environment to cheat, they change their perception of how morally wrong cheating is. Similarly, research has found that the more tempted people are to behave unethically, the more likely they are to rationalize that behavior by being more likely to believe that “everyone else is doing it."
- · Research demonstrates that not only are we blind to our own unethical behavior but also to the unethical behavior of others when it is not in our best interest to see it. This is known as motivated blindness. So if I am an auditor and hope for future auditing and consulting business from you, I will be less likely to see your unethical behavior than if such future rewards were not possible. Motivated blindness is more likely to occur when:
- § the unethical behavior is committed by a third party or intermediary.
- § the unethical behavior occurs gradually, with actions representing only small deviations from one’s standard.
- § the unethical behavior is associated with good versus bad outcomes.
- · How does hypocrisy figure in to decision-making?
There is a long line of research from Daniel Batson on hypocrisy, showing that people will make unethical choices in part by applying standards flexibly, until they can find a way to justify the outcome that they want for themselves. (See similar findings on our Cheating and Honesty page; people cheat up to the point that they can continue to believe that they are honest.) - · Why do people discriminate against categories of people without any intention to discriminate?