Ethical blindness describes the risk that over time and under the pressure of their context, individuals lose the ability to see that what they do is wrong. It is important because it is the driving force behind big scandals. We all know why bad people do bad things. However, we will never understand large scale systematic and systemic cases of immoral and illegal behaviour if we do not understand why and under what conditions good people are vulnerable to such behaviour. The fascinating question is, why you and me, under certain circumstances, would have done what managers did at Volkswagen, Boeing, Purdue or Wells Fargo.
The various cases of corporate fraud - Enron, Worldcom, the 2008 financial crisis, etc. - leads to a fundamental question: Is dishonesty largely restricted to a few bad apples, or is it a more widespread problem? If the problem is not confined to a few outliers, that would mean that anyone could behave dishonestly at work and at home — you and I included. This is the problem that Dan Ariely seeks an answer to in The Honest Truth about Dishonesty. He shows that the rational cost-benefit forces that are presumed to drive dishonest behavior often do not, and the irrational forces that we think don’t matter often do.
In rational economics, the prevailing notion of cheating comes from the economist Gary Becker, a Nobel laureate who suggested that people commit crimes based on a rational analysis of each situation. He noted that in weighing the costs versus the benefits, there was no place for consideration of right or wrong; it was simply about the comparison of possible positive and negative outcomes. According to this model, we all seek our own advantage as we make our way through the world. Whether we do this by robbing banks or writing books is inconsequential to our rational calculations of costs and benefits.
In The Honest Truth about Dishonesty, Dan Ariely writes that if this theory is correct, then the response should be to a) increase the probability of being caught (through hiring more police officers and installing more surveillance cameras, for example); b) increase the magnitude of punishment for people who get caught (for example, by imposing steeper prison sentences and fines). This is the model that is generally followed and accepted by policy-makers and the public.
Ariely’s experiments suggest that we don’t cheat and steal as much as we would if we were perfectly rational and acted only in our own self-interest. Cheating is not necessarily due to one guy doing a cost-benefit analysis and stealing a lot of money. Instead, it is more often an outcome of many people who quietly justify taking a little bit of cash or a little bit of merchandise over and over. Essentially, we cheat up to the level that allows us to retain our self-image as reasonably honest individuals.
Our behaviour is driven by two opposing motivations. On the one hand, we want to view ourselves as honest, honourable people. We want to be able to look at ourselves in the mirror and feel good about ourselves (psychologists call this ego motivation). On the other hand, we want to benefit from cheating and get as much money as possible (this is the standard financial motivation). Clearly these two motivations are in conflict. How can we secure the benefits of cheating and at the same time still view ourselves as honest, wonderful people?
This is where our amazing cognitive flexibility comes into play. Thanks to this human skill, as long as we cheat by only a little bit, we can benefit from cheating and still view ourselves as marvellous human beings. This balancing act is the process of rationalisation, and it is the basis of what he calls the “fudge factor theory.” All of us continuously try to identify the line where we can benefit from dishonesty without damaging our own self-image. The question is: where is the line?
The fudge factor suggests that if we want to take a bite out of crime, we need to find a way to change the way in which we are able to rationalise our actions. When our ability to rationalise our selfish desires increases, so does our fudge factor, making us more comfortable with our own misbehaviour and cheating. The other side is true as well; when our ability to rationalise our actions is reduced, our fudge factor shrinks, making us less comfortable with misbehaving and cheating. There are various environmental forces that increase and decrease honesty in our daily lives, including conflicts of interest, counterfeits, pledges, and simply being tired.
If we increased the psychological distance between a dishonest act and its consequences, the fudge factor would increase and our participants would cheat more. People are more apt to be dishonest in the presence of non-monetary objects than actual money — you are more likely to take paper or pencil from the office than money from the petty cash box. When you take money, you can't help but think you're stealing. When you take a pencil, there's all kinds of stories you can tell yourself. You can say this is something everybody does. Or, if I take a pencil home, it's actually good for work because I can work more.
The situation changes our ability to rationalise. When rationalisation increases -- for example, when we say things like everybody does it; or when we say we are doing it for a good cause -- we cheat to a higher degree. Non-monetary exchanges allow people greater psychological latitude to cheat – leading to crimes that go well beyond pilfered pens to backdated stock options, falsified financial reports, and crony deals. Ariely writes:
From all the research I have done over the years, the idea that worries me the most is that the more cashless our society becomes, the more our moral compass slips. If being just one step removed from money can increase cheating to such degree, just imagine what can happen as we become an increasingly cashless society.
Could it be that stealing a credit card number is much less difficult from a moral perspective than stealing cash from someone’s wallet? Of course, digital money (such as a debit or credit card) has many advantages, but it might also separate us from the reality of our actions to some degree.
If being one step removed from money liberates people from their moral shackles, what will happen as more and more banking is done online? What will happen to our personal and social morality as financial products become more obscure and less recognisably related to money (think, for example, about stock options, derivatives, and credit default swaps)?'
Again the idea is that once we get distanced from money it is easier for us to feel that we are honest but nevertheless be dishonest. Moreover, you aren’t dealing with real cash; you are only playing with numbers that are many steps removed from cash. Their abstractness allows you to view your actions more as a game, and not as something that actually affects people’s homes, livelihoods, and retirement accounts. As Auden said in Letter to Lord Byron, ‘Today, thank God, we’ve got no snobbish feeling /Against the more efficient modes of stealing.’
The creation on Wall Street of mortgage-backed securities made it harder to be a good person. When being not such a nice guy suddenly gets more profitable, it becomes harder to resist temptation. People didn’t get greedier, but the gains from dishonesty rose, so we got more dishonesty. And when you're surrounded by all these people who think the same way, it's very hard to think differently. Every time you reward someone’s dishonesty you are encouraging others to do the same.