Many admire John Kennedy and his advisers’ deft handling of the Cuban missile crisis. It is generally thought to result from some of the best-judged decisions of the era. Yet a year earlier, much the same group of people decided to support the Bay of Pigs invasion (a crackpot scheme for the invasion of Cuba in which the US pitted 1,600 men against 200,000), conversely thought of as one of the most idiotic.
What accounts for the difference? It is simple. When reaching one decision, those involved argued endlessly about the various options. When reaching the other, there was only one option on the table and they all agreed on it. Here comes the point: they bickered when they made the good decision. They all agreed when they made the bad one. Irving Janis, a research psychologist from Yale, famously dubbed the herd effect that got Kennedy to endorse the Bay of Pigs invasion as ‘groupthink’. Call it what you like, lawyers seem particularly vulnerable. Whenever I lecture on it, one usually approaches me afterwards still trying to come to terms with some tale of collective legal idiocy. Usually they are about mediations, litigation or firm management.
See if you recognise the features Janis identified as fostering groupthink. You need a group of people who secretly want each other’s respect. Intellectually able people are fine: the phenomenon has nothing to do with intelligence. They just need to be interested in their status in the group. A socially-polished leader who nobody likes to challenge also helps. Often, all concerned take comfort in their safety in numbers and the decision is gently arrived at ‘in a curious atmosphere of consensus’ (as one of Kennedy’s advisers put it).
But the process is not always gentle. If you need, you can add a few management dark arts to keep those lemmings cliff-bound. ‘Outgrouping’ is a common one: alienating those who rock the boat. The carefully deployed use of the word ‘we’ at meetings is important here, to make the dissenter feel out on a limb, eg: ‘I don’t think any of us are saying that, Sarah. I think what we are all saying is…’. A public vote or show of hands when the leader has expressed their view is also an excellent way of getting a group to go along with something they wouldn’t otherwise agree to.
Generally speaking, decisions reached in this way tend to be more extreme than those individual participants in the group would make themselves: either absurdly cautious or – more usually – unjustifiably hazardous (the so-called ‘risky shift’). This comes about because the group feels it is invulnerable. Janis also noticed an assumption of collective moral invincibility. According to many accounts, Enron executives indulged in continual reinforcement of their own supposed intellectual superiority, a collective arrogance reflected in the title of the best-selling 2004 book on the company’s collapse, Enron: The Smartest Guys in the Room. The group will assume it is behaving ethically when those outside it would not agree. Conversely, it will imbue those outside it with negative attributes – stupidity, bias or obstructiveness.
Many frauds are the consequence of groupthink (which is how I came to be interested in it). Either the perpetrators believe they are behaving morally or, like the victims, they engage in collective denial of the risk they are taking. Well-known corporate misfortunes are said to have been the result of a similar collective overconfidence. Janis identified the American involvement in Vietnam and the Watergate scandal as a consequence of the same effect.
When Kennedy came to his senses after the Bay of Pigs fiasco, he was unable to understand how he had agreed to something so profoundly wrong-headed. The best explanation that one of his advisers came up with was that his natural impulse to object to such an obviously hare-brained scheme was ‘simply undone by the circumstances of the decision’. He might as well have said that he was ‘with his mates’. We need to lose a little snobbery here. Neither education nor intellect offers the slightest protection against the phenomenon.
So what can be done to avoid groupthink? Two commonly suggested ways involve separating people from viewpoints. These are: (a) appointing devil’s advocates when making important decisions; or (b) encouraging people to express their thinking before they have come to a view. To prevent feelings of invulnerability and the so-called ‘planning fallacy’, research psychologist Gary Klein advocates encouraging insight by asking decision-makers to write ‘future histories’ – brief accounts of the decision they are going to make prefixed with the words: ‘It all went horribly wrong because…’
But these are all first aid for groupthink, not treatment. A famous series of experiments in the 1950s had a group of people trying to judge which of three bars was the longest. Like many experiments around that time, all but one of the group was a stooge, under instructions to select the same bar that was clearly not the longest. As most people know, the experimentees went with the (obviously wrong) majority decision at least once. But there was an equally important finding. When one non-conformist stooge disagreed with the others – even when he chose another bar that also obviously wasn’t the longest – experimental subjects were happy to give the right answer. Troublesome objectors, even when they are in the wrong, can perform a useful service.
Conformists might be comfortable company, but they are not team players. At best they dilute responsibility for a decision without improving it. And sometimes, as Kennedy discovered, they quietly help make disasters happen. Minority dissent can be an irritation. But it does at least show that disagreement is possible. A telling measure of an organisation is how respectfully it is dealt with. If we could engender the self-confidence to value constructive disagreement we would see better decision-making all round.
Robert Hunter is head of the trust, asset-tracing and fraud group at Herbert Smith Freehills