Counteracting our biases

In an earlier post, and one of a series by me and subsequently Ken as well, I suggested that an important part of any professional education should be a kind of counter-narrative in which those who learn a profession are also made familiar with that profession’s cognitive biases, with a view to lessening them in practice.

Nice to see this kind of thing is beginning to be taken seriously in management books. Actually it might have been taken seriously before now, I wouldn’t know because I don’t read management books. But I occasionally browse them and it hasn’t seemed to prominent in my browsing. In any event Daniel Kahneman et al have a long article in the HBR on the behavioural economics of business decision making.

And a useful check-list for when an organisation is making big decisions. Viz:

1. Is there any reason to suspect motivated errors, or errors driven by the self-interest of the recommending team?

2. Have the people making the recommendation fallen in love with it?

3. Were there dissenting opinions within the recommending team?

4. Could the diagnosis of the situation be overly influenced by salient analogies?

5. Have credible alternatives been considered?

6. If you had to make this decision again in a year, what information would you want, and can you get more of it now?

7. Do you know where the numbers came from?

8. Can you see a halo effect?

9. Are the people making the recommendation overly attached to past decisions?

10. Is the base case overly optimistic?

11. Is the worst case bad enough?

12. Is the recommending team overly cautious?

Preliminary Questions: Ask yourself

1. Check for Self-interested Biases

Is there any reason to suspect the team making the recommendation of errors motivated by self-interest?

Review the proposal with extra care, especially for overoptimism.

2. Check for the Affect Heuristic

Has the team fallen in love with its proposal?

Rigorously apply all the quality controls on the checklist.

3. Check for Groupthink

Were there dissenting opinions within the team?

Were they explored adequately?

Solicit dissenting views, discreetly if necessary.

Challenge Questions: Ask the recommenders

4. Check for Saliency Bias

Could the diagnosis be overly influenced by an analogy to a memorable success?

Ask for more analogies, and rigorously analyze their similarity to the current situation.

5. Check for Confirmation Bias

Are credible alternatives included along with the recommendation?

Request additional options.

6. Check for Availability Bias

If you had to make this decision again in a year’s time, what information would you want, and can you get more of it now?

Use checklists of the data needed for each kind of decision.

7. Check for Anchoring Bias

Do you know where the numbers came from? Can there be

…unsubstantiated numbers?

…extrapolation from history?

…a motivation to use a certain anchor?

Reanchor with figures generated by other models or benchmarks, and request new analysis.

8. Check for Halo Effect

Is the team assuming that a person, organization, or approach that is successful in one area will be just as successful in another?

Eliminate false inferences, and ask the team to seek additional comparable examples.

9. Check for Sunk-Cost Fallacy, Endowment Effect

Are the recommenders overly attached to a history of past decisions?

Consider the issue as if you were a new CEO.

Evaluation Questions: Ask about the proposal

10. Check for Overconfidence, Planning Fallacy, Optimistic Biases, Competitor Neglect

Is the base case overly optimistic?

Have the team build a case taking an outside view; use war games.

11. Check for Disaster Neglect

Is the worst case bad enough?

Have the team conduct a premortem: Imagine that the worst has happened, and develop a story about the causes.

12. Check for Loss Aversion

Is the recommending team overly cautious?

Subscribe
Notify of
guest

12 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Senexx
13 years ago

That’s pretty good. It reminds me of the way I was taught Ethics via Kallman & Grillo.

Charles
Charles
13 years ago

In my experience, in both Government and private sectors, so much of this rings true – particularly over-optimism and the attachment to past decisions.

This sort of stuff was never raised when I did my MBA in the 1990s, and in all my time has been resented by more senior staff when raised.

Government, however prone to fallacious decision-making, is forced to accept that a multiplicity of views prevails in the community and therefore need to be at least acknowledged in decisions.

Business leaders are far more prone to tunnel vision predicated on commercial biases (understandably enough) and tend to be openly hostile to any attempt to add some intellectual rigour to their analyses.

Benjamin
Benjamin
13 years ago

I see this alot it my office where disenting opinions are treated as a threat to management rather than explored.

Patrick
Patrick
13 years ago

My work encouraged us to read about half-a-dozen articles by Kahneman and similar…but I don’t think mpany people take it very seriously :(

trackback

[…] From a post at Club Troppo called “Counteracting our biases”, which begins. In an earlier post, and one of a series by me and subsequently Ken as well, I suggested that an important part of any professional education should be a kind of counter-narrative in which those who learn a profession are also made familiar with that profession’s cognitive biases, with a view to lessening them in practice. […]

trackback

[…] why I’ve argued that one of the most useful things one might do as far as ‘teaching’ management or […]