Mental Shortcuts - Good and Bad (Consumer Understanding #5)

Mar 14 2011

Home Economicus or Homo Sapiens?

We have seen that the majority of human behaviour is controlled outside consciousness.  This is in contrast to the classical models of economics which assume a model of man as Homo Economicus who is entirely rational, always knows what (s)he wants and is capable of calculating the precise consequences of any action.  We all know that this is nonsense, and recent economic events have once again highlighted the inadequacy of such models both for economics and further afield.

Homo Sapiens is much more human as psychologists such as Daniel Kahneman and Amos Tversky have shown repeatedly.  Emotion often destroys our self control and we often don’t know what we want.  We are unable to calculate consequences precisely, and in fact are influenced much more heavily by losses than by gains.  We are sometimes irrational and incompetent, and often inconsistent in our decision making.  This applies to financial analysts watching multiple computer screens as well as me and you.

Can we have too many choices?

So we are not always good at using information to determine the consequences of actions.  In fact, more information can make us less informed (and less happy) about the choices we make.  Barry Schwartz has written about how the modern consumer faces more choices than any other group of people in history.  But this greater freedom and autonomy has not brought any psychological benefit, and in many cases we are less happy than we have been before.  Moreover, sometimes too much choice leads the modern consumer to make the easiest decision of all, and choose nothing!

Imagine an intergalactic alien landing his spaceship outside Walmart (or equivalent).  (S)he is in a hurry and needs to quickly buy some toothpaste for his/her travel bag.  With no information on the different brands and varieties of toothpaste available, how would (s)he decide what to buy?  What advice would you or I give the alien?  Maybe (s)he could go for the most popular brand (most shelf space?), the one closest to the brand from planet Zog (same colour or name?), or the one nearest the entrance (time is short).

Is there a shortcut?

We all use mental shortcuts constantly, usually for very good reasons.  The upside of mental shortcuts is that they help us to save time and effort - remember that our brains are very energy conscious.  The downside is that they sometimes make us act “irrationally” (to coin Dan Ariely), and the field of behavioural economics has investigated many of the different mental biases that influence our behaviour.  Looking across all the work, there seem to be five key shortcuts (or biases depending on your viewpoint) which are most common.

One of the most common, and earliest studied, is loss aversion.  We all like to stick with the status quo and losses can weigh twice as heavily on our decision making as gains.

We often focus on one trait or a single piece of information, which is usually the first or earliest context we have, and use this to anchor later information.  Anchoring bias is sometimes called Framing bias, and is also related to Priming which is an implicit memory effect.  Framing is used commonly in setting price structures and menus (for example in fast food restaurants).  It is also a common trick of estate agents to show you the best (or sometimes worst) house first before the home that you finally choose.

We are always over-optimistic about our own abilities and about the future (a very good thing generally!).  Why else can we believe that house prices will continue going up for ever?

We predict frequency based on how easily we can bring an example to mind, and Availability bias is a common issue in branding and communication research.  Representativeness bias is similar, leading us to evaluate risk based on strength of emotion as well as pure probability.

Lastly, we like to follow the crowd, and Herd mentality will be the subject of a later post in this series (#6).

Nudging in the right direction

Yesterday I watched The Adjustment Bureau at my local cinema, in which a group of bowler-hatted ‘angels’ (or perhaps devils), watched over humans and made small interventions when needed to nudge people in the right direction (making sure they kept to ‘the plan’).  In life we are often faced with difficult choices, and incomplete information, leading us to sometimes make poor choices which are not in our long term interests.

Some social scientists argue that governments and other institutions should leverage these mental shortcuts to encourage us to make better choices. One of the best known examples is organ donation.  In many countries, organ donors “opt in” and make the decision to carry the card, while in other countries it has been decided that those who do not wish to donate their organs must “opt out”.  Unsurprisingly in opt out countries more than 90% of the population become organ donors, whereas in opt in countries it might not reach 10%.  Similarly, school canteens can consider placing healthy meal options closer to eye level, and many corporations now make pension provision simpler with default options for employees which significantly increase uptake and help them choose an appropriate plan.

Good or bad?

Mental shortcuts can help us make good decisions faster and with less effort, which is generally a very good thing.  Organisations can also leverage these shortcuts to encourage particular choices, including brand choices (eg by framing price options).

In market research these shortcuts sometimes lead to bias, and we have to be particularly careful of anchoring and availability bias in designing question flows and wording (and herd mentality when we talk to participants in groups).

If you would like to learn more about this topic, contact us about our training courses here.


Judgment under Uncertainty: Heuristics and Biases by Daniel Kahneman, Paul Slovic and Amos Tversky (1982)

The Paradox of Choice: Why More is Less by Barry Schwartz (2005)

Nudge: Improving Decisions about Health, Wealth and Happiness by Richard Thaler and Cass Sunstein (2009)

No responses yet

Leave a Reply