Stratecution Nugget: A single tactic increases success rate 300%

82e6ddcea7e1ff2b694c2421cf410391-drive-slow-sign

Did you you know that in the vast arsenal of Behavioural Economics tactics to deal with our +200 biases, one of them alone can increase your chance of success in execution by 300% or a factor 4?

We are talking about small commitments as opposed to big bang implementation. We all have a tendency once the whole strategy has been thought through, decided upon and written up in beautiful slides to spring the whole thing on our organization.

And why should we not? After all they deserve to know. We also need to get going. Plus how can they can they execute correctly, if they do not understand the full picture? All valid reasons. But if you want to succeed, research says it is not the way to go.

There are several pieces of research, but our favorite one is this one from London, where families on two adjacent streets were asked to put up a big ugly sign in their front yard like “Drive Slow. Kids Playing”. On one street 20% of the houses put one up, but on the other street 80% agreed to do it. 20% vs 80%!

Now the streets were adjacent, there was no timing or demographic differences. It was the little differences. On the street with 80% the families had been asked a week in advance to put up a small postcard with the same worlds in their window – who can say no to that? Well, once you have made even a small commitment, it is very difficult to for humans to back down, when a week later they were asked to put up the bigger sign…

Did you like this blog article? Then contact us here, follow us or read some of the other articles on strategy and change at http://www.behaviouralstrategygroup.com.

 

Stratecution NEXT: Decision Strategy (5)

overconfidence

Before the holidays we started introducing core concepts from our book, Decision Strategy, that was awarded the 5th best management book globally of the year. This is part 5, so if you have missed the earlier parts, click the links below:

Part 5: If confidence is good, then overconfidence must be…

”All you need in life is ignorance and confidence, then success is sure”

Mark Twain

“The game is over”, said Mohammed al Douri, Iraq’s ambassador to the UN in April 2003. After only three weeks of fighting US forces occupied Baghdad, while Saddam Hussein was caught in a cellar on the outskirts of Tikrit later in the year. But while the invasion turned out to be a great success, the Iraqi liberation war was anything but. In Washington Saddam’s dictatorial regime was expected to crumble as soon as America set foot in the country, but Shiites did not rise, the Sunnis fought fiercely, the Iraqi guerrilla war surprised the unprepared forces and there were no weapons of mass destruction. How could the US be so wrong in a time when the intelligence community was better equipped than ever?

It all started with a few neoconservatives – among which the most prominent was Defense Secretary Paul Wolfowitz – who had long been convinced that expelling Saddam Hussein would pave the way for a grand reorganization of the Middle East that would move it away from tyranny and anti-Americanism and toward modernity and democracy. But maybe the most prevalent reason to attack Iraq was, that it would be a courageous use of American power, which mixed raw strength with idealism after the 1990s years of retreat and the terrorist attack on September 11. A combination of ambiguous intelligence and a strong belief that one could easily ‘smoke the bad guys out’, underpinned rosy scenarios, and poor planning – for example the original game plan called for 500,000 soldiers but was cut down to 160,000 for 3 months in the final approach – and only 8 years later the last soldiers were extracted.

The reason we often enter hopeless wars lie in our tendency of overconfidence. Although a more realistic assessment of the situation and the alternatives could lead to more peaceful solutions, our view is obscured by positive illusions, wishful thinking and overconfidence. Historical data show that each side before the start of the war, are convinced that they have more than 50 percent chance of winning!

Our tendency to overestimate our skills, the accuracy of our decisions or the value of our ideas is not limited to war. It is human and found across time, cultures and circumstances. Even the best laid business plans are often ruined by the annoying interference by reality. There is an abundance of examples of overconfidence in professional contexts such as stock market bubbles, the number of new entrepreneurs setting up shop despite 90% default rates and the many high profile acquisitions despite high risk of failure.

But could overconfidence not be a good thing? Is it not our self-conceit that makes us rise to the challenge and pushes us to perform better than our perceived limits? The short answer is no. With the exception of some limited benefits in innovation processes, overconfidence is one of the biggest obstacles to good decision making.

Although self-conceit can be fatal to decision-making processes, it is almost ironic how easy it is to correct. Churchill once said that it is important to remember that no matter how certain you feel about victory, there would be no war if the other person did not also think he had a good chance. If only the Americans had remembered to ask themselves a few perspective changing “what-if” questions, they could have avoided some of the biggest challenges. For example, why did Saddam Hussein dare risk a war? With the loss of the Gulf War in 1991 and a widening gap between Iraq and US forces, the outcome was given for the invasion – but the war was not.

When discussing overconfidence there are three different scenarios:(1) overestimation of our actual performance, (2) overplacement of our performance against others and (3) overprecision, that is exaggerated confidence in the accuracy of our beliefs:

Overestimation – the reason for project being late and over budget

You are preparing slides for a last minute presentation. It is the night before, but you do not worry much, because you are an experienced speaker. Although it is a new topic, you got slides from the former lecturer and you usually get good feedback. When you take the stage, you realize that your lecture is completely off target. You were not prepared for the participants’ prior knowledge of the subject, you have not quite understood all the points in the slides that you got and 45 agonizing minutes seem like a lifetime.

We tend to believe that we are better across a wide range of domains than we actually are, that we have more control over situations than we actually have and that we can plan out things quite detailed. This planning fallacy is reason for projects coming in late and over budget.

This is also yet another reason that bonuses can be difficult to use as a motivator. You may receive a bonus of USD 10,000, but you have expected 5,000 or 15,000. The easiest way to exceed our expectations is to reduce them and we have therefore developed defensive pessimism: While we will start out optimistic in a year or a project, we shield ourselves from any disappointment towards the end of the year or project through pessimistic assessments about ourselves and our possibilities, because it feels extra hard when our inflated belief meets reality.

Overplacement – the reason we enter hopeless projects

93% of motorists believe that they are better than average and 25% of students consider themselves in top 1%. Besides making performance appraisals difficult to agree on between managers and employees, it means that entrepreneurs will go into markets where their objective possibilities for success are limited or we will continue a hopeless lawsuit at high costs.

Interestingly, recent research also provides for our tendency to believe that we are performing worse than others when it comes to very difficult tasks. Both extremes are problematic. While overplacement may lead us to throw money at bad projects, then underplacement may prevent us from pursuing great projects – whether in business or in life.

Overplacement can create some nasty surprises, when we suddenly face reality. For example highly intelligent people joining ivy league schools often drop out, when they finally meet real competition. We simply tend to focus on ourselves instead of comparing ourselves with the particular group we belong – or we think of our own team as more competent than average. That would be okay except the other talented teams may impact on our own teams opportunities. In the business world this leads to lack of understanding of the market and competitors.

Overprecision – never ask experts to predict the future

Throughout history experts’ overly precise estimates have proven to be wrong again and again. For example, the neoclassical economist, Irving Fisher, who became famous for shortly before the Wall Street crash in 1929 saying that stock prices seemed to to have reached a permanently high level. Or Harry Morris Warner, one of the founders of Warner Bros, who in 1927 rhetorically asked who on earth wanted to hear actors talk? Or Thomas Watson, founder of IBM, who in 1943 predicted that there might be a world market for five computers.

Of course everyone can make mistakes in areas where they have limited knowledge, but the question is whether expertise buys us more precision? Studies show that we in areas with prior knowledge or expertise closer to the correct answer. The problem is that we as experts are becoming more cocksure about our ability to predict the future and therefore defines narrow confidence intervals (i.e. how sure we are of our estimate), so we still miss the target. The problem is that our one lone estimate has a tendency to make mistakes.

The problem is further cemented by human interaction patterns. For example few voters will choose a politician with the slogan: “I think it’s way, but I’m not sure.” We feel that confident people are more persuasive, competent and we reward them with influential positions. When the confidence and capability is positively correlated, it makes sense. But people quickly learn that to succeed, they must adopt a ‘fake it till you make it’ approach.

6S Model – bring in the devils advocate

Overconfidence is not the hardest bias to work with. Often we just need to consider consider worst-case scenarios, adverse information or adjust our project estimates with standard percentage to minimize the effect. Here are a few examples from the 6S parameters of Strategy, Structure, Steps, Systems, Skills and Style:

In Strategy you might towards the end of the process deploy Gary Klein’s “premortem” technique, where you imagine that the project has failed and you are analyzing the reasons behind. It is a clever way to invite in the devils advocate without strong opposition and is great at uncovering the biases and challenges that may threaten your strategy or project.

In Systems overoptimistic organizations often overestimate their abilities and underestimate how long something takes. Although you should always start with detailed plans to increase realism, then make sure to build in flexibility around those very precise estimates and consider whether the decisions in the system should be “opt in”, where you must actively choose a direction rather than an “opt out”, where you can just let system 1 press the accept button without reflecting on the consequences

This was part 5 of our book, Decision Strategy. Next week we will look at the power of loss aversion and the crazy lengths we will go to avoid it – stay tuned!.

Did you like this blog article? Then contact us here, follow us or read some of the other articles on strategy and change at http://www.behaviouralstrategygroup.com.

Stratecution NEXT: Decision Strategy (4)

confirmation-bias

This is part 4, where we introduce core concepts from our book, Decision Strategy, that received 5 stars as the 5th best management book globally of the year. If you have missed the earlier parts, click the links below:

Decision Strategy part 1

Decision Strategy part 2

Decision Strategy part 3

Part 4: When confirmation bias locks you on to a dangerous path

”We don’t believe the world we see; we see the world, we believe”

The Matrix

Monday, September 23, 2013 CEO Thorsten Heins faced one of the biggest decisions in his professional life: Should he end several years of financial struggles and sell Blackberry to the Canadian holding company Fairfax Financial? A special task force had examined strategic alternatives and the best course of action seemed to be to sell BlackBerry at a price of USD 4.7 billion or about 3 percent more than the closing share price Friday. If the deal was accepted, Blackberry would be come a private company away from Wall Street pressure, but four days before the completion of the due diligence Heins was fired and the new CEO, John S Chen, instead raised USD 1 billion cash injection. For a company that only a few years before was the worlds leading smartphone company with 41% of the US market, the highest company valuation in Canada and named the fastest growing company in the world, how could this happen?

BlackBerry’s decline is of course a perfect case study in what happens when a technology giant fails to innovate in a market that is evolving with breathtaking speed. Amid the success investors were warning about the increasing competition from iOS and Android, but BlackBerry maintained strategy. It was only when the iPhone in 2007 began to gain popularity and challenge the BlackBerry that reality hit. But challenge is not one exclusively reserved for fast paced technology markets – in fact other markets may be even more prone to it: When you put a frog into hot water it immediately jumps out, but when you submerge it into normal temperature water, it will stay even when you slowly heat it up (actually, this is not scientifically proven, but it is a great analogy).

The challenge is called confirmation bias: We believe what we see – and we see what we believe. We seek information that confirms our expectations and play down the aspects that are inconsistent with our expectations. It works against innovation, because creative thinkers use information to re-evaluate ideas and avoid status-quo scenarios. Innovative thinking is costly and difficult because confirmation bias is part of our inner mechanics; it is all too easy for us to stop innovate and instead reproduce earlier successful ideas. This is both the reason that giants fall and 90% of new ventures fail – we are only human after all.

Biased information retrieval

We test our hypotheses by searching for confirmatory information consistent with our limited attention and cognitive processing ability that compels us to seek information selectively. This is also one of the reasons that newspaper readership is often split by political leaning – voters prefer to confirm their positions instead of undermining them.

Confirmation bias not only applies to long held beliefs. Often it can be related to an attitude only just developed. For example if we are about to meet someone for the first time and just before the meeting we hear a colleague describe the person as dishonest, then we will immediately start looking for confirming information – and surprise, surprise we find it.

Biased interpretation

Biased interpretation occurs when two people with exactly the same information make different conclusions consistent with previous beliefs. A 1979 Stanford test asked students to evaluate the US death penalty based on research showing the ineffectiveness of capital punishment and both prior proponents AND opponents maintained their previous positions – in fact they left the experiment even more convinced.

Biased memory

Even if we collect and interpret information in a neutral way, we tend to remember it in a way that strengthens our beliefs. Just to consider a hypothesis means that the information stored in our memory consistent with the hypothesis becomes more accessible. Although we often have the feeling that we remember past events correctly – especially when it comes to highly emotional events such as when we first held our baby in our arms, the day we were married or when we were told that we had been promoted – tons of research shows we cannot give an accurate description of the events.

6S Model

To minimize confirmation bias it critical that your organization is geared to be open and even proactive towards alternatives, surprises and disagreements across the 6S parameters of Strategy, Structure, Steps, Systems, Skills and Style.

In Strategy there is a myriad of tools, concepts and even schools, but not all are equally valuable or relevant to your business. It is not always obvious what is the strongest solution, but once you have bought into a particular tool, you begin automatically searching for affirmative arguments that exactly the strategy method you have chosen is the best hammer for all your different challenges. Playing to Win is a strong approach, because each step from defining winning ambition over selecting where to play and how to win to designing core capablities and management system, are designed to test the previous step.

In Steps (aka processes) we face repeated decisions, where the answer is often the same. This reinforces the confirmation bias – it takes a lot to answer no when you just answered yes 1,000 times. The classic scientific and consultant problem solving approach is to first look for how you can disprove your hypothesis. A simple way to ensure this are checklists spurring just enough conscious thought to avoid disasters.

In Style (aka culture) we cannot stress the importance of diversity enough, but there will be no positive results unless you insist on maintaining the constructive disagreement arising from diversity. Even with a healthy diversity approach, individuals may unconsciously sabotage the opportunity, e.g. when the manager starts out with his or her opinion and then asks for alternative views – funnily enough you do not get a good discussion going. In this big data focused age, it is important to remember that while facts are critical then organizations only focused on data are paving the way their only personal confirmation bias hell. Data can not replace common sense and good decision processes -not everything that can be measured is important and not everything that is important can be measured.

This was part 4 of Decision Strategy and next week we have something else for you, so next part will only be after New Years!

Cant wait? Then contact us here, follow us or read some of the other articles on strategy and change.

Stratecution Nugget: Look for the white stroller

black-swan

One of our partners told the story about how his wife had desperately wanted a white stroller for their second child and he had argued vehemently, that there was no such thing. Within a day he had suddenly seen three of them!

Many are familiar with this aspect of our brain that we can basically set our “radar” to look for certain visual and even audio cues. What is less well known is that it has a devious angle to it. Once you have an idea that you like, you will start looking for evidence, that supports it – and distance yourself from opposing views.

This is of course confirmation bias at play and you can imagine what happens, when the manager at the end of the table gets an idea – good or otherwise it will get supported. Maybe less like a white stroller and more like a black swan.

This weeks Stratecution NEXT blog will be about confirmation bias and in anticipation, here is a great video showing just how far we humans can take it – enjoy!

Did you like this blog article on Behavioural Strategy Group? Then contact us here, follow us or read some of the other articles on strategy and change.

Stratecution NEXT: Decision Strategy (3)

spotlight

This is part 3, where we introduce core concepts from our book, Decision Strategy, that received 5 stars as the 5th best management book globally of the year. If you have not yet read part 1 or 2, click the links below:

Decision Strategy part 1

Decision Strategy part 2

 

Part 3: When bounded awareness narrows our vision

””Facts matter not at all. Perception is everything. It’s certainty”

Steven Colbert

On 28 January 1986, all employees at the Kennedy Space Center in Florida were busy preparing the launch of The Challenger space shuttle. A thorough review of the spaceship aviation readiness had been undertaken and it had been cleared. The lift off had been postponed five times due to bad weather, but today was a clear day albeit also the coldest for NASA ever to launch a rocket. The event was highly televised as a civilian was onboard for the first time ever to help NASA regain financial support. At 11:38 the Challenge left Pad 39B and almost immediately struck disaster. 73 seconds into the trip, the Challenger exploded in a ball of fire, immediately killing all crew. How could this happen to one of the most professional and highly regarded organisations?

Highly focused on saving the space program and using the Challenger as a key marketing vehicle for this purpose, management had under analyzed the risk situation and overheard engineering concerns around o-ring temperature requirements. Now you would think such an everyday reason could not happen in such a critical situation, but bounded awareness contributes to people even in the same organization with similar skills and knowledge can draw different conclusions. Because we have a limited bandwidth, we can in a mix of chaotic information quickly come to see important communication as trivial and thus underestimate risks. In NASA it was not technical skills or knowledge that separated the leaders from the engineers, but rather the narrow vision, which comes in the wake of strong focus – management on saving the space program and previous successes – engineering on recent o-ring reviews.

 

colour-changing-card-trick

If you are in doubt about whether this applies to you, then watch the colour changing card trick video and honestly answer yourself, whether you are one of the few, who gets it…

Bounded awareness can occur in various stages of your decision making and we like to distinguish between seeing, seeking, selecting and sharing information – or 4SI for short:

Error in Seeing Information

Our ability to focus on one task is undoubtedly useful, but it can also limit our awareness of peripheral threats and opportunities in your business environment and thus your ability to craft a strategic response.

Now, if you do not see it often, you often do not see it, but we can learn to become more aware of changes in our environment: Military personnel can be trained to scan a crowd for suspicious behaviour, leaders can hone their awareness of critical information and organizations can set up early warning signals of key environmental change.

Error in Seeking Information

The Challenger disaster demonstrates what can potentially happen when professional and well-meaning leaders limit their analysis and fail to seek out the most relevant information. It is not difficult to make the connection to the recent acquisition of Nokia, where the CEO exclaimed: “we did nothing wrong, yet we lost”.

That said, how can we be expected to seek information which by its nature is beyond our awareness? The most important thing is that you are vigilant in your reflections on what information is actually relevant for the decision you must make. As a manager you often experience recommendations reaching your desk supported by a significant amount of data – a quick trick here is to be skeptical about the absence of contradictory evidence.

Error in Selection Information

It may be hard to believe, but we ignore many valuable and accessible pieces of information about changes to customers, competitors and other stakeholders, when making important decisions – particularly, when we are successful. The Swiss watchmakers held 50 percent market share before 1960 and despite being the first to develop quartz technology, the watchmakers were in less than 10 years reduced by two thirds from foreign competition in quartz technology.

One way you can determine if the information you have at your disposal is useful, is to think about how the other parties involved will act. If you are in a negotiation, how will the counterpart assess the business you are negotiating? One method is to understand the links between all the relevant information by not only focusing on the cause-effect relationship, but also to bringing other contextual factors in to play.

Error in Sharing Information

If we succeed to seeing, searching and selecting the right information in a non-biased way, research suggests that we still have a problem: Our cognitive limitations prevent us from unrestricted exchange of information. When team members discuss available information, they omit the unique pieces, that can make the difference. Why? Because it is much easier to discuss common information and it is often better rewarded.

There are many ways to integrate diverse knowledge in groups, but one of the simplest approaches is to set agendas for meetings with specific points to ask individual views or make a person/department responsible for valuable knowledge sharing.

 

6s-model

6S Model

There are many individual approaches to avoid or limit bounded awareness. We use the 6S model to apply the right type of method to the right type of problem. Are we dealing with an issue within Strategy, Structure, Steps (aka process), Systems, Skills or Style (aka culture)?

For example in Strategy a strong approach is to counter bounded awareness with a highly focused megatrend analysis on the company and industry in question to showcase key issues and relevant scenarios facing the company in the next +5 years, so strategy is based on solid ground.

Another example in Structure companies may have a tendency for proliferation, inequitable resource allocation or absence of significant branches. The best thing to do is run periodic due diligence on your structure: reasonable number of organizational layers, average span of control, a relevant mandate in each role etc.

A final example is Systems that are configured incorrectly has the special ability to get you very far away from your goal very quickly. This is because the system’s primary purpose is to solve tasks fast, but if “the GPS” is set incorrectly, you can quickly end up somewhere completely different than you expected. Your best option here is to think both lead and lag indicators measuring both the end result and the process of getting there similarly to Balanced Scorecard etc. It allows you to create an Early Warning System to correct course at the right time but the trick is to focus on a handful of KPIs – if you focus on everything, you focus on nothing.

Bounded awareness can break organizations in industries undergoing significant change. The solution is not surprising both systematically and continuously work to expand awareness of the organization.

This was part 3 on our book, Decision Strategy. Next week we will look at how our confirmation bias keeps our bad ideas alive far beyond disaster and what to do about it – stay tuned!

Did you like this blog article on Behavioural Strategy Group? Then contact us here, follow us or read some of the other articles on strategy and change.

Stratecution Nugget: Look out for No.1

willpower

Why is it that candy is always at the end of the supermarket “snake”? Is it just the natural state of affairs for how our brain sorts food categories? Are they trying to make sure that we have bought all the important stuff first? Or are they well aware, that by the time we get to that aisle we have spent all our willpower on small decisions like which bread type to chose, whether we need meat this week and what our Saturday dinner guests might enjoy, so we are easy targets for anything our 35,000 year old brain might enjoy RIGHT now?

Our willpower can quickly go from full to empty, as we go through decisions after decisions, minute by minute, hour by hour. How important is this? Take a look at this graph depicting the likelihood of parole on the y-axis and the time of day for the parole hearing on the x-axis:

parole

Yes, you guessed it – right before lunch and the afternoon sandwich your chances fall to almost zero, because when you are in doubt on big decisions you default to the safe option!

So for critical decisions this is pretty important. But maybe just as interesting our ability to manage our willpower may be THE most important predictor of success in terms of our ability to plan and stick to it, concentration for long periods of time, GPA score, BMI etc:

VIDEO: The most important predictor of success

So some kids are born lucky (ok we actually do not know, how much is from birth and how much from upbringing, but you get the picture) – what can I do? Well besides making sure to get your regular sleep, healthy food and exercise, think about what former US president Barack Obama and rock star scientist Albert Einstein have in common:

einstein-and-obama

Make sure to have as few simple decisions to make in a day – have just ONE suit types, ONE shirt color, ONE shoe style, ONE… well you get it. Eat the same breakfast, ask the waiter to pick your lunch, run a dinner program without questioning it. Do NOT use unnecessary willpower, when you need it to decide more important things later in the day…

Did you like this blog article on Behavioural Strategy Group? Then contact us here, follow us or read some of the other articles on strategy and change.

Stratecution Nugget: Jump to conclusion

i-confine-my-exercise-to-jumping-to-conclusions-stretching-3947

The power of our brain never ceases to amaze us.

We make snap decisions on very little real content and instead respond disproportionally to authoritativeness and approachability.

We jump to superficial and quite stable conclusions such as when we have started to favor one presidential candidate over another, no amount of evidence will change this opinion. This is the reason that televised political debates are pretty much a waste of time.

But how far can we actually take it? That was the question that spawned this fun video nugget – enjoy:

WATCH VIDEO: JUMPING TO CONCLUSIONS

Did you like this blog article on Behavioural Strategy Group? Then contact us here, follow us or read some of the other articles on strategy and change.

Stratecution NEXT: Decision Strategy (2)

rationality

This is part 2 of an introduction to core concepts from our book, Decision Strategy, that received 5 stars as the fifth best management book of the year globally. You can read the other parts here:

Decision Strategy part 1

Decision Strategy part 3

Part 2: How does our brain wreak havoc on strategy?

”When faced with a difficult question, we often answer an easier one instead”

Daniel Kahneman

Christine, who is a director of a mid-sized media company, is in doubt whether to fire Anna, the marketing director. In recent years Anna has not delivered more than minimum requirements. She is in every way talented, intelligent and has a knack for finding inexpensive powerful marketing solutions, but she rarely takes the initiative and is often critical towards other employees. The challenge is that Anna is difficult to replace and is the only one who can maintain the company’s critical partners. What would you advise Christine to do?

If you reflect a little about your mental activity while you reading this introduction, it is remarkable how quickly you formed an opinion. Maybe you advise her to fire Anna or have a chat with Anna for a final chance. But you are unlikely to be completely confused. Our consciousness is usually in a state where we have intuitive feelings towards almost everything we experience. We may like or dislike people long before we know anything special about them; we trust strangers without knowing why; we have the feeling that a company will succeed without further analyzing business. We simply put too much emphasis on the information that is readily available to us.

The ideal decision process has 6 steps starting with an assessment phase and ending with a decision phase, but we actually rarely use it:

  1.  Define the problem
  2. Identify criteria
  3. Weigh criteria
  4. Generate alternatives
  5. Assess alternatives against each criterion
  6. Calculate the optimal decision

Instead 95% of our decisions are taken intuitively by fast thinking or system 1 as opposed to slow thinking or system 2, where we actually engage our minds in classic analysis. The brain is not literally divided in this way, but it is a useful analogue. System 1 is our faster, automatic, intuitive and emotional thinking and often goes under the name of the elephant – while System 2 is slower, more costly and conscious thinking – the one often referred to as the rider. For example our system 1 may tell us to buy Audi shares because we basically like Audi vehicles, but it does not mean that the shares have an attractive price compared to the actual value and alternative investments. This assessment requires system 2, but that pulls significantly more energy and these resources are quickly used up – and then you are back to system 1 decisions.

We have in the book structured the more than 200 biases currently identified into 6 distinct groups with 5 closely fitting with the key reasons for strategic failure according to McKinsey & Company:

70-failure

  • Availability bias was introduced in the beginning of this article. We put too much emphasis on the limited information we have and do not look new key pieces.
  • Confirmation bias is when you get that great business idea and start investigating further only to find that every new piece of information seem to confirm your idea.
  • Overconfidence is our tendency to underestimate competitors, timelines and budgets as well as overestimate our control over these.
  • Emotional bias holds several different sub biases such as strong preference for status quo and is almost a category for itself, because it tends to reinforce the other biases.
  • Loss aversion is the finding that we are on average twice as averse to losses as we are attracted to gains. Once vested in a strategy this feels like a loss we should avoid.
  • Bounded ethics is not included in McKinseys study and focuses on how we slowly stretch our ethical code. The fake accounts in Wells Fargo is a recent example.

As we can see, there are plenty of ways for top management decisions to go wrong. But are these biases inherently bad or irrational? Not really – they were just invented for a different era and now we need to adjust for the situations, where they are not ideal:

  1. Identify the situations, where it is worth the effort –  the rare high impact decisions such as strategy and many small decisions that over time has a large effect such as in core processes
  2. Identify the most likely bias to affect the decisions selected and their general effect on the process
  3. Decide on a new practice that either removes the bias from the equation or counter-effects the bias

This was part 2 on our book, Decision Strategy. Next week we will look at how our availability bias prevents us from bringing all key data to strategic decisions and what to do about it – stay tuned!

Did you like this blog article on Behavioural Strategy Group? Then contact us here, follow us or read some of the other articles on strategy and change.

Stratecution Nugget: Recruit the elephant

elephant-and-rider-for-blog

Our products, marketing, pricing and channels are all viewed through our customers broken lens. By understanding their human biases we can build up our universe to make it as easy and convincing as possible to buy from us.

In behavioural economics we talk about the elephant and the rider of decision making. Most of our decisions are taken almost automatically represented by the elephant just marching, where it pleases (our intuition), while a few decisions are taken through real analysis represented by the rider on top of the elephant (our analytical side).

When you want customers or any other stakeholder to buy into an idea, it rarely helps much to appeal to their analytical side. It is hard work and the rider is lazy, so only the most important issues are dealt with here. That is when you recruit the elephant by appealing to the emotional side of people like in this world class commercial:

Recruit the elephant

This commercial has been labelled the most impactful commercial in their space ever and as parents this hits us right in the heart every time.

Did you like this blog article on Behavioural Strategy Group? Then contact us here, follow us or read some of the other articles on strategy and change.