Take your own medicine!

Growth Strategy for Consumer Products

Nobody told us that – but maybe they should have…

I have worked on commercial, operational, financial and organizational strategies in retail, logistics, aviation, pharma, energy and consumer products for almost 25 years – so forgive me for having trouble limiting myself, when we founded Behavioural Strategy Group:)

Our focus back then was purely on combining what seemed like a bloody obvious idea – why don’t we use decision science on decisions? Crazy right? Or in other words why not apply the field of behavioural economics to the most important decision arena in the world – strategy. Great idea – easy win.

The idea is still great, but I like to think that I am now wiser, more mature – after all, that idea is just shy of its 5th birthday and after many fun strategy projects, where we have repeatedly stressed that strategy is firstly about deselecting options (it even says so on www.behaviouralstrategygroup.com, so it must be true), it is time to take that awful medicine.

Except, this medicine will be awesome, because the focus will be purely on the exciting area of growth strategy for consumer products, where Behavioural Strategy Group really have an edge – a tried and true way to generate deep insights and make solid strategy decisions, that I am proud to say has resulted in the most powerful strategy processes, I have seen. Hands down.

Now my network in consumer products and retail is not bad – in fact I am working on a growth strategy for a retailer right now – but I would love your help to put out the word to senior commercial managers in midsized consumer products and retail companies in Denmark, that the new kid on the block is now all grown up.

Thanks in advance friends! 
Brian

Cofounder of Behavioural Strategy Group

Innovative project approach recognized

metaphor-lightbulbs_unique

Behavioural Strategy Group was just featured in the Comatch newsletter to more than 1,000 strategically minded businesses with an article on our next practice strategy as an innovative project approach!

Of course you should not be cheated of the article so here is the link – enjoy reading:

Next practice strategy

If you want to learn more, then contact us here or read more on strategy and change at Behavioural Strategy Group.

Going once, twice, sold… out!

beslutningsstrategy

… is our book, Beslutningsstrategi, in the original version, that was awarded 5th place among global management books of 2016!

Readers also loved it as this quote from Jakob Wedel, a partner at Monitor Deloitte illustrates: “I have read many management books. But this one surprises. Beslutningsstrategi puts behavioural economics at the center of strategic decision processes. Brilliantly written – should find it’s way to board and management rooms fast”

The book is the first time ever, that anyone has structured our more than 200 biases into a coherent whole and treated each bias type with the relevant current strategic management tools such as “Playing To Win”, “Scenario Planning” and many more.

Sometime this spring the book is expected to sell out, but you can still buy the original version from our publisher here, if you are quick: https://www.djoef-forlag.dk/da/boeger/b/beslutningsstrategi#

If you are interested in hearing more or want to have an inspiration session for your management team, then contact us here or read some of the other articles on strategy and change at http://www.behaviouralstrategygroup.com.

Stratecution nugget: incentives work…

money.jpg

… but not in the way that you expect!

Firstly, as Daniel Kahneman points out in his landmark book, Thinking Fast & Slow, quoted above, money drives individualism over team work.

Secondly, as Dan Ariely, another key behavioural economics player, has described again and again, money tends to take us away from focusing on the overall purpose of an organization and towards maximizing our own benefits – and they do not always align.

Thirdly, as shown by Daniel Pink, money as an incentive even in basic jobs tend to reduce performance.

So incentives work, but probably not in the way that you want – instead think of rewarding on a team level with a more profit sharing approach.

Did you like this blog article? Then contact us here, follow us or read some of the other articles on strategy and change at http://www.behaviouralstrategygroup.com.

 

Stratecution Nugget: A single tactic increases success rate 300%

82e6ddcea7e1ff2b694c2421cf410391-drive-slow-sign

Did you you know that in the vast arsenal of Behavioural Economics tactics to deal with our +200 biases, one of them alone can increase your chance of success in execution by 300% or a factor 4?

We are talking about small commitments as opposed to big bang implementation. We all have a tendency once the whole strategy has been thought through, decided upon and written up in beautiful slides to spring the whole thing on our organization.

And why should we not? After all they deserve to know. We also need to get going. Plus how can they can they execute correctly, if they do not understand the full picture? All valid reasons. But if you want to succeed, research says it is not the way to go.

There are several pieces of research, but our favorite one is this one from London, where families on two adjacent streets were asked to put up a big ugly sign in their front yard like “Drive Slow. Kids Playing”. On one street 20% of the houses put one up, but on the other street 80% agreed to do it. 20% vs 80%!

Now the streets were adjacent, there was no timing or demographic differences. It was the little differences. On the street with 80% the families had been asked a week in advance to put up a small postcard with the same worlds in their window – who can say no to that? Well, once you have made even a small commitment, it is very difficult to for humans to back down, when a week later they were asked to put up the bigger sign…

Did you like this blog article? Then contact us here, follow us or read some of the other articles on strategy and change at http://www.behaviouralstrategygroup.com.

 

Stratecution NEXT: Decision Strategy (5)

overconfidence

Before the holidays we started introducing core concepts from our book, Decision Strategy, that was awarded the 5th best management book globally of the year. This is part 5, so if you have missed the earlier parts, click the links below:

Part 5: If confidence is good, then overconfidence must be…

”All you need in life is ignorance and confidence, then success is sure”

Mark Twain

“The game is over”, said Mohammed al Douri, Iraq’s ambassador to the UN in April 2003. After only three weeks of fighting US forces occupied Baghdad, while Saddam Hussein was caught in a cellar on the outskirts of Tikrit later in the year. But while the invasion turned out to be a great success, the Iraqi liberation war was anything but. In Washington Saddam’s dictatorial regime was expected to crumble as soon as America set foot in the country, but Shiites did not rise, the Sunnis fought fiercely, the Iraqi guerrilla war surprised the unprepared forces and there were no weapons of mass destruction. How could the US be so wrong in a time when the intelligence community was better equipped than ever?

It all started with a few neoconservatives – among which the most prominent was Defense Secretary Paul Wolfowitz – who had long been convinced that expelling Saddam Hussein would pave the way for a grand reorganization of the Middle East that would move it away from tyranny and anti-Americanism and toward modernity and democracy. But maybe the most prevalent reason to attack Iraq was, that it would be a courageous use of American power, which mixed raw strength with idealism after the 1990s years of retreat and the terrorist attack on September 11. A combination of ambiguous intelligence and a strong belief that one could easily ‘smoke the bad guys out’, underpinned rosy scenarios, and poor planning – for example the original game plan called for 500,000 soldiers but was cut down to 160,000 for 3 months in the final approach – and only 8 years later the last soldiers were extracted.

The reason we often enter hopeless wars lie in our tendency of overconfidence. Although a more realistic assessment of the situation and the alternatives could lead to more peaceful solutions, our view is obscured by positive illusions, wishful thinking and overconfidence. Historical data show that each side before the start of the war, are convinced that they have more than 50 percent chance of winning!

Our tendency to overestimate our skills, the accuracy of our decisions or the value of our ideas is not limited to war. It is human and found across time, cultures and circumstances. Even the best laid business plans are often ruined by the annoying interference by reality. There is an abundance of examples of overconfidence in professional contexts such as stock market bubbles, the number of new entrepreneurs setting up shop despite 90% default rates and the many high profile acquisitions despite high risk of failure.

But could overconfidence not be a good thing? Is it not our self-conceit that makes us rise to the challenge and pushes us to perform better than our perceived limits? The short answer is no. With the exception of some limited benefits in innovation processes, overconfidence is one of the biggest obstacles to good decision making.

Although self-conceit can be fatal to decision-making processes, it is almost ironic how easy it is to correct. Churchill once said that it is important to remember that no matter how certain you feel about victory, there would be no war if the other person did not also think he had a good chance. If only the Americans had remembered to ask themselves a few perspective changing “what-if” questions, they could have avoided some of the biggest challenges. For example, why did Saddam Hussein dare risk a war? With the loss of the Gulf War in 1991 and a widening gap between Iraq and US forces, the outcome was given for the invasion – but the war was not.

When discussing overconfidence there are three different scenarios:(1) overestimation of our actual performance, (2) overplacement of our performance against others and (3) overprecision, that is exaggerated confidence in the accuracy of our beliefs:

Overestimation – the reason for project being late and over budget

You are preparing slides for a last minute presentation. It is the night before, but you do not worry much, because you are an experienced speaker. Although it is a new topic, you got slides from the former lecturer and you usually get good feedback. When you take the stage, you realize that your lecture is completely off target. You were not prepared for the participants’ prior knowledge of the subject, you have not quite understood all the points in the slides that you got and 45 agonizing minutes seem like a lifetime.

We tend to believe that we are better across a wide range of domains than we actually are, that we have more control over situations than we actually have and that we can plan out things quite detailed. This planning fallacy is reason for projects coming in late and over budget.

This is also yet another reason that bonuses can be difficult to use as a motivator. You may receive a bonus of USD 10,000, but you have expected 5,000 or 15,000. The easiest way to exceed our expectations is to reduce them and we have therefore developed defensive pessimism: While we will start out optimistic in a year or a project, we shield ourselves from any disappointment towards the end of the year or project through pessimistic assessments about ourselves and our possibilities, because it feels extra hard when our inflated belief meets reality.

Overplacement – the reason we enter hopeless projects

93% of motorists believe that they are better than average and 25% of students consider themselves in top 1%. Besides making performance appraisals difficult to agree on between managers and employees, it means that entrepreneurs will go into markets where their objective possibilities for success are limited or we will continue a hopeless lawsuit at high costs.

Interestingly, recent research also provides for our tendency to believe that we are performing worse than others when it comes to very difficult tasks. Both extremes are problematic. While overplacement may lead us to throw money at bad projects, then underplacement may prevent us from pursuing great projects – whether in business or in life.

Overplacement can create some nasty surprises, when we suddenly face reality. For example highly intelligent people joining ivy league schools often drop out, when they finally meet real competition. We simply tend to focus on ourselves instead of comparing ourselves with the particular group we belong – or we think of our own team as more competent than average. That would be okay except the other talented teams may impact on our own teams opportunities. In the business world this leads to lack of understanding of the market and competitors.

Overprecision – never ask experts to predict the future

Throughout history experts’ overly precise estimates have proven to be wrong again and again. For example, the neoclassical economist, Irving Fisher, who became famous for shortly before the Wall Street crash in 1929 saying that stock prices seemed to to have reached a permanently high level. Or Harry Morris Warner, one of the founders of Warner Bros, who in 1927 rhetorically asked who on earth wanted to hear actors talk? Or Thomas Watson, founder of IBM, who in 1943 predicted that there might be a world market for five computers.

Of course everyone can make mistakes in areas where they have limited knowledge, but the question is whether expertise buys us more precision? Studies show that we in areas with prior knowledge or expertise closer to the correct answer. The problem is that we as experts are becoming more cocksure about our ability to predict the future and therefore defines narrow confidence intervals (i.e. how sure we are of our estimate), so we still miss the target. The problem is that our one lone estimate has a tendency to make mistakes.

The problem is further cemented by human interaction patterns. For example few voters will choose a politician with the slogan: “I think it’s way, but I’m not sure.” We feel that confident people are more persuasive, competent and we reward them with influential positions. When the confidence and capability is positively correlated, it makes sense. But people quickly learn that to succeed, they must adopt a ‘fake it till you make it’ approach.

6S Model – bring in the devils advocate

Overconfidence is not the hardest bias to work with. Often we just need to consider consider worst-case scenarios, adverse information or adjust our project estimates with standard percentage to minimize the effect. Here are a few examples from the 6S parameters of Strategy, Structure, Steps, Systems, Skills and Style:

In Strategy you might towards the end of the process deploy Gary Klein’s “premortem” technique, where you imagine that the project has failed and you are analyzing the reasons behind. It is a clever way to invite in the devils advocate without strong opposition and is great at uncovering the biases and challenges that may threaten your strategy or project.

In Systems overoptimistic organizations often overestimate their abilities and underestimate how long something takes. Although you should always start with detailed plans to increase realism, then make sure to build in flexibility around those very precise estimates and consider whether the decisions in the system should be “opt in”, where you must actively choose a direction rather than an “opt out”, where you can just let system 1 press the accept button without reflecting on the consequences

This was part 5 of our book, Decision Strategy. Next week we will look at the power of loss aversion and the crazy lengths we will go to avoid it – stay tuned!.

Did you like this blog article? Then contact us here, follow us or read some of the other articles on strategy and change at http://www.behaviouralstrategygroup.com.

Stratecution NEXT: Decision Strategy (4)

confirmation-bias

This is part 4, where we introduce core concepts from our book, Decision Strategy, that received 5 stars as the 5th best management book globally of the year. If you have missed the earlier parts, click the links below:

Decision Strategy part 1

Decision Strategy part 2

Decision Strategy part 3

Part 4: When confirmation bias locks you on to a dangerous path

”We don’t believe the world we see; we see the world, we believe”

The Matrix

Monday, September 23, 2013 CEO Thorsten Heins faced one of the biggest decisions in his professional life: Should he end several years of financial struggles and sell Blackberry to the Canadian holding company Fairfax Financial? A special task force had examined strategic alternatives and the best course of action seemed to be to sell BlackBerry at a price of USD 4.7 billion or about 3 percent more than the closing share price Friday. If the deal was accepted, Blackberry would be come a private company away from Wall Street pressure, but four days before the completion of the due diligence Heins was fired and the new CEO, John S Chen, instead raised USD 1 billion cash injection. For a company that only a few years before was the worlds leading smartphone company with 41% of the US market, the highest company valuation in Canada and named the fastest growing company in the world, how could this happen?

BlackBerry’s decline is of course a perfect case study in what happens when a technology giant fails to innovate in a market that is evolving with breathtaking speed. Amid the success investors were warning about the increasing competition from iOS and Android, but BlackBerry maintained strategy. It was only when the iPhone in 2007 began to gain popularity and challenge the BlackBerry that reality hit. But challenge is not one exclusively reserved for fast paced technology markets – in fact other markets may be even more prone to it: When you put a frog into hot water it immediately jumps out, but when you submerge it into normal temperature water, it will stay even when you slowly heat it up (actually, this is not scientifically proven, but it is a great analogy).

The challenge is called confirmation bias: We believe what we see – and we see what we believe. We seek information that confirms our expectations and play down the aspects that are inconsistent with our expectations. It works against innovation, because creative thinkers use information to re-evaluate ideas and avoid status-quo scenarios. Innovative thinking is costly and difficult because confirmation bias is part of our inner mechanics; it is all too easy for us to stop innovate and instead reproduce earlier successful ideas. This is both the reason that giants fall and 90% of new ventures fail – we are only human after all.

Biased information retrieval

We test our hypotheses by searching for confirmatory information consistent with our limited attention and cognitive processing ability that compels us to seek information selectively. This is also one of the reasons that newspaper readership is often split by political leaning – voters prefer to confirm their positions instead of undermining them.

Confirmation bias not only applies to long held beliefs. Often it can be related to an attitude only just developed. For example if we are about to meet someone for the first time and just before the meeting we hear a colleague describe the person as dishonest, then we will immediately start looking for confirming information – and surprise, surprise we find it.

Biased interpretation

Biased interpretation occurs when two people with exactly the same information make different conclusions consistent with previous beliefs. A 1979 Stanford test asked students to evaluate the US death penalty based on research showing the ineffectiveness of capital punishment and both prior proponents AND opponents maintained their previous positions – in fact they left the experiment even more convinced.

Biased memory

Even if we collect and interpret information in a neutral way, we tend to remember it in a way that strengthens our beliefs. Just to consider a hypothesis means that the information stored in our memory consistent with the hypothesis becomes more accessible. Although we often have the feeling that we remember past events correctly – especially when it comes to highly emotional events such as when we first held our baby in our arms, the day we were married or when we were told that we had been promoted – tons of research shows we cannot give an accurate description of the events.

6S Model

To minimize confirmation bias it critical that your organization is geared to be open and even proactive towards alternatives, surprises and disagreements across the 6S parameters of Strategy, Structure, Steps, Systems, Skills and Style.

In Strategy there is a myriad of tools, concepts and even schools, but not all are equally valuable or relevant to your business. It is not always obvious what is the strongest solution, but once you have bought into a particular tool, you begin automatically searching for affirmative arguments that exactly the strategy method you have chosen is the best hammer for all your different challenges. Playing to Win is a strong approach, because each step from defining winning ambition over selecting where to play and how to win to designing core capablities and management system, are designed to test the previous step.

In Steps (aka processes) we face repeated decisions, where the answer is often the same. This reinforces the confirmation bias – it takes a lot to answer no when you just answered yes 1,000 times. The classic scientific and consultant problem solving approach is to first look for how you can disprove your hypothesis. A simple way to ensure this are checklists spurring just enough conscious thought to avoid disasters.

In Style (aka culture) we cannot stress the importance of diversity enough, but there will be no positive results unless you insist on maintaining the constructive disagreement arising from diversity. Even with a healthy diversity approach, individuals may unconsciously sabotage the opportunity, e.g. when the manager starts out with his or her opinion and then asks for alternative views – funnily enough you do not get a good discussion going. In this big data focused age, it is important to remember that while facts are critical then organizations only focused on data are paving the way their only personal confirmation bias hell. Data can not replace common sense and good decision processes -not everything that can be measured is important and not everything that is important can be measured.

This was part 4 of Decision Strategy and next week we have something else for you, so next part will only be after New Years!

Cant wait? Then contact us here, follow us or read some of the other articles on strategy and change.

Stratecution Nugget: Look for the white stroller

black-swan

One of our partners told the story about how his wife had desperately wanted a white stroller for their second child and he had argued vehemently, that there was no such thing. Within a day he had suddenly seen three of them!

Many are familiar with this aspect of our brain that we can basically set our “radar” to look for certain visual and even audio cues. What is less well known is that it has a devious angle to it. Once you have an idea that you like, you will start looking for evidence, that supports it – and distance yourself from opposing views.

This is of course confirmation bias at play and you can imagine what happens, when the manager at the end of the table gets an idea – good or otherwise it will get supported. Maybe less like a white stroller and more like a black swan.

This weeks Stratecution NEXT blog will be about confirmation bias and in anticipation, here is a great video showing just how far we humans can take it – enjoy!

Did you like this blog article on Behavioural Strategy Group? Then contact us here, follow us or read some of the other articles on strategy and change.

Stratecution NEXT: Decision Strategy (3)

spotlight

This is part 3, where we introduce core concepts from our book, Decision Strategy, that received 5 stars as the 5th best management book globally of the year. If you have not yet read part 1 or 2, click the links below:

Decision Strategy part 1

Decision Strategy part 2

 

Part 3: When bounded awareness narrows our vision

””Facts matter not at all. Perception is everything. It’s certainty”

Steven Colbert

On 28 January 1986, all employees at the Kennedy Space Center in Florida were busy preparing the launch of The Challenger space shuttle. A thorough review of the spaceship aviation readiness had been undertaken and it had been cleared. The lift off had been postponed five times due to bad weather, but today was a clear day albeit also the coldest for NASA ever to launch a rocket. The event was highly televised as a civilian was onboard for the first time ever to help NASA regain financial support. At 11:38 the Challenge left Pad 39B and almost immediately struck disaster. 73 seconds into the trip, the Challenger exploded in a ball of fire, immediately killing all crew. How could this happen to one of the most professional and highly regarded organisations?

Highly focused on saving the space program and using the Challenger as a key marketing vehicle for this purpose, management had under analyzed the risk situation and overheard engineering concerns around o-ring temperature requirements. Now you would think such an everyday reason could not happen in such a critical situation, but bounded awareness contributes to people even in the same organization with similar skills and knowledge can draw different conclusions. Because we have a limited bandwidth, we can in a mix of chaotic information quickly come to see important communication as trivial and thus underestimate risks. In NASA it was not technical skills or knowledge that separated the leaders from the engineers, but rather the narrow vision, which comes in the wake of strong focus – management on saving the space program and previous successes – engineering on recent o-ring reviews.

 

colour-changing-card-trick

If you are in doubt about whether this applies to you, then watch the colour changing card trick video and honestly answer yourself, whether you are one of the few, who gets it…

Bounded awareness can occur in various stages of your decision making and we like to distinguish between seeing, seeking, selecting and sharing information – or 4SI for short:

Error in Seeing Information

Our ability to focus on one task is undoubtedly useful, but it can also limit our awareness of peripheral threats and opportunities in your business environment and thus your ability to craft a strategic response.

Now, if you do not see it often, you often do not see it, but we can learn to become more aware of changes in our environment: Military personnel can be trained to scan a crowd for suspicious behaviour, leaders can hone their awareness of critical information and organizations can set up early warning signals of key environmental change.

Error in Seeking Information

The Challenger disaster demonstrates what can potentially happen when professional and well-meaning leaders limit their analysis and fail to seek out the most relevant information. It is not difficult to make the connection to the recent acquisition of Nokia, where the CEO exclaimed: “we did nothing wrong, yet we lost”.

That said, how can we be expected to seek information which by its nature is beyond our awareness? The most important thing is that you are vigilant in your reflections on what information is actually relevant for the decision you must make. As a manager you often experience recommendations reaching your desk supported by a significant amount of data – a quick trick here is to be skeptical about the absence of contradictory evidence.

Error in Selection Information

It may be hard to believe, but we ignore many valuable and accessible pieces of information about changes to customers, competitors and other stakeholders, when making important decisions – particularly, when we are successful. The Swiss watchmakers held 50 percent market share before 1960 and despite being the first to develop quartz technology, the watchmakers were in less than 10 years reduced by two thirds from foreign competition in quartz technology.

One way you can determine if the information you have at your disposal is useful, is to think about how the other parties involved will act. If you are in a negotiation, how will the counterpart assess the business you are negotiating? One method is to understand the links between all the relevant information by not only focusing on the cause-effect relationship, but also to bringing other contextual factors in to play.

Error in Sharing Information

If we succeed to seeing, searching and selecting the right information in a non-biased way, research suggests that we still have a problem: Our cognitive limitations prevent us from unrestricted exchange of information. When team members discuss available information, they omit the unique pieces, that can make the difference. Why? Because it is much easier to discuss common information and it is often better rewarded.

There are many ways to integrate diverse knowledge in groups, but one of the simplest approaches is to set agendas for meetings with specific points to ask individual views or make a person/department responsible for valuable knowledge sharing.

 

6s-model

6S Model

There are many individual approaches to avoid or limit bounded awareness. We use the 6S model to apply the right type of method to the right type of problem. Are we dealing with an issue within Strategy, Structure, Steps (aka process), Systems, Skills or Style (aka culture)?

For example in Strategy a strong approach is to counter bounded awareness with a highly focused megatrend analysis on the company and industry in question to showcase key issues and relevant scenarios facing the company in the next +5 years, so strategy is based on solid ground.

Another example in Structure companies may have a tendency for proliferation, inequitable resource allocation or absence of significant branches. The best thing to do is run periodic due diligence on your structure: reasonable number of organizational layers, average span of control, a relevant mandate in each role etc.

A final example is Systems that are configured incorrectly has the special ability to get you very far away from your goal very quickly. This is because the system’s primary purpose is to solve tasks fast, but if “the GPS” is set incorrectly, you can quickly end up somewhere completely different than you expected. Your best option here is to think both lead and lag indicators measuring both the end result and the process of getting there similarly to Balanced Scorecard etc. It allows you to create an Early Warning System to correct course at the right time but the trick is to focus on a handful of KPIs – if you focus on everything, you focus on nothing.

Bounded awareness can break organizations in industries undergoing significant change. The solution is not surprising both systematically and continuously work to expand awareness of the organization.

This was part 3 on our book, Decision Strategy. Next week we will look at how our confirmation bias keeps our bad ideas alive far beyond disaster and what to do about it – stay tuned!

Did you like this blog article on Behavioural Strategy Group? Then contact us here, follow us or read some of the other articles on strategy and change.