Decision Strategy – chapter 8

Once you cross the line, it escalates

It takes twenty years to build a reputation and five minutes to ruin it. If you think about that, you do things differently

Warren Buffett

The financial world was shaken to its core, when the fourth largest US investment bank, Lehman Brothers, filed for bankruptcy in September 2008. With $ 639 billion in assets and $ 619 billion in debt, Lehman’s bankruptcy petition not only surpassed other bankruptcy giants like WorldCom or Enron by a factor 10, it also came after repeated assurances that finances were sound.

Lehman Brothers had a humble beginning as a grocery business in Alabama in 1844 by German immigrant Henry Lehman, who was eventually joined by his two brothers. The business boomed and managed to survive massive challenges like railway company bankruptcies, the Great Depression, two world wars and capital suffocation from various owners, but it was the breakdown of the American housing market that finally killed it. The seed was sowed when Washington in 1999 removed the Glass-Steagall law, which separated commercial and investment banks, so the former handled capital-intensive portfolios like real estate and the latter focused on liquid assets. In 2001 CEO Dick Fuld received an investment plan from a group of math PhDs, showing how the bank would always profit, if it invested in the housing market, and he was so impressed, that he immediately borrowed billions to invest in the housing market.

Fuld was feared and admired as the Gorilla of Wall Street due to his aggressive style and lived for Lehman Brothers, where he started at 23 and quickly became the longest serving CEO on Wall Street. In 2003 and 2004 Lehman bought five housing loan businesses including two subprime services specialized in Alt-A loans for borrowers without documentation, which at first sight was ingenious – the following three years, they grew by 56% annually ending with USD 4.2bn profit in 2007 and a market value of USD 60bn. But the assumptions in the PhDs investment proposal was falling apart. The stock market had its largest correction in five years in just one day, the number of sub prime loan defaults reached the highest share in a decade and Bear Stearns had two housing funds go bankrupt. Lehmans CFO assured the market that the risks were well insured, so the strategy was not changed, and they geared their investment with a factor 44, while the five other large investment banks stayed around a factor 20-30.

The warning signs kept pouring in with Bear Stearns having a near breakdown early 2008, hedge fund managers were questioning the value of Lehmans housing loan portfolio and Lehman had the first quarterly loss in more than a decade, leading to one double-digit drop-in stock price after another. The final blow came, when Korea Development Bank withdrew from negotiations to take an active share in Lehman. Hedgefund clients left, creditors cut off loan facilities, Lehman had a second quarterly loss and Moodys Investor Service stated that Lehman had to sell the majority of its shares to avoid a rating reduction. Lehman was now losing USD 8m every minute with little cash left and made one last desperate attempt to find a buyer in Barclays to no avail. 15th September Fuld declared bankruptcy, eliminating USD 46bn in market value or 93% in just three days, and Barclays bought the scraps for USD 1bn.

Bounded Ethics were rampant in Lehman Brothers

Bankruptcies are full of learning, but what was particularly interesting here was despite Fulds clear ethics in private, then his business ethic was more “anything goes”. As bankruptcy lawyer Valukas started unravelling the events, an enormous number of lies were discovered. Whenever danger was noted, everything was covered up. Maybe to avoid showing weakness or maybe Fuld believed so much in his strategy, but the many problems and attempts to finder buyers indicated more that he was feeling the ropes tightening around his neck. So, he did what everyone else does in a high pressure situation: He lied. Maybe on purpose to mislead, maybe because he was caught in his story, maybe his pride could not handle it or maybe he would not accept his back was against the wall. The problem with the lies was that they led to more problems, less trust and wasted precious time in dealing with the very real challenges. Stopping the bleeding early and giving potential buyers enough time to consider how to integrate the acquisition might have saved Lehmans – because as we saw in chapter 7, when we are faced with a high-risk situation and short time, we stick to status quo. We all fear losing face, but that need to protect our self image can become the very reason we fail.

The second ethical break down is a more gradual erosion of ethical conduct, where a small step away from a high ethical standard sets us on downward trend towards faster and larger ethical lapses, which we may not even notice due to our blindness to change. Lehman used an increasing number of accounting tricks to make their gearing and liquidity look favorable for example by selling to ghost companies and rebuying them at a higher price thus avoiding sales of assets with losses.

The third ethical break down happened as outside auditors Ernest & Young knew about the activities and failed to report it despite direct questioning. Lehman was a huge and profitable client for them, where motivated blindness indicates that when we are incentivized to avoid certain data, then we tend to overlook it.

You see Bounded Ethics every day

Nobody is in doubt about the dramatic unethical behaviour of Enron’s Jeff Skilling, Italy’s Silvio Berlusconi or IT Factory’s Stein Bagger. The business scandals of the last 20 years have led media, politicians and academics to look for the underlying reasons for ethical breaks and the most popular approach is identifying a few evil people or systems – like Dick Fuld in Lehman, Arthur Andersen in Enron or poor regulations. But the latest research shows that most unethical behaviour happens without a conscioius choice. Of course there are people, who act unethically on purpose, but the argument here is that our ability to understand and change unethical behaviour in people rests on appreciating that we are systematically biased in certain ways:

Everyday dishonesty: The human tendency to lie up to the point, where you feel your integrity is at risk – further enhanced by your self serving bias. For example high earners will find a reason to justify why they save their money in tax havens.

Indirect unethical behaviour: This tendency to avoid seeing or even hiding exploiting or unethical behaviour by having third parties do the job for you. For example some global companies like H&M or Apple have had questionable production facilitites run by third parties.

Motivated blindness: The tendency to avoid noticing unethical behaviour, when it is against your own interests to notice them. For example classic inhability cases like the researcher, politician or leader cannot see the problem in their close relations with certain interest organizations.

In-group favoritism: Also known as in-group bias, it describes your tendency to favor members of the group you belong to over members of groups you do not belong to. For example, we tend to provide possible board positions, jobs or other opportunities to people in our network, without thinking about whether they are also objectively the best to solve the task.

Implicit attitudes: When you meet someone, you immediately activate stereotypes about the person’s race, gender, and age. For example, when we meet a female dealer and are more surprised by an aggressive negotiating behavior than if it were a male dealer.

Experience of Fairness: Your tendency to value fairness to an extent that you are willing to sacrifice your financial well-being to enforce it. For example, if a business partner offers a very unfair deal in a negotiation, we can let the whole deal fall through to teach the other party a lesson, even when it may have financial consequences for ourselves.

You cheat because you deserve it

Sometimes you lie or cheat because it is self serving, but other times you might do it, because you can – or because you cannot help it, as you are running out of the energy required to ensure proper self control. And our brains are flexible enough to change our mind after cheating to make us believe that we in fact deserved it – even worse if there is some distance between the unethical behaviour and the reward. For example 10% of professional golf players are willing to pick up the ball to place it more favourably, but 23% are willing to do the same with the club.

The creative class is special and is treated as such, which makes creatives feel more important and deserving in US studies. The trouble with that is they are more willing to steal or lie than other groups. And this is not only the natural creatives – if you prime people to believe they belong to the creative class, they steal 6 times more than the control group. So the creatives issue may be a US challenge, but priming people to feel more deserving for whatever reason is the a problem. Now how important is that? Well the typical organization loses about 5% of annual revenue to cheating and fraud, losing about USD 2.9bn in 2010.

You are about to enter a pedestrian crossing, when an aggressive driver races past you inches away. Shocked you yell at the driver, but the car is already gone. Do you think the car was an brand new Aston Martin or and old Toyota Corolla? Research shows that drivers in expensive cars are much more aggressive and selfish in traffic than those in cheaper cars. Also known as the asshole effect, American psychologist Paul Piff researched whether assholes often became rich – or if being rich in itself drove the asshole tendency. It turned out that mere thought of feeling rich or just being near money symbols would make you more antisocial, self sufficient and unwilling to help others.

In a small classroom at the University of California, Berkeley, two students are playing Monopoly, one of whom has no earthly chance of winning. The Monopoly game is rigged, so the other player is more privileged during the game. Hundreds of Monopoly games with different players show the same – winning brings out the worst in the player who is unfairly privileged to win – arrogant, dominant, entitled to his victory and even eats more of the common candy. But there are some completely different unforeseen effects of winning: When we win in a competition, we subsequently have a greater tendency to cheat others to continue to win. In short, winning leads to more unethical behavior – even in an unimportant game. Now imagine what you would do, if billions were at stake.

6S Model – ensuring transparency

It is easy to identify unethical behaviour in others but very difficult in our selves. So to combat this bias group, you might want to start by ensuring that individuals are not making decisions about cases, where they have a personal interest and then picking some of these tricks:

Structure: You need transparency and simplicity everywhere – not least in your structure and to avoid ethical breaks. A flat organization with a stringent logic and large span of control is preferable – possibly coupled with internal audit, whistleblowers and external unbiased support.

Steps: Just like structure transparency and simplicity is key. A strong process management approach can by virtue of its rationality and simplicity identify off activities and prune them. But it cannot stand alone – the Enron case was well known for their use of standardized processes, where a customer claim was passed from department A to B to C and back to A with no way of communicating between them.

Style: Some of the best options to combat limited ethics are to ensure diversity, broad compensation structures and a code of ethics that is constantly repeated and permeates the organization. Diversity means that we are not as easily talked into unfortunate actions as when we are surrounded by people who are similar to us. The compensation structure should reward groups instead of individuals, as cheating requires a larger coalition to make sense. The code of ethics may seem weak at first glance, but Ariely’s research shows that people take it very seriously if they are just reminded that there is a code of ethics.

This was the short version of the last chapter on the 6 major bias groups in 5th best management book of 2016. Next up is the final conclusion. If you cannot wait contact brian@behaviouralstrategygroup.com or +45-23103206.

Decision Strategy – chapter 7

Emotional bias enhance other bias groups and make your decisions for you

It is useless to attempt to reason a man out of a thing he was never reasoned into

Jonathan Swift

At the end of the 1970ies a genious entrepreneur decided to beat IBM as the world leading tech company within the next decade. You will be forgiven thinking it was Steve Jobs, but it was actually Chinese inventor Dr. An Wang. Wang Labs was one of the first tech companies to advertise on TV and their first add showed Wang Labs as David and IBM as Goliath. From there on a series of “Giant Killer” stories were aired with the grand finale of a Wang helichopter shooting the smug IBM CEO.

Wang was as known for his butterflys as his ingenioius inventions based on 40 patents and 23 honorable degrees. He was born in Shangai in 1920 and entered the Chinese version of MIT at 16. During his studies Japan invaded China and young Wang lost his entire family, but he finalized with top marks and immigrated dirt poor to the US, where he got a PhD from Harvard. He opened up a one-man company and invented memory core, which revolutionized pocket calculators and later mini computers including word processers to replace type writers. His company grew 60% annually in the 1980ies into a USD 2bn Fortune 500 company with 80% of the 2000 largest US companies as clients, but disaster was lurking.

Until now Wang had capably foreseen technological developments, but now the company was against all trends pouring all resources into the competitive word processing market – in fact only IBM was spending more on marketing. Meanwhile the personal computer market was booming, where Wang held several critical patents, but he refused to change course: “personal computers is the dumbest thing ever!”. Wang loved word processing. His feelings went far beyond pride and commercial interest – he was emotional and protective like a parent. When he finally saw the writing on the wall it was last shot and he blew it on a proprietary operating system, when everybody else used IBM compatible systems. Sales nose dived and Wang Labs was bankrupt in 1990.

Wang’s problems were self inflicted and more personal than business related. Wang hated IBM after the company according to his own biography had cheated and humiliated him, when he sold his memory-core patent to IBM as a young man. He was determined to beat IBM, even if it would cost him everything. And so, it did.

Emotional bias – your emotions decide for you

A leading neuroscientist conducted an experiment on the role of emotions in decision making – the Iowa Gambling Task. Particpants sat down at tables with four decks of cards and were given USD 2000 to try to grow. They were told that some cards would pay say USD 100 and others would cost say USD 100. What they did not know was that half the decks were stacked to create a surplus and the other half a deficit. As the players started drawing good and bad cards, their emotions were measured, and they were asked about their feelings. In the beginning they drew random cards and just made notes, but as soon as they started drawing the cards that cost something, their emotions were activated, heart and pulse beating faster. After a while it was possible to observe higher emotional activity BEFORE picking from the bad decks. In fact, they started going after the good decks without being able to articulate why.

It might seem strange, that we can have emotions that make our decisions without us realizing it, but this is exactly what happens. Many scientists believe that our brains store memories of actions and associate emotions with that – known as emotional tagging. When we encounter similar situations, we recall our last action and our emotions – the emotions will then promote or warn us against these actions and thereby drive our decision – exactly like Wang. Until recently behavioral economics focused only on cognitive decision processes, but now researchers have found that not only do emotions drive our decisions directly through emotional tagging and enhance our existing cognitive biases like confirmation bias, but we also have a whole host of individual emotional biases:

Status quo bias is our irrational preference for the current situation – any change from that feels like a loss and shows why some companies never innovate or change. In fact, according to McKinsey the best way to predict your competitors strategy is to look at what they have done recently. As you run out of energy from making decisions in a day, then your tendency to stay with status quo also increases.

Avoidance of regret describes how it is more painful to make a change and be wrong, than stick with current actions and then be wrong – it can be seen when leaders stick with poor decisions, but a version of it also exist in competitive sports: Everybody prefers the gold medal, but silver medalists are actually more dissatisfied than bronze winners – they are just happy they got something, but silver is soooo close to gold.

Insensitivity to numbers shows that we are for example less focused on savings human lives as the number of people in danger grows. Saving one life can seem huge, but saving 88 vs 87 lives seem relatively unimportant, which is the reason that news outlets focus on stories and not statistics: people relate to people and not to numbers.

Hyperbolic discounting means that we prefer to get a reward earlier rather than later to the extent that even small delays cause us to discount the value of the reward substantially, whereas a much longer delay does not discount much more. So, if you are offered the choice between USD 100 now or USD 200 in 1 year, most take the money now, although you could earn the equivalent of 100% interest. But if you are offered USD 100 in 1 year or USD 200 in two years (also 100% interest), you would normally pick the USD 200.

Multiple selves are our tendency to have two opposite preferences for both immediate satisfaction and future rewards. On the one hand you want to stay healthy and accelerate your career – on the other hand pizza and some easy work tasks sound great. On the one hand you want to watch that new French intelligent drama – on the other hand Rocky 5000 sounds so alluring.

Self serving bias attribute positive events to our own personality or effort, while negative events are attributed to external factors. For example, if your project goes well, then of course it is because of you – but if it goes wrong, then the reason is external like poor partners or too few resources. If you are sued and win, then 85% of you will expect the plaintiff to pay trial costs – but if you sue someone and lose, then only 44% of you believe you should pay trial costs.

6S model – building checks and balances

Management literature has recently gone through a theoretical discussion about whether employees should park their emotions at the office front door. Our opinion is that you cannot be split in two. Not only is it physically impossible, but it is probably incredibly difficult to make any decisions at all without emotions to guide you and nothing great has ever come without passion. But we all have our strengths and weaknesses, where even strengths can backfire, and we need to keep it under control. Generally, you want to create checks and balances between different actors and activities in the organization and here are a few examples:

Strategy: When planning for an emotional organization, there are three things that are particularly critical: First, you need to look broadly to minimize the chance of falling in love with a specific scenario. Second, the process must be fact-based specifically on critical areas that determine the direction of the company. Finally, your processes must be coherent so that decisions somewhere in the process are followed up with changes elsewhere. Thus, you avoid an emotional decision about sub-elements instead of the overall strategy. In practice, this can be done with qualitative and quantitative megatrend analyzes, where the status quo can be challenged through assessment of alternatives. Coupled with the overall Playing to Win process, which just excels with its cohesiveness, this is a powerful solution.

Steps: Facts, facts, facts. That’s how easy it can be said. And yet not, because you can use Wisdom of Crowds to remove yourself from an emotional and towards a factual estimate. Many studies show that our bias can be balanced in groups so that we get close to the right solution, and this is what Wisdom of Crowds is about: When you want to learn about past facts, you ask an expert, but if you want to forecast the future, experts are as bad a chimp throwing darts – instead take the average opinion of 20 average people. This does not mean that intuitive and emotional decisions have to be parked, they are actually very important as long as they are connected to fact-based systems, and you choose from both – the best way is maybe a 50-50 average of your intuitive estimate and what comes out of a model.

Skills: There are two competencies that are critical here. On the one hand, key people must be trained in classical problem solving with the ability to razor-sharply define the problem, break it down into smaller parts, develop solution trees, priorities, analyses, and pyramid-structured presentations. On the other hand, people responsible for change must be well versed in behavioral change management and not least think in implementation solutions as soon as the analysis phase starts.

I hope you enjoyed this short version of chapter 7 in 5th best management book of 2016. Next up is Bounded Ethics – a scary look at the dark side of business and people. Contact brian@behaviouralstrategygroup.com or +45-23103206 if you are too concerned to wait.

Decision Strategy – chapter 6

You are twice as excited about losing something as you are about gaining the same thing

If there is a 50-50 chance that something can go wrong, then 9 times out of ten it will.

Paul Harvey

In the 1980s, Coca-Cola CEO Robert Goizueta was deeply concerned about the future. Had time run out from Coca-Cola that Dr. John Pemberton back in 1886 had brewed together in his three-legged brass pot? In recent years, Coca-Cola had lost significant ground to arch-rival Pepsi despite Coca-Cola had much broader distribution and spent at least $100 million more annually on marketing. At this difficult time, Pepsi was smearing salt in Coca-Cola’s wounds with its national TV commercials, the Pepsi Challenge, where in a blind test dedicated Cola drinkers always preferred Pepsi. Coca-Cola’s immediate reaction to the Pepsi Challenge commercials was blatantly rejecting the results in public, but the internal concern was growing.

Coca-Cola’s mystery had always been about the famous secret recipe that had not changed in 99 years since Dr. Pemberton developed it. In a world where it was customary to change popular products to “new and improved” versions, the unique thing about Coca-Cola was that it was never new. But Goizueta was not called “President of Change” for nothing. Early in his tenure, he promised that there would be “no sacred cows including the recipe for our products.” He began to shake up the company’s traditions and introduced Diet Coke, Cherry Coke and more. Now Coca-Cola embarked on systematic market research that confirmed the Pepsi blind tests and Coca-Cola scientists began fiddling with the legendary secret recipe, making it more like Pepsi. Instantly, Coca-Cola’s market researchers saw improvements in the blind tests.

In September 1984, they tested what ended up being the final version of New Coke followed by one of the most expensive market research studies in history including 200,000 blind tests across North America. Here, New Coke beat Pepsi by 6-8 percentage points with only 10-12 percent of the taste test participants strongly opposed to changing the Coca-Cola recipe, so Goizueta gave the green light. In the launch press conference, Goizueta called New Coke “the safest move the company had ever made.” Yet New Coke became a disaster. Angry Cola drinkers demonstrated throughout the United States and began hoarding boxes of the old cola. A black market for Old Coke emerged, where a box went for $ 30 and more began to find ways to import it from abroad. Coca-Cola customer service received over 60,000 angry calls from Cola drinkers, while a group of Cola drinkers in the United States sued Coca-Cola. Their reasoning was that “when [Coca-Cola] took Old Coke off the market, they violated my freedom of choice. It is as basic as the Magna Charta, the Declaration of Independence. We went to war in Japan to defend this freedom”. Just 79 days after launch Goizueta was forced to withdraw New Coke and reintroduce the original recipe as “Coca-Cola Classic”. Despite an inferior product Coca Cola is still the dominant soft drink in the world and this marketing blunder of the century shows how difficult it is to understand what people actually think.

Loss aversion – prefer to avoid loss over gain

From the stocks we invest in over the projects we own to the special variety of Coke that we always drank; once we have something, we value it much higher than we did when we first encountered it regardless of objective value. It is called the ownership effect bias and stems from loss aversion, first publicized about 30 years ago by Richard Thaler in the bestseller “Nudge” and subsequently recorded in hundreds of studies. In the most famous, Daniel Kahneman conducted a simple experiment, where students’ incentive to swap two identically priced products was tested. One group was given a coffee cup, while the other group was given a chocolate bar, where after each group was offered the opportunity to swap with the other. Before receiving one or the other, about half had preferred the coffee cup and half preferred the chocolate bar, but now only 10% of each group was willing to swap their newly acquired product. The ownership effect means that after having received something – however temporary it may be – we attach ourselves to it and protest against changes that threaten to remove it. For example, marketers know that it takes oceans of work and money to get people to try something new even if it is better – let alone change their habit. In that context, the whole idea of a Pepsi versus Coca-Cola blind test gets a little silly, because no one drinks their Coca-Cola blind.

But how could Coca-Cola’s extensive research overlook the information they had right under their noses? This is because of the way in which losses affect our judgment. Let’s say you are faced with a choice. You can get 1 million kroner now, or you can get a 50-50 chance to win 2 million kroner or nothing. Which one do you choose? Most of us have a preference for a sure win rather than a bigger but more uncertain reward. But conversely, if you get the choice between definitely losing 10,000 kroner – or a 50-50 chance of losing 20,000 kroner or nothing, then most people will actually take the chance despite the potentially bigger loss. This phenomenon is known as loss aversion, which was originally described in Daniel Kahneman and Amos Tversky’s landmark article from 1979, The Prospect Theory. When Coca Cola faced losing its market position, they bet big. But would they have shown the same willingness to take risks at the prospect of a gain? 99 years of history where Coca-Cola dominated the soda market without making a single change in their recipe indicates that they probably would not. In fact people are about twice as loss averse as they are attracted to the same gain, but that is not all – loss aversion is a group of several biases:

Ownership effects – my house is above market

Also known as divestment aversion we attribute more value to things simply because we own them, as seen above. Most people in the business world have already seen the light in the ownership effect when it comes to enticing you to buy their products. For example, when you buy a computer from Dell, you get a 90-day trial period on an anti-virus program from Norton. This means that we as a computer user are more likely to purchase the software at the end of the free trial period than if it were not offered. The ownership effect will be further strengthened if we at the same time put a lot of energy into making the project into something – also known as the “IKEA effect”. In a variety of experiments, participants were set to assemble IKEA products or build things from Lego, after which they felt that their amateur projects had similar value as if they were made by experts.

Framing effects – 90% fat free trumps 10% fat

People react to a particular choice in different ways depending on how the choice is presented; as a loss or as a gain. People tend to avoid risk when a positive framework is presented, but seek risks when a negative framework is presented. For example, it feels mentally more dramatic if you, as the leader of a merger, communicate “we unfortunately had to fire 30% of the employees” instead of “we succeeded in saving 70% of the jobs”. The math is basically the same, but the first sounds like a loss, the second like a win.

Sunk cost & escalate commitment – say no more

We tend to invest money, time and resources based on past investment decisions where we cannot get our investment back and which therefore should not be part of our decisions about the future. For example, if we have once invested in a startup, we will have an easier time investing a second, third and fourth time in the same company, as we do not want to risk our previous investment being “lost” – even if previous investments cannot be regained and thus should not enter into considerations about future investments. We escalate the commitment to our initial decision instead of changing course, even when we get more and more negative outcomes from continuing the course. It is related to sunk cost and occurs, for example, when we have chosen to engage time and resources in hiring an employee who turns out to be incompetent, a new factory that does not give the results we had hoped, or a new product that fails to perform as we had hoped.

Mental bookkeeping – easy come easy go

Also known as the two-pocket theory occurs when we place our money in separate categories or mental accounts, based on where the money comes from or what it is intended for. For example, every month we have allocated our salary to a number of fixed expenses and have a sober approach to our money, but if we suddenly get an unexpected windfall, we can just blow it off because it feels like a different kind of money. In fact, the adage “easy come easy go” is an example of this kind of mental bookkeeping.

Scarcity trap – last shirt is most valuable

When we experience scarcity – for example of time, money, relationships, calories – we tend to concentrate our thoughts on the scarce good – something that actually makes us better able to to assess what the good is worth. But it also means that we tend to become short-sighted, weak-willed, and get tunnel vision because we spend so much mental bandwidth dealing with our thoughts about this scarce resource that we have less mental IQ for other important decisions. Marketing experts work to match supply to demand, but by using the illusion of scarcity, they can accelerate demand. Excellent examples of this effect are the launch of iPhones or Harry Potter books, where the pre-launch was designed not only to increase demand but at the same time to create the illusion that supply would be limited. In many situations, the mere thought of scarcity can actually reduce our intelligence significantly. In several tests among shoppers in shopping malls in the United States and in the fields of Bangladesh, participants were asked about their income and then discreetly classified as either poor or rich. Then they were asked the question: “Your car needs a repair. The insurance covers $150, but it will cost you $150 in deductible. You can choose to take out a loan, pay in full or defer service. What do you choose?” After the test subjects had the answer, they had their fluid intelligence and self-control measured in a Raven test. When it was only about $150, the poor and rich did just as well on the intelligence and self-control test. But when the researchers changed $150 for the repair to $1,500, something significant happened: Rich participants passed the intelligence and self-control test just as well as before, but poor participants did not. The mere thought of facing a $1,500 extra expense put so much pressure on their bandwidth that their floating intelligence score dropped 13-14 IQ percentage points. To put it in perspective, a loss of 13 IQ points can move you from ‘normal intelligent’ to the category of ‘deficient’ or the 5-7 percent least intelligent in society.

6S model – think expected benefits

A simple but effective rule for optimized decisions is that we should always base our decision on the option with the highest expected benefit. In economics, game theory and decision theory, the expected utility hypothesis is a theory of people’s preferences when faced with choices with uncertain outcomes, i.e. bets. You arrive at the expected benefit of an uncertain choice by multiplying the expected outcome of the choice by its probability. The expected benefit model says that 1 million kroner is worth twice as much and provides twice as much benefit as 500,000 kroner. On paper, it seems very true, but people do not feel twice as much pleasure from a gain that is twice as great. This is due to the “declining marginal utility of gains”, which means that the more we get from something, the less joy it gives us. As this is unnatural to us, it should be incorporated in the organization’s levers – here are some examples:

Strategy: We might be tempted to believe that strategy in the risk-averse organization will be the opposite of the over-optimistic one. But risk aversion is not a bias. It is a risk preference that is not about ‘overcoming’ our propensity, but instead comparing value and risks at different options so that we do not stumble into the accompanying biases such as framing, ownership effect, etc. A good place to start strategy analysis is to move beyond the expected scenario and include both best/worst-case scenarios and Wild Cards in the analysis, so that they get a thorough processing and not just “analyzed” superficially in the risk averse leader’s head. A later important step is to go into detail with the Playing to Win strategy, where you investigate how you can win on the game board you are betting on – and not least what it requires of you as an organization. It can also take the form of a series of high risk/high return investment projects versus some low risk/low return investment projects, as long as the options are easy to compare on relatively objective criteria.

Systems: One of the things that can go really wrong here is the feeling of insecurity that leads us to constantly seek more information to be completely sure of the decision. It is difficult to predict the future, so at some point the analysis must stop and the management must show why it is there. If this approach with more and more information is allowed to slip into system design, then you are quickly faced with a very complex and heavy system that pulls the energy completely out of the organization. Therefore, it is important to think simplification and reduce information retrieval to the absolutely critical issues as well as maintain this approach over the time the system is developed.

Skills: In risk-averse organizations it can be difficult to attract people with the opposite approach, but that is exactly what you need: inspiration from creative thinkers, innovators and entrepreneurs as well as over-optimistic people from e.g. the management of role model and innovative companies in similar industries. It will be like getting a necessary vitamin injection and giving your organization a significant boost for a period of time. To make the period last, you can start by getting these people in for inspiration workshops, later for longer strategy projects and finally in protected units – in the same way as when you want to establish a business unit that is very different from the core business. When you are ready, put the protected unit together with the core business.

This was the abbreviated version of chapter 6 in our awarded book Decision Strategy. Next week we will focus on emotional biases – maybe the most challenging of all the bias groups. If you cant wait… contact brian@behaviouralstrategygroup.com or +45-23103206.

Decision Strategy – chapter 5

overconfidence

We are biased to believe that we are better than everyone else.

If confidence is good, then overconfidence must be…

”All you need in life is ignorance and confidence, then success is sure”

Mark Twain

“The game is over”, said Mohammed al Douri, Iraq’s ambassador to the UN in April 2003. After only three weeks of fighting US forces occupied Baghdad, while Saddam Hussein was caught in a cellar on the outskirts of Tikrit later in the year. But while the invasion turned out to be a great success, the Iraqi liberation war was anything but. In Washington Saddam’s dictatorial regime was expected to crumble as soon as America set foot in the country, but Shiites did not rise, the Sunnis fought fiercely, the Iraqi guerrilla war surprised the unprepared forces and there were no weapons of mass destruction. How could the US be so wrong in a time when the intelligence community was better equipped than ever?

It all started with a few neoconservatives – among which the most prominent was Defense Secretary Paul Wolfowitz – who had long been convinced that expelling Saddam Hussein would pave the way for a grand reorganization of the Middle East that would move it away from tyranny and anti-Americanism and toward modernity and democracy. But maybe the most prevalent reason to attack Iraq was, that it would be a courageous use of American power, which mixed raw strength with idealism after the 1990s years of retreat and the terrorist attack on September 11. A combination of ambiguous intelligence and a strong belief that one could easily ‘smoke the bad guys out’, underpinned rosy scenarios, and poor planning – for example the original game plan called for 500,000 soldiers but was cut down to 160,000 for 3 months in the final approach – and only 8 years later the last soldiers were extracted.

The reason we often enter hopeless wars lie in our tendency to be overconfident. Although a more realistic assessment of the situation and the alternatives could lead to more peaceful solutions, our view is obscured by positive illusions, wishful thinking and overconfidence. Historical data show that each side before the start of the war, are convinced that they have more than 50 percent chance of winning!

Our tendency to overestimate our skills, the accuracy of our decisions or the value of our ideas is not limited to war. It is human and found across time, cultures and circumstances. Even the best laid business plans are often ruined by the annoying interference by reality. There is an abundance of examples of overconfidence in professional contexts such as stock market bubbles, the number of new entrepreneurs setting up shop despite 90% default rates and the many high profile acquisitions despite high risk of failure.

But could overconfidence not be a good thing? Is it not our self-conceit that makes us rise to the challenge and pushes us to perform better than our perceived limits? The short answer is no. With the exception of some limited benefits in innovation processes, overconfidence is one of the biggest obstacles to good decision making.

Although self-conceit can be fatal to decision-making processes, it is almost ironic how easy it is to correct. Churchill once said that it is important to remember that no matter how certain you feel about victory, there would be no war if the other person did not also think he had a good chance. If only the Americans had remembered to ask themselves a few perspective changing “what-if” questions, they could have avoided some of the biggest challenges. For example, why did Saddam Hussein dare risk a war? With the loss of the Gulf War in 1991 and a widening gap between Iraq and US forces, the outcome was given for the invasion – but the war was not.

When discussing overconfidence there are three different scenarios:(1) overestimation of our actual performance, (2) overplacement of our performance against others and (3) overprecision, that is exaggerated confidence in the accuracy of our beliefs:

Overestimation – the reason for project being late and over budget

You are preparing slides for a last minute presentation. It is the night before, but you do not worry much, because you are an experienced speaker. Although it is a new topic, you got slides from the former lecturer and you usually get good feedback. When you take the stage, you realize that your lecture is completely off target. You were not prepared for the participants’ prior knowledge of the subject, you have not quite understood all the points in the slides that you got and 45 agonizing minutes seem like a lifetime.

We tend to believe that we are better across a wide range of domains than we actually are, that we have more control over situations than we actually have and that we can plan out things quite detailed. This planning fallacy is the reason for projects coming in late and over budget.

This is also yet another reason that bonuses can be difficult to use as a motivator. You may receive a bonus of USD 10,000, but you have expected 5,000 or 15,000. The easiest way to exceed our expectations is to reduce them and we have therefore developed defensive pessimism: While we will start out optimistic in a year or a project, we shield ourselves from any disappointment towards the end of the year or project through pessimistic assessments about ourselves and our possibilities, because it feels extra hard when our inflated belief meets reality.

Overplacement – the reason we enter hopeless projects

93% of motorists believe that they are better than average and 25% of students consider themselves in top 1%. Besides making performance appraisals difficult to agree on between managers and employees, it means that entrepreneurs will go into markets where their objective possibilities for success are limited or we will continue a hopeless lawsuit at high costs.

Interestingly, recent research also underlines our tendency to believe that we are performing worse than others when it comes to very difficult tasks. Both extremes are problematic. While overplacement may lead us to throw money at bad projects, then underplacement may prevent us from pursuing great projects – whether in business or in life.

Overplacement can create some nasty surprises, when we suddenly face reality. For example highly intelligent people joining ivy league schools often drop out, when they finally meet real competition. We simply tend to focus on ourselves instead of comparing ourselves with the particular group we belong – or we think of our own team as more competent than average. That would be okay except the other talented teams may impact our own teams opportunities. In the business world this leads to lack of understanding of the market and competitors.

Overprecision – never ask experts to predict the future

Throughout history experts’ overly precise estimates have proven to be wrong again and again. For example, the neoclassical economist, Irving Fisher, who became famous for shortly before the Wall Street crash in 1929 saying that stock prices seemed to to have reached a permanently high level. Or Harry Morris Warner, one of the founders of Warner Bros, who in 1927 rhetorically asked who on earth wanted to hear actors talk? Or Thomas Watson, founder of IBM, who in 1943 predicted that there might be a world market for five computers.

Of course everyone can make mistakes in areas where they have limited knowledge, but the question is whether expertise buys us more precision? Studies show that we in areas with prior knowledge or expertise are closer to the correct answer. The problem is that we as experts are becoming more cocksure about our ability to predict the future and therefore defines narrow confidence intervals (i.e. how sure we are of our estimate), so we still miss the target. The problem is that our one lone estimate has a tendency to make mistakes.

The problem is further cemented by human interaction patterns. For example few voters will choose a politician with the slogan: “I think it’s this way, but I’m not sure.” We feel that confident people are more persuasive, competent and we reward them with influential positions. When the confidence and capability is positively correlated, it makes sense. But people quickly learn that to succeed, they must adopt a ‘fake it till you make it’ approach.

6S Model – bring in the devils advocate

Overconfidence is not the hardest bias to “fix”. Often we just need to consider worst-case scenarios, adverse information or adjust our project estimates with a standard percentage to minimize the effect. Here are a few examples from the 6S parameters of Strategy, Structure, Steps, Systems, Skills and Style:

In Strategy you might towards the end of the process deploy Gary Klein’s “premortem” technique, where you imagine that the project has failed and you are analyzing the reasons behind. It is a clever way to invite in the devils advocate without strong opposition and is great at uncovering the biases and challenges that may threaten your strategy or project.

In Systems overoptimistic organizations often overestimate their abilities and underestimate how long something takes. Although you should always start with relatively detailed plans to increase realism, then make sure to build in flexibility around those very precise estimates and consider whether the decisions in the system should be “opt in”, where you must actively choose a direction rather than an “opt out”, where you can just let system 1 press the accept button without reflecting on the consequences

This was the short version of chapter 5 in our book, Decision Strategy. Next week we will look at the power of loss aversion and the crazy lengths we will go to avoid it – stay tuned!

… and as always when you cannot wait, contact brian@behaviouralstrategygroup.com or +45-23103206.

Decision Strategy – chapter 4

confirmation-bias

Confirmation bias is hard to see

When confirmation bias locks you on to a dangerous path

”We don’t believe the world we see; we see the world, we believe”

The Matrix

Monday, September 23, 2013 CEO Thorsten Heins faced one of the biggest decisions in his professional life: Should he end several years of financial struggles and sell Blackberry to the Canadian holding company Fairfax Financial? A special task force had examined strategic alternatives and the best course of action seemed to be to sell BlackBerry at a price of USD 4.7 billion or about 3 percent more than the closing share price Friday. If the deal was accepted, Blackberry would be come a private company away from Wall Street pressure, but four days before the completion of the due diligence Heins was fired and the new CEO, John S Chen, instead raised USD 1 billion cash injection. For a company that only a few years before was the worlds leading smartphone company with 41% of the US market, the highest company valuation in Canada and named the fastest growing company in the world, how could this happen?

BlackBerry’s decline is of course a perfect case study in what happens when a technology giant fails to innovate in a market that is evolving with breathtaking speed. Amid the success investors were warning about the increasing competition from iOS and Android, but BlackBerry maintained strategy. It was only when the iPhone in 2007 began to gain popularity and challenge the BlackBerry that reality hit. But challenge is not one exclusively reserved for fast paced technology markets – in fact other markets may be even more prone to it: When you put a frog into hot water it immediately jumps out, but when you submerge it into normal temperature water, it will stay even when you slowly heat it up (actually, this is not scientifically proven, but it is a great analogy).

The challenge is called confirmation bias: We believe what we see – and we see what we believe. We seek information that confirms our expectations and play down the aspects that are inconsistent with our expectations. It works against innovation, because creative thinkers use information to re-evaluate ideas and avoid status-quo scenarios. Innovative thinking is costly and difficult because confirmation bias is part of our inner mechanics; it is all too easy for us to stop innovate and instead reproduce earlier successful ideas. This is both the reason that giants fall and 90% of new ventures fail – we are only human after all.

Biased information retrieval

We test our hypotheses by searching for confirmatory information consistent with our limited attention and cognitive processing ability that compels us to seek information selectively. This is also one of the reasons that newspaper readership is often split by political leaning – voters prefer to confirm their positions instead of undermining them.

Confirmation bias not only applies to long held beliefs. Often it can be related to an attitude only just developed. For example if we are about to meet someone for the first time and just before the meeting we hear a colleague describe the person as dishonest, then we will immediately start looking for confirming information – and surprise, surprise we find it.

Biased interpretation

Biased interpretation occurs when two people with exactly the same information make different conclusions consistent with previous beliefs. A 1979 Stanford test asked students to evaluate the US death penalty based on research showing the ineffectiveness of capital punishment and both prior proponents AND opponents maintained their previous positions – in fact they left the experiment even more convinced.

Biased memory

Even if we collect and interpret information in a neutral way, we tend to remember it in a way that strengthens our beliefs. Just to consider a hypothesis means that the information stored in our memory consistent with the hypothesis becomes more accessible. Although we often have the feeling that we remember past events correctly – especially when it comes to highly emotional events such as when we first held our baby in our arms, the day we were married or when we were told that we had been promoted – tons of research shows we cannot give an accurate description of the events.

6S Model

To minimize confirmation bias it critical that your organization is geared to be open and even proactive towards alternatives, surprises and disagreements across the 6S parameters of Strategy, Structure, Steps, Systems, Skills and Style.

In Strategy there is a myriad of tools, concepts and even schools, but not all are equally valuable or relevant to your business. It is not always obvious what is the strongest solution, but once you have bought into a particular tool, you begin automatically searching for affirmative arguments that exactly the strategy method you have chosen is the best hammer for all your different challenges. Playing to Win is a strong approach, because each step from defining winning ambition over selecting where to play and how to win to designing core capablities and management system, are designed to test the previous step.

In Steps (aka processes) we face repeated decisions, where the answer is often the same. This reinforces the confirmation bias – it takes a lot to answer no when you just answered yes 1,000 times. The classic scientific and consultant problem solving approach is to first look for how you can disprove your hypothesis. A simple way to ensure this are checklists spurring just enough conscious thought to avoid disasters.

In Style (aka culture) we cannot stress the importance of diversity enough, but there will be no positive results unless you insist on maintaining the constructive disagreement arising from diversity. Even with a healthy diversity approach, individuals may unconsciously sabotage the opportunity, e.g. when the manager starts out with his or her opinion and then asks for alternative views – funnily enough you do not get a good discussion going. In this big data focused age, it is important to remember that while facts are critical then organizations only focused on data are paving the way their only personal confirmation bias hell. Data can not replace common sense and good decision processes -not everything that can be measured is important and not everything that is important can be measured.

This was the abbreviated chapter 4 of our acclaimed book “Decision Strategy” – next two weeks we take a summer break. Cant wait? Then contact brian@behaviouralstrategygroup.com or +45-23103206.

Decision Strategy – chapter 3

spotlight

Bounded awareness allow razor sharp focus – but it also means tunnel vision, so you do not see, what is just outside your field of focus.

When bounded awareness narrows our vision

””Facts matter not at all. Perception is everything. It’s certainty”

Steven Colbert

On 28 January 1986, all employees at the Kennedy Space Center in Florida were busy preparing the launch of The Challenger space shuttle. A thorough review of the spaceship aviation readiness had been undertaken and it had been cleared. The lift off had been postponed five times due to bad weather, but today was a clear day albeit also the coldest for NASA ever to launch a rocket. The event was highly televised as a civilian was onboard for the first time ever to help NASA regain financial support. At 11:38 the Challenge left Pad 39B and almost immediately struck disaster. 73 seconds into the trip, the Challenger exploded in a ball of fire, immediately killing all crew. How could this happen to one of the most professional and highly regarded organizations?

Highly focused on saving the space program and using the Challenger as a key marketing vehicle for this purpose, management had under analyzed the risk situation and overheard engineering concerns around O-ring temperature requirements. Now you would think such an everyday reason could not happen in such a critical situation, but bounded awareness contributes to people even in the same organization with similar skills and knowledge can draw different conclusions. Because we have a limited bandwidth, we can in a mix of chaotic information quickly come to see important communication as trivial and thus underestimate risks. In NASA it was not technical skills or knowledge that separated the leaders from the engineers, but rather the narrow vision, which comes in the wake of strong focus – management on saving the space program and previous successes – engineering on recent O-ring reviews.

colour-changing-card-trick

If you are in doubt about whether this applies to you, then watch the colour changing card trick video and honestly answer yourself, whether you are one of the few, who gets it…

Bounded awareness can occur in various stages of your decision making and we like to distinguish between seeing, seeking, selecting and sharing information – or 4SI for short:

Error in Seeing Information

Our ability to focus on one task is undoubtedly useful, but it can also limit our awareness of peripheral threats and opportunities in your business environment and thus your ability to craft a strategic response.

Now, if you do not see it often, you often do not see it, but we can learn to become more aware of changes in our environment: Military personnel can be trained to scan a crowd for suspicious behavior, leaders can hone their awareness of critical information and organizations can set up early warning signals of key environmental change.

Error in Seeking Information

The Challenger disaster demonstrates what can potentially happen when professional and well-meaning leaders limit their analysis and fail to seek out the most relevant information. It is not difficult to make the connection to the recent acquisition of Nokia, where the CEO exclaimed: “we did nothing wrong, yet we lost”.

That said, how can we be expected to seek information which by its nature is beyond our awareness? The most important thing is that you are vigilant in your reflections on what information is actually relevant for the decision you must make. As a manager you often experience recommendations reaching your desk supported by a significant amount of data – a quick trick here is to be skeptical about the absence of contradictory evidence.

Error in Selection Information

It may be hard to believe, but we ignore many valuable and accessible pieces of information about changes to customers, competitors and other stakeholders, when making important decisions – particularly, when we are successful. The Swiss watchmakers held 50 percent market share before 1960 and despite being the first to develop quartz technology, the watchmakers were in less than 10 years reduced by two thirds from foreign competition in quartz technology.

One way you can determine if the information you have at your disposal is useful, is to think about how the other parties involved will act. If you are in a negotiation, how will the counterpart assess the business you are negotiating? One method is to understand the links between all the relevant information by not only focusing on the cause-effect relationship, but also to bringing other contextual factors in to play.

Error in Sharing Information

If we succeed to seeing, searching and selecting the right information in a non-biased way, research suggests that we still have a problem: Our cognitive limitations prevent us from unrestricted exchange of information. When team members discuss available information, they omit the unique pieces, that can make the difference. Why? Because it is much easier to discuss common information and it is often better rewarded.

There are many ways to integrate diverse knowledge in groups, but one of the simplest approaches is to set agendas for meetings with specific points to ask individual views or make a person/department responsible for valuable knowledge sharing.

6s-model

6S Model

There are many individual approaches to avoid or limit bounded awareness. We use the 6S model to apply the right type of method to the right type of problem. Are we dealing with an issue within Strategy, Structure, Steps (aka process), Systems, Skills or Style (aka culture)?

For example in Strategy a strong approach is to counter bounded awareness with a highly focused megatrend analysis on the company and industry in question to showcase key issues and relevant scenarios facing the company in the next +5 years, so strategy is based on solid ground.

Another example in Structure companies may have a tendency for proliferation, inequitable resource allocation or absence of significant branches. The best thing to do is run periodic due diligence on your structure: reasonable number of organizational layers, average span of control, a relevant mandate in each role etc.

A final example is Systems that are configured incorrectly has the special ability to get you very far away from your goal very quickly. This is because the system’s primary purpose is to solve tasks fast, but if “the GPS” is set incorrectly, you can quickly end up somewhere completely different than you expected. Your best option here is to think both lead and lag indicators measuring both the end result and the process of getting there similarly to Balanced Scorecard etc. It allows you to create an Early Warning System to correct course at the right time but the trick is to focus on a handful of KPIs – if you focus on everything, you focus on nothing.

Bounded awareness can break organizations in industries undergoing significant change. The solution is not surprising both systematically and continuously work to expand awareness of the organization.

This was the summary of chapter 3 in our book, Decision Strategy. Next week we will look at how your Confirmation Bias keeps your bad ideas alive far beyond disaster and what to do about it – stay tuned!

As always if you prefer to wait for disaster to strike, then do not contact brian@behaviouralstrategygroup.com or +45-23103206:)

Decision Strategy – chapter 2

rationality

95% of our 35,000 daily decisions are taken subconsciously by our system 1

How does our brain wreak havoc on strategy?

”When faced with a difficult question, we often answer an easier one instead”

Daniel Kahneman

Christine, who is a director of a mid-sized media company, is in doubt whether to fire Anna, the marketing director. In recent years Anna has not delivered more than minimum requirements. She is in every way talented, intelligent and has a knack for finding inexpensive powerful marketing solutions, but she rarely takes the initiative and is often critical towards other employees. The challenge is that Anna is difficult to replace and is the only one who can maintain the company’s critical partners. What would you advise Christine to do?

If you reflect a little about your mental activity while you reading this introduction, it is remarkable how quickly you formed an opinion. Maybe you advise her to fire Anna or have a chat with Anna for a final chance. But you are unlikely to be completely confused. Our consciousness is usually in a state where we have intuitive feelings towards almost everything we experience. We may like or dislike people long before we know anything special about them; we trust strangers without knowing why; we have the feeling that a company will succeed without further analyzing business. We simply put too much emphasis on the information that is readily available to us.

The ideal decision process has 6 steps starting with an assessment phase and ending with a decision phase, but we actually rarely use it:

  1. Define the problem
  2. Identify criteria
  3. Weigh criteria
  4. Generate alternatives
  5. Assess alternatives against each criterion
  6. Calculate the optimal decision

Instead 95% of our decisions are taken intuitively by fast thinking or system 1 as opposed to slow thinking or system 2, where we actually engage our minds in classic analysis. The brain is not literally divided in this way, but it is a useful analogue. System 1 is our faster, automatic, intuitive and emotional thinking and often goes under the name of the elephant – while System 2 is slower, more costly and conscious thinking – the one often referred to as the rider. For example our system 1 may tell us to buy Audi shares because we basically like Audi vehicles, but it does not mean that the shares have an attractive price compared to the actual value and alternative investments. This assessment requires system 2, but that pulls significantly more energy and these resources are quickly used up – and then you are back to system 1 decisions.

We have in the book structured the more than 200 biases currently identified into 6 distinct groups with 5 closely fitting with the key reasons for strategic failure according to McKinsey & Company:

70% failure rate

  • Availability bias was introduced in the beginning of this article. We put too much emphasis on the limited information we have and do not see new key pieces.
  • Confirmation bias is when you get that great business idea and start investigating further only to find that every new piece of information seem to confirm your idea.
  • Overconfidence is our tendency to underestimate competitors, timelines and budgets as well as overestimate our control over these.
  • Emotional bias holds several different sub biases such as strong preference for status quo and is almost a category in itself, because it tends to reinforce the other biases.
  • Loss aversion is the finding that we are on average twice as averse to losses as we are attracted to gains. Once vested in a strategy this feels like a loss we should avoid.
  • Bounded ethics is not included in McKinsey’s study and focuses on how we slowly stretch our ethical code. The fake accounts in Wells Fargo is a recent example.

As we can see, there are plenty of ways for top management decisions to go wrong. But are these biases inherently bad or irrational? Not really – they were just invented for a different era and now we need to adjust for the situations, where they are not ideal:

  1. Identify the situations, where it is worth the effort –  the rare high impact decisions such as strategy and many small decisions that over time has a large effect such as in core processes
  2. Identify the most likely bias to affect the decisions selected and their general effect on the process
  3. Decide on a new practice that either removes the bias from the equation or counters the bias

This was the short version of chapter 2 on our book, Decision Strategy. Next week we will look at how our availability bias prevents us from bringing all key data to strategic decisions and what to do about it – stay tuned!

If you cannot wait until next week and need support in strategic decisions or daily business decision making processes, please contact me at brian@behaviouralstrategygroup.com or +45-23103206

Decision Strategy – chapter 1

book

This is a series of core points from our book, Decision Strategy. It won 5th place in global management books in 2016.

Why do we need Behavioural Strategy?

”Take a simple idea, then take it seriously”

Warren Buffet

The last 50 years we have witnessed a revolution in strategic thinking. Ground-breaking work of great thinkers like Michael Porter and Henry Mintzberg has paved the way for a solid management tradition. Today, most senior managers are trained in strategic management, and large companies often have their own strategy teams.

Yet business is filled with examples of bad strategy and strategy execution with disastrous consequences. 70% of all strategic initiatives fail according McKinsey, and what is worse, many companies will look at one moment to a promising future and the next find themselves brought to their knees. Almost half of the 25 companies that originally went through Tom Peters and Robert Waterman In Search of Excellence needle’s eye, either no longer exist, are bankrupt or are faring poorly.

We all know the countless examples of poor strategic decisions; Kodak, who missed opportunities in digital photography, a technology they had invented; the Swiss national airline, Swissair, which was so financially stable that it was known as “the flying bank”, but as a result of a failed M & A strategy succumbed to debt, or Nokia that chose market retention over innovation and dropped out of competition with the words: “we did not do anything wrong, and yet we lost.”

The question is why talented executives are backing flawed strategies in a time when we have access to so much information? Misinterpreting opportunities, poor strategy design and lack of follow through are some of the key reasons according McKinsey research. But they are all fairly everyday reasons that should not get in the way of a solid strategy right?

Wrong. We are simply not as rational as we like to think. 95% of our decisions are taken on pure intuition based on a brain developed thousands of years ago and by no means equipped to handle the speed, pressure and diversity of decisions of today. It has developed shortcuts and simplifications that may have helped early humans to survive on the savannas of Africa like “if it looks like a deer, and everyone else chases it, it must be lunch.”

The strength of behavioral economics is that it brings the real world into the strategy process and thus all strategists should take an interest in behavioral economics. It is not about the next fad or adding a new organizational unit. As in the Buffet quote at the top, it is all about taking a simple idea – that we are systematically irrational – very seriously to build systems and organizations that will be flexible and robust enough to handle human excesses – and it is this agility and robustness that will separate your organization from the competition.

The idea is simple yet not easy. A few first mover investment funds have incorporated behavioral economics to counteract systematic biases, but strategy has not yet been penetrated. And it is not hard to guess why: unlike marketing, where the biases and irrational behaviour of others are used routinely to increase sales, strategy requires us to recognize our own biases.

This was from chapter 1 on our book, Decision Strategy. Next week we will look into how our brain works, how that can wreak havoc on strategy and what to do about it – stay tuned!

If you cannot wait to learn more or want to avoid strategic disaster in your company, contact me at brian@behaviouralstrategygroup.com or +45-23103206.

5 year anniversary!!!

Our acclaimed book Decision Strategy (translated from Danish Beslutningsstrategi)

Time flies, when you are working 100 hours per week (or so I heard from my younger self – but that is story for another time)! In 2016 I coauthored my first book and cofounded my first company – and never looked back (ok actually I did, but I noticed all these cool entrepreneurs never seem to do so on Linkedin, so sue me!).

Our book Decision Strategy received five stars in the most prestigious business paper in Denmark and went on to win global 5th place just shy of McKinsey, Bain and MITs books that year (and some ridiculous re-re-retelling of the business disaster in EAC – you will be forgiven for not remembering). It was the only time someone managed to not only structure your +200 biases into a comprehensible concept but also match them up against the right business tools – absolutely critical in a time flush with rebottled old concepts and no way of telling the difference (at least if you ask us, and we are certainly not biased…:)

I also left the comfort of corporate life that year. Boldly striking out on my own with many interested potential clients in our concept and exactly… zero signing up… OK I was a bit scared there for a month or so! But then Behavioural Strategy Group took on our first client with a growth strategy for an IT company. And a second. And a third. We combined megatrend analysis to avoid the availability bias group with a twist on Playing To Win strategy to minimize especially the confirmation bias group together with some deep dives and early warning systems to keep the overconfidence, emotional and loss aversion bias groups at bay. You never forget your first love!

It has been a blast despite the many ups and downs of entrepreneurship, but after a horrible 2020 I believe this year will be the best yet in terms of intriguing clients and challenging projects, so I can hardly wait to see the next 5 years. To celebrate the anniversary of our book and to give you all some exciting summer reading, I thought I would share the key learnings from the book one chapter every week for the next 10 weeks – stay tuned!

And if you cannot wait that long to learn the secret sauce behind our strategies, contact me at brian@behaviouralstrategygroup.com or +45-23103206.

The upside of irrationality – Bounded Ethics bias

The towels were so thick there I could hardly close my suitcase.

Yogi Berra, Baseball athlete & accidental satirist

You lie, steal and cheat all the time. Well not you of course – just everybody you have ever met. Some more than others though. Or at least they are the ones who get caught:)

So bounded ethics is your tendency to act unethically, whenever there is a conflict between doing what is right – and what is right for you. It can be small things like taking a pencil from work, crossing a pedestrian street at red light or dressing up your story a bit. Try googling and taking the online test: “How unethical are you?” and share your results if you dare!

In fact most of this is small, but that is also the problem, because once you cross that line, you are entering a very slippery slope, where it is easy to lose your bearings and keep going a little further all the time – and before you know it, you have destroyed a +100 year old financial institution and created the largest financial crisis in almost a century just as a completely hypothetical example (or not – check out the case of Lehman Brothers).

Sometimes it is you hiring a board member from your trusted network or as a 40ish male Caucasian simply just preferring other 40ish male Caucasians for your team, because – well – they just seem very trustworthy and competent. And it is not to pick on 40ish male Caucasians – 20ish female Asians also trust 20ish female Asians the most. Nobody means any harm, but you need to find a debiased recruitment process.

Another interesting trick is to take page out of Amazons playbook with assigning a chair in every meeting to a customer – in this case assign it to the newspaper, you are most concerned about ending up on the frontpage of. But it can also be as easy as posting your values on a poster – in fact you do not even have to have any values written down – research shows that if you just refer to the values of the organization, then people will automatically behave better!

One of my favorite professors Dan Ariely is here to tell you the honest truth about dishonesty:

Now if you know someone, who knows someone, who might have some “issues”, they need to fix in this arena contact me a brian@behaviouralstrategygroup.com or +45-23103206.

Have a wonderful week!