Vous êtes sur la page 1sur 26

Five routes to more innovative problem solving

Tricky problems must be shaped before they can be solved. To start that process, and stimulate novel thinking, leaders should look through multiple lenses.
April 2013 | byOlivier Leclerc and Mihnea Moldoveanu Rob McEwen had a problem. The chairman and chief executive officer of Canadian mining group Goldcorp knew that its Red Lake site could be a money-spinnera mine nearby was thrivingbut no one could figure out where to find high-grade ore. The terrain was inaccessible, operating costs were high, and the unionized staff had already gone on strike. In short, McEwen was lumbered with a gold mine that wasnt a gold mine. Then inspiration struck. Attending a conference about recent developments in IT, McEwen was smitten with the open-source revolution. Bucking fierce internal resistance, he created the Goldcorp Challenge: the company put Red Lakes closely guarded topographic data online and offered $575,000 in prize money to anyone who could identify rich drill sites. To the astonishment of players in the mining sector, upward of 1,400 technical experts based in 50-plus countries took up the problem. The result? Two Australian teams, working together, found locations that have made Red Lake one of the worlds richest gold mines. From a remote site, the winners were able to analyze a database and generate targets without ever visiting the property, McEwen said. Its clear that this is part of the future.1 McEwen intuitively understood the value of taking a number of different approaches simultaneously to solving difficult problems. A decade later, we find that this mind-set is ever more critical: business leaders are operating in an era when forces such as technological change and the historic rebalancing of global economic activity from developed to emerging markets have made the problems increasingly complex, the tempo faster, the markets more volatile, and the stakes higher. The number of variables at play can be enormous, and free-flowing information encourages competition, placing an ever-greater premium on developing innovative, unique solutions. This article presents an approach for doing just that. How? By using what we call flexible objects for generating novel solutions, or flexons, which provide a way of shaping difficult problems to reveal innovative solutions that would otherwise remain hidden. This approach can be useful in a wide range of situations and at any level of analysis, from individuals to groups to organizations to industries. To be sure, this is not a silver bullet for solving any problem whatever. But it is a fresh mechanism for representing ambiguous, complex problems in a structured way to generate better and more innovative solutions.

The flexons approach

Finding innovative solutions is hard. Precedent and experience push us toward familiar ways of seeing things, which can be inadequate for the truly tough challenges that confront senior leaders. After all, if a problem can be solved before it escalates to the C-suite, it typically is. Yet we know that teams of smart people from different backgrounds are more likely to come up with fresh ideas more quickly than individuals or like-minded groups do.2 When a diverse range of expertsgame theorists to economists to psychologistsinteract, their approach to problems is different from those that individuals use. The solution space becomes broader, increasing the chance that a more innovative answer will be found. Obviously, people do not always have think tanks of PhDs trained in various approaches at their disposal. Fortunately, generating diverse solutions to a problem does not require a diverse group of problem solvers. This is where flexons come into play. While traditional problem-solving frameworks address particular problems under particular conditionscreating a compensation system, for instance, or undertaking a value-chain analysis for a vertically integrated business they have limited applicability. They are, if you like, specialized lenses. Flexons offer languages for shaping problems, and these languages can be adapted to a much broader array of challenges. In essence, flexons substitute for the wisdom and experience of a group of diverse, highly educated experts. To accommodate the world of business problems, we have identified five flexons, or problemsolving languages. Derived from the social and natural sciences, they help users understand the behavior of individuals, teams, groups, firms, markets, institutions, and whole societies. We arrived at these five through a lengthy process of synthesizing both formal literatures and the private knowledge systems of experts, and trial and error on real problems informed our efforts. We dont suggest that these five flexons are exhaustiveonly that we have found them sufficient, in concert, to tackle very difficult problems. While serious mental work is required to tailor the flexons to a given situation, and each retains blind spots arising from its assumptions, multiple flexons can be applied to the same problem to generate richer insights and more innovative solutions.

Networks flexon
Imagine a map of all of the people you know, ranked by their influence over you. It would show close friends and vague acquaintances, colleagues at work and college roommates, people who could affect your career dramatically and people who have no bearing on it. All of them would be connected by relationships of trust, friendship, influence, and the probabilities that they will meet. Such a map is a network that can represent anything from groups of people to interacting product parts to traffic patterns within a cityand therefore can shape a whole range of business problems. For example, certain physicians are opinion leaders who can influence colleagues about which drugs to prescribe. To reveal relationships among physicians and help identify those best able to influence drug usage, a pharmaceutical company launching a product could create a network map of doctors who have coauthored scientific articles. By targeting clusters of physicians who share the same ideas and (one presumes) have tight interactions, the company may improve its return on investments compared with what traditional mass-marketing approaches would

achieve. The network flexon helps decompose a situation into a series of linked problems of prediction (how will ties evolve?) and optimization (how can we maximize the relational advantage of a given agent?) by presenting relationships among entities. These problems are not simple, to be sure.3 But they are well-defined and structureda fundamental requirement of problem solving.

Evolutionary flexon
Evolutionary algorithms have won games of chess and solved huge optimization problems that overwhelm most computational resources. Their success rests on the power of generating diversity by introducing randomness and parallelization into the search procedure and quickly filtering out suboptimal solutions. Representing entities as populations of parents and offspring subject to variation, selection, and retention is useful in situations where businesses have limited control over a large number of important variables and only a limited ability to calculate the effects of changing them, whether theyre groups of people, products, project ideas, or technologies. Sometimes, you must make educated guesses, test, and learn. But even as you embrace randomness, you can harness it to produce better solutions to complex problems. Thats because not all guessing strategies are created equal. We have crucial choices to make: generating more guesses (prototypes, ideas, or business models) or spending more time developing each guess or deciding which guesses will survive. Consider a consumer-packagedgoods company trying to determine if a new brand of toothpaste will be a hit or an expensive failure. Myriad variableseverything from consumer habits and behavior to income, geography, and the availability of clean waterinteract in multiple ways. The evolutionary flexon may suggest a series of low-cost, small-scale experiments involving product variants pitched to a few well-chosen market segments (for instance, a handful of representative customers high in influence and skeptical about new ideas). With every turn of the evolutionary-selection crank, the companys predictions will improve.

Decision-agent flexon
To the economic theorist, social behavior is the outcome of interactions among individuals, each of whom tries to select the best possible means of achieving his or her ends. The decision-agent flexon takes this basic logic to its limit by providing a way of representing teams, firms, and industries as a series of competitive and cooperative interactions among agents. The basic approach is to determine the right level of analysisfirms, say. Then you ascribe to them beliefs and motives consistent with what you know (and think they know), consider how their payoffs change through the actions of others, determine the combinations of strategies they might collectively use, and seek an equilibrium where no agent can unilaterally deviate from the strategy without becoming worse off. Game theory is the classic example, but its worth noting that a decision-agent flexon can also incorporate systematic departures from rationality: impulsiveness, cognitive shortcuts such as stereotypes, and systematic biases. Taken as a whole, this flexon can describe all kinds of behavior, rational and otherwise, in one self-contained problem-solving language whose most

basic variables comprise agents (individuals, groups, organizations) and their beliefs, payoffs, and strategies. For instance, financial models to optimize the manufacturing footprint of a large industrial company would typically focus on relatively easily quantifiable variables such as plant capacity and input costs. To take a decision-agent approach, you assess the payoffs and likely strategies of multiple stakeholdersincluding customers, unions, and governmentsin the event of plant closures. Adding the incentives, beliefs, and strategies of all stakeholders to the analysis allows the company to balance the trade-offs inherent in a difficult decision more effectively.

System-dynamics flexon
Assessing a decisions cascading effects on complex businesses is often a challenge. Making the relations between variables of a system, along with the causes and effects of decisions, more explicit allows you to understand their likely impact over time. A system-dynamics lens shows the world in terms of flows and accumulations of money, matter (for example, raw materials and products), energy (electrical current, heat, radio-frequency waves, and so forth), or information. It sheds light on a complex system by helping you develop a map of the causal relationships among key variables, whether they are internal or external to a team, a company, or an industry; subjectively or objectively measurable; or instantaneous or delayed in their effects. Consider the case of a deep-sea oil spill, for example. A source (the well) emits a large volume of crude oil through a sequence of pipes (which throttle the flow and can be represented as inductors) and intermediate-containment vessels (which accumulate the flow and can be modeled as capacitors). Eventually, the oil flows into a sink (which, in this case, is unfortunately the ocean). A pressure gradient drives the flow rate of oil from the well into the ocean. Even an approximate model immediately identifies ways to mitigate the spills effects short of capping the well. These efforts could include reducing the pressure gradient driving the flow of crude, decreasing the loss of oil along the pipe, increasing the capacity of the containment vessels, or increasing or decreasing the inductance of the flow lines. In this case, a loosely defined phenomenon such as an oil spill becomes a set of precisely posed problems addressable sequentially, with cumulative results.

Information-processing flexon
When someone performs long division in her head, a CEO makes a strategic decision by aggregating imperfect information from an executive team, or Google servers crunch Web-site data, information is being transformed intelligently. This final flexon provides a lens for viewing various parts of a business as information-processing tasks, similar to the way such tasks are parceled out among different computers. It focuses attention on what information is used, the cost of computation, and how efficiently the computational device solves certain kinds of problems. In an organization, that device is a collection of people, whose processes for deliberating and deciding are the most important explanatory variable of decision-makings effectiveness.4

Consider the case of a private-equity firm seeking to manage risk. A retrospective analysis of decisions by its investment committee shows that past bets have been much riskier than its principals assumed. To understand why, the firm examines what information was transmitted to the committee and how decisions by individuals would probably have differed from those of the committee, given its standard operating procedures. Interviews and analysis show that the company has a bias toward riskier investments and that it stems from a near-unanimity rule applied by the committee: two dissenting members are enough to prevent an investment. The insistence on near-unanimity is counterproductive because it stifles debate: the committees members (only two of whom could kill any deal) are reluctant to speak first and be perceived as an enemy by the deal sponsor. And the more senior the sponsor, the more likely it is that risky deals will be approved. Raising the number of votes required to kill deals, while clearly counterintuitive, would stimulate a richer dialogue.

Putting flexons to work


We routinely use these five problem-solving lenses in workshops with executive teams and colleagues to analyze particularly ambiguous and complex challenges. Participants need only a basic familiarity with the different approaches to reframe problems and generate more innovative solutions. Here are two quite different examples of the kinds of insights that emerge from the use of several flexons, whose real power emerges in combination.

Reorganizing for innovation


A large biofuel manufacturer that wants to improve the productivity of its researchers can use flexons to illuminate the problem from very different angles. Networks. Its possible to view the problem as a need to design a better innovation network by mapping the researchers ties to one another through co-citation indices, counting the number of e-mails sent between researchers, and using a network survey to reveal the strength and density of interactions and collaborative ties. If coordinating different knowledge domains is important to a companys innovation productivity, and the current network isnt doing so effectively, the company may want to create an internal knowledge market in which financial and status rewards accrue to researchers who communicate their ideas to co-researchers. Or the company could encourage cross-pollination by setting up cross-discipline gatherings, information clearinghouses, or wiki-style problem-solving sites featuring rewards for solutions. Evolution. By describing each lab as a self-contained population of ideas and techniques, a company can explore how frequently new ideas are generated and filtered and how stringent the selection process is. With this information, it can design interventions to generate more varied ideas and to change the selection mechanism. For instance, if a lot of research activity never seems to lead anywhere, the company might take steps to ensure that new ideas are presented more frequently to the business-development team, which can provide early feedback on their applicability. Decision agents. We can examine in detail how well the interests of individual researchers and the organization are aligned. What financial and nonfinancial benefits accrue to individuals who

initiate or terminate a search or continue a search that is already under way? What are the net benefits to the organization of starting, stopping, or continuing to search along a given trajectory? Search traps or failures may be either Type I (pursuing a development path unlikely to reach a profitable solution) or Type II (not pursuing a path likely to reach a profitable solution). To better understand the economics at play, it may be possible to use industry and internal data to multiply the probabilities of these errors by their costs. That economic understanding, in turn, permits a company to tailor incentives for individuals to minimize Type I errors (by motivating employees to reject apparent losers more quickly) or Type II errors (by motivating them to persist along paths of uncertain value slightly longer than they normally would).

Predicting the future


Now consider the case of a multinational telecommunications service provider that operates several major broadband, wireless, fixed, and mobile networks around the world, using a mix of technologies (such as 2G and 3G). It wants to develop a strategic outlook that takes into consideration shifting demographics, shifting technologies for connecting users with one another and with its core network (4G), and shifting alliancesto say nothing of rapidly evolving players from Apple to Qualcomm. This problem is complicated, with a range of variables and forces at work, and so broad that crafting a strategy with big blind spots is easy. Flexons can help. Each view of the world described below provides valuable food for thought, including potential strategic scenarios, technology road maps, and possibilities for killer apps. More hard work is needed to synthesize the findings into a coherent worldview, but the different perspectives provided by flexons illuminate potential solutions that might otherwise be missed. Decision agents. Viewing the problem in this way emphasizes the incentives for different industry players to embrace new technologies and service levels. By enumerating a range of plausible scenarios from the perspective of customers and competitors, the network service provider can establish baseline assessments of future pricing, volume levels, and investment returns. Networks. This lens allows a company or its managers to look at the industry as a pattern of exchange relationships between paying customers and providers of services, equipment, chips, operating systems, and applications, and then to examine the properties of each exchange network. The analysis may reveal that not all innovations and new end-user technologies are equal: some provide an opportunity for differentiation at critical nodes in the network; others do not. System dynamics. This flexon focuses attention on data-flow bottlenecks in applications ranging from e-mail and voice calls to video downloads, games, and social-networking interactions.5 The company can build a network-optimization map to predict and optimize capital expenditures for network equipment as a function of expected demand, information usage, and existing constraints. Because cost structures matter deeply to annuity businesses (such as those of service

providers) facing demand fluctuations, the resulting analysis may radically affect which services a company believes it can and cannot offer in years to come. Flexons help turn chaos into order by representing ambiguous situations and predicaments as well-defined, analyzable problems of prediction and optimization. They allow us to move up and down between different levels of detail to consider situations in all their complexity. And, perhaps most important, flexons allow us to bring diversity inside the head of the problem solver, offering more opportunities to discover counterintuitive insights, innovative options, and unexpected sources of competitive advantage.

The case for behavioral strategy


Left unchecked, subconscious biases will undermine strategic decision making. Heres how to counter them and improve corporate performance.
March 2010 | byDan Lovallo and Olivier Sibony

Once heretical, behavioral economics is now mainstream. Money managers employ its insights about the limits of rationality in understanding investor behavior and exploiting stock-pricing anomalies. Policy makers use behavioral principles to boost participation in retirement-savings plans. Marketers now understand why some promotions entice consumers and others dont. Yet very few corporate strategists making important decisions consciously take into account the cognitive biasessystematic tendencies to deviate from rational calculationsrevealed by behavioral economics. Its easy to see why: unlike in fields such as finance and marketing, where executives can use psychology to make the most of the biases residing in others, in strategic decision making leaders need to recognize their own biases. So despite growing awareness of behavioral economics and numerous efforts by management writers, including ourselves, to make the case for its application, most executives have a justifiably difficult time knowing how to harness its power.1 This is not to say that executives think their strategic decisions are perfect. In a recent McKinsey Quarterly survey of 2,207 executives, only 28 percent said that the quality of strategic decisions in their companies was generally good, 60 percent thought that bad decisions were about as frequent as good ones, and the remaining 12 percent thought good decisions were altogether infrequent.2 Our candid conversations with senior executives behind closed doors reveal a similar unease with the quality of decision making and confirm the significant body of research indicating that cognitive biases affect the most important strategic decisions made by the smartest managers in the best companies. Mergers routinely fail to deliver the expected synergies.3 Strategic plans often ignore competitive responses.4 And large investment projects are over budget and over timeover and over again.5

In this article, we share the results of new research quantifying the financial benefits of processes that debias strategic decisions. The size of this prize makes a strong case for practicing behavioral strategya style of strategic decision making that incorporates the lessons of psychology. It starts with the recognition that even if we try, like Baron Mnchhausen, to escape the swamp of biases by pulling ourselves up by our own hair, we are unlikely to succeed. Instead, we need new norms for activities such as managing meetings (for more on running unbiased meetings, see Taking the bias out of meetings), gathering data, discussing analogies, and stimulating debate that together can diminish the impact of cognitive biases on critical decisions. To support those new norms, we also need a simple language for recognizing and discussing biases, one that is grounded in the reality of corporate life, as opposed to the sometimes-arcane language of academia. All this represents a significant commitment and, in some organizations, a profound cultural change.

The value of good decision processes


Think of a large business decision your company made recently: a major acquisition, a large capital expenditure, a key technological choice, or a new-product launch. Three things went into it. The decision almost certainly involved some fact gathering and analysis. It relied on the insights and judgment of a number of executives (a number sometimes as small as one). And it was reached after a processsometimes very formal, sometimes completely informalturned the data and judgment into a decision. Our research indicates that, contrary to what one might assume, good analysis in the hands of managers who have good judgment wont naturally yield good decisions. The third ingredient the processis also crucial. We discovered this by asking managers to report on both the nature of an important decision and the process through which it was reached. In all, we studied 1,048 major decisions made over the past five years, including investments in new products, M&A decisions, and large capital expenditures (Exhibit 1).
Exhibit 1

About the research

Enlarge

We asked managers to report on the extent to which they had applied 17 practices in making that decision. Eight of these practices had to do with the quantity and detail of the analysis: did you, for example, build a detailed financial model or run sensitivity analyses? The others described the decision-making process: for instance, did you explicitly explore and discuss major uncertainties or discuss viewpoints that contradicted the senior leaders? We chose these process characteristics because in academic research and in our experience, they have proved effective at overcoming biases.6 After controlling for factors like industry, geography, and company size, we used regression analysis to calculate how much of the variance in decision outcomes7 was explained by the quality of the process and how much by the quantity and detail of the analysis. The answer: process mattered more than analysisby a factor of six (Exhibit 2). This finding does not mean that analysis is unimportant, as a closer look at the data reveals: almost no decisions in our sample made through a very strong process were backed by very poor analysis. Why? Because one of the things an unbiased decision-making process will do is ferret out poor analysis. The reverse is not true; superb analysis is useless unless the decision process gives it a fair hearing.
Exhibit 2

Process carries weight

Enlarge

To get a sense of the value at stake, we also assessed the return on investment (ROI) of decisions characterized by a superior process.8 The analysis revealed that raising a companys game from the bottom to the top quartile on the decision-making process improved its ROI by 6.9 percentage points. The ROI advantage for top-quartile versus bottom-quartile analytics was 5.3 percentage points, further underscoring the tight relationship between process and analysis. Good process, in short, isnt just good hygiene; its good business.

The building blocks of behavioral strategy


Any seasoned executive will of course recognize some biases and take them into account. That is what we do when we apply a discount factor to a plan from a direct report (correcting for that persons overoptimism). That is also what we do when we fear that one persons recommendation may be colored by self-interest and ask a neutral third party for an independent opinion.

However, academic research and empirical observation suggest that these corrections are too inexact and limited to be helpful. The prevalence of biases in corporate decisions is partly a function of habit, training, executive selection, and corporate culture. But most fundamentally, biases are pervasive because they are a product of human naturehardwired and highly resistant to feedback, however brutal. For example, drivers laid up in hospitals for traffic accidents they themselves caused overestimate their driving abilities just as much as the rest of us do.9 Improving strategic decision making therefore requires not only trying to limit our own (and others) biases but also orchestrating a decision-making process that will confront different biases and limit their impact. To use a judicial analogy, we cannot trust the judges or the jurors to be infallible; they are, after all, human. But as citizens, we can expect verdicts to be rendered by juries and trials to follow the rules of due process. It is through teamwork, and the process that organizes it, that we seek a high-quality outcome. Building such a process for strategic decision making requires an understanding of the biases the process needs to address. In the discussion that follows, we focus on the subset of biases we have found to be most relevant for executives and classify those biases into five simple, businessoriented groupings. (For more on these groupings, see interactive, How cognitive biases affect strategic decision making. You can also download a PDF of the groupings of biases that occur most frequently in business.) A familiarity with this classification is useful in itself because, as the psychologist and Nobel laureate in economics Daniel Kahneman has pointed out, the odds of defeating biases in a group setting rise when discussion of them is widespread. But familiarity alone isnt enough to ensure unbiased decision making, so as we discuss each family of bias, we also provide some general principles and specific examples of practices that can help counteract it.

Interactive

How cognitive biases affect strategic decision making


Explore the biases most pertinent to business and the ways they can combine to create dysfunctional patterns in corporate cultures.

Pop out

Counter pattern-recognition biases by changing the angle of vision


The ability to identify patterns helps set humans apart but also carries with it a risk of misinterpreting conceptual relationships. Common pattern-recognition biases include saliency biases (which lead us to overweight recent or highly memorable events) and the confirmation bias (the tendency, once a hypothesis has been formed, to ignore evidence that would disprove it). Particularly imperiled are senior executives, whose deep experience boosts the odds that they will rely on analogies, from their own experience, that may turn out to be misleading.10 Whenever analogies, comparisons, or salient examples are used to justify a decision, and whenever convincing champions use their powers of persuasion to tell a compelling story, pattern-recognition biases may be at work. Pattern recognition is second nature to all of usand often quite valuableso fighting biases associated with it is challenging. The best we can do is to change the angle of vision by encouraging participants to see facts in a different light and to test alternative hypotheses to explain those facts. This practice starts with things as simple as field and customer visits. It continues with meeting-management techniques such as reframing or role reversal, which encourage participants to formulate alternative explanations for the evidence with which they are presented. It can also leverage tools, such as competitive war games, that promote out-of-the-box thinking. Sometimes, simply coaxing managers to articulate the experiences influencing them is valuable. According to Kleiner Perkins partner Randy Komisar, for example, a contentious discussion over manufacturing strategy at the start-up WebTV11 suddenly became much more manageable once it was clear that the preferences of executives about which strategy to pursue stemmed from their previous career experience. When that realization came, he told us, there was immediately a sense of exhaling in the room. Managers with software experience were frightened about building hardware; managers with hardware experience were afraid of ceding control to contract manufacturers.

Getting these experiences into the open helped WebTVs management team become aware of the pattern recognition they triggered and see more clearly the pros and cons of both options. Ultimately, WebTVs executives decided both to outsource hardware production to large electronics makers and, heeding the worries of executives with hardware experience, to establish a manufacturing line in Mexico as a backup, in case the contractors did not deliver in time for the Christmas season. That in fact happened, and the backup plan, which would not have existed without a decision process that changed the angle of vision, saved the company. Another useful means of changing the angle of vision is to make it wider by creating a reasonably largein our experience at least sixset of similar endeavors for comparative analysis. For example, in an effort to improve US military effectiveness in Iraq in 2004, Colonel Kalev Seppby himself, in 36 hoursdeveloped a reference class of 53 similar counterinsurgency conflicts, complete with strategies and outcomes. This effort informed subsequent policy changes.12

Counter action-oriented biases by recognizing uncertainty


Most executives rightly feel a need to take action. However, the actions we take are often prompted by excessive optimism about the future and especially about our own ability to influence it. Ask yourself how many plans you have reviewed that turned out to be based on overly optimistic forecasts of market potential or underestimated competitive responses. When you or your people feelespecially under pressurean urge to take action and an attractive plan presents itself, chances are good that some elements of overconfidence have tainted it. To make matters worse, the culture of many organizations suppresses uncertainty and rewards behavior that ignores it. For instance, in most organizations, an executive who projects great confidence in a plan is more likely to get it approved than one who lays out all the risks and uncertainties surrounding it. Seldom do we see confidence as a warning signa hint that overconfidence, overoptimism, and other action-oriented biases may be at work. Superior decision-making processes counteract action-oriented biases by promoting the recognition of uncertainty. For example, it often helps to make a clear and explicit distinction between decision meetings, where leaders should embrace uncertainty while encouraging dissent, and implementation meetings, where its time for executives to move forward together. Also valuable are toolssuch as scenario planning, decision trees, and the premortem championed by research psychologist Gary Klein (for more on the premortem, see Strategic decisions: When can you trust your gut?)that force consideration of many potential outcomes. And at the time of a major decision, its critical to discuss which metrics need to be monitored to highlight necessary course corrections quickly.

Counter stability biases by shaking things up


In contrast to action biases, stability biases make us less prone to depart from the status quo than we should be. This category includes anchoringthe powerful impact an initial idea or number has on the subsequent strategic conversation. (For instance, last years numbers are an implicit but extremely powerful anchor in any budget review.) Stability biases also include loss aversion

the well-documented tendency to feel losses more acutely than equivalent gainsand the sunk-cost fallacy, which can lead companies to hold on to businesses they should divest.13 One way of diagnosing your companys susceptibility to stability biases is to compare decisions over time. For example, try mapping the percentage of total new investment each division of the company receives year after year. If that percentage is stable but the divisions growth opportunities are not, this finding is cause for concernand quite a common one. Our research indicates, for example, that in multibusiness corporations over a 15-year time horizon, there is a near-perfect correlation between a business units current share of the capital expenditure budget and its budget share in the previous year. A similar inertia often bedevils advertising budgets and R&D project pipelines. One way to help managers shake things up is to establish stretch targets that are impossible to achieve through business as usual. Zero-based (or clean-sheet) budgeting sounds promising, but in our experience companies use this approach only when they are in dire straits. An alternative is to start by reducing each reporting units budget by a fixed percentage (for instance, 10 percent). The resulting tough choices facilitate the redeployment of resources to more valuable opportunities. Finally, challenging budget allocations at a more granular level can help companies reprioritize their investments.14

Counter interest biases by making them explicit


Misaligned incentives are a major source of bias. Silo thinking, in which organizational units defend their own interests, is its most easily detectable manifestation. Furthermore, senior executives sometimes honestly view the goals of a company differently because of their different roles or functional expertise. Heated discussions in which participants seem to see issues from completely different perspectives often reflect the presence of different (and generally unspoken) interest biases. The truth is that adopting a sufficiently broad (and realistic) definition of interests, including reputation, career options, and individual preferences, leads to the inescapable conclusion that there will always be conflicts between one manager and another and between individual managers and the company as a whole. Strong decision-making processes explicitly account for diverging interests. For example, if before the time of a decision, strategists formulate precisely the criteria that will and wont be used to evaluate it, they make it more difficult for individual managers to change the terms of the debate to make their preferred actions seem more attractive. Similarly, populating meetings or teams with participants whose interests clash can reduce the likelihood that one set of interests will undermine thoughtful decision making.

Counter social biases by depersonalizing debate


Social biases are sometimes interpreted as corporate politics but in fact are deep-rooted human tendencies. Even when nothing is at stake, we tend to conform to the dominant views of the group we belong to (and of its leader).15 Many organizations compound these tendencies because of both strong corporate cultures and incentives to conform. An absence of dissent is a strong warning sign. Social biases also are likely to prevail in discussions where everyone in the room

knows the views of the ultimate decision maker (and assumes that the leader is unlikely to change her mind). Countless techniques exist to stimulate debate among executive teams, and many are simple to learn and practice. (For more on promoting debate, see suggestions from Kleiner Perkins Randy Komisar and Xeroxs Anne Mulcahy in How we do it: Three executives reflect on strategic decision making.) But tools per se wont create debate: that is a matter of behavior. Genuine debate requires diversity in the backgrounds and personalities of the decision makers, a climate of trust, and a culture in which discussions are depersonalized. Most crucially, debate calls for senior leaders who genuinely believe in the collective intelligence of a high-caliber management team. Such executives see themselves serving not only as the ultimate decision makers but also as the orchestrators of disciplined decision processes. They shape management teams with the humility to encourage dissent and the selfconfidence and mutual trust to practice vigorous debate without damaging personal relationships. We do not suggest that CEOs should become humble listeners who rely solely on the consensus of their teamsthat would substitute one simplistic stereotype for another. But we do believe that behavioral strategy will founder without their leadership and role modeling.

Four steps to adopting behavioral strategy


Our readers will probably recognize some of these ideas and tools as techniques they have used in the past. But techniques by themselves will not improve the quality of decisions. Nothing is easier, after all, than orchestrating a perfunctory debate to justify a decision already made (or thought to be made) by the CEO. Leaders who want to shape the decision-making style of their companies must commit themselves to a new path.

1. Decide which decisions warrant the effort


Some executives fear that applying the principles we describe here could be divisive, counterproductive, or simply too time consuming (for more on the dangers of decision paralysis, see the commentary by WPPs Sir Martin Sorrell in How we do it: Three executives reflect on strategic decision making). We share this concern and do not suggest applying these principles to all decisions. Here again, the judicial analogy is instructive. Just as higher standards of process apply in a capital case than in a proceeding before a small-claims court, companies can and should pay special attention to two types of decisions. The first set consists of rare, one-of-a-kind strategic decisions. Major mergers and acquisitions, bet the company investments, and crucial technological choices fall in this category. In most companies, these decisions are made by a small subgroup of the executive team, using an ad hoc, informal, and often iterative process. The second set includes repetitive but high-stakes decisions that shape a companys strategy over time. In most companies, there are generally no more than one or two such crucial processes, such as R&D allocations in a pharmaceutical company, investment decisions in a private-equity firm, or capital expenditure decisions in a utility. Formal processesoften affected by biasesare typically in place to make these decisions.

2. Identify the biases most likely to affect critical decisions


Open discussion of the biases that may be undermining decision making is invaluable. It can be stimulated both by conducting postmortems of past decisions and by observing current decision processes. Are we at risk, in this meeting, of being too action oriented? Do I see someone who thinks he recognizes a pattern but whose choice of analogies seems misleading to me? Are we seeing biases combine to create dysfunctional patterns that, when repeated in an organization, can become cultural traits? For example, is the combination of social and status quo biases creating a culture of consensus-based inertia? This discussion will help surface the biases to which the decision process under review is particularly prone.

3. Select practices and tools to counter the most relevant biases


Companies should select mechanisms that are appropriate to the type of decision at hand, to their culture, and to the decision-making styles of their leaders. For instance, one company we know counters social biases by organizing, as part of its annual planning cycle, a systematic challenge by outsiders to its business units plans. Another fights pattern-recognition biases by asking managers who present a recommendation to share the raw data supporting it, so other executives in this analytically minded company can try to discern alternative patterns. If, as you read these lines, you have already thought of three reasons these techniques wont work in your own companys culture, you are probably right. The question is which ones will. Adopting behavioral strategy means not only embracing the broad principles set forth above but also selecting and tailoring specific debiasing practices to turn the principles into action.

4. Embed practices in formal processes


By embedding these practices in formal corporate operating procedures (such as capitalinvestment approval processes or R&D reviews), executives can ensure that such techniques are used with some regularity and not just when the ultimate decision maker feels unusually uncertain about which call to make. One reason its important to embed these practices in recurring procedures is that everything we know about the tendency toward overconfidence suggests that it is unwise to rely on ones instincts to decide when to rely on ones instincts! Another is that good decision making requires practice as a management team: without regular opportunities, the team will agree in principle on the techniques it should use but lack the experience (and the mutual trust) to use them effectively. The behavioral-strategy journey requires effort and the commitment of senior leadership, but the payoffbetter decisions, not to mention more engaged managersmakes it one of the most valuable strategic investments organizations can make.

Taking the bias out of meetings

Managing bias effectively can help lessen the impact it has on your companys strategy.
April 2010 | byDan Lovallo and Olivier Sibony

The biases that undermine strategic decision making often operate in meetings. Here is a menu of ideas for running them in a way that will mitigate the impact of those biases. Not every suggestion will be applicable to all types of decisions or organizations, but paying attention to the principles underlying these ideas should pay dividends for any executive trying to run meetings that lead to sounder decisions. Also included are related comments from executives and experts we spoke with while creating our special package: Seeing through biases in strategic decisions.

Making better decisions in meetings

Download

Authors Dan Lovallo and Olivier Sibony describe how to remove cognitive biases from meetings where decisions get made.

Make sure the right people are involved

Ensure diversity of backgrounds, roles, risk aversion profiles, and interests; cultivate critics within the top team:

You need internal criticspeople who have the courage to give you feedback, says Anne Mulchay, chairman and former CEO of Xerox. This requires a certain comfort with confrontation, so its a skill that has to be developed. The decisions that come out of allowing people to have different views are often harder to implement than what comes out of consensus decision making, but theyre also better. (See Xeroxs Anne Mulcahy: Timeliness trumps perfection.)

Invite contributions based on expertise, not rank. Dont hesitate to invite expert contributors to come and present a point of view without attending the entire meeting. For the portion of the meeting where a decision is going to be made, keep attendance to a minimum, preferably with a team that has experience making decisions together. This loads the dice in favor of depersonalized debate by eliminating executives fear of exposing their subordinates to conflict and also creates, over time, an environment of trust among that small group of decision makers.

Assign homework

Make sure predecision due diligence is based on accurate, sufficient, and independent facts and on appropriate analytical techniques. Request alternatives and out of the box plansfor instance, by soliciting input from outsiders to the decision-making process. Consider setting up competing fact-gathering teams charged with investigating opposing hypotheses.

Create the right atmosphere

As the final decision maker, ask others to speak up (starting with the most junior person); show you can change your mind based on their input; strive to create a peerlike atmosphere. Encourage admissions of individual experiences and interests that create possible biases.

According to Kleiner Perkins partner Randy Komisar, for example, a contentious debate over manufacturing strategy at the start-up WebTV suddenly became more manageable once it was clear that managers with software experience were frightened about building hardware and managers with hardware experience were afraid of ceding control to contract manufacturers. (See Dan Lovallo and Olivier Sibony, The case for behavioral strategy.)

Encourage expressions of doubt and create a climate that recognizes reasonable people may disagree when discussing difficult decisions. Encourage substantive disagreements on the issue at hand by clearly dissociating it from personal conflict, using humor to defuse tension.

Manage the debate

Before you get going, make sure everyone knows the meetings purpose (making a decision) and the criteria you will be using to make that decision. For recurring decisions (such as R&D portfolio reviews), make it clear to everyone that those criteria include forcing devices (such as comparing projects against one another). Take the pulse of the room: ask participants to write down their initial positions, use voting devices, or ask participants for their balance sheets of pros and cons.

Frankly, Im surprised that when you have a reasonably well-informed group it isnt more common to begin by having everyone write their conclusions on a slip of paper, remarks Nobel laureate Daniel Kahneman. If you dont do that, the discussion will create an enormous amount of conformity. (See Strategic decisions: When can you trust your gut?) Put together a simple balance sheet where everybody around the table is asked to list points on both sides: Tell me what is good about this opportunity; tell me what is bad about it. Do not tell me your judgment yet. I dont want to know, says Randy Komisar, partner at Kleiner Perkins Caufield & Byers. The balance sheet process mitigates a lot of the friction that typically arises

when people marshal the facts that support their case while ignoring those that dont. (See Kleiner Perkins Randy Komisar: Balance out biases.)

Use the premortem technique to expand the debate.

The premortem technique is a sneaky way to get people to do contrarian, devils advocate thinking, explains psychologist Gary Klein. Before a project starts, say, Were looking in a crystal ball, and this project has failed; its a fiasco. Now, everybody, take two minutes and write down all the reasons why you think the project has failed. (See Strategic decisions: When can you trust your gut?)

Counter anchoring: postpone the introduction of numbers if possible; reframe alternative courses of action as they emerge by making explicit what you have to believe to support each of the alternatives.

Its easy for people to lose track of how much theyve explained away, notes Klein. So one possibility is to try to surface this for themto show them the list of things theyve explained away. (See Strategic decisions: When can you trust your gut?)

Pay attention to the use of comparisons and analogies: limit the use of inappropriate ones (inadmissible evidence) by asking for alternatives and suggesting or requesting additional analogies. Force the room to consider opposing views. For vital decisions, create an explicit role for one or two peoplethe decision challengers.

Follow up

Commit yourself to the decision. Debate should stop when the decision is made. Connect individually with initial dissenters and make sure implementation plans address their concerns to the extent possible. Monitor preagreed upon criteria and milestones to correct your course or move on to backup plans. Conduct a postmortem on the decision once its outcome is known. Periodically step back and review decision processes to improve meeting preparation and mechanics, using an outside observer to diagnose possible sources of bias.

How to test your decision-making instincts

Executives should trust their gut instinctsbut only when four tests are met.
May 2010 | byAndrew Campbell and Jo Whitehead One of the most important questions facing leaders is when they should trust their gut instincts an issue explored in a dialogue between Nobel laureate Daniel Kahneman and psychologist Gary Klein titled Strategic decisions: When can you trust your gut? published by McKinsey Quarterly in March 2010. Our work on flawed decisions suggests that leaders cannot prevent gut instinct from influencing their judgments. What they can do is identify situations where it is likely to be biased and then strengthen the decision process to reduce the resulting risk. Our gut intuition accesses our accumulated experiences in a synthesized way, so that we can form judgments and take action without any logical, conscious consideration. Think about how we react when we inadvertently drive across the center line in a road or see a car start to pull out of a side turn unexpectedly. Our bodies are jolted alert, and we turn the steering wheel well before we have had time to think about what the appropriate reaction should be. The brain appears to work in a similar way when we make more leisurely decisions. In fact, the latest findings in decision neuroscience suggest that our judgments are initiated by the unconscious weighing of emotional tags associated with our memories rather than by the conscious weighing of rational pros and cons: we start to feel somethingoften even before we are conscious of having thought anything. As a highly cerebral academic colleague recently commented, I cant see a logical flaw in what you are saying, but it gives me a queasy feeling in my stomach. Given the powerful influence of positive and negative emotions on our unconscious, it is tempting to argue that leaders should never trust their gut: they should make decisions based solely on objective, logical analysis. But this advice overlooks the fact that we cant get away from the influence of our gut instincts. They influence the way we frame a situation. They influence the options we choose to analyze. They cause us to consult some people and pay less attention to others. They encourage us to collect more data in one area but not in another. They influence the amount of time and effort we put into decisions. In other words, they infiltrate our decision making even when we are trying to be analytical and rational. This means that to protect decisions against bias, we first need to know when we can trust our gut feelings, confident that they are drawing on appropriate experiences and emotions. There are four tests. 1. The familiarity test: Have we frequently experienced identical or similar situations? Familiarity is important because our subconscious works on pattern recognition. If we have plenty of appropriate memories to scan, our judgment is likely to be sound; chess masters can make good chess moves in as few as six seconds. Appropriate is the key word here because many disastrous decisions have been based on experiences that turned

out to be misleadingfor instance, the decision General Matthew Broderick, an official of the US Department of Homeland Security, made on August 29, 2005, to delay initiating the Federal response following Hurricane Katrina. The way to judge appropriate familiarity is by examining the main uncertainties in a situationdo we have sufficient experience to make sound judgments about them? The main uncertainties facing Broderick were about whether the levees had been breached and how much danger people faced in New Orleans. Unfortunately, his previous experience with hurricanes was in cities above sea level. His learned response, of waiting for ground truth, proved disastrous. Gary Kleins premortem technique, a way of identifying why a project could fail, helps surface these uncertainties. But we can also just develop a list of uncertainties and assess whether we have sufficient experience to judge them well. 2. The feedback test: Did we get reliable feedback in past situations? Previous experience is useful to us only if we learned the right lessons. At the time we make a decision, our brains tag it with a positive emotionrecording it as a good judgment. Hence, without reliable feedback, our emotional tags can tell us that our past judgments were good, even though an objective assessment would record them as bad. For example, if we change jobs before the impact of a judgment is clear or if we have people filtering the information we receive and protecting us from bad news, we may not get the feedback we need. It is for this reason that yes men around leaders are so pernicious: they often eliminate the feedback process so important to the development of appropriate emotional tags. 3. The measured-emotions test: Are the emotions we have experienced in similar or related situations measured? All memories come with emotional tags, but some are more highly charged than others. If a situation brings to mind highly charged emotions, these can unbalance our judgment. Knowing from personal experience that dogs can bite is different from having a traumatic childhood experience with dogs. The first will help you interact with dogs. The second can make you afraid of even the friendliest dog. A board chairman, for example, had personally lost a significant amount of money with a previous company when doing business in Russia. This traumatic experience made him wary of a proposal for a major Russian expansion in his new company. But he also realized that the experience could be biasing his judgment. He felt obliged to share his concerns but then asked the rest of the board to make the final decision. 4. The independence test: Are we likely to be influenced by any inappropriate personal interests or attachments? If we are trying to decide between two office locations for an organization, one of which is much more personally convenient, we should be cautious. Our subconscious will have more positive emotional tags for the more convenient location. It is for this reason that it

is standard practice to ask board members with personal interests in a particular decision to leave the meeting or to refrain from voting. Also for this reason, we enjoy the quip turkeys will not vote for Christmas. A similar logic applies to personal attachments. When auditors, for example, were asked to demonstrate to a Harvard professor that their professional training enabled them to be objective in arriving at an audit opinion, regardless of the nature of the relationship they had with a company, they demonstrated the opposite. If a situation fails even one of these four tests, we need to strengthen the decision process to reduce the risk of a bad outcome. There are usually three ways of doing thisstronger governance, additional experience and data, or more dialogue and challenge. Often, strong governance, in the form of a boss who can overrule a judgment, is the best safeguard. But a strong governance process can be hard to set up and expensive to maintain (think of the US Senate or a typical corporate board). So it is normally cheaper to look for safeguards based on experience and data or on dialogue and challenge. In the 1990s, for example, Jack Welch knew he would face some tough decisions about how to exploit the Internet, so he chose experience as a solution to the biases he might have. He hired a personal Internet mentor who was more than 25 years his junior and encouraged his top managers to do the same. Warren Buffett recommends extra challenge as a solution to biases that arise during acquisitions. Whenever a company is paying part of the price with shares, he proposes using an adviser against the deal, who would be compensated well only if it did not go through. There are no universal safeguards. Premortems help surface uncertainties, but they do not protect against self-interest. Additional data can challenge assumptions but will not help a decision maker who is influenced by a strong emotional experience. If we are to make better decisions, we need to be thoughtful both about why our gut instincts might let us down and what the best safeguard is in each situation. We should never ignore our gut. But we should know when to rely on it and when to safeguard against it.

Are We Thinking Too Little, or Too Much?


Comments 44 Email Print Download Share

In the course of making a decision, managers often err in one of two directionseither overanalyzing a situation or forgoing all the relevant information and simply going with their gut. HBS marketing professor Michael I. Norton discusses the potential pitfalls of thinking too much or thinking too little. Key concepts include:

When deciding among potential products or employees, managers often take too much time considering all the attributes of their choiceseven attributes that have no bearing on the situation at hand. However, in trying to avoid overthinking a decision for fear of decision paralysis, managers often "over-correct" and end up not thinking enough. We know that sometimes people think too much, and sometimes they think too little. But we still don't know the right amount to think.

by Carmen Nobel The most captivating item in Michael Norton's office is a Star Wars The Force Trainer, a toy that allows would-be Jedi warriors to levitate a Ping-Pong ball within a tube using only the power of focused thinking. Norton, a marketing professor at Harvard Business School, plans to study whether inducing people into believing they can expertly control the ball will affect the way they perceive themselves as business influencers. In fact, Norton spends most of his time thinking about thinking. So it's somewhat ironic that his latest line of research explores the idea of thinking too much. If youve done something the same way for 10 years, it might be time to reconsider. "Academics traditionally have taken two different approaches to decision-making," says Norton, who teaches in the Marketing Unit. "One view is that people often make decisions too hastily; they use shortcuts and heuristics, and therefore they're susceptible to biases and mistakes. The implication is that if maybe they thought more, they'd do better. "And then there's this whole stream of research about ways in which you should think more carefully in more logical wayscreating decision trees that map out 'if you want to do this, then you should do this and not that,' making lists of the pros and cons and making a decision based on which list is longer, and so on." However, there has been little research that considers the notion that overthinking a decision might actually lead to the wrong outcome. Nor have researchers come up with a model that explores how to determine when we're overthinking a decisioneven though logic tells us that there certainly is such a thing. "We all know that when we make lists, we often end up crumpling them and throwing them away because they're not really helping us make decisions," Norton says. "Bill Clinton was famous for becoming so involved with the intricacies of each policy that no decisions were

made. Having a leader who considers every detail sounds great in theory, but it can be suboptimal for moving forward with a decision. There's a paralysis that can come with thinking too much." Norton explores this idea in From Thinking Too Little to Thinking Too Much: A Continuum of Decision Making, an article he co-wrote with Duke University's Dan Ariely for Wiley Interdisciplinary Reviews: Cognitive Science. "We set out not to tell people whether they're thinking the right way, but just to get them thinking, 'I'm supposed to be making a decision right nowam I thinking too little about this, or am I thinking too much?' " Norton says. "Both of those could lead to mistakes." We set out not to tell people whether theyre thinking the right way, but just to get them thinking, I'm supposed to be making a decision right now-am I thinking too little about this, or am I thinking too much? Both of those could lead to mistakes. For example, in choosing laptop computers for a sales team, an IT executive might get caught up in comparing the graphics capabilities and audio quality of various options, when in fact the only factors of importance to users are the size, weight, and security features. Worse yet, even if they narrow down the list of attributes under consideration, executives can still be stymied if they try to consider every single laptop on the market. (In the article, Norton and Ariely cite a study by social psychologists Sheena Iyengar and Mark Lepper, who showed that grocery store shoppers who were offered free samples of 24 jam flavors were less likely to buy any jam at all than those shoppers who sampled only 6 flavors; considering too many options made it too hard to choose one.)

The underthinker
The problem is that time-crunched managers often swing too far to the other end of the decisionmaking thinking spectrumthat is, they don't think at all. "Very often managers find that there's not enough time to think through every single scenario or customer segment, which can take months," Norton says. "But too often the correction to 'We don't have time to do that' is an over-correction to one hundred percent 'We should go with our gut.' " While all good managers should be able to make snap decisions in high-pressure situations, they may miss out on good opportunitiesand fall into rutswhen they make quick decisions strictly out of habit. Too often, "We always do it that way" is the main reason for a decision. For instance, a manager might hire or disqualify job candidates based on whether they make good eye contact during an interview, just because past candidates who made good eye contact ended up performing well at the company. "So they just decide to use that criterion forever because it's worked out in the past," Norton explains. "But they don't think about what if they had hired people who don't make eye contact.

Maybe they would have been better than the people who do. And so that's the idea we want people to consider. Sometimes when you make habitual decisions, things work out fine. But that doesn't mean they're the best decisions. And if you've done something the same way for 10 years, it might be time to reconsiderto think a little more."

Stale popcorn
More detrimentally, people may make downright bad decisions based on force of habit. In the article, Norton and Ariely describe a study in which several participants watched a movie while eating popcorn. Some received fresh popcorn, while others were given week-old, stale popcorn. The researchers found that those participants who always ate popcorn at the movies were just as likely to gobble down the stale popcorn as they were the fresh popcorn, strictly out of habit. Lately, Norton has been studying the brain chemistry of decision makers, using functional magnetic resonance imaging (FMRI) in order to determine the neural signatures of decisions based on habits and those based on thoughtful analysis. He gives the example of choosing a favorite hangout because of the quality of the coffee and the ambience at a particular coffeehouse, as opposed to stumbling into a caf on a very cold day when any hot drink would seem deliciousyet coming to believe in both cases that the establishment truly offers the best coffee in the whole world. "Ask yourself: Do I like this coffee because I really like this coffee, or do I like it because it was cold out?" Norton says. Still, there's a long way to go before science offers a clear-cut method for thinking through decisions perfectly. "We are hopeful that people will continue to conduct research in this area," Norton says. "What we know now is that people sometimes think too much, and sometimes they think too little. But we still don't know the right amount to think for any given decision, which is a fascinating decision yet to be solved."

What Makes an Effective Executive


by Peter F. Drucker Comments (7)

Related Also Available

Buy PDF

An effective executive does not need to be a leader in the sense that the term is now most commonly used. Harry Truman did not have one ounce of charisma, for example, yet he was among the most effective chief executives in U.S. history. Similarly, some of the best business and nonprofit CEOs Ive worked with over a 65-year consulting career were not stereotypical leaders. They were all over the map in terms of their personalities, attitudes, values, strengths, and weaknesses. They ranged from extroverted to nearly reclusive, from easygoing to controlling, from generous to parsimonious. What made them all effective is that they followed the same eight practices:

They asked, What needs to be done? They asked, What is right for the enterprise? They developed action plans. They took responsibility for decisions. They took responsibility for communicating. They were focused on opportunities rather than problems.

Vous aimerez peut-être aussi