Without well-developed decision systems and processes to address the complexities of capital allocation, organizations often resort to long, drawn-out debates; politicking and gaming the system; the gut instincts of a brave executive and staff; or deferring to quantitative analysis alone. With large amounts at stake, the opportunities for improved decision quality can be considerable.
The new CEO had a board mandate to improve the return on investment capital (ROIC) from its capital planning program. The company was underperforming its industry peers, so the CEO established targets for improved performance. However, upon reviewing their portfolio of capital projects, he noted a lack of consistency and rigor across the many business cases. Some cases appeared overly optimistic, most of them did not provide much insight into how risks could hurt project performance, and they all seemed to use different forecast assumptions—sometimes even different forecasts for interest and foreign exchange rates. He was now being presented with a capital request for a facility expansion project in excess of $100 million, but the analysis approach seemed more of a copy-paste from the previous year’s infrastructure construction project, which experienced delays and cost overruns.
Are you leaving money on the table?
The majority of finance executives “are not confident in their organization’s ability to optimize their portfolio of project investment opportunities.”1 Some of the top challenges cited include: difficulty measuring benefits and comparing divergent projects, linking funding decisions to corporate strategy, inefficient and time-consuming processes, politicking or “gaming” the process and quantifying risk.
How capital is allocated is among the most strategically important decisions associated with an organization’s success and sustainability. Investment decisions—also known as capital allocation decisions—aim to create value for stakeholders and impact the performance of organizations. These decisions include pharmaceutical companies’ investment in research and development, natural resource companies’ investment in exploration and extraction, and telecommunication companies’ investment in next-generation network infrastructure.
These decisions can be enormously complex. Information and assumptions about capital investment alternatives are dispersed across the people and systems of the organization. Moreover, projects are hard to compare given their divergence on factors such as strategic benefits, stakeholder interests, risk levels, interdependencies, urgency, timing of returns and investment type. Without well-developed decision systems and processes to address this complexity, organizations often have few choices other than to fall back on “Plan B”—long, drawn-out debates; politicking and gaming the system; or the gut instincts of a brave executive and staff who adhere to existing processes, though with distrust and skepticism. It may mean deferring to quantitative analysis alone, letting the numbers make the call.
Improving the quality of capital allocation decisions—whether it is a “one-off” investment decision or the cyclical (e.g., annual) review of one or more investments in a portfolio setting—offers often sizeable opportunities for enhancing shareholder value and creating strategic advantage. Lessons can be drawn from the major improvements in process quality implemented over the past few decades: With a structured approach to decision quality, “costs go down and productivity goes up as improvement of quality is accomplished by better management of design, engineering, testing and by improvement of processes.”2
Applying management quality concepts to capital allocation decision-making may improve value and strategic alignment while reducing costs and inefficiencies. Here we focus on a two-pronged architecture designed to improve decision quality:
- Process quality: What are the framework and processes for a high-quality design for capital decision-making?
- People quality: What are the governance, skill level and learning processes required for improving the decision process?
Our proposed architecture has the following elements: decision framing, data management, analytics, decision-making and iterative learning. As we define these elements, we will also identify a range of capabilities that are associated with each element and suggest some steps for developing these capabilities within the organization. The return on investment for instilling a robust decision-making architecture for capital allocation3 can be significant, and can result in additional benefits.4
Figure 1. The architecture for decision quality
The architecture for decision quality
Framing the capital investment decision, or portfolio of decisions, is perhaps the most critical step in the decision process, as it lays the foundation for later steps. Structuring and framing the investment decision is as much an art as it is a science. A multistakeholder process with the objectives and preferences clearly laid out is recommended as is the use of creative facilitation techniques to identify alternatives, risks and opportunities.
Figure 2. Architecture for improving decision quality: People quality
Stakeholders inside the organization can be grouped into a three-tiered governance structure: 1) decision-makers who set the strategic direction and decide what gets funded; 2) the decision advisors conducting capital project business cases and portfolio analysis to provide insights to the decision-makers; and 3) project proponents and team members as well as subject matter specialists throughout the organization. Decision-makers often rely on information from numerous teams and specialists. The recommended framing process is informed by the direction and criteria defined top-down by the decision-makers, is managed by the decision advisors, and is one that makes use of the knowledge of multiple team members and specialists.
Stakeholders inside the organization can be grouped into a three-tiered governance structure: 1) decision- makers who set the strategic direction and decide what gets funded; 2) the investment review body that coordinates project reviews and portfolio analysis to provide insights to the decision-makers; and 3) project proponents and team members as well as subject matter specialists throughout the organization.
For large strategic investment decisions, the outcome of a structured framing process might include a factor map, a graphical tool showing the key drivers of uncertainty and value, which serves as a blueprint for data and analysis and represents a robust synthesis of multiple shareholders’ views on the investment decision. For portfolio prioritization decisions, which require culling from dozens, hundreds or thousands of projects, an important step in structuring the decisions is developing a valuation framework that accounts for the broad range of financial and strategic objectives of the organization to create a consistent way of comparing the various divergent projects and their benefits.
Yet decision framing is often minimized or overlooked. In developing capital project business cases, people tend to start gathering inputs right away and to fill out spreadsheets too soon. When we start building a financial model and collecting data without first framing the decision, we run the risk of falling prey to collecting the wrong data and the common cognitive biases. The field of behavioral economics has examined the phenomenon of cognitive bias in decision-making for several decades, and a number of recent books and articles have applied the theory, developed by academia, to business management.5
Recommendations for improving investment decision framing include:
- Stakeholders: Pursue the broadest participation in the framing process that is practically feasible.
- Environment: Focus on creating an environment for the stakeholders that is conducive to unconstrained thinking.
- Questions: Be prepared to ask tough questions rather than reinforcing what is known and comfortable.
Broad stakeholder participation in the framing process would seem to be common sense, yet it is all too easy to bypass this construct, given how frequently organizations end up structured in silos. The group of stakeholders needed for good framing often involves people outside your immediate department or business unit. For instance, combining the capital project team with the operating team during the framing sessions will contribute to a smoother transition from project implementation to operational readiness. Building the principle of broad stakeholder involvement into your decision architecture during framing can improve decision quality by helping develop a more robust business case, a more efficient data collection process, clearer roles and responsibilities, and it can facilitate consensus building around the final decisions resulting from the business case analysis.
Creating an environment that promotes unconstrained thinking (as opposed to convergence or group think) is perhaps more art than science. One innovative approach that is gaining traction in some organizations is the use of improvisation games and methods from the field of theater performance.
Creating an environment that promotes unconstrained thinking (as opposed to convergence or group think) is perhaps more art than science. One innovative approach that is gaining traction in some organizations is the use of improvisation games and methods from the field of theater performance. These techniques may be used to foster a more accepting context for creative brainstorming. The theory behind this approach can in part be traced to lessons from a psychologist, Lev Vygotsky, studying how children learn and develop during play. He observed what he called a zone of proximal development (ZPD) when children learn language from their parents or when they perform beyond themselves during play.6 A ZPD is an environment where children are both accepted as they are and encouraged to perform as though they were, in Vygotsky’s words, “a head taller” than they are. In framing complex decisions, improvisation games can help create a kind of ZPD for adults to draw a range of views from a group and can help group members go beyond their day-to-day behaviors to think more creatively about the investment decision.
Asking the right, sometimes difficult, questions is a key ingredient of framing. When structuring a large strategic investment decision, it is crucial to understand how risks and uncertainties may affect the investment decision. One useful question to ask stakeholders is: “How could we be wrong?” This requires participants to analyze or otherwise consider the assumptions underlying the decision and explore how the investment might turn out differently than expected. Using a premortem question can also help identify and frame risk factors: Assume it is three years in the future and the capital project has failed, and ask the stakeholder group to create stories that would result in this outcome.7 It is generally accepted that it’s easier for team members to talk about risk factors in a hypothetical story than to speak critically of the project.
In addition, when framing an approach to analyzing investments, it is important to understand the breadth of objectives that contribute to value creation for the organization. What are the financial and strategic objectives that decision-makers care about when reviewing investment proposals? What would different stakeholder groups consider a good outcome from the capital budget? The net result of this framing process should be a comprehensive valuation framework that captures the breadth of criteria that add value to the organization. This valuation framework would then serve to assist in judging investment proposals on their merits.
Data and assumptions are central to the decision process and often considered to be one of its weakest links. Common challenges include data overload (vast quantities of data make it hard to know where to start), irrelevant data, or lack of data. Starting with a framing process to identify important questions enables the purposeful and targeted use of data. Data can be thought of as falling into three categories:
- Market data: Data for which there are observable market transactions, for example, stock prices, commodity prices, foreign exchange rates.
- Empirical data: This is where the emerging trend of data analytics fits into the process, using historical data to predict future outcomes using techniques ranging from basic trend analyses, e.g., descriptive statistics and correlations, to more complex approaches, e.g., regression analyses and predictive models.8
- Subjective assessments: Use when market data are not available and empirical data are insufficient and not appropriate, but assumptions nevertheless need to be made.
Organizations can use all three kinds of data to determine the attractiveness of investment decisions; this article focuses on subjective data assessments, often perceived as both unreliable and the most challenging to formulate.
A thoughtful approach is required to capture subjective assessments from subject matter specialists and to minimize cognitive bias. Subjective assessments can include financial model assumptions, such as data points and probability ranges, and can include answers to survey questions and scoring scales. When faced with complex questions, research studies have shown that we tend to oversimplify and fall back on rules of thumb, known as heuristics, which can be biased and underestimate risk. One approach is to inform the specialist about cognitive biases and to deconstruct the question into smaller “bite-size” assessments that are less likely to be subject to heuristics and biases. Conducting defensible subjective assessments of data rests on using “debiasing” techniques and questions. As an example of how this can improve insights, a power company was evaluating its strategy and capital projects for a foreign subsidiary and had initially assessed that the biggest value improvement would come from investments in operational efficiencies. After a careful assessment of inputs in a probabilistic business case analysis, it became clear that there were even larger value opportunities by investing in their lobbying efforts to make the case for government tariff increases.
Another helpful process step in improving the quality of subjective assessments is for the investment review body to hold data quality review sessions. Using the same principle as above of deconstructing the process, the team can focus on one data element at a time (e.g., revenue growth rates, margin assumptions or scoring scale assessments) and line up the projects in rank order based on that one element. The key is to focus on projects that appear to be out of order relative to other projects for that one element. Repeat the process for each data element, one at a time. This tends to lead to a more reliable data quality review step than reviewing complete business cases one at a time to determine which data elements may be biased.
Analytics is becoming accepted as a central component of better decision-making and a potential source of competitive advantage. But developing these organizational capabilities requires a roadmap and investments. Some organizations are effective at evaluating large projects but less skilled at making decisions about a high volume of smaller investments. Other organizations have the reverse set of capabilities. Rarely do we see organizations that are strong at both.
One of the key analytical challenges is addressing the uncertainty of an investment. In the absence of a crystal ball, “all models are wrong, but some are useful.”
One of the key analytical challenges is addressing the uncertainty of an investment. In the absence of a crystal ball, “all models are wrong, but some are useful.”9 To develop a useful business case model that provides insights into the effect of uncertainty, those involved need to be able to capture possible future scenarios and to develop dynamic models for evaluating these scenarios. The framing step is critical to identifying and structuring the uncertainties and alternatives embedded in the investment decision, and it provides the blueprint for developing dynamic models to better measure those uncertainties and compare alternatives.
Building dynamic models—models that effectively represent the economics of the investment across a broad range of what-if scenarios—is a core skill for investment analysis. Dynamic models form the basis for simpler techniques, such as sensitivity analysis and scenario analysis (i.e., a model showing one cash flow scenario at a time), and for more complex models, such as simulation, decision trees and real options models (i.e., probabilistic models that consider and integrate numerous risk-based scenarios to calculate an expected value in light of all the modeled uncertainties and decisions). For example, a global developer was considering a large power plant project in an emerging market and whether to build it all at once and run the risk of building overcapacity or incur additional cost to build it in phases and better manage overcapacity risk. A real options model quantified and traded off the value and risks of these two scenarios and showed that in this case a phased approach appeared to be a superior strategy for that market. More robust models aim to provide greater insights about the impact of risk and potential mitigation strategies. However, these models require specific skills, in the absence of which they cause additional complexity and frustration and offer little value.
Case study: A global natural resources company enhances its investment decisions
Framing: The company’s CEO and leadership team embarked on a strategy of increasing returns on invested capital (ROIC). The capital planning group implemented a decision quality approach for large strategic investments (above $100 million). In the first framing session for an expansion project, two dozen employees representing the full value chain participated in a two-day framing session. At first, there was resistance to brainstorming risk factors or discussing ways in which the project might yield disappointing results. On the second day, the team began the session with an improvisation game. They were asked to pretend they all had just been promoted to be members of the board of directors. Their job over the next hour was to identify all the vulnerabilities of the project so it could be compared with other projects submitted to the board. In that next hour, the team developed far more content on risk factors than they had during the entire first day. This was a subtle but effective change in mission: allowing participants to pretend to be someone else allowed them to step outside their traditional behavior (e.g., not criticizing the project out of fear of repercussions). This provided the content for an effective framing workshop and yielded a blueprint of risk factors for a probabilistic risk model.
Data and analytics: Once the portfolio team had framed the project investment and its underlying risks and opportunities, the team began developing a probabilistic decision tree model. An output of the model was a probabilistic risk profile showing the likelihood of the project achieving different future levels of performance, cash flows and value. Senior management knew it had a positive NPV project, but what they did not realize is that the project also had a 30 percent chance of not breaking even. The key risk factor during the construction phase of the project was commodity price variance, and if they hedged about 40 percent of the commodity price, there would be just a 10 percent chance of not breaking even. That analytical insight created an “aha” moment for the board, and they decided future large projects would now need to go through this kind of robust review.
Iterative learning: Leadership decided to implement a new decision quality process for large capital projects at the feasibility stage of each large-scale capital project. After conducting a couple of large projects using their new process the decision-makers derived substantial value from improved business cases, better quantification and management of risks, and in many cases, lower capital requirements. They also realized that some of the value improvement ideas could have created even more value had the improvements been designed into the project earlier. They, therefore, revised their process. In lieu of one in-depth project review during the feasibility stage, they began to conduct an initial, lighter robustness evaluation during the prefeasibility stage and a follow-on review during the feasibility stage. Not only did this enable early design improvements, it also allowed for a “double-loop” learning process for the project team to revisit and challenge its original project assumptions earlier and more frequently, to have a second opportunity to practice conducting difficult probabilistic data assessment from subject matter specialists, and to observe and measure which risks changed (favorably or unfavorably) between the prefeasibility and feasibility stages.
Good models need good governance to have an impact. As more sophisticated mathematical models are deployed, the middle layer of the investment governance structure—the investment review body, for example—needs to fill the key role of translator or “interpreter” between the math people and business managers. Without this translation role, the analytical models run the risk of being perceived as black boxes that are not decision-relevant.
The level of analytical sophistication relevant to a particular organization should consider the current state of that organization, the dynamics within its industry, competitors’ practices, and the company’s desire or capacity for implementing change. Capital-intensive companies conducting large strategic capital projects (e.g., energy and natural resources, technology, pharmaceuticals, telecom, automotive) need greater precision and thoroughness, and it may be advantageous to focus on analytical tools that provide depth of insight across a broad range of future scenarios. For companies where the number and size of projects is not as large or intense (e.g., insurance companies or retail establishments), it may be better to develop analytical tools and processes that deliver speed and efficiency.
At this step in the process, decision-making should be a combination of facilitated analysis and experienced judgment. Bringing the power of robust analytics to a senior decision-making body requires both robust tools and effective facilitation techniques.
With a structured process for decision framing, data and analytics, the resulting analytical insights should be defensible and transparent, should address questions raised during the framing process, and should as a result lead to decisions that are superior to those based on gut instinct. More effective decision-making meetings rely on a facilitator playing an independent rather than an advocacy role: summarizing the results of a multistakeholder framing process, explaining the rationale behind key assumptions, and facilitating a discussion about the analytical results to enable decision-makers to better exercise judgment and evaluate the investment trade-offs. The investment review body, those responsible for assimilating business cases from throughout the organization and presenting the analytical results to decision-makers, also needs to understand the analytics behind the recommendations and be able to explain it intuitively to decision-makers. This is a critical role for the team, and it is a skill that many organizations could improve upon.
One useful tool for the decision-making team is data visualization. A challenge for decision-makers is capturing a holistic understanding of the many projects and the underlying data for those projects. Data visualization techniques address this challenge by enabling decision-makers to see the forest for the trees. However, intuitive graphic representations of a business case or a large number of capital projects need to strike a delicate balance between visual elegance and analytical rigor. When implemented effectively, data visualization can greatly assist decision-makers, especially during the decision-making process for many projects, by providing:
- A portfolio view of the inventory of projects, their values and their prioritization rankings.
- Trade-off analysis, enabling a strategic consideration and comparison of projects by type, benefit criteria, region, etc.
- “What-if” analysis, offering a view of the portfolio under different scenarios.
For example, a national company implemented a new portfolio approach to screening and prioritizing hundreds of projects. In prior years, decision-makers spent an entire week at an off-site meeting trying to reach agreement on what to fund, defer or reject for the coming year. By using visualization tools to view the portfolio of projects, project costs, sources of value, and relative levels of risk, they were able to reach agreement in a matter of hours.
Iterative learning and continuous improvement are part of advancing the organization’s capital investment decision process. Some questions to consider for more effective organizational learning are:
- Have we created a working environment and set of performance metrics and incentives in which project teams can explore and learn from past experiences, positive and negative?
- Is the project team instructed to and willing to challenge its assumptions during and after the analysis process?
- Is there sufficient training and strategic communication to support new process steps, framing techniques, data assessment approaches or analytical tools in the capital investment process?
We observed iterative learning at an agency that adopted a new, structured process to capital project business cases. The new tools captured leadership’s strategic direction, and through combined training, time and practice, the organization was able to improve its project proposals. Over the course of three funding cycles it saw the average ROI of its portfolio of projects more than double.
Handsome, but not neat
For many organizations, the capital planning process can resemble a food fight or have the appearance of compliance under a cloud of confusion and mistrust. That is not criticism, but rather recognition that these are particularly complex decisions. Designing and implementing a consistent architecture for making capital allocation decisions can and should be the beginning of a process of continual improvement. This requires both hard skills and soft skills. Hard skills are required for a data management strategy, implementation of and training in insightful analytical techniques, design of visualization and reporting tools, and translation of mathematical insights. Soft skills are needed for framing the decisions, managing the governance process, assessing data from specialists, facilitating analytical discussions with decision-makers, and setting the tone for an iterative learning culture.
Implementing a structured decision-making process that embeds the notion of decision quality within all levels of the company can result in multiple benefits:
- The ability to prioritize resources more efficiently and enhance strategic alignment.
- Streamlined, often faster, decision-making supported by relevant and timely insights.
- Improved team collaboration and stakeholder buy-in.
- Greater ability to forecast investment requirements and expected future returns.
- Increased consistency of the level and quality of analysis for investment decisions.
- Improved transparency and greater understanding of investment uncertainties.
Investment decisions are often complex, often far more so than they might appear. Yet that complexity can be tamed. Revisiting the existing capital decision-making architecture—and developing a prioritized list of steps for improving your organization’s decision quality—is rarely a neat process. But it is an effort than can repay a company handsomely, and in many ways.
EndnotesView all endnotes
- Source: 1) Deloitte Dbriefs Webcast, “Capital Planning Trends: New Ways Organizations Are Adapting to Uncertain Times,” July 12, 2010, 1330 participants (Deloitte Development LLP). 2) Deloitte Dbriefs Webcast, “Capital Allocation: Increasing Your Odds for Placing the Right Bets,” March 27, 2007, 882 participants (Deloitte Development LLP). 3) Deloitte’s CFO Signals Survey, May 2010, 134 CFO participants (Deloitte Development LLP).
- Walton, Mary; W. Edwards Deming (1988). The Deming Management Method. Perigee. pp. 88.
- We recognize there are other important related issues to achieving capital efficiency in addition to the quality of decision-making, such as project implementation, monitoring and evaluation, aligning the performance and incentives system, capital structure and financing decisions, among others. We have not attempted to address these additional important topics in this article.
- In his webinar, Carl Spetzler estimated, “The ROI on investing in decision quality [at a major oil & gas company] was astronomical—1000 to 1. Measurement is difficult but accuracy is not the issue.” Carl Spetzler, “Decision Quality: The Next Wave,” Society of Decision Professionals, webinar, December 14, 2011.
- Two recent and important books on the topic of bias and decision-making are: Daniel Kahneman, Thinking Fast and Slow, Farrar, Straus and Giroux, New York. 2011, and Dan Ariely, Predictably Irrational: The Hidden Forces That Shape Our Decisions, Harper Collins, New York. 2010.
- Cited by Fred Newman and Lois Holzman. Lev Vygotsky: Revolutionary Scientist (Critical Psychology Series), Routledge, New York, 1993, pp. 65-66.
- For a description of the premortem approach to addressing overconfidence, see Kahneman, pp. 264-265. “Imagine that we are a year in the future. We implemented the plan as it now exists. The outcome was a disaster. Please take 5-to-10 minutes to write a brief history of that disaster.”
- For detailed discussion on data analytics, see “Beyond the Numbers: Analytics as a Strategic Capability,” by James Guszcza and John Lucker. Deloitte Review, Issue 8, 2011.
- This quote is attributed to George Box, a statistician in the areas of quality control, time-series analysis, design of experiments, and Bayesian inference. See: “Robustness in the Strategy of Scientific Model Building” (May 1979) in Robustness in Statistics: Proceedings of a Workshop (1979) edited by RL Launer and GN Wilkinson, page 2.