Carl Wiens/theispot.com
Leaders frequently encourage teams to apply an experimental mindset to business initiatives and to reframe failures as learning opportunities. But in reality, few organizations systematically explore the reasons behind their wins and losses — which means they routinely miss out on valuable insights.
Taking a structured approach to learning helps top management teams dig below the surface to examine the factors that contributed to success or failure. Doing so is particularly important when navigating volatility and uncertainty, whether the pressure driving them is external or stems from the risks inherent in product launches, new business models, or other growth efforts. When leaders treat outcomes as a source of strategic insight, they get better at recognizing and reinforcing successful behaviors and processes and applying the most effective approaches across the company.
To help leaders discover and act on such insights systematically, we have used observations from a longitudinal study of corporate growth, resilience, and longevity to create three powerful tools. The Decompose, Interpret, Reward, and Scale (DIRS) framework guides managers in applying lessons from execution to future business development opportunities and repeatable growth strategies. Embedded in this framework is the Learning From Execution Matrix, which helps leaders categorize their company’s growth efforts according to the results and their usefulness for future efforts. To assist them with making decisions based on their findings, we offer the Stop, Improve, Intensify, Start (SIIS) assessment, which brings discipline to what can otherwise be an opportunistic and political process.1
By making sense of what works and what does not, leaders can turn scattered wins into an engine for growth and ground their strategic planning not in existing strengths and old success formulas but in insight gained from the current, and evolving, environment.
The Learning From Execution Matrix draws attention to results and their underlying drivers. Regularly using this tool in the planning process enables top management teams to build their growth plans around validated mechanisms of value creation uncovered through analysis rather than extrapolations from past results. The DIRS framework supports this shift by turning insights from execution into the foundation for planning.
Learning in Action: The DIRS Framework
Top management teams often assess growth initiatives simply: They either worked or did not. However, leaders may not understand the reasons for these outcomes, making successes harder to replicate and flops harder to avoid in the future.
Sometimes, when a promising initiative does not produce the expected results, dismissing it as a failure also closes off future exploration.2 What if the effort fell victim to unusual circumstances? A supply chain disruption at the moment a product launched might have prevented it from reaching key markets despite strong preorder demand. A regulatory change may have altered market conditions just as an initiative was gaining traction.
Conversely, some initiatives may appear successful due to temporary tailwinds. Consider a product launch that coincided with a competitor’s quality crisis, or a pricing strategy test that occurred during a supply shortage. If teams do not probe such nuances, they cannot take away strategic lessons about what drives sustainable success versus temporary results influenced by external factors.
Therefore, it is critical to extract the strategic lessons when evaluating growth initiatives — not just whether the goals were achieved but also what caused the results. Yet, knowing whether an initiative offers useful lessons is only the starting point. What comes next is more difficult: turning that understanding into action. The DIRS framework offers a four-step approach designed to embed structured learning into decision-making and help organizations uncover and implement repeatable growth strategies.
A European logistics company that we studied used the DIRS framework to help it grow from a local competitor of FedEx and UPS into an international air and ground shipping provider. Launched in the last decade, the company now operates in over 200 countries with strategic hubs across North America, Asia, and Europe. It employs over 300 people and has increased its revenue tenfold in the past eight years, exceeding $70 million.
Senior leaders in the company credit its sustained growth in large part to their decision to treat the outcome of every initiative as a learning opportunity. Beginning in 2024, when the company exceeded its profit target by 50%, the leadership team embedded the key elements of the DIRS framework into its planning cycles. In what follows, we will explain each of the four components and how the company applied them.
1. Decompose: Identify outcome drivers. To learn why a project hit its goals or fell short, leaders need to decompose its performance — that is, conduct a root-cause analysis that moves beyond general impressions to identify and analyze specific factors that determined the outcome. Although leaders can choose from a variety of approaches, a practical one is to focus the analysis on profit drivers and types of customers.
Decomposing performance by profit drivers involves disaggregating the company’s financial results into its key growth components. Those may include introducing new products and services, acquiring new customers, increasing share of wallet from existing customers, increasing prices, benefiting from overall market growth, and improving cost efficiency. Each driver may be influenced by specific actions that can be tested, tracked, and improved. For example, decomposing a successful new product launch might reveal that the top-line growth it generated stemmed more from aggressive discounting that hurt margins than from the acquisition of more high-lifetime-value customers. Similarly, an initiative that failed might still have validated a new product concept that has the potential to grow once pricing or cost structure is improved.3
Decomposition by types of customers involves breaking down results by customer segments based on the organization’s go-to-market strategy. An initiative that appears to be successful overall might show concerning patterns when examined by customer segment. If, for example, overall growth is attributable to new customers while existing customers are leaving, the outcome of the initiative may hold clues to weaknesses that could lead to failure in the near future.
The European logistics company we mentioned decided to build its operational planning process around its customer base, which consists of three groups: existing clients, clients in the process of being acquired, and new clients. Each year, the company’s account managers not only set performance targets for each group but also decompose the prior year’s results. Decomposition helps them understand whether past gains came from new customers, more spending by existing customers, successful pricing strategies, or broader market momentum. The account managers use their findings as the foundation for all subsequent steps in the planning cycle.
One year, the account managers learned that revenue growth was due mainly to one large new client. Retention of existing clients and development of other new clients had fallen short of their goals. Without the decomposition process, this insight would have remained hidden in the positive overall growth results. Instead, the discipline revealed both strategic risks and growth opportunities.
Once leaders have made efforts to identify the drivers of their results, a simple 2×2 matrix can help them extract the strategic lessons. (See “The Learning From Execution Matrix.”) This matrix offers a conceptual model for characterizing the outcome of any business initiative on two dimensions: “Were the goals achieved?” and “Does the team understand the reasons for the results?” Plotted against each other, these two questions form four distinct quadrants. Each calls for a different leadership response.
Hit represents winning projects for which the drivers of success are well understood. Teams can apply the behaviors and processes that contributed to the results across the organization and turn them into standard practices.
Luck captures initiatives that have met their goals but, because the drivers of success remain unclear, the results are unreliable as directional indicators. Success may not be repeatable, and trying the same approach again could put the organization at risk. Leaders need to investigate further.
Learning covers failed initiatives that provide useful insights. These are not wasted efforts: Top management teams can use the insights to refine their strategic assumptions and mental models — that is, their beliefs about their industry, customers, and competition.
Defeat highlights initiatives that failed and no one knows the cause. If new business initiatives commonly fall into this quadrant, it signals that the organization is not learning to succeed. It is headed toward stagnation and failure.
The matrix helps teams move from simple scorekeeping to thoughtful reflection that improves decision quality and sets the stage for sustainable growth by repeating what works. The logistics company uses a classification system like the matrix to categorize its growth initiatives. Notably, when company leaders have clear evidence that an opportunity is viable (if market conditions are favorable, for example), even if the results fall short of expectations, they will take additional steps to understand the drivers for success and develop the necessary competencies. Without such evidence, an initiative is not worth the effort to further understand it.
2. Interpret: Find meaning in the results. Once leaders have mapped out where successes or failures fall on the matrix, they can begin to interrogate what the outcomes reveal about their assumptions, strategies, and tactics. They can probe, for example, whether they have identified the right customers and whether they possess the necessary expertise and processes to deliver what the market demands.
Further analysis can illuminate which processes and expertise contribute to success consistently and which must evolve with changing conditions. Particularly in cases where performance falls short of expectations, it is crucial to understand whether the organization has failed to maintain its standards or lacks the necessary new competencies that are critical to achieving its goals.
This interpretation should lead to decisions about which strategies and tactics to pursue. In our research, we observed a common pattern among companies that prioritized learning: They consistently identified which of their practices to stop, improve, intensify, or start.
Leaders whom we have worked with at several dozen companies now use SIIS assessments to support shifting resources toward the growth mechanisms they have validated. They stop ineffective activities. If an activity seems promising but needs refining, they improve it. When a practice succeeds repeatedly, they intensify it by deploying it across the organization. And, because organizational learning may serendipitously reveal new areas for growth, teams can start to explore and validate promising ideas. (See “Assess Options Through the SIIS Lens.”)
When leaders in the European logistics company dug into why its efforts at acquiring new clients were falling short, they found that the business development team had focused on generating numerous quick offers that did not resonate with potential customers. Evaluating practices through the SIIS lens pointed them toward two decisions: They stopped the practice that did not work, and they started to deeply engage with fewer potential clients, cocreating a value proposition with them. These new efforts led to a noticeable improvement in achieving their sales targets.
3. Reward: Reinforce learning behaviors. Too often, organizations fall into the trap of celebrating only the visible wins — initiatives that meet or exceed their targets. However, if learning is to contribute to how the organization grows, people should be rewarded not just for results that matter but also for the thinking and behavior that led to them. Leaders should be evaluated not just on outcomes but on how they develop capabilities, share insights, and build organizational learning capacity.
Rather than rewarding only aggregate outcomes, recognition should acknowledge progress on specific drivers. For example, teams that successfully penetrate a new customer segment can be recognized for that achievement even if revenue fell short due to external factors.
Reward systems should provide incentives for teams to act on insights from the Interpret step. Teams that stop ineffective activities, improve promising approaches, intensify proven strategies, or start new initiatives based on learning can be recognized for their actions even if there is no immediate positive financial outcome. At the logistics company, the bonus that the business development team leader earns is based in part on whether their team practices these behaviors.
Reward and recognition can take other familiar forms, such as acknowledgement in leadership meetings, incentives that incorporate learning-based criteria alongside performance-based ones, and performance reviews that assess the rigor with which individuals apply and share what they have learned. Rewarding learning and its application sends a powerful cultural signal that curiosity, reflection, and intellectual honesty are core to how the organization grows.
4. Scale: Discover what works. In many organizations, scaling an initiative means taking a pilot that succeeded and doing more of it. This approach misses the nuance and rigor that true strategic learning requires. Scale is not about copying tactics. It is about embedding validated mechanisms of value creation (the initiatives in the Hit and Learning quadrants of the Learning From Execution Matrix) into how the organization chooses and executes growth efforts. This disciplined amplification of what works is the final and most critical step of the DIRS framework.
Scaling effectively requires systematic learning processes, expanded competencies, and buy-in across the organization.
These elements can take many forms, such as monthly sales reviews to map current results to the annual plan, or regular problem-solving sessions. Companies can also hold organizationwide conversations to share information and insights about their successes and failures — both expected and unexpected.
Scaling also requires the development of capabilities that align with a company’s profit drivers and clients. It is not enough to copy successful approaches; teams need a foundation to succeed in different conditions. For example, if success with acquiring customers stems from a particular sales approach, scaling it will involve developing the market insight, relationship-building expertise, or value articulation skills that will make it successful in a different region or for a different product or service. Further, leaders can prioritize learning agility, analytical thinking, and collaborative behaviors that support the DIRS approach when they hire.
Companies will also need a way to disseminate the necessary skills, behaviors, and tools in the form of case studies and benchmarks. Beyond training, this can include repositories of successful practices, peer learning sessions, and knowledge transfer systems that help teams understand not just what to do but why and how to adapt approaches to their specific contexts.
Finally, leaders, front-line managers, and their teams need to buy in fully and be open to learning. The DIRS framework is effective only to the extent that leaders create a culture in which uncovering the reasons behind results and applying what they learn becomes routine. Respecting the results of the DIRS process enables organizations to avoid the trap of active inertia, where leaders become so committed to past success formulas that they resist changing how they operate.
Embed Strategic Learning Into the Planning Cycle
For DIRS to have organizationwide benefits, leaders will need to integrate it into their planning process. Companies using the framework have incorporated it into their quarterly reviews and annual strategy development.
Leaders and managers in those companies use the insights from DIRS to set goals and decide how to achieve them. For example, the leadership team may demand a 10% revenue increase. The evidence that reveals which initiatives get results, the drivers of those results, and the factors that contributed to success or failure take away the guesswork when choosing the most high-value plans.
Front-line managers and teams should be engaged in uncovering insights and proposing growth opportunities. When people shape their goals, they are more likely to own them and more motivated to meet them. The result is a planning system that bridges learning and execution. It allows organizations to continually test, refine, and scale what works while shedding ineffective practices and avoiding repeating mistakes.
Learning on its own is not enough, however. Repeatable success is supported by a broad set of leadership principles that define how the organization succeeds. These might be operational rules like “how we prioritize projects” or strategic imperatives such as “prioritize quality before cutting costs.” Such principles matter because they create a foundation for decision-making.4
The DIRS framework supports an examination of these principles and can reveal when they have become outdated or counterproductive. Leaders can be more confident both in their approach to running the organization and their ability to change course when the evidence shows them that they should.
Resilient and dynamic organizations do not simply celebrate what worked or discard what did not. They analyze and reflect. They reward learning, not just outcomes. And they adapt based on what they discover.