Mark Airs/Ikon Images
Early one morning in 2018, a self-driving Uber vehicle fatally struck a pedestrian in Tempe, Arizona. The world had questions: Who was responsible? Was it the safety driver behind the wheel? The engineers who designed the algorithms? Uber’s leadership? Or the regulators who had allowed autonomous-vehicle testing? The inability to name a single culprit signaled a profound shift in how responsibility must be understood and attributed in the age of intelligent technologies.
As organizations deploy increasingly autonomous systems such as drones, trading bots, or algorithmic decision makers (like automated resume screeners or credit assessment tools), agency becomes distributed, emerging from the complex interplay of human and machine actions. Decisions, once linear and traceable, now unfold across networks of people and artificial intelligence systems, introducing new forms of influence and unpredictability.
For today’s leaders, this means that the old search for a culprit loses relevance. The real challenge is not to assign blame but to instead construct a shared narrative — to uncover not only what went wrong but how collective activities, assumptions, and technologies shaped the outcome. As our recent research, published in MIS Quarterly, shows, forging organizational learning and resilience depends on this collaborative revisiting of how decisions happen and how stories of responsibility are constructed. We call this process narrative responsibility.
Why Classic Models of Responsibility No Longer Work
Classic theories of responsibility have rested on three core assumptions: that the world is fundamentally linear, with events following clear cause-and-effect logic; that decision makers act in a shared space and time, making the link between actions and consequences traceable; and that responsibility can be precisely attributed backward to an individual whose intentions and choices drive outcomes.
Consistent with these assumptions, when something goes wrong, organizations often enact traditional models of accountability by holding a senior leader personally responsible. For instance, after two fatal crashes of Boeing’s 737 MAX aircraft killed 346 people in 2018 and 2019, CEO Dennis Muilenburg was swiftly dismissed as a visible response to the crisis. However, despite this action and promises of cultural change from his successor, the underlying quality and safety failures persisted — culminating in a door plug blowing off a 737 MAX midflight in 2024 and the departure of yet another CEO. Removing one individual rarely addresses the deeper, complex causes of organizational failure.
Such approaches to accountability have always faced limits, even before the rise of digital technologies. What’s new in the age of AI and automation is how much faster, more complex, and opaque decisions are becoming, making old models of accountability less tenable than ever.
Take the crash of Amazon’s Prime Air delivery drone in Oregon in 2022. While official reports focused on technical or operator errors, the reality is that accountability for such incidents is inherently distributed — across coders, approval teams, and operations or project managers. Actions and consequences are distributed in ways that old models of accountability simply cannot address.
This challenge demands a fresh approach to responsibility that moves from blame to narrative responsibility.
Making Narrative Responsibility Real: Three Actionable Moves
Translating narrative responsibility from theory to practice requires that leaders reframe how accountability is constructed, sustained, and experienced so that every incident becomes a catalyst for collective learning and continual improvement. To make this shift, organizations must embed narrative responsibility at every level. Here’s how leaders can put the principles of narrative responsibility into action:
1. Map the real story — beyond the obvious. In the aftermath of an incident, organizational reviews — whether technical, legal, or managerial — often aim to converge toward a coherent causal account that enables closure and action. While such convergence is common and often necessary, it can also narrow the scope of responsibility by privileging stabilized explanations over contested or ambiguous ones. A narrative responsibility approach does not reject conventional audits but complements them by attending to how responsibility is constructed, anticipated, distributed, and gradually fixed through organizational storytelling, decision rationales, and silences over time.
Google’s response to its Gemini image-generation failure in early 2024 offers a partial model. When the tool generated historically inaccurate images, Google published a detailed public explanation tracing the root cause to flawed diversity tuning and misguided model behavior. Meanwhile, in an internal memo, CEO Sundar Pichai committed to structural changes, improved launch processes, and expanded red-teaming. This was genuine story mapping — naming what broke and why.
But a more comprehensive exercise might have identified competitive pressure to ship quickly, organizational incentives that discouraged cautious testing, and the gap between known risks and the decision to launch as factors to consider. Mapping the real story means going beyond the technical postmortem to surface the human and organizational dynamics that allowed failure in the first place. It means going beyond individual errors or broken code to understand how assumptions, data, and organizational routines interact — and where ambiguity, a lack of relevant anticipations, and misalignment take root.
2. Distribute ownership, not blame. In today’s complex AI-enabled organizations, decisions and outcomes emerge not from a single hand on the wheel but from dynamic interactions over time, which calls for a collective and distributed notion of responsibility. Real accountability depends on ongoing engagement and sensemaking across teams and functions. Too often, warnings or objections that were ignored or never voiced play as big a part as active missteps.
Forward-thinking organizations are creating formal structures, such as steering committees, incident review panels, traceability systems, and cross-functional advisory groups, to institutionalize narrative responsibility. These forums are designed as open, psychologically safe spaces where staff members at all levels can reflect on what happened, voice difficult truths, and collectively reconstruct how incidents unfolded. In health care, this shift is well underway: UCLA Health, for example, established a network of trained culture champions and incident review committees that examine adverse events to surface systemic patterns and drive improvement across the organization. The aviation sector offers a proven model of this collective-learning approach: After an automation-related failure, airlines like Air France and KLM, in line with European Union Aviation Safety Agency regulations, convene multidisciplinary panels as part of their safety management systems. These panels, aligned with the principles of “just culture,” focus not on blaming but on extracting lessons and adapting systemically. This approach has demonstrably strengthened airline safety and customer trust.
3. Embed reflection in everyday practice. For narrative responsibility to thrive, it must not be practiced only post-crisis; it must become organizational routine. Sustainable learning emerges when teams habitually review how stories of accountability are constructed — and reconstructed — across daily operations and the use of technologies like AI.
Some organizations add narrative review points to recurring meetings, asking, “What did we learn?” “Where did our assumptions or processes fail?” or “How did our actions contribute to the outcome?” (See, for instance, the chapter “Postmortem Culture: Learning From Failure” in Google’s book Site Reliability Engineering.) Others routinely include responsibility narratives in management reports, not only after incidents but as an ongoing practice — turning lessons learned into living documents that support continuous learning. ING Bank, for instance, has built regular reviews and “retrospective learning sessions” directly into its agile routines. After each sprint, teams discuss what went well, what could be improved, and how lessons learned from critical events can inform future work, to ensure that key insights connect day-to-day operations to broader conversations about ethics and risk.
When the three principles are enacted, they reshape not just day-to-day operations but how organizations collectively respond to failure at all levels. Returning to the opening example of Uber’s tragic self-driving car incident, the official response centered on individual fault: The safety driver was prosecuted, and Uber halted its autonomous-vehicle program. However, as far as we know, organizational and systemic factors like design decisions, safety culture, and regulatory gaps were extensively documented in the official investigation but received limited attention in subsequent public and judicial responses. A narrative responsibility approach — one that maps the real story with all stakeholders and techniques involved, distributes ownership beyond blame, and embeds ongoing reflection — would have invited all key actors to collectively examine what shaped the anticipated and realized outcomes. While this wouldn’t have reversed past harm, it could have surfaced deeper lessons, enabled more meaningful accountability, and driven more systemic change for the future.
From Blame to Shared Narrative
Sustaining narrative responsibility requires more than scattered initiatives. It must become part of an organization’s DNA.
As businesses adopt AI agents, they can no longer rely on compliance teams or retroactive audits to assign accountability. Instead, establishing a shared practice of responsibility by constructing, questioning, and evolving the organizational narrative, together, is a strategic, forward-looking imperative for all leaders and teams.
Embracing narrative responsibility is critical for today’s organizations, but it’s not a panacea. There are real risks, particularly if the process is used to diffuse or obscure accountability — especially when leaders control the story. It cannot substitute for legal or regulatory obligations: Frameworks like the European Union’s AI Act remain essential safeguards. And when responsibility is distributed across organizations, constructing shared accountability is complex and demands intentional openness and collaboration. For narrative responsibility to be transformative, it must complement — never replace — robust ethical and legal standards.