Why Groups of Experts Watch Disasters Unfold and Let It Happen

We like to believe that more expertise equals more safety : more brains, more foresight, fewer mistakes. Yet history whispers a different story. From boardrooms to control centers, disasters often unfold not because people didn’t know, but because they didn’t speak.

Inside organizations, warning signs flare like distant fires. Everyone sees them, yet everyone assumes someone else will act. Spaces designed for rational thought become echo chambers of hesitation, where doubt is stifled by hierarchy and a consensus that masquerades as confidence.
This is not incompetence; it is psychology.

The Bystander effect

Imagine you are at home late at night when loud, suspicious noises echo through your hallway. You peek out and notice that your neighbors have also heard them. Do you step in, or assume someone else will? If it’s the latter, you are experiencing the bystander effect. The classic bystander effect shows that responsibility diffuses when others are present.

In the 1960s, psychologists John Darley and Bibb Latané conducted a now famous experiment to understand how people react in emergencies when others are present. Participants were asked to fill out a questionnaire in a small room that, after a few minutes, began to fill with smoke. When participants were alone, nearly 75 percent quickly reported the smoke, recognizing it as a potential danger. However, when a participant was placed with two other people who had been instructed to ignore the smoke, only 10 percent reported it, taking their calm behavior as a signal that nothing was wrong. In a third version, where three uninformed participants shared the room, just 38 percent acted. The results revealed a powerful psychological pattern known as the bystander effect, the tendency for individuals to remain passive in emergencies when others are present as they look to each other for cues about how to respond and assume someone else will take responsibility.

The Organizational Bystander Effect

Inside corporations, this same dynamic plays out as the organizational bystander effect, except the silence unfolds around a conference table, not a smoke-filled room. Two psychological traps reinforce it.

1. Diffusion of Responsibility
In expert groups, personal duty dissolves across the collective. Each individual assumes that someone else, perhaps more senior or qualified, will raise the alarm. Responsibility scatters through the hierarchy until it belongs to no one. This is reinforced by social inhibition, where authority and expertise make people hesitant to speak, afraid of appearing ignorant or confrontational.

2. Pluralistic Ignorance
When everyone stays silent, each person assumes their private doubt is misplaced. They think, “If it were truly critical, everyone else would be reacting.” The group’s calmness becomes a false signal of safety. This blends into groupthink, where the desire for harmony overrides critical evaluation. Individuals stay quiet because the group seems certain, and the group seems certain because individuals stay quiet.

Research by Hussain and Tangirala (HBR, 2019) found a paradoxical pattern: as workplace issues became more widely known, individuals became less likely to report them. Employees who believed they were the only ones aware of a problem were 2.5 times more likely to raise it. In other words, the more “obvious” the warning, the more invisible it becomes.

The cost of this collective quiet is staggering. Studies show that when employees suppress concerns, burnout rises, innovation stalls, and turnover accelerates; all symptoms of emotional strain.

The Consequences

The consequences can be catastrophic. In 1986, the NASA Challenger disaster occurred. Engineers at NASA and the contractor Morton Thiokol already knew that the O-rings were unsafe in cold temperatures, yet the launch proceeded when the temperature that morning was only 36°F. Strong political, national, and media pressure pushed NASA to continue, and the Challenger exploded. The Rogers Commission Report (1986) later identified deeply flawed decision-making processes and organizational culture , not just technical failure; as a direct cause of the disaster, showing how NASA’s focus on maintaining excitement over safety caused seven astronauts to lose their lives.

During the 2008 financial crisis, analysts and regulators recognized the dangers of subprime mortgages and complex derivatives, yet many assumed others would intervene, allowing the system to collapse. This massive pluralistic ignorance allowed systemic risk to grow unchecked because no individual or regulatory body was willing to break the quiet consensus of the market. Even startups feel the effect: teams without clear operational leadership often let problems fester, slowing growth and creating inefficiencies.

A similar pattern appeared in the corporate collapse of Enron in 2001. Many employees, auditors, and executives noticed warning signs related to the company’s complex accounting practices. However, diffusion of responsibility played a major role. Employees assumed that if something were truly wrong, the auditors at Arthur Andersen or the lawyers at Vinson & Elkins would raise concerns. At the same time, pluralistic ignorance emerged as employees privately doubted the financial structures used to hide debt but remained silent because their colleagues appeared unconcerned. The company’s aggressive “Rank and Yank” performance review system also created evaluation apprehension, discouraging employees from speaking out for fear of losing their jobs. The Bystander Effect at Enron was eventually broken by Sherron Watkins, an Enron Vice President. Unlike the thousands who remained silent, she famously wrote an anonymous memo to CEO Kenneth Lay warning him that the company might “implode in a wave of accounting scandals.” Her action is often cited as the antidote to the organizational bystander effect.

The Way Out

Overcoming this effect requires deliberate design. Organizations must concentrate accountability, assigning final decision authority to a single executive rather than diffusing it across committees. They should institutionalize dissent, creating roles that challenge plans and reward speaking up.

As McKinsey’s 2019 Bias Busters study explains, the premortem exercise legitimizes skepticism by asking teams to imagine a project’s failure before it begins. In doing so, it turns criticism into contribution, thus making it safe, even admirable, to raise difficult truths.
Bystander intervention training, anonymous feedback systems, and psychological safety metrics ensure employees feel empowered to raise concerns without fear of retaliation.

Managers must make it clear that every voice counts. According to Google’s Project Aristotle, the most successful teams aren’t the smartest ones but those built on empathy and psychological safety, where individuals feel safe to speak up even when others stay silent. The corporate mantra needs to be adjusted to:

“If you see something, say something (even if others see the same thing).”
as recommended by Hussain and Tangirala.

Expertise isn’t protection if it hesitates. The failure of expert groups isn’t a moral lapse but a design flaw. Intelligence is meaningless unless it translates into action.

What separates organizations that survive from those that fail catastrophically isn’t the talent in the room; it’s the structures they build to guarantee that when someone spots the fire, they sound the alarm, and the rest of the group actually listens.

If your systems can’t catch the warning signs, who will?

Citations

  • Hussain, I., & Tangirala, S. (2019, January 14). Why open secrets exist in organizations. Harvard Business Review. https://hbr.org/2019/01/why-open-secrets-exist-in-organizations
  • Smith, S. (2020, October 7). How groupthink played a role in the challenger disaster | Applied Social Psychology (ASP) – Dr. Anthony Nelson, Ph.D. https://sites.psu.edu/aspsy/2020/10/07/how-groupthink-played-a-role-in-the-challenger-disaster/
  • Boogaard, K. (n.d.). Diffusion of responsibility: What it is, examples and how to fight it. Marlee. https://getmarlee.com/blog/diffusion-of-responsibility
  • Lerebulan, H. E. N., & Amalia, L. (2023). The Effect of Employee Silence on Turnover Intention, with Burnout As a Mediation Variable and Coworker Support As a Moderating Variable. Indonesian Journal of Business Analytics, 3(1), 41–56. https://doi.org/10.55927/ijba.v3i1.3071
  • Klein, G., Koller, T., Lovallo, D., McKinsey & Company, & Global Editorial Services. (2019). Premortems: Being smart at the start. In McKinsey on Finance: Vol. Number 70 [Journal-article]. https://www.mckinsey.com/~/media/McKinsey/Business%20Functions/Strategy%20and%20Corporate%20Finance/Our%20Insights/Bias%20busters%20Premortems%20Being%20smart%20at%20the%20start/Bias-busters-Premortems-Being-smart-at-the-start.pdf