
Picture a room where ten highly intelligent people examine the same information and reach identical conclusions. Sounds reassuring, right? That level of agreement must mean the decision is solid. Except history tells a different story. Some of the most catastrophic organizational failures—from corporate bankruptcies to intelligence failures to disastrous military campaigns—happened precisely because everyone in the room agreed. The problem wasn’t lack of expertise or insufficient data. The problem was that nobody challenged the consensus. Enter the Tenth Man Rule, a counterintuitive protocol that deliberately engineers disagreement into decision-making processes. If nine people agree, the tenth person must assume they’re all wrong and build a case against the prevailing view, no matter how solid that consensus appears.
This isn’t about playing devil’s advocate for the sake of intellectual exercise or creating conflict where none exists. The rule addresses a fundamental vulnerability in human cognition: our tendency to seek harmony, reinforce existing beliefs, and mistake agreement for accuracy. When stakes are high—when decisions involve significant resources, human lives, or organizational survival—this vulnerability becomes dangerous. The rule emerged from one of the most painful lessons in modern military history, when an entire nation was caught completely unprepared despite having access to warning signals. Today, it’s being adopted by forward-thinking organizations that recognize a uncomfortable truth: sometimes the greatest risk isn’t being wrong, it’s everyone being wrong together while feeling completely confident about it. As someone who studies group dynamics and organizational psychology, I find this principle fascinating because it forces us to confront how poorly designed most of our decision-making systems actually are. This article will explore where this rule came from, why it works from a psychological standpoint, what it contributes to group reflection, and how organizations can implement it without descending into perpetual disagreement.
Origins and Historical Context
The rule gained widespread attention through the 2013 film World War Z, where an Israeli intelligence official explains how his country anticipated a zombie outbreak while others dismissed it as impossible. According to the fictional narrative, after the 1973 Yom Kippur War caught Israeli leadership completely by surprise despite available intelligence, the country instituted a radical policy. Whenever nine members of their ten-person security council reached the same conclusion, the tenth member was obligated to disagree and investigate the opposite scenario, no matter how improbable.
While the zombie scenario is obviously fiction, the underlying inspiration is historical fact. The Yom Kippur War truly did represent a catastrophic intelligence failure for Israel. Egyptian and Syrian forces launched a coordinated surprise attack on October 6, 1973, catching Israeli forces unprepared despite numerous warning signs in the preceding weeks. The Israeli intelligence community had fallen victim to what analysts later called “the concept”—a firmly held belief that Arab nations wouldn’t attack until certain preconditions were met. Evidence contradicting this belief was rationalized away or ignored entirely. Multiple intelligence officers saw the warning signs but assumed others knew better or that their concerns would disrupt consensus.
The actual Israeli intelligence reforms following this disaster didn’t establish a formal “Tenth Man Rule” exactly as portrayed in the film, but they did implement structural changes designed to prevent consensus-driven blind spots. They created dedicated units tasked with challenging prevailing assumptions, established clearer protocols for escalating dissenting views, and built systems to ensure contrary evidence received serious consideration rather than dismissal. The cinematic version dramatizes and simplifies these reforms into a clear principle that’s proven remarkably influential in business and organizational psychology circles.
What makes this principle resonate isn’t its literal historical accuracy but how it crystallizes a solution to a problem every organization faces. The film gave a name and narrative framework to something psychologists had been studying for decades: the tendency of cohesive groups to make terrible decisions while feeling confident about them.
The Psychology of Groupthink
To understand why the rule matters, we need to understand the phenomenon it’s designed to combat. In 1972, psychologist Irving Janis introduced the concept of groupthink—a mode of thinking that occurs when a group’s desire for harmony and consensus overrides their ability to realistically appraise alternative courses of action. Janis studied major policy disasters, including the Bay of Pigs invasion, the Pearl Harbor attack, and escalation of the Vietnam War. In each case, intelligent, experienced decision-makers made choices that, in retrospect, seemed obviously flawed.
Janis identified eight symptoms of groupthink that eerily describe what happens in many boardrooms and leadership meetings. First, there’s an illusion of invulnerability that creates excessive optimism and encourages extreme risk-taking. The group starts believing they’re too smart, too experienced, or too well-informed to make serious mistakes. Second, there’s collective rationalization where members discount warnings and refuse to reconsider their assumptions. Any information contradicting the group’s preferred course gets explained away rather than genuinely evaluated.
Third, groups develop a belief in their inherent morality, convincing themselves that because their intentions are good, they don’t need to worry about ethical or moral consequences. Fourth, they create stereotyped views of opponents or alternatives, dismissing other perspectives as too flawed to warrant serious consideration. Fifth, direct pressure gets applied to dissenters. Anyone who questions the group consensus faces subtle or overt pressure to conform, often framed as being “not a team player” or “not understanding the bigger picture.”
Sixth, individuals engage in self-censorship, minimizing their own doubts and refraining from expressing concerns. This happens even when someone has serious misgivings because they assume everyone else is confident, so their own doubts must be unfounded. Seventh, there’s an illusion of unanimity where silence is interpreted as agreement. If nobody voices opposition, the group assumes everyone agrees, not recognizing that multiple people might be self-censoring simultaneously. Finally, some members become “mindguards” who actively shield the group from adverse information that might disrupt the consensus.
These symptoms create a perfect storm where groups of smart people make terrible decisions while feeling completely confident. The scariest part? Groupthink is more likely in highly cohesive groups with strong leadership—exactly the characteristics we typically celebrate in teams. The very things that make groups effective in many contexts become liabilities when facing complex, high-stakes decisions requiring careful analysis of multiple scenarios.
Cognitive Biases in Group Settings
Groupthink doesn’t operate in isolation. It’s amplified by numerous cognitive biases that affect how we process information and make decisions. Confirmation bias leads us to seek information that supports our existing beliefs while dismissing or minimizing contradictory evidence. In group settings, this becomes collective—everyone unconsciously filters information through the same lens, reinforcing the dominant narrative.
The availability heuristic makes us judge the likelihood of events based on how easily examples come to mind. If the group shares similar experiences and references, they’ll all have similar “available” examples, leading to convergent thinking that feels like validation but might actually represent shared blind spots. Authority bias leads people to defer to perceived experts or leaders within the group, even when those authorities might be wrong.
Social proof—our tendency to assume that if everyone else believes something, it must be true—becomes particularly powerful in group contexts. Each person’s confidence reinforces others’ confidence in a self-amplifying cycle. The more people who agree, the more pressure subsequent individuals feel to conform, even if they have private doubts. This creates cascades where early consensus locks in, making later dissent increasingly difficult and unlikely.
How the Tenth Man Rule Works
The rule establishes a systematic protocol for introducing dissent into group decision-making. Here’s how it operates in its ideal form. First, all ten members of the decision-making group receive the same information simultaneously. Nobody has an informational advantage, and there’s no formal hierarchy within the group—everyone is considered an intellectual equal for the purpose of this process.
Second, members independently analyze the information and form initial conclusions. This independent analysis phase is crucial because it prevents early voices from anchoring everyone else’s thinking. Third, members share their conclusions. If nine members arrive at the same conclusion, the tenth person is automatically designated as the contrarian and must develop a comprehensive case for why the consensus is wrong.
Fourth—and this is critical—the tenth person must genuinely assume the other nine are incorrect and investigate that scenario thoroughly. This isn’t about poking minor holes in the plan or raising token objections. It’s about building a complete alternative analysis based on the assumption that the consensus conclusion is fundamentally flawed. The tenth person must ask: “If this decision leads to disaster, what would the warning signs have been? What are we missing? What assumptions are we making that might be wrong?”
Fifth, the tenth person presents their contrarian analysis to the group, who must seriously consider it before finalizing their decision. The group doesn’t necessarily have to adopt the tenth person’s view, but they must be able to specifically address the concerns raised and explain why they remain confident in their original conclusion despite the counterargument.
Importantly, the role of tenth person rotates. It’s not always the same individual playing contrarian, which prevents the role from becoming marginalized or the person from being dismissed as “just the designated pessimist.” Everyone knows they might be the tenth person on the next decision, which creates incentive for everyone to take the role seriously and engage genuinely with contrarian arguments when they’re presented.

Contributions to Group Reflection
The rule contributes to organizational learning and group effectiveness in several interconnected ways. Most fundamentally, it forces explicit consideration of alternative scenarios and failure modes. Groups naturally gravitate toward confirming their preferred course of action. The rule institutionalizes skepticism, ensuring that someone systematically asks “What could go wrong?” and “What are we assuming?” These questions often go unasked in normal group dynamics because raising them feels like obstructionism or negativity.
Second, the rule reduces diffusion of responsibility—the psychological phenomenon where individuals feel less personally accountable when part of a group. When everyone agrees, it’s easy for each person to think “Well, if I’m wrong, everyone else is wrong too, so it’s not really my fault.” The tenth person rule creates specific accountability. Someone must take ownership of the contrarian position and develop it thoroughly. This prevents decisions from being made where everyone kind of agrees but nobody has really thought it through critically.
Third, the rule legitimizes dissent and creates psychological safety for disagreement. In many organizational cultures, voicing disagreement with emerging consensus feels risky. You might be seen as difficult, not understanding the situation, or lacking team spirit. The rule reframes dissent from a social violation into a procedural requirement. When it’s your turn to be the tenth person, disagreeing isn’t risky or rude—it’s your job. This gives permission for others to voice concerns in different contexts too, because the culture has established that disagreement is valuable rather than threatening.
Fourth, it enhances cognitive diversity even in demographically homogeneous groups. You can have a room full of people with similar backgrounds, training, and perspectives, but by forcing one person to systematically develop an alternative view, you create cognitive diversity procedurally rather than relying on it emerging naturally. This is particularly valuable because most executive teams and leadership groups are relatively homogeneous in many organizations.
Fifth, the process improves decision quality by stress-testing assumptions before implementation. It’s far better to have someone punch holes in your plan while it’s still a plan rather than having reality punch holes in it after you’ve committed resources. The tenth person functions as a pre-mortem analyst, imagining the decision failed and working backward to identify what went wrong. This perspective reveals vulnerabilities that forward-looking analysis often misses.
Differences from Traditional Devil’s Advocacy
Many organizations already use devil’s advocate roles, so what makes the tenth person rule different? Several key distinctions make it more effective at combating groupthink. Traditional devil’s advocacy is often optional and informal. Someone might volunteer to play devil’s advocate, or a leader might say “Let me play devil’s advocate for a moment.” The rule makes it mandatory and systematic—if consensus emerges, contrarian analysis must occur.
Devil’s advocacy is frequently superficial. The devil’s advocate might raise a few objections or poke at some assumptions, but there’s often an underlying sense that this is a formality before moving forward with what everyone knows is the right decision. The rule requires the tenth person to develop a comprehensive alternative analysis, not just raise token objections. They must assume the consensus is wrong and build a complete case for that assumption.
Traditional devil’s advocacy often has a designated person—someone known as the skeptic or contrarian who fills that role habitually. This leads to their input being discounted. “Oh, that’s just Sarah being the devil’s advocate again.” The rule rotates the role, meaning everyone must be willing to seriously challenge consensus when it’s their turn and seriously consider challenges when someone else is the tenth person.
Devil’s advocacy typically operates within a hierarchical structure where the devil’s advocate might be junior to the decision-makers. The rule requires intellectual equality among all ten participants. There’s no formal hierarchy within the group for purposes of the decision, meaning the tenth person’s view carries equal weight to everyone else’s. This prevents the contrarian position from being summarily dismissed by senior leadership.
Finally, devil’s advocacy is often applied inconsistently—used for some decisions but not others based on someone’s judgment about whether it’s needed. The rule triggers automatically when consensus emerges, regardless of how confident people feel. This is crucial because groupthink is most dangerous precisely when groups feel most confident. If you only use the procedure when you’re uncertain, you miss the situations where you need it most.
When to Apply the Tenth Man Rule
The rule isn’t appropriate for every decision. Applied indiscriminately, it would paralyze an organization with endless debate over trivial matters. So when should it be used? The rule works best for high-stakes, low-probability scenarios—what Nassim Taleb calls “black swan” events. These are situations where the consequences of being wrong are severe, even if the probability seems low.
It’s appropriate for strategic decisions with significant resource commitments. Before launching a new product line, entering a new market, making a major acquisition, or divesting a business unit, the systematic contrarian analysis the rule provides can reveal critical vulnerabilities. It’s valuable for crisis response and emergency planning. When everyone agrees on how to handle an emerging crisis, that’s exactly when you need someone systematically developing the scenario where everyone’s wrong and the crisis is actually different than it appears.
The rule makes sense for decisions where there’s potential for catastrophic failure. If failure means bankruptcy, massive casualties, environmental disaster, or other irreversible harm, the cost of the tenth person’s analysis is trivial compared to the potential cost of groupthink. It’s useful when facing unprecedented situations without clear historical analogies. When you’re in genuinely novel territory, your intuitions and pattern-matching might lead everyone to the same wrong conclusion. Having someone systematically challenge those intuitions provides a crucial check.
Conversely, the rule shouldn’t be used for routine operational decisions. Nobody needs to develop a comprehensive contrarian analysis about what brand of coffee to stock in the break room. It’s inappropriate for decisions with easy reversibility. If you can quickly and cheaply reverse course if something doesn’t work, the elaborate analytical process isn’t worth the time. It’s not suited for situations requiring rapid response where analysis paralysis would be worse than imperfect action.
Implementation Challenges
Despite its elegance in theory, implementing the rule faces several practical challenges. First, it requires a high-trust organizational culture. People need to believe that playing the tenth person won’t hurt their careers or relationships. If someone thinks they’ll be seen as difficult, negative, or not a team player for thoroughly challenging consensus, they’ll go through the motions without genuine engagement. Building this trust takes time and consistent leadership modeling.
Second, it demands significant expertise and intellectual capacity from all participants. Everyone needs to be capable of both building sound arguments for a position and systematically deconstructing those same arguments when it’s their turn to be the tenth person. This limits who can effectively participate in the process. You can’t simply apply this rule to any arbitrary group of ten people and expect good results.
Third, the rule is time-consuming and cognitively demanding. Developing a comprehensive contrarian analysis takes real effort. Groups facing time pressure may feel they can’t afford this process, even for important decisions. Organizations need to build in adequate time for the analysis and accept that good decisions sometimes take longer than convenient.
Fourth, there’s risk of the process becoming performative. People might go through the motions of contrarian analysis without genuine intellectual engagement, especially if the culture doesn’t truly value dissent despite the formal procedure. The tenth person might raise some token objections, the group might address them superficially, and everyone moves forward with what they wanted to do anyway. Preventing this requires constant vigilance and cultural reinforcement.
Fifth, determining when to conclude the analysis poses difficulty. The group needs shared criteria for when the tenth person’s concerns have been adequately addressed versus when those concerns should change the decision. Without clear resolution criteria, the process can devolve into endless debate or, alternatively, premature closure where legitimate concerns get dismissed for the sake of moving forward.
The Neuroscience of Dissent
Recent neuroscience research illuminates why going against group consensus feels so difficult and why the rule needs to be formalized rather than relying on people naturally speaking up. Studies using brain imaging show that when individuals hold opinions differing from the group, they experience activation in brain regions associated with pain and anxiety, particularly the amygdala and anterior cingulate cortex. Conforming to group opinion, even when you privately disagree, reduces this neurological distress.
Our brains are wired for social cohesion because for most of human evolutionary history, being excluded from the group meant death. This deep-rooted tendency to seek agreement and avoid conflict served us well in small tribal contexts but becomes maladaptive in modern organizational decision-making. The neurological discomfort of dissent is real and automatic—you can’t simply decide to not feel it. This is why cultural permission and procedural requirement are necessary. The rule doesn’t eliminate the discomfort of dissent, but it reframes that discomfort as appropriate and valued rather than as a signal that you’re doing something wrong.
Research on reward processing shows that agreeing with others activates the brain’s reward centers, releasing dopamine and creating positive feelings. This means that reaching consensus literally feels good at a neurological level. The feeling of agreement doesn’t indicate the agreement is correct—it just indicates your brain’s reward system is doing what evolution designed it to do. Understanding this helps group members recognize that feeling great about unanimous agreement might be a warning sign rather than confirmation they’re making the right choice.
Real-World Applications Beyond Intelligence
While the rule originated in security and intelligence contexts, its applications extend across many domains. In corporate governance, some boards have adopted modified versions for major strategic decisions. Before approving large acquisitions or fundamental strategy shifts, they designate a board member to develop a comprehensive case against the proposal, regardless of their personal view. This has helped identify deal-breakers in proposed acquisitions that would have proceeded based on management enthusiasm and board consensus.
In medical settings, some hospitals use similar protocols for unusual or high-risk treatment decisions. When a treatment team reaches consensus on a course of action for a complex case, they assign someone to systematically develop the case for why that approach might be wrong and what alternative diagnoses or treatments should be considered. This has caught diagnostic errors and identified treatment complications that unanimous confidence had obscured.
Venture capital firms have implemented versions of the rule for investment decisions. When partners reach consensus that a deal is attractive, one partner must develop the comprehensive case for why the startup will fail and the investment will be lost. This challenges the pattern-matching and enthusiasm that can lead to poor investment decisions despite all partners feeling confident.
Military planning incorporates similar principles through red teaming—having a dedicated team play the role of adversary and systematically identify vulnerabilities in planned operations. While not identical to the tenth person rule, it shares the core insight that consensus among planners might reflect groupthink rather than sound strategy, and someone needs to be formally tasked with finding the flaws.
The Limits of Contrarianism
While the rule provides valuable protection against groupthink, it’s important to recognize what it doesn’t do. First, it doesn’t guarantee correct decisions. Sometimes the consensus is actually right, and the tenth person’s contrarian analysis, no matter how thorough, is addressing unlikely scenarios that won’t materialize. The rule improves decision quality on average, but individual decisions can still fail despite its application.
Second, it doesn’t substitute for genuine diversity of thought, experience, and perspective. Having ten people with identical backgrounds, training, and worldviews apply the rule is less effective than having a genuinely diverse group, even without the formal procedure. The rule is a tool to create procedural diversity when demographic and cognitive diversity might be limited, but it’s not a replacement for the real thing.
Third, the rule can’t overcome fundamental uncertainty. For some decisions, the information simply doesn’t exist to confidently predict outcomes regardless of how thoroughly you analyze alternatives. The tenth person might identify real vulnerabilities without that changing the fact that the decision involves irreducible uncertainty. Sometimes you have to act despite ambiguity, and no amount of contrarian analysis eliminates that reality.
Fourth, the rule assumes rational actors engaging in good faith. If the nine people in consensus are being driven by ulterior motives, political considerations, or personal interests rather than genuine belief in their conclusion, the tenth person’s analytical efforts won’t change those underlying dynamics. The rule works when the problem is cognitive bias and groupthink, not when the problem is deliberate deception or misaligned incentives.
Building a Culture That Values Dissent
The rule only works within an organizational culture that genuinely values intellectual humility and constructive disagreement. Building that culture requires consistent leadership behavior that models the desired norms. Leaders need to publicly acknowledge when they’ve changed their minds based on contrarian input, demonstrating that dissent leads to better outcomes rather than to punishment or marginalization.
Organizations need to separate disagreement about ideas from personal attacks or disloyalty. This distinction is harder than it sounds. When someone thoroughly challenges your proposal, it can feel like they’re attacking you personally or not supporting the team. Creating clear norms that vigorous debate about ideas strengthens rather than weakens professional relationships takes deliberate effort and reinforcement.
Performance evaluation and promotion decisions must reward thoughtful dissent, not just consensus-building and harmony. If people observe that those who go along get along while those who challenge get sidelined, they’ll quickly learn that the formal procedure doesn’t reflect real organizational values. The culture trumps the procedure every time.
Training helps people develop the skills to disagree constructively and to receive disagreement non-defensively. These aren’t natural talents for most people. Learning to attack ideas without attacking people, to find flaws without being dismissive, to acknowledge valid concerns while explaining why you reach different conclusions—these are learned skills that improve with practice and feedback.
FAQs About The Tenth Man Rule
Is the Tenth Man Rule based on a real Israeli intelligence policy?
The rule as depicted in World War Z is a dramatized and simplified version of real Israeli intelligence reforms following the 1973 Yom Kippur War. Israel did experience a catastrophic intelligence failure when Egyptian and Syrian forces launched a surprise attack despite available warning signs. The subsequent reforms included creating dedicated units to challenge prevailing assumptions and better systems for escalating dissenting views. However, there wasn’t a literal “tenth man” protocol exactly as portrayed in the film. The cinematic version captures the spirit of those reforms in a more dramatic and accessible format that’s proven influential in organizational psychology.
How is the Tenth Man Rule different from playing devil’s advocate?
Several key differences make the rule more effective than traditional devil’s advocacy. The rule is mandatory and triggered automatically when consensus emerges, while devil’s advocacy is typically optional. The rule requires comprehensive alternative analysis assuming the consensus is wrong, while devil’s advocacy often involves raising token objections. The tenth person role rotates among all members, preventing marginalization, while devil’s advocates are often designated individuals whose input gets discounted. Finally, the rule operates among intellectual equals without hierarchy, while devil’s advocates often have less formal authority than decision-makers.
When should organizations use the Tenth Man Rule versus when should they not?
The rule works best for high-stakes decisions where the consequences of error are severe, strategic decisions with significant resource commitments, crisis response scenarios, and unprecedented situations without clear historical analogies. It’s appropriate when potential for catastrophic failure exists. However, it shouldn’t be used for routine operational decisions, easily reversible choices, time-critical situations requiring rapid response, or trivial matters where the analytical investment exceeds the decision’s importance. Applying it indiscriminately would paralyze an organization with excessive debate.
What psychological phenomena does the Tenth Man Rule address?
The rule primarily combats groupthink—the tendency of cohesive groups to prioritize consensus over realistic appraisal of alternatives. It also addresses confirmation bias, where groups collectively filter information to support preferred conclusions, and social proof, where agreement among group members is mistaken for validation of correctness. Additionally, it reduces diffusion of responsibility by creating specific accountability for contrarian analysis, and it counteracts self-censorship by legitimizing dissent as a procedural requirement rather than social violation.
Can the Tenth Man Rule work in groups smaller or larger than ten people?
The specific number ten isn’t magical—the principle can adapt to different group sizes. The key is that when strong consensus emerges, at least one person must be formally designated to develop comprehensive contrarian analysis. For smaller groups, you might use a “fifth person rule” or similar. For larger groups, you might designate multiple contrarians or divide into subgroups. However, very small groups (three or four people) may not have sufficient diversity for the procedure to add value, and very large groups may need different decision-making structures entirely.
What skills do participants need for the Tenth Man Rule to work effectively?
Participants need high-level analytical abilities to build sound arguments and systematically deconstruct them. They require intellectual humility to genuinely consider they might be wrong and to change positions based on evidence. Strong communication skills help articulate contrarian positions persuasively and challenge others’ views constructively. Emotional intelligence enables separating disagreement about ideas from personal attacks. Finally, everyone needs domain expertise relevant to the decisions being made—the rule doesn’t overcome fundamental knowledge gaps among participants.
How do you prevent the Tenth Man Rule from becoming performative or superficial?
Preventing superficial implementation requires genuine cultural commitment to valuing dissent. Leaders must model the behavior by changing their minds when contrarian analysis reveals flaws and publicly acknowledging those changes. Performance evaluations should reward thoughtful dissent. Adequate time must be allocated for thorough contrarian analysis rather than rushing through the formality. The group needs clear criteria for when contrarian concerns have been adequately addressed versus when they should change the decision. Regular reflection on past decisions helps identify whether the process is genuinely improving outcomes or just adding procedural overhead.
What happens if the tenth person actually convinces the group they were right?
That’s the ideal outcome—the process revealed that the consensus was flawed before resources were committed. When this happens, the group should adopt the alternative approach or develop a third option that addresses the concerns raised. Importantly, the organization should celebrate this result rather than treating it as embarrassment. If people feel foolish for having been convinced by contrarian analysis, they’ll resist genuinely engaging with it in the future. The goal isn’t to defend original positions but to reach the best possible decision, and sometimes that means the minority view was correct all along.
By citing this article, you acknowledge the original source and allow readers to access the full content.
PsychologyFor. (2025). The Tenth Man Rule: What it is and What it Contributes to Group Reflection. https://psychologyfor.com/the-tenth-man-rule-what-it-is-and-what-it-contributes-to-group-reflection/

