The South Indian Monkey Trap: How Leadership Biases Corrupt Decision-Making
The Hidden Traps of Dysfunctional Organizations
The story goes that in South India, a simple but cruel trick was used to trap monkeys. This classic example, referenced in Zen and the Art of Motorcycle Maintenance, illustrates how the real trap isn’t physical—it’s mental. Villagers would carve a small hole in a coconut, place a tempting piece of food inside, and chain it to a tree. The hole was just big enough for the monkey to slip its hand in—but not big enough for it to remove its fist while gripping the prize. Trapped by its own mind holding firm to the treat, its desire, the monkey remained stuck, unable to free itself, even in the face of a longer term existential danger, as its captors approached.
Leaders fall into the same trap all the time. They cling to what are now clearly bad ideas, dysfunctional cultures, and failed strategies, even when the escape route is obvious. The result? A few rounds of “the emperor has no clothes” followed by collapse, scandal, or a slow, grinding decline to irrelevance.
Consider Theranos, the biotech darling that promised a medical revolution. Despite red flags, investors poured in billions. The culture? Obviously dysfunctional. Information was controlled, dissent was crushed, and leadership surrounded itself with loyalists. Employees who questioned the science were dismissed. The result? Mass deception, regulatory fraud, and a total collapse.
Or take WeWork, where a charismatic CEO manipulated his board, siphoned millions in self-serving deals, and drove the company into near bankruptcy—all while insiders ignored red flags in favor of hype.
These aren’t outliers; these are patterns.
Organizations don’t fail because of one bad decision—they fail because of institutionalized dysfunction. For your refrigerator (yes, we’ll make magnets…) here are the warning signs:
Five Warning Signs that Your Organization has Institutionalized Dysfunction
Information is controlled – Key insights are restricted, creating information bottlenecks.
Selective loyalty reflects gatekeeping – An inner circle forms, shutting out dissenters.
Resistance to oversight is institutionalized – Audits, evaluations, and restructuring efforts are undermined.
Resources are monopolized – A small group controls access to power and opportunities.
Dissent is discouraged – Intimidation, discrediting, or subtle punishments silence critics.
If you've witnessed these patterns, you know the danger they pose. The question is: why do these failures keep happening? Surprise! Its about how we interpret risk as individuals and in groups and how we’re built to pursue short term risk avoidance/safety and not long-term wellbeing, which requires more rationality and a more enlightened self-interest. More to come on that, but first, why do organizations, made up of many decent people, end up in such a terrible place?
The Psychology of Dysfunction: Our Biases Lead To Bad Decisions
Humans lead with our egos, prioritizing risk reduction and self-interest, making our minds fertile territory for self-deception and irrationality. Long-term cooperation is best for us, but we aren’t built to ensure it. What we have are our instincts and desires. We all want to be safe; some of us seek safety in control, most in submission (fight, flight, fawn). In the absence of positive controls represented by shared values, adopted first and foremost by leaders, (transparency and accountability for starters), that protect people to speak up, dictators or manipulators with superficial charm will soon arrive to control increasingly dysfunctional organizations. Poor decision making will become the norm. Soon it will be counter-cultural to make common sense objections. This should sound familiar. We have all seen it.
Why does this happen? Because left to our own devices, people make poor decisions by design. It is natural for humans to act out of ego/fear/desire. Sadly, our own mindsets keep us from acting on higher/bigger/better long-term goals.
Long-term rational cooperation depends on embracing the discipline of higher level rules, like shared values, rather than responding instinctively to desire and fear. Our greatest successes come from thoughtful approaches to alignment allowing us to collaborate, disagree, and ultimately build together. Our failures arise from ego shaping our environments into fear based hellscapes characterized by insularity, groupthink, and, finally, cult-like control.
The Biases that Shape Poor Decision-Making
Behavioral economics has shown that as individuals, we are each prone to cognitive biases—glitches in our thinking that distort reality and lead to bad choices. Common examples of cognitive biases:
Confirmation Bias – We seek out information that supports our existing beliefs.
Anchoring Bias – The first piece of information we receive has an outsized impact.
Availability Heuristic – What we already know feels more relevant than what we don’t.
Dunning-Kruger Effect – The less competent we are in an area, the more confident we feel.
Fundamental Attribution Error – We blame others' failures on character, not circumstances.
Hindsight Bias – We assume past events were more predictable than they were.
Loss Aversion & Sunk Cost Fallacy – We overvalue what we've already invested in.
Self-Serving Bias – We attribute success to ourselves but blame failure on others.
Negativity Bias – We weigh bad experiences more heavily than good ones.
Group Biases Make Bad Decisions Even Worse
When we operate in groups, our biases don’t disappear—they intensify. Poor group decision-making is predictable based on these common group or collective biases:
Groupthink – Consensus is prioritized over critical thinking.
Authority Bias – People accept decisions from authority figures without question.
Conformity Bias – Individuals conform to the group even when they disagree.
Escalation of Commitment – Groups double down on bad decisions due to past investments.
Common Information Effect – Groups discuss what they already know instead of new insights.
Polarization Effect – Groups push each other toward more extreme positions.
Diffusion of Responsibility – No one takes ownership because “someone else will.”
The bigger the group, the worse the decision making, as individuals make little effort to participate authentically. These predictable failures are patterns and explain why companies become toxic, risk-averse or intoxicated by risk, and prone to collapse, in the absence of positive approaches—but this can be prevented, and even turned around.
The Sting in the Tail: If You Think This Doesn’t Apply to You, You’re Already in Trouble
Every failed company, every toxic workplace, every scandal-ridden institution once believed they were the exception—until reality proved them wrong. Dysfunction isn’t something that only happens to other people—it happens to every organization that ignores the warning signs.
So ask yourself:
Are dissenting voices truly welcome in your organization, or just tolerated until they become inconvenient?
Are you tracking decisions and their outcomes, or just assuming the right choices will emerge?
Do you and your leadership team regularly challenge your own assumptions, or do you expect your perspective to be self-evidently correct?
If you’re not actively fighting against dysfunction, you’re passively enabling it. And dysfunction, left unchecked, always wins.
Like the monkey clinging to its prize, most organizations, having declined into this “natural” state, won’t let go of their dysfunction until it’s too late.
In our next piece, we’ll examine how organizations can institutionalize good decision-making rather than dysfunction.