When smart people fail together

In every financial crisis, many billions are lost not by crooks but by smart people doing honest work, often as part of a committee. Jason Zweig wrote that in 2009. It still holds.

Committees can amplify wisdom or destroy it. Their output depends on how they’re tuned—by “tuning” I mean the group’s shared mental models, processes, and balance of skills and domain expertise.

So: how can a group improve the filters that govern collective decision-making?

First, understand bias. The human brain is an unreliable instrument under uncertainty, and naming its flaws helps tame them. If it has a name, it has a face.

Second, anchor on process. In investing, results emerge from repeatable systems. Good venture funds operate from a thesis—a directional map. Quant hedge funds live at the opposite pole: they shorten the feedback loop between input and output to near real-time. But both depend on alignment around the basic truths of their field. Without that, collaboration becomes noise.

Noise is the enemy. Groups add it easily, especially through groupthink. Diversity of perspective is the antidote, though not to be confused with diversity of principle. A good team contains different pairs of eyes on the same truth, not different truths.

Third, invest in a shared epistemology. How much evidence is “enough”? What logic governs the tie-break: empiricism, first principles, or precedent? What is the agreed method for evaluating new ideas? In any domain of repeated judgment under high uncertainty, eliminating systematic error is the only path to durable success.

And today, that path runs through data. Ten years ago, heuristic judgment could suffice; the infrastructure wasn’t ready. Now, ignoring data is like entering a boxing match one-armed. If you’re Tyson, maybe you’ll win. Most aren’t.

Consensus should exist only at the level of foundational truth. Beyond that, it breeds mediocrity. In fact, committees could use a “scrum master” of sorts—a behavioral-science referee tasked with exposing flaws in individual and group logic.

Below are a few cognitive biases worth keeping on the radar. Think of them as recurring bugs in the operating system of judgment:

  • False equivalence – mistaking resemblance for parity

  • Cherry picking – privileging confirming data

  • Representativeness heuristic – assuming similar appearances imply similar odds

  • Anchoring – clinging to the first data point

  • Scarcity bias – equating rarity with value

  • Social proof – mistaking consensus for truth.

  • Sunk cost fallacy – persisting because of prior investment

  • Clustering illusion – seeing patterns in noise

  • Endowment effect – overvaluing what we own

  • Procrastination and inertia – deferring hard calls under uncertainty

The list is long because bias is persistent. But awareness converts it from fatal flaw to manageable friction. The goal isn’t perfect rationality—it’s consistent calibration.

In the end, good group decisions come from shared truth, diverse perception, and tight feedback loops. Everything else is variance.

Previous
Previous

Coase in the age of code

Next
Next

The Honey Badger and the mirror