Siddharth Ramakrishnan

Writing

When Efficiency Backfires

July 9, 2025

Open your inbox, a dating app, or an applicant tracking dashboard and you’ll notice the same paradox: technology has never made it easier to reach people, yet everyone complains about being buried in junk, swipes, or low-effort resumes. Matt Levine jokes that everything is getting more efficient and everyone hates it.

Economists (unsurprisingly) have a name for this: congestion externalities. When markets get congested with too much supply and demand, their match quality drops. After weeks of reading papers and poking around real world data, I think the real story of congestion externalities has two overlapping plots:

The trick is figuring out where the efficient frontier starts to turn south and designing just enough friction (or smarter filters) to stay on the happy side.

A quick tour of the research

Mechanism Canonical papers Plain-English takeaway
Congestion externality Diamond-Mortensen-Pissarides, Hosios (1990) Each extra job seeker helps employers (more choice) but hurts fellow seekers by soaking up attention. After a sweet spot, more volume = worse matches.
Adverse selection via low friction Moen (1997); Menzio-Shi (2011); Wu (2024) When applying / swiping is costless, marginal (low-quality) types flood in. High-quality folks stop playing, which lowers payoffs for everyone else.

There’s more papers, but these were the most poignant. Economists show that, beyond a threshold, markets left to themselves will overshoot because each sender ignores the extra screening burden they dump on others.

How this shows up in the real world

Notice the common thread: platforms re-introduce scarcity (caps, fees, or high-cost signals) once the pool gets too noisy.

What is the right amount of friction?

Retention suffers when users are consistently shown low-quality or irrelevant matches. In practical terms, once the false positive rate climbs above 10 to 15 bad matches for every good one, users begin to churn. This threshold functions as a kind of "Hosios point" for consumer attention, and beyond it, engagement rapidly declines.

Mechanisms like daily swipe limits, application quotas, or even fees help reduce both platform congestion and the incentive for users to adopt a spray and pray strategy. When every action has a cost, users tend to behave more deliberately, which improves overall signal quality.

High quality signals are central to the system's integrity, so they need to be both costly and verifiable. Inputs like GitHub history, onchain credentials, and peer endorsements work well because they’re hard to fake but relatively easy for algorithms to evaluate. This makes them ideal for sorting genuine intent from noise at scale.

Finally, while AI filters like LLM classifiers are valuable tools for identifying spam or low-effort submissions, they can’t carry the full weight alone. If the entry gate is too wide open, even the best models are stuck playing a perpetual game of Whack-A-Mole. Instead, AI should be used in tandem with structural incentives and credible signaling to keep a marketplace clean and trustworthy.

Open questions

A playbook for builders (or investors)

Here’s what I learned distilled down into a table for reference. Hopefully this is helpful to someone!

Goal Friction lever Current tool-set
Keep noise < signal Entry caps, pricing tiers, queue delays Usage-based pricing; “slow mode” throttles
Preserve quality mix Costly, verifiable credentials Skill passports, verified selfies, brand registries
Reduce cognitive load AI summarisers + “why you’re seeing this” blurbs LLM-powered match rationales
Stay adaptive Active-learning filters + human review Continuous fine-tuning pipelines

If your product lives in any matching market (jobs, dating, marketplaces, B2B lead gen) the winning strategy is thoughtful friction plus smart filters. Strip away just enough sludge to keep high-quality players engaged, and let algorithms sweep the rest.