Anti-Distillation — Strategic Indispensability in the Age of AI
The distillation trap
The organizational goal is straightforward: convert individual human capital (expertise that leaves when the person leaves) into organizational capital (knowledge embedded in systems that persist). AI agent skills are the mechanism — employees distill their judgment, edge-case knowledge, and decision-making into structured prompts and workflows.
The problem: this is a Hundred Flowers trap. Employees who participate faithfully eliminate the knowledge moat that makes them valuable. The person who knows why the model does the weird thing for Brazilian entities — the one they call at 2 AM — is being asked to hand over the thing that keeps them employed.
Three anti-distillation strategies
| Strategy | How it works | Effect on the org |
|---|---|---|
| Performative skill | Looks comprehensive, demos well, but omits the 20% of edge-case knowledge that makes it work in production | Employee becomes more indispensable — they’re the only one who can fix it when it fails |
| Poison pill | Encodes expertise faithfully but with subtle dependencies on context only the creator holds (internal wikis, coined terminology, owned pipelines) | Removing the employee causes outputs to drift quietly until someone says “bring them back” |
| Complexity moat | Makes the skill architecturally entangled with other work the creator owns | Extracting the knowledge is harder than keeping the person around |
Why this is a rational response
This is not sabotage — it is rational self-preservation under misaligned incentives. The mandate asks employees to act against their own job security with no credible commitment that faithful compliance will be rewarded rather than punished. In game theory terms, the employer is asking for cooperation in a prisoner’s dilemma where the employer’s dominant strategy (reduce headcount once knowledge is captured) is transparent to the employee.
Organizational consequences
The campaign to reduce dependence on individual experts produces experts who are strategically indispensable — not because of what they know, but because of how they have structured the system to need them. The organization ends up with more fragile knowledge infrastructure than it started with, because the fragility is now intentional and camouflaged.
Key Takeaways
- Knowledge distillation mandates without credible job-security commitments will produce anti-distillation behavior.
- The resulting “agent skills” will look complete in demos but fail in production in ways that only the creator can fix.
- Organizations should treat knowledge distillation as a retention and incentive design problem, not a compliance problem.
- The parallel to Mao’s Hundred Flowers Campaign is instructive: asking people to reveal what they know, then using it against them, teaches everyone to hide what they know.
Related Notes
- Barrels and Ammunition - Why Hiring More People Makes Companies Slower — anti-distillation is most damaging when the people resisting are barrels, since their institutional knowledge is hardest to replace
- AI Transformation Requires Strong Form Org Redesign — strong-form redesign must address the incentive misalignment head-on rather than mandating compliance
- The AI Great Leap Forward — source clipping developing the full Hundred Flowers analogy