This morning, Anthropic published Claude Code v2.1.88 to npm with a 59.8 MB source map file still in the package. The .map file pointed to an unobfuscated TypeScript archive sitting on their Cloudflare R2 bucket. By 4:23 am ET, an intern at Solayer Labs had found it and posted it on X. The full 512,000-line codebase was mirrored on GitHub within hours and forked over 41,500 times.
Five days ago, Fortune reported that descriptions of an unreleased Anthropic model called "Mythos" were sitting in a publicly accessible data cache. Two incidents in one week, from a company preparing for an IPO.
The root cause was a misconfigured .npmignore or package.json files field. One wrong line and the source maps shipped with the package. Engineer Gabriel Anhaia put it well: "A single misconfigured .npmignore or files field in package.json can expose everything."
Why people are blaming vibe coding
Back in December, Boris Cherny, Anthropic's head of Claude Code, posted that "in the last thirty days, 100% of my contributions to Claude Code were written by Claude Code." The person running the product doesn't write any of the code himself.
So when the source leaked, people went looking. The reactions were not kind. One developer called it "bug on top of bug, workaround on top of workaround, zero tests." Another counted 64,464 lines with no test coverage and wrote: "That's not an AI failure. That's the absence of engineering process."
I keep coming back to that second quote. A .map file left in an npm package is the kind of thing a CI check or pre-publish script catches. Nobody vibes their way through .npmignore configuration. That's plumbing. You either have the checks or you don't.
So what
We use Claude Code every day. We like it. This isn't us piling on. But if Anthropic can ship a bad .npmignore with 512,000 lines of proprietary code behind it, your team can too.
Vibe coding is fine for prototypes. Internal tool that three people use? Go for it. Production systems need more. They need automated tests and a human who reads the config before it goes out.
Most of the orgs we work with are early in AI adoption, and the temptation is obvious: the AI writes code fast, why add friction? Because friction is what keeps your source code off the public internet.
Set up the process at the same time you set up the tools. Automated checks before every deploy. A review step for config changes. Someone who understands what actually shipped.
We build AI coding workflows with the checks already in place. If you'd rather learn from Anthropic's week than have one like it, get in touch.