All Insights

VS Code Silently Stamped 1.4 Million Commits With Copilot's Name — Check Your Git History

CivSafe Team·May 8, 2026·6 min read

If your team uses VS Code and committed any code between roughly April 22 and May 6, 2026, there's a solid chance your git history now has a line in it that nobody put there intentionally:

Co-authored-by: Copilot <copilot@github.com>

Microsoft quietly flipped a default in VS Code 1.117, a release that went out April 22 without a changelog entry, release note, or user notification of any kind. A setting called git.addAICoAuthor — which had defaulted to off since it was introduced in VS Code 1.110 back in March — was switched to all. Which means: every commit, stamped with Copilot's name.

But the bug made it worse. The setting also triggered even when users had AI features explicitly disabled. Developers who caught Copilot's generated commit message and manually rewrote it before committing still found the co-author trailer baked into their final git history — because the trailer was appended after the commit window closed, out of sight. There was no moment to catch and remove it.

About 1.4 million commits were affected before Microsoft rolled out a fix in VS Code 1.119 on May 6. The default is now back to off. The internet was not quiet about it — the Hacker News thread hit over 1,400 points, the GitHub issue is 500+ comments deep, and The Register ran a headline about Copilot "claiming credit for human code."

What nobody's talking much about yet is what this actually means for teams that got hit.

Your git history is permanent

Here's the part that doesn't go away: git history is immutable once it's been pushed. If your team committed during that window and pushed to a remote — GitHub, GitLab, your own hosted instance — those commits have the Copilot trailer in them. Microsoft fixed the default. They can't touch your history.

For some teams this is genuinely not a problem. The co-author trailer is cosmetically weird but legally inert if your code was human-written, your team doesn't contribute to external open source projects, and your clients aren't doing code audits.

But a lot of small orgs that description doesn't quite fit.

When it actually matters

Open source contributions. If anyone on your team contributed to a public open source project during that window using VS Code, check immediately. Many projects — particularly Apache-licensed and GPL projects — explicitly prohibit AI-generated contributions because AI-generated code can't be copyrighted and creates provenance ambiguity. Even if your code was entirely human-written, you now have a public record attributing Copilot as a co-author. Some project maintainers have already started flagging or reverting commits with that trailer. If you maintain your own open source libraries, your contributors may have this in their commits without knowing it.

Copyright status of your code. The U.S. Copyright Office has consistently held that non-human authors can't hold copyright — most recently affirmed in Thaler v. Perlmutter. That means Copilot's name as a co-author doesn't transfer any rights to Microsoft. But it does create a public attribution record suggesting some portion of your code had non-human involvement. Some enterprise legal teams have started requiring that AI-produced code stay under a certain percentage of any given file to preserve full copyright protectability. A blanket co-author trailer makes that kind of accounting impossible — you can't tell from git history which commits it was accurate for and which commits it just got accidentally applied to.

Clients who audit your work. If you deliver software to clients and they run a git audit — which is increasingly common in regulated sectors like healthcare, finance, and public procurement — git log output now shows Copilot as a contributor to work you may have presented as fully human-authored. Most contracts don't currently have specific AI attribution clauses, but that's changing fast. Getting ahead of this with clients is better than having them discover it in an audit.

Your own AI governance policy. If your org has told clients or partners that you don't use AI tools for certain work, and VS Code silently stamped Copilot's name on commits covering that work — even incorrectly — that's an alignment problem you'd want to understand before someone else points it out.

What to check this week

First, find out if you're affected. Run this against any repo your team was committing to during that window:

git log --all --grep="Co-authored-by: Copilot" --oneline

If you get results and the date range is late April to early May, you were in the affected window.

Second, update VS Code. Version 1.119, which started rolling out May 6, has the default back to off. Verify after updating:

  1. Open VS Code settings
  2. Search for git.addAICoAuthor
  3. Confirm it's set to off — or remove the setting entirely to use the new default

Third, assess your exposure based on the categories above. Public open source contributions are the most urgent. Client deliverables are next. If you have a legal team or outside counsel, this is worth a 15-minute conversation.

On actually fixing the git history: if the commits are only on your local branch and haven't been pushed, you can rebase and drop the trailer. Once pushed to a shared remote, rewriting history requires a force push and coordination with everyone who has a copy. For most teams, the practical answer is to document the incident and move forward — the exposure is real but not catastrophic for the vast majority of use cases.

The bigger issue here

This incident happened because a single internal pull request — no release note, no user notification, no consent prompt — changed the default behavior of a tool used by tens of millions of developers. The setting change went from review to merge to rolling out to users in under 24 hours. 1.4 million commits later, the community noticed.

This is going to keep happening. Your AI tools are not static. The defaults change. The "Enabled" state drifts. What the tool does with your data, your code, your commit history — that's a moving target, and it's moving faster than most teams are watching.

Most small orgs we work with don't have an AI tool inventory: a clear list of what AI tools are in use, what permissions they have, what they're allowed to do with team output, and how to verify that the tools are actually behaving according to policy. This week's VS Code situation is exactly why that list matters.

You don't need a 40-page AI governance framework. You need to know what your tools are doing, and to have a 30-minute process for checking when something like this surfaces.

That's something we can help with. If this incident surfaced questions about your team's AI tool usage that you don't have good answers to, we'd rather you figure that out now than in the middle of a client audit.

CivSafe — Strategic Innovation. Community Impact.