Microsoft's GitHub next month plans to begin using customer interaction data – "specifically inputs, outputs, code snippets, and associated context" – to train its AI models. The code locker’s revised policy applies to Copilot Free, Pro, and Pro+ customers, as of April 24. Copilot Business and Copilot Enterprise users are exempt thanks to the terms of their contracts. Students and teachers who access Copilot will also be spared. Those affected have the option to opt out in accordance with "established industry practices" – meaning according to US norms as opposed to European norms where opt-in is commonly required. To opt out, GitHub users should visit /settings/copilot/features and disable "Allow GitHub to use my data for AI model training" under the Privacy heading. Mario Rodriguez, GitHub's chief product officer, would rather you didn't. "By participating, you'll help our models better understand development workflows, deliver more accurate and secure code pattern suggestions, and improve their ability to help you catch potential bugs before they reach production," he wrote in a blog post. To excuse its covetous behavior, GitHub in its FAQs notes that Anthropic, JetBrains, and corporate parent Microsoft operate similar opt-out data use policies. The rationale for the change, according to Rodriguez, is that interaction data makes company AI models perform better. Adding interaction data from Microsoft employees has led to meaningful improvements, he claims, such as an increased acceptance rate for AI model suggestions. The data GitHub wants includes: Model outputs that have been accepted or modified; Model inputs including code snippets shown; Code context surrounding your cursor position; and documentation you've written; File names and repo structure; Interactions with Copilot features (e.g. chats); and Feedback (e.g. thumbs up/down ratings). The policy shift does somewhat change the meaning of GitHub private repositories, which are notionally "only accessible ...
First seen: 2026-03-26 00:56
Last seen: 2026-03-29 13:53