A new service that helps coding agents stay up to date on their API calls could be dialing in a massive supply chain vulnerability. Two weeks ago, Andrew Ng, an AI entrepreneur and adjunct professor at Stanford, launched Context Hub, a service for supplying coding agents with API documentation. "Coding agents often use outdated APIs and hallucinate parameters," Ng wrote in a LinkedIn post. "For example, when I ask Claude Code to call OpenAI's GPT-5.2, it uses the older chat completions API instead of the newer responses API, even though the newer one has been out for a year. Context Hub solves this." Perhaps so. But at the same time, the service appears to provide a way to dupe coding agents by simplifying software supply chain attacks: The documentation portal can be used to poison AI agents with malicious instructions. Mickey Shmueli, creator of an alternative curated service called lap.sh, has published a proof-of-concept attack that demonstrates the risk. "Context Hub delivers documentation to AI agents through an MCP server," Shmueli wrote in an explanatory blog post. "Contributors submit docs as GitHub pull requests, maintainers merge them, and agents fetch the content on demand. The pipeline has zero content sanitization at every stage." It's been known for some time in the developer community that AI models sometimes hallucinate package names, a shortcoming that security experts have shown can be exploited by uploading malicious code under the invented package name. Shmueli's PoC cuts out the hallucination step by suggesting fake dependencies in documentation that coding agents then incorporate into configuration files (e.g. requirements.txt) and generated code. The attacker simply creates a pull request – a submitted change to the repo – and if it gets accepted, the poisoning is complete. Currently, the chance of that happening appears to be pretty good. Among 97 closed PRs, 58 were merged. Shmueli told The Register in an email, "The review process appears ...
First seen: 2026-03-25 20:54
Last seen: 2026-03-29 13:53