Debate about whether the explosion of datacenter buildout will prove to be a worthwhile investment centers on two scenarios: AI adoption accelerates, the datacenter investment pays out AI adoption is not as fast as forecasted, and it doesn't. However, a third scenario is very plausible: Open source models running on local workstations dominate AI There are a few reasons this could happen: Open source models keep up With the exception of gpt-4, open source models have matched performance of frontier models within 6 months of frontier model release (data): Months to open source parity with frontier modelsOpenAIAnthropic Naturally, there have been accusations of open source models gaming evals, but the frontier models do the same. We can expect this to continue. Startups usually try to create a moat, but model providers build waterslides: frontier models help train their open source competitors. Unauthorized distillation is a difficult threat to counter. Providers can (and have) complain about competitors using their model to train competition. As a practical matter, however, this "theft" could be impossible to prevent. Remote providers increase prices (or degrade subscription value) The unit economics of frontier models are reminiscent of Uber's "cheap ride era": for example, despite $13 billion in revenue, OpenAI projects $14 billion in losses for 2026. That bill includes $8 billion in compute costs. For Anthropic, Cursor recently estimated a $200/month Claude Max subscription can consume up to $5,000 in compute. Even before this report, they introduced rate limits on that subscription. Their newly released Claude Code Review feature is priced at a very expensive $15-$25 per PR. Its announcement came with little explanation of why it should replace existing PR review workflows. This seems like a pricing experiment, to see how high a price enterprise is willing to tolerate. In OpenAI's case, there is public reporting on pruning side bets and focusing on enterprise. ...
First seen: 2026-03-22 22:56
Last seen: 2026-03-24 11:26