OpenAI's memo to shareholders criticizes rival Anthropic for lacking sufficient compute capacity, framing scale as a key competitive differentiator in AI.
Compute (hardware, data centers, energy for training and running AI models) is central to the dispute, with OpenAI planning 30GW by 2030 versus Anthropic's roughly 7-8GW by end of next year.
OpenAI claims its compute ramp is "materially ahead and widening," arguing that scale now drives differentiation and product growth.
OpenAI forecasts spending $600 billion on compute over the next five years, despite facing questions about profitability and planning to go public as soon as this year.
Anthropic takes a conservative approach, with CEO Dario Amodei criticizing the "YOLO approach" to data center build-out and emphasizing responsible capital allocation.
OpenAI cites analyst Ben Thompson to argue that compute is a product constraint for Anthropic, potentially putting a ceiling on its growth and suggesting Anthropic underestimated demand.
Anthropic's slow rollout of a new cybersecurity model to only 40 companies is attributed by OpenAI to compute limitations, not security concerns, per the memo.
The strategic disagreement highlights a fundamental tension in AI: aggressive infrastructure investment versus disciplined capital allocation.
Market implication: Compute investment levels may determine growth ceilings and competitive positioning, with scale becoming a critical barrier to entry in the AI sector.