OpenAI just told investors it won't turn a profit until 2030, and the numbers explain why the entire AI infrastructure stack is about to get a multi-hundred-billion-dollar stress test.
The Summary
- OpenAI projects $121 billion in AI training costs for 2028 alone, more than quadruple this year's $25 billion, with no profits expected until 2030 despite revenue hitting $275 billion
- Anthropic's path is faster but smaller, projecting profitability in 2028 with revenues approaching $150 billion by 2029, mostly from enterprise sales through cloud partners
- OpenAI is subsidizing free users today to build consumer habit, betting on converting $150 billion in consumer revenue by 2030 through channels still undefined
The Signal
The AI foundation model race just revealed its price tag, and it's staggering enough to reshape the entire compute supply chain. OpenAI's 2028 training budget of $121 billion is larger than the entire 2024 global semiconductor capital equipment market. For context, TSMC, the world's largest chipmaker, spent about $30 billion on capex last year. OpenAI alone will spend four times that on compute in 2028, and Anthropic will add another $30 billion by 2029.
This is not a normal business trajectory. These are infrastructure-scale numbers dressed up as software companies. The fact that OpenAI expects revenue to hit $275 billion by 2030 while Anthropic projects $150 billion suggests they're not competing for the same pie, they're betting on fundamentally different distribution models. Anthropic is going enterprise-first through cloud partnerships (read: AWS, which led their funding). OpenAI is playing a consumer land-grab game, subsidizing ChatGPT's free tier to lock in users before figuring out how to monetize them.
The gap between revenue and profitability tells you everything about the compute arms race. You can't slow down training spend without falling behind, but you also can't charge enough to cover costs without killing adoption. So both companies are running the same playbook: raise enough capital to survive the race, bet that scale advantages compound faster than costs, and pray that inference gets cheap enough to stop bleeding before the money runs out.
What's quietly wild here is the implied infrastructure build. If OpenAI is spending $121 billion on training in 2028, someone has to sell them that compute. Nvidia's data center revenue was $47 billion in fiscal 2024. OpenAI alone could triple that market in four years. This isn't just an AI story. It's a signal that the entire cloud hyperscale model is about to get remade around foundation model training, and whoever controls that stack controls the next decade of software.
The Implication
Watch the chip and power sectors. If these projections hold, we're about to see unprecedented demand for GPUs, data center capacity, and energy infrastructure. For builders in the agent space, this is your window: the foundation model players are burning capital to give you cheap inference. Use it. For enterprise buyers, Anthropic's bet on cloud partnerships means pricing will get competitive fast as AWS, GCP, and Azure fight to bundle AI into existing deals. The consumer revenue question matters too, if OpenAI can't convert free users to paid at scale, someone else will crack the consumer AI business model first.
Source: Fast Company Tech