OpenAI just killed its video model after seven months because burning compute to make AI-generated videos of Will Smith eating spaghetti doesn't pay the bills.

The Summary

  • OpenAI scrapped Sora, its video-generation app, reversed ChatGPT video plans, and wound down a $1 billion Disney deal in a single day
  • The company raised another $10 billion (total funding now over $120 billion) while racing to profitability
  • Sora consumed massive compute resources without financial return to justify the expense

The Signal

The Sora shutdown is not about product quality. It's about the brutal economics of inference-heavy AI models colliding with reality. Video generation is computationally expensive. Every clip a user generates costs real money in GPU time. Unlike text models where you can charge subscription fees that roughly cover the compute cost of casual use, video models burn through resources at a rate that makes sustainable pricing nearly impossible without either gouging users or subsidizing losses indefinitely.

OpenAI's move signals something deeper: the foundation model gold rush is hitting its first real profitability filter. Companies that raised billions on the promise of AGI now have to explain why their balance sheets look like a Series B startup that got lost on the way to product-market fit. Sora was the kind of feature that looks great in a keynote demo and terrible in a quarterly investor update.

The timing matters. OpenAI is raising $10 billion at a $120 billion-plus valuation while simultaneously cutting a flagship product and unwinding a Disney partnership that was supposed to prove enterprise viability. That's not confidence. That's triage. The company is prioritizing compute allocation toward models that can actually generate revenue per inference cycle. Text and code generation clear that bar. Video doesn't.

The Disney deal collapse is particularly telling. If you can't make video generation work with a partner who has infinite content needs and deep pockets, you probably can't make it work at all. This wasn't a technical failure. It was a unit economics failure.

The Implication

Watch which models survive the next 12 months. The ones that stick are the ones where compute cost scales linearly with revenue potential. Text, code, maybe audio. High-compute, low-monetization models like video and 3D generation are getting cut. If you're building in the agent economy, choose problems where inference costs don't eat your margin. The era of "cool demo, figure out the business model later" just ended.


Source: The Verge AI