The biggest cloud provider just showed us the real price tag of the AI boom, and it's not slowing down.

The Summary

The Signal

AWS revenue acceleration tells you everything about where AI workloads actually run. New data center capacity helped drive the growth surge, but the customer mix matters more. Anthropic and OpenAI aren't just using AWS—they're anchoring massive compute contracts that define the next decade of cloud economics.

This marks AWS's fastest growth since 2022, back when pandemic-era digital transformation was still feeding the machine. The difference now: AI inference and training workloads are stickier, more compute-intensive, and less price-sensitive than any cloud service that came before.

"The e-commerce giant is making more money than expected from AWS but it's also spending a lot, and will continue to do so in the near term."

Capital spending exceeded analyst expectations, and Amazon's CEO made clear this isn't a one-quarter blip. The company is in build mode. Data centers don't spin up overnight, and the AI customers driving this growth need guarantees of future capacity. Amazon is buying market position with billions in infrastructure before competitors can catch up.

Here's what the spending spike means:

  • Long-term capacity contracts are already signed with AI labs
  • Amazon sees sustained AI workload growth, not a hype cycle
  • The gap between cloud revenue and capex is the cost of staying dominant

The timing matters. AWS competitors are making similar bets, but Amazon's scale advantage compounds when everyone's racing to build. Procurement leverage, power contracts, cooling systems, chip volume—all favor the incumbent with the deepest pockets. The revenue growth and capacity expansion are feeding each other, creating a moat that gets wider the more they spend.

The Implication

Watch the capex-to-revenue ratio in Amazon's next few quarters. If spending keeps climbing while growth holds, that's confirmation AI infrastructure demand is real and structural. If spending moderates, it means the capacity build is ahead of demand and someone overestimated.

For founders building on AI: this growth validates betting on cloud-native AI infrastructure, but it also means pricing power shifts back to providers. AWS, Azure, and Google now control the choke point for LLM scaling. If you're building agents or AI tooling, your margin is capped by what Amazon decides to charge for compute.

Sources

TechCrunch AI | Bloomberg Tech