OpenAI struck a $38 billion, seven-year cloud partnership with Amazon Web Services, sending Amazon shares up about 5% on November 3, 2025, and signaling the ChatGPT maker’s push to diversify where it buys massive computing power.
The multiyear agreement gives OpenAI access to AWS’s U.S.-based infrastructure, including hundreds of thousands of Nvidia AI chips, to train and run its models. OpenAI will begin using AWS immediately, with all planned capacity targeted to be online by the end of 2026 and room to expand into 2027 and beyond, according to the companies. The deal is valued at $38 billion over seven years and marks a marquee customer win for AWS at a time when cloud providers are racing to supply compute for frontier AI systems.
The partnership follows OpenAI’s recent restructuring and updated terms with Microsoft that ended exclusive cloud rights, giving the startup greater latitude to source compute from multiple vendors. Amazon, which already backs rival AI lab Anthropic, deepens its bid to remain a primary supplier of AI infrastructure as demand for generative systems surges. OpenAI CEO Sam Altman has described ambitions to build out unprecedented amounts of compute—plans that he has said could require spending on the order of trillions of dollars over time—underscoring why diversified cloud capacity has become strategic.
Investors cheered the announcement. Amazon’s stock rose roughly 5% on November 3, 2025, as the company touted the scale and reliability of AWS for AI workloads. Reuters reported the shares touched record levels during the session and that the move briefly pressured some competitors. For Amazon, the agreement is seen as a validation of AWS’s competitiveness against Microsoft and Google in the AI arms race. Amazon shares hit an all-time high during the rally, highlighting how marquee AI demand can sway perceptions of cloud leadership.
Both companies say initial workloads start now, with full capacity targeted before December 31, 2026, and potential expansion in 2027. AWS plans to cluster next-generation Nvidia accelerators for OpenAI’s training and inference needs, while OpenAI continues scaling services like ChatGPT. The broader question—how the economics of building and operating such vast AI infrastructure will play out—will loom over Big Tech and investors as these commitments ramp.
These Related Stories



No Comments Yet
Let us know what you think