OpenAI struck a $38 billion, seven-year partnership with Amazon Web Services, a move that immediately broadens the ChatGPT maker’s cloud footprint and sent Amazon shares higher on November 3, 2025.
Announced November 3, 2025, the multi-year agreement lets OpenAI begin running core AI workloads on AWS right away, with all targeted capacity slated to be online by the end of 2026 and room to expand into 2027. AWS said the buildout will include clusters of Nvidia’s latest GB200 and GB300 accelerators to support model training and inference at scale. OpenAI will gain access to hundreds of thousands of Nvidia GPUs under the arrangement, marking one of the industry’s largest single-cloud compute commitments.
The deal underscores both the arms race for AI computing power and OpenAI’s shift to a multi-cloud strategy after changes to its long-standing Microsoft arrangement. By tapping AWS’s global infrastructure and security posture—while continuing ties elsewhere—OpenAI is hedging against supply bottlenecks and positioning to sustain rapid growth in products like ChatGPT. For Amazon, landing OpenAI is a marquee win that helps counter perceptions it was trailing Microsoft and Google in AI cloud momentum. Amazon shares hit a record during the rally, reflecting investor confidence that the partnership can bolster AWS’s leadership in high-performance AI infrastructure.
Amazon’s stock rose as much as about 5% on November 3, 2025, after the announcement, reaching an all-time high. The move added tens of billions of dollars to Amazon’s market value and highlighted how marquee AI infrastructure wins can quickly shift sentiment toward cloud providers.
Frontier AI development is voraciously compute-hungry. OpenAI has discussed plans to dramatically scale capacity in coming years, and executives have emphasized that “massive, reliable compute” is essential to advance next‑generation systems. Under this agreement, AWS will concentrate cutting-edge chips in large, low‑latency clusters meant to handle both training runs and the always‑on inference traffic behind widely used applications. The companies say the initial capacity comes online immediately, with the bulk arriving through 2026 and options to extend thereafter—signaling a long runway for collaboration.
In the near term, expect OpenAI to start shifting select workloads to AWS while maintaining a diversified provider mix. For Amazon, operational execution—delivering the promised capacity on time—will be the key to turning the headline into durable revenue. Analysts will watch whether this marquee customer draws additional AI players to AWS and whether multi‑cloud becomes the norm for the largest model developers.