In Brief: Google Cloud has signed a multibillion-dollar infrastructure agreement with Thinking Machines Lab — the AI startup founded by former OpenAI CTO Mira Murati — granting it access to Nvidia GB300-powered compute to train and deploy frontier models.
Google Cloud announced the deal on April 22 at its Cloud Next conference, extending an existing relationship with Murati's startup that began in 2025. Valued in the single-digit billions, the agreement gives Thinking Machines Lab dedicated access to A4X Max virtual machines — Google's latest Nvidia GB300-based instances — which Google says deliver roughly twice the throughput of their previous-generation GPU offerings. The Jupiter high-bandwidth interconnect will handle the weight transfers required by the startup's reinforcement-learning workloads, a demanding use case that depends on low-latency links between GPU nodes at scale.
Murati departed OpenAI in late 2024 after serving as the company's chief technology officer and founded Thinking Machines Lab in February 2025. The startup followed up with a $2 billion seed round at a $12 billion valuation — among the largest seed financings ever recorded in the technology industry. Its debut product, Tinker, automates the creation of custom frontier AI models, targeting enterprises that want purpose-built capabilities without building model infrastructure from scratch.
The agreement is non-exclusive, meaning Thinking Machines Lab can continue running workloads on other cloud providers alongside Google. That flexibility is consistent with how most frontier labs manage their infrastructure risk, but the financial scale of the deal makes Google Cloud the de facto primary compute partner for the foreseeable training horizon.
Thinking Machines Lab becomes the third frontier AI developer — after Anthropic and Meta — to lock in Google's Blackwell and TPU capacity inside a single month. The clustering of these announcements reflects a broader pattern: hyperscalers are racing to sign long-term compute commitments with the top AI labs before the next wave of large training runs begins, converting infrastructure access into a durable competitive moat. For Google Cloud, the deal with Murati's lab reinforces its positioning as the preferred home for independent AI developers that want high-end GPU access without defaulting to AWS.