Back to stories
Industry

Murati's Thinking Machines Lab Signs Multi-Billion Google Cloud Deal for GB300 Chips

Michael Ouroumis2 min read
Murati's Thinking Machines Lab Signs Multi-Billion Google Cloud Deal for GB300 Chips

Mira Murati's Thinking Machines Lab has signed a multi-billion-dollar agreement with Google Cloud, deepening a compute stack that already leans heavily on Nvidia. TechCrunch first reported the deal on April 22, 2026, pegging its value in the single-digit billions — the startup's first partnership with a cloud services provider.

The agreement gives Thinking Machines access to Google Cloud infrastructure powered by Nvidia's new GB300 chips, along with services for model training and deployment. Google says GB300-based systems offer roughly a 2x improvement in training and serving speed over the previous generation of GPUs, and Thinking Machines is among the first customers to get them in production.

A cloud anchor for the Tinker stack

Thinking Machines launched its first product, Tinker, in October 2025 as a tool for automating the creation of custom frontier models. That workflow leans heavily on reinforcement learning, the same training regime behind recent breakthroughs at DeepMind and OpenAI — and one of the most compute-hungry approaches in modern AI.

"Google Cloud got us running at record speed with the reliability we demand," Myle Ott, a founding researcher at Thinking Machines, said in a statement accompanying the announcement. Google, for its part, emphasized its ability to support the startup's reinforcement-learning workloads at scale.

Murati's compute strategy

Murati, OpenAI's former chief technology officer, founded Thinking Machines Lab in February 2025. The company closed a record $2 billion seed round at a $12 billion valuation in July 2025, then locked in a significant Nvidia investment and compute deal in March 2026. Today's Google agreement layers a hyperscaler on top of those direct chip relationships.

It also slots Thinking Machines into a growing club of well-funded AI labs signing vendor-diverse infrastructure deals rather than committing to a single cloud. Anthropic separately confirmed earlier this month that it will tap roughly 3.5 gigawatts of Alphabet and Broadcom TPU capacity starting in 2027, while continuing to run on AWS Trainium and Nvidia GPUs.

Implications

For Google, landing Thinking Machines is a strategic win in the contest for frontier-lab workloads, where Microsoft Azure and AWS have historically dominated. Getting a GB300-native tenant of Murati's profile helps Google position its cloud as a credible home for reinforcement-learning-heavy research, not just inference or TPU-exclusive buildouts.

For Thinking Machines, the deal answers a question the company has faced since its outsized seed round: where will all that compute actually run? The answer, increasingly, is "everywhere it can" — a mix of direct Nvidia supply, Google Cloud capacity, and whatever in-house optimization its research team can squeeze from GB300-class silicon.

With Tinker traction still early and a suite of larger products presumed in the pipeline, the Google deal is less a headline moment than a structural one. It signals that even the most capital-rich AI startups now treat compute sourcing as a multi-vendor exercise — and that cloud providers are willing to write checks measured in billions to stay in the conversation.

Learn AI for Free — FreeAcademy.ai

Take "AI for Business: Practical Implementation" — a free course with certificate to master the skills behind this story.

More in Industry

Eli Lilly Bets $2.25B on Profluent's AI-Designed Gene Editors in Beyond-CRISPR Deal
Industry

Eli Lilly Bets $2.25B on Profluent's AI-Designed Gene Editors in Beyond-CRISPR Deal

Eli Lilly inked a research collaboration worth up to $2.25 billion with Bezos-backed AI biotech Profluent to develop custom site-specific recombinases — enzymes designed by generative models to perform large-scale DNA editing that current CRISPR tools cannot.

6 min ago2 min read
AWS Unveils Amazon Quick, Connect Agentic AI Suite, and Bedrock Managed Agents Powered by OpenAI
Industry

AWS Unveils Amazon Quick, Connect Agentic AI Suite, and Bedrock Managed Agents Powered by OpenAI

At its April 28 'What's Next with AWS' event, Amazon turned Connect into a four-product agentic AI family, debuted desktop assistant Amazon Quick, and previewed Bedrock Managed Agents running OpenAI's frontier models on AWS infrastructure.

3 hours ago2 min read
Anthropic Opens Sydney Office, Builds on Australian Government MOU as Hourmouzis Takes ANZ Helm
Industry

Anthropic Opens Sydney Office, Builds on Australian Government MOU as Hourmouzis Takes ANZ Helm

Anthropic officially opened its Sydney office this week, naming former Snowflake executive Theo Hourmouzis as General Manager for Australia and New Zealand and reinforcing an earlier-April memorandum of understanding with the Australian government on AI deployment.

4 hours ago3 min read