Back to stories
Industry

OpenAI Commits $20B+ to Cerebras Chips, Takes Equity Stake as IPO Looms

Michael Ouroumis3 min read
OpenAI Commits $20B+ to Cerebras Chips, Takes Equity Stake as IPO Looms

OpenAI has agreed to spend more than $20 billion over the next three years on servers powered by Cerebras Systems chips and will receive an equity stake in the startup as part of the deal, according to reporting by The Information and follow-up coverage on April 16, 2026. The agreement lands as Cerebras prepares to refile paperwork for its long-delayed initial public offering, potentially repricing one of the highest-profile non-Nvidia AI chip companies.

A second, much larger commitment

The new contract builds on a January 2026 agreement in which OpenAI committed to buying up to 750 megawatts of compute capacity from Cerebras in a deal valued at over $10 billion. The latest expansion roughly doubles that commitment and pairs it with an equity component. According to the reports, OpenAI will also pay approximately $1 billion to help fund the construction of data centers that will run its AI models on Cerebras hardware.

The equity structure takes the form of warrants rather than immediate shares. OpenAI is said to receive warrants for a minority portion of Cerebras stock, with the potential to grow to up to 10% of the company over the three-year term, contingent on additional spending milestones. That structure mirrors the stake-for-compute arrangements that have become common across the AI industry as hyperscalers tie chip and cloud suppliers more tightly to their capacity roadmaps.

IPO refile at a $35 billion valuation

Perhaps the most immediate consequence is for the Cerebras IPO itself. The company, which first filed to go public in 2024 and saw its process stall amid regulatory review, is reportedly preparing to refile paperwork as soon as this week. The Information says Cerebras is targeting a raise of roughly $3 billion at a valuation of about $35 billion — a dramatic step up from prior reported valuations.

An anchor compute contract of this size from OpenAI gives Cerebras a forward revenue story that public-market investors can price against Nvidia's dominant data-center business, and it provides a counterweight to concerns about concentration risk and wafer-scale manufacturing costs.

Why OpenAI keeps stacking chip deals

For OpenAI, the Cerebras commitment is the latest in a string of multibillion-dollar infrastructure arrangements — stretching across Nvidia, AMD, Broadcom, Google TPUs, and custom silicon efforts — aimed at securing enough compute to train and serve frontier models through the back half of the decade. By taking equity alongside the compute, OpenAI reduces the risk that a key supplier gets bought out from under it or redirects capacity elsewhere.

The move also reflects a broader diversification push away from sole reliance on the Nvidia-TSMC supply chain, which has faced capacity constraints and geopolitical pressure. Cerebras' wafer-scale architecture is particularly attractive for large-context inference workloads, an area where OpenAI has been investing as agentic and long-running tasks grow in importance.

Implications

If the IPO prices near the reported $35 billion target, Cerebras would become one of the largest pure-play AI chip companies on public markets. It would also test investor appetite for AI-infrastructure IPOs at a moment when public sentiment toward AI and data-center spending is mixed. For OpenAI, the deal further entrenches a capital structure in which the company's compute suppliers are also its shareholders — and vice versa.

Learn AI for Free — FreeAcademy.ai

Take "AI for Business: Practical Implementation" — a free course with certificate to master the skills behind this story.

More in Industry

Eli Lilly Bets $2.25B on Profluent's AI-Designed Gene Editors in Beyond-CRISPR Deal
Industry

Eli Lilly Bets $2.25B on Profluent's AI-Designed Gene Editors in Beyond-CRISPR Deal

Eli Lilly inked a research collaboration worth up to $2.25 billion with Bezos-backed AI biotech Profluent to develop custom site-specific recombinases — enzymes designed by generative models to perform large-scale DNA editing that current CRISPR tools cannot.

6 min ago2 min read
AWS Unveils Amazon Quick, Connect Agentic AI Suite, and Bedrock Managed Agents Powered by OpenAI
Industry

AWS Unveils Amazon Quick, Connect Agentic AI Suite, and Bedrock Managed Agents Powered by OpenAI

At its April 28 'What's Next with AWS' event, Amazon turned Connect into a four-product agentic AI family, debuted desktop assistant Amazon Quick, and previewed Bedrock Managed Agents running OpenAI's frontier models on AWS infrastructure.

3 hours ago2 min read
Anthropic Opens Sydney Office, Builds on Australian Government MOU as Hourmouzis Takes ANZ Helm
Industry

Anthropic Opens Sydney Office, Builds on Australian Government MOU as Hourmouzis Takes ANZ Helm

Anthropic officially opened its Sydney office this week, naming former Snowflake executive Theo Hourmouzis as General Manager for Australia and New Zealand and reinforcing an earlier-April memorandum of understanding with the Australian government on AI deployment.

4 hours ago3 min read