CEO Andrew Feldman has described Cerebras as the builder of "the fastest AI hardware for training and inference," Adobe and if the filing is anything to go by, the market seems to agree. Cerebras reported net income of $87.9 million on revenue of $510 million for 2025, compared with a net loss of $484.8 million on revenue of $290.3 million the year before TechCrunch — a near-complete financial turnaround in twelve months. Revenue grew roughly 76% year-over-year, and the company posted a profit of $1.38 per share, compared with a $9.90-per-share loss just a year ago. CNBC That trajectory is exactly the kind of story public market investors have been waiting to hear from an AI infrastructure company.
This is actually Cerebras' second attempt at going public. The company first filed paperwork with the SEC in 2024, before postponing and ultimately withdrawing its IPO — a delay that followed a US national security review of UAE-based tech conglomerate G42's minority investment in the chipmaker. CNBC G42, which had been both an investor and one of Cerebras' largest customers, drew increased scrutiny from US authorities amid concerns that Middle Eastern companies could provide China access to advanced American AI technology. TECHi® The situation put the entire listing on ice for months. Cerebras announced in 2025 that it had obtained clearance from the Committee on Foreign Investment in the United States, CNBC clearing the path for this second run.
The customer concentration story has also shifted meaningfully. When Cerebras first filed in 2024, a single customer — G42 — accounted for 87% of its revenue. In 2025, G42's share had dropped to 24%, while Mohamed bin Zayed University of Artificial Intelligence, a public institution in the UAE, contributed 62% of revenue. Bloomberg That's still a concentrated customer base, and risk-conscious investors will flag it. But the nature of the risk is different now — and the arrival of much larger names on the client roster changes the conversation considerably.
The headline relationship is with OpenAI. Cerebras has tied much of its growth to OpenAI, including a $20 billion multi-year deal under which the ChatGPT creator will deploy 750 megawatts of Cerebras chips. CNBC The deal calls for Cerebras making available 250 megawatts per year between 2026 and 2028, with OpenAI holding an option to purchase an additional 1.25 gigawatts of computing power through 2030. Bloomberg To put that in context, this is the largest non-Nvidia AI infrastructure contract on record. Feldman put it bluntly in a recent Wall Street Journal interview: "Obviously, Nvidia didn't want to lose the fast inference business at OpenAI, and we took that from them." Adobe
Beyond OpenAI, Cerebras also announced an agreement with Amazon Web Services to use Cerebras chips in Amazon data centers Adobe, and in March 2026, it signed a deal with Amazon that will enable cloud services on top of Cerebras chips and allow Amazon to purchase about $270 million in Cerebras' Class N stock. Bloomberg Oracle has also publicly acknowledged running Cerebras hardware for customer workloads on its cloud infrastructure. The company that once had a single-client revenue problem is now sitting inside the infrastructure stacks of some of the largest cloud providers in the world.
What makes Cerebras architecturally interesting — and genuinely different from Nvidia — comes down to how its chips are physically built. Its Wafer Scale Engine 3 is physically 56 times larger than Nvidia's H100, keeping compute cores and memory on the same piece of silicon connected by an on-die mesh fabric rather than external networking. KFGO In practical terms, this eliminates most of the latency that occurs when data has to travel between chips, across circuit boards, through cables, and between racks. Cerebras claims the CS-3 can train models up to 24 trillion parameters — more than ten times the size of GPT-4 — without the complex parallelization software that GPU clusters require. KFGO Fewer engineers debugging distributed systems, faster iteration cycles, and a dramatically simpler operational footprint. For inference specifically — the process of actually responding to a user query in real time — that speed advantage is where Cerebras has been most aggressively winning business.
The valuation picture is shaping up fast. Cerebras raised a $1.1 billion Series G last year, followed by a $1 billion Series H in February at a $23 billion valuation. Adobe Analysts currently peg the IPO target in the $22–28 billion range, with the offering planned for mid-May. The company has not disclosed a target fundraise figure. It also disclosed $24.6 billion in remaining performance obligations as of December 31, 2025, with an expectation to recognize 15% of that in 2026 and 2027 Bloomberg — a backlog number that will reassure institutional investors looking for revenue visibility in an inherently lumpy infrastructure market.
The risks are real and worth naming. Customer concentration remains the central concern, with the OpenAI relationship described in the filing as representing a substantial portion of projected revenues over the next several years — and OpenAI retains the right to exit part or all of the agreement if Cerebras misses service levels. Nvidia's software moat, built over a decade around the CUDA ecosystem, is years ahead of anything Cerebras can offer today. Manufacturing dependency on TSMC adds geopolitical exposure. And the company's non-GAAP net loss of $75.7 million in 2025 suggests the headline profitability number needs context.
But zoom out and the moment makes sense. Retail investors have been hungry for IPOs from large, growing technology companies after a relative drought that began in 2022, and AI companies including Anthropic and OpenAI are also considering going public as soon as this year. Bloomberg Cerebras is positioned to be the first pure-play AI chip infrastructure company to reach public markets — not a software wrapper, not a platform play, but a company that makes the actual silicon the AI revolution runs on. For a market that has watched Nvidia's stock become one of the defining trades of the decade, the prospect of a credible challenger finally going public is the kind of narrative that moves institutional money fast.
The roadshow begins soon. The AI chip war just moved to Wall Street.






