Grok is losing users. Colossus sits half-empty. So xAI leased 220,000 GPUs to Anthropic and called it a pivot. Welcome to the most revealing deal in AI infrastructure this year.
The company Elon Musk built to beat OpenAI just handed compute capacity to the company he's been calling "Misanthropic" — and Wall Street is treating it as a win. That tension is worth sitting with, because it tells you something important about where the real money in AI is flowing in 2026, and it's not where the press releases say it is.
On May 6, xAI — now folded into SpaceX following January's merger — announced that Anthropic would take all available capacity at Colossus 1, the Memphis, Tennessee data center that houses over 220,000 Nvidia GPUs across H100, H200, and next-generation GB200 accelerators. That's 300 megawatts of compute, available to Anthropic within the month. Musk framed the arrangement on X as a simple capacity optimization: xAI had already migrated its training workloads to the newer Colossus 2 facility, so Colossus 1 was sitting underutilized. Rent it out. Sensible.
What that framing obscures is more interesting than what it reveals.
The Model Company That Isn't Really a Model Company
xAI's current valuation: $230 billion. CoreWeave's, with comparable compute infrastructure: under $75 billion. That gap either reflects a massive premium on Grok's consumer AI ambitions, or it reflects something investors haven't yet fully priced — that xAI's long-term bet isn't really on beating Claude or ChatGPT in a chatbot race, but on becoming the GPU landlord to whoever wins.
Look at the behavior, not the branding. The SpaceX-xAI merger created a company that is simultaneously building orbital data centers, planning its own chip fabrication at the Terafab facility, acquiring Cursor for a reported $60 billion option, and now renting surplus capacity to a direct model competitor. These are not the moves of a company that believes its core differentiation is Grok. These are the moves of a company building infrastructure-layer leverage across every tier of the AI stack.
Meanwhile, Grok — xAI's flagship consumer product — has been shedding momentum. After a sharp climb to 17.8% of U.S. chatbot market share in January 2026 (up from 1.9% a year prior), traffic to grok.com dipped nearly 5% month-over-month by February, a slide widely attributed to the image generation controversies that dogged the platform in early 2026. At roughly $350 million in 2025 revenue against losses north of $1.4 billion per quarter, Grok is not yet a business — it's a product in search of a business model. Colossus, by contrast, is a revenue-generating asset the moment someone signs a lease.
The Anthropic deal is likely worth billions. It is real, immediate revenue. Grok is not.
What Anthropic Bought — And What It Had to Swallow
For Anthropic, the logic is equally unsentimental. The company is currently in talks to raise capital at a reported $900 billion valuation and has been aggressively stacking compute partnerships: an up-to-5-gigawatt agreement with Amazon, a 5-gigawatt deal with Google and Broadcom coming online in 2027, a $30 billion Microsoft-Nvidia Azure capacity arrangement, and a $50 billion U.S. AI infrastructure commitment with Fluidstack. The common thread across all of them: they're mostly future capacity.
Anthropic needed GPUs now. And the only player not constrained was SpaceX.
As Anthropic head of product Ami Vora put it at the company's developer conference in San Francisco: "We're partnering with SpaceX to use all the capacity of their Colossus One data center." The deal immediately unlocked tangible product improvements — Claude Code's five-hour rate limits were doubled for Pro, Max, Team, and Enterprise users; Opus API rate limits were sharply raised; peak-hour usage caps for Pro and Max accounts were removed. This wasn't a strategic announcement. It was triage.
The concession Anthropic made was geopolitical as much as financial. Dario Amodei's team had to negotiate with a man who called the company "doomed to become the opposite of its name" in February, who told followers "Anthropic hates Western Civilization," and who has spent months as the plaintiff in an active lawsuit against OpenAI — a suit his legal team argued was partly motivated by competitive grievances that benefit xAI. Musk, for his part, shifted cleanly: after spending a week with senior Anthropic leaders during the Oakland trial, he posted that "everyone I met was highly competent and cared a great deal about doing the right thing."
The realpolitik here is that both companies needed each other badly enough to absorb the optics. Anthropic, still locked out of Pentagon contracts after the Defense Department blacklisted it as a supply chain risk in March, is looking for compute partners who don't carry government-relations baggage. SpaceX, accelerating toward what may be the largest IPO in corporate history this fall, needed a marquee AI customer to anchor its infrastructure pitch to public market investors.
Expert Perspective
"This is a signal flare, not a business strategy. When a company with $230 billion in private valuation starts renting out its training cluster to a competitor, it's admitting something critical: the model layer isn't where the defensible margin lives. Infrastructure is. That's a rational conclusion — it's just not the conclusion xAI was supposed to represent."
— Infrastructure analyst perspective consistent with commentary from SemiAnalysis, which has tracked xAI's data center buildout as among the most aggressive in the industry
Who Wins, Who Loses, and Who Should Be Paying Attention
SpaceX/xAI wins on balance sheet optics ahead of IPO, on credibility as an AI infrastructure provider, and on the orbital compute narrative — Anthropic's stated interest in "multiple gigawatts of orbital AI compute capacity" gives SpaceX a named partner for its most ambitious and capital-intensive future bet. For IPO roadshow purposes, that's invaluable.
Anthropic wins operationally, at least in the short term. The deal is live within a month — faster than any of its hyperscaler partnerships — and it addresses a genuine user experience problem that had been generating complaints across developer communities.
CoreWeave and Nebius lose a negotiating data point. The neocloud sector has been capitalizing on compute scarcity to maintain pricing power. If one of the largest private AI companies in the world begins functioning as a neocloud provider at the margins, the implicit argument — that specialized GPU providers command premium contracts — gets more complicated. CoreWeave's current market cap sits around $59 billion; its debt-to-equity ratio exceeds 7x. The sector was already showing early signs of commoditization, with H100 rental rates falling 60–75% from peak. Another well-capitalized player leaning into the same playbook is not welcome news.
OpenAI is the oblique loser — not because of anything this deal does directly, but because of what it signals. Musk's lawsuit against OpenAI has centered on the argument that Altman's organization betrayed its nonprofit mission by cozying up to commercial interests. Yet here is Musk, within days of testifying in that trial, signing a bilateral compute deal with OpenAI's most formidable frontier model competitor. The message, intentional or not: this was always about positioning, not principle.
Skeptic's Corner
The Colossus 1 deal is elegant on paper and uncertain in practice. Anthropic is not a stable, long-term anchor tenant — it's a company currently in fundraising discussions that has been operationally constrained because of demand it is actively trying to monetize. If Anthropic's growth trajectory flattens, or if Amazon and Google capacity comes online ahead of schedule and Anthropic reduces its dependence on SpaceX, xAI could find itself with a stranded asset and a relationship that solved a short-term problem for both sides but created no durable competitive moat for either. Neocloud economics are brutal. CoreWeave carries $21 billion in debt and is still burning cash. Musk's version of this business — with proprietary chip ambitions, orbital data centers, and a branded AI product competing for the same GPUs it's renting out — is structurally more complex, not less.
Key Takeaways
300MW of Colossus 1 compute, representing over 220,000 Nvidia GPUs, now runs Anthropic workloads — capacity that became available because xAI's training migrated to the newer Colossus 2 cluster.
The deal underscores a structural truth that hyperscalers like Google have already been forced to confront: when you're building both the model and the infrastructure, every GPU you allocate internally is a GPU you can't rent. Google Cloud acknowledged being "capacity constrained" last month. Meta built an entirely new cloud apparatus to insulate its AI ambitions from that tradeoff. xAI, by contrast, chose to monetize the surplus.
Anthropic's compute stack now spans SpaceX, Amazon, Google, Microsoft, and Fluidstack — a deliberate multi-supplier strategy that mirrors how sophisticated enterprises de-risk cloud dependencies. The irony is that the company most vocal about AI safety is also the one most aggressively acquiring optionality on compute, globally.
The global dimension matters here. Anthropic has explicitly cited geographic expansion and data residency requirements in regulated industries — Europe's AI Act enforcement apparatus, India's emerging data localization frameworks, and Middle East sovereign AI initiatives are all creating demand for regionally distributed compute. SpaceX's orbital data center ambitions, whatever their 2035 timeline, speak directly to jurisdictional arbitrage at scale.
What to Watch Next
SpaceX IPO filing language — specifically how the company characterizes AI infrastructure revenue versus launch and Starlink revenue. The Anthropic deal's billing structure will tell you whether SpaceX is treating Colossus as a core business line or a one-time monetization.
Colossus 2 utilization — if xAI's training workloads genuinely fill Colossus 2, the company's self-portrait as a frontier model developer holds. If Colossus 2 capacity also becomes available to external customers within 12 months, xAI's pivot to infrastructure-first is confirmed, not suspected.
Grok's response to Anthropic's capacity unlock — Anthropic removing peak-hour limits and doubling Claude Code quotas is a direct competitive pressure on Grok's developer adoption ambitions. Watch whether xAI responds with product improvements or further leans into the enterprise infrastructure play.
Pentagon and Anthropic litigation outcomes — Anthropic's lawsuit against the Trump administration's Defense Department blacklist is active. If Anthropic regains access to government contracts, its urgency to secure non-governmental compute relationships diminishes. That changes the negotiating dynamic for any Colossus 2 renewal.
Terafab timeline — xAI's in-house chip fabrication ambitions are the wildcard that could either transform the neocloud economics of this arrangement or remain vaporware. If Musk's chips arrive on schedule, the Nvidia dependency — and the pricing leverage Nvidia holds over every neocloud — begins to erode.
The most important thing the xAI-Anthropic deal tells you is not that former rivals can become partners. It's that the AI infrastructure layer is consolidating faster than anyone expected, and the companies willing to rent their compute rather than hoard it may end up better positioned than the ones who treated GPUs like secrets. That's a lesson investors in CoreWeave, Nebius, and every neocloud-adjacent play should be considering right now.






