Is xAI a neocloud now?
xAI’s surprise partnership with Anthropic suggests the company may be shifting its focus from building AI models to selling compute capacity. By renting out its Colossus 1 data center, xAI is positioning itself more like a neocloud provider than a traditional AI developer.

On Wednesday, xAI and Anthropic announced a surprise partnership that has the Claude-maker buying out “all of the compute capacity at [xAI’s] Colossus 1 data center,” roughly 300MW that allowed Anthropic to immediately raise its usage limits. It’s a huge deal for xAI, likely worth billions of dollars. More importantly, it immediately monetized one of the company’s most impressive accomplishments, turning xAI from a consumer to a provider of compute.
It’s tempting to see the arrangement as a shot at OpenAI amid the ongoing lawsuit. But Musk’s explanation on X was that xAI had already moved training to a newer data center, Colossus 2, and xAI simply didn’t need them both.
Short-term logic
In the short term, there’s an obvious logic at work. xAI’s existing products are mostly focused on Grok, which has seen plummeting usage since the image generation debacles earlier this year. If xAI’s data center buildout exceeds what Grok needs to operate, partnering with Anthropic adds significant revenue to the balance sheet. This is especially useful as the company, now combined with SpaceX, speeds toward an IPO.
More broadly, having Anthropic lined up as a customer makes it easier to believe that SpaceX’s orbital data center play might actually work.
A different strategic signal
Beyond the short-term benefit, the Anthropic partnership sends an unusual message about where Elon Musk’s priorities lie. It suggests the company’s real business may be more about building data centers than training AI models.
It’s rare to see a major tech company treat compute resources this way. Companies like Google and Meta, which are also training models, are building more data centers—but when forced to choose between selling available compute to customers and preserving it to build their own tools, they reliably choose the latter.
Just last month, Sundar Pichai admitted on a call that Google Cloud revenue was lower than it could have been because the company was “capacity constrained.” When given the choice of renting out GPUs or using them to develop AI products, Google chose the AI products.
Facebook has faced a more extreme version of the same constraint, spinning up an entirely new cloud apparatus to ensure it would have enough GPU power to chase Mark Zuckerberg’s AI ambitions. As he put it when announcing Meta Compute in January, “How we engineer, invest, and partner to build this infrastructure will become a strategic advantage.”
The key word is “strategic.” Both Zuckerberg and Pichai are looking toward a future where AI powers the most popular and lucrative systems in the world. Computing power isn’t just a way to satisfy today’s inference demand—it’s a foundation for tomorrow’s products. Running short on compute means missing that opportunity.
Positioning as a neocloud
By focusing on data centers—earthbound and otherwise—xAI is positioning itself more like a neocloud business: buying GPUs from Nvidia and renting them out to model developers like Anthropic. It’s a more difficult business, squeezed by chip suppliers and shifting cycles of demand.
Valuations reflect that reality. xAI was valued at $230 billion in its January funding round. CoreWeave, which oversees a comparable quantity of computing power, is worth less than a third of that.
Musk’s version of a neocloud is more ambitious. Some of the data centers might be in space—at least by 2035, if plans hold. xAI will also be making its own chips at the Terafab, reducing some of Nvidia’s pricing power. But none of that changes the fundamental economics of the neocloud business.
What happens to software ambitions?
As recently as the February all-hands, xAI outlined substantial software ambitions. That presentation unveiled the orbital data center project, but also teased major goals in coding—since reinforced by the Cursor partnership—and initiatives like leveraging computer use into full-scale digital twins under the Macrohard project.
These long-horizon projects require committed computing resources to succeed. As long as xAI is selling large quantities of compute to competitors, it’s difficult to see how those ambitions can be fully realized.