ARM has long run its business as an architect behind the scenes, designing chips that power almost all the world’s smartphone and making money off royalties from the chips it designs for customers.
Now, Arm is changing it up by announcing its own AI chip, the Arm AGI CPU.
Arm CEO Rene Haas said Tuesday at a company conference that this massive pivot wasn’t just an internal strategy shift—it was a direct plea from the world’s most powerful AI giants. The company name-dropped OpenAI and Meta as major partners for this chip.
“The biggest reason we’re doing this is that our partners have asked for it,” Haas said Tuesday.
With energy constraints and memory shortages, the AI boom has created a massive bottleneck in data centers. Faced with this demand, Arm stepped up with an AI chip that it says is more energy-efficient. Arm says it sees a $1.5 trillion market opportunity as it moves into AI chips for cloud, edge, and physical AI.
Arm stock was up by more than 18% on Wednesday. Mizuho analysts wrote that they see “strong growth opportunities” for Arm in AI infrastructure and the automotive industry. Bank of America research analyst Vivek Arya wrote in a note to investors that the company’s outlook could be “too ambitious.”
Meta and OpenAI partner with Arm
Meta has been building out data centers at a massive scale to power its apps and its latest superintelligence ventures. Santosh Janardhan, head of infrastructure at Meta, said Tuesday onstage that its coming “Hyperion” cluster could draw 5 gigawatts of power, enough to power 50 towns the size of Palo Alto.
“If we met the performance, we couldn’t get the power. If we got the power, we wouldn’t get the performance,” Janardhan said.
This sparked an engineering project within Meta, where engineers were “working ’round the clock” to port its systems to Arm in three months, said Paul Saab, a Meta engineer.
“I didn’t even ask my boss here for permission to buy these machines or even start the project,” Saab said onstage.
While Saab says he saw major performance benefits, at the time, there wasn’t an ARM chip available to buy.
OpenAI faced a similar problem. Its compute demand has grown massively as it trains and runs its ChatGPT models, its AI coding tool Codex, and more.
“That is one of the most common things I hear inside OpenAI. I need more compute,” Kevin Weil, vice president of OpenAI for science, said onstage, adding that it needed chips that were energy-efficient.
Arm said it expects this chip to generate $15 billion in revenue by fiscal 2031.
The chip market is ‘getting very crowded’
Arm faces the risk that the CPU market is “getting very crowded,” Arya wrote in his analyst note. Other competitors, such as AMD, Nvidia, and Intel, have more CPU products and more established customers. Notably, both Meta and OpenAI also work with AMD and Nvidia, which could leave “limited” opportunity for Arm’s new CPU, Arya wrote.
“Moreover, the bigger AI grows, the more pressure ARM’s smartphone/consumer markets would have from limited memory supplies,” Arya wrote.
That said, the increasing demand has led many customers to turn to chip companies beyond Nvidia for their computing needs. Both Meta and OpenAI also work with Broadcom to build AI chips.
The rise of AI agents has also led to greater demand for inference, or how AI models draw conclusions and make predictions. While Nvidia’s core AI chips, the GPUs, dominate AI training, CPUs like Arm’s AGI CPU can also help with inference. Nvidia also recently made moves into this market.
Have a tip? Contact this reporter via email at rmchan@businessinsider.com, or Signal at rosal.13. Use a personal email address, a nonwork WiFi network, and a nonwork device; here’s our guide to sharing information securely.
