Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

Microsoft this week shipped its first crop of AI chips into one of its data centers, and plans to roll out more in the coming months, it does.
The chip, called the Maia 200, is designed to be what Microsoft calls an “AI inference powerhouse,” meaning it’s designed to work hard at running AI models in production. The company released the Maia’s impressive specs, saying it outperforms Amazon’s latest internship chips are Google’s latest Tensor Processing Units (TPU).
All of the cloud giants are turning to their own AI chip designs in part because of the pressures, and costs, of the latest and greatest acquisition from Nvidia – a shortfall that it shows no signs of slowing down.
But despite its modern, high-performance device, Microsoft CEO Satya Nadella said the company will still buy chips made by others.
“We have a great partnership with Nvidia, with AMD. They are innovating. We are innovating,” he explained. “I think a lot of people talk about who’s ahead. Just remember, you always have to be ahead.”
He added: “Just because we can integrate directly doesn’t mean we only integrate directly,” meaning building our own machines from top to bottom, not using products from other vendors.
That said, the Maia 200 will be used by Microsoft’s team known as Superintelligence, AI experts who are developing their own version of the giant. That’s according to Mustafa Suleymanformer co-founder of Google DeepMind who now leads the team. Microsoft is working on its models to perhaps one day reduce its reliance on OpenAI, Anthropic, and other model developers.
Techcrunch event
Boston, MA
| |
June 23, 2026
The Maia 200 chip will also support OpenAI models running on Microsoft’s Azure cloud platform, the company says. But, by all means, getting access to the most advanced AI tools still difficult for everyone, paying customers and internal teams alike.
Therefore in the post on XSuleyman was clearly delighted to share the news that his team gets first dibs. “It’s a big day,” he wrote when the chip started. “Our Superintelligence team will be the first to use the Maia 200 to develop our own frontier AI models.”