Scott Guthrie, executive vice president of cloud and enterprise at Microsoft, speaks at the Microsoft Build developer conference in Seattle on May 7, 2018. The Build conference, held in Seattle for the second year in a row, is expected to focus on the company’s cloud technologies and the artificial intelligence capabilities within those services.
Grant Hinsley | Bloomberg | Getty Images
microsoft announced the next generation of artificial intelligence chips that could replace the world’s leading processors. Nvidia And products from cloud rivals too. Amazon and google.
The Maia 200 arrived two years after Microsoft announced it had developed its first AI chip, the Maia 100, but it was never made available for rent to cloud customers. Scott Guthrie, Microsoft’s executive vice president of cloud and AI, said in a blog post Monday that the new chip “will be available to a broader range of customers in the future.”
Guthrie called Maia 200 “the most efficient inference system Microsoft has ever deployed.” Developers, academics, AI labs, and people contributing to open source AI models can apply for a preview of the software development kit.
Microsoft said its superintelligence team, led by Mustafa Suleiman, will use the new chip. It’s also used in the Microsoft 365 Copilot add-on for commercial productivity software bundles and Microsoft Foundry services for building on AI models.
Cloud providers are facing a surge in demand from generative AI model developers like Anthropic and OpenAI, as well as companies building AI agents and other products based on popular models. Data center operators and infrastructure providers are looking to increase computing power while reducing power consumption.
Microsoft is installing Maia 200 chips in data centers in the Central US region, and will later roll them out in three West US regions, with additional locations planned.
chips are used Taiwan Semiconductor Manufacturing Co., Ltd. 3 nanometer process. There are four interconnected within each server. Relies on Ethernet cables rather than the InfiniBand standard. Nvidia sells InfiniBand switches after acquiring Mellanox in 2020.
The chip offers 30% higher performance than similarly priced alternatives, Guthrie wrote. Microsoft says each Maia 200 has more high-bandwidth memory than Amazon Web Services’ third-generation Trainium AI chips or Google’s seventh-generation tensor processing units.
Microsoft can wire up to 6,144 Maia 200 chips together to deliver high performance and reduce energy usage and total cost of ownership, Guthrie wrote.
In 2023, Microsoft demonstrated that the GitHub Copilot coding assistant could run on Maia 100 processors.
Spotlight: China’s AI models adapt without Nvidia

