Advertisement

Microsoft will use custom-designed chips to bolster its AI services

They'll reach its data centers early next year.

Microsoft

Microsoft has announced a project it has been "refining in secret for years;" Its own custom silicon in the form of two new server chips. The company unveiled the fruits of its labor at Microsoft Ignite, showing off the Azure Maia AI Accelerator and the Azure Cobalt CPU. The latter of which, at least, the company is happy to admit is ARM-based, which can still feel unthinkable to eyes so used to Microsoft and Intel's hand-in-glove dominance of the computing market.

The company turned to OpenAI to receive feedback on Azure Maia and to use the company's models for testing. OpenAI CEO Sam Altman said the updated Microsoft's Azure will also provide the opportunity for training improved models and making them more affordable for customers.

The custom-designed chips can further optimize Microsoft's infrastructure instead of relying on third-party options. "Much like building a house lets you control every design choice and detail, Microsoft sees the addition of homegrown chips as a way to ensure every element is tailored for Microsoft cloud and AI workloads," a blog post from the company explained. "The chips will nestle onto custom server boards, placed within tailor-made racks that fit easily inside existing Microsoft datacenters. The hardware will work hand in hand with software — co-designed together to unlock new capabilities and opportunities."

The company plans to use the new Maia 100 AI Accelerator to power some of Microsoft Azure's biggest internal AI workloads. Microsoft claims both the accelerator and Azure Cobalt CPU will improve efficiency and performance. The chips will make their way to Microsoft's data centers early next year for powering services like Microsoft Copilot (now encompassing Bing Chat) and Azure OpenAI Service.

This article contains affiliate links; if you click such a link and make a purchase, we may earn a commission.