Microsoft, on November 15, 2023, announced a duo of custom-designed computing chips, joining other big tech firms that – faced with the high cost of delivering artificial intelligence services – are bringing key technologies inhouse. Microsoft said it does not plan to sell the chips but instead will use them to power its own subscription software offerings and as part of its Azure cloud computing service.
At its Ignite developer conference in Seattle, Microsoft introduced a new chip, called Maia, to speed up AI computing tasks and provide a foundation for its $30-a-month ‘Copilot’ service for business software users, as well as for developers who want to make custom AI services. The Maia chip was designed to run large language models, a type of AI software that underpins Microsoft’s Azure OpenAI service and is a product of Microsoft’s collaboration with ChatGPT creator OpenAI.
Microsoft and other tech giants such as Alphabet are grappling with the high cost of delivering AI services, which can be 10 times greater than for traditional services such as search engines. Microsoft executives have said they plan to tackle those costs by routing nearly all of the company’s sprawling efforts to put AI in its products through a common set of foundational AI models. The Maia chip, they said, is optimized for that work. Named Cobalt, the new chip is a central processing unit (CPU) made with technology from Arm Holdings. Microsoft disclosed on November 15, 2023, that it has already been testing Cobalt to power Teams, its business messaging tool. But Microsoft’s Guthrie said his company also wants to sell direct access to Cobalt to compete with the ‘Graviton’ series of in-house chips offered by Amazon Web Services (AWS). https://tinyurl.com/2vpx7c72
Source: IBP