BERLIN (DPA)

Microsoft has presented its new specialised chip, which is intended to make large AI applications faster and cheaper in the future.

The Maia 200 chip has been specially developed for running AI models, not for language training models. It will initially be used in Microsoft's data centres in the central region in the US.

The company said on Monday that the Maia 200 works particularly energy-efficiently as an “AI accelerator”, and offers a better price-performance ratio than other systems.

Microsoft presented a self-developed AI chip called Maia 100 in November 2023, which was primarily intended to make the creation of AI content more efficient.

The software giant relies heavily on artificial intelligence in its core business, and entered into a multi-billion-dollar package with ChatGPT inventor OpenAI to incorporate its technology into its own products.

The self-designed chips are intended to help Microsoft free itself from its strategic dependence on chipmaker Nvidia, and keep operating costs under control.

The software giant also hopes that this will enable software and hardware to be particularly closely harmonised, which should make it possible to achieve significantly more performance per watt.

In the tough and competitive AI market, Microsoft's competitors Google, Amazon, and Meta are also pursuing their own ambitious hardware projects in order to free themselves from their dependence on Nvidia.