Search

Microsoft Plans to Launch Own AI Chip to Rival Nvidia’s GPUs—Report

Microsoft Plans to Launch Own AI Chip to Rival Nvidia's GPUs—Report

Microsoft is planning to launch what’s been called its “first” AI chip next month, according to a report by The Information. The move is seen as a way for the company to reduce its dependence on Nvidia-made GPUs, which are in short supply due to high demand.

Codenamed ‘Athena,’ the Microsoft AI chip is “designed for data center servers that train and run large language models (LLMs),” the kind of technology that underpins the viral generative AI chatbot ChatGPT and Google’s Bard.

Also read: Gen-AI Tops CEO’s Investment Priority List Despite Concerns

Microsoft aims to secure stable chip supply

Citing sources with direct knowledge of the matter, The Information reported that Microsoft’s AI processor is “a culmination of years of work” and could help the Redmond-based company “lessen its reliance on Nvidia-designed AI chips,” which have become scarce.

The chip, which will likely be unveiled at Microsoft’s Ignite conference between Nov. 14 and 17, is expected to compete with Nvidia’s powerful A100 graphics processing unit (GPU). Nvidia’s chips are being used to run LLMs and other AI apps by several companies, including OpenAI.

The Nvidia A100 GPU costs up to $15,000 each. Microsoft’s data centers currently use the graphics processing units to power state-of-the-art large language models for cloud customers like Intuit, as well as for AI features in the firm’s productivity apps like MS Word.

With the Athena AI chip, Microsoft will have more control over its own hardware, allowing the company to secure a stable supply of chips for its AI-related projects at a much lower cost and more efficiently.

The development of Athena comes at a time when demand for artificial intelligence chips is soaring. Large language models, in particular, require massive amounts of computing power to train and run. This has led to a shortage of AI chips, causing prices to rise.

For example, a report released by research firm TrendForce in March revealed that ChatGPT may need more than 30,000 Nvidia GPUs to process training data, potentially squeezing out PC gamers.

The number of graphics processing units that OpenAI’s GPT model needed to process training data in 2020 came to about 20,000. That was just from 180 billion parameters of data. As ChatGPT continues to grow and evolve, TrendForce expects that figure to rocket.

Microsoft Plans to Launch its Own AI Chip to Rival Nvidia's GPUs—Report

AI chip war

Nvidia specializes in making hardware mainly for video game consoles, GPUs for graphics cards, and processors for the AI industry. Its GPUs have also been deployed to mine crypto assets such as ethereum (ETH), monero (XMR), and zcash (ZEC).

In recent months, the chips have been widely used in the AI sector, causing supply shortages. Microsoft’s latest venture into chip-making reflects a growing trend within the tech industry, suggesting that after the AI chatbots’ race, firms may be entering the AI chip ‘war’.

As MetaNews reported on Tuesday, OpenAI, the outfit behind ChatGPT, is actively investigating the creation of its own artificial intelligence chips in order to address the market shortage. The San Francisco-based company is reportedly evaluating an acquisition target.

Meta is also developing its own silicon-chip Meta Training and Inference Accelerator to drive its AI ambitions. Amazon, the world’s biggest e-commerce marketplace, has been building its chip since 2013, but is now speeding up the plans amid the generative AI race.

Image credits: Shutterstock, CC images, Midjourney, Unsplash.

Welcome

Install
×