AMD's artificial intelligence processors could reportedly soon be offered by Microsoft to act as a substitute for its cloud computing customers looking to obtain Nvidia's H100 family of graphic processing units (GPUs), which are currently hard to obtain.

Because data and computation cannot fit on a single chip, businesses usually need to connect together—or cluster—numerous GPUs to execute apps or construct AI models.

AMD has stated that the chips are strong enough to operate and train massive AI models, and the company anticipates $4 billion in sales from AI chips this year.

AI PC Race: AMD	Unveils 2 New Artificial Intelligence-Powered Chips to Rival Intel, Nvidia
(Photo: Timothy Dykes from Unsplash) Some people claim that AI is currently on the tech pedestal, and companies are catching up to put it on top, just like what AMD recently announced about its AI chips for PCs.

Microsoft will use Azure's cloud computing service to sell clusters of Advanced Micro Devices' flagship MI300X AI chips; further information will be disclosed during the tech giant's Build Developer Conference.

AMD's AI chips are set to substitute for Nvidia's powerful H100 family of graphics processing units (GPUs), which are in high demand and dominate the data center semiconductor market for AI.

Microsoft will also unveil a sneak peek of its upcoming Cobalt 100 custom CPUs during the conference. In addition to Nvidia's premium AI chips, Microsoft's cloud computing division offers access to Maia, an AI chip developed in-house.

Read Also: Samsung Profits Increase After Riding AI Demand on Chips 

AI Constricts Memory Chip Supply

The AI wave continues to constrict certain supply chains as recent reports indicate that high-bandwidth memory chips from SK Hynix and Micron are nearly sold out for this year and the upcoming year. SK Hynix is now Nvidia's chip supplier, but the company is also looking into Samsung as a potential source.

Training LLMs like OpenAI's ChatGPT require high-performance memory chips, so AI adoption has surged. LLMs need these chips to remember information from past interactions with users and their preferences to react to queries in a manner akin to that of a human.

SK Hynix intends to increase output to keep up with the surge in demand. To achieve this, the Yongin semiconductor cluster in South Korea, the M15X fab in Cheongju, and state-of-the-art packaging facilities in Indiana will all be invested. 

According to sources, the need for AI chips is being driven by large tech companies such as Microsoft and Amazon, which are spending billions on training their LLMs to remain competitive. 

Apple's Own AI Chip

Apple is also reportedly developing an AI chip in its data centers to run AI technologies. The initiative would expand upon Apple's previous endeavors to develop internal chips utilized in its Macs, iPhones, and other devices. 

Apple shares saw a 1.2% increase in late trade after the news. By year's end, they had dropped 5.6%. A request for comment from the Cupertino, California-based corporation was not immediately answered.

Apple has been falling behind its tech rivals in terms of generative AI, the technology that powers chatbots, and other cutting-edge technologies. However, the company is preparing to present a novel take on AI at the Worldwide Developers Conference in June.

Related Article: Honda, IBM in Building AI-Powered Chips for Next-Gen Cars 

Written by Aldohn Domingo

(Photo: Tech Times)

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion