close
close

AMD launches AI chip MI325X to compete with Nvidia's Blackwell

AMD launches AI chip MI325X to compete with Nvidia's Blackwell

AMD launched a new artificial intelligence chip on Thursday that directly targets Nvidia's data center graphics processors, known as GPUs.

The Instinct MI325X, as the chip is called, will begin production before the end of 2024, AMD said Thursday during an event to announce the new product. When AMD's AI chips are viewed by developers and cloud giants as a close replacement for Nvidia's products, it could put pricing pressure on Nvidia, which had a gross margin of about 75% last year while its GPUs were in high demand last year.

Advanced generative AI like OpenAI's ChatGPT requires massive data centers full of GPUs to perform the necessary processing, which has led to demand for more companies to provide AI chips.

In recent years, Nvidia has dominated the majority of the data center GPU market, but AMD has historically been in second place. Now AMD wants to take market share from its Silicon Valley rival, or at least capture a large part of the market, which is expected to be worth $500 billion by 2028.

“AI demand has actually continued to rise and has even exceeded expectations. It is clear that the rate of investment continues to increase everywhere,” said AMD CEO Lisa Su at the event.

AMD didn't announce any new major cloud or internet customers for its Instinct GPUs at the event, but the company has previously announced both Meta And Microsoft Buy its AI GPUs and that OpenAI uses them for some applications. The company also didn't announce pricing for the Instinct MI325X, which is typically sold as part of a complete server.

With the introduction of the MI325X, AMD is accelerating its product plan to release new chips annually to better compete with Nvidia and capitalize on the boom in AI chips. The new AI chip is the successor to the MI300X, which began shipping late last year. AMD's 2025 chip will be called MI350, and its 2026 chip will be called MI400, the company said.

The launch of the MI325X will pit it against Nvidia's upcoming Blackwell chips, which Nvidia says will ship in significant quantities early next year.

A successful launch of AMD's latest data center GPU could spark interest from investors looking for more companies to capitalize on the AI ​​boom. AMD is up just 20% so far in 2024, while Nvidia's stock is up over 175%. Most industry estimates have Nvidia holding more than 90% of the data center AI chip market.

AMD shares fell 3% during trading on Thursday.

AMD's biggest obstacle to gaining market share is that its competitor's chips use its own programming language, CUDA, which has become standard among AI developers. This essentially ties developers to the Nvidia ecosystem.

In response, AMD announced this week that it has improved its competing software called ROCm so that AI developers can easily move more of their AI models to AMD's chips, which the company calls accelerators.

AMD has rated its AI accelerators as more competitive for use cases where AI models create content or make predictions, rather than when an AI model processes terabytes of data for improvement. This is partly due to the advanced memory that AMD uses on its chip, which allows it to serve Meta's Llama AI model faster than some Nvidia chips.

“What you see is that the MI325 platform delivers up to 40% more inference performance than the H200 on Llama 3.1,” Su said, referring to Metas large voice AI model.

Also against Intel

While AI accelerators and GPUs have become the most closely watched part of the semiconductor industry, AMD's core business is the central processors, or CPUs, which are at the heart of almost all servers in the world.

AMD's data center revenue more than doubled to $2.8 billion in the June quarter compared to last year, with AI chips accounting for only about $1 billion, the company said in July.

According to the company, AMD accounts for about 34% of total spending on data center CPUs. That's still less than Intelwhich continues to be the market leader with its Xeon chip series. AMD is looking to change that with a new CPU lineup called EPYC 5th Gen, also announced on Thursday.

These chips come in various configurations, ranging from a low-cost and low-power 8-core chip that costs $527 to 192-core, 500-watt supercomputer processors that cost $14,813 per chip .

According to AMD, the new CPUs are particularly suitable for feeding data into AI workloads. Almost all GPUs require a CPU on the same system to boot the computer.

“Today’s AI is all about CPU capability, and you see that in data analysis and many other applications of this kind,” Su said.

Leave a Reply

Your email address will not be published. Required fields are marked *