Українська правда

AMD and OpenAI Introduce New MI400 Chips for AI Processing

- 13 June, 01:21 PM

AMD has announced its Instinct MI400 series of artificial intelligence chips, which will be available next year. This was announced during a presentation in San Jose, where AMD CEO Lisa Su said that the new GPUs will be able to work as part of the Helios rack-scale system - a server rack that can scale to the level of data centers. This is reported by CNBC.

OpenAI CEO Sam Altman also appeared on stage and confirmed that his company would be using the MI400, saying the chip's specifications were "incredible" to him.

The MI400 is designed to compete directly with NVIDIA Blackwell (B100, B200), which are already used in high-performance AI clusters. AMD notes that its chips have more high-speed memory, which allows them to run large-scale language models on a single GPU, increasing the efficiency of inference workloads.

AMD says its current MI355X chip, which is already shipping to cloud providers, is 7x more efficient than its predecessor and delivers 40% more compute (in tokens) per dollar thanks to lower power consumption. It will be the foundation for Oracle's cloud services, with the company planning to deploy more than 131,000 of these chips.

Despite the technological breakthrough, AMD is still significantly behind NVIDIA, which analysts estimate controls over 90% of the AI-GPU market. However, AMD expects the overall AI chip market to exceed $500 billion by 2028 and aims to capture a significant share through competitive pricing and open solutions.

AMD is actively investing in the ecosystem, having acquired or invested in 25 companies in the past year, including ZT Systems (a server manufacturer), and is developing an open infrastructure with UALink networking technology. This allows for the integration of CPU, GPU, and networking solutions into a single stack, as opposed to NVIDIA's proprietary technologies such as NVLink and CUDA.

AMD's customers include not only OpenAI, but also Tesla, xAI, Cohere, Meta (which it uses for Llama), and Microsoft (for Copilot). Su noted that the company plans to update its AI chips annually to keep up with the pace of the market.

Load more