AMD accelerates pace of data centre AI innovation and leadership with expanded AMD instinct GPU roadmap

At Computex 2024, AMD showcased the growing momentum of the AMD Instinct accelerator family during the opening keynote by Chair and CEO Dr. Lisa Su. AMD unveiled a multiyear, expanded AMD Instinct accelerator roadmap which will bring an annual cadence of leadership AI performance and memory
capabilities at every generation.

The updated roadmap starts with the new AMD Instinct MI325X accelerator, which will be available in Q4 2024. Following that, the AMD Instinct MI350 series, powered by the new AMD CDNA 4 architecture, is expected to be available in 2025 bringing up to a 35x increase in AI inference performance compared to AMD Instinct MI300 Series with AMD CDNA 3 architecture 1 . Expected to arrive in 2026, the AMD Instinct MI400 series is based on the AMD CDNA “Next” architecture.

“The AMD Instinct MI300X accelerators continue their strong adoption from numerous partners and customers including Microsoft Azure, Meta, Dell Technologies, HPE, Lenovo and others, a direct result of the AMD Instinct MI300X accelerator exceptional performance and value proposition,” said Brad McCredie, corporate vice president, of Data Centre Accelerated Compute, AMD. “With our updated annual cadence of products, we are relentless in our pace of innovation, providing the leadership capabilities and performance of the AI industry and our customers expect to drive the next evolution of data center AI training and inference.”

AMD AI software ecosystem matures

The AMD ROCm™ 6 open software stack continues to mature, enabling AMD Instinct MI300X accelerators to drive impressive performance for some of the most popular LLMs. On a server using eight AMD Instinct MI300X accelerators and ROCm 6 running Meta Llama-3 70B, customers can get 1.3x better inference performance and token generation compared to the competition 2. On a single AMD Instinct MI300X accelerator with ROCm 6, customers can get better inference performance and token generation throughput compared to the competition by 1.2x on Mistral-7B 3 . AMD also highlighted that Hugging Face, the largest and most popular repository for AI models, is now testing 700,000 of their most popular models nightly to ensure they work out of box on AMD Instinct MI300X accelerators. In addition, AMD is continuing its upstream work into popular AI frameworks like PyTorch, TensorFlow and JAX.

AMD previews new accelerators and reveals annual cadence roadmap
During the keynote, AMD revealed an updated annual cadence for the AMD Instinct accelerator roadmap to meet the growing demand for more AI compute. This will help ensure that AMD Instinct accelerators propel the development of next-generation frontier AI models.
The updated AMD Instinct annual roadmap highlighted:
 The new AMD Instinct MI325X accelerator, which will bring 288GB of HBM3E memory and 6 terabytes per second of memory bandwidth, uses the same industry standard Universal Baseboard server design used by the AMD Instinct MI300 series, and will be generally available in Q4 2024. The accelerator will have an industry leading memory capacity and bandwidth, 2x and 1.3x better than the competition respectively 4, and 1.3x better 5 compute performance than the competition.

 The first product in the AMD Instinct MI350 Series, the AMD Instinct MI350X accelerator, is based on the AMD CDNA 4 architecture and is expected to be available in 2025. It will use the same industry standard Universal Baseboard server design as other MI300 Series accelerators and will be built using advanced 3nm
process technology, support the FP4 and FP6 AI datatypes and have up to 288 GB of HBM3E memory.
 AMD CDNA “Next” architecture, which will power the AMD Instinct MI400 Series accelerators, is expected to be available in 2026 providing the latest features and capabilities that will help unlock additional performance and efficiency for inference and large-scale AI training.

Finally, AMD highlighted the demand for AMD Instinct MI300X accelerators continues to grow with numerous partners and customers using the accelerators to power their demanding AI workloads, including:
 Microsoft Azure using the accelerators for Azure OpenAI services and the new
Azure ND MI300X V5 virtual machines.
 Dell Technologies using MI300X accelerators in the PowerEdge XE9680 for enterprise AI workloads.
 Supermicro providing multiple solutions with AMD Instinct accelerators.
 Lenovo powering Hybrid AI innovation with the ThinkSystem SR685a V3
 HPE is using them to accelerate AI workloads in the HPE Cray XD675.

AIdata centreITtechnology
Comments (0)
Add Comment