During the recent Advancing AI event in San Francisco, AMD introduced its new Instinct MI325X AI chips. Despite falling short of its original memory promise, AMD managed to fit 256 GB of HBM3e memory onto the MI325X, which is a considerable upgrade from the previous 192 GB. This places AMD’s offering well above Nvidia’s H200 in terms of memory capacity and bandwidth, potentially making it a favored choice for cloud providers. Although the power consumption has increased to 1,000 watts, AMD claims substantial performance benefits in real-world AI tasks. The company also teased the upcoming MI355X, expected to feature even greater enhancements. Both AMD’s MI325X and upcoming network solutions reflect the company’s commitment to competing aggressively in the AI accelerator space.
Thinking about your next software development?
Agile software design, development and support.