Thursday, June 15th 2023
AMD Confirms that Instinct MI300X GPU Can Consume 750 W
AMD recently revealed its Instinct MI300X GPU at their Data Center and AI Technology Premiere event on Tuesday (June 15). The keynote presentation did not provide any details about the new accelerator model's power consumption, but that did not stop one tipster - Hoang Anh Phu - from obtaining this information from Team Red's post-event footnotes. A comparative observation was made: "MI300X (192 GB HBM3, OAM Module) TBP is 750 W, compared to last gen, MI250X TBP is only 500-560 W." A leaked Giga Computing roadmap from last month anticipated server-grade GPUs hitting the 700 W mark.
NVIDIA's Hopper H100 took the crown - with its demand for a maximum of 700 W - as the most power-hungry data center enterprise GPU until now. The MI300X's OCP Accelerator Module-based design now surpasses Team Green's flagship with a slightly greater rating. AMD's new "leadership generative AI accelerator" sports 304 CDNA 3 compute units, which is a clear upgrade over the MI250X's 220 (CDNA 2) CUs. Engineers have also introduced new 24G B HBM3 stacks, so the MI300X can be specced with 192 GB of memory (as a maximum), the MI250X is limited to a 128 GB memory capacity with its slower HBM2E stacks. We hope to see sample units producing benchmark results very soon, with the MI300X pitted against H100.
Sources:
VideoCardz, AnhPhuH Tweet
NVIDIA's Hopper H100 took the crown - with its demand for a maximum of 700 W - as the most power-hungry data center enterprise GPU until now. The MI300X's OCP Accelerator Module-based design now surpasses Team Green's flagship with a slightly greater rating. AMD's new "leadership generative AI accelerator" sports 304 CDNA 3 compute units, which is a clear upgrade over the MI250X's 220 (CDNA 2) CUs. Engineers have also introduced new 24G B HBM3 stacks, so the MI300X can be specced with 192 GB of memory (as a maximum), the MI250X is limited to a 128 GB memory capacity with its slower HBM2E stacks. We hope to see sample units producing benchmark results very soon, with the MI300X pitted against H100.
43 Comments on AMD Confirms that Instinct MI300X GPU Can Consume 750 W
news.microsoft.com/source/features/sustainability/project-natick-underwater-datacenter/
1 - HBM3 is costly, and may negate the cost advantage AMD might have with the MI300. Nvidia is likely to ship with HBM3 products at the same timeframe or earlier
2 - There is no apparent equivalent to the transformer engine, which can triple performance in LLM scenarios in Nvidia counterparts
3 - Nvidias H100 is shipping in full volume today, with more researcher and technical support in their superior ecosystem
4 - AMD is yet to disclose benchmarks
I hope AMD can resolve or alleviate some of these issues because it seems like an excellent product overall.
And I think the 570 watt peak I have seen on my rtx 4090 with oc is bad enough. Yes I am aware this is meant for other things than gaming. But still 700 watt or more is a lot of power for one gpu.
NVIDIA is a software company.
By then NVIDIA will have built it's 1:1 virtual/real world model in the Omniverse, of which every major manufacturer has already signed onto, just as with CUDA dominance for the past decade.
From 15 months 3/22 ago this image.
www.nvidia.com/en-us/omniverse/ecosystem/ current ecosystem.
Where is AMD? I hope so, a monopoly isn't a great situation, but the product needs to deliver. Zen did, but that was a hardware success not a software success. The software (AGESA) for Zen has been buggy in each iteration for years now, which they hope to fix eventually with opensource OpenSIL.
"Narrative" aside, we'll see how they do in 2023 for enterprise GPU, in 2021 they couldn't breach 9% and the trend isn't changing, that percentage went down in 2022.
www.nasdaq.com/articles/better-buy:-nvidia-vs.-amd-2
AMD drivers again? I have no issues and don't/have not seen the masses abnormally arrayed against AMD drivers, just the odd hyperbolic statement.
If you really want to make them almost completely eco friendly just launch them into space!
If they're going to be run, they may as well be run more efficiently.
Average ocean temperature is 0-20ºC, even if that doubled (projections are a couple ºC increase over several centuries), it would still be an effective cooling medium.
I wouldn't be surprised to see local ecology find some way to benefit from the heat source, as with coral/bacteria ecosystems near underwater volcanos or vents etc.