Friday, February 2nd 2024
Financial Analyst Outs AMD Instinct MI300X "Projected" Pricing
AMD's December 2023 launch of new Instinct series accelerators has generated a lot of tech news buzz and excitement within the financial world, but not many folks are privy to Team Red's MSRP for the CDNA 3.0 powered MI300X and MI300A models. A Citi report has pulled back the curtain, albeit with "projected" figures—an inside source claims that Microsoft has purchased the Instinct MI300X 192 GB model for ~$10,000 a piece. North American enterprise customers appear to have taken delivery of the latest MI300 products around mid-January time—inevitably, top secret information has leaked out to news investigators. SeekingAlpha's article (based on Citi's findings) alleges that the Microsoft data center division is AMD's top buyer of MI300X hardware—GPT-4 is reportedly up and running on these brand new accelerators.
The leakers claim that businesses further down the (AI and HPC) food chain are having to shell out $15,000 per MI300X unit, but this is a bargain when compared to NVIDIA's closest competing package—the venerable H100 SXM5 80 GB professional card. Team Green, similarly, does not reveal its enterprise pricing to the wider public—Tom's Hardware has kept tabs on H100 insider info and market leaks: "over the recent quarters, we have seen NVIDIA's H100 80 GB HBM2E add-in-card available for $30,000, $40,000, and even much more at eBay. Meanwhile, the more powerful H100 80 GB SXM with 80 GB of HBM3 memory tends to cost more than an H100 80 GB AIB." Citi's projection has Team Green charging up to four times more for its H100 product, when compared to Team Red MI300X pricing. NVIDIA's dominant AI GPU market position could be challenged by cheaper yet still very performant alternatives—additionally chip shortages have caused Jensen & Co. to step outside their comfort zone. Tom's Hardware reached out to AMD for comment on the Citi pricing claims—a company representative declined this invitation.
Sources:
Seeking Alpha, Tom's Hardware, MSN Money
The leakers claim that businesses further down the (AI and HPC) food chain are having to shell out $15,000 per MI300X unit, but this is a bargain when compared to NVIDIA's closest competing package—the venerable H100 SXM5 80 GB professional card. Team Green, similarly, does not reveal its enterprise pricing to the wider public—Tom's Hardware has kept tabs on H100 insider info and market leaks: "over the recent quarters, we have seen NVIDIA's H100 80 GB HBM2E add-in-card available for $30,000, $40,000, and even much more at eBay. Meanwhile, the more powerful H100 80 GB SXM with 80 GB of HBM3 memory tends to cost more than an H100 80 GB AIB." Citi's projection has Team Green charging up to four times more for its H100 product, when compared to Team Red MI300X pricing. NVIDIA's dominant AI GPU market position could be challenged by cheaper yet still very performant alternatives—additionally chip shortages have caused Jensen & Co. to step outside their comfort zone. Tom's Hardware reached out to AMD for comment on the Citi pricing claims—a company representative declined this invitation.
25 Comments on Financial Analyst Outs AMD Instinct MI300X "Projected" Pricing
Let's not forget that Intel probably still sells more Xeons while EPYC is the superior option.
But more seriously, even if the price of a myriad (=ten thousand) dollars is accurate, there may be more to this story. This is not retail market. Microsoft may have paid much in advance to help financing the development - they wouldn't be the first to do that. Maybe they wanted the product early and they accepted the role of the customer as a beta tester
Probably in a way too it is slowing down everyone's plans to create their own AI silicon. Lmao, Nvidia was driving the industry to go their own way and not feed a suppliers 75% margin, smh.
And then there is that problem of capacity for AMD or maybe lack of will to take chances and pay extra TSMC for extra capacity, fear that they could end up with a huge unsellable inventory in the end. Meaning AMD can't flood the world with EPYC CPUs even if there is huge demand.
AMD on the other hand depends more on open source architectures and software which defers more cost to the community rather than AMD itself. This lowers AMD’s expenditures so they can charge less for its hardware.
Both strategies have merit but I always hope for the open source route in all things.
I recently swapped out XEON's for EPYC's and geezus christ; Epyc's are blowing Xeon's out of the water. More cores for less, superior performance per watt, do more for less and more important you can tone down on the amount of servers for certain applications since EPYC has single socket chips with over 128 cores and 256 threads.
One Epyc is replacing 2 XEON servers here now.
Intel is getting double hit in this way; being replaced by both Epycs and compute GPUs.
The big big money is obviously in AI - by bringing MI300X they have again a serious competing product that is priced far lower then Nvidia's part and performs better.
People who worry about AMD only shifting it's focus on compute rather then gaming GPU's are wrong. CDNA and RDNA will simply continue; they technically both the same where CDNA has bin stripped from anything VGA related.
If anyone has bought stock of AMD a few years ago watch it skyrocket. Telling you.
And this is only begining. They've abandoned gaming/consumer markets. They don't even care. The recent 8000G STAPM isue is just another proof fo that.
It's also sort of silly to say AMD is deprioritizing gamers. When you look at all the consoles using AMD, or the Steam Deck and the array of clones that followed it's quiet clear that AMD is fully interested in pushing solutions for gaming.
The kicker is that GPU advances aren't cheap which means that the constant screaming for faster dedicated GPUs has put PC gaming on a rush track to where dedicated GPUs are going to be for cloud gaming providers. Yet SOCs, APUs, and IGPs are getting better and better all the time.
For example, RDNA is now integrated into servers, PCs, consoles, handhelds, laptops, and even smartphones.