Friday, February 2nd 2024

Financial Analyst Outs AMD Instinct MI300X "Projected" Pricing

AMD's December 2023 launch of new Instinct series accelerators has generated a lot of tech news buzz and excitement within the financial world, but not many folks are privy to Team Red's MSRP for the CDNA 3.0 powered MI300X and MI300A models. A Citi report has pulled back the curtain, albeit with "projected" figures—an inside source claims that Microsoft has purchased the Instinct MI300X 192 GB model for ~$10,000 a piece. North American enterprise customers appear to have taken delivery of the latest MI300 products around mid-January time—inevitably, top secret information has leaked out to news investigators. SeekingAlpha's article (based on Citi's findings) alleges that the Microsoft data center division is AMD's top buyer of MI300X hardware—GPT-4 is reportedly up and running on these brand new accelerators.

The leakers claim that businesses further down the (AI and HPC) food chain are having to shell out $15,000 per MI300X unit, but this is a bargain when compared to NVIDIA's closest competing package—the venerable H100 SXM5 80 GB professional card. Team Green, similarly, does not reveal its enterprise pricing to the wider public—Tom's Hardware has kept tabs on H100 insider info and market leaks: "over the recent quarters, we have seen NVIDIA's H100 80 GB HBM2E add-in-card available for $30,000, $40,000, and even much more at eBay. Meanwhile, the more powerful H100 80 GB SXM with 80 GB of HBM3 memory tends to cost more than an H100 80 GB AIB." Citi's projection has Team Green charging up to four times more for its H100 product, when compared to Team Red MI300X pricing. NVIDIA's dominant AI GPU market position could be challenged by cheaper yet still very performant alternatives—additionally chip shortages have caused Jensen & Co. to step outside their comfort zone. Tom's Hardware reached out to AMD for comment on the Citi pricing claims—a company representative declined this invitation.
Sources: Seeking Alpha, Tom's Hardware, MSN Money
Add your own comment

25 Comments on Financial Analyst Outs AMD Instinct MI300X "Projected" Pricing

#1
Guwapo77
I had expected the price to be double what they are charging for it tbh. AMD is really trying to kick the door down with this pricing. I wonder how long GPT-4 has been up and running with these accelerators. This is a program I would like to see run on both AMD and Nvidia, I'm curious to see what the differences are in real world scenarios.
Posted on Reply
#2
PerfectWave
WoW really exited about this news :roll:
Posted on Reply
#3
Vya Domus
They're seriously undercutting Nvidia, getting an almost 200GB GPU for 10K is nuts.
Posted on Reply
#4
Event Horizon
Once AMD gets a taste of that sweet AI money they'll deprioritize gamers even further just like NVIDIA has.
Posted on Reply
#5
john_
AMD needs to create a success story for their product and that means sales, not profits. Profits will come latter when more and more will be posting projects based on MI300 series and others will start looking at AMD's products as a true alternative to Nvidia's.
Let's not forget that Intel probably still sells more Xeons while EPYC is the superior option.
Posted on Reply
#6
Vya Domus
john_Let's not forget that Intel probably still sells more Xeons while EPYC is the superior option.
I doubt that's true anymore, Xeons have been far worse products for far too long for that to be the case, I am sure they still have way more market share but as far as currently sold products I'd be amazed if they still outsold AMD.
Posted on Reply
#7
Daven
Vya DomusI doubt that's true anymore, Xeons have been far worse products for far too long for that to be the case, I am sure they still have way more market share but as far as currently sold products I'd be amazed if they still outsold AMD.
The product mix is also important. Intel sells gold and platinum Xeons that connect together in 4 and 8 socket racks at a major premium. AMD only sells to the 1 and 2 socket markets which have a lionshare of the servers.
Posted on Reply
#8
ncrs
Vya DomusI doubt that's true anymore, Xeons have been far worse products for far too long for that to be the case, I am sure they still have way more market share but as far as currently sold products I'd be amazed if they still outsold AMD.
AMD is limited by TSMC's capacity that's shared between basically every major chip vendor, while Intel can produce their own chips. This problem is going to get worse since Intel is also moving some of their products to TSMC including CPUs (at least partially as seen in Meteor Lake).
Posted on Reply
#9
Wirko
PerfectWaveWoW really exited about this news :roll:
Hah, you'll soon be able to buy your biggest APU ever, for a lower price that you could ever imagine!

But more seriously, even if the price of a myriad (=ten thousand) dollars is accurate, there may be more to this story. This is not retail market. Microsoft may have paid much in advance to help financing the development - they wouldn't be the first to do that. Maybe they wanted the product early and they accepted the role of the customer as a beta tester
(THEY ARE MICROSOFT AFTER ALL!)
.
Posted on Reply
#10
thesmokingman
WirkoHah, you'll soon be able to buy your biggest APU ever, for a lower price that you could ever imagine!

But more seriously, even if the price of a myriad (=ten thousand) dollars is accurate, there may be more to this story. This is not retail market. Microsoft may have paid much in advance to help financing the development - they wouldn't be the first to do that. Maybe they wanted the product early and they accepted the role of the customer as a beta tester
(THEY ARE MICROSOFT AFTER ALL!)
.
They're half the high cost of the h100. In volumes institutions can get that down. But make no freaking mistake, these are faster and half the cost. This news is actually old. It just takes this long for MSM to wrap their brains around it.

Probably in a way too it is slowing down everyone's plans to create their own AI silicon. Lmao, Nvidia was driving the industry to go their own way and not feed a suppliers 75% margin, smh.
Posted on Reply
#11
john_
Vya DomusI doubt that's true anymore, Xeons have been far worse products for far too long for that to be the case, I am sure they still have way more market share but as far as currently sold products I'd be amazed if they still outsold AMD.
Even if most supercomputers are build around EPYC, there are plenty of ITs out there that probably still prefer to take the "safe route" of Xeons. I call it "safe route" not because I believe it is a "safe route" but because they do, having work so many years with Xeons before. While they are professionals, they aren't probably less vulnerable than anyone in here in fearing change. I am used in AMD hardware, I keep buying AMD hardware. Others are used in Intel hardware, they keep buying Intel hardware. Many ITs that fear change or think that changing platform will mean extra work and learning for them, they will keep buying Xeons. Also Nvidia choosing Sapphire Rapids for DGX H100 AI was a proof that Xeons can still be considered competitive and valid options (OK Nvidia probably had others reasons too, like stop promoting products of it's main competitor in the GPU business). So, I think small servers are still build mostly around Xeons.
And then there is that problem of capacity for AMD or maybe lack of will to take chances and pay extra TSMC for extra capacity, fear that they could end up with a huge unsellable inventory in the end. Meaning AMD can't flood the world with EPYC CPUs even if there is huge demand.
Posted on Reply
#12
Daven
It seems nvidia does a lot of software development in house to create a proprietary ecosystem. That would contributes to its R&D expenditures so it makes sense that they are charging a lot more.

AMD on the other hand depends more on open source architectures and software which defers more cost to the community rather than AMD itself. This lowers AMD’s expenditures so they can charge less for its hardware.

Both strategies have merit but I always hope for the open source route in all things.
Posted on Reply
#13
mb194dc
They'll do anything to keep hyping to support the stock price...
Posted on Reply
#14
Jism
Vya DomusI doubt that's true anymore, Xeons have been far worse products for far too long for that to be the case, I am sure they still have way more market share but as far as currently sold products I'd be amazed if they still outsold AMD.
If you have build your infrastructure based on XEON's you can't just swap out to a different vendor like that. It requires testing, validating which costs more then just replacing the whole thing.

I recently swapped out XEON's for EPYC's and geezus christ; Epyc's are blowing Xeon's out of the water. More cores for less, superior performance per watt, do more for less and more important you can tone down on the amount of servers for certain applications since EPYC has single socket chips with over 128 cores and 256 threads.

One Epyc is replacing 2 XEON servers here now.
Posted on Reply
#15
Daven
JismIf you have build your infrastructure based on XEON's you can't just swap out to a different vendor like that. It requires testing, validating which costs more then just replacing the whole thing.

I recently swapped out XEON's for EPYC's and geezus christ; Epyc's are blowing Xeon's out of the water. More cores for less, superior performance per watt, do more for less and more important you can tone down on the amount of servers for certain applications since EPYC has single socket chips with over 128 cores and 256 threads.

One Epyc is replacing 2 XEON servers here now.
This is one of the most insightful comments I have read lately. A similar thing is also happening in some instances with compute GPUs replacing the need for many Epyc and Xeons.

Intel is getting double hit in this way; being replaced by both Epycs and compute GPUs.
Posted on Reply
#16
Jism
DavenThis is one of the most insightful comments I have read lately. A similar thing is also happening in some instances with compute GPUs replacing the need for many Epyc and Xeons.

Intel is getting double hit in this way; being replaced by both Epycs and compute GPUs.
AMD is a serious competetor wether folks like it or not. I've always wondered about the EYPC performance and yeah at this point i'm convinced like mad for it's performance / power. It's something completely different then it's previous line of chips like the Bulldozer series.

The big big money is obviously in AI - by bringing MI300X they have again a serious competing product that is priced far lower then Nvidia's part and performs better.

People who worry about AMD only shifting it's focus on compute rather then gaming GPU's are wrong. CDNA and RDNA will simply continue; they technically both the same where CDNA has bin stripped from anything VGA related.

If anyone has bought stock of AMD a few years ago watch it skyrocket. Telling you.
Posted on Reply
#17
thesmokingman
Vya DomusI doubt that's true anymore, Xeons have been far worse products for far too long for that to be the case, I am sure they still have way more market share but as far as currently sold products I'd be amazed if they still outsold AMD.
Continuing to stay on Xeons in this day and age is a great way to get fired.
Posted on Reply
#18
Random_User
And now, folks, behold the actual hero of the occasion, that feeds your CoPilot
datamining advertiser :p.
Vya DomusThey're seriously undercutting Nvidia, getting an almost 200GB GPU for 10K is nuts.
They'd better do the same for consumer products as well... But who am I fooling.
Event HorizonOnce AMD gets a taste of that sweet AI money they'll deprioritize gamers even further just like NVIDIA has.
They aready did it... years ago. The products like this, not being made in one day. It probably was in development for a while, which means that AMD planned their current finacial strategy long time ago.
And this is only begining. They've abandoned gaming/consumer markets. They don't even care. The recent 8000G STAPM isue is just another proof fo that.
Just ocasionally drop some stuff for client/consumer market, to not upset the fanbase completely. Some crisps for plebs.
DavenIt seems nvidia does a lot of software development in house to create a proprietary ecosystem. That would contributes to its R&D expenditures so it makes sense that they are charging a lot more.

AMD on the other hand depends more on open source architectures and software which defers more cost to the community rather than AMD itself. This lowers AMD’s expenditures so they can charge less for its hardware.

Both strategies have merit but I always hope for the open source route in all things.
Still, there's nothing that stops AMD from contributing to the community, and create tools and software, required for their ecosystem, even if it's openscource. Right now, it feels, like AMD uses OpenSource facade, just to cheap out on R&D. Seems like the last decade or so, nothing new.
JismIf you have build your infrastructure based on XEON's you can't just swap out to a different vendor like that. It requires testing, validating which costs more then just replacing the whole thing.

I recently swapped out XEON's for EPYC's and geezus christ; Epyc's are blowing Xeon's out of the water. More cores for less, superior performance per watt, do more for less and more important you can tone down on the amount of servers for certain applications since EPYC has single socket chips with over 128 cores and 256 threads.

One Epyc is replacing 2 XEON servers here now.
They have to keep momentum. At this pace, AMD would really benefit from own foundry or two.
Posted on Reply
#19
Nordic
Event HorizonOnce AMD gets a taste of that sweet AI money they'll deprioritize gamers even further just like NVIDIA has.
AMD would make more money, invest more in RnD, and gamers will benefit from further advancements.
Posted on Reply
#20
Minus Infinity
NordicAMD would make more money, invest more in RnD, and gamers will benefit from further advancements.
And the more revenue they have the less pressure they are under to charge extortionate prices: they can afford to compete on price. Huang doesn't give fig though, he'll gouge us no matter how many gazillions he makes.
Posted on Reply
#21
kondamin
Vya DomusThey're seriously undercutting Nvidia, getting an almost 200GB GPU for 10K is nuts.
Isn’t it more of a nvidia charging what ever they want and amd coming to market with a more realistic price 10k isn’t anything to sneeze at
Posted on Reply
#22
SOAREVERSOR
Event HorizonOnce AMD gets a taste of that sweet AI money they'll deprioritize gamers even further just like NVIDIA has.
mUh g4m1ng PC!!!! is not some super special class of item that companies should make less money or even lose money just to support it's fanbois.

It's also sort of silly to say AMD is deprioritizing gamers. When you look at all the consoles using AMD, or the Steam Deck and the array of clones that followed it's quiet clear that AMD is fully interested in pushing solutions for gaming.

The kicker is that GPU advances aren't cheap which means that the constant screaming for faster dedicated GPUs has put PC gaming on a rush track to where dedicated GPUs are going to be for cloud gaming providers. Yet SOCs, APUs, and IGPs are getting better and better all the time.
Posted on Reply
#23
Denver
NordicAMD would make more money, invest more in RnD, and gamers will benefit from further advancements.
Yeah, AMD tends to synergistically integrate advances made between its different products;

For example, RDNA is now integrated into servers, PCs, consoles, handhelds, laptops, and even smartphones.
Posted on Reply
#24
watzupken
SOAREVERSORmUh g4m1ng PC!!!! is not some super special class of item that companies should make less money or even lose money just to support it's fanbois.

It's also sort of silly to say AMD is deprioritizing gamers. When you look at all the consoles using AMD, or the Steam Deck and the array of clones that followed it's quiet clear that AMD is fully interested in pushing solutions for gaming.

The kicker is that GPU advances aren't cheap which means that the constant screaming for faster dedicated GPUs has put PC gaming on a rush track to where dedicated GPUs are going to be for cloud gaming providers. Yet SOCs, APUs, and IGPs are getting better and better all the time.
For profit companies will always tilt towards areas that gives them maximum profit margin. AMD have to find ways to sell their products and console for example, is one avenue for steady income. But if AI chips becomes their priority, it will surely impact their focus on gaming. For example, as their EPYC chips for data center are most lucrative, it actually impacts their priorities when it comes to fab allocation. Hence, I feel this is why they tend to struggle to produce enough consumer class CPUs.
Posted on Reply
#25
Leiesoldat
lazy gamer & woodworker
Minus InfinityAnd the more revenue they have the less pressure they are under to charge extortionate prices: they can afford to compete on price. Huang doesn't give fig though, he'll gouge us no matter how many gazillions he makes.
Nvidia is an equal opportunity gouger: both the government, the corporate world, and regular consumers get screwed over. Nvidia either didn't bid or lost out on the 3 exascale supercomputers (Frontier [ORNL], Aurora [ANL], and El Capitan [LLNL]) in the USA.
Posted on Reply
Add your own comment
Dec 18th, 2024 09:38 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts