Wednesday, February 28th 2024
NVIDIA Accused of Acting as "GPU Cartel" and Controlling Supply
World's most important fuel of the AI frenzy, NVIDIA, is facing accusations of acting as a "GPU cartel" and controlling supply in the data center market, according to statements made by executives at rival chipmaker Groq and former AMD executive Scott Herkelman. In an interview with the Wall Street Journal, Groq CEO Jonathan Ross alleged that some of NVIDIA's data center customers are afraid to even meet with rival AI chipmakers out of fear that NVIDIA will retaliate by delaying shipments of already ordered GPUs. This is despite NVIDIA's claims that it is trying to allocate supply fairly during global shortages. "This happens more than you expect, NVIDIA does this with DC customers, OEMs, AIBs, press, and resellers. They learned from GPP to not put it into writing. They just don't ship after a customer has ordered. They are the GPU cartel, and they control all supply," said former Senior Vice President and General Manager at AMD Radeon, Scott Herkelman, in response to the accusations on X/Twitter.The comments reference the NVIDIA GeForce Partner Program (GPP) from 2018, which was abandoned following backlash over its exclusivity requirements. Herkelman suggests NVIDIA has continued similar practices but avoided written agreements. The Wall Street Journal report also hinted that major tech companies like Microsoft, Google, and Amazon are developing their own AI accelerators but downplaying them as NVIDIA competitors. This further points to an environment where NVIDIA is seen as controlling access to key technology for AI development. NVIDIA currently powers around 80% of AI development worldwide, giving it incredible influence over strategic technology. The accusations from Groq and Herkelman suggest the company is willing to leverage that market position aggressively to protect its dominance. NVIDIA has not officially responded to the latest accusations. But the reports have fueled speculation about anticompetitive practices just as regulatory scrutiny grows over the market power of tech giants. NVIDIA will likely face pressure to transparently address whether its supply allocation favors some customers over others based on relationships with rival chipmakers.
Sources:
The Wall Street Journal, Scott Herkelman (X/Twitter), via VideoCardz
130 Comments on NVIDIA Accused of Acting as "GPU Cartel" and Controlling Supply
power draw is a moot point depending on the user and here’s why- those who drop 5k on a system generally care very little about efficiency . Also those power figures are under load not daily use. Nvidia is much more efficient sure but don’t think you’re saving 150w/hr by going with nvidia . Again you are cherry picking your point . Are you also one of those folks who think a 4090 is a good value for gaming ? Lol
Power draw is not a moot point. Players don't want additional 100-150 watts slammed in their face while gaming. Or additional noise, that being louder fans or coilwhine, since many use open headsets, at least at home, mostly closed at tournaments. Also, Nvidia have better minimum and higher averages fps, just less issues in general.
Once again, you have zero knowledge about eSport. I've worked close with many top players and teams. Been at 100+ tournaments, all over the world. Build tournament rigs. I know for a fact that pretty much zero of them use AMD GPU. Most want Reflex + LDAT support. AMD has nothing that even comes close.
You can act like AMD is "just as good" all you want, in reality, they don't even have 2% of the pro base when talking GPUs and Steam HW Survey also tells me that 8 out of 10 players buys Nvidia - store.steampowered.com/hwsurvey/videocard/
Lets stop dreaming now.
:roll:
Awesome conversation feel free to go back to your moms basement and come out next time there’s an nvidia articicle
Power draw is a moot point, the 7900XTX uses on average 50-60 watts more than an RTX 4080, not 150, when players are sponsored they aren't going to care about another 50-60 watts during gaming.
tpucdn.com/review/amd-radeon-rx-7900-xtx/images/power-gaming.png
www.techpowerup.com/review/amd-radeon-rx-7900-xtx/37.html
As for minimum FPS, 7900XTX is higher, average FPS is as well.
tpucdn.com/review/asrock-radeon-rx-7900-xtx-taichi-white/images/minimum-fps-1920_1080.png
tpucdn.com/review/asrock-radeon-rx-7900-xtx-taichi-white/images/average-fps-1920-1080.png
www.techpowerup.com/review/asrock-radeon-rx-7900-xtx-taichi-white/34.html
So who have you built PC's for? You're claiming you know about e-sports yet are bringing up nonsense points.
Also the Steam hardware survery isn't as accurate as Nvidia users want to insist it is, the polling is random and Valve doesn't provide their metrics for testing. And of course Nvidia has very effective marketing, convincing people they need proprietary Nvidia features to play games, while sponsoring people to use Nvidia GPU's so the average PC gamer will only want to buy from Nvidia.
Stop talking about the FPS charts.
We're just discussing the details now ;)
These companies are direct competitors of NVIDIA and they will do whatever it takes to harm business of NVIDIA!
Are there any complaints from Meta ( recently ordered 350,000 NVIDIA GPUs ), Amazon, or X?
Groq and AMD did Not publish benchmark results of famous MLPerf benchmarks for accelerators they design and sell.
Try to find any MLPerf benchmarks for accelerators ( Not CPUs ) of both companies at:
mlcommons.org/benchmarks/training
You were warned to do so in an earlier post by a moderator.
In addition it's a chiplet based architecture so costs don't increase exponentially like monolithic. The size of the actual GPU itself on Navi 31 is 304mm2.
It's akin to saying AMD is loosing in the enterprise and consumer CPU market because their total die size is large, completely ignorant of the differences in architecture. I have some for you
1) 3000 series feeding back noise causing certain PSUs to trip
2) 12VHPWR, enough said
3) High DPC latency bug that's been around for over a decade now (only fixed on the 3000 series as of right now)
4) 970 3.5GB
You act as if AMD is the only one that makes mistakes. You do realize AMD was almost bankrupt back when they pulled out of the high end GPU market right? 87000 XT? You are basing that one entirely on a rumor. In the context of this article both of those are irrelevant. We are talking AI not consumer GPUs.
Mind you both AMD and Intel do offer Ray Tracing so saying other companies don't offer it is patently false. Or any of Nvidia's mobile GPUs the last few generation or their low end GPUs that do a bait and switch with the memory configuration. 50-70% seems extremely reasonable compared to the 3-5x claims Nvidia was making. I can't wait to see your proportionality greater outrage at Nvidia because as we are all aware, you treat both equally.
AMD has a long way ahead of them in marketing department it seems. Even longer than in engineering one.
If people want change, they need to start supporting the other side, I have said it a million times and will keep saying it. AMD is currently very competitive with Nvidia except at the very top (4090, which is a very high priced card).
No excuses for nVidia but, I gotta wonder what prompted bringing this up?
From my observations of industries,
If you're the market-leader, it's de facto 'normal' to Act as a Cartel.
So, who's cheerios did nVidia piss in, to get this into the news circuit?
Yields on N5 node are around 90%, so those 10% of dies are not discarded, but collected for months in order to create a new product line.
Plus, Nvidia has not dared yet to release chiplet based GPU, which is more complex task to accomplish.
I'm pretty sure evga would silently agree with the cartel language plus add a few more choice words :laugh:
"Efficiency? Who cares about efficiency? I have 8 more cores than you (E cores but let's not talk about it) and I am 1.5% faster than you(at 100W+). You slooooooOOOOOOOOOooooooooow!"