• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 50 Cards Spotted with Missing ROPs, NVIDIA Confirms the Issue, Multiple Vendors Affected

Nvidia has been a company that has been treating users like fools since the TNT2 era. We have seen a lot of things in the past regarding the cards it has produced. I am not surprised at all that these problems are happening. They produce cards with the same American fast food logic. Produce them immediately. Release them to the market immediately. Make lots of money immediately.

Lots of bugs and problems. I guess the quality part is now in the background.
And yet it didn't stop you from buying 4070 Ti Super.
 
I think it's obvious to everyone that "0.5%" is a blatant fabrication - but who can prove otherwise? They can claim it's only 0.01%, that from thousands of sold cards there's only a handful of cases, and not all of them are confirmed - in fact, all if them are more or less anecdotal.

So move along, nothing to see here, we congratulate ourselves on the industry's best quality control, but you might not receive the whole card, your cable might melt, your capacitors might burn up, your PC might end up with black screen with no apparent reason...

All of that is of course avoidable through our "Verified Priority Un-Access" - where by signing up you will be put on a list that has no connection with card availability whatsoever, thus ensuring your complete safety fron faulty Blackwell products!
 
No - they just need to do some quality management.

Funny the statement ... identified ... that some graphic cards ....
* After your public notification of not to specification graphic cards by different sources and too much publicity - NVIDIA identified that some graphic cards have issues. *
We will update this to calm the minds of our loyal customers (sheeps), as we see fit. We will not go public before, we update our statements when our fraudulent products are found in the open field.

I do not see any actions to be done in regards of faulty hardware. It's a long term problem already over many years. Not a recent issue of less than the usual 7 calendar days in quality management.
But QM costs money.......so no. The customer can handle that ;)
 
Anyone who still believes Nvidia's lies deserves special shock treatment to wake them up.
you don't even really know what you're talking about /s


none of you really know what you're talking about /s


GET WITH THE PROGRAM. IT'S THE DAWN OF A NEW ERA.
 
The company at the forefront of AI couldn’t predict the disastrous “launch” of blackwell.

Now it suddenly knows exactly the percentile figure of affected cards. Playing it down much, eh?
 
The company at the forefront of AI couldn’t predict the disastrous “launch” of blackwell.

Now it suddenly knows exactly the percentile figure of affected cards. Playing it down much, eh?
Maybe Nvidia should use AI to figure out how many chips they shipped with missing ROPs.
They are downplaying it in the same way Intel did by staying quiet about it, which I think is much worse than admitting to the problem.
 
I'm soryy it's just a Reddit post copied :

MSRP = Missing Some ROPs Possibly.
 
Rumor has it the AI lost some ROPs and 32-bit PhysX trying to spell raspberry successfully. The AI raspberry war of attrition. Relax folks it was only %5 at one point.
 
5070 (Ti) this might be more than 10% performance impact.

I think hardwareluxx.de member Zeitgeist (english ghost of time) claimed his faulty card had around 11 % penalty. Even the site itself quoted that user.
 
Maybe Nvidia should use AI to figure out how many chips they shipped with missing ROPs.
Maybe AI generation usage in the project was the issue, just like it still has issues with hands.
 
Missing ROPs is not a problem, AI can manage to rebuild missings at softwares level.
 
this generation is an absolute mess all around.
 
"We've identified"? Still taking credit for other people's work and reporting. Keep going Nvidia!


I don't need to believe, I know they're liars:
View attachment 386559
No "*", no small print, just a lie
wow just wow, the word “ duplicitous“ comes to mind… the CEO leather jacket is emulating POTUS 45&47… 20% right is not good. Just because GDDR7 is faster than GDDR6X, Does not mean a 5070 has the same performance as a 4090…
 
That's a similar pitfall in the argument that the 5090 missing ROPs is still much faster than the competition so "it's fast enough".
Swapping features (and very common and tested ones at that) for "a better experience" lacks innovation. And probably cost the same, instead of going to R&D the funds were given to the Marketing department.
Imagine getting a car that comes without a radio and being said: "it's for a better driving experience" it works for some niches (Ferrari F40) but the iPhones, although not cheap, still find their way onto a bigger, mainstream audience.


So, kinda like DLSS...

View attachment 386507
I wouldn't say that it's comparable to the iPhone (the first was universally agreed to be bad, but the form factor decided the future. Even if the likes of Ballmer/Blackberry didn't think so) many of the advantage that are/were associated with android are often QoL rather than absolute mandatory features. There's a parity when it comes to the most used apps. Entry level android phones were the biggest beneficiary of uppgradable storgae because they were shipped with and unuasble amount of base storage. (512 mb in 2010 vs 8GB for a flagship. Those were becoming bordeline unusable after 3 months). For many regular users, the iPhone does exactly what it needs to do to serve it's function. And you get a bonus if you happen to be a mac user/use other Apple product. They are also pretty fashionable, tons of third party accesories for them, wich can't always be said of the competition. Phones quickly went beyond the utilitarian aspect to become an everyday accesory.

iOS is/was odd on some aspect compared to android, but not in way that makes the product unuseable. Unlike entry level android phone of the 2010's, those thing were absolute crap. Windows phones were sooo much better in that sector, but suffered from the lack of apps. Yes the bend gate and the antennagate happened, but samsung batterygate was probably the biggest fuck-up that industry ever saw with people getting injured. But they recovered from that.

It's not about if you ever fuck up, but rather how often/how dangerous it is/do you make the necesesarry step to fix that fuck up.

Now, for photography/video every phone is software enhanced, they don't have a choice: the sensor is small, and most people don't want to fiddle with settings/touch up the pictures themselves to have something that looks great. The pixel is so good because of the software. Sony makes phones that are superior in that aspect (bigger sensor) but the automatic mode is so basic you don't really see the benefits when you do a side by side comparision.

Nowadays the iphone cameras feels a bit basic it's your regular image processing stuff focusing on exposure, noise reduction, dynamic range. Android makers have been going bananas about AI enhanced photography
How Samsung Galaxy Cameras Combine Super Resolution Technologies With AI Technology to Produce High-Quality Images of the Moon – Samsung Mobile Press
Improve Photos with Pixel’s AI Camera Technology - Google Store


It's the first GPU generation in a while were every SKU seems to have some kind of issue, when before you were safe if you didn't buy the high-end from nvidia.
 
TechPowerUp

Can TPU do tests comparing the AI performance of different hardware from different manufacturers (Nvidia, AMD, Intel, etc.)?
It would be great to compare the AI performance of Nvidia hardware with that of other manufacturers.
There is a lot of dedicated hardware for AI calculations, but we have no idea how each one performs in practice.
 
Can TPU do tests comparing the AI performance of different hardware from different manufacturers (Nvidia, AMD, Intel, etc.)?
It would be great to compare the AI performance of Nvidia hardware with that of other manufacturers.
There is a lot of dedicated hardware for AI calculations, but we have no idea how each one performs in practice.
i'm sure there's a puget systems for AI. it's a search away. benchmarking all these games and cards is work enough. this platform
is predominantly for gaming.
 
We need some kind of test that would catch a driver lying about the card's actual resources.
 
I wouldn't say that it's comparable to the iPhone (the first was universally agreed to be bad, but the form factor decided the future. Even if the likes of Ballmer/Blackberry didn't think so) many of the advantage that are/were associated with android are often QoL rather than absolute mandatory features. There's a parity when it comes to the most used apps. Entry level android phones were the biggest beneficiary of uppgradable storgae because they were shipped with and unuasble amount of base storage. (512 mb in 2010 vs 8GB for a flagship. Those were becoming bordeline unusable after 3 months). For many regular users, the iPhone does exactly what it needs to do to serve it's function. And you get a bonus if you happen to be a mac user/use other Apple product. They are also pretty fashionable, tons of third party accesories for them, wich can't always be said of the competition. Phones quickly went beyond the utilitarian aspect to become an everyday accesory.

iOS is/was odd on some aspect compared to android, but not in way that makes the product unuseable. Unlike entry level android phone of the 2010's, those thing were absolute crap. Windows phones were sooo much better in that sector, but suffered from the lack of apps. Yes the bend gate and the antennagate happened, but samsung batterygate was probably the biggest fuck-up that industry ever saw with people getting injured. But they recovered from that.

It's not about if you ever fuck up, but rather how often/how dangerous it is/do you make the necesesarry step to fix that fuck up.

Now, for photography/video every phone is software enhanced, they don't have a choice: the sensor is small, and most people don't want to fiddle with settings/touch up the pictures themselves to have something that looks great. The pixel is so good because of the software. Sony makes phones that are superior in that aspect (bigger sensor) but the automatic mode is so basic you don't really see the benefits when you do a side by side comparision.

Nowadays the iphone cameras feels a bit basic it's your regular image processing stuff focusing on exposure, noise reduction, dynamic range. Android makers have been going bananas about AI enhanced photography
How Samsung Galaxy Cameras Combine Super Resolution Technologies With AI Technology to Produce High-Quality Images of the Moon – Samsung Mobile Press
Improve Photos with Pixel’s AI Camera Technology - Google Store


It's the first GPU generation in a while were every SKU seems to have some kind of issue, when before you were safe if you didn't buy the high-end from nvidia.
The post that you quoted was a follow-up to another that was about how "bad decisions" kill customer bases. It wasn't meant to go on an ecosystem "deep dive".
I'm going to try and make this abundantly clear, this sort of stupid kills customer bases.
And I tried to show that Apple is kind of bulletproof, and so far NVIDIA seems to be as well. To do that I gave examples of "bad decisions" made by Apple on their iPhone lineup (over several generations), like you pointed out the first was bad, but still they managed to strive. It goes against the saying "you only have one chance to make a good first impressions".

And let's not forget: bad coverage if you "held the phone wrong" (iPhone 4).
"Gripping any mobile phone will result in some attenuation of its antenna performance, with certain places being worse than others depending on the placement of the antennas. This is a fact of life for every wireless phone. If you ever experience this on your iPhone 4, avoid gripping it in the lower left corner in a way that covers both sides of the black strip in the metal band, or simply use one of many available cases."

Reportedly, Steve Jobs was even kind enough to reply to some user emails - briefly, but straight to the point:

"Just avoid holding it that way."
The hubris in these statements did not receive the backlash it deserved.

But going for Apple in general, Louis Rossmann build a business (and YouTube channel) fixing and showing people how Apple cuts corners and overcharges users who try to get their devices fixed.
Writting this I just remembered another time NVIDIA screwed up big time and managed to get away:
 
Last edited:
Can TPU do tests comparing the AI performance of different hardware from different manufacturers (Nvidia, AMD, Intel, etc.)?
It would be great to compare the AI performance of Nvidia hardware with that of other manufacturers.
There is a lot of dedicated hardware for AI calculations, but we have no idea how each one performs in practice.
 
Anyone recommending 50-series at this point should be ashamed of themselves.
Do I get credit for all the times I said "don't buy 5090!"...LoL
 
Can TPU do tests comparing the AI performance of different hardware from different manufacturers (Nvidia, AMD, Intel, etc.)?
It would be great to compare the AI performance of Nvidia hardware with that of other manufacturers.
There is a lot of dedicated hardware for AI calculations, but we have no idea how each one performs in practice.
TPU already does AI tests in the GPU compute tests as seen in https://www.techpowerup.com/review/msi-geforce-rtx-5090-suprim/36.html , for example.
 
Jensen and the scalpers can keep the entire 50 series GPU's :roll:
They can all enjoy the Multi Flame Gen.
 
The cynic in me reckons that this debacle, even combined with all the other ones around the launch of this series, will hardly even make a dent.
 
This is either a very weird driver glitch, or NVidia has a very serious problem.
This must be much lower level than the driver.
This comes down to the binning process, where the appropriate lower quality ROPs are "fused off", I can only come up with two possible reasons why;
1) The affected GPUs are a lower bin, which for some reason was combined into the same SKU, and these missing ROPs are "defective".
2) The affected GPUs have 8 fully working ROPs fused off unintentionally, which makes the GPU avoid using these. A Nvidia engineer would be able to explain exactly how this works on a hardware level, but it's one of two;
a) Somehow "burned" into the chip, so no firmware update can change it.
b) Controlled by firmware, but in this case they should have fixed it instead of taking returns. (and now with multiple models affected…)
Either way it's not a driver issue.
Is there anything I've missed?

Also, every finalized graphics card is run through extensive validation by the AiB partners, it surprises me that none of those checks validates that the reported hardware matches the spec.

This is grounds for class action law suit right? How soon can we expect one? Nvidia needs to get their shit together.
That would have to be done per country (or EU combined), and after years of deliberation and a settlement is reached, owners will get their ~$2 after lawyer fees.

I'll get a 9070XT in pure retaliation.
And most will quickly return when they get burned there too…

I think it's obvious to everyone that "0.5%" is a blatant fabrication - but who can prove otherwise?
Firstly, they do know the exact number, as this is a binning issue. But whether the reported figures are correct or not, I have my doubts, considering very few units are in use and users have a very low probability of detecting this, so my expectation would be that the real figure is in the ~10% range. (That's just a qualified guess, but don't quote me on that.)

It is however always hard for the public to gauge how widespread an issue may be, especially problems which may be tied to specific production batches, and a few people shouting loudly in the forums. A couple of generations ago Nvidia got a tremendous amount of flak for the "space invaders" defect on certain RTX 2080s, which in the end turned out to be an issue with EVGA. (Except for the random occurrences which is normal with mass produced graphics cards.) Outright failure rates with graphics cards are still very low compared to e.g. motherboards, and CPUs are even lower. So we have every reason to expect a graphics card to be fully working, and we should continue to hold vendors to that standard.
 
Back
Top