• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 50 Cards Spotted with Missing ROPs, NVIDIA Confirms the Issue, Multiple Vendors Affected

Nvidia has been a company that has been treating users like fools since the TNT2 era. We have seen a lot of things in the past regarding the cards it has produced. I am not surprised at all that these problems are happening. They produce cards with the same American fast food logic. Produce them immediately. Release them to the market immediately. Make lots of money immediately.

Lots of bugs and problems. I guess the quality part is now in the background.
 
This is grounds for class action law suit right? How soon can we expect one? Nvidia needs to get their shit together. I'll get a 9070XT in pure retaliation.
 
Nvidia made statement only about 5090(D) and 5070 Ti. Now 5080 is affected, too. This proves they absolutely don't know real numbers for sure. That previously stated 0.5% of 5090(D)/5070 Ti of affected units by "anomaly" was bullshit, no doubt about it now.
 
This is grounds for class action law suit right? How soon can we expect one? Nvidia needs to get their shit together. I'll get a 9070XT in pure retaliation.
I'm so disappointed by the Blackwell launch, I am absolutely considering a 9070 XT as long as its not priced dumb. My 3080 Ti is...fine, but the 12GB has been problematic at 4k in some games.
 
6000$ those sold so high where?
I sold under 5k, and bought 4090 liquid X.
 
Wasn't that the one game from Amazon, the MMO (New World, thanks google)? Because they didn't cap the performance of the Main Menu which is a rookie mistake?

I know it doesn't comfort the casualties but who trusts AMAZON to make a game properly anyway...
The only rookie mistake was the design choices made for the GPU. It has never and will never be the game developer's responsibility to limit frames to protect a GPU from doing more work than it can handle. I think they should limit FPS in menu (except graphics settings page) or pause screens to save energy, but not to protect cards from self deleting.
 
The only rookie mistake was the design choices made for the GPU. It has never and will never be the game developer's responsibility to limit frames to protect a GPU from doing more work than it can handle. I think they should limit FPS in menu (except graphics settings page) or pause screens to save energy, but not to protect cards from self deleting.
I was wondering about graphical setting options when neural rendering etc becomes more prevalent.

 
Last edited:
Paper launch, fake prices, fake frames, fake resolution, terrible generational performance uplift, bricked cards, flawed board design, defective chips and no PhysX. But hey, at least we got real flames for our $2,000 $6,000 card.
Thanks for repeating that for the few that haven't read it before, the other thousand times it's been said by someone thinking they are clever. :slap:
 
Heh, now even RTX 5080 has been found (Founders edition) with 104 ROP's instead of 112 (videocardz.com). Crazy. I'm sure it affects below 0.5% of cards. :shadedshu:
 
so i see we have super models and gts models and rtx d models now hhhh too many cards in 1 gen... too many crap
 
imaging >>>>>AI server blackwell are also affected too but they dont have proper gpuz program. company spend billion for defective gpu LUL. Also they have problem with hbm3...
They have very few ROPs and not really used for rendering but either way if something is disabled on them I imagine customers would realize it very quickly as the performance number wouldn't match up.

This proves they absolutely don't know real numbers for sure.
They're just lying trying to downplay it, every chip maker knows exactly what they're making. It's not like some guy looks at trays of chips then gives them a proper kick like he's checking a car and goes "Ugh, looks fine to me, send them out".
 
Last edited:
No counterargument is required. Your statement was so lacking of all merit and logic that any counterargument would be completely superfluous. Your statement argues itself into the realms fanciful delusion, not worthy of any consideration. It is worthy only of being mocked.

No argument because you have none.

Selling something at $25,000 rather than at $2,000 is what all companies on the planet will do 100% of the time.

You seem to be under the impression that companies are stupid greedy, and that they can charge anything for anything. That's stupid. In that economic view the retort is that consumers can then tell companies to go pound sand...because virtually everything on the planet has something that will replace it. Nvidia charges $2000 for a card, people decide 40% of the performance at 20% of the price is good enough. Crypto is insane...the model switches from stakeholder to proof of work, and suddenly the crypto mining card boom dies. Every single market action will last for some time, until that imbalance is righted.

I'm going to try and make this abundantly clear, this sort of stupid kills customer bases. They will accept that you can get away with it now, but in two to three years they'll be looking for more hardware. They'll see your product, and immediately associate you with price gouging. Whenever the AI market dives, and you're back to selling GPUs to consumers for what has to be a reasonable price, you'll discover that they will not buy into your crap whenever given the choice. That's how companies record record profits, then three to four years later get bought out by their competitors who won by simply not opening their mouths.


Let me also suggest that "peanuts" for consumer GPUs is misleading. Yes, it's not their primary business segment...but it's also how they can sell their silicon to virtually anybody. Yes, AI accelerators are legally limited. At the same time, you can sell a boatload of "consumer" grade GPUs to countries adjacent to countries that are legally not able to be sold to, and watch as the middlemen grow fat and rich off of sending silicon across the border at truly silly smuggling prices. Funny how countries with no huge gaming market have had silly demand for those high end GPUs in the past couple of years...no? All you have to do is have a little inflated trade values to absorb the costs legally, then print money everywhere, while on paper meeting your requirements not to ship to countries. Seems...legit?



Let me end on an anecdote. My family used to be involved in selling bakery goods. Think Pillsbury's expensive brother. Think "fresh baked, hand made muffins" that were called what they were. Scoop and Bake. They sold a product that curb stomped the competition, because consumers could not tell the difference between 30 seconds of labor and 45 minutes of baking antics. The money was good, because they sold flour and components, and about 4-6 months in they'd introduce bakeries to scoop and bake product. 1/4 the labor cost, for a product that was superior and consistent. Yes please.
The thing is, for decades the management team knew they had the market on lock. 5% increase in flour, pass on 5% to customers, and nobody had an issue because the bakery could eat 20% and still be making a profit. That was, until the Dutch came. It was a Dutch consortium who bought the company...and decided to change how the business worked. Butter became margarine, then decreased, then they decided to decrease ingredient quantity, and finally quality. This won't mean anything to you, but an IQF blueberry (individually quick frozen) is why you buy a blueberry muffin and it isn't green on the inside. It's also 20-30% more expensive, and it's the most expensive part of that muffin. So, customers noticed quality drop. And drop. And drop. The company made huge amounts of money...at the expense of a decades long reputation. Eventually the premium product performed as bad as the cheap stuff...and customers jumped ship. The company saw their profit margins tank, and first increased price. When that drove off more customers, they tried to compete on price. With 40% of their customer base gone, and telling them to shove their scoops where the sun doesn't shine, they had pissed away a market advantage that took decades to make for about four total years of record profits...followed by none.

Tell me that doesn't sound like Nvidia right now. That doesn't sound like hedonistic DILLIGAF, that will eventually bury them in the market until AMD or Intel decide to do the same stupid thing after getting the market leadership. Nvidia is currently uncontested in the market, and instead of setting new standards for quality that would permanently relegate their competition to second string they've decided to point the gun directly at their feet and pop off a few rounds. Don't worry, the consumers don't currently have a choice so they have to go with us. So we are clear, this is why I prefer when AMD has a lead in the CPU market over Intel...because there is literally no love left for me an Intel after the socket 2011 where they decided to simply not have a good chipset...because who would ever need more than a pair of SATA III sockets? Yes, my first enthusiast platform killed my interest in every paying for another, because Intel decided since AMD wasn't competing they could release whatever and it'd be good enough. That's why I buy whatever is best, and enjoy when my AMD CPU option is best.
 
You seem to be under the impression that companies are stupid greedy, and that they can charge anything for anything. That's stupid. In that economic view the retort is that consumers can then tell companies to go pound sand...because virtually everything on the planet has something that will replace it. Nvidia charges $2000 for a card, people decide 40% of the performance at 20% of the price is good enough. Crypto is insane...the model switches from stakeholder to proof of work, and suddenly the crypto mining card boom dies. Every single market action will last for some time, until that imbalance is righted.

I'm going to try and make this abundantly clear, this sort of stupid kills customer bases. They will accept that you can get away with it now, but in two to three years they'll be looking for more hardware. They'll see your product, and immediately associate you with price gouging. Whenever the AI market dives, and you're back to selling GPUs to consumers for what has to be a reasonable price, you'll discover that they will not buy into your crap whenever given the choice. That's how companies record record profits, then three to four years later get bought out by their competitors who won by simply not opening their mouths.


Let me also suggest that "peanuts" for consumer GPUs is misleading. Yes, it's not their primary business segment...but it's also how they can sell their silicon to virtually anybody. Yes, AI accelerators are legally limited. At the same time, you can sell a boatload of "consumer" grade GPUs to countries adjacent to countries that are legally not able to be sold to, and watch as the middlemen grow fat and rich off of sending silicon across the border at truly silly smuggling prices. Funny how countries with no huge gaming market have had silly demand for those high end GPUs in the past couple of years...no? All you have to do is have a little inflated trade values to absorb the costs legally, then print money everywhere, while on paper meeting your requirements not to ship to countries. Seems...legit?



Let me end on an anecdote. My family used to be involved in selling bakery goods. Think Pillsbury's expensive brother. Think "fresh baked, hand made muffins" that were called what they were. Scoop and Bake. They sold a product that curb stomped the competition, because consumers could not tell the difference between 30 seconds of labor and 45 minutes of baking antics. The money was good, because they sold flour and components, and about 4-6 months in they'd introduce bakeries to scoop and bake product. 1/4 the labor cost, for a product that was superior and consistent. Yes please.
The thing is, for decades the management team knew they had the market on lock. 5% increase in flour, pass on 5% to customers, and nobody had an issue because the bakery could eat 20% and still be making a profit. That was, until the Dutch came. It was a Dutch consortium who bought the company...and decided to change how the business worked. Butter became margarine, then decreased, then they decided to decrease ingredient quantity, and finally quality. This won't mean anything to you, but an IQF blueberry (individually quick frozen) is why you buy a blueberry muffin and it isn't green on the inside. It's also 20-30% more expensive, and it's the most expensive part of that muffin. So, customers noticed quality drop. And drop. And drop. The company made huge amounts of money...at the expense of a decades long reputation. Eventually the premium product performed as bad as the cheap stuff...and customers jumped ship. The company saw their profit margins tank, and first increased price. When that drove off more customers, they tried to compete on price. With 40% of their customer base gone, and telling them to shove their scoops where the sun doesn't shine, they had pissed away a market advantage that took decades to make for about four total years of record profits...followed by none.

Tell me that doesn't sound like Nvidia right now. That doesn't sound like hedonistic DILLIGAF, that will eventually bury them in the market until AMD or Intel decide to do the same stupid thing after getting the market leadership. Nvidia is currently uncontested in the market, and instead of setting new standards for quality that would permanently relegate their competition to second string they've decided to point the gun directly at their feet and pop off a few rounds. Don't worry, the consumers don't currently have a choice so they have to go with us. So we are clear, this is why I prefer when AMD has a lead in the CPU market over Intel...because there is literally no love left for me an Intel after the socket 2011 where they decided to simply not have a good chipset...because who would ever need more than a pair of SATA III sockets? Yes, my first enthusiast platform killed my interest in every paying for another, because Intel decided since AMD wasn't competing they could release whatever and it'd be good enough. That's why I buy whatever is best, and enjoy when my AMD CPU option is best.
Really enjoyed your post, examples and anecdote.
Let's not forget Deepseek vs OpenAI situation: more for less and open source.
I'm going to try and make this abundantly clear, this sort of stupid kills customer bases.
Regarding this statement, I believe, there are examples that go along this and in the opposite direction.
Still to this day I'm baffled with the success of the iPhone, the first one didn't have MMS or 3G, followed by paid apps that were free (or had free equivalent) on other platforms and exorbitant prices for storages upgrades (and never offered expandable storage) And let's not forget: bad coverage if you "held the phone wrong" (iPhone 4).
 
Really enjoyed your post, examples and anecdote.
Let's not forget Deepseek vs OpenAI situation: more for less and open source.

Regarding this statement, I believe, there are examples that go along this and in the opposite direction.
Still to this day I'm baffled with the success of the iPhone, the first one didn't have MMS or 3G, followed by paid apps that were free (or had free equivalent) on other platforms and exorbitant prices for storages upgrades (and never offered expandable storage) And let's not forget: bad coverage if you "held the phone wrong" (iPhone 4).

You are looking at iPhone incorrectly. It sounds backwards, but the thing they are selling is not the hardware. The thing they are selling is the experience, with things that just work.

That sounds backwards, but iPhone was about the walled garden. They didn't release the first conferencing application, they released it whenever the networks allowed it to work most of the time. They didn't have the biggest and best camera, they had one whose software made your bad pictures turn out good...even if you had no control over advanced features. To this day people still love them because they just work organically...even if their features are years behind competitors and they constantly require expensive updating. It sounds backwards, but you love Apple not for their products, but because their experience is (or was) king.


If you frame most other things that way, you see why things succeed. Nvidia is best, because they've got creators whose primary tools "just work" with their software. Cars are a commodity, so the luxury brands offer you an experience. I buy national brands, despite knowing they are produced on the same lines as store brands, because their quality control is better...even if any one individual thing may be better in some metric (think a bag with 50% marshmallows of the Lucky Charms knock-off, but still preferring General Mills because I don't also get a bag with 5% marshmallows). It's funny how our lives have things in it that are no longer things...but this is the world we live in. Hopefully AMD has a competent launch, and by virtue of not being a miserable crap show it's a fantastic cleanser for Blackwell's turd sandwich. Isn't it funny to say that, even knowing they've got most of the market on lock?
 
You are looking at iPhone incorrectly. It sounds backwards, but the thing they are selling is not the hardware. The thing they are selling is the experience, with things that just work.

That sounds backwards, but iPhone was about the walled garden. They didn't release the first conferencing application, they released it whenever the networks allowed it to work most of the time. They didn't have the biggest and best camera, they had one whose software made your bad pictures turn out good...even if you had no control over advanced features. To this day people still love them because they just work organically...even if their features are years behind competitors and they constantly require expensive updating. It sounds backwards, but you love Apple not for their products, but because their experience is (or was) king.


If you frame most other things that way, you see why things succeed. Nvidia is best, because they've got creators whose primary tools "just work" with their software. Cars are a commodity, so the luxury brands offer you an experience. I buy national brands, despite knowing they are produced on the same lines as store brands, because their quality control is better...even if any one individual thing may be better in some metric (think a bag with 50% marshmallows of the Lucky Charms knock-off, but still preferring General Mills because I don't also get a bag with 5% marshmallows). It's funny how our lives have things in it that are no longer things...but this is the world we live in. Hopefully AMD has a competent launch, and by virtue of not being a miserable crap show it's a fantastic cleanser for Blackwell's turd sandwich. Isn't it funny to say that, even knowing they've got most of the market on lock?
That's a similar pitfall in the argument that the 5090 missing ROPs is still much faster than the competition so "it's fast enough".
Swapping features (and very common and tested ones at that) for "a better experience" lacks innovation. And probably cost the same, instead of going to R&D the funds were given to the Marketing department.
Imagine getting a car that comes without a radio and being said: "it's for a better driving experience" it works for some niches (Ferrari F40) but the iPhones, although not cheap, still find their way onto a bigger, mainstream audience.

They didn't have the biggest and best camera, they had one whose software made your bad pictures turn out good
So, kinda like DLSS...

1740437710376.gif
 
Gamers Nexus has been quoting Techpowerup...
 
Citation please.



Compared to what? You’re mixing up the brands again - slow and hot is AMD’s calling card.

Citation not necessary, you should read the update.

Update Feb 22nd, 6:30 UTC:
NVIDIA's global PR director Ben Berraondo confirmed this issue. He told The Verge:
NVIDIA said:
We have identified a rare issue affecting less than 0.5% (half a percent) of GeForce RTX 5090 / 5090D and 5070 Ti GPUs which have one fewer ROP than specified. The average graphical performance impact is 4%, with no impact on AI and Compute workloads. Affected consumers can contact the board manufacturer for a replacement. The production anomaly has been corrected.


Note that the time between identifying it and coming up with a number has one of two logical outcomes. Either they are BSing a number at 0.5% because they think it's fine, or they are aware of the issue and knew about the 0.5%. Do you want to believe they are stupid, liars, or both?
 
Added update with new statement from NVIDIA:

Upon further investigation, we’ve identified that an early production build of GeForce RTX 5080 GPUs were also affected by the same issue. Affected consumers can contact the board manufacturer for a replacement.
 
Anyone recommending 50-series at this point should be ashamed of themselves.
 
Added update with new statement from NVIDIA:
"We've identified"? Still taking credit for other people's work and reporting. Keep going Nvidia!

Do you want to believe they are stupid, liars, or both?
I don't need to believe, I know they're liars:
1740478030675.jpeg

No "*", no small print, just a lie
 
Both NVIDIA and their OEMs need a beta testing program to help spot these issues before the consumer does.
 
No - they just need to do some quality management.

Funny the statement ... identified ... that some graphic cards ....
* After your public notification of not to specification graphic cards by different sources and too much publicity - NVIDIA identified that some graphic cards have issues. *
We will update this to calm the minds of our loyal customers (sheeps), as we see fit. We will not go public before, we update our statements when our fraudulent products are found in the open field.

I do not see any actions to be done in regards of faulty hardware. It's a long term problem already over many years. Not a recent issue of less than the usual 7 calendar days in quality management.
 
WTF, they admitted RTX 5080 is affected, too, while still saying the defect rate is just 0.5%. So is this per SKU defect rate or overall defect rate?

And just 4% performance impact? Maybe on 5090 but no on 5080. Hell, on 5070 (Ti) this might be more than 10% performance impact.

A 3 billion dollar company needs pressure from community and reviewers to admit things. Shady practics as hell. Shame on you, Nvidia.
 
Back
Top