Tuesday, February 25th 2025

NVIDIA GeForce RTX 50 Cards Spotted with Missing ROPs, NVIDIA Confirms the Issue, Multiple Vendors Affected

TechPowerUp has discovered that there are NVIDIA GeForce RTX 5090 graphics cards in retail circulation that come with too few render units, which lowers performance. Zotac's GeForce RTX 5090 Solid comes with fewer ROPs than it should—168 are enabled, instead of the 176 that are part of the RTX 5090 specifications. This loss of 8 ROPs has a small, but noticeable impact on performance. During recent testing, we noticed our Zotac RTX 5090 Solid sample underperformed slightly, falling behind even the NVIDIA RTX 5090 Founders Edition card. At the time we didn't pay attention to the ROP count that TechPowerUp GPU-Z was reporting, and instead spent time looking for other reasons, like clocks, power, cooling, etc.

Two days ago, one of our readers who goes by "Wuxi Gamer," posted this thread on the TechPowerUp Forums, reporting that his retail Zotac RTX 5090 Solid was showing fewer ROPs in GPU-Z than the RTX 5090 should have. The user tried everything from driver to software re-installs, to switching between the two video BIOSes the card comes with, all to no avail. What a coincidence that we had this card in our labs already, so we then dug out our sample. Lo and behold—our sample is missing ROPs, too! GPU-Z is able to read and report these units counts, in this case through NVIDIA's NVAPI driver interface. The 8 missing ROPs constitute a 4.54% loss in the GPU's raster hardware capability, and to illustrate what this means for performance, we've run a couple of tests.

In the first test, "Elden Ring" at 4K UHD with maxed out settings and native resolution (no DLSS), you can see how the Zotac RTX 5090 Solid falls behind every other RTX 5090 we tested, including the NVIDIA Founders Edition, a de facto reference-design that establishes a performance baseline for the RTX 5090. The Zotac card is 5.6% slower than the FE, and 8.4% slower than the ASUS ROG Astral RTX 5090 OC, the fastest custom design card for this test. Officially, the Solid is clocked at 2407 MHz rated boost frequency, which matches the Founders Edition clocks—it shouldn't be significantly slower in real-life. The interesting thing is that the loss of performance is not visible when monitoring the clock frequencies, because they are as high as expected—there's just fewer units available to take care of the rendering workload.

A ROP (Raster Operations Pipeline) unit in the GPU processes pixel data, handling tasks like blending, antialiasing, render-to-texture, and writing final pixel values to the frame buffer. In contrast, a shading unit, aka "GPU core" is responsible for computing the color, lighting, and material properties of pixels or vertices during the rendering process, without directly interacting with the frame buffer, so the performance hit of the eight missing ROPs depends on how ROP-intensive a game is.
For example, in Starfield, the performance loss is much smaller, and in DOOM Eternal with ray tracing, the card actually ends up close to its expected performance levels.

We've also put the card through a quick 3DMark Time Spy Extreme graphics score run.
  • NVIDIA Founders Edition: 25439
  • Zotac Solid: 22621
  • Gigabyte Gaming OC: 26220
This should be a number that you can test easily for yourself, if you're one of the lucky RTX 5090 owners. The quickest way is definitely to just fire up GPU-Z and look at the ROP count number, it should be "176."

So far, we know only of Zotac 5090 Solid cards that are affected, none of our review samples from ASUS, Gigabyte, MSI, Palit, and NVIDIA exhibit this issue, all 5090 owners should definitely check their cards and report back.

This is an issue with quality assurance at both NVIDIA and Zotac. NVIDIA's add-in card partners (AICs) do not have the ability to configure ROP counts, either physically on the silicon, or in the video BIOS, and yet the GPU, its video BIOS, and the final product, cleared QA testing at both NVIDIA and Zotac.

We are working with Zotac to return the affected card, so they can forward it to NVIDIA for investigation. At this time Zotac was unable to provide a statement, citing the fluidity of the situation. As for possible fixes. We hope the issue is localized to a bug with the driver or the video BIOS, so NVIDIA could release a user-friendly BIOS update tool that can run from within Windows and update the BIOS of the affected cards. If, however, the ROPs were disabled at the hardware-level, then there's little that end-users or even AIC partners can do, except initiating a limited product recall for replacements or refunds. If the ROPs really are disabled through fuses, it seems unlikely that NVIDIA has a way to re-enable those units in the field, because that would potentially provide details to how such units can be reactivated on other cards and SKUs from the company.

Update 14:22 UTC:
Apparently the issue isn't specific to Zotac, HXL posted a screenshot of an MSI RTX 5090D, the China-specific variant of the RTX 5090 with nerfed compute performance, but which is supposed to have 176 ROPs. Much like the Zotac RTX 5090 Solid, it has 8 missing ROPs.

Update 16:38 UTC:
Another card has been found, this time from Manli.

Update 17:30 UTC:
ComputerBase reports that their Zotac RTX 5090 Solid sample is not affected and shows the correct ROP count of 176. This confirms that the issue isn't affecting all cards of this SKU and probably not even all cards in a batch/production run.

Update 17:36 UTC:
Just to clarify, because it has been asked a couple of times. When no driver is installed, GPU-Z will use an internal database as fallback, to show a hardcoded ROP count of 176, instead of "Unknown." This is a reasonable approximation, because all previous cards had a fixed, immutable ROP count. As soon as the driver is installed, GPU-Z will report the "live" ROP counts active on the GPU—this data is read via the NVIDIA drivers.

Update 19:18 UTC:
A card from Gigabyte is affected, too.

Update Feb 22nd, 6:00 UTC:
Palit, Inno3D and MSI found to be affected as well

Update Feb 22nd, 6:30 UTC:
NVIDIA's global PR director Ben Berraondo confirmed this issue. He told The Verge:
NVIDIAWe have identified a rare issue affecting less than 0.5% (half a percent) of GeForce RTX 5090 / 5090D and 5070 Ti GPUs which have one fewer ROP than specified. The average graphical performance impact is 4%, with no impact on AI and Compute workloads. Affected consumers can contact the board manufacturer for a replacement. The production anomaly has been corrected.
Very interesting—NVIDIA confirms that RTX 5070 Ti is affected, too.

While NVIDIA talks about "one ROP unit," this really means "8 ROPs" in our context. Many years ago, marketing decided that higher numbers = better, so they started to report the number of pixels that can be processed per unit, instead of the actual unit counts. So in this case, one hardware unit is disabled, which mean eight pixels per clock less can be processed, resulting in a loss of "8 ROPs".

Update Feb 25th:
In the meantime, some RTX 5080 GPUs with missing ROPs were found, too, NVIDIA provided the following statement to TechPowerUp:
NVIDIAUpon further investigation, we've identified that an early production build of GeForce RTX 5080 GPUs were also affected by the same issue. Affected consumers can contact the board manufacturer for a replacement.
Add your own comment

491 Comments on NVIDIA GeForce RTX 50 Cards Spotted with Missing ROPs, NVIDIA Confirms the Issue, Multiple Vendors Affected

#451
Sir Beregond
MxPhenom 216This is grounds for class action law suit right? How soon can we expect one? Nvidia needs to get their shit together. I'll get a 9070XT in pure retaliation.
I'm so disappointed by the Blackwell launch, I am absolutely considering a 9070 XT as long as its not priced dumb. My 3080 Ti is...fine, but the 12GB has been problematic at 4k in some games.
Posted on Reply
#453
OkieDan
FierceRedWasn't that the one game from Amazon, the MMO (New World, thanks google)? Because they didn't cap the performance of the Main Menu which is a rookie mistake?

I know it doesn't comfort the casualties but who trusts AMAZON to make a game properly anyway...
The only rookie mistake was the design choices made for the GPU. It has never and will never be the game developer's responsibility to limit frames to protect a GPU from doing more work than it can handle. I think they should limit FPS in menu (except graphics settings page) or pause screens to save energy, but not to protect cards from self deleting.
Posted on Reply
#454
Jtuck9
OkieDanThe only rookie mistake was the design choices made for the GPU. It has never and will never be the game developer's responsibility to limit frames to protect a GPU from doing more work than it can handle. I think they should limit FPS in menu (except graphics settings page) or pause screens to save energy, but not to protect cards from self deleting.
I was wondering about graphical setting options when neural rendering etc becomes more prevalent.

www.pcgamer.com/hardware/graphics-cards/theres-a-future-where-you-and-your-graphics-card-decide-what-a-game-looks-like-not-the-developers/
Posted on Reply
#455
Caring1
The PlaguePaper launch, fake prices, fake frames, fake resolution, terrible generational performance uplift, bricked cards, flawed board design, defective chips and no PhysX. But hey, at least we got real flames for our $2,000 $6,000 card.
Thanks for repeating that for the few that haven't read it before, the other thousand times it's been said by someone thinking they are clever. :slap:
Posted on Reply
#456
pavle
Heh, now even RTX 5080 has been found (Founders edition) with 104 ROP's instead of 112 (videocardz.com). Crazy. I'm sure it affects below 0.5% of cards. :shadedshu:
Posted on Reply
#457
TommyT
so i see we have super models and gts models and rtx d models now hhhh too many cards in 1 gen... too many crap
Posted on Reply
#458
Vya Domus
PerfectWaveimaging >>>>>AI server blackwell are also affected too but they dont have proper gpuz program. company spend billion for defective gpu LUL. Also they have problem with hbm3...
They have very few ROPs and not really used for rendering but either way if something is disabled on them I imagine customers would realize it very quickly as the performance number wouldn't match up.
LittleBroThis proves they absolutely don't know real numbers for sure.
They're just lying trying to downplay it, every chip maker knows exactly what they're making. It's not like some guy looks at trays of chips then gives them a proper kick like he's checking a car and goes "Ugh, looks fine to me, send them out".
Posted on Reply
#459
lilhasselhoffer
lexluthermiesterNo counterargument is required. Your statement was so lacking of all merit and logic that any counterargument would be completely superfluous. Your statement argues itself into the realms fanciful delusion, not worthy of any consideration. It is worthy only of being mocked.
MacZNo argument because you have none.

Selling something at $25,000 rather than at $2,000 is what all companies on the planet will do 100% of the time.
You seem to be under the impression that companies are stupid greedy, and that they can charge anything for anything. That's stupid. In that economic view the retort is that consumers can then tell companies to go pound sand...because virtually everything on the planet has something that will replace it. Nvidia charges $2000 for a card, people decide 40% of the performance at 20% of the price is good enough. Crypto is insane...the model switches from stakeholder to proof of work, and suddenly the crypto mining card boom dies. Every single market action will last for some time, until that imbalance is righted.

I'm going to try and make this abundantly clear, this sort of stupid kills customer bases. They will accept that you can get away with it now, but in two to three years they'll be looking for more hardware. They'll see your product, and immediately associate you with price gouging. Whenever the AI market dives, and you're back to selling GPUs to consumers for what has to be a reasonable price, you'll discover that they will not buy into your crap whenever given the choice. That's how companies record record profits, then three to four years later get bought out by their competitors who won by simply not opening their mouths.


Let me also suggest that "peanuts" for consumer GPUs is misleading. Yes, it's not their primary business segment...but it's also how they can sell their silicon to virtually anybody. Yes, AI accelerators are legally limited. At the same time, you can sell a boatload of "consumer" grade GPUs to countries adjacent to countries that are legally not able to be sold to, and watch as the middlemen grow fat and rich off of sending silicon across the border at truly silly smuggling prices. Funny how countries with no huge gaming market have had silly demand for those high end GPUs in the past couple of years...no? All you have to do is have a little inflated trade values to absorb the costs legally, then print money everywhere, while on paper meeting your requirements not to ship to countries. Seems...legit?



Let me end on an anecdote. My family used to be involved in selling bakery goods. Think Pillsbury's expensive brother. Think "fresh baked, hand made muffins" that were called what they were. Scoop and Bake. They sold a product that curb stomped the competition, because consumers could not tell the difference between 30 seconds of labor and 45 minutes of baking antics. The money was good, because they sold flour and components, and about 4-6 months in they'd introduce bakeries to scoop and bake product. 1/4 the labor cost, for a product that was superior and consistent. Yes please.
The thing is, for decades the management team knew they had the market on lock. 5% increase in flour, pass on 5% to customers, and nobody had an issue because the bakery could eat 20% and still be making a profit. That was, until the Dutch came. It was a Dutch consortium who bought the company...and decided to change how the business worked. Butter became margarine, then decreased, then they decided to decrease ingredient quantity, and finally quality. This won't mean anything to you, but an IQF blueberry (individually quick frozen) is why you buy a blueberry muffin and it isn't green on the inside. It's also 20-30% more expensive, and it's the most expensive part of that muffin. So, customers noticed quality drop. And drop. And drop. The company made huge amounts of money...at the expense of a decades long reputation. Eventually the premium product performed as bad as the cheap stuff...and customers jumped ship. The company saw their profit margins tank, and first increased price. When that drove off more customers, they tried to compete on price. With 40% of their customer base gone, and telling them to shove their scoops where the sun doesn't shine, they had pissed away a market advantage that took decades to make for about four total years of record profits...followed by none.

Tell me that doesn't sound like Nvidia right now. That doesn't sound like hedonistic DILLIGAF, that will eventually bury them in the market until AMD or Intel decide to do the same stupid thing after getting the market leadership. Nvidia is currently uncontested in the market, and instead of setting new standards for quality that would permanently relegate their competition to second string they've decided to point the gun directly at their feet and pop off a few rounds. Don't worry, the consumers don't currently have a choice so they have to go with us. So we are clear, this is why I prefer when AMD has a lead in the CPU market over Intel...because there is literally no love left for me an Intel after the socket 2011 where they decided to simply not have a good chipset...because who would ever need more than a pair of SATA III sockets? Yes, my first enthusiast platform killed my interest in every paying for another, because Intel decided since AMD wasn't competing they could release whatever and it'd be good enough. That's why I buy whatever is best, and enjoy when my AMD CPU option is best.
Posted on Reply
#460
tvshacker
lilhasselhofferYou seem to be under the impression that companies are stupid greedy, and that they can charge anything for anything. That's stupid. In that economic view the retort is that consumers can then tell companies to go pound sand...because virtually everything on the planet has something that will replace it. Nvidia charges $2000 for a card, people decide 40% of the performance at 20% of the price is good enough. Crypto is insane...the model switches from stakeholder to proof of work, and suddenly the crypto mining card boom dies. Every single market action will last for some time, until that imbalance is righted.

I'm going to try and make this abundantly clear, this sort of stupid kills customer bases. They will accept that you can get away with it now, but in two to three years they'll be looking for more hardware. They'll see your product, and immediately associate you with price gouging. Whenever the AI market dives, and you're back to selling GPUs to consumers for what has to be a reasonable price, you'll discover that they will not buy into your crap whenever given the choice. That's how companies record record profits, then three to four years later get bought out by their competitors who won by simply not opening their mouths.


Let me also suggest that "peanuts" for consumer GPUs is misleading. Yes, it's not their primary business segment...but it's also how they can sell their silicon to virtually anybody. Yes, AI accelerators are legally limited. At the same time, you can sell a boatload of "consumer" grade GPUs to countries adjacent to countries that are legally not able to be sold to, and watch as the middlemen grow fat and rich off of sending silicon across the border at truly silly smuggling prices. Funny how countries with no huge gaming market have had silly demand for those high end GPUs in the past couple of years...no? All you have to do is have a little inflated trade values to absorb the costs legally, then print money everywhere, while on paper meeting your requirements not to ship to countries. Seems...legit?



Let me end on an anecdote. My family used to be involved in selling bakery goods. Think Pillsbury's expensive brother. Think "fresh baked, hand made muffins" that were called what they were. Scoop and Bake. They sold a product that curb stomped the competition, because consumers could not tell the difference between 30 seconds of labor and 45 minutes of baking antics. The money was good, because they sold flour and components, and about 4-6 months in they'd introduce bakeries to scoop and bake product. 1/4 the labor cost, for a product that was superior and consistent. Yes please.
The thing is, for decades the management team knew they had the market on lock. 5% increase in flour, pass on 5% to customers, and nobody had an issue because the bakery could eat 20% and still be making a profit. That was, until the Dutch came. It was a Dutch consortium who bought the company...and decided to change how the business worked. Butter became margarine, then decreased, then they decided to decrease ingredient quantity, and finally quality. This won't mean anything to you, but an IQF blueberry (individually quick frozen) is why you buy a blueberry muffin and it isn't green on the inside. It's also 20-30% more expensive, and it's the most expensive part of that muffin. So, customers noticed quality drop. And drop. And drop. The company made huge amounts of money...at the expense of a decades long reputation. Eventually the premium product performed as bad as the cheap stuff...and customers jumped ship. The company saw their profit margins tank, and first increased price. When that drove off more customers, they tried to compete on price. With 40% of their customer base gone, and telling them to shove their scoops where the sun doesn't shine, they had pissed away a market advantage that took decades to make for about four total years of record profits...followed by none.

Tell me that doesn't sound like Nvidia right now. That doesn't sound like hedonistic DILLIGAF, that will eventually bury them in the market until AMD or Intel decide to do the same stupid thing after getting the market leadership. Nvidia is currently uncontested in the market, and instead of setting new standards for quality that would permanently relegate their competition to second string they've decided to point the gun directly at their feet and pop off a few rounds. Don't worry, the consumers don't currently have a choice so they have to go with us. So we are clear, this is why I prefer when AMD has a lead in the CPU market over Intel...because there is literally no love left for me an Intel after the socket 2011 where they decided to simply not have a good chipset...because who would ever need more than a pair of SATA III sockets? Yes, my first enthusiast platform killed my interest in every paying for another, because Intel decided since AMD wasn't competing they could release whatever and it'd be good enough. That's why I buy whatever is best, and enjoy when my AMD CPU option is best.
Really enjoyed your post, examples and anecdote.
Let's not forget Deepseek vs OpenAI situation: more for less and open source.
lilhasselhofferI'm going to try and make this abundantly clear, this sort of stupid kills customer bases.
Regarding this statement, I believe, there are examples that go along this and in the opposite direction.
Still to this day I'm baffled with the success of the iPhone, the first one didn't have MMS or 3G, followed by paid apps that were free (or had free equivalent) on other platforms and exorbitant prices for storages upgrades (and never offered expandable storage) And let's not forget: bad coverage if you "held the phone wrong" (iPhone 4).
Posted on Reply
#461
lilhasselhoffer
tvshackerReally enjoyed your post, examples and anecdote.
Let's not forget Deepseek vs OpenAI situation: more for less and open source.

Regarding this statement, I believe, there are examples that go along this and in the opposite direction.
Still to this day I'm baffled with the success of the iPhone, the first one didn't have MMS or 3G, followed by paid apps that were free (or had free equivalent) on other platforms and exorbitant prices for storages upgrades (and never offered expandable storage) And let's not forget: bad coverage if you "held the phone wrong" (iPhone 4).
You are looking at iPhone incorrectly. It sounds backwards, but the thing they are selling is not the hardware. The thing they are selling is the experience, with things that just work.

That sounds backwards, but iPhone was about the walled garden. They didn't release the first conferencing application, they released it whenever the networks allowed it to work most of the time. They didn't have the biggest and best camera, they had one whose software made your bad pictures turn out good...even if you had no control over advanced features. To this day people still love them because they just work organically...even if their features are years behind competitors and they constantly require expensive updating. It sounds backwards, but you love Apple not for their products, but because their experience is (or was) king.


If you frame most other things that way, you see why things succeed. Nvidia is best, because they've got creators whose primary tools "just work" with their software. Cars are a commodity, so the luxury brands offer you an experience. I buy national brands, despite knowing they are produced on the same lines as store brands, because their quality control is better...even if any one individual thing may be better in some metric (think a bag with 50% marshmallows of the Lucky Charms knock-off, but still preferring General Mills because I don't also get a bag with 5% marshmallows). It's funny how our lives have things in it that are no longer things...but this is the world we live in. Hopefully AMD has a competent launch, and by virtue of not being a miserable crap show it's a fantastic cleanser for Blackwell's turd sandwich. Isn't it funny to say that, even knowing they've got most of the market on lock?
Posted on Reply
#462
tvshacker
lilhasselhofferYou are looking at iPhone incorrectly. It sounds backwards, but the thing they are selling is not the hardware. The thing they are selling is the experience, with things that just work.

That sounds backwards, but iPhone was about the walled garden. They didn't release the first conferencing application, they released it whenever the networks allowed it to work most of the time. They didn't have the biggest and best camera, they had one whose software made your bad pictures turn out good...even if you had no control over advanced features. To this day people still love them because they just work organically...even if their features are years behind competitors and they constantly require expensive updating. It sounds backwards, but you love Apple not for their products, but because their experience is (or was) king.


If you frame most other things that way, you see why things succeed. Nvidia is best, because they've got creators whose primary tools "just work" with their software. Cars are a commodity, so the luxury brands offer you an experience. I buy national brands, despite knowing they are produced on the same lines as store brands, because their quality control is better...even if any one individual thing may be better in some metric (think a bag with 50% marshmallows of the Lucky Charms knock-off, but still preferring General Mills because I don't also get a bag with 5% marshmallows). It's funny how our lives have things in it that are no longer things...but this is the world we live in. Hopefully AMD has a competent launch, and by virtue of not being a miserable crap show it's a fantastic cleanser for Blackwell's turd sandwich. Isn't it funny to say that, even knowing they've got most of the market on lock?
That's a similar pitfall in the argument that the 5090 missing ROPs is still much faster than the competition so "it's fast enough".
Swapping features (and very common and tested ones at that) for "a better experience" lacks innovation. And probably cost the same, instead of going to R&D the funds were given to the Marketing department.
Imagine getting a car that comes without a radio and being said: "it's for a better driving experience" it works for some niches (Ferrari F40) but the iPhones, although not cheap, still find their way onto a bigger, mainstream audience.
lilhasselhofferThey didn't have the biggest and best camera, they had one whose software made your bad pictures turn out good
So, kinda like DLSS...

Posted on Reply
#463
SmookinJoe
Gamers Nexus has been quoting Techpowerup...
Posted on Reply
#464
Visible Noise
Tek-CheckNvidia knew about defective dies
Citation please.
KritPoor performance
Compared to what? You’re mixing up the brands again - slow and hot is AMD’s calling card.
Posted on Reply
#465
lilhasselhoffer
Visible NoiseCitation please.



Compared to what? You’re mixing up the brands again - slow and hot is AMD’s calling card.
Citation not necessary, you should read the update.

Update Feb 22nd, 6:30 UTC:
NVIDIA's global PR director Ben Berraondo confirmed this issue. He told The Verge:
NVIDIA said:
We have identified a rare issue affecting less than 0.5% (half a percent) of GeForce RTX 5090 / 5090D and 5070 Ti GPUs which have one fewer ROP than specified. The average graphical performance impact is 4%, with no impact on AI and Compute workloads. Affected consumers can contact the board manufacturer for a replacement. The production anomaly has been corrected.
Note that the time between identifying it and coming up with a number has one of two logical outcomes. Either they are BSing a number at 0.5% because they think it's fine, or they are aware of the issue and knew about the 0.5%. Do you want to believe they are stupid, liars, or both?
Posted on Reply
#466
W1zzard
Added update with new statement from NVIDIA:
Upon further investigation, we’ve identified that an early production build of GeForce RTX 5080 GPUs were also affected by the same issue. Affected consumers can contact the board manufacturer for a replacement.
Posted on Reply
#467
Chomiq
Anyone recommending 50-series at this point should be ashamed of themselves.
Posted on Reply
#468
tvshacker
W1zzardAdded update with new statement from NVIDIA:
"We've identified"? Still taking credit for other people's work and reporting. Keep going Nvidia!
lilhasselhofferDo you want to believe they are stupid, liars, or both?
I don't need to believe, I know they're liars:

No "*", no small print, just a lie
Posted on Reply
#469
robert3892
Both NVIDIA and their OEMs need a beta testing program to help spot these issues before the consumer does.
Posted on Reply
#470
_roman_
No - they just need to do some quality management.

Funny the statement ... identified ... that some graphic cards ....
* After your public notification of not to specification graphic cards by different sources and too much publicity - NVIDIA identified that some graphic cards have issues. *
We will update this to calm the minds of our loyal customers (sheeps), as we see fit. We will not go public before, we update our statements when our fraudulent products are found in the open field.

I do not see any actions to be done in regards of faulty hardware. It's a long term problem already over many years. Not a recent issue of less than the usual 7 calendar days in quality management.
Posted on Reply
#471
LittleBro
WTF, they admitted RTX 5080 is affected, too, while still saying the defect rate is just 0.5%. So is this per SKU defect rate or overall defect rate?

And just 4% performance impact? Maybe on 5090 but no on 5080. Hell, on 5070 (Ti) this might be more than 10% performance impact.

A 3 billion dollar company needs pressure from community and reviewers to admit things. Shady practics as hell. Shame on you, Nvidia.
Posted on Reply
#472
Denver
Anyone who still believes Nvidia's lies deserves special shock treatment to wake them up.
Posted on Reply
#473
Chomiq
OrodruinNvidia has been a company that has been treating users like fools since the TNT2 era. We have seen a lot of things in the past regarding the cards it has produced. I am not surprised at all that these problems are happening. They produce cards with the same American fast food logic. Produce them immediately. Release them to the market immediately. Make lots of money immediately.

Lots of bugs and problems. I guess the quality part is now in the background.
And yet it didn't stop you from buying 4070 Ti Super.
Posted on Reply
#474
Bwaze
I think it's obvious to everyone that "0.5%" is a blatant fabrication - but who can prove otherwise? They can claim it's only 0.01%, that from thousands of sold cards there's only a handful of cases, and not all of them are confirmed - in fact, all if them are more or less anecdotal.

So move along, nothing to see here, we congratulate ourselves on the industry's best quality control, but you might not receive the whole card, your cable might melt, your capacitors might burn up, your PC might end up with black screen with no apparent reason...

All of that is of course avoidable through our "Verified Priority Un-Access" - where by signing up you will be put on a list that has no connection with card availability whatsoever, thus ensuring your complete safety fron faulty Blackwell products!
Posted on Reply
#475
Dragokar
_roman_No - they just need to do some quality management.

Funny the statement ... identified ... that some graphic cards ....
* After your public notification of not to specification graphic cards by different sources and too much publicity - NVIDIA identified that some graphic cards have issues. *
We will update this to calm the minds of our loyal customers (sheeps), as we see fit. We will not go public before, we update our statements when our fraudulent products are found in the open field.

I do not see any actions to be done in regards of faulty hardware. It's a long term problem already over many years. Not a recent issue of less than the usual 7 calendar days in quality management.
But QM costs money.......so no. The customer can handle that ;)
Posted on Reply
Add your own comment
Feb 25th, 2025 13:00 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts