Tuesday, February 25th 2025

NVIDIA GeForce RTX 50 Cards Spotted with Missing ROPs, NVIDIA Confirms the Issue, Multiple Vendors Affected
TechPowerUp has discovered that there are NVIDIA GeForce RTX 5090 graphics cards in retail circulation that come with too few render units, which lowers performance. Zotac's GeForce RTX 5090 Solid comes with fewer ROPs than it should—168 are enabled, instead of the 176 that are part of the RTX 5090 specifications. This loss of 8 ROPs has a small, but noticeable impact on performance. During recent testing, we noticed our Zotac RTX 5090 Solid sample underperformed slightly, falling behind even the NVIDIA RTX 5090 Founders Edition card. At the time we didn't pay attention to the ROP count that TechPowerUp GPU-Z was reporting, and instead spent time looking for other reasons, like clocks, power, cooling, etc.
Two days ago, one of our readers who goes by "Wuxi Gamer," posted this thread on the TechPowerUp Forums, reporting that his retail Zotac RTX 5090 Solid was showing fewer ROPs in GPU-Z than the RTX 5090 should have. The user tried everything from driver to software re-installs, to switching between the two video BIOSes the card comes with, all to no avail. What a coincidence that we had this card in our labs already, so we then dug out our sample. Lo and behold—our sample is missing ROPs, too! GPU-Z is able to read and report these units counts, in this case through NVIDIA's NVAPI driver interface. The 8 missing ROPs constitute a 4.54% loss in the GPU's raster hardware capability, and to illustrate what this means for performance, we've run a couple of tests.In the first test, "Elden Ring" at 4K UHD with maxed out settings and native resolution (no DLSS), you can see how the Zotac RTX 5090 Solid falls behind every other RTX 5090 we tested, including the NVIDIA Founders Edition, a de facto reference-design that establishes a performance baseline for the RTX 5090. The Zotac card is 5.6% slower than the FE, and 8.4% slower than the ASUS ROG Astral RTX 5090 OC, the fastest custom design card for this test. Officially, the Solid is clocked at 2407 MHz rated boost frequency, which matches the Founders Edition clocks—it shouldn't be significantly slower in real-life. The interesting thing is that the loss of performance is not visible when monitoring the clock frequencies, because they are as high as expected—there's just fewer units available to take care of the rendering workload.
A ROP (Raster Operations Pipeline) unit in the GPU processes pixel data, handling tasks like blending, antialiasing, render-to-texture, and writing final pixel values to the frame buffer. In contrast, a shading unit, aka "GPU core" is responsible for computing the color, lighting, and material properties of pixels or vertices during the rendering process, without directly interacting with the frame buffer, so the performance hit of the eight missing ROPs depends on how ROP-intensive a game is.For example, in Starfield, the performance loss is much smaller, and in DOOM Eternal with ray tracing, the card actually ends up close to its expected performance levels.
We've also put the card through a quick 3DMark Time Spy Extreme graphics score run.
So far, we know only of Zotac 5090 Solid cards that are affected, none of our review samples from ASUS, Gigabyte, MSI, Palit, and NVIDIA exhibit this issue, all 5090 owners should definitely check their cards and report back.
This is an issue with quality assurance at both NVIDIA and Zotac. NVIDIA's add-in card partners (AICs) do not have the ability to configure ROP counts, either physically on the silicon, or in the video BIOS, and yet the GPU, its video BIOS, and the final product, cleared QA testing at both NVIDIA and Zotac.
We are working with Zotac to return the affected card, so they can forward it to NVIDIA for investigation. At this time Zotac was unable to provide a statement, citing the fluidity of the situation. As for possible fixes. We hope the issue is localized to a bug with the driver or the video BIOS, so NVIDIA could release a user-friendly BIOS update tool that can run from within Windows and update the BIOS of the affected cards. If, however, the ROPs were disabled at the hardware-level, then there's little that end-users or even AIC partners can do, except initiating a limited product recall for replacements or refunds. If the ROPs really are disabled through fuses, it seems unlikely that NVIDIA has a way to re-enable those units in the field, because that would potentially provide details to how such units can be reactivated on other cards and SKUs from the company.
Update 14:22 UTC:Apparently the issue isn't specific to Zotac, HXL posted a screenshot of an MSI RTX 5090D, the China-specific variant of the RTX 5090 with nerfed compute performance, but which is supposed to have 176 ROPs. Much like the Zotac RTX 5090 Solid, it has 8 missing ROPs.
Update 16:38 UTC:Another card has been found, this time from Manli.
Update 17:30 UTC:
ComputerBase reports that their Zotac RTX 5090 Solid sample is not affected and shows the correct ROP count of 176. This confirms that the issue isn't affecting all cards of this SKU and probably not even all cards in a batch/production run.
Update 17:36 UTC:
Just to clarify, because it has been asked a couple of times. When no driver is installed, GPU-Z will use an internal database as fallback, to show a hardcoded ROP count of 176, instead of "Unknown." This is a reasonable approximation, because all previous cards had a fixed, immutable ROP count. As soon as the driver is installed, GPU-Z will report the "live" ROP counts active on the GPU—this data is read via the NVIDIA drivers.
Update 19:18 UTC:A card from Gigabyte is affected, too.
Update Feb 22nd, 6:00 UTC:Palit, Inno3D and MSI found to be affected as well
Update Feb 22nd, 6:30 UTC:
NVIDIA's global PR director Ben Berraondo confirmed this issue. He told The Verge:
While NVIDIA talks about "one ROP unit," this really means "8 ROPs" in our context. Many years ago, marketing decided that higher numbers = better, so they started to report the number of pixels that can be processed per unit, instead of the actual unit counts. So in this case, one hardware unit is disabled, which mean eight pixels per clock less can be processed, resulting in a loss of "8 ROPs".
Update Feb 25th:
In the meantime, some RTX 5080 GPUs with missing ROPs were found, too, NVIDIA provided the following statement to TechPowerUp:
Two days ago, one of our readers who goes by "Wuxi Gamer," posted this thread on the TechPowerUp Forums, reporting that his retail Zotac RTX 5090 Solid was showing fewer ROPs in GPU-Z than the RTX 5090 should have. The user tried everything from driver to software re-installs, to switching between the two video BIOSes the card comes with, all to no avail. What a coincidence that we had this card in our labs already, so we then dug out our sample. Lo and behold—our sample is missing ROPs, too! GPU-Z is able to read and report these units counts, in this case through NVIDIA's NVAPI driver interface. The 8 missing ROPs constitute a 4.54% loss in the GPU's raster hardware capability, and to illustrate what this means for performance, we've run a couple of tests.In the first test, "Elden Ring" at 4K UHD with maxed out settings and native resolution (no DLSS), you can see how the Zotac RTX 5090 Solid falls behind every other RTX 5090 we tested, including the NVIDIA Founders Edition, a de facto reference-design that establishes a performance baseline for the RTX 5090. The Zotac card is 5.6% slower than the FE, and 8.4% slower than the ASUS ROG Astral RTX 5090 OC, the fastest custom design card for this test. Officially, the Solid is clocked at 2407 MHz rated boost frequency, which matches the Founders Edition clocks—it shouldn't be significantly slower in real-life. The interesting thing is that the loss of performance is not visible when monitoring the clock frequencies, because they are as high as expected—there's just fewer units available to take care of the rendering workload.
A ROP (Raster Operations Pipeline) unit in the GPU processes pixel data, handling tasks like blending, antialiasing, render-to-texture, and writing final pixel values to the frame buffer. In contrast, a shading unit, aka "GPU core" is responsible for computing the color, lighting, and material properties of pixels or vertices during the rendering process, without directly interacting with the frame buffer, so the performance hit of the eight missing ROPs depends on how ROP-intensive a game is.For example, in Starfield, the performance loss is much smaller, and in DOOM Eternal with ray tracing, the card actually ends up close to its expected performance levels.
We've also put the card through a quick 3DMark Time Spy Extreme graphics score run.
- NVIDIA Founders Edition: 25439
- Zotac Solid: 22621
- Gigabyte Gaming OC: 26220
So far, we know only of Zotac 5090 Solid cards that are affected, none of our review samples from ASUS, Gigabyte, MSI, Palit, and NVIDIA exhibit this issue, all 5090 owners should definitely check their cards and report back.
This is an issue with quality assurance at both NVIDIA and Zotac. NVIDIA's add-in card partners (AICs) do not have the ability to configure ROP counts, either physically on the silicon, or in the video BIOS, and yet the GPU, its video BIOS, and the final product, cleared QA testing at both NVIDIA and Zotac.
We are working with Zotac to return the affected card, so they can forward it to NVIDIA for investigation. At this time Zotac was unable to provide a statement, citing the fluidity of the situation. As for possible fixes. We hope the issue is localized to a bug with the driver or the video BIOS, so NVIDIA could release a user-friendly BIOS update tool that can run from within Windows and update the BIOS of the affected cards. If, however, the ROPs were disabled at the hardware-level, then there's little that end-users or even AIC partners can do, except initiating a limited product recall for replacements or refunds. If the ROPs really are disabled through fuses, it seems unlikely that NVIDIA has a way to re-enable those units in the field, because that would potentially provide details to how such units can be reactivated on other cards and SKUs from the company.
Update 14:22 UTC:Apparently the issue isn't specific to Zotac, HXL posted a screenshot of an MSI RTX 5090D, the China-specific variant of the RTX 5090 with nerfed compute performance, but which is supposed to have 176 ROPs. Much like the Zotac RTX 5090 Solid, it has 8 missing ROPs.
Update 16:38 UTC:Another card has been found, this time from Manli.
Update 17:30 UTC:
ComputerBase reports that their Zotac RTX 5090 Solid sample is not affected and shows the correct ROP count of 176. This confirms that the issue isn't affecting all cards of this SKU and probably not even all cards in a batch/production run.
Update 17:36 UTC:
Just to clarify, because it has been asked a couple of times. When no driver is installed, GPU-Z will use an internal database as fallback, to show a hardcoded ROP count of 176, instead of "Unknown." This is a reasonable approximation, because all previous cards had a fixed, immutable ROP count. As soon as the driver is installed, GPU-Z will report the "live" ROP counts active on the GPU—this data is read via the NVIDIA drivers.
Update 19:18 UTC:A card from Gigabyte is affected, too.
Update Feb 22nd, 6:00 UTC:Palit, Inno3D and MSI found to be affected as well
Update Feb 22nd, 6:30 UTC:
NVIDIA's global PR director Ben Berraondo confirmed this issue. He told The Verge:
NVIDIAWe have identified a rare issue affecting less than 0.5% (half a percent) of GeForce RTX 5090 / 5090D and 5070 Ti GPUs which have one fewer ROP than specified. The average graphical performance impact is 4%, with no impact on AI and Compute workloads. Affected consumers can contact the board manufacturer for a replacement. The production anomaly has been corrected.Very interesting—NVIDIA confirms that RTX 5070 Ti is affected, too.
While NVIDIA talks about "one ROP unit," this really means "8 ROPs" in our context. Many years ago, marketing decided that higher numbers = better, so they started to report the number of pixels that can be processed per unit, instead of the actual unit counts. So in this case, one hardware unit is disabled, which mean eight pixels per clock less can be processed, resulting in a loss of "8 ROPs".
Update Feb 25th:
In the meantime, some RTX 5080 GPUs with missing ROPs were found, too, NVIDIA provided the following statement to TechPowerUp:
NVIDIAUpon further investigation, we've identified that an early production build of GeForce RTX 5080 GPUs were also affected by the same issue. Affected consumers can contact the board manufacturer for a replacement.
491 Comments on NVIDIA GeForce RTX 50 Cards Spotted with Missing ROPs, NVIDIA Confirms the Issue, Multiple Vendors Affected
www.pcgamer.com/hardware/graphics-cards/theres-a-future-where-you-and-your-graphics-card-decide-what-a-game-looks-like-not-the-developers/
I'm going to try and make this abundantly clear, this sort of stupid kills customer bases. They will accept that you can get away with it now, but in two to three years they'll be looking for more hardware. They'll see your product, and immediately associate you with price gouging. Whenever the AI market dives, and you're back to selling GPUs to consumers for what has to be a reasonable price, you'll discover that they will not buy into your crap whenever given the choice. That's how companies record record profits, then three to four years later get bought out by their competitors who won by simply not opening their mouths.
Let me also suggest that "peanuts" for consumer GPUs is misleading. Yes, it's not their primary business segment...but it's also how they can sell their silicon to virtually anybody. Yes, AI accelerators are legally limited. At the same time, you can sell a boatload of "consumer" grade GPUs to countries adjacent to countries that are legally not able to be sold to, and watch as the middlemen grow fat and rich off of sending silicon across the border at truly silly smuggling prices. Funny how countries with no huge gaming market have had silly demand for those high end GPUs in the past couple of years...no? All you have to do is have a little inflated trade values to absorb the costs legally, then print money everywhere, while on paper meeting your requirements not to ship to countries. Seems...legit?
Let me end on an anecdote. My family used to be involved in selling bakery goods. Think Pillsbury's expensive brother. Think "fresh baked, hand made muffins" that were called what they were. Scoop and Bake. They sold a product that curb stomped the competition, because consumers could not tell the difference between 30 seconds of labor and 45 minutes of baking antics. The money was good, because they sold flour and components, and about 4-6 months in they'd introduce bakeries to scoop and bake product. 1/4 the labor cost, for a product that was superior and consistent. Yes please.
The thing is, for decades the management team knew they had the market on lock. 5% increase in flour, pass on 5% to customers, and nobody had an issue because the bakery could eat 20% and still be making a profit. That was, until the Dutch came. It was a Dutch consortium who bought the company...and decided to change how the business worked. Butter became margarine, then decreased, then they decided to decrease ingredient quantity, and finally quality. This won't mean anything to you, but an IQF blueberry (individually quick frozen) is why you buy a blueberry muffin and it isn't green on the inside. It's also 20-30% more expensive, and it's the most expensive part of that muffin. So, customers noticed quality drop. And drop. And drop. The company made huge amounts of money...at the expense of a decades long reputation. Eventually the premium product performed as bad as the cheap stuff...and customers jumped ship. The company saw their profit margins tank, and first increased price. When that drove off more customers, they tried to compete on price. With 40% of their customer base gone, and telling them to shove their scoops where the sun doesn't shine, they had pissed away a market advantage that took decades to make for about four total years of record profits...followed by none.
Tell me that doesn't sound like Nvidia right now. That doesn't sound like hedonistic DILLIGAF, that will eventually bury them in the market until AMD or Intel decide to do the same stupid thing after getting the market leadership. Nvidia is currently uncontested in the market, and instead of setting new standards for quality that would permanently relegate their competition to second string they've decided to point the gun directly at their feet and pop off a few rounds. Don't worry, the consumers don't currently have a choice so they have to go with us. So we are clear, this is why I prefer when AMD has a lead in the CPU market over Intel...because there is literally no love left for me an Intel after the socket 2011 where they decided to simply not have a good chipset...because who would ever need more than a pair of SATA III sockets? Yes, my first enthusiast platform killed my interest in every paying for another, because Intel decided since AMD wasn't competing they could release whatever and it'd be good enough. That's why I buy whatever is best, and enjoy when my AMD CPU option is best.
Let's not forget Deepseek vs OpenAI situation: more for less and open source. Regarding this statement, I believe, there are examples that go along this and in the opposite direction.
Still to this day I'm baffled with the success of the iPhone, the first one didn't have MMS or 3G, followed by paid apps that were free (or had free equivalent) on other platforms and exorbitant prices for storages upgrades (and never offered expandable storage) And let's not forget: bad coverage if you "held the phone wrong" (iPhone 4).
That sounds backwards, but iPhone was about the walled garden. They didn't release the first conferencing application, they released it whenever the networks allowed it to work most of the time. They didn't have the biggest and best camera, they had one whose software made your bad pictures turn out good...even if you had no control over advanced features. To this day people still love them because they just work organically...even if their features are years behind competitors and they constantly require expensive updating. It sounds backwards, but you love Apple not for their products, but because their experience is (or was) king.
If you frame most other things that way, you see why things succeed. Nvidia is best, because they've got creators whose primary tools "just work" with their software. Cars are a commodity, so the luxury brands offer you an experience. I buy national brands, despite knowing they are produced on the same lines as store brands, because their quality control is better...even if any one individual thing may be better in some metric (think a bag with 50% marshmallows of the Lucky Charms knock-off, but still preferring General Mills because I don't also get a bag with 5% marshmallows). It's funny how our lives have things in it that are no longer things...but this is the world we live in. Hopefully AMD has a competent launch, and by virtue of not being a miserable crap show it's a fantastic cleanser for Blackwell's turd sandwich. Isn't it funny to say that, even knowing they've got most of the market on lock?
Swapping features (and very common and tested ones at that) for "a better experience" lacks innovation. And probably cost the same, instead of going to R&D the funds were given to the Marketing department.
Imagine getting a car that comes without a radio and being said: "it's for a better driving experience" it works for some niches (Ferrari F40) but the iPhones, although not cheap, still find their way onto a bigger, mainstream audience. So, kinda like DLSS...
Update Feb 22nd, 6:30 UTC:
NVIDIA's global PR director Ben Berraondo confirmed this issue. He told The Verge: Note that the time between identifying it and coming up with a number has one of two logical outcomes. Either they are BSing a number at 0.5% because they think it's fine, or they are aware of the issue and knew about the 0.5%. Do you want to believe they are stupid, liars, or both?
No "*", no small print, just a lie
Funny the statement ... identified ... that some graphic cards ....
* After your public notification of not to specification graphic cards by different sources and too much publicity - NVIDIA identified that some graphic cards have issues. *
We will update this to calm the minds of our loyal customers (sheeps), as we see fit. We will not go public before, we update our statements when our fraudulent products are found in the open field.
I do not see any actions to be done in regards of faulty hardware. It's a long term problem already over many years. Not a recent issue of less than the usual 7 calendar days in quality management.
And just 4% performance impact? Maybe on 5090 but no on 5080. Hell, on 5070 (Ti) this might be more than 10% performance impact.
A 3 billion dollar company needs pressure from community and reviewers to admit things. Shady practics as hell. Shame on you, Nvidia.
So move along, nothing to see here, we congratulate ourselves on the industry's best quality control, but you might not receive the whole card, your cable might melt, your capacitors might burn up, your PC might end up with black screen with no apparent reason...
All of that is of course avoidable through our "Verified Priority Un-Access" - where by signing up you will be put on a list that has no connection with card availability whatsoever, thus ensuring your complete safety fron faulty Blackwell products!