Saturday, February 22nd 2025

NVIDIA Investigates GeForce RTX 50 Series "Blackwell" Black Screen and BSOD Issues

NVIDIA's problems with its latest flagship RTX 50 series "Blackwell" GPUs continue. First, it was melting power cables, then stability issues, and recently, the case of missing ROPs. Today, we got a confirmation that NVIDIA is investigating users experiencing significant stability problems, with reports of widespread black screen issues and system crashes since the launch of the dedicated 572 driver branch. Unlike owners of previous generation cards who can roll back to stable drivers, RTX 50 series users are particularly affected as no alternative drivers are available for their hardware. The problems span across the entire RTX 50 lineup, including the 5090, 5080, and newly announced 5070 Ti models. Users have reported issues ranging from display flickering to complete system failures, with some experiencing blue screen of death (BSOD) errors during normal operation.

The situation is especially problematic when using advanced features like DLSS 4 frame generation. NVIDIA staff member Manuel recently addressed these concerns on the GeForce Forums, confirming that the company is actively investigating the problems. Preliminary investigation suggests the issues might extend beyond driver software, potentially requiring VBIOS updates to resolve the stability problems fully. Some users have found temporary relief by reducing PCIe speeds below Gen 5 or lowering monitor refresh rates to 60 Hz, suggesting potential firmware-level compatibility issues. However, these workarounds are not guaranteed solutions for all affected users. The latest driver update (572.47), which added support for the RTX 5070 Ti, failed to address these critical stability issues, including only a single bug fix related to monitor wake-up from sleep mode. This has left many early adopters of the RTX 50 series frustrated with their premium hardware purchases.
Sources: GeForce Forums, via Tom's Hardware, VideoCardz
Add your own comment

244 Comments on NVIDIA Investigates GeForce RTX 50 Series "Blackwell" Black Screen and BSOD Issues

#226
Zyalon
This AI bs need to stop.
I will keep my 4090 for now. Next generation on a new node will probably get better results with this ai race that just drag market and money for now
Posted on Reply
#227
umeng2002
I've been wanting to replay Mirror's Edge for a while now. It's still sold. nVidia should provide a way to enable real GPU PhysX for the 50 series cards.
Posted on Reply
#228
Legacy-ZA
JustBenchingI didn't say it's a good thing. You asked me what's the difference between turning off PhysX and turning off rt.. I explained the difference.
Just take the "L"

Even Digital Foundry, the greatest of all nGreedia shills are extremely annoyed by this decision and "will continue to bang the drum" until something is done.
Posted on Reply
#229
Orodruin
I think Nvidia hired software developers from AMD/ATI. There is no other explanation for this.
Posted on Reply
#230
JustBenching
TheinsanegamerNAnd turning RT off makes a game look just as bad as any other card not running RT. Your own argument doesnt make any sense.
Are you following the "discussion". He asked me why im okay with turning physx off and not okay with turning RT off on amd cards. The difference is neither amd or nvidia can run physx now, nvidia can run RT though, amd can't (it's too slow is the point).
Legacy-ZAJust take the "L"

Even Digital Foundry, the greatest of all nGreedia shills are extremely annoyed by this decision and "will continue to bang the drum" until something is done.
Ok man, I won't buy an nvidia card cause it can't run physx. Ill buy a different brand cause..oh, they can't run physx either. You do get how it's not a thing that will stop you from buying one since nobody else has it either, right?
Posted on Reply
#231
AusWolf
JustBenchingAre you following the "discussion". He asked me why im okay with turning physx off and not okay with turning RT off on amd cards. The difference is neither amd or nvidia can run physx now, nvidia can run RT though, amd can't (it's too slow is the point).

Ok man, I won't buy an nvidia card cause it can't run physx. Ill buy a different brand cause..oh, they can't run physx either. You do get how it's not a thing that will stop you from buying one since nobody else has it either, right?
That's not the point. The point is that PhysX has been with Nvidia for about 20 years now, so you would rightfully expect it to run on the newest cards as well. You wouldn't expect them to drop RT support unannounced, would you? So why is it okay if they do it with (32-bit) PhysX?
Posted on Reply
#232
JustBenching
AusWolfThat's not the point. The point is that PhysX has been with Nvidia for about 20 years now, so you would rightfully expect it to run on the newest cards as well. You wouldn't expect them to drop RT support unannounced, would you? So why is it okay if they do it with (32-bit) PhysX?
Oh I agree with that, if it doesn't cost a lo to have it supported it's a silly thing to remove, but im pointing out that the removal of physx doesn't give a competitive advantage to anyone else, since noone else can run physx. If they drop RT support other brands that have support for RT have a competitive advantage.
Posted on Reply
#233
AusWolf
JustBenchingOh I agree with that, if it doesn't cost a lo to have it supported it's a silly thing to remove
And if they do remove it for whatever reason, at least they could face up, and announce it properly. We get news of new games supporting DLSS left and right even though it's pretty much granted these days, but not a word of this? This is what I'm disgusted about.
JustBenchingbut im pointing out that the removal of physx doesn't give a competitive advantage to anyone else, since noone else can run physx. If they drop RT support other brands that have support for RT have a competitive advantage.
Sure, but no one is arguing about a competitive edge. The argument is that they're taking something away from you that has been with you for 20 years (or however many years you've been playing on Nvidia).
Posted on Reply
#234
rattlehead99
Guwapo77I understand your mindset, but I would never want developers to stop pushing the graphical envelope. If these devs didn't strive to separate themselves, would we get hardware advancements?
Yes, we'd get them at the same pace, while graphical improvements would come at a lower pace, but in general high graphical fidelity is why AAA games cost hundreds of millions of dollars, which is why most of them suck.
Posted on Reply
#235
Caring1
JustBenchingTurning physX off makes nvidia cards run the game at an equal level to every other card, since no other card supports physx.
So first Nvidia buys PhysX so the competition can't use it natively, then they remove it from their own GPUs.
Brilliant, that will show them. :kookoo:
All they are doing is forcing people to hang onto their old Nvidia GPUs to use them for their PhysX capabilities, alongside any brand graphics card.
Posted on Reply
#236
Waldorf
while not cool to remove it, the days of 32bit is getting old, and at some point stuff gets done/outperformed by other things, and i would be more unhappy losing "support" on a 10y old gpu,
than 20y old features used by one game i played (UT), which even with my 2080S was stuttering when enabled and playing cranked up settings at 2160p.
Posted on Reply
#237
Random_User
Shou MikoIf this had been in the AI and Server market there would have been bigger consequences than in the consumer market where it is....
This all is just speculations... But something tells me, that these, might be exactly the faulty B200, the Enterprize/AI were hit with. And instead of dump them, nVidia just did some "magic", slicing, disabling/fusing some stuff, wrote the "Gaming" SKU numbers, and tried to push it to the consumers. Or, maybe these are indeed the separate gaming chips, but the had the same issues as the early B200, but instead of throw the expensive silicon, they just tried to get away with it. Their reasoning would be: with this scarcity, and exorbitant scalping, every buyer will be "grateful" for even defective chips, and hold on to it forever.
Posted on Reply
#238
Waldorf
@Shou Miko
@Random_User
fact is, they've (always) done some kind of binning/selecting as you will never get 100% of samples to be perfect, just depends on what they decide to do about it.

they could have easily made things like 5080Ti/5085 OC/5090 "basic" model, instead of trying to sprinkle their defective chips in between working ones to hide it.
Posted on Reply
#239
thesmokingman
They reserved the good silicon for the gpus they sell for 40,000 bucks...
Posted on Reply
#242
Tartaros
Dr. DroHave an update for you, someone on Reddit that already has their card went out and bought a RTX 3050 6GB low profile GPU for it, such configuration will work

nvidia/comments/1iv2a4c
Looks like my RTX A2000 has a life after the 5090 arrives
Suddenly, the GT710 that has been around since forever and was already obsolete when it came out becomes somewhat valuable xD
Posted on Reply
#243
Dr. Dro
TartarosSuddenly, the GT710 that has been around since forever and was already obsolete when it came out becomes somewhat valuable xD
The 710 cannot run alongside a 50 series card because the driver version mismatches and updates have been discontinued a long time ago. But you can use a GT 1030, at least until Pascal is dropped out of the driver which should be soon. Unsure about the fate of the GTX 16 series, 1630/1650 could follow soon so it's RTX 2060/3050 and up.
Posted on Reply
#244
Scattergrunt
Dr. DroThe 710 cannot run alongside a 50 series card because the driver version mismatches and updates have been discontinued a long time ago. But you can use a GT 1030, at least until Pascal is dropped out of the driver which should be soon. Unsure about the fate of the GTX 16 series, 1630/1650 could follow soon so it's RTX 2060/3050 and up.
16 and 20 series will probably be dropped at the same time I figured, since they run the same architecture. I haven't really paid attention to how NVIDIA is dropping driver support, though, so I could be very much wrong.
Posted on Reply
Add your own comment
Mar 28th, 2025 10:42 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts