• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Investigates GeForce RTX 50 Series "Blackwell" Black Screen and BSOD Issues

Every single driver release from Nvidia is a bad driver. They've not had a stable driver for probably the past 5 years and almost all of their GPU's have had some sort of issue, whether its black screen, blue screen, BSOD, green video, crashing, etc... Nvidia truly don't care about gamers anymore, they have garbage drivers with no Q&A!

Surely they can afford 20 people to test their drivers and run various configurations, or are all of their developers and testers at AI now?
 
I'm guessing you between 40-60FPS. I want to play 90-120+ fps at max settings hence why I wanted a 5090. The monitor I have rocks to 240Hz, I mean I don't think I'm asking for too much in OLED glory. I thought 2025 was going to be that year as I'm doing a massive rebuild...WRONG!
I get 40-60fps when playing with high, but as I have a 4K120 screen, dropping the details a little is the wisest thing to do for now. At least FF VII Rebirth runs flawlessly at 100+ fps with 4K medium and it's a 2025 game (well, port).

Even if I would have the cash, I would personally never get a 3KEUR graphics card and give them that "this is fine, next time we can pay 4KEUR" message.
 
AMD could have made a killing if the stayed in flagship competition wouldn't even matter if it was 10% slow at this point.
 
Every single driver release from Nvidia is a bad driver. They've not had a stable driver for probably the past 5 years and almost all of their GPU's have had some sort of issue, whether its black screen, blue screen, BSOD, green video, crashing, etc... Nvidia truly don't care about gamers anymore, they have garbage drivers with no Q&A!

Surely they can afford 20 people to test their drivers and run various configurations, or are all of their developers and testers at AI now?
566.36 seems stable at least.
 
AMD could have made a killing if the stayed in flagship competition wouldn't even matter if it was 10% slow at this point.
Literally all they have to do is deliver on the 9070/XT, price them reasonably, and have sufficient stock. Thats it.

Dont shoot yourself in the leg with a bazooka AGAIN AMD, cmon!
 
Literally all they have to do is deliver on the 9070/XT, price them reasonably, and have sufficient stock. Thats it.

Dont shoot yourself in the leg with a bazooka AGAIN AMD, cmon!

Me right now.... :laugh:

Jon Stewart Popcorn GIF
 
I've read comments saying Physx is getting dropped. So, are they going to do a wrapper to let it execute in CPU or something? If I have Physx compatible GPU lying around, I could use it as a dedicated Physx card like back in the day with the GT9300 and GT9400, but in that case when Nvidia drops support for the old card will it break compatibility with it?

Have an update for you, someone on Reddit that already has their card went out and bought a RTX 3050 6GB low profile GPU for it, such configuration will work


Looks like my RTX A2000 has a life after the 5090 arrives
 
I'm surprised that's not more fire issues...I checked my cards hot spot...255⁰ wtf NVIDIA
 
Last edited:
I'm surprised that's not more fire issues...I checked my cats hot spot...255⁰ wtf NVIDIA

It is obviously not running at 255. The Hot Spot temperature readout was removed in Blackwell GPUs, and GPU-Z should not be reporting it on 50 series cards anymore. 255 happens to be the highest value expected by the software (equivalent to $FF in hex). It's a bugged value.

Jensen, finding ways to sell off all those 3050s. Tremendous businessman!

The more you buy, the more you save, I guess
 
Man this launch is the sloppiest i have ever seen from Nvidia LOL.

Yeah, I'm finding it very unprofessional from a 3 Trillion dollar company or should I say the second most valuable company in the world..
 
I get 40-60fps when playing with high, but as I have a 4K120 screen, dropping the details a little is the wisest thing to do for now. At least FF VII Rebirth runs flawlessly at 100+ fps with 4K medium and it's a 2025 game (well, port).

Even if I would have the cash, I would personally never get a 3KEUR graphics card and give them that "this is fine, next time we can pay 4KEUR" message.
NGL, I was in the market for the MSI Supreme Liquid when it was $2500. But then it went to $2600 and now I think its $2800. Naw son, I'm probably going to get this 9070XT and pass this 6900XT on to my son's 1440p rig.
 
It is obviously not running at 255. The Hot Spot temperature readout was removed in Blackwell GPUs, and GPU-Z should not be reporting it on 50 series cards anymore. 255 happens to be the highest value expected by the software (equivalent to $FF in hex). It's a bugged value.



The more you buy, the more you save, I guess
I wasnt using gpuz to check temps. But ik do a bit more research before I fully come to a conclusion on what to do.
 
If this had been in the AI and Server market there would have been bigger consequences than in the consumer market where it is....

I guess this shows once again NGreedia doesn't care much for consumers but only investors and such, if they really cared they wouldn't have launched any of cards if these issues and further more they created the hole 12VPWR problem themselves because "oh" our gpu's needs 500W+++ to be so fast instead of innovate on bringing power consumption down to a single maximum 2x8pin pcie power connectors this would improve a lot.

I sad it a couple of times already stop the 5ish % performances and fakeness upscales and frames and stand still while improving power consumption instead of users needing 1000W+ to run a crappy card at the end of the day.

I wish both AMD and NV would stop the performance race and start looking into just innovate on the performance we see and bring it down to 200-250W that would be innovation but I guess none of them have the brains, b**** and p**** to do so....
Finally someone with common sense. Desktop GPUs had an engineering standard for almost 2 decades. It stated that a GPU should NOT be over 250W TDP and usually that was 210-230W power draw.
Also I think game developers should stop with the super high fidelity graphics, it makes for worse games and super broken, unfinished, super expensive to develop ones at the same time.
 
We are complaining about the problems of a product as per the topic of the article. And they keep rolling in:



Your encoding needs don’t justify the bad QC we are seeing. Some people are spending their hard earned money as you said and getting lower specs than advertised, black screens, burned components, etc. It’s not really fair to them as they put their trust in Nvidia, spent way over MSRP and then got burned…literally!

Never known a cap blow like that, so i don't believe it is.
 
Never known a cap blow like that, so i don't believe it is.

Probably just a fluke but with everything else going on still not a good look.

Vrm failure still sucks.
 
NGL, I was in the market for the MSI Supreme Liquid when it was $2500. But then it went to $2600 and now I think its $2800. Naw son, I'm probably going to get this 9070XT and pass this 6900XT on to my son's 1440p rig.

Yeah, the price gen on gen just seemingly continues to jump without any regard to the actual cost to produce these cards. It's gotten to the point where it doesn't even make sense unless you are making money off the card. Very few people have that kind of money to spend on a single PC component, let alone just for gaming.

Aside from all the issues (connector, defective cards, driver bugs, ect), there's also a seemingly growing list of nitpicks that make the 5000 series worse as well. They don't support 32-bit PhysX and they no longer show you the hotspot data (which is just stupid, some people will have an uneven paste application from the factory and it will inevitably cause some additional failures out of all the cards Nvidia sells).

Finally someone with common sense. Desktop GPUs had an engineering standard for almost 2 decades. It stated that a GPU should NOT be over 250W TDP and usually that was 210-230W power draw.
Also I think game developers should stop with the super high fidelity graphics, it makes for worse games and super broken, unfinished, super expensive to develop ones at the same time.

High fidelity graphics are not the problem, lack of optimization is. You should watch Threat Interactive, he demonstrates how modern games could be optimized but aren't. He did a video on silent hill demonstrating just how much resources they waste on trees the player can hardly see in the fog. UE5 in general requires optimization for high fidelity games, it's designed for games like fortnite out of the box. In addition, you have games like Star Wars outlaws that use Ray Tracing like a crutch. They didn't want to go through the game and do the lighting properly so they just require software/hardware RT on. This is the problem we may be seeing more of, games with no art direction and no clue how to enhance scenes with proper lighting so they just lazily slap RT in there to make it look high fidelity so they don't have to hire senior devs who know what they are doing. Just look how good half-life Alyx looks and how well it performs and couple that with it's extreme level of interactivity with the environment. That's what a competant dev looks like, none of this garbage we are seeing from dying AAA studios that bleed all their talent years ago to reduce costs.
 
Last edited:
Probably just a fluke but with everything else going on still not a good look.

Vrm failure still sucks.

VRM failure as you know is not a capacitor failure, and is most likely what happened.

Shit happens, don't see how it's nVidia's fault either.
 
Anyone buying the best most expensive gaming card on the market is not concerned about value for money, that doesn't make them idiots, it just means they have the disposable income for it, who did not deserve.
Fix it
 
Have an update for you, someone on Reddit that already has their card went out and bought a RTX 3050 6GB low profile GPU for it, such configuration will work


Looks like my RTX A2000 has a life after the 5090 arrives
There is no point unless you wanna play almost 2 decade old games. The latest games that made use of 32 bit physx are from 2013 but most of them are from 2007-2010.
 
Let's see now if they're actually going to address these issues properly or will they blunder it like how Intel did with their scummy handling of the 13th and 14th gen instability issues.
 
There is no point unless you wanna play almost 2 decade old games. The latest games that made use of 32 bit physx are from 2013 but most of them are from 2007-2010.

The fact that it has become a major issue, is that people still play those games, its that simple.

Just because they are old, shouldn't mean you remove the functionality, it's easy enough to keep it in the driver forever more "as is/was" even if you don't want to support it.
 
The fact that it has become a major issue, is that people still play those games, its that simple.

Just because they are old, shouldn't mean you remove the functionality, it's easy enough to keep it in the driver forever more "as is/was" even if you don't want to support it.
You realize they can still play those games?
 
Back
Top