• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Investigates GeForce RTX 50 Series "Blackwell" Black Screen and BSOD Issues

This problem existed on the 40 series on 3rd party cables and was diagnosed as a sense issue, so curious if it is the same cause here.
 
Anyway Win 12 will drop the full x86 archi, lol.
Extremely unlikely, as doing so would obliterate Windows' backwards compatibility and the majority of their reason to exist. There's nothing wrong with 32 bit libraries, they do not hurt anyone.

The mroe recent push to eliminate older software and APIs, which has pushed more software to web based or multiplatform applications, have significantly hurt MS, windows is now below 3/4ths of the total market, down from a peak of 92%. The more they push to eliminate backwards compatibility, the more market they will lose.

Of course, this is the same company that strangled their console market to the point of being bedridden, so who knows, maybe MS will be the first multi trillion dollar company to go bankrupt from their own stupidity.

Finally someone with common sense. Desktop GPUs had an engineering standard for almost 2 decades. It stated that a GPU should NOT be over 250W TDP and usually that was 210-230W power draw.
Also I think game developers should stop with the super high fidelity graphics, it makes for worse games and super broken, unfinished, super expensive to develop ones at the same time.
What standard was this? Do you have documentation for it?

The "200w standard" didnt exist. 20 years ago, GPUs were limited to 100-150w, not because of some ancient wisdom, but because they were pushing the limit of process nodes at the time. The moment nodes allowed higher limits, we pushed higher. This has always happened.

I remember people clutching their pearls and fetching their fainting couches over the GTX 280's power draw. You know, a card that was sufficiently cooled by a basic 2 slot blower. Saying anything over 150 was just too much. Hell there was hand wringing over the voodoo cards needing external power connectors WAY back when, and how they just used too much and what is happening to our hobby, ece ece. And dear god, the fire and brimstone when CPUs needed coolers with FANS! THE HORROR!
If this had been in the AI and Server market there would have been bigger consequences than in the consumer market where it is....

I guess this shows once again NGreedia doesn't care much for consumers but only investors and such, if they really cared they wouldn't have launched any of cards if these issues and further more they created the hole 12VPWR problem themselves because "oh" our gpu's needs 500W+++ to be so fast instead of innovate on bringing power consumption down to a single maximum 2x8pin pcie power connectors this would improve a lot.

I sad it a couple of times already stop the 5ish % performances and fakeness upscales and frames and stand still while improving power consumption instead of users needing 1000W+ to run a crappy card at the end of the day.

I wish both AMD and NV would stop the performance race and start looking into just innovate on the performance we see and bring it down to 200-250W that would be innovation but I guess none of them have the brains, b**** and p**** to do so....
Nothing is stopping you from buying 200w GPUs. Or undervolting higher end parts. Sounds like the one lacking brains here isnt the developers of GPUs......
There is no point unless you wanna play almost 2 decade old games. The latest games that made use of 32 bit physx are from 2013 but most of them are from 2007-2010.
If I didnt care about backwards compatibility, I'd buy a mac or a console.
 
Last edited:
Hold them accountable with your wallets, not your words :)
 
Do I need to repeat that the games run just fine without physX?
You can repeat it until you are blue in the face, it doesnt change the fact that a feature of the game has now been made inaccessible on newer hardware, for the sole reason of greed.

Unless you have access to a patch that re enables these effects without older nvidia hardware, there is no argument against this.
 
You can repeat it until you are blue in the face, it doesnt change the fact that a feature of the game has now been made inaccessible on newer hardware, for the sole reason of greed.

Unless you have access to a patch that re enables these effects without older nvidia hardware, there is no argument against this.
Yes, now nvidia new gen cards are on par with every other vendors in terms of playing 15 year old games with physX.
 
Yes, now nvidia new gen cards are on par with every other vendors in terms of playing 15 year old games with physX.
So Nvidia lost a unique selling point in terms of broad and ubiquitous support. Its really that simple. How you value that is another, and personal, matter.
 
While they're at it, they can figure out where the missing ROPs went
 
Living on the edge of tomorrow.. Sorry, got sonic the hedgehog on the mind.


With framerates equivalent to high end GPU's from 6 generations ago. Then again, it isnt hard to just go out and buy a second hand GPU that supports physX. I hoard hardware for a reason man..
Sonic Adventure 2, Live and Learn by Crush 40.

You should hear this, by Noisy Neighbors


Hold them accountable with your wallets, not your words :)
Do both, get on their sm, report on ftc and other comprobable orgs in other parts of the world, heck send Louis Rossman a message
 
Hopefully people will buy AMD GPUs or even Intel on the low end instead of Nvidia but I expect not.
Well....that would require cards.

AMD is too busy playing coy with their cards and Intel seems to manufacture theirs in batches of 5, and hasnt bothered making anything higher then a 3060 in performance.
 
You asked me why it isn't the same as disabling RT on AMD. I explained why.
You didn't really, but I'll leave it at that. At least your reasoning wasn't something I can agree with. A game being 15 years old doesn't make it less valuable in my eyes.
 
Finally someone with common sense. Desktop GPUs had an engineering standard for almost 2 decades. It stated that a GPU should NOT be over 250W TDP and usually that was 210-230W power draw.
Also I think game developers should stop with the super high fidelity graphics, it makes for worse games and super broken, unfinished, super expensive to develop ones at the same time.
I understand your mindset, but I would never want developers to stop pushing the graphical envelope. If these devs didn't strive to separate themselves, would we get hardware advancements?
 
You didn't really, but I'll leave it at that. At least your reasoning wasn't something I can agree with. A game being 15 years old doesn't make it less valuable in my eyes.
Turning physX off makes nvidia cards run the game at an equal level to every other card, since no other card supports physx.

Turning off RT on amd cards makes the game look worse (arguable,I get it, but that was your question) than nvidia cards.

So clearly, the 2 cases are not comparable.
 
Turning physX off makes nvidia cards run the game at an equal level to every other card, since no other card supports physx.

Turning off RT on amd cards makes the game look worse (arguable,I get it, but that was your question) than nvidia cards.

So clearly, the 2 cases are not comparable.
Wait... Turning off RT (shiny puddles) makes the game look worse, but turning off PhysX (flying particles) doesn't. That's some weird-ass skewed logic, dude. :wtf:
 
Wait... Turning off RT (shiny puddles) makes the game look worse, but turning off PhysX (flying particles) doesn't. That's some weird-ass skewed logic, dude. :wtf:
No, that's not what I said. Turning physX off makes the game look as bad as it does on any other non nvidia card is what I said.
 
Let's be real. Nvidia doesn't want their cards in your home. They want them all in data centers where they rent out time on. Or they train an AI model for the game on their servers, and they sell you a small AI card to run the model on to play the game.
 
Hopefully people will buy AMD GPUs or even Intel on the low end instead of Nvidia but I expect not.
it all depends on AMD and Intel. Intel cards are still hard to find, and impossible in some countries, and AMD made a thriller out of their launch, lets hope it doesn't end with disappointment
 
No, that's not what I said. Turning physX off makes the game look as bad as it does on any other non nvidia card is what I said.
And that's a good thing because...?
 
And that's a good thing because...?
I didn't say it's a good thing. You asked me what's the difference between turning off PhysX and turning off rt.. I explained the difference.
 
No, that's not what I said. Turning physX off makes the game look as bad as it does on any other non nvidia card is what I said.
And turning RT off makes a game look just as bad as any other card not running RT. Your own argument doesnt make any sense.
 
Sonic Adventure 2, Live and Learn by Crush 40.

You should hear this, by Noisy Neighbors

I actually grew up with this show through various reruns of it. Grew up with tons of old and new media in my childhood. (I'm like a weird hybrid of a 80's, 90's and 00's kid in spirit and its crazy.) Had a pager until like late 2011.
Oh so now we have actual reports of 5080's too? I thought we already ruled this out b4 but this seems real. Not surprising.
 
And turning RT off makes a game look just as bad as any other card not running RT. Your own argument doesnt make any sense.
Exactly my point, thank you. If turning PhysX off isn't an issue, then turning RT off isn't an issue, either. Neither of them are essential features, just nice to have. Still, I'd like the cancellation of any feature to be announced, so at least you know what you're up against.
 
Back
Top