• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Investigates GeForce RTX 50 Series "Blackwell" Black Screen and BSOD Issues

So someone will move from a 4070ti /4080 / 4090 to a 9070xt because they can't find a 5090? Doesn't make sense, right? Im telling you, anyone who owns an nvidia high end card of the last couple generations doesn't have an option, they either buy nvidia or nothing.
So 4000 series or "last couple generations", including that 3080 with lol VRAM.
 
This AI bs need to stop.
I will keep my 4090 for now. Next generation on a new node will probably get better results with this ai race that just drag market and money for now
 
I didn't say it's a good thing. You asked me what's the difference between turning off PhysX and turning off rt.. I explained the difference.

Just take the "L"

Even Digital Foundry, the greatest of all nGreedia shills are extremely annoyed by this decision and "will continue to bang the drum" until something is done.
 
I think Nvidia hired software developers from AMD/ATI. There is no other explanation for this.
 
And turning RT off makes a game look just as bad as any other card not running RT. Your own argument doesnt make any sense.
Are you following the "discussion". He asked me why im okay with turning physx off and not okay with turning RT off on amd cards. The difference is neither amd or nvidia can run physx now, nvidia can run RT though, amd can't (it's too slow is the point).

Just take the "L"

Even Digital Foundry, the greatest of all nGreedia shills are extremely annoyed by this decision and "will continue to bang the drum" until something is done.
Ok man, I won't buy an nvidia card cause it can't run physx. Ill buy a different brand cause..oh, they can't run physx either. You do get how it's not a thing that will stop you from buying one since nobody else has it either, right?
 
Are you following the "discussion". He asked me why im okay with turning physx off and not okay with turning RT off on amd cards. The difference is neither amd or nvidia can run physx now, nvidia can run RT though, amd can't (it's too slow is the point).

Ok man, I won't buy an nvidia card cause it can't run physx. Ill buy a different brand cause..oh, they can't run physx either. You do get how it's not a thing that will stop you from buying one since nobody else has it either, right?
That's not the point. The point is that PhysX has been with Nvidia for about 20 years now, so you would rightfully expect it to run on the newest cards as well. You wouldn't expect them to drop RT support unannounced, would you? So why is it okay if they do it with (32-bit) PhysX?
 
That's not the point. The point is that PhysX has been with Nvidia for about 20 years now, so you would rightfully expect it to run on the newest cards as well. You wouldn't expect them to drop RT support unannounced, would you? So why is it okay if they do it with (32-bit) PhysX?
Oh I agree with that, if it doesn't cost a lo to have it supported it's a silly thing to remove, but im pointing out that the removal of physx doesn't give a competitive advantage to anyone else, since noone else can run physx. If they drop RT support other brands that have support for RT have a competitive advantage.
 
Oh I agree with that, if it doesn't cost a lo to have it supported it's a silly thing to remove
And if they do remove it for whatever reason, at least they could face up, and announce it properly. We get news of new games supporting DLSS left and right even though it's pretty much granted these days, but not a word of this? This is what I'm disgusted about.

but im pointing out that the removal of physx doesn't give a competitive advantage to anyone else, since noone else can run physx. If they drop RT support other brands that have support for RT have a competitive advantage.
Sure, but no one is arguing about a competitive edge. The argument is that they're taking something away from you that has been with you for 20 years (or however many years you've been playing on Nvidia).
 
I understand your mindset, but I would never want developers to stop pushing the graphical envelope. If these devs didn't strive to separate themselves, would we get hardware advancements?
Yes, we'd get them at the same pace, while graphical improvements would come at a lower pace, but in general high graphical fidelity is why AAA games cost hundreds of millions of dollars, which is why most of them suck.
 
Turning physX off makes nvidia cards run the game at an equal level to every other card, since no other card supports physx.
So first Nvidia buys PhysX so the competition can't use it natively, then they remove it from their own GPUs.
Brilliant, that will show them. :kookoo:
All they are doing is forcing people to hang onto their old Nvidia GPUs to use them for their PhysX capabilities, alongside any brand graphics card.
 
while not cool to remove it, the days of 32bit is getting old, and at some point stuff gets done/outperformed by other things, and i would be more unhappy losing "support" on a 10y old gpu,
than 20y old features used by one game i played (UT), which even with my 2080S was stuttering when enabled and playing cranked up settings at 2160p.
 
If this had been in the AI and Server market there would have been bigger consequences than in the consumer market where it is....
This all is just speculations... But something tells me, that these, might be exactly the faulty B200, the Enterprize/AI were hit with. And instead of dump them, nVidia just did some "magic", slicing, disabling/fusing some stuff, wrote the "Gaming" SKU numbers, and tried to push it to the consumers. Or, maybe these are indeed the separate gaming chips, but the had the same issues as the early B200, but instead of throw the expensive silicon, they just tried to get away with it. Their reasoning would be: with this scarcity, and exorbitant scalping, every buyer will be "grateful" for even defective chips, and hold on to it forever.
 
@Shou Miko
@Random_User
fact is, they've (always) done some kind of binning/selecting as you will never get 100% of samples to be perfect, just depends on what they decide to do about it.

they could have easily made things like 5080Ti/5085 OC/5090 "basic" model, instead of trying to sprinkle their defective chips in between working ones to hide it.
 
They reserved the good silicon for the gpus they sell for 40,000 bucks...
 
User fix Nvidia's bad PhysX decision on RTX 50 series:

Since PhysX Games Are A Problem On NVIDIA RTX 50 GPUs, This User Combined RTX 5090 With RTX 3050 To Solve The Performance Issue

1740477882620.png
 
Have an update for you, someone on Reddit that already has their card went out and bought a RTX 3050 6GB low profile GPU for it, such configuration will work


Looks like my RTX A2000 has a life after the 5090 arrives
Suddenly, the GT710 that has been around since forever and was already obsolete when it came out becomes somewhat valuable xD
 
Suddenly, the GT710 that has been around since forever and was already obsolete when it came out becomes somewhat valuable xD

The 710 cannot run alongside a 50 series card because the driver version mismatches and updates have been discontinued a long time ago. But you can use a GT 1030, at least until Pascal is dropped out of the driver which should be soon. Unsure about the fate of the GTX 16 series, 1630/1650 could follow soon so it's RTX 2060/3050 and up.
 
The 710 cannot run alongside a 50 series card because the driver version mismatches and updates have been discontinued a long time ago. But you can use a GT 1030, at least until Pascal is dropped out of the driver which should be soon. Unsure about the fate of the GTX 16 series, 1630/1650 could follow soon so it's RTX 2060/3050 and up.
16 and 20 series will probably be dropped at the same time I figured, since they run the same architecture. I haven't really paid attention to how NVIDIA is dropping driver support, though, so I could be very much wrong.
 
Back
Top