• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Investigates GeForce RTX 50 Series "Blackwell" Black Screen and BSOD Issues

Bud, come on now. In the other thread i mentioned way more important features (like broadcast) and you said barely anyone uses them. Now nvidia removed physx from 13 to 18 year old games (which brings them on parity with every other GPU brand that can't run them either) and suddenly that's a big issue. If physx was such an issue you wouldn't have an amd gpu and people would mention it in the 20 pages long thread about why nvidia has a 90% marketshare. Nobody did. Not a single person. But if you think it's important, then that easily explains why nvidia has been outselling amd 9 to 1.
I never said it's a big issue. I said it's a shady thing to do without announcement.
 
I've no idea why 2012 is some arbitrary cutoff date though. PhysX games started around 2006 and 32-bit .exe's were common through to around the mid 2015's with a lot of 2013 (PayDay 2), 2014 (Gauntlet, QUBE), even 2015 (Crookz The Big Heist, Life is Strange) games were all 32-bit PhysX games that still aren't on your "expanded" list, nor did you add the 5 above mentioned earlier titles (Dishonored, Mass Effect trilogy, Dragon Age Origins, etc) despite the individual game pages confirming they use it. Keep adding all the ones you seem to have left off, and you'll find there's closer to +150 games affected, not just 42-59.
There are 150+ games with physX? Did not expect that, that's a long list. 150+ games that you can only play properly with an nvidia 40 series card or older, but not with 50 series (or any other non nvidia card). That's whack.
 
eh, they already were idiots for buying them in the firs place, not a single review says anything positive about the value for money.
The 5090 is the only review so far ive seen from teh 50 series with positives in it, unsurprisingly. 5080 was written off as being basically a nothing sandwich, the 5070Ti is not too dissimilar (though, it does ideally have better things going for it than a 5080, if it were at MSRP)
 
Anyway Win 12 will drop the full x86 archi, lol.
 
Karma is a bitch innit Huang. You can't half ass an entire market segment and think you'll get away with it.

Either you bring proper products, or your business will go to shit. Live and learn
 
Karma is a bitch innit Huang. You can't half ass an entire market segment and think you'll get away with it.

Either you bring proper products, or your business will go to shit. Live and learn

@Vayra86 NV don't care about consumers anymore almost all their earnings are right now from AI and if it doesn't effect AI performance but only gamers Jenson problemer won't do squad about it.

I am currently just waiting for AMD to release their RX 9070 and RX 9070 XT to see if they will be good enough for my 1440p gaming or if I should rush out get a RX 7900 XT / XTX while they are still there.

Because as good as my Sapphire Radeon RX 590 Nitro+ Special Edition does in Wolfenstein 2 it won't be up to much in newer titles that I want to game this year.
 
Karma is a bitch innit Huang. You can't half ass an entire market segment and think you'll get away with it.

Either you bring proper products, or your business will go to shit. Live and learn
Living on the edge of tomorrow.. Sorry, got sonic the hedgehog on the mind.

You realize they can still play those games?
With framerates equivalent to high end GPU's from 6 generations ago. Then again, it isnt hard to just go out and buy a second hand GPU that supports physX. I hoard hardware for a reason man..
 
With framerates equivalent to high end GPU's from 6 generations ago. Then again, it isnt hard to just go out and buy a second hand GPU that supports physX. I hoard hardware for a reason man..
Physx --> off
 
High fidelity graphics are not the problem, lack of optimization is. You should watch Threat Interactive, he demonstrates how modern games could be optimized but aren't. He did a video on silent hill demonstrating just how much resources they waste on trees the player can hardly see in the fog. UE5 in general requires optimization for high fidelity games, it's designed for games like fortnite out of the box. In addition, you have games like Star Wars outlaws that use Ray Tracing like a crutch. They didn't want to go through the game and do the lighting properly so they just require software/hardware RT on. This is the problem we may be seeing more of, games with no art direction and no clue how to enhance scenes with proper lighting so they just lazily slap RT in there to make it look high fidelity so they don't have to hire senior devs who know what they are doing. Just look how good half-life Alyx looks and how well it performs and couple that with it's extreme level of interactivity with the environment. That's what a competant dev looks like, none of this garbage we are seeing from dying AAA studios that bleed all their talent years ago to reduce costs.
For the lighting part, whether it's for games, or fully path-traced movies, the lighting isn't just handled by anyone, it's a full job position called lighting artist/ Lighting TD. Ray tracing/path tracing make it easier to get something that looks correct, but to make something that looks great you still need to know what you are doing. Focusing on ligting is a legit career path, that some people have been doing their whole life. RT did not lower the skill requirement, it's not just a button press and "voilà". Here a quote from a Lead lighting TD who work on movies :
Other times I get to be really creative when I am told to evoke a certain atmosphere or feeling through my lighting. Then I have to use colour theory, or think about where the viewer’s eye is being drawn, or how much shadow we have in the image and if it’s telling the right story.

The more technical role of a lighting TD is knowing how to do all of this using software, and making sure that my scenes are optimized enough for the computers to be able to output images I am creating. This part is called “rendering”
BEHIND THE SCENES: ILM’S LEAD LIGHTING TD, ALANA LENNIE - Ausfilm
1740309108534.png
1740309490838.png
1740310816703.png


For the optimisation part, the CEO of epic seemed to imply that any additional power is often being used to generally increase the level the details "as precise as your eyes can see"...which seems to means tons of barely noticeable details, but who have an impact on performance regardless.
Do We Have Enough Shaders? - ACM SIGGRAPH Blog
 
Physx --> off
That sounds funny from someone who's all in for RT. I wonder what makes shiny puddles more important than flying particles in your opinion. I also wonder why you don't find RT off an equally easy, non-issue option as PhysX off.
 
It would be interesting to see whether these problems are present on Linux as well, and debugging there might shed some light on what is going on.
I assume it's triggered under specific conditions or depends on sample quality as otherwise it should be mentioned in reviews.

But as always, it would be very useful for those reporting such issues to provide some info regarding how often and how reproducible the crashes are, whether it's consistent across driver versions, etc. and especially whether they have overclocked RAM on their system (which is known to cause driver instability).

More competition is certainly needed, but I want the competition to rise to compete with Nvidia's quality, not for Nvidia to lower their standards. But in order to see real competition the other vendors also needs to provide ample supplies, great driver support (incl. proper DirectX 9 support), and provide real competitors up into the high-end.
 
That sounds funny from someone who's all in for RT. I wonder what makes shiny puddles more important than flying particles in your opinion. I also wonder why you don't find RT off an equally easy, non-issue option as PhysX off.
I don't find it important cause we are talking about 15 year old games. You don't find it important since you have an amd gpu that doesn't support it anyways. It's not like amd gpus can actually run physX so turning it off on 5xxx brings nvidia cards to parity, it doesn't give them an advantage. Not being able to run heavy RT on amd cards doesn't bring them to parity with nvidia,
 
There are 150+ games with physX? Did not expect that, that's a long list. 150+ games that you can only play properly with an nvidia 40 series card or older, but not with 50 series (or any other non nvidia card). That's whack.
There's actually 928 games that use PhysX. The +150 were the subset of that list confirmed to be 32-bit only. Edit: Looked through it again and it's probably closer to +160-180 games. Many post 2013 games are 64-bit, but there are still quite a lot of 2014-2016 games that were still 32-bit only.

Physx --> off
I think you're missing the point that some games will try and force it on anyway upon detecting an nVidia GPU without the option to disable it in-game (expected to be needed for AMD but not nVidia at the time). Not every game offers the option of disabling, some will just auto-detect AMD vs nVidia and base it on that and accidentally end up forcing it on in software mode.

Real question is though "why" it was removed. It's one thing if they removed it in the driver due to lacking actual hardware, but something else entirely if they just arbitrarily crippled a (still perfectly working) feature with zero upside. I own an nVidia GPU too (and am thus hardly some AMD fanboy), but in my 35 years of PC gaming, I've never seen the PCMR crowd become as insufferably self-destructive as this, constantly giving nVidia a free pass / making excuses for all the wrong things for all the wrong reasons... (Design a cable with zero power overhead headroom = "You're holding it wrong (tm)", play older games = "You're playing the 'wrong' games, but even if you weren't it's still OK to remove features as long as the other brand lacks same feature, just pretend it was never there". How the hell did we go from Turing's $1k flagship but at least had +75% generational gain to... charging $2k for "+15-25% uplift, MOAR fake frames and a whole bunch of regressions (hardware & software)"...
 
Karma is a bitch innit Huang. You can't half ass an entire market segment and think you'll get away with it.

Either you bring proper products, or your business will go to shit. Live and learn

That's the problem, if they tried this before the Enterprise/Server/A.I boom, he would have had his hands full, but sadly, this won't even phase him.

With framerates equivalent to high end GPU's from 6 generations ago. Then again, it isnt hard to just go out and buy a second hand GPU that supports physX. I hoard hardware for a reason man..

Yeah, but now you are forced to make space for a PhysX card, what about small form factor builds, or, builds in general that conserve power? Anyway you slice it, this was a bad move.

I mean, I can just keep this GTX 1650 for PhysX if I upgrade, but I don't want to, it doesn't belong in my build.
 
That's the problem, if they tried this before the Enterprise/Server/A.I boom, he would have had his hands full, but sadly, this won't even phase him.



Yeah, but now you are forced to make space for a PhysX card, what about small form factor builds, or, builds in general that conserve power? Anyway you slice it, this was a bad move.

I mean, I can just keep this GTX 1650 for PhysX if I upgrade, but I don't want to, it doesn't belong in my build.

Easy plug/unplug with a cable extender (PCIE-16 tho)
 
Yeah, but now you are forced to make space for a PhysX card, what about small form factor builds, or, builds in general that conserve power? Anyway you slice it, this was a bad move.

I mean, I can just keep this GTX 1650 for PhysX if I upgrade, but I don't want to, it doesn't belong in my build.
I didnt say it was a good fix. Its just a fix. I sure as hell ain't doing that, despite the fact I can
 
I don't find it important cause we are talking about 15 year old games. You don't find it important since you have an amd gpu that doesn't support it anyways. It's not like amd gpus can actually run physX so turning it off on 5xxx brings nvidia cards to parity, it doesn't give them an advantage. Not being able to run heavy RT on amd cards doesn't bring them to parity with nvidia,
With that logic, Nvidia should disable RT on 60 series to bring it to parity with AMD. I don't understand why a feature should be discarded only on the basis of it being in 15 year-old games. It was a selling point of Nvidia GPUs at that time, so I don't see why Nvidia should give up on it now.

And I do have Nvidia GPUs as well. Having an AMD GPU in my main gaming rig at the moment doesn't mean that I only care about AMD tech.
 
@Vayra86 NV don't care about consumers anymore almost all their earnings are right now from AI and if it doesn't effect AI performance but only gamers Jenson problemer won't do squad about it.

I am currently just waiting for AMD to release their RX 9070 and RX 9070 XT to see if they will be good enough for my 1440p gaming or if I should rush out get a RX 7900 XT / XTX while they are still there.

Because as good as my Sapphire Radeon RX 590 Nitro+ Special Edition does in Wolfenstein 2 it won't be up to much in newer titles that I want to game this year.
That's why I bought a nitro 7900xtx, 5080 and a 5070ti. If the 9070xt is comparable or better than a 7900xtx, I'll go that route. If it falls a bit short of the 5080 but it's better than the 7900xtx... I'm buying 2 and everything else goes back.

I'm definitely keeping an amd card though, it's just a matter of which. I'd like to split and have one from each camp but that all depends on the 9070xts perf.
 
With that logic, Nvidia should disable RT on 60 series to bring it to parity with AMD. I don't understand why a feature should be discarded only on the basis of it being in 15 year-old games. It was a selling point of Nvidia GPUs at that time, so I don't see why Nvidia should give up on it now.
They didn't exactly give up on it. Nvidia didn't explicitly break phys-x support, they just made CUDA 64-bit only, which had the side effect of breaking phys-x in 32-bit games. It's a bit like how Apple canned 32-bit support in their os and lost compatibility with a ton of older games. Even though they are pushing MacOS to game devs.

Edit as to why they did, they probably didn't want to maintain a version of CUDA that has a lower use at large. And the CUDA toolkit can have security flaws that need to be fixed, so it's not like they can just leave it alone. Trillion dollars companies become trillion dollars companies because they save money where it's deemed "acceptable" to them.
Multiple Vulnerabilities Discovered in NVIDIA CUDA Toolkit
 
Last edited:
With that logic, Nvidia should disable RT on 60 series to bring it to parity with AMD. I don't understand why a feature should be discarded only on the basis of it being in 15 year-old games. It was a selling point of Nvidia GPUs at that time, so I don't see why Nvidia should give up on it now.

And I do have Nvidia GPUs as well. Having an AMD GPU in my main gaming rig at the moment doesn't mean that I only care about AMD tech.
You asked me why it isn't the same as disabling RT on AMD. I explained why.
 
3000 Series - Teaches people backside capacitor knowledge
4000 Series - Teaches people cables, connectors, electrical resistance and contact surfaces knowledge
5000 Series - Teaches people IC burn, PCI-E 5 signal integrity, BSOD, defective ROPs, also (cables, connectors, electrical resistance and contact surfaces)

Listen Kate Mckinnon GIF by Saturday Night Live
And the 1000 series taught people that you buy a video card and it worked without having to play with settings in the game.
 
Guys there's been some straying from the topic, please try to keep it on topic, and civil. thanks!
 
It not present in TPU GPU database, yet. :(

Well it's part of the Strix Halo maybe that's why ;)



The 2 articles are not letting the chip run at 125W the first is 65W and the second one is 55W that's like half of the full power for Strix Halo if it can beat the RTX 4070 Laptop variant it's still promising.
 
Back
Top