• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5090 Features 16+6+7 Phase Power Delivery on 14-Layer PCB

Natural progression of the video card. 3090 was hot! 4090 hotter and 5090 is hottest(so far).
power-gaming.png
 
PCIe 5.0 makes things even more complicated. Given the memory chip layout, some lanes probably run under the memory chips and/or other lanes. This is quite unfavourable because these wires require a specific geometry (distance to ground, to other lanes etc).
Well there is a good amount of layers on the PCB for that but I agree PCI Gen5 doesn't make it easy. Basically 90% of the IOs have a very high bandwidth, which is actually not so frequent.
 
You lost the bet? Lol.
Hmm I'm not him but apparently the bet didn't take into account DLSS4 which is a bit ridiculous. Software and DLSS is definitely a huge part of today's GPUs, like it or not. I'm a hardware engineer but you can't count on hardware only to push the Moore's Law further.

I'd accept the bet. Math doesn't lie. 4080S had 10240 shading units vs 3090TI's 10752, with 3090TI being on shitty samsung node. 5080 has 10752 shading units vs 4090's 16384 on similar nodes. No way can DDR7 speed close that gap in rasterization. Sure it will have better RT (who really cares?) and maybe support new frame gen tech in DLSS (again who really cares), but in raw performance 4090 will be the 2nd best, only behind 5090.
Who cares for RT ? Most people who buy a xx90 and actually play games with it ??

Who cares about DLSS ? Probably less people but again, using the device to its fullest is using it with DLSS, and no it's not a mess anymore. And most games in 4K are slower than 200 fps with DLSS on a 4090 when everything else is maxed. And someone who buys a xx90 is likely to have a 4K display with 240Hz or above. I don't think anything above 240Hz is useful but it's still a decent improvement to have more than 144fps with a 240Hz screen, which will require DLSS on a decent amount of games. And Frame Gen x4 in performance with DLSS4 is actually a better quality than FG X2 in quality with DLSS3. So with DLSS on the 5080 is above the 4090.

So yes, that guy lost his bet, they were talking about raw rasterization, but the assessment that we shouldn't care about RT and DLSS absolutely ridiculous, and in a more realistic comparison, the 5080 is indeed above the 4090.
 
Hmm I'm not him but apparently the bet didn't take into account DLSS4 which is a bit ridiculous. Software and DLSS is definitely a huge part of today's GPUs, like it or not. I'm a hardware engineer but you can't count on hardware only to push the Moore's Law further.


Who cares for RT ? Most people who buy a xx90 and actually play games with it ??

Who cares about DLSS ? Probably less people but again, using the device to its fullest is using it with DLSS, and no it's not a mess anymore. And most games in 4K are slower than 200 fps with DLSS on a 4090 when everything else is maxed. And someone who buys a xx90 is likely to have a 4K display with 240Hz or above. I don't think anything above 240Hz is useful but it's still a decent improvement to have more than 144fps with a 240Hz screen, which will require DLSS on a decent amount of games. And Frame Gen x4 in performance with DLSS4 is actually a better quality than FG X2 in quality with DLSS3. So with DLSS on the 5080 is above the 4090.

So yes, that guy lost his bet, they were talking about raw rasterization, but the assessment that we shouldn't care about RT and DLSS absolutely ridiculous, and in a more realistic comparison, the 5080 is indeed above the 4090.

I currently own a 5090FE, and I upgraded from a 4090. I would much rather have a 4090 than a 5080.
 
Last edited:
You lost the bet? Lol.
The bet was never formalized, but yeah, I did. No shame in admitting it. And I stand by what I said - a 5080 slower than the 4090 didn’t make sense and doesn’t now - no wonder we mostly look at the 5000 series as a meme - outside of the 5090 the generational uplift is questionable at best.

Hmm I'm not him but apparently the bet didn't take into account DLSS4 which is a bit ridiculous. Software and DLSS is definitely a huge part of today's GPUs, like it or not. I'm a hardware engineer but you can't count on hardware only to push the Moore's Law further.
Nah, if we do take MFG into account then sure, I would be technically correct, but that wasn’t a part of the discussion and I wouldn’t be able to force it even if I tried since most people seem to count “native only, raster only, Final Destination” type of performance.
 
Nah, if we do take MFG into account then sure, I would be technically correct, but that wasn’t a part of the discussion and I wouldn’t be able to force it even if I tried since most people seem to count “native only, raster only, Final Destination” type of performance.

Interpolation is not performance it's a screen smoothing effect at best with two major downsides. If Nvidia can create a similar technology where the latency goes down at each step and there is 0 additional artifacts vs using DLSS SR by itself sure we can call it performance.

The only people who consider it performance are Nvidia and people who have drank too much green Kool-aid.
 
Nah, if we do take MFG into account then sure, I would be technically correct, but that wasn’t a part of the discussion and I wouldn’t be able to force it even if I tried since most people seem to count “native only, raster only, Final Destination” type of performance.
That's what I said in my next comment.

Interpolation is not performance it's a screen smoothing effect at best with two major downsides. If Nvidia can create a similar technology where the latency goes down at each step and there is 0 additional artifacts vs using DLSS SR by itself sure we can call it performance.

The only people who consider it performance are Nvidia and people who have drank too much green Kool-aid.
Not counting side technologies as a plus is ridiculous to say the least. You can talk about raw performance and compare cards with each other using that only, that's an intersting comparison to make. But if you want to tell this card is better than this one, you have to take into account everything. I don't play games where a few ms of latency make any difference so it doesn't matter to me and I won't talk about that. But I don't care about image differences that I can't see when playing even if I try.

Make a blind test in normal playing conditions, make several runs, with or without MFG (faster fps could be compensated by using either a slower card or by downclocking) and try to guess which one is played with MFG and if you can't see any difference, then call it a performance uplift.
 
Not counting side technologies as a plus is ridiculous to say the least. You can talk about raw performance and compare cards with each other using that only, that's an intersting comparison to make. But if you want to tell this card is better than this one, you have to take into account everything. I don't play games where a few ms of latency make any difference so it doesn't matter to me and I won't talk about that. But I don't care about image differences that I can't see when playing even if I try.

Make a blind test in normal playing conditions, make several runs, with or without MFG (faster fps could be compensated by using either a slower card or by downclocking) and try to guess which one is played with MFG and if you can't see any difference, then call it a performance uplift.

I had to reread my comment not once did I say it shouldn't be counted as a feature or as a plus/bonus. Only that it isn’t performance. Unless in 2025 adding artifacts and increasing latency equal performance now. I can see Nvidia PR/Marketing department grinning from ear to ear reading $#!+ like this. I can't wait for X6 but we can't do it on Blackwell being the main selling point of 6000 series or whatever they call it.

If it's not noticeable to you awesome happy for you I wish that was the case for me.

I would never buy a weaker gpu with MFG vs a Stronger one without assuming a similar price all other things being equal it's a nice cherry on top and there are scenarios where it makes sense for sure.

I only like framegen of any kind with a controller on a large oled from about 8-10 feet away otherwise the latency and artifacts are way too noticeable to me.
 
I agree that Multiframe gen is amazing! But I also agree that standard frame gen is also amazing and good enough tech as it’s the most common in games right now! I work at home everyday on my PC which gives me the excuse to waste money on silly overpriced graphics cards.

I owned a 3090 and when the 4090 came out it was very upsetting to not have frame Gen, it’s all I could think about and wish for. Many titles could potentially run so much smoother on the 4080. The 4080 could turn that experience to butter. I actually upgraded to a 4080 which gave me frame Gen which I thought was a ground breaking feature improving the smoothness dramatically. Then I upgraded to a 4090. Now a 5090. Regular Frame Gen seems to be available in almost everything. But multi frame Gen is not. When it is, it’s great and I’ll use it for sure. So, I think RTX 4000 series is still very viable much more so than RTX 3000 series was.
 
Back
Top