• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 50 Technical Deep Dive

Joined
Jun 14, 2020
Messages
3,931 (2.34/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
I want it to be more expensive on the basis of more raw performance, not features. I'd consider the RT argument relevant if it was necessary to play games, but as long as there's only like what... 3 (?) games with mandatory RT, it's more of a nicety than something to pay extra money for.

Needless to say, that this is my opinion, everybody is free to think whatever they want. Why do I feel like I have to put this at the end of every post to avoid personal attacks? Huh. :(
But even if there are 0 games with RT , the hardware that's much faster on it has to cost more. The opinion part is the part that you are saying "it's not worth it", which is fine. You used a car example before, well a car that can get 3 times the speed of my car - I consider it worthless cause I don't care about that feature - but I can totally understand why it costs more. Because it costs more to make a car 3 times as fast, it's that simple.

Those are two a bit contrasting statements. I'll give the 1st one a a bit of credit, but the latter is nothing else but a humble personal opinion.
If an image quality of this is better than that ... that's again something for someone to tell.
It also depends at what display size are you looking on DLSS output image. Your eyes' ability to see differences is lower when display points are smaller.

Me and Waldorf gave you explanation based on facts, not on personal opinions.
When you lose bitmap information in the process, you can't recreate it, only guess or estimate (interpolate, extrapolate). Thus, native wins over rendered in lower res and upscaled.

We clearly stand on the opposite sides when it comes to DLSS. I think we just keep looping the same statements. It's been nice talk, though.
4k dlss q looking better than 1440p native is as close to a fact as you can reasonably get considering image quality is all in the eye of the beholder.
 
Joined
Jan 14, 2019
Messages
13,550 (6.17/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
But even if there are 0 games with RT , the hardware that's much faster on it has to cost more. The opinion part is the part that you are saying "it's not worth it", which is fine. You used a car example before, well a car that can get 3 times the speed of my car - I consider it worthless cause I don't care about that feature - but I can totally understand why it costs more. Because it costs more to make a car 3 times as fast, it's that simple.
Sure, I get it - I just don't get why so many gamers jump on it like money never existed, in the middle of an economic recession. But I guess I never will.

4k dlss q looking better than 1440p native is as close to a fact as you can reasonably get considering image quality is all in the eye of the beholder.
Personally, I only compare X resolution native with X resolution DLSS/FSR (in which case, native always wins). I didn't buy a 1440p monitor to play at 1080p, so why would I compare to a lower res?
 
Joined
Jun 14, 2020
Messages
3,931 (2.34/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Sure, I get it - I just don't get why so many gamers jump on it like money never existed, in the middle of an economic recession. But I guess I never will.
Well that's the opinion part and eye of the beholder. You don't like DLSS, I don't like native and so the story goes. So if I'd say, an equivalent nvidia card has to cost X $ more cause of DLSS, you can definitely say "who gives a ***k about DLSS, im buying the amd card", but I can also definitely say "well it has to cost more, cause adding / training / whatever the DLSS costs money".

Instead what I usually read is "ngreedia". :D

Personally, I only compare X resolution native with X resolution DLSS/FSR (in which case, native always wins). I didn't buy a 1440p monitor to play at 1080p, so why would I compare to a lower res?
Well I in fact disagree with your premise (that native looks better), but for the sake of the argument let's say you are correct. The problem with that argument is you are now ignoring the performance part, you are making a non ISO comparison, which scientifically is completely flawed. Of course native SHOULD look better, cause performance is much worse. Isn't it more reasonable to wonder what gives better image quality at the same performance? In which case, DLSS and even FSR to an extent always wins, cause that's exactly what they are made to do.

Sure, you didn't buy a 1440p monitor to play at 1080p, that's where solutions like supersampling (DLDDSR etc.) come in. Downscaling from 4k and using DLSS Q looks better than 1440p native on a 1440p monitor while having pretty much identical performance, so why would you run natively?


You also have to considered my pov, which is that for the performance I want, I can only achieve it playing 1440p native or 4k DLSS Q. Since 4k DLSS Q looks better, wasn't it reasonable for me to buy a 4k monitor and use DLSS instead of buying a 1440p monitor? Didn't DLSS essentially improve image quality? Cause that's how I look at it. If there was no DLSS, i would have bought a 1440p oled instead of a 4k one. I pretty much did the exact same with my laptop, was between a 1200p and a 1600p screen and I thought, why not just get the 1600p and use FSR Q - since it would look better than 1200p native seemed like a no brainer to me.
 
Last edited:
Joined
Jan 14, 2019
Messages
13,550 (6.17/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Well that's the opinion part and eye of the beholder. You don't like DLSS, I don't like native and so the story goes. So if I'd say, an equivalent nvidia card has to cost X $ more cause of DLSS, you can definitely say "who gives a ***k about DLSS, im buying the amd card", but I can also definitely say "well it has to cost more, cause adding / training / whatever the DLSS costs money".
DLSS costs money? Like FSR doesn't? Ehm... okay.

You say premium price for a premium product. I say, premium price for a regular product that AMD has an open solution for anyway. Then you go "but DLSS looks better", and I go "moose muffins, both look like crap compared to native". And the story goes... Better not start it again, I guess. :)

Instead what I usually read is "ngreedia". :D
Not from me, though. ;) (although I do have an opinion on Nvidia's business practices, but that's a story for another day)
 
Joined
Jun 14, 2020
Messages
3,931 (2.34/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
DLSS costs money? Like FSR doesn't? Ehm... okay.
It was just a very hypothetical example. Forget DLSS. Let's say the new PCB design from nvidias FE. It stands to reason that a lot of work went into that, so it should cost more than a random AIB with a crappy board. It sure is useless in itself - cause realistically who cares about the size of the PCB, but I can understand why it should cost more.
 
Joined
Jan 14, 2019
Messages
13,550 (6.17/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Well I in fact disagree with your premise (that native looks better), but for the sake of the argument let's say you are correct. The problem with that argument is you are now ignoring the performance part, you are making a non ISO comparison, which scientifically is completely flawed. Of course native SHOULD look better, cause performance is much worse. Isn't it more reasonable to wonder what gives better image quality at the same performance? In which case, DLSS and even FSR to an extent always wins, cause that's exactly what they are made to do.
Native looks better because it puts raw render results onto your screen without the GPU having to "guess" what's hidden in the pixels that aren't rendered.

Sure, you didn't buy a 1440p monitor to play at 1080p, that's where solutions like supersampling (DLDDSR etc.) come in. Downscaling from 4k and using DLSS Q looks better than 1440p native on a 1440p monitor while having pretty much identical performance, so why would you run natively?
Possibly. I have no experience on that matter, so I'll just take your point.

You also have to considered my pov, which is that for the performance I want, I can only achieve it playing 1440p native or 4k DLSS Q. Since 4k DLSS Q looks better, wasn't it reasonable for me to buy a 4k monitor and use DLSS instead of buying a 1440p monitor? Didn't DLSS essentially improve image quality? Cause that's how I look at it. If there was no DLSS, i would have bought a 1440p oled instead of a 4k one. I pretty much did the exact same with my laptop, was between a 1200p and a 1600p screen and I thought, why not just get the 1600p and use FSR Q - since it would look better than 1200p native seemed like a no brainer to me.
That is not my question when playing a game. My question is "what's the highest setting I can play at on my native resolution". Anything lower than your monitor's native is a blurry mess, so if I can't feed 4K in my games with my GPU, then I won't buy a 4K monitor.

It was just a very hypothetical example. Forget DLSS. Let's say the new PCB design from nvidias FE. It stands to reason that a lot of work went into that, so it should cost more than a random AIB with a crappy board. It sure is useless in itself - cause realistically who cares about the size of the PCB, but I can understand why it should cost more.
Still not a reason to pay more. There are AIBs with great PCB design as well.

What I said above still holds: I can find nothing in Nvidia cards that feels more premium than an AMD equivalent (this is true vice versa as well - before anyone jumps at me).
 
Joined
Jan 8, 2017
Messages
9,598 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Joined
Oct 28, 2012
Messages
1,226 (0.27/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
But AMD could gain from being more aggressive in their software deployment, and stop so passive and wait for adobe to start to care about them.
Just to show how bad the situation can be outside of gaming: Intel B580 is somehow as fast as the 7900 XTX in 3D in AE. Intel who's a newcomer, with a tiny marketshare, somehow get better support in AE. That's not normal. (The M3 score is held back by the obvious bug in 3d perf, it's otherwise often faster than a 4090 here). HP is about to sell a Z workstation laptop and mini PC based on strix Halo. AMD need to work with Adobe to fix their performance to bring competition in that space.

1737118725301.png
1737119088895.png
 
Joined
Nov 7, 2017
Messages
1,992 (0.76/day)
Location
Ibiza, Spain.
System Name Main
Processor R7 5950x
Motherboard MSI x570S Unify-X Max
Cooling converted Eisbär 280, two F14 + three F12S intake, two P14S + two P14 + two F14 as exhaust
Memory 16 GB Corsair LPX bdie @3600/16 1.35v
Video Card(s) GB 2080S WaterForce WB
Storage six M.2 pcie gen 4
Display(s) Sony 50X90J
Case Tt Level 20 HT
Audio Device(s) Asus Xonar AE, modded Sennheiser HD 558, Klipsch 2.1 THX
Power Supply Corsair RMx 750w
Mouse Logitech G903
Keyboard GSKILL Ripjaws
VR HMD NA
Software win 10 pro x64
Benchmark Scores TimeSpy score Fire Strike Ultra SuperPosition CB20
@AusWolf
i have zero problems with ppl and their choice or the why.
but claiming gamers dont need cuda, is purely based on your assumption that ppl arent doing anything outside of gaming with their pc, which isnt the case.

in the last +30y i have never met a pc gamer that purely used their pc for nothing else, with users from 10-80y old.
from the last 5 gaming rigs i build for customers i knew from work, 3 were going to be used for de/encoding, so if they had listened to your advice (cuda isnt needed for gamers),
would have meant an increase in encoding time by ~10-15x.
and none of those had gpus above the xx70, so its not like only those with big cards do things like video editing.
 
Top