• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 50 Technical Deep Dive

I want it to be more expensive on the basis of more raw performance, not features. I'd consider the RT argument relevant if it was necessary to play games, but as long as there's only like what... 3 (?) games with mandatory RT, it's more of a nicety than something to pay extra money for.

Needless to say, that this is my opinion, everybody is free to think whatever they want. Why do I feel like I have to put this at the end of every post to avoid personal attacks? Huh. :(
But even if there are 0 games with RT , the hardware that's much faster on it has to cost more. The opinion part is the part that you are saying "it's not worth it", which is fine. You used a car example before, well a car that can get 3 times the speed of my car - I consider it worthless cause I don't care about that feature - but I can totally understand why it costs more. Because it costs more to make a car 3 times as fast, it's that simple.

Those are two a bit contrasting statements. I'll give the 1st one a a bit of credit, but the latter is nothing else but a humble personal opinion.
If an image quality of this is better than that ... that's again something for someone to tell.
It also depends at what display size are you looking on DLSS output image. Your eyes' ability to see differences is lower when display points are smaller.

Me and Waldorf gave you explanation based on facts, not on personal opinions.
When you lose bitmap information in the process, you can't recreate it, only guess or estimate (interpolate, extrapolate). Thus, native wins over rendered in lower res and upscaled.

We clearly stand on the opposite sides when it comes to DLSS. I think we just keep looping the same statements. It's been nice talk, though.
4k dlss q looking better than 1440p native is as close to a fact as you can reasonably get considering image quality is all in the eye of the beholder.
 
But even if there are 0 games with RT , the hardware that's much faster on it has to cost more. The opinion part is the part that you are saying "it's not worth it", which is fine. You used a car example before, well a car that can get 3 times the speed of my car - I consider it worthless cause I don't care about that feature - but I can totally understand why it costs more. Because it costs more to make a car 3 times as fast, it's that simple.
Sure, I get it - I just don't get why so many gamers jump on it like money never existed, in the middle of an economic recession. But I guess I never will.

4k dlss q looking better than 1440p native is as close to a fact as you can reasonably get considering image quality is all in the eye of the beholder.
Personally, I only compare X resolution native with X resolution DLSS/FSR (in which case, native always wins). I didn't buy a 1440p monitor to play at 1080p, so why would I compare to a lower res?
 
Sure, I get it - I just don't get why so many gamers jump on it like money never existed, in the middle of an economic recession. But I guess I never will.
Well that's the opinion part and eye of the beholder. You don't like DLSS, I don't like native and so the story goes. So if I'd say, an equivalent nvidia card has to cost X $ more cause of DLSS, you can definitely say "who gives a ***k about DLSS, im buying the amd card", but I can also definitely say "well it has to cost more, cause adding / training / whatever the DLSS costs money".

Instead what I usually read is "ngreedia". :D

Personally, I only compare X resolution native with X resolution DLSS/FSR (in which case, native always wins). I didn't buy a 1440p monitor to play at 1080p, so why would I compare to a lower res?
Well I in fact disagree with your premise (that native looks better), but for the sake of the argument let's say you are correct. The problem with that argument is you are now ignoring the performance part, you are making a non ISO comparison, which scientifically is completely flawed. Of course native SHOULD look better, cause performance is much worse. Isn't it more reasonable to wonder what gives better image quality at the same performance? In which case, DLSS and even FSR to an extent always wins, cause that's exactly what they are made to do.

Sure, you didn't buy a 1440p monitor to play at 1080p, that's where solutions like supersampling (DLDDSR etc.) come in. Downscaling from 4k and using DLSS Q looks better than 1440p native on a 1440p monitor while having pretty much identical performance, so why would you run natively?


You also have to considered my pov, which is that for the performance I want, I can only achieve it playing 1440p native or 4k DLSS Q. Since 4k DLSS Q looks better, wasn't it reasonable for me to buy a 4k monitor and use DLSS instead of buying a 1440p monitor? Didn't DLSS essentially improve image quality? Cause that's how I look at it. If there was no DLSS, i would have bought a 1440p oled instead of a 4k one. I pretty much did the exact same with my laptop, was between a 1200p and a 1600p screen and I thought, why not just get the 1600p and use FSR Q - since it would look better than 1200p native seemed like a no brainer to me.
 
Last edited:
Well that's the opinion part and eye of the beholder. You don't like DLSS, I don't like native and so the story goes. So if I'd say, an equivalent nvidia card has to cost X $ more cause of DLSS, you can definitely say "who gives a ***k about DLSS, im buying the amd card", but I can also definitely say "well it has to cost more, cause adding / training / whatever the DLSS costs money".
DLSS costs money? Like FSR doesn't? Ehm... okay.

You say premium price for a premium product. I say, premium price for a regular product that AMD has an open solution for anyway. Then you go "but DLSS looks better", and I go "moose muffins, both look like crap compared to native". And the story goes... Better not start it again, I guess. :)

Instead what I usually read is "ngreedia". :D
Not from me, though. ;) (although I do have an opinion on Nvidia's business practices, but that's a story for another day)
 
DLSS costs money? Like FSR doesn't? Ehm... okay.
It was just a very hypothetical example. Forget DLSS. Let's say the new PCB design from nvidias FE. It stands to reason that a lot of work went into that, so it should cost more than a random AIB with a crappy board. It sure is useless in itself - cause realistically who cares about the size of the PCB, but I can understand why it should cost more.
 
Well I in fact disagree with your premise (that native looks better), but for the sake of the argument let's say you are correct. The problem with that argument is you are now ignoring the performance part, you are making a non ISO comparison, which scientifically is completely flawed. Of course native SHOULD look better, cause performance is much worse. Isn't it more reasonable to wonder what gives better image quality at the same performance? In which case, DLSS and even FSR to an extent always wins, cause that's exactly what they are made to do.
Native looks better because it puts raw render results onto your screen without the GPU having to "guess" what's hidden in the pixels that aren't rendered.

Sure, you didn't buy a 1440p monitor to play at 1080p, that's where solutions like supersampling (DLDDSR etc.) come in. Downscaling from 4k and using DLSS Q looks better than 1440p native on a 1440p monitor while having pretty much identical performance, so why would you run natively?
Possibly. I have no experience on that matter, so I'll just take your point.

You also have to considered my pov, which is that for the performance I want, I can only achieve it playing 1440p native or 4k DLSS Q. Since 4k DLSS Q looks better, wasn't it reasonable for me to buy a 4k monitor and use DLSS instead of buying a 1440p monitor? Didn't DLSS essentially improve image quality? Cause that's how I look at it. If there was no DLSS, i would have bought a 1440p oled instead of a 4k one. I pretty much did the exact same with my laptop, was between a 1200p and a 1600p screen and I thought, why not just get the 1600p and use FSR Q - since it would look better than 1200p native seemed like a no brainer to me.
That is not my question when playing a game. My question is "what's the highest setting I can play at on my native resolution". Anything lower than your monitor's native is a blurry mess, so if I can't feed 4K in my games with my GPU, then I won't buy a 4K monitor.

It was just a very hypothetical example. Forget DLSS. Let's say the new PCB design from nvidias FE. It stands to reason that a lot of work went into that, so it should cost more than a random AIB with a crappy board. It sure is useless in itself - cause realistically who cares about the size of the PCB, but I can understand why it should cost more.
Still not a reason to pay more. There are AIBs with great PCB design as well.

What I said above still holds: I can find nothing in Nvidia cards that feels more premium than an AMD equivalent (this is true vice versa as well - before anyone jumps at me).
 
But AMD could gain from being more aggressive in their software deployment, and stop so passive and wait for adobe to start to care about them.
Just to show how bad the situation can be outside of gaming: Intel B580 is somehow as fast as the 7900 XTX in 3D in AE. Intel who's a newcomer, with a tiny marketshare, somehow get better support in AE. That's not normal. (The M3 score is held back by the obvious bug in 3d perf, it's otherwise often faster than a 4090 here). HP is about to sell a Z workstation laptop and mini PC based on strix Halo. AMD need to work with Adobe to fix their performance to bring competition in that space.

1737118725301.png
1737119088895.png
 
@AusWolf
i have zero problems with ppl and their choice or the why.
but claiming gamers dont need cuda, is purely based on your assumption that ppl arent doing anything outside of gaming with their pc, which isnt the case.

in the last +30y i have never met a pc gamer that purely used their pc for nothing else, with users from 10-80y old.
from the last 5 gaming rigs i build for customers i knew from work, 3 were going to be used for de/encoding, so if they had listened to your advice (cuda isnt needed for gamers),
would have meant an increase in encoding time by ~10-15x.
and none of those had gpus above the xx70, so its not like only those with big cards do things like video editing.
 
@AusWolf
i have zero problems with ppl and their choice or the why.
but claiming gamers dont need cuda, is purely based on your assumption that ppl arent doing anything outside of gaming with their pc, which isnt the case.
Sure, use your PC for whatever you need, and buy a GPU with CUDA. But if you don't, if you only game, then you don't need it. Simple.

in the last +30y i have never met a pc gamer that purely used their pc for nothing else, with users from 10-80y old.
from the last 5 gaming rigs i build for customers i knew from work, 3 were going to be used for de/encoding, so if they had listened to your advice (cuda isnt needed for gamers),
would have meant an increase in encoding time by ~10-15x.
and none of those had gpus above the xx70, so its not like only those with big cards do things like video editing.
And in the last 30+ years, I haven't met anyone who used their PC for anything other than web surfing, watching films and gaming. I have one friend who records game clips sometimes, but it's not a necessity for him, and he can do it just fine with OBS through the CPU, or whatever AMD GPUs have for that purpose (ReLive, I think - I haven't used it in ages so I don't know).

It appears that our circles of people we know are very different, and I find nothing wrong with that.
 
@AusWolf
sure, but it shows that assuming what the "whole" gamer market wants, based on our small sample size, isnt a fact, and that incl cuda.
 
@AusWolf
sure, but it shows that assuming what the "whole" gamer market wants, based on our small sample size, isnt a fact, and that incl cuda.
It also shows that what the gamer market wants and what the gamer market needs are entirely different things.
 
Very true.
And 45% in rhat poll is mostly AMD users, beause Amd upscaler is not so great allways, thats why they use native.

Most of RTX user use ofc Upscaler because DLSS is just great


Majority use upscaler
Cant u see it?
39% + 9% + 4% +3% = 55%
But most of AMD users wont use it so thats why Native is 45%, FSR is not so good,better to use Native if gaming Amd gpu

Also there was only under 20k votes while Nvidia sell millions on GPUs.
Global results show +80% of RTX users use upscaler.
And Nvidia dominate in marketshares so there is huge amount of DLSS users..

But u dont like it so dont use it, its not good in Amd gpus anyways so its okay to use native then.


DLSS looks so much better also huge FPS boost..

But ppls are brand loyalty even they see something great they dont like it because its wrong brand.


100% sure AMD user
Maybe try Nvidia and DLSS someday?to realy see it u self.
Quality is better than native.


Okay another Amd user again..

Have fun playing 4K using 3060Ti or 5700XT
View attachment 380304


2000$ GPU cost only 2000$ u buy it only once,sell afther 2 years and buy new one.
Ppls QQ just too much about prices, dont buy it u u cant or want.

There is cheap gpus also, let ppls buy and use they hard earned moneys if it makes them happy.
What u care what ppls buy?

I didn't say you can't buy it and waste your money. I said it is irrelevant.

I'm playing the exact same games with the same visuals but paying 1/4 the money.

Use DLSS perf on a 4k screen instead of DLSS balanced...
 
Why do I get the feeling that introduction of Cooperative Vectors API for DirectX and Slang will end up being much more significant than we now realize?
Hope not, as a linux/vulkan user. Unless dxvk will be able to do it
 
Can nVidia cripple down 5090 die for the 5080 Ti or Ti Super, building a 6 channel model? There is a big barrier between 5090 and 5080, 5080 can not pass 1000 GB/s, which is bad choice for the price of $1000.
 
Back
Top