• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Nvidia's GPU market share hits 90% in Q4 2024 (gets closer to full monopoly)

Status
Not open for further replies.
You're thinking of DLDSR or DSR which performs downsampling from a higher rendered resolution. DLAA is just DLSS where the input and output are the same (native) resolution, or in other words the render resolution is 100%.
No, DLAA is the just AA algo from DLSS used on a native resolution render, you're perhaps thinking of DSR/DLDSR.
Ah I see. Thanks. Still, the point remains.

This harkens back to the notion I've said before, some people are against the tech from a purely academic "what it is doing" standpoint, rather than the actual results. The proof is in the pudding.
Turn on - take a look - is it better? Simple as that. No need to deep dive into pipelines and shit to form an opinion on image quality.

Every competent reviewer disagrees with you, including Wizard
And your point is...?
 
And your point is...?
You’re wrong. I imagine willfully, going by the fact that you are told this practically daily.
 
Turn on - take a look - is it better? Simple as that. No need to deep dive into pipelines and shit to form an opinion on image quality.
You're describing exactly why DLSS is such a popular and desirable feature.
 
Pssttt they are still known as ATi internally
It's funny when people think that a company got completely changed, including everyone who works there down to the last janitor, just because its name changed.

You’re wrong.
No, you're wrong. Where's my prize?
Darn, what a stupid game. :roll:

You're describing exactly why DLSS is such a popular and desirable feature.
Maybe people see something entirely different than I do. Or maybe they're not as sensitive to blur as they are to low FPS (I'm the other way around), I don't know. But I see what I see and think what I think and that's that.
 
DLSS = Lower resolution image upscaled to your monitor's native.
DLAA = Higher resolution image downscaled to your monitor.
DLAA is not higher resolution, it uses the exact same resolution.
The entire pipeline is as follows:
- Have a desired output resolution, let's say 4k for this example
- Shove an image as input for your DLSS pipeline.
- For DLSS performance, the input resolution is 1080p (this is what your GPU renders.
- For DLSS balanced, input res is 1253p
- For DLSS quality, input res is 1440p
- For DLAA quality, input res is 4k
- Get output image at desired resolution
that just happen to work well together.
They don't work together. DLAA is a specific case of DLSS where the scaling factor is set to 1x


And I just noticed others have already explained the above. Well, I will not let my 5 minutes go to waste and post it anyways lol
 
It's funny when people think that a company got completely changed, including everyone who works there down to the last janitor, just because its name changed.
Are you kidding me?

It was a Canadian company, here in Canada.

Wtf lol..

ATI Technologies - Wikipedia

Look at you now that you have a modern GPU.
 
Maybe people see something entirely different than I do. Or maybe they're not as sensitive to blur as they are to low FPS (I'm the other way around), I don't know. But I see what I see and think what I think and that's that.
iirc your first hand experience with DLSS was with an RTX 2070 @ 1080p in like 2020-21, I can see exactly why you feel the way you do, your experience was among the worst case scenario's for any upscaling and it's based on <DLSS 2.5, makes total sense why it was blurry given those qualifying factors.

So yeah in 2025 people are seeing something entirely different to what you did.
 
Did they also fire their engineers? Or did they get relocated?
Surely the high end AMD system you are using right now is a strong enough tool to search with?

Most were let go afaik. As well as most of the other staff.
 
Surely the high end AMD system you are using right now is a strong enough tool to search with?

Most were let go afaik. As well as most of the other staff.
Sure, I can search. I just thought we were having a conversation here. I was also hoping that either you, or someone else knows the story and can give a concise answer. I don't know why an honest question deserves a laughing emoji and such a condescending answer, but it's kind of rude.

iirc your first hand experience with DLSS was with an RTX 2070 @ 1080p in like 2020-21, I can see exactly why you feel the way you do, your experience was among the worst case scenario's for any upscaling and it's based on <DLSS 2.5, makes total sense why it was blurry given those qualifying factors.

So yeah in 2025 people are seeing something entirely different to what you did.
Yeah, I'm pretty sure it's decent at 4K. I tried some FSR on my HTPC on a 1050 Ti and it wasn't too bad, especially compared to my monitor. Upscaling definitely has its uses, just "using it all the time whether you need it or not" isn't one of them, imo.
 
Yeah, I'm pretty sure it's decent at 4K.
Oh it's better than decent, it's better than Native+TAA! :toast: And that was true for the old CNN model (DLSS3), which FSR4 now appears to be a broad match for and even do some things slightly better, good times for all. Add to that lowering power consumption and heat, increasing performance and it's a win that I'll use every time, unless I have in excess of the performance I need for DLAA @ 120fps locked.
 
I don't know where you've been during the last few Nvidia generations when you needed a new card every time you wanted to use a new DLSS version.
Huh? That was never the case.

Oh it's better than decent, it's better than Native+TAA! :toast: And that was true for the old CNN model (DLSS3), which FSR4 now appears to be a broad match for and even do some things slightly better, good times for all. Add to that lowering power consumption and heat, increasing performance and it's a win that I'll use every time, unless I have in excess of the performance I need for DLAA @ 120fps locked.
DLSS and to an extent FSR give you better image quality at similar framerate to native. What's a crutch is raster, being used by slow cards that can't do RT.

The below screenshots are running at the same framerate, one is native the other one is using DLSS Q. So at similar framerate, DLSS gives far far better image quality, I don't understand why this discussion still keeps going on. The results speak for themselves, what does each person's opinion add to the discussion that the below screnshots don't already address?


Native
native 1440p 36 fps.JPG


DLSS Q

4k dlss Q 36 fps.JPG
 
Last edited:
Oh it's better than decent, it's better than Native+TAA! :toast: And that was true for the old CNN model (DLSS3), which FSR4 now appears to be a broad match for and even do some things slightly better, good times for all. Add to that lowering power consumption and heat, increasing performance and it's a win that I'll use every time, unless I have in excess of the performance I need for DLAA @ 120fps locked.
Well, sitting 2 metres away from a 4K TV, there isn't much difference between native 4K and FSR/DLSS performance mode, that's for sure. Like I said, the technology has its uses. :)

Huh? That was never the case.

DLSS and to an extent FSR give you better image quality at similar framerate to native. What's a crutch is raster, being used by slow cards that can't do RT.

The below screenshots are running at the same framerate, one is native the other one is using DLSS Q. So at similar framerate, DLSS gives far far better image quality, I don't understand why this discussion still keeps going on. The results speak for themselves, what does each person's opinion add to the discussion that the below screnshots don't already address?

Native


DLSS Q

You're comparing to a shitty TAA implementation, that's why.
 
Well, sitting 2 metres away from a 4K TV, there isn't much difference between native 4K and FSR/DLSS performance mode, that's for sure. Like I said, the technology has its uses. :)




You're comparing to a shitty TAA implementation, that's why.
DLSS 3 is FG, the upscaling part of it works in every nvidia card since Turing.

It doesn't matter what I'm comparing, name a good TAA implementation, there is 0 chance that 1440p native will look better than 4k dlss q. Give me a game with a good native image and I'll compare it as well.
 
DLSS 3 is FG, the upscaling part of it works in every nvidia card since Turing.
FG isn't very interesting, I give you that.

It doesn't matter what I'm comparing, name a good TAA implementation, there is 0 chance that 1440p native will look better than 4k dlss q. Give me a game with a good native image and I'll compare it as well.
Compare a game with no TAA at all.
 
FG isn't very interesting, I give you that.


Compare a game with no TAA at all.
I genuinely do not know any game without baked in TAA but I can assure you it won't make a difference. It's impossible for native to look better at same framerate. 4k dlss q already has the baseline quality of 1440p native, and then you add the dlss magic on top.
 
I genuinely do not know any game without baked in TAA but I can assure you it won't make a difference. It's impossible for native to look better at same framerate. 4k dlss q already has the baseline quality of 1440p native, and then you add the dlss magic on top.
Kingdom Come Deliverance 2 doesn't force TAA, I think.
 
Ok, I'll try that in the afternoon, I'll post screenshots. Will you accept the results though?
Sure. Let's make it a blind test. :) I'll also try FSR 4 once I get my card running properly in Linux. :laugh:
 
Sure. Let's make it a blind test. :) I'll also try FSR 4 once I get my card running properly in Linux. :laugh:
Wait were all discussing upscaling and reasons for market share and the 9070XT... Doesn't work? I thought there were some niggles but not outright unusable? I mean I know they work in Windows so the vast majority of buyers are covered.
 
Wait were all discussing upscaling and reasons for market share and the 9070XT... Doesn't work? I thought there were some niggles but not outright unusable? I mean I know they work in Windows so the vast majority of buyers are covered.
Now that fsr 4 is good upscaling will become important, and since amd is stuck with the same vram as a 5 year old 550$ card (6800) vram doesn't matter anymore
 
Wait were all discussing upscaling and reasons for market share and the 9070XT... Doesn't work? I thought there were some niggles but not outright unusable? I mean I know they work in Windows so the vast majority of buyers are covered.

Driver support for RDNA4 in Linux is limited at the moment :) Most distros don't have the required Mesa versions and firmware packages available.
 
Now that fsr 4 is good upscaling will become important, and since amd is stuck with the same vram as a 5 year old 550$ card (6800) vram doesn't matter anymore
Some are saying FSR4 has comparable image quality to DLSS4, even Digital Foundry is impressed with FSR4.
And VRAM is still important, IMO 16GB is plenty for a midrange card. 5 years ago Nvidia was still stuck on 8GB.
 
Last edited:
Status
Not open for further replies.
Back
Top