• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD FidelityFX FSR 3.1

ELI5 why should I pay copious amounts of money for a decent GPU and not ask it to be good at, say, rasterizing? Who cares about DSLL or whatever competitor out there, if you buy a GPU for gaming you want strong raster performances, end of.
 
ELI5 why should I pay copious amounts of money for a decent GPU and not ask it to be good at, say, rasterizing? Who cares about DSLL or whatever competitor out there, if you buy a GPU for gaming you want strong raster performances, end of.
Raster vs RT is one thing and native vs upscaling is another.
 
Raster vs RT is one thing and native vs upscaling is another.
But you can't even have RT without raster. It is sad that people think that raster is not the foundation of 3D Gaming and Ray Tracing was not introduced with Polygon Tech way back in 1989. Native vs Upscaling used to be the monitors performance to output the desired resolution (those Korean 1440P no scalar were great) but Nvidia made it into witchcraft by forcing it into the GPU compute. Now TPU can do a 7900XTX review and show that it is one of the fastest cards you can buy period it will still be not good enough because it does not have DLSS 3.5 support for $1000 more where I live. It won't matter though because once you start a narrative it is hard to change mindsets so no matter what AMD does their FSR implementation will never be good enough for some.
 
But you can't even have RT without raster. It is sad that people think that raster is not the foundation
I meant raster performance vs rt performance.
 
Last edited:
I meant raster performance vs rt performance.
If I already have native Ray Tracing why would I care about an extra layer that is in hardware? I play at 4K and that is plenty sharp for me. Once you get a certain quality of Monitor you know you can turn the Saturation and Contrast up for excellent looking screens anyway add HDR (When it works) and OMG is the acronym. You know when your 7 year old wants to watch Bluey on Daddy's PC because the colours are so rich vs the 4kTV what I would mean. Which brings me back to my initial point I don't buy GPUs thinking about FSR support. I bought a 7900XT because it is much faster than a 6800XT and has more VRAM. That meant that Gaming on my FV43U went from turning down some game to 1440P to 4K everything with a little too much time enjoying my library as I work from home so you know what happens. Strategy Games like TWWH3 or Rogue Trader during work hours, an Endless session of Orcs Must Die 3 while having lunch and then a championship in AMS2 or a race in LMU once finished work. When duites are done it is Spacebourne2, Everspace 2 or whatever I fancy at the time.
 
When you have the performance capable hardware then downsampling and dlaa is superior than native with taa. I always said if you aren't satisfied with the native sampled image then you aren't going to be satisfied with super sampling from an inferior resolution. So if you don't like 1080p native you are likely not going to like 1440p dlss quality or 4k dlss performance. If you are satisfied with a native 1440p resolution than 4k dlss set to quality is going to improve that image. If you are satisfied with native 4k than the only thing subjectively superior is dlaa and downsampling from a higher resolution. Super sampling is just a new anti aliasing techniques that replaced traditional brute forced anti aliasing techniques of the past.
 
When you have the performance capable hardware then downsampling and dlaa is superior than native with taa. I always said if you aren't satisfied with the native sampled image then you aren't going to be satisfied with super sampling from an inferior resolution. So if you don't like 1080p native you are likely not going to like 1440p dlss quality or 4k dlss performance. If you are satisfied with a native 1440p resolution than 4k dlss set to quality is going to improve that image. If you are satisfied with native 4k than the only thing subjectively superior is dlaa and downsampling from a higher resolution. Super sampling is just a new anti aliasing techniques that replaced traditional brute forced anti aliasing techniques of the past.
But even in your example using DLSS / DLAA is always better than native. Which is the point. Why would you ever use native if you have the choice to use DLSS / DLAA
 
I do NOT care about DLSS 3.5 or any other upscaling feature/technique.

I bought a 7900XT
You've made a right decision based on your needs.

Here, I suggested both the 4070 Super and the 7900 GRE but the op said that they do NOT care about ray tracing and that they can afford a 7900 XT and I said "may that 7900 XT serve you well".

I've got a friend who is interested in playing the W3CE at 4K so I've suggested an RTX 4070 Ti Super to him based on his preference for RT-heavy titles.

RT on vs RT off is a matter of preference, simple as that.
 
But even in your example using DLSS / DLAA is always better than native. Which is the point. Why would you ever use native if you have the choice to use DLSS / DLAA
While it is all subjective Temporal aa at 4k native is superior than dlss 4k set to quality. Why would a gamer choose not to run dlaa? Some titles don't have it. Why would some gamers not downsaple using dldsr because the performance hardware is not capable of running 8k dlss set to quality yet. Although some on TPU use 8k downsampling to dlss ultra performance/performance and are satisfied with that quality vs 4k dlaa. We can attempt to use objective data to attempt to say what best but again this is all subjective on a personal computer (PC) .
 
While it is all subjective Temporal aa at 4k native is superior than dlss 4k set to quality. Why would a gamer choose not to run dlaa? Some titles don't have it. Why would some gamers not downsaple using dldsr because the performance hardware is not capable of running 8k dlss set to quality yet. Although some on TPU use 8k downsampling to dlss ultra performance/performance and are satisfied with that quality vs 4k dlaa. We can attempt to use objective data to attempt to say what best but again this is all subjective on a personal computer (PC) .
I specifically said if given the choice. Obviously if the game doesn't support it you don't have a choice. In that case youd have to use the inferior native resolution. Which is inferior with or without TAA

My question is more like, why are people against a technique that gives you higher image quality under every and all circumstances? Doesn't make sense to me at all.
 
I specifically said if given the choice. Obviously if the game doesn't support it you don't have a choice. In that case youd have to use the inferior native resolution. Which is inferior with or without TAA

My question is more like, why are people against a technique that gives you higher image quality under every and all circumstances? Doesn't make sense to me at all.
Dlss/FSR has it place, I am not against supersamling. Then again I'll admit ilI am biased towards the best picture quality without compromise. You say the image quality is subjectively better. Other peope say it subjectively worse. That is why its subjective in a personalized computer PC. PC was never about one size fits all. That is how I love it.
To make it more objective I view supersamling as a replacement for the traditional brute forced anti aliasing techniques of the past which is exactly what it is. Some people rather play with 4k taa than 4k dlss set to quality if the performance allows. In Cyberpunk 4k dlss set to quality has inferior texture quality to 4k at native resolution with game's own anti aliasing but unfortunately it's not playable. So 4k dlss quality is the only choice for current hardware for maximum quality at a playable experience ( subjectively speaking). I don't play at 1440p so the sample image for me at 4k dlss set to quality ( 1440p sampled image looks worse ) than the 4k image native.
 
Last edited:
Dlss/FSR has it place, I am not against supersamling. Then again I'll admit ilI am biased towards the best picture quality without compromise. You say the image quality is subjectively better. Other peope say it subjectively worse. That is why its subjective in a personalized computer PC. PC was never about one size fits all. That is how I love it.
To make it more objective I view supersamling as a replacement for the traditional brute forced anti aliasing techniques of the past which is exactly what it is. Some people rather play with 4k taa than 4k dlss set to quality if the performance allows. In Cyberpunk 4k dlss set to quality has inferior text quality to 4k at native resolution with game's own anti aliasing but unfortunately it's not playable. So 4k dlss quality is the only choice for current hardware for maximum quality at a playable experience ( subjectively speaking). I don't play at 1440p so the sample image for me at 4k dlss set to quality ( 1440p sampled image looks worse ) than the 4k image native.
Well if someone says 4k DLSS quality looks worse than native 1440p they are obviously wrong, and that's not really subjective. The difference is so obvious. I mean if it wasn't better then DLSS / FSR wouldnt have existed in the first place.

The problem is most people view DLSS / FSR as a technique that increases FPS, so they are comparing 4k native vs 4k FSR / DLSS, which is kinda silly. The proper image quality comparison is - you have a target framerate and then see what technique gets you higher IQ. And that's never native as you know yourself.
 
Well if someone says 4k DLSS quality looks worse than native 1440p they are obviously wrong, and that's not really subjective. The difference is so obvious.

4K DLSS quality would be upscaled from 1440p, so of course it wouldn't look worse than 1440p native. There is no need for subjective/objective assessments, it's literally the same base resolution, very astute of you to notice this basic fact which doesn't mean anything.
 
Last edited:
4K DLSS quality would be upscaled from 1440p, so of course it wouldn't look worse than 1440p native. There is no need for subjective/objective assessments, it's literally the same base resolution, very astute of you to notice this basic fact which doesn't mean anything.
Given the fact that performance will be very close, of course that basic fact means something. It means even if your card is comfortable at 1440p you should still get a 4k monitor and use DLSS / FSR cause youll get better image quality. It's not complicated.
 
Well if someone says 4k DLSS quality looks worse than native 1440p they are obviously wrong, and that's not really subjective. The difference is so obvious. I mean if it wasn't better then DLSS / FSR wouldnt have existed in the first place.

The problem is most people view DLSS / FSR as a technique that increases FPS, so they are comparing 4k native vs 4k FSR / DLSS, which is kinda silly. The proper image quality comparison is - you have a target framerate and then see what technique gets you higher IQ. And that's never native as you know yourself.
If someone says that? But whon is saying that 1440p native is worse than 4k dlss set to quality? The performance is similar and yes the image quality is the better in that comparison. Hence why upscaling techniques from my perspective should be used as the minimum resolution that the user is used to as the sample. Once you identify your minimum resolution you satisfied with because the performance is often tied to that resolution and image quality. Then you can achieve upscaling to improve your image quality. So if you are are not satisfied with 720p resolution then 4k ultra performance with its high frame rates might not appeal to the end user. There is no free lunch unless you are satisfied with you image quality and frame per second at native. Then you can use upscaling to improve your image quality with not minimum performance hit.
 
Last edited:
Ghosting that is mentioned is not present on amd gpus, so maybe you should make that complaint to nvidia that nerfs fsr to make it proprietary dlss look better?
 
Ghosting that is mentioned is not present on amd gpus, so maybe you should make that complaint to nvidia that nerfs fsr to make it proprietary dlss look better?
I watched Tim's video on DLSS vs FSR vs XeSS last night and there was definitely some ghosting shown on AMD gpus on Ratchet and Clank.

I am quietly wondering if developers will have ways to lean into increase upscaling technique fidelity to compensate (or possible encourage) slacking off in the optimisation department.
 
But that's a completely unfair comparison cause dlss gives you much higher framerate. Which in turn allows you to up the resolution.
I think you misunderstand how DLSS & FSR work. You only get a jump in framerate by reducing effective resolution. Raise the resolution, lower the framerate. @wolf seems to be having the exact same problems understand this dynamic.

It's not a matter of opinion, but sure. :toast:
You're right, it's not. Hush up about it until you understand how it all works.
 
You're right, it's not. Hush up about it until you understand how it all works.
Now who's talking down? And no, I won't, because for one that's rude, and it seems to me that it's you who doesn't understand how it works, or perhaps you've misunderstood me, so I'll give you the benefit of the doubt.

There's maybe a couple of separate things going on here, so I'll be really clear about it;

Preferring Native resolution rendering to upscaled from lower res to native? absolutely matter of preference and opinion, and it's highly variable, people can like what they want, zero argument from me there.

That Upscaling can look better is not what I am saying here.

Rendering a video game at the native resolution of your monitor is the absolute best, un-toppable image quality you can expect from that game, on your monitor? objectively, demonstrably false. You can render the game at even higher resolution and downsample that to your monitors native resolution, giving better image quality.

Example 1, Example 2 from my native 4k setup.

Screenshot_20240706_181632_Chrome.jpg


Whether one has the GPU horsepower to do it, is it the right thing for all games, etc, totally beside the point. The only point I'm trying to have you see here is, better than native res rendering image quality is possible. Do you refute this?
 
Last edited:
Native is best are for people suffering from Dunning Kruger effect LOL.

Buying a higher resolution monitor is the best way to enjoy your PC, from web surfing to playing games. Using higher res display with upscaling is way better than using Native on lower res display
 
Preferring Native resolution rendering to upscaled from lower res to native? absolutely matter of preference and opinion
What we really need is a time travel to the PS7 era so that we have enough (and affordable) horsepower to be able to play our favorite AAA titles with all the settings maxed out (including RT effects) and 100% render resolution (aka native rendering) at 4K. (PS6 will more than likely be 4K60 upscaled so that's a hard pass.)

I don't care whether Sony touts the PS7 as the ultimate 8K supremacy or whatever.
I found in order to see the whole screen on 8k I couldn't get to a point where I could see the pixels AND the whole screen at the same time (those with better eyesight will be able to) so couple with an increase in power consumption there's no actual need for me at all.
This^

I just want my AAA titles to run at 4K120 with 100% render resolution.

To the far shores of the PS7 era, we go. :peace:
 
I just want my AAA titles to run at 4K120 with 100% render resolution.

To the far shores of the PS7 era, we go. :peace:

By that time PC Gamers are enjoying 8K480HZ with PathTracing
 
Back
Top