• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD FidelityFX FSR 3.1

Joined
Jun 14, 2020
Messages
3,335 (2.07/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
I think you misunderstand how DLSS & FSR work. You only get a jump in framerate by reducing effective resolution. Raise the resolution, lower the framerate. @wolf seems to be having the exact same problems understand this dynamic.
1440p native vs 4k dlss q / fsr q have roughly the same framerate. The latter looks clearly, undeniably better
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,150 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
What we really need is a time travel to the PS7 era so that we have enough (and affordable) horsepower to be able to play our favorite AAA titles with all the settings maxed out (including RT effects) and 100% render resolution (aka native rendering) at 4K.
Do you see now with my examples how rendering above 100%, beyond native res and then displaying it is superior IQ to 100%?

Edit : as shown again here

Screenshot_20240706_181632_Chrome.jpg
 
Last edited:
Joined
Sep 4, 2022
Messages
286 (0.36/day)
Do you see now with my examples how rendering above 100%, beyond native res and then displaying it is superior IQ to 100%?

Edit : as shown again here

View attachment 354217
One take away from Daniel Owen's video he mentions the actual display size/ppi/resolution. 1440p/27inchs has higher ppi the the 4k 48 inch oled in his comparison. I play on a 48 inch CX oled and the price is stretched out that is why we might have different experiences or perception. If I would upgrade to a 4k 32 inch oled the ppi would improve significantly and my perception might likely change. Also I didn't know that 1440p dlss quality uses a 920p sample image I thought it was at least 1080p.
By that time PC Gamers are enjoying 8K480HZ with PathTracing
The race is on although current 8k monitor offering cost >$4000 and is stuck at 60 hz.

update are 1/4 of the way there although in concept status TLC showen off a 8k 120hz 1400nits 5000 zones miniled display past ces.
 
Last edited:
Joined
Mar 21, 2016
Messages
2,508 (0.80/day)
Native is best are for people suffering from Dunning Kruger effect LOL.

Buying a higher resolution monitor is the best way to enjoy your PC, from web surfing to playing games. Using higher res display with upscaling is way better than using Native on lower res display

Downsample to native is arguably better though than upscale in that scenario though. I'd rather native at a higher resolution downsampled to native than upscale from a lower resolution to a higher resolution. Basically similar to making or simulating the native resolution being higher PPI or lower PPI thru the usage of scaling. You also have to take into post process as well and it very easily favors downsample on quality, but on performance it's the opposite since more is harder than less.
 
Joined
Nov 11, 2016
Messages
3,393 (1.16/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
One take away from Daniel Owen's video he mentions the actual display size/ppi/resolution. 1440p/27inchs has higher ppi the the 4k 48 inch oled in his comparison. I play on a 48 inch CX oled and the price is stretched out that is why we might have different experiences or perception. If I would upgrade to a 4k 32 inch oled the ppi would improve significantly and my perception might likely change. Also I didn't know that 1440p dlss quality uses a 920p sample image I thought it was at least 1080p.

The race is on although current 8k monitor offering cost >$4000 and is stuck at 60 hz.

Display technologies are also progressing nicely, I'm upgrading my 48in OLED CX to C4 in a couple days :D. I was actually waiting for 42in 4K 240hz OLED display but it might be 1-2 years away anyways.

Playing games & doing daily tasks on high resolution & high refresh display is the best thing about PCMR. There is no point getting hung up on low resolution & low refresh display just to play games at "Native is best".

Downsample to native is arguably better though than upscale in that scenario though. I'd rather native at a higher resolution downsampled to native than upscale from a lower resolution to a higher resolution. Basically similar to making or simulating the native resolution being higher PPI or lower PPI thru the usage of scaling. You also have to take into post process as well and it very easily favors downsample on quality, but on performance it's the opposite since more is harder than less.

Nope, downsample from 2.25x factor (1.5x on each dimensions) looks pretty bad on any display. In this case you would need to downsample from 4x factor, which is very intensive.

For example, on a 1080p display, you would need to render 4K and downsample, destroying your FPS in the process. In the end the final image is only 1080p, it looks better than 1080p Native but still miles behind 1440p DLSS.Q on a 1440p Display, not to mention the massive FPS deficit

4K Downsample (1080p) vs 1080p Native vs 1440p DLSS.Q
 
Last edited:
Joined
Mar 21, 2016
Messages
2,508 (0.80/day)
Performance impact being higher is accurate, but otherwise even 4K downsampled to 1080p can look better than 1440p effectively. The lighting and reflections were better and tree foliage looks better. The jaggies are more prominent in that example given pixel size is relative to DPI. Also you're changing the display resolution in the example. If you're intended display render is 1440p native down sample will look better than upscale to 1440p looks.

If anything getting a higher PPI panel even you can't run the higher resolution well makes a good bit of sense so long as it's high enough refresh rate and affordable since you can simple downscale to a lower resolution on the diplay and since it's a higher PPI panel you can get away with that more readily.

There is another angle on matters as well since displays and refresh rates are pretty interconnected. Higher resolutions often come at lower refresh rates and there is a whole class of image quality surrounding animation or motion which impacts scene details and input lag.
 
Joined
Jan 8, 2017
Messages
9,402 (3.29/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Nope, downsample from 2.25x factor (1.5x on each dimensions) looks pretty bad on any display.
This is completely untrue and also total nonsense. Downsampling from any resolution higher than native will improve IQ.
 
Joined
Nov 11, 2016
Messages
3,393 (1.16/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Performance impact being higher is accurate, but otherwise even 4K downsampled to 1080p can look better than 1440p effectively. The lighting and reflections were better and tree foliage looks better. The jaggies are more prominent in that example given pixel size is relative to DPI. Also you're changing the display resolution in the example. If you're intended display render is 1440p native down sample will look better than upscale to 1440p looks.

If anything getting a higher PPI panel even you can't run the higher resolution well makes a good bit of sense so long as it's high enough refresh rate and affordable since you can simple downscale to a lower resolution on the diplay and since it's a higher PPI panel you can get away with that more readily.

There is another angle on matters as well since displays and refresh rates are pretty interconnected. Higher resolutions often come at lower refresh rates and there is a whole class of image quality surrounding animation or motion which impacts scene details and input lag.

If you are willing to lose more than 50% of FPS, practically halving your gaming experience for 10% better IQ LOL.

I think the majority of people will take 50% more FPS at the cost of 10% Image quality reduction (or 0% with DLSS)
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,150 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
If you are willing to lose more than 50% of FPS, practically halving your gaming experience for 10% better IQ LOL.
Agreed, unless you have plenty of GPU that is far and above what the game needs for native, then why not. What's more baffling is the ones that either didn't know it existed, totally misunderstood, or flat out refused to believe what supersampling does is possible, ie further improving IQ, I'd go as far as to say more than 10% too at times, but it can be hard to quantify, although immediately and obviously it is better. "Native" is such a weird hill to die on.
I think the majority of people will take 50% more FPS at the cost of 10% Image quality reduction (or 0% with DLSS)
Bingo. Hell, console gamers don't even get a choice and they deal with whatever depths of input res the developer chooses to get solid performance in given modes. At least on PC we get the choice to tweak and optimise our personal mode for a game.
 
Last edited:
Joined
Mar 21, 2016
Messages
2,508 (0.80/day)
I guess I just imagined reflections and lighting getting heavily broken with DLSS. You literally left out native 1440p in the comparison to directly compare a DLSS 1080P upscale to 1440p vs 4K down-sample to 1440p comparison directly. My point was more getting a higher DPI display and simply dropping resolution is a good option. Ironically 4K down-sample to 1080p improves lighting and reflections even at a 50% lower resolution than 1440p native that DLSS tries to pretend it's equal or better to.

Anyway to add to the reflections and lighting matter if AMD can resolve it's FSR issues w/o creating that problem DLSS suffers from it'll be a bit of sticking point. That's a bit of a big if at this stage given they've got some work to do to catch up with FSR on general quality. I seems like they need more of a mix between FSR 2.2 and FSR 3.1 from the looks of it and probably a bit of the lighting XESS has mixed with FSR 3.1, but the scene clarity is better with DLSS right now than the others and the AO is mostly better though FSR 2.2 did a better with AO than FSR 3.1 seems to.
 
Last edited:

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,150 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
@lexluthermiester so was it a misunderstanding about upscaling, or didn't you know about Supersampling? This isn't personal and I don't want to be rude to you, it's about the facts of the matter and helping everyone understand what's possible for image quality.
 
Joined
Jul 5, 2013
Messages
27,474 (6.63/day)
@lexluthermiester so was it a misunderstanding about upscaling, or didn't you know about Supersampling? This isn't personal and I don't want to be rude to you, it's about the facts of the matter and helping everyone understand what's possible for image quality.
Supersampling is a form of AA. Not completely but in effect. However, we're talking about FSR/DLSS. These are very different functions and effects. They're not processed the same way and do not render the same result, though one could argue that the end result image is similar but I digress.

Most image processing effects often make the final screen render results look terrible. Most forms of AA make games look like someone smeared vasoline all over the screen. DLSS & FSR come with the potential of artifacting if not done perfectly.

The potential performance gains are just not worth the hassle. Running games fully native and managing other settings is far more effective and easier. For example, I never run AA of any type in any game. AA has always been an unnecessary resource hog that never improved the gaming experience enough to justify the hit in processing power. So it's the very first thing I disable. Everything else is on a per-game, per-setting basis.

IF FSR and DLSS can actually offer something compelling, that consistently offers performance gains WITHOUT degrading image quality, then it'll be worth using. ATM, neither are there yet. Credit to AMD & NVidia though, they are improving things to the point where they're getting close.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,150 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Supersampling is a form of AA. Not completely but in effect.
Can you see how in effect over rendering a game then downsampling can and does exceed native image quality? I see you talk about not using AA at all, you could still do, for example, a 4x render with no AA and get better image quality by way of leveraging, among other things, sub pixel detail. The end image does appear antialiased, but also more detailed. Even the blurb on AMD's Virtual Super Resolution (VSR) says;
Get quality that rivals up to 4K, even on a 1080p display while playing your favorite games
Wether it's worth it, different discussion, does it enable better than native res image quality? yes.
However, we're talking about FSR/DLSS. These are very different functions and effects. They're not processed the same way and do not render the same result, though one could argue that the end result image is similar but I digress.

Most image processing effects often make the final screen render results look terrible. Most forms of AA make games look like someone smeared vasoline all over the screen. DLSS & FSR come with the potential of artifacting if not done perfectly.

The potential performance gains are just not worth the hassle. Running games fully native and managing other settings is far more effective and easier. For example, I never run AA of any type in any game. AA has always been an unnecessary resource hog that never improved the gaming experience enough to justify the hit in processing power. So it's the very first thing I disable. Everything else is on a per-game, per-setting basis.

IF FSR and DLSS can actually offer something compelling, that consistently offers performance gains WITHOUT degrading image quality, then it'll be worth using. ATM, neither are there yet. Credit to AMD & NVidia though, they are improving things to the point where they're getting close.
I can absolutely accept this as your opinion and experience on upscaling and how it affects image quality relative to your tastes, even though I disagree relative to my experience and tastes.
 
Joined
Jul 5, 2013
Messages
27,474 (6.63/day)
Can you see how in effect over rendering a game then downsampling can and does exceed native image quality? I see you talk about not using AA at all, you could still do, for example, a 4x render with no AA and get better image quality by way of leveraging, among other things, sub pixel detail. The end image does appear antialiased, but also more detailed. Even the blurb on AMD's Virtual Super Resolution (VSR) says;

Wether it's worth it, different discussion, does it enable better than native res image quality? yes.

I can absolutely accept this as your opinion and experience on upscaling and how it affects image quality relative to your tastes, even though I disagree relative to my experience and tastes.
While some things will always be personal preference and opinion, one can not say the same for a native image quality render, unaltered by after-effects.

It is not unobjective to state that an unaltered rendering of a game, with no after effects, will always look best. This is more factual as not using after-effects will show the game as the the creators intended, even IF some of those effects are worked into the game as options.

FSR and DLSS alters the image rendering in a way that does not always produce a desirable result. That is what is subjective.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,150 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
It is not unobjective to state that an unaltered rendering of a game, with no after effects, will always look best. This is more factual as not using after-effects will show the game as the the creators intended, even IF some of those effects are worked into the game as options.
Respectfully, I am in disagreement here. What if the developer has TAA or motion blur, or chromatic aberration etc on by default (for example), what if it can't be disabled? is that not the way the developer intended? Take Senua's Saga, it has chromatic aberration, depth of field and permanent un-toggleable 21:9 aspect ratio set by default, so I'd argue that's the way the developer intended, but I digress as this is not what I'm really talking about.

Resolution is also in effect inconsequential relative to the developer. If you have a 1080p monitor, or a 4k monitor, or an 8k TV, the game in anno 2024 will be built to render to any of those perfectly fine. I do not see a relationship between rendering at the native resolution of any individuals given display, and that looking best. Irrespective of any other settings choices you make for AA, after effects, in game settings etc, supersampling / downsampling will increase visual fidelity in a manner that is not linked to creators intentions at all, it simply increases perceived resolution, and with it image quality, much like just using a higher resolution display. If a creator has their game run perfectly on a 1080p display, and a 4k display, it is not unobjective to say that rendering at 4k then downsampling to 1080p, displayed on a 1080p monitor will look more detailed and higher resolution than rendering at native 1080p on that same monitor. I would say preferring native render to that is by definition personal preference when the images are compared side by side.
 
Joined
Jun 14, 2020
Messages
3,335 (2.07/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Again - as an owner of a variety of hardware, a lot of it low end, I'd much rather have very high resolution monitors and use upscalers than low resolution monitors and play natively. There is a big difference. Especially in LODs, cause LODs entirely depended on your actual resolution.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,150 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Again - as an owner of a variety of hardware, a lot of it low end, I'd much rather have very high resolution monitors and use upscalers than low resolution monitors and play natively. There is a big difference. Especially in LODs, cause LODs entirely depended on your actual resolution.
I'm very much in agreement with that too, so much so it seems the obvious choice to me now.

The only point I'm trying to make here is that rendering a game at "Native" (whatever that might be on an individual basis) is not the absolute best image quality achievable on that monitor/TV when rendering a given game, image quality (detail) does continue to improve that when supersampling.

There will of course be an absolute limit - a point in which over-rendering then down-sampling the game does not yield further improvements, eg I'd wager 8k render -> 16k render would be hard to tell the difference when displayed at 1080p.
 
Joined
Jun 14, 2020
Messages
3,335 (2.07/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
I'm very much in agreement with that too, so much so it seems the obvious choice to me now.

The only point I'm trying to make here is that rendering a game at "Native" (whatever that might be on an individual basis) is not the absolute best image quality achievable on that monitor/TV when rendering a given game, image quality (detail) does continue to improve that when supersampling.

There will of course be an absolute limit - a point in which over-rendering then down-sampling the game does not yield further improvements, eg I'd wager 8k render -> 16k render would be hard to tell the difference when displayed at 1080p.
Yeah obviously, native isn't the goal. Native is the minimum acceptable starting point. Back when I briefly had a 2560*1080 ultrawide (very low resolution) with a 3090, I was never running native, I was always supersampling obviously. Native is for when your GPU isn't fast enough for SS.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,150 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
@fevgatos I suppose I'm just surprised that in this day and age, native resolution rendering is still thought by some to be the pinnacle of IQ, unable to be improved upon; as I said earlier, it merely serves as a reference point along the spectrum of possible image quality for a given display. And when you realise this, perhaps in conjunction with other forced settings (such as game rendering breaking without some form of TAA) it also becomes relatively easy to see how traditional supersampling isn't the only technique which can exceed that native reference point, depending on the game of course. When we accept an outcome is possible, it stands to reason that there isn't an inherent limitation of only one possible method to achieve it.

To say that FSR and DLSS take only a lower resolution image and upscale it, so therefore it cannot be higher quality, is to not understand what those techniques are actually doing. You didn't say that mind you, but others have, for some time.
 
Last edited:
Joined
Mar 21, 2016
Messages
2,508 (0.80/day)
While some things will always be personal preference and opinion, one can not say the same for a native image quality render, unaltered by after-effects.

It is not unobjective to state that an unaltered rendering of a game, with no after effects, will always look best. This is more factual as not using after-effects will show the game as the the creators intended, even IF some of those effects are worked into the game as options.

FSR and DLSS alters the image rendering in a way that does not always produce a desirable result. That is what is subjective.

I'm perfectly fine with post process done well that isn't clearly and obviously obtrusive to native details in very obvious ways that would be viewed as worse. As far altering the intended design by the creators I would agree the goal should be to not heavily alter the native intended look. Like I don't mind layering something like lighting and shading to improve the relative qualities of those things that doesn't really change the intended look particularly in a negative manner if done tastefully it simply makes those things look more balanced and natural.

That's the kind of thing that enhanced native image quality in a tangible way. If you can toggle something on and off and after your eyes adjust it looks clearly more natural and detailed in subtle ways that's a positive. It shouldn't jump out at you and be in your face like omg piles of lighting or piles of shading that completely alters the native look. Those are just post process effects example cases. It's equally true of other forms of post process from blur to sharpen and more.

As far as a individual deciding to colorize a game a particular way for style and interest I don't really care. If they want to enjoy their synth wave experience or w/e they kind of atmosphere they might do so be it so long as it's not negatively impacting others it doesn't really matter. I mean hell I haven't created some custom just for fun style stuff like that personally, but I should. I've done it with AI art and it can be a pretty cool vibe especially the red, green and yellow reggae look and maybe some orange. It makes for some cool ambience. It's not exactly natural, but it doesn't matter if that's the intended look you're after.
 
Joined
Apr 15, 2020
Messages
409 (0.25/day)
System Name Old friend
Processor 3550 Ivy Bridge x 39.0 Multiplier
Memory 2x8GB 2400 RipjawsX
Video Card(s) 1070 Gaming X
Storage BX100 500GB
Display(s) 27" QHD VA Curved @120Hz
Power Supply Platinum 650W
Mouse Light² 200
Keyboard G610 Red
8K480HZ with PathTracing
XD!

rendering above 100%, beyond native res and then displaying it is superior IQ to 100%
No denying that, and no argument there.

The thing is, at the moment, not even a 4090 can natively provide 60 FPS at 4K in CP2077 with RT (no PT, just plain RT) enabled so no point in going above 100% render resolution on that one.

It's about native rendering vs upscaling (or rather, reduction scaling!). This is NOT about native rendering vs going above 100% render resolution which actually improves the IQ at the cost of performance (hands down, the best AA there is).

I will NEVER go below 100% render resolution.

Of course, with enough resources, I've got NO PROBLEM with going above 100% render resolution.

IF FSR and DLSS can actually offer something compelling, that consistently offers performance gains WITHOUT degrading image quality, then it'll be worth using. ATM, neither are there yet. Credit to AMD & NVidia though, they are improving things to the point where they're getting close.
This^
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,150 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
No denying that, and no argument there ..... going above 100% render resolution which actually improves the IQ at the cost of performance (hands down, the best AA there is).....Of course, with enough resources, I've got NO PROBLEM with going above 100% render resolution.
It's nice to see you say it, I'm baffled at the lengths some go to either argue better than 100%/native isn't possible, or to not admit it.
It's about native rendering vs upscaling (or rather, reduction scaling!). This is NOT about native rendering vs going above 100% render resolution which actually improves the IQ at the cost of performance (hands down, the best AA there is).

I will NEVER go below 100% render resolution.
That's quite separate to the point I was trying to make about 100%+ / supersampling, which I'm glad can be put to rest now. However, suffice to say there have been games where upscaling has absolutely provided better IQ (will give example below), but I suspect that won't occur much or at all moving forward as upscaling implementations now are finally including a native AA option like DLAA, FSR Native AA etc - and because those AA solutions are so much better than TAA, they are of course better than upscaling from lower res. Same algorithm, more data.

My example where upscaling has provided better IQ goes as follows.

Game AAA releases, it has a rendering pipeline dependent on TAA to not break visual effects, so you cannot outright disable TAA (or another temporally underpinned AA solution) at all in the game options, the developer made sure of this so their game doesn't look visually broken. The standard TAA that comes in the game has a very blurry resolve and it's own temporal artefacts like smearing and ghosting. The game has also implemented DLSS, and because the standard TAA is that bad, and the AA algorithm in DLSS is that good, DLSS Quality mode provides superior image quality, higher detail retention, image stability etc. Article covering some examples here.
However, it is true that DLSS can produce a better than native image, it just depends on the game. At 4K using the Quality mode, for example, we'd say that around 40% of the time DLSS was indeed producing a better than native image. Often this comes down to the native image, on a handful of occasions native rendering wasn't great, likely due to a poor TAA implementation. But on other occasions the native image looks pretty good, it's just that DLSS looks better or has an advantage at times.
Again this is very unlikely to happen now, as we can use these AA algorithms like FSR native AA/XeSS native AA/DLAA, giving undeniably better results than starting from a lower base resolution, more data so duh, but that above scenario certainly played out in multiple games, because of straight garbage TAA so bad it fumbles, even with craploads more pixels to work with.

Image quality on a given game, on a given monitor/panel is on a spectrum of subnative-through-massively-supersampled, and so because better than native is categorically possible, it's objective to say that it isn't inherently limited to one singular way to achieve that.
 
Top