• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD FidelityFX FSR 3.1

Joined
Jun 14, 2020
Messages
3,457 (2.13/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
In my opinion,I fail to see the hype about frame generation. Maybe I am wrong, but my understanding is that we want higher frame rates so that the latency will be lower. With frame gen, fake frames are added, but by itself, that introduces higher latency. So I am not sure this technology is looking to solve anything other than giving people the false sense that the performance is better. After all. most people are very fixated with frame rates per second. If it is about latency, just flick Nvidia Reflex or AMD Anti-lag on.
You can have low latency with low framerates actually, but it's not an enjoyable experience. You want both low latency and smooth image, which is what FG provides.
 
Joined
Jul 20, 2020
Messages
1,123 (0.71/day)
System Name Gamey #1 / #3
Processor Ryzen 7 5800X3D / Ryzen 7 5700X3D
Motherboard Asrock B450M P4 / MSi B450 ProVDH M
Cooling IDCool SE-226-XT / IDCool SE-224-XTS
Memory 32GB 3200 CL16 / 16GB 3200 CL16
Video Card(s) PColor 6800 XT / GByte RTX 3070
Storage 4TB Team MP34 / 2TB WD SN570
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / TT Versa H18
Audio Device(s) Dragonfly Black
Power Supply EVGA 650 G3 / EVGA BQ 500
Mouse JSCO JNL-101k Noiseless
Keyboard Steelseries Apex 3 TKL
Software Win 10, Throttlestop
In my opinion,I fail to see the hype about frame generation. Maybe I am wrong, but my understanding is that we want higher frame rates so that the latency will be lower. With frame gen, fake frames are added, but by itself, that introduces higher latency. So I am not sure this technology is looking to solve anything other than giving people the false sense that the performance is better. After all. most people are very fixated with frame rates per second. If it is about latency, just flick Nvidia Reflex or AMD Anti-lag on.

Higher framerates typically have 2 features:

video smoothness
lower latency

Framegen gives you the smoothness without the latency improvements, so if all you need is that half of the featureset then it's a good tech. This can work well in RPGs and any game that's not online MP or esports. If you're playing esports or the various online MP battlefests, then FG is useless because lower latency means more accuracy.

This is just veering into a AMD hate bashing thread by NVIDIA fans honestly.

Fanboys on all sides gonna fanboy, people love defending their fragile tribes from perceived invaders.
 
Joined
Aug 21, 2013
Messages
1,898 (0.46/day)
I tried FSR upscaling plus framegen and it looked and felt horrible. Flickering and shimmering and aliasing all over the place plus high latency. Switched to DLSS upscaling with FSR framegen and now it looks much better but the higher latency is really annoying. Ended up disabling FSR framegen.
Framegen ALWAYS needs a Reflex/Anti-Lag option present and enabled in the game to mitigate large portion of the added latency. Without it you are just asking for trouble.
Thats bullshit, I played whole Ghost of Tsushima DLSS + Amd FG, without one crash.
With 3.1? Highly doubt that. GoT has 3.0 implementation and is the only game that allows FSR FG without FSR upscaling. I suspect it may have been 3.1 beta test - at least the decoupling portion. I too completed GoT but without any upscaler or ever DLAA/FSR NativeAA as all of them softened/smudged the image way too much for my liking and messed with distant particles such as fire embers. FSR FG on my RTX 20 series card was great tho - obviously i had Reflex enabled.
Maybe I am just overly sensitive to frametime differences.
People are sensitive to different things. Some people are sensitive to low fps. Other always hate tearing. Some are really latency sensitive etc.
For example i've never much cared for tearing - especially if the FPS is high and tearing occurs for a very short time. I can easily live without G-Sync/Freesync.
Latency and framerate sensitivity is higher. On desktop i notice the lower framerate and added latency right away when it's 60Hz instead of my usual 165Hz. I even notice stuttering when some programs dont run sooth. I am however very sensitive to the added smoothing by any AA as i absolutely hate it. I want my games sharp. Im even willing to disable AA alltogether if it gets me more sharpness. I also willing to screw with colors to get more "vivid" experience. I find the default colors at 6500K "dull and lifeless". Hence why i have Digital Vibrance set to 65%.
So this article is actually suggesting that FSR 3.1 works as a replacement for DLSS 3, ie that nvidia requirement of a RTX 4 series card for DLSS 3 is a scam and tensor ai cores are not needed?
No it does not. But i agree that Nvidia's reasons for not enabling DLSS FG on 30 series are extremely dubious. If 30 series did not support Reflex than i would almost understand it but it does. The fact that AMD has FG (no not AFMF) running smoothly on 30 series and older proves this. I really dont see the supposed benefit from 40 series specific hardware for DLSS FG.
In my opinion,I fail to see the hype about frame generation. Maybe I am wrong, but my understanding is that we want higher frame rates so that the latency will be lower. With frame gen, fake frames are added, but by itself, that introduces higher latency. So I am not sure this technology is looking to solve anything other than giving people the false sense that the performance is better. After all. most people are very fixated with frame rates per second. If it is about latency, just flick Nvidia Reflex or AMD Anti-lag on.
Nvidia Reflex or AMD Anti-lag are must have every time you enable FG. Also like i explained above not all people are sensitive to same things due to how our eyes and brains work. That would be like saying everyone likes sugar and hates salt in their foods which is patently untrue.
 
Joined
Sep 4, 2022
Messages
308 (0.38/day)
Higher framerates typically have 2 features:

video smoothness
lower latency

Framegen gives you the smoothness without the latency improvements, so if all you need is that half of the featureset then it's a good tech. This can work well in RPGs and any game that's not online MP or esports. If you're playing esports or the various online MP battlefests, then FG is useless because lower latency means more accuracy.



Fanboys on all sides gonna fanboy, people love defending their fragile tribes from perceived invaders.
Yep if gamers accept 60 fps which is 16 ms of latency when doubling the rendering fps without frame generation you half the latency to 8 ms linearly. When adding Frame gen 120 fps gives you about 16 ms as the minimum penalty the sometimes the penalty is in the 20s of milliseconds. Hence the disconnect between the frame counter and the perceived latency response.
Funny how I was hating on Framegen since my 4090 in 2022 and now that it's part of FSR. Some are defending it although mostly Nvidia paid trolls tend to get triggered on my past citisizism of frame gen.
 
Joined
Jan 8, 2017
Messages
9,436 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
As expected it's just more blurry, it crushes higher frequency details like DLSS and XeSS does so it gets rid of more shimmering at the cost of worse in surface detail.
 
Joined
Jul 20, 2020
Messages
1,123 (0.71/day)
System Name Gamey #1 / #3
Processor Ryzen 7 5800X3D / Ryzen 7 5700X3D
Motherboard Asrock B450M P4 / MSi B450 ProVDH M
Cooling IDCool SE-226-XT / IDCool SE-224-XTS
Memory 32GB 3200 CL16 / 16GB 3200 CL16
Video Card(s) PColor 6800 XT / GByte RTX 3070
Storage 4TB Team MP34 / 2TB WD SN570
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / TT Versa H18
Audio Device(s) Dragonfly Black
Power Supply EVGA 650 G3 / EVGA BQ 500
Mouse JSCO JNL-101k Noiseless
Keyboard Steelseries Apex 3 TKL
Software Win 10, Throttlestop
As expected it's just more blurry, it crushes higher frequency details like DLSS and XeSS does so it gets rid of more shimmering at the cost of worse in surface detail.

To serve a purpose. My purpose is not to retain FPS after enabling the dubious benefits of eye candy, instead to make gaming more viable on lower end cards. If the choice is: DLSS/FSR vs. lower resolution, then both upscaling techs preserve fine detail better than simple scaling.

I find DLSS to be visually superior to simple scaling all the time though the irony of it's ~98% excellent results is it really calls attention to errors I often see on vertical and horizontal lines.

I find FSR to be better than simple scaling on most games but in ones with worse implementations (detail shimmering esp. on foliage), I will sometimes choose simple scaling instead. Soft detail but more stable wins sometimes.

A drawback of both however is they can give a notably smaller FPS boost than simple scaling, especially at 1080p, so simple scaling wins again as game-dependent, FPS is king.
 
Joined
Mar 21, 2016
Messages
2,508 (0.79/day)
Framegen ALWAYS needs a Reflex/Anti-Lag option present and enabled in the game to mitigate large portion of the added latency. Without it you are just asking for trouble.

With 3.1? Highly doubt that. GoT has 3.0 implementation and is the only game that allows FSR FG without FSR upscaling. I suspect it may have been 3.1 beta test - at least the decoupling portion. I too completed GoT but without any upscaler or ever DLAA/FSR NativeAA as all of them softened/smudged the image way too much for my liking and messed with distant particles such as fire embers. FSR FG on my RTX 20 series card was great tho - obviously i had Reflex enabled.

People are sensitive to different things. Some people are sensitive to low fps. Other always hate tearing. Some are really latency sensitive etc.
For example i've never much cared for tearing - especially if the FPS is high and tearing occurs for a very short time. I can easily live without G-Sync/Freesync.
Latency and framerate sensitivity is higher. On desktop i notice the lower framerate and added latency right away when it's 60Hz instead of my usual 165Hz. I even notice stuttering when some programs dont run sooth. I am however very sensitive to the added smoothing by any AA as i absolutely hate it. I want my games sharp. Im even willing to disable AA alltogether if it gets me more sharpness. I also willing to screw with colors to get more "vivid" experience. I find the default colors at 6500K "dull and lifeless". Hence why i have Digital Vibrance set to 65%.

No it does not. But i agree that Nvidia's reasons for not enabling DLSS FG on 30 series are extremely dubious. If 30 series did not support Reflex than i would almost understand it but it does. The fact that AMD has FG (no not AFMF) running smoothly on 30 series and older proves this. I really dont see the supposed benefit from 40 series specific hardware for DLSS FG.

Nvidia Reflex or AMD Anti-lag are must have every time you enable FG. Also like i explained above not all people are sensitive to same things due to how our eyes and brains work. That would be like saying everyone likes sugar and hates salt in their foods which is patently untrue.

Depends on how frame gen is handled and overhead incurred to do so. The native resolution has a implicit impact as well. Interpolation itself can be handled different ways. It can increase or decrease frame rates and/or input latency depending on the approach.

As far as AA I just run 1 to 3 configuration layers of FXAA. You can interpolate between different configuration setups. It's more overhead, but addresses more jaggies. Depending on how you tune them though it can actually be less blurry and address more jaggies than single FXAA native configuration does so.

I'm not too excited over motion vectors for interpolation in it's current form particularly with the added latency and as steep as it is. If they can make it more granular it'll be more usable at least. On the other hand interpolating lighting is really effective way to make a scene appear more natural, but comes at added overhead. It's great if you've got enough overhead to spare though. Same goes with other post process techniques with or without layering of the same technique.

The extra overhead for some of that is one of the things I most look forward to with GPU upgrade actually. It's adaptable what you can do with reshade and layering of real time post process. If you you use lower overhead techniques it's not too different to having several layers of GIMP applied to a image and using transparency layers and things. Just 2 to 3 layers of any given post process technique can have tangible positive impact on them when well configured in a appropriate balanced way.

I agree on the sensitivity thing. Some people are over sensitive to trace amount of sharpening meanwhile I'm really not fond of blur that others seem to not mind much. Ironically enough FSR 3.1 looks blurry to me in the still and somewhat in video's, but not as pronounced as in the stills since everything tends to generally look better in motion outside really obvious flickering shimmer. Like no one wants a strobe like experience. That's also sometimes a issue with LOD's and pop in.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,171 (1.27/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
This is just veering into a AMD hate bashing thread by NVIDIA fans honestly. No real surprising and the usual people inciting it. Don't you have a DLSS owners discussion only forum to circle jerk about it in rather than derailing this thread as well!!?
Threads about FSR, and the discussion is about FSR. Throwing stones about people derailing threads and hate bashing (your assertion about me, to which I strongly disagree) is amusing, considering how that goes most of the time with Nvidia threads. In fact I'd say it's being derailed more by others arguing against DLSS and HUB than it is by talking about FSR, even if negatively, because there isn't that much positive to say unfortunately, I know, I've tried it..
I don't know where you get the idea that people don't like up-scaling it's been used for decades long before DLSS even came into the picture. It's more advanced now than it was, but it's always been drifting this direction in reality. Same goes for Ray Tracing it isn't new. Hardware has simply matured and these technologies improved upon and reinterpretation like interpolation which is just interlacing with a new name and different approach.
I suppose if you straight up ignore the multitude of people who directly said they don't like upscaling, then no you wouldn't know where I get the idea. I think it's the bees knees, and in isolation (ie without competition to compare to), FSRF 3.1 is really good.
I don't think anyone is labeling it a DLSS killer with a straight face unless their entirely delusional. It's a DLSS alternative with some improved maturity. Much like XeSS and with XeSS making strides on improving to which it has likewise.
Some are, delusional or not is up to them, and I've consistently said alternatives are good; they're absolutely a good thing for everyone (who is willing to use them) - that doesn't mean there aren't other discussion points and amusing things happening that surround the tech.
Believe it or not people don't all want a proprietary option from one brand in control of a legal monopoly in essence. We need competing solutions like FSR and XeSS to avoid one company just being a de-facto monopoly. The more open source these technologies are the easier it for developers to simply take the best and apply them for everyone to enjoy instead of this duck duck goose scenario we've seen.
I don't remember saying that they did want only a proprietary option, if that's what you've gleaned from my many comments about FSR, DLSS etc over the years, you haven't understood me at all. I consistently advocate for them all improving, and them all being included.
 
Last edited:
Joined
Jul 10, 2011
Messages
797 (0.16/day)
Processor Intel
Motherboard MSI
Cooling Cooler Master
Memory Corsair
Video Card(s) Nvidia
Storage Western Digital/Kingston
Display(s) Samsung
Case Thermaltake
Audio Device(s) On Board
Power Supply Seasonic
Mouse Glorious
Keyboard UniKey
Software Windows 10 x64
How red glasses help you to see difference between VA vs OLED but blocking you from this slop?



 
Joined
Jun 14, 2020
Messages
3,457 (2.13/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
How red glasses help you to see difference between VA vs OLED but blocking you from this slop?



As with every tool, what's important is knowing how to use it. Yes DLSS looks better than FSR but both look WAY better than the traditional upscaling techniques, and that's what's important. I don't get people that are against it. It's sounds insane to me. When 4k + DLSS / FSR Q looks a lot better than native 1440p,it makes it obvious that you are better off buying a 4k monitor and playing with fsr / dlss. Always.
 
Joined
Jul 5, 2013
Messages
27,774 (6.67/day)
As with every tool, what's important is knowing how to use it. Yes DLSS looks better than FSR but both look WAY better than the traditional upscaling techniques, and that's what's important. I don't get people that are against it. It's sounds insane to me. When 4k + DLSS / FSR Q looks a lot better than native 1440p,it makes it obvious that you are better off buying a 4k monitor and playing with fsr / dlss. Always.
While you make some good points, you're missing one. Not using scaling at all. Native rendering is always best and always LOOKS best. Some of us don't care, at all, about these scaled re-rendering methods and do not use them.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,171 (1.27/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Native rendering is always best and always LOOKS best.
Not true, not when you're forced to use TAA or a TAA derivative, because it's often so average Upscaling can look better, FSR included.

If native is always best and always looks the best, why does supersampling exist?

Native isn't the be all end all, some top level that's unsurpassable because of the arbitrary limitation of the panel one is displaying on, it merely serves as a [reference] point along the possible IQ spectrum/range.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,171 (1.27/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Joined
Jul 5, 2013
Messages
27,774 (6.67/day)
It really isn't, and in the year 2024 people really need to accept this already.
Are you really trying to talk down to someone who is an active gamer and frequently tests features to see what they have to offer? Think that over for a minute.

Native rendering at native resolutions ALWAYS looks best when compared to reduction scaling(what DLSS and FSR actually do), unless you're talking about prerendering at a higher resolution and then down scaled, but that has zero performance benefit.

Denying it doesn't make it not true.
Oh please with that. Really?
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,171 (1.27/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Are you really trying to talk down to someone who is an active gamer and frequently tests features to see what they have to offer? Think that over for a minute.
I'm not talking down to you, I'm correcting you when you said something wrong. The appeal to authority and/or experience is inconsequential.
Native rendering at native resolutions ALWAYS looks best when compared to reduction scaling(what DLSS and FSR actually do), unless you're talking about prerendering at a higher resolution and then down scaled, but that has zero performance benefit.
Forget a reduction in scaling for a moment, you've asserted that rendering a game at a given panels native resolution always looks best, that notion is not correct. There are methods which gather significantly more data, then down-sample back to native resolution to be displayed, offering image quality improvements over doing the rendering at the panels native resolution, one such example is super-sampling, do you dispute this?
Oh please with that. Really?
Yes.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,171 (1.27/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Except that I'm NOT wrong.

I'm also done debating this. Cheers.
If you're not wrong, why not address my question? should be easy to prove me wrong, right? But you are wrong, debate it, don't, accept it, don't, doesn't really matter to me except to disprove incorrect information presented in a public forum.

Exceeding the image quality of rendering a game at a given panels native resolution is possible, has been for a long time, well before DLSS/FSR. I remain willing to listen to any evidence or an argument against that, but "I'm not wrong" isn't an argument.
 
Joined
Jun 14, 2020
Messages
3,457 (2.13/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
While you make some good points, you're missing one. Not using scaling at all. Native rendering is always best and always LOOKS best. Some of us don't care, at all, about these scaled re-rendering methods and do not use them.
Native 4k may look better than 4k dlss q (i disagree with that but let's go with it) but sometimes that's not the choice you get.

Real example, I was trying to play horizon zero dawn on my laptop. My choice for locking 60 fps was to either drop to medium @ 1440p, or high @ 1440p FSR Q. I don't usually mind dropping to medium but in this game specifically medium just looks like crap in comparison, all of the shadows are completely removed. So using FSRQ gave me both higher framerate and higher IQ.

My general point is, if I had let's say a card only capable of 1440p, I'd still buy a 4k monitor and use DLSS / FSR, because 4k dlss / fsr looks a lot better than native 1440p.
 
Joined
Jul 5, 2013
Messages
27,774 (6.67/day)
Native 4k may look better than 4k dlss q (i disagree with that but let's go with it) but sometimes that's not the choice you get.
If you have a 4070/7800XT or better, then most titles do not need DLSS or FSR.
on my laptop
Unless that is a VERY high end laptop, then yeah DLSS/FSR would be of benefit.
My general point is, if I had let's say a card only capable of 1440p, I'd still buy a 4k monitor and use DLSS / FSR, because 4k dlss / fsr looks a lot better than native 1440p.
And that's a valid point. However, my point was that, on a 1440p display, native 1440p rendering will look better than with DLSS/FSR on the same display. Likely with 1080p and even 4k. If you have the rendering power, running native is best.
 
Joined
Jun 14, 2020
Messages
3,457 (2.13/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
If you have a 4070/7800XT or better, then most titles do not need DLSS or FSR.

Unless that is a VERY high end laptop, then yeah DLSS/FSR would be of benefit.

And that's a valid point. However, my point was that, on a 1440p display, native 1440p rendering will look better than with DLSS/FSR on the same display. Likely with 1080p and even 4k. If you have the rendering power, running native is best.
As I've already acknowledged for the sake of the discussion, sure let's say x resolution native looks better than x resolution + dlss q.

But that's a completely unfair comparison cause dlss gives you much higher framerate. Which in turn allows you to up the resolution.

Saying If you have a 4070 or higher games don't require dlss is just not true. First of all because it doesn't take into account the resolution. On 4k I'm sure a 4070 needs upscale. So the question is, if we agree that 4k + dlss q is better than 1440p native, why would you not go for the 4k monitor and just perma use dlss / fsr? Most games nowadays support it.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,171 (1.27/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
So the question is, if we agree that 4k + dlss q is better than 1440p native, why would you not go for the 4k monitor and just perma use dlss / fsr? Most games nowadays support it.
I agree with that, and am glad I also made that purchase choice (going 4k), but another option if you have the power at your disposal and are in say 1440p is to use (DL)DSR to render at 4k, perhaps even using DLSS too, and get better than native 1440p rendering image quality, as has been tried and tested many maaany times with consistent results.

@Sunny and 75 I see the reacts you dropped, but hes wrong mate, go try it for yourself, you'll see what we've known to be true for quite some time now ;)
 
Joined
Jun 2, 2017
Messages
9,134 (3.34/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Threads like these with the DLSS is bigger look less genuine when they don't acknowledge that Sapphire Trixx software was the first to offer you scaling options in terms of resolution. The kicker is any GPU that has more than 16 GB of VRAM can run 4K just fine. I bet it would be hard to tell the difference between Ultra and high at 4K on these Mini LED monitors we have today.
 
Joined
Apr 15, 2020
Messages
409 (0.24/day)
System Name Old friend
Processor 3550 Ivy Bridge x 39.0 Multiplier
Memory 2x8GB 2400 RipjawsX
Video Card(s) 1070 Gaming X
Storage BX100 500GB
Display(s) 27" QHD VA Curved @120Hz
Power Supply Platinum 650W
Mouse Light² 200
Keyboard G610 Red
I was watching a car repair video and the influencer managed to insert a shout out to his magnificent 4090 for no reason whatsoever!
XD!

Native rendering is always best and always LOOKS best.
Nothing comes close to native rendering in terms of experience and overall look and feel of the game especially in terms of AAA titles.

Native rendering at native resolutions ALWAYS looks best when compared to reduction scaling(what DLSS and FSR actually do)
I always start my RPG playthroughs with the render resolution set at 100%. Now why would I blur the beautiful scenery with upscaling techniques?!

If you have the rendering power, running native is best.
Simplicity itself.

@Sunny and 75 I see the reacts you dropped, but hes wrong mate, go try it for yourself, you'll see what we've known to be true for quite some time now ;)
To each their own, mate, good day to you :)

Cheers ;)
 
Top