Wednesday, March 20th 2024

AMD Announces FSR 3.1, Improves Super Resolution Quality, Allows Frame Generation to Work with Other Upscaling Tech

AMD at GDC 2024 announced the FidelityFX Super Resolution 3.1 (FSR 3.1). While the original FSR 3.0 feature-set largely carries forward the super resolution upscaler from FSR 2.2, adding frame generation on top; the new FSR 3.1 adds several image quality improvements to the upscaler itself, improving image quality at every performance preset. Specifically, it improves the temporal stability of the output at rest and in movement, to reduce flickering and shimmering, or "fizziness" around objects in motion. The new upscaler also reduces ghosting, and better preserves detail.

Next up, is a rather important change in the way the frame generation technology works. AMD has decoupled FSR 3.1 frame generation from the upscaling tech, which allows frame generation to work with other upscaling solutions, such as DLSS or XeSS. The possibilities of such a decoupling are endless—have an RTX 30-series "Ampere" GPU that lacks DLSS 3 frame generation support? No worries, use DLSS 2 for the upscaling, and FSR 3.1 for the frame generation. AMD is also clumping its FidelityFX family of technologies into a new FidelityFX API that makes it easier for developers to debug, and paves the way for forward-compatibility with future versions of FSR. Lastly, FSR 3.1 supports Vulkan API, and the Microsoft Xbox GDK. AMD plans to release FSR 3.1 to developers through its GPUOpen platform in Q2-2024, and its first implementations on games are expected later this year. In the meantime, AMD implemented FSR 3.1 on "Ratchet & Clank: Rift Apart," to showcase the new upscaler.
Source: AMD
Add your own comment

44 Comments on AMD Announces FSR 3.1, Improves Super Resolution Quality, Allows Frame Generation to Work with Other Upscaling Tech

#26
stimpy88
Will we see a AAA title new or old supporting FSR 3.1 this year?
Posted on Reply
#27
Trunks0
stimpy88Will we see a AAA title new or old supporting FSR 3.1 this year?
Well... already confirmed is Ratchet and Clank: Rift Apart :p
Posted on Reply
#28
Prima.Vera
btarunrAMD has decoupled FSR 3.1 frame generation from the upscaling tech, which allows frame generation to work with other upscaling solutions, such as DLSS or XeSS. The possibilities of such a decoupling are endless—have an RTX 30-series "Ampere" GPU that lacks DLSS 3 frame generation support? No worries, use DLSS 2 for the upscaling, and FSR 3.1 for the frame generation.
This! Allow us to give KUDOS to AMD, and both middle fingers to nGreedia!
Even so I have a feeling they will bring their Frame Generation to 20 and 30 RTX series too....
Denverimgsli.com/MjI3Mjcz/0/3

Both DLSS and TAA are image blurring techniques, which is extremely annoying. I know what temporal filters are.
Yes, DLSS is crap in most of the games except Starfield for some reason. Not sure why, but even with low quality DLSS settings and with a little sharpening, the image is almost as native for x3 times the FPS. Guess it is all in the implementation. Also motion blur is very well done in the same game.
Posted on Reply
#29
wheresmycar
Prima.VeraEven so I have a feeling they will bring their Frame Generation to 20 and 30 RTX series too
Hopefully that would be the expected FSR 3.1 consequence. Although knowing Nvidia everything and nothing is possible... nV doesn't like decoupling and sharing the good stuff (well no-one does really) - we can only hope FSR's FG measures up to nVs offering on 40-series with no compromises... a good knee in the balls for nV to surrender!
Posted on Reply
#30
stimpy88
wheresmycarHopefully that would be the expected FSR 3.1 consequence. Although knowing Nvidia everything and nothing is possible... nV doesn't like decoupling and sharing the good stuff (well no-one does really) - we can only hope FSR's FG measures up to nVs offering on 40-series with no compromises... a good knee in the balls for nV to surrender!
Don't forget nGreedias lies regarding their "optical flow" accelerators! The 30x0 and 20x0 just dont have them to make it work! We all know it's BS BTW, just like ReBAR support on the 20x0 series!
Posted on Reply
#31
Vya Domus
Chrispy_Honestly, the only good thing about upscaling is the fact that you can natively render the UI and run the game at a lower resolution, but all of these comparisons to native on Youtube are disingenuous because honestly it's very hard to really equate any upscaling with native in most games. Something like BG3 where movement is slow and predictable are acceptable compromises for upscaling but they're also the sort of games that need the least upscaling help in the first place.
Upscaling should have only being used in scenarios where the base resolution isn't really all that low, like 75% of the native resolution. But instead of being used sparingly developers are absolutely using it as a crutch all the time, if you think we have it bad enough there are games on consoles that are upscaled from resolutions close to 720P lol.
Posted on Reply
#32
Chrispy_
Vya DomusUpscaling should have only being used in scenarios where the base resolution isn't really all that low, like 75% of the native resolution. But instead of being used sparingly developers are absolutely using it as a crutch all the time, if you think we have it bad enough there are games on consoles that are upscaled from resolutions close to 720P lol.
Yup, I've been shouting that from the rooftops ever since DLSS first launched to prop up the untenable performance hit of RT.

Starfield might have been the worst example of this to date - even the highest presets included a pretty bad FSR implementation to cover up the pathetic performance of Bethesda's 20-year-old, woefully obsolete Creation Engine pushed so far beyond its capabilities that even a 4090 struggles to make it look okay.

I got Starfield free with a 7800XT purchase so it ran well enough to play but I simply couldn't believe how bad the game looked and run. There are DX9 XB360 games that look better.
Posted on Reply
#33
nguyen
wolfAs in, Upscaling (specifically DLSS and in my case at 4k on an OLED) producing a better result than the forced TAA at native res, that cannot be totally disabled in a majority of modern games and is often very average at best.

If the comparatively rarer game comes out where TAA is not forced, sure you can get a crisper image at native with say FXAA or SMAA or even no AA (if you can tolerate that unstable mess), at the expense of other faults in the image, no denying that.

Personally I can't stand shimmer, but that's highly personal, I 100% understand that to some the softer resolve is what they can't stand and are willing to trade that perhaps against shimmer for example.

But I try my best to not put forward my opinion as if it's a universal fact that applies to everyone. For me, a potentially slightly softer (4k mitigates the majority of softness, this gets worse as resolution lowers) but very stable image without shimmer, fizzle, flicker and breakup is eminently desirable over a slightly sharper image with one of more of those artefacts persisting.

All decided on a per game basis mind you, I can't remember the titles right now but recently I did play a couple of games where I did go with FXAA or SMAA as I had the rendering budget to spare and the art/geometry etc style didn't present many opportunities for the artefacts I can't stand.
I guess there are people who just can't grasp that with AI upscaling enabled, they will get superior image quality (higher res and higher details) than without upscaling at the same FPS :rolleyes:.

Though FSR2.2 is still pretty far from being ideal, with so many visual artifacts that negate all the benefits.
Posted on Reply
#34
wolf
Better Than Native
nguyenI guess there are people who just can't grasp that with AI upscaling enabled, they will get superior image quality (higher res and higher details) than without upscaling at the same FPS :rolleyes:.
Yeah to some extent it is what it is with people, either they can't grasp it because they haven't seen it for themselves, or just don't want to admit it and so make other random, disingenuous, or non-connected to IQ and fps arguments. To what end? I'm really not certain, but some portion of it/to some people it is clearly political.

To me it's simply the proof is in the pudding, if it's as good or better IQ, that's that and it in effect doesn't matter how we got there. And you make a great case for performance normalised IQ too. I am of course fascinated by all manner of rendering technology, but it's always occurred to me as odd that people will die on the hill of what's going on behind the curtain being somehow totally unacceptable. Granted of course it's a bit of a dogs breakfast for which upscaling solution you use, which sub-version, which game and how well it's implemented, what output resolution and monitor technology you game at and so on. You and I on 4k120 OLED's and using DLSS, are effectively on a best case scenario for the technology.
nguyenThough FSR2.2 is still pretty far from being ideal, with so many visual artifacts that negate all the benefits.
I truly want it to be better, I really do, and I am keen when the 3.1 update drops in R&C Rift Apart to do some back to back testing myself. It is unfortunate that one of the solutions with the broadest compatibility and perhaps broadest appeal has so many flaws (up to v3.0) at the resolutions where it's needed the most.
Posted on Reply
#35
stimpy88
nguyenI guess there are people who just can't grasp that with AI upscaling enabled, they will get superior image quality (higher res and higher details) than without upscaling at the same FPS :rolleyes:.

Though FSR2.2 is still pretty far from being ideal, with so many visual artifacts that negate all the benefits.
We will just ignore the visual artifacts then...
Posted on Reply
#36
AusWolf
nguyen80-90FPS with Upscaling looks much crisper than 60FPS Native in motion though.
120FPS with Upscaling+Frame Gen is even way way crisper than 60FPS Native
I don't need upscaling at 60 FPS native. I would at 30 FPS, but then, it works with too little input to give me an acceptable picture. The only place for upscaling, imo, is when I fire up my 4K TV with the small HTPC it's connected to, because 1. There's no other way to play at 4K with a 1660 Ti, and 4K+FSR looks better than 1080p on a 4K screen, and 2. I sit far enough from the TV not to really care about the slight loss of image quality.

As for the article, I'm glad AMD is trying to improve. I'm wondering, though, if already released games will get FSR 3.1 support, or if you can just pop some DLLs in, like you can with DLSS.
Posted on Reply
#37
wolf
Better Than Native
We will just ignore the visual artifacts then...
Sure seems like people do ignore visual artefacts when decrying upscaling, when Native without AA, or Native + TAA, FXAA, SMAA, MSAA etc all have artifacts and visual/performance drawbacks of their own.
Posted on Reply
#38
stimpy88
wolfSure seems like people do ignore visual artefacts when decrying upscaling, when Native without AA, or Native + TAA, FXAA, SMAA, MSAA etc all have artifacts and visual/performance drawbacks of their own.
It's the ghosting and motion smearing mostly.
Posted on Reply
#39
theouto
stimpy88It's the ghosting and motion smearing mostly.
And often times that motion smearing and ghosting (other breakup in motion too, specially in wires/thin lines, a blurrier image, etc.) can be distracting enough for some to prefer the shimmer of more conservative AA techniques (SMAA, MSAA, CMAA2, etc.) against the blur that can be caused by more extreme/temporal AA techniques (TAA, DLAA, FSRNative).
I myself Prefer the sharpness and non temporal artifacting of SMAA over TAA (For reasons found in a very long text that I put in my profile).
It's a choosing game, it becomes a problem when we can no longer choose (either due to TAA/Upscaling being forced, or poor performance that encourages Upscaling).

(Also yes, I am lumping in TAA with Upscaling, they are all just TAA at the end of the day, they're just better at it)
Posted on Reply
#40
Super XP
AMD continuing to support open source is a huge Plus.
Posted on Reply
#41
ben dover
I just tried an 7900xtx and to be honest the driver is not very good. No better than old days. While fiddling around, the power tuning refused to turn off again, resulting in a massive fps drop every 10 secs, rendering the game unplayable. Sorry but i really wanted this to work. Sold the card and went straight back to the 3080 which just works. No new card until next gen Nvidia.
Posted on Reply
#42
theouto
I never know how people have these catastrophic failures, but somehow it's 99% TPU users that have these issues happen to them.
Posted on Reply
#43
Trunks0
theoutoI never know how people have these catastrophic failures, but somehow it's 99% TPU users that have these issues happen to them.
His comment is also incredibly random and outta place.
For counter balance? Running a PowerColor Red Devil 7900 XTX, with basically zero issues now for a little over a year? It's been great :toast:
Posted on Reply
#44
kapone32
Trunks0His comment is also incredibly random and outta place.
For counter balance? Running a PowerColor Red Devil 7900 XTX, with basically zero issues now for a little over a year? It's been great :toast:
I had more issues with my 6800XT than my 7900XT. I also love it for what Gaming is for me today.
Posted on Reply
Add your own comment
Nov 21st, 2024 12:36 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts