• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Announces FidelityFX Super Resolution 3 (FSR 3) Fluid Motion Rivaling DLSS 3, Broad Hardware Support

Status
Not open for further replies.
I am sorry but what exactly revolutionary technology has been introduced here ?

Real time frame interpolation is nothing new, TVs have had it for what, like a decade now ? To be perfectly frank it's embarrassing Nvidia and AMD couldn't find a way of enabling this at the driver level ages ago.
Computer generated graphics are much finer than film. Lossy compression is widely employed for content streaming. The idea is not revolutionary, but up until now we didn't have the means to come up with approximations that won't cause shimmering left and right.
 
Actually it makes it far more likely that you're talking out of your arse, since FSR3 and DLSS3 are completely different technologies, which means anyone extrapolating technical characteristics of one from the other is a moron.


As long as there are people, there will be stupid people. As long as there are stupid people, they will make stupid comments. Like "fake frames".


[citation needed]


It's 2023. RX 550 was released in 2017. 6 years is an eon in technology terms. Stop whining and deal with it.


Another person extrapolating technical characteristics of something from a completely unrelated thing, and making themself look Homer Simpson smart in the process.
Being a jackass doesn't win you points. Both of them interpolate frames, and given that AMD was able to find a generic way of doing it means that Nvidia could have done it the same way and chose not to. Why they didn't choose to is left as an exercise to the reader. In my opinion, if you are looking to buy a bridge in Brooklyn, then Nvidia's explanation makes sense.
 
View attachment 310639
Oh jeez, AMD for the love of god please do not rush it out in this state... This checkerboard effect I'm seeing in the second Fospoken image example does not bode well.

the first image looks similar, so looks like its just a bad game to me.
 
more excited for FSR 3 than anything.

if they can turn into a Gsync / Freesync situation would be a huge win.
 
I was most concerned by the ridiculous oversharpening on the wall...

but to be fair, it is zoomed in, granted they choose to do so, but still, might be not this bad at normal distance.



thats....actually not how this works Dr.Fanboy.

Nvidia simply took over research that has been done before them by others, Frame Gen was first shown as a tech demo for the Force Unleashed 2 by LucasArts....
Secondly they did not get blasted at all, it was really well received, so im not sure why you are playing some weird victim card here.
The only critism was the latency it introduced and we would be super hypocritical NOT to raise some points about that considering the whining people have been doing about monitors response rates and running no Vsync because "moar latency bruv"....

and lastly AMD is the darling because...its not proprietary, it works for borderline everyone, where big Daddy Nvidia leaves GTX1080Ti owners behind to die off, AMD comes in with some of that FSR for them to run and extend the longevity of their purchase.

Surely even you could see how AMD might be a bit more appreciated for that reason.
Apparently the 1080 ti isn’t supported by fsr 3


Your point is otherwise well taken.
 
Last edited:
What to say? They persistently - as well - no support to my - not that old - Radeon RX 550. I'm really sorry AMD, but I can't get excited about your limited solutions. Maybe in 2029 when I'll be assembly my next PC 'low TDP enthusiast' I can finally start to get excited about it. That's if you're still working on it.
I own the Sapphire Pulse version of the RX 550. It was never meant to be a gaming powerhouse. With 2GB of VRAM, it mostly exists to provide graphics for systems that don't have an iGPU built into the CPU and to offer a selection of connector options.

When I bought mine for $65 in September 2019 (below the $79 launch price), I never planned to use it for gaming. And I never did. Today it sits in a closet as an emergency backup just in case one of my GPUs blows up.

I don't even consider it a "true" Polaris card (I also have an RX 580). It's a Lexa PRO GPU so really a baby Polaris. Architecture is different, Apple doesn't support it for eGPUs with the Intel Macs.

Anyhow good on AMD for releasing a platform agnostic frame generation technology. No one should complain because if you don't want it you can turn it off.
 
Another person extrapolating technical characteristics of something from a completely unrelated thing, and making themselves look Homer Simpson smart in the process.
They all work exactly in the same way, dividing the frames into a grid, using motion vectors to estimate which blocks within the grid have changed between frames and then reconstruing the intermediate frame according to how these blocks have moved. Stop embarrassing yourself and look up how this works, this is neither new or particularly groundbreaking.
 
It's amusing how it works, Nvidia introduces a pioneering technology, gets blasted for it, AMD releases an inferior copy and it's everyone's darling for doing so.

No frame interpolation isn't "pioneering technology" and I see very few people praising this. There are more people here complaining that it's being praised ironically, a lot of Nvidia metooers.
 
Nice! FSR3 looks promising, but i am not sure why amd chose forspoken as a game to showcase the FSR3 as its not a popular game
Immortals of Aveum is badly optimized but none the less i look forward to 7700xt and 7800xt testing done
 
View attachment 310639
Oh jeez, AMD for the love of god please do not rush it out in this state... This checkerboard effect I'm seeing in the second Fospoken image example does not bode well.
Don't worry. When seeing in DLSS3 is extra image fidelity. You can't have everything from nothing with a little guessing. That's why upscaling as a major feature is bad and only as a last resort should have been marketed.

Now devs can happily start programming in MS Basic.
 
Nice! FSR3 looks promising, but i am not sure why amd chose forspoken as a game to showcase the FSR3 as its not a popular game
Immortals of Aveum is badly optimized but none the less i look forward to 7700xt and 7800xt testing done

the games were probably chosen long before reviews came out. these things take time.

im just glad cyberpunk 2077 will get it. i wish red dead redemption 2 was getting it, those are the two most demanding games on wish list.
 
It's amusing how it works, Nvidia introduces a pioneering technology, gets blasted for it, AMD releases an inferior copy and it's everyone's darling for doing so.
Well, what choice do they have. Have you read a review or comparison on a tech web site that doesn't recommend Nvidia due to better ray tracing or FSR3? Either release your version, or continue to get beat up by the tech press because you don't have this "great" feature.
x
If the press thought it was crap, which I think it is...it would be a non starter. But every thing Nvidia come out with is "must have". (until it isn't....PhysX anyone?)
 
It’ll be interesting to see if the early implementations suffer from the same ui artifacts and frame stretching issues that DLSS 3 had/has.
 
It’ll be interesting to see if the early implementations suffer from the same ui artifacts and frame stretching issues that DLSS 3 had/has.
We'll see.

As I pointed out earlier, these technologies really need to be reviewed in real life, in real applications, in real time action, especially because this technology is specifically designed for moving image improvement. Looking at a blown up still image of one moment in a game isn't a good methodology.
 
We'll see.

As I pointed out earlier, these technologies really need to be reviewed in real life, in real applications, in real time action, especially because this technology is specifically designed for moving image improvement. Looking at a blown up still image of one moment in a game isn't a good methodology.

Again, and like I repeatedly mention in any frame gen thread. If there are significant artifacts to be seen in stills and motion, the technology is useless. There’s no reason to buy expensive hardware and introduce artifacts by enabling upscalers and frame gen.
 
No frame interpolation isn't "pioneering technology" and I see very few people praising this. There are more people here complaining that it's being praised ironically, a lot of Nvidia metooers.

I've little interest in the brand fighting, really. Not gonna change anyone's mind here.

But I'm eagerly awaiting to see the same people who called DLSS frame generation "fake frames" and "Vaseline smear" praise this thing just because they can use it now. It's amusing the lengths people go to justify their allegiance to a brand or their investment ;)

Like I have an Ada card now but I've never used DLSS FG, for me it's not even a thing I care about. But it's funny anyway.

My growing list of grievances against AMD at this point are well known, but it's truly beside the point.

I keep my frustration and anger management sessions calling them names on my relatively private discord server :)
 
But I'm eagerly awaiting to see the same people who called DLSS frame generation "fake frames" and "Vaseline smear" praise this thing just because they can use it now. It's amusing the lengths people go to justify their allegiance to a brand or their investment ;)

That's 100% going to happen and I'm here for it too haha.
 
Regardless of AMD or Nvidia, I find it funny that not long ago, limiting frame queuing for minimising latency was a thing, and now, we have two pre-rendered frames and the GPU inserting a third one in between.
 
late September and late December, right?
Last time I checked it is early September to late November.
It's amusing how it works, Nvidia introduces a pioneering technology, gets blasted for it, AMD releases an inferior copy and it's everyone's darling for doing so.
Everything Nvidia had introduced was available before. PhysX was an Ageia tech, Nvidia bought it, gave the middle finger to those having bought Ageia cards, lock it to only work in systems where the Nvidia card needed to be primary. A simple patch was unlocking it making it totally clear that CUDA and PhysX could work in a system with an AMD GPU as primary and an Nvidia GPU as secondary. At the same time, programmers coincidentally forgot how to program good physics on the CPU and everything heavily promoted by Nvidia had $h!tty physics without hardware PhysX. When it was pretty clear that hardware PhysX was dead, Nvidia removed the lock.

Variable refresh rate was already available in laptops, used for battery consumption. Nvidia got it, use it to smooth graphics in games, introduced a hardware only solution and was making money even from those hardware GSync boards. AMD offered FreeSync to the masses and after it was clear that GSync was losing the battle, Nvidia decided to offer GSync Compatible, which is VESA's VRR, or in other words, FreeSync.

Raytracing is older than many of us. Upscaling and frame generation the same. But as far as frame generation goes, the main rule the last 10-15-20 don't know how many years, is to prefer a TV with TRUE high refresh rate, instead of fake refresh rate stuff.
Fake Refresh Rate Conversion: How To Not Get Tricked By A TV Manufacturer - RTINGS.com

Let's see SONY as an example
What is Motionflow XR and X-Motion Clarity? | Sony UK
The human eye can see a slight ‘judder’ as it perceives small differences in position between successive TV frames.

This judder or movement is most noticeable when on-screen objects are moving rapidly, and there is a significant difference between the image's position in each frame. By increasing the number of images seen every second, the TV produces a smoother, more natural TV picture.

The native panel frequency (also called the refresh rate) refers to how many images a panel can display in one second (Hz). Motionflow XR combines native panel frequency with other techniques, so pictures become smoother and more natural:
  • Frame insertion, which creates new motion-compensated frames that are inserted between the original TV frames
  • LED Backlight control, for controlling the panel’s LED backlighting to make images sharper
  • Image Blur Reduction, to compensate for blurred images, cleaning up the subject so that it appears clearer

Motionflow XR can go up to 1200 Hz depending on the TV model. To check the specific range for your TV, please go to Specifications located on your TV's product page.


You can adapt Motionflow by going to the [Picture] settings menu.

I have a feeling we'll see a lot less "I don't want no fake frames" messages from now on.
But good job making this vendor agnostic. Let's see the image quality.
With AMD following and probably Intel not far behind, Fake Frames are becoming standard part of the hardware. So, either we want it or not, it becomes standard. And with Fake Frames in the games, we got stagnation in real performance. Because, what is the difference between an RTX 3060 Ti and an RTX 4060 Ti without FG as an advantage for the 4060?
Also expect games to be even more unoptimised in the future and with next gen consoles getting upscaling and FG from day one, well, get ready to see many 30 fps with an RTX 4090 WITHOUT FG in 2-3 years from now.

But I'm eagerly awaiting to see the same people who called DLSS frame generation "fake frames" and "Vaseline smear" praise this thing just because they can use it now. It's amusing the lengths people go to justify their allegiance to a brand or their investment ;)
It's still Fake Frames. Expect to be more amused when your RTX 4080 will absolutely need FG to get 60fps in future games. I will definitely be amused from RTX 4080 and RTX 4090 owners screaming about optimizations.
 
I have a feeling we'll see a lot less "I don't want no fake frames" messages from now on.
But good job making this vendor agnostic. Let's see the image quality.
You're probably right, although I still don't want fake frames, just like I still don't want fake resolutions, either.
 
Last time I checked it is early September to late November.

Everything Nvidia had introduced was available before.

Paraphrasing the Doktor, "ich liebe kapitalismus..." but you can't say that AMD hasn't undergone some serious corporate mergers of their own ;)
 
Paraphrasing the Doktor, "ich liebe kapitalismus..." but you can't say that AMD hasn't undergone some serious corporate mergers of their own ;)
"I love capitalism" is a nice excuse when having plenty of money around and don't care about others.
 
"I love capitalism" is a nice excuse when having plenty of money around and don't care about others.

You don't really think AMD cares about you do you? That they're doing this out of kindness of their hearts?

Don't you think the timing is just a bit too convenient? Couldn't possibly be intended to deflect from the overwhelmingly negative press about the 7800 XT being slower than the 6800 XT, I guess that's just silly ol me and my ngreediot tendencies :rolleyes:

By the time my 4080 runs any game at 30 fps, there's only two courses of action: I'm upgrading (or rather I've already upgraded twice since then) or - if due to unreasonably high system requirements that just make no sense - I'm simply not playing the game.
 
more excited for FSR 3 than anything.

if they can turn into a Gsync / Freesync situation would be a huge win.

Definitely no need to fanboy about it for sure no pun intended :laugh:

As an owner of a 4090 I hope it's better than Nvidia alternative or at the very least comparable. I really like the feature and as others have stated it's a bummer that it is locked to 40 series cards.

Late is better than never and honestly hopefully AMD has used this time to perfect it as much as they can.
 
Status
Not open for further replies.
Back
Top