Friday, August 25th 2023

AMD Announces FidelityFX Super Resolution 3 (FSR 3) Fluid Motion Rivaling DLSS 3, Broad Hardware Support

In addition to the Radeon RX 7800 XT and RX 7700 XT graphics cards, AMD announced FidelityFX Super Resolution 3 Fluid Motion (FSR 3 Fluid Motion), the company's performance enhancement that's designed to rival NVIDIA DLSS 3 Frame Generation. The biggest piece of news here, is that unlike DLSS 3, which is restricted to GeForce RTX 40-series "Ada," FSR 3 enjoys the same kind of cross-brand hardware support as FSR 2. It works on the latest Radeon RX 7000 series, as well as previous-generation RX 6000 series RDNA2 graphics cards, as well as NVIDIA GeForce RTX 40-series, RTX 30-series, and RTX 20-series. It might even be possible to use FSR 3 with Arc A-series, although AMD wouldn't confirm it.

FSR 3 Fluid Motion is a frame-rate doubling technology that generates alternate frames by estimating an intermediate between two frames rendered by the GPU (which is essentially what DLSS 3 is). The company did not detail the underlying technology behind FSR 3 in its pre-briefing, but showed an example of FSR 3 implemented on "Forspoken," where the game puts out 36 FPS at 4K native resolution, is able to run at 122 FPS with FSR 3 "performance" preset (upscaling + Fluid Motion + Anti-Lag). At 1440p native, with ultra-high RT, "Forspoken" puts out 64 FPS, which nearly doubles to 106 FPS without upscaling (native resolution) + Fluid Motion frames + Anti-Lag. The Maximum Fidelity preset of FSR 3 is essentially AMD's version of DLAA (to use the detail regeneration and AA features of FSR without dropping down resolution).
AMD announced just two title debuts for FSR 3 Fluid Motion, the already released "Forspoken," and "Immortals of Aveum" that released earlier this week. The company announced that it is working with game developers to bring FSR 3 support to "Avatar: Frontiers of Pandora," "Cyberpunk 2077," "Warhammer II: Space Marine," "Frostpunk 2," "Alters," "Squad," "Starship Troopers: Extermination," "Black Myth: Wukong," "Crimson Desert," and "Like a Dragon: Infinite Wealth." The company is working with nearly all leading game publishers and game engine developers to add FSR 3 support, including Ascendant, Square Enix, Ubisoft, CD Projekt Red, Saber Interactive, Focus Entertainment, 11-bit Studios, Unreal Engine, Sega, and Bandai Namco Reflector.
AMD is also working to get FSR 3 Fluid Motion frames part of the AMD Hyper-RX feature that the company is launching soon. This is big, as pretty much any DirectX 11 or DirectX 12 game will get Fluid Motion frames, launching in Q1-2024.

Both "Forspoken" and "Immortals of Aveum" will get FSR 3 patches this Fall.
Add your own comment

362 Comments on AMD Announces FidelityFX Super Resolution 3 (FSR 3) Fluid Motion Rivaling DLSS 3, Broad Hardware Support

#276
oxrufiioxo
TheoneandonlyMrKYour lucky you can afford to.

If some didn't choose to go with the competition regardless due to reasons you wouldn't be able to afford a GPU now.


Now imagine if Huang had his way, a monopoly THEN this AI boom kicked in.

As I said your lucky to have that option.
Not for much longer at the rate GPU pricing is increasing.... And really they've already been pricing their flagship at whatever they want to for a decade.
Posted on Reply
#277
fevgatos
TheoneandonlyMrKYou really are the worst version of poirot or clueso or Agatha Christie, imagine ,you would bang up more innocent people than anyone hypothetically, it's him he was there, I said so.

Your lucky you can afford to.

If some didn't choose to go with the competition regardless due to reasons you wouldn't be able to afford a GPU now.


Now imagine if Huang had his way, a monopoly THEN this AI boom kicked in.

As I said your lucky to have that option.
Yeah, let's buy the worst product for competitions sake!

People weren't buying AMD cpus for like 10 years or something, i was still able to afford CPUs. In fact, mainstream CPUs cost 1/3rd of what they cost today back then when there was no "competition".
Posted on Reply
#278
kapone32
fevgatosYeah, let's buy the worst product for competitions sake!

People weren't buying AMD cpus for like 10 years or something, i was still able to afford CPUs. In fact, mainstream CPUs cost 1/3rd of what they cost today back then when there was no "competition".
You are talking trash. The focus may be 16 core monsters but there have been chips like the 3300x, 2400G, 10400F that are perfect fro what you want on a budget.
Posted on Reply
#279
TheoneandonlyMrK
oxrufiioxoNot for much longer at the rate GPU pricing is increasing.... And really they've already been pricing their flagship at whatever they want to for a decade.
Well there's a extreme angle, both Intel AMD and Nvidia are working on APUS to sell cheap and game on, but perhaps eventually low end discreet cards will die out, I'm actually ok with that so long as APU can cope but that's all too off topic.

Because this thread is about Fsr3 , I do like it's wide support, I prefer in fact the option to others but again they're are few games where I Don't ever 180° spin fast at some point and that lag I just can't do.

@fevgatos see paragraph two, any chance of on topic chat please.
Posted on Reply
#280
oxrufiioxo
TheoneandonlyMrKWell there's a extreme angle, both Intel AMD and Nvidia are working on APUS to sell cheap and game on, but perhaps eventually low end discreet cards will die out, I'm actually ok with that so long as APU can cope but that's all too off topic.

Because this thread is about Fsr3 , I do like it's wide support, I prefer in fact the option to others but again they're are few games where I Don't ever 180° spin fast at some point and that lag I just can't do.

@fevgatos see paragraph two, any chance of on topic chat please.
Well you know any Amd/Nvidia article is always going to go to shite people treat these companies like their favorite football team

I find FSR3 Promising but really want to see it independently tested to judge how well it works with asynchronous compute.
Posted on Reply
#281
AusWolf
fevgatosYeah, let's buy the worst product for competitions sake!
Worst product? Why? Because it doesn't support DLSS?
fevgatosPeople weren't buying AMD cpus for like 10 years or something, i was still able to afford CPUs. In fact, mainstream CPUs cost 1/3rd of what they cost today back then when there was no "competition".
You could afford them because one (Intel) generation was exactly the same as the last one. GPUs nowadays, on the other hand, keep getting faster, but also more expensive. I could easily afford the high-end 10 years ago while I still lived on my measly student loan. Now I can only afford mid-tier on a full-time salary with a night shift premium. At this rate, even the low-end will be out of my reach in the next 10-15 years.
Posted on Reply
#282
fevgatos
AusWolfWorst product? Why? Because it doesn't support DLSS?
In general, AMD cards are equal in raster to competing nvidia cards, have higher power draw, way less features, worse RT performance. So yes, they are worse, and the only way for amd to compete is by price cuts. Which is great, but it would be better if they actually launched their cards with lower prices to begin with.Take the 7900xt as a prime example, at launch it was worse at RASTER performance per dollar compared to the 4070ti. That is absolutely absurd, considering all the other pros the 4070ti has over it
AusWolfYou could afford them because one (Intel) generation was exactly the same as the last one
Well -that's a great myth people keep repeating but it's not true. You don't have to believe me, do the math yourself. Compared the multithreade performance of an i7 2600k vs an i7 6700k (same pricepoints) and compare it to the multithreaded performance increase between an R7 1700 and an r5 5600x (same pricepoint. Youll realize that intel gave us more performance per gen back then than amd does today :roll:
Posted on Reply
#283
kapone32
fevgatosWell -that's a great myth people keep repeating but it's not true. You don't have to believe me, do the math yourself. Compared the multithreade performance of an i7 2600k vs an i7 6700k (same pricepoints) and compare it to the multithreaded performance increase between an R7 1700 and an r5 5600x (same pricepoint. Youll realize that intel gave us more performance per gen back then than amd does today :roll:
Yes an 8 core CPU vs a 6 core but it doesn't matter. Regardless of how you feel there would be applications where the 5600X would be faster even with 2 less cores.
Posted on Reply
#284
fevgatos
kapone32Yes an 8 core CPU vs a 6 core but it doesn't matter. Regardless of how you feel there would be applications where the 5600X would be faster even with 2 less cores.
The number of cores is irrelevant. We are talking about performance. Who actually cares about cores? Do you care how many cores your GPU has or how fast it is? Personally i've no idea how many cores my 4090 has.
Posted on Reply
#285
AusWolf
fevgatosIn general, AMD cards are equal in raster to competing nvidia cards, have higher power draw, way less features, worse RT performance. So yes, they are worse, and the only way for amd to compete is by price cuts. Which is great, but it would be better if they actually launched their cards with lower prices to begin with.Take the 7900xt as a prime example, at launch it was worse at RASTER performance per dollar compared to the 4070ti. That is absolutely absurd, considering all the other pros the 4070ti has over it
You can pack it full of "features", wrap Christmas lights around the box and glaze it in exquisite Swiss chocolate, a 799 USD MSRP for a 12 GB card in 2023 is still nothing short of a joke (or rather, a slap in the face).
fevgatosWell -that's a great myth people keep repeating but it's not true. You don't have to believe me, do the math yourself. Compared the multithreade performance of an i7 2600k vs an i7 6700k (same pricepoints) and compare it to the multithreaded performance increase between an R7 1700 and an r5 5600x (same pricepoint. Youll realize that intel gave us more performance per gen back then than amd does today :roll:
Let's agree to disagree on that one (and not visit off-topic territories).
Posted on Reply
#286
kapone32
fevgatosThe number of cores is irrelevant. We are talking about performance. Who actually cares about cores? Do you care how many cores your GPU has or how fast it is? Personally i've no idea how many cores my 4090 has.
CPU cores and GPU cores cannot be compared but debating with you is better understood as you have a serious case of price justification syndrome. When you are looking at CPUs transistor count, clock speed and Memory support are tantamount (for AM4). Alone the fact that you could get RAM running at 4000 MHZ with a 5600X is an advantage. There is no 1700X that can run Memory above 3200. Then look at IPC. The difference especially in Multi threaded BENCHMARKS will be mitigated by core count in CPUs. And yes I come from a world where 1 GB VRAM was OMG. So yes when my 7900XT goes to 2898 Mhz I am happy but why the hell do you have a 4090 if it doesn't matter? BTW TPU is your friend to ascertain the specs of your PC.
Posted on Reply
#287
AusWolf
fevgatosThe number of cores is irrelevant. We are talking about performance. Who actually cares about cores? Do you care how many cores your GPU has or how fast it is? Personally i've no idea how many cores my 4090 has.
Well, I do care. ;)

Not only do I like knowing how things work, but I also like not being ripped off with a partially disabled GPU that's advertised as the flagship of the flagships, only for the Ti version to take its place 6-12 months later.
Posted on Reply
#288
gt362gamer
fevgatosIn general, AMD cards are equal in raster to competing nvidia cards, have higher power draw, way less features, worse RT performance. So yes, they are worse, and the only way for amd to compete is by price cuts. Which is great, but it would be better if they actually launched their cards with lower prices to begin with.Take the 7900xt as a prime example, at launch it was worse at RASTER performance per dollar compared to the 4070ti. That is absolutely absurd, considering all the other pros the 4070ti has over it
Well, AMD gpus in general have more vram and less driver overhead (unless you play certain games, usually Directx11 games or older ones, where the higher Nvidia driver overhead is not important performance wise but you get more draw calls therefore better fps [if you have enough CPU power, of course]) at the same price points, unless I'm mistaken.
Posted on Reply
#289
fevgatos
kapone32but why the hell do you have a 4090 if it doesn't matter?
Because of it's performance. I don't care if it delivers that performance with 1 core or a trillion.
AusWolfWell, I do care. ;)

Not only do I like knowing how things work, but I also like not being ripped off with a partially disabled GPU that's advertised as the flagship of the flagships, only for the Ti version to take its place 6-12 months later.
If you buy a product at a price that makes sense for you, then you can;'t get ripped off regardless of whether a ti or a super variant taking it's place. Would you buy a product that is full of specs (cores ram bandwidth etc.) if it - for whatever reason - underperforms? Probably not. So you don't really care about specs either. Specs tell you how a product should perform. Actual performance metrics tell you how it does in fact perform. I buy stuff based on the latter. The former is useless when it comes to buying decisions.
Posted on Reply
#290
TheoneandonlyMrK
fevgatosThe number of cores is irrelevant. We are talking about performance. Who actually cares about cores? Do you care how many cores your GPU has or how fast it is? Personally i've no idea how many cores my 4090 has.
This I agree with.

The number of cores does seam to be irrelevant to Fsr3:) :D.
Posted on Reply
#291
kapone32
fevgatosBecause of it's performance. I don't care if it delivers that performance with 1 core or a trillion.
Why are you on TPU if none of the nuance matters to you? Your 4090 sucks almost more than 100 Watts vs my 7900XT but it is all I need for 4K. The fact that it has 20GB of VRAM matters more to me than some feature like lighting. I Gamed on Atari so I can appreciate how fing sweet Armored Core 6 looks and appreciate the 100+ FPS I get all day at 4K. I know that it is good because I read and only now am I fully behind AMD but that is because the 7900XT is everything I expected it to be. If I play CP2077 at high in 4k I get over 200 FPS. You can DLSS your way to that but that is your choice. Don't rag on people because they did not make the same decision as you when purchasing a GPU.
Posted on Reply
#292
Dr. Dro
kapone32Why are you on TPU if none of the nuance matters to you? Your 4090 sucks almost more than 100 Watts vs my 7900XT but it is all I need for 4K. The fact that it has 20GB of VRAM matters more to me than some feature like lighting. I Gamed on Atari so I can appreciate how fing sweet Armored Core 6 looks and appreciate the 100+ FPS I get all day at 4K. I know that it is good because I read and only now am I fully behind AMD but that is because the 7900XT is everything I expected it to be. If I play CP2077 at high in 4k I get over 200 FPS. You can DLSS your way to that but that is your choice. Don't rag on people because they did not make the same decision as you when purchasing a GPU.
Well it's 100W for almost twice the performance in some situations, so I guess it's not only winning but it's also significantly more efficient at perf/W

I'll agree the 4090 is quite overkill, the XTX or 4080 is all most people are gonna need for this generation.

I don't think I'll be buying 90-class hardware anymore myself unless the price comes back down to the $1k price point, so the custom Asus cards go for 1.5k tops.

This binned midrange ASIC on overengineered flagship VRM and cooling solution niche that the 4080 Strix OC and the former 980 Kingpin I've used to have all the way back belong to is quite awesome for a gamer that looks for performance and finesse without throwing balance off the window.
Posted on Reply
#293
AusWolf
fevgatosBecause of it's performance. I don't care if it delivers that performance with 1 core or a trillion.


If you buy a product at a price that makes sense for you, then you can;'t get ripped off regardless of whether a ti or a super variant taking it's place. Would you buy a product that is full of specs (cores ram bandwidth etc.) if it - for whatever reason - underperforms? Probably not. So you don't really care about specs either. Specs tell you how a product should perform. Actual performance metrics tell you how it does in fact perform. I buy stuff based on the latter. The former is useless when it comes to buying decisions.
It's worth knowing whether a higher-tier version with unlocked cores (potentially for a similar MSRP) can be expected later, or not. I don't buy a product that underperforms (unless I'm extremely curious for some reason), but I also don't buy a product that is based on a partially disabled chip, knowing that the full version hasn't been released, yet. For example, what's stopping Nvidia (except for yields) from offering a slight price reduction on the 4090 and releasing a 4090 Ti with a fully enabled GPU die for the same MSRP as the 4090 any time between the release dates of the 4090 and the 5090?
Posted on Reply
#294
95Viper
Stick to the topic.
And, stop your insults and belittling of others.
Discuss civilly and do not argue.
Posted on Reply
#295
mkppo
fevgatosIn general, AMD cards are equal in raster to competing nvidia cards, have higher power draw, way less features, worse RT performance. So yes, they are worse, and the only way for amd to compete is by price cuts. Which is great, but it would be better if they actually launched their cards with lower prices to begin with.Take the 7900xt as a prime example, at launch it was worse at RASTER performance per dollar compared to the 4070ti. That is absolutely absurd, considering all the other pros the 4070ti has over it
4070Ti never had faster raster than the 7900xt, and the gap only increased over time as the RDNA3 drivers matured. Right now the 7900xt is clearly faster in raster, has more VRAM while consuming just under 40W more power which is much less than the power difference between a 7950X3D and the slower 13900K. Go figure
Posted on Reply
#296
fevgatos
mkppo4070Ti never had faster raster than the 7900xt,
I never said the 4070ti was faster in raster. You are making stuff up. I said the 4070ti had better raster per dollar. Which is true.
kapone32Why are you on TPU if none of the nuance matters to you? Your 4090 sucks almost more than 100 Watts vs my 7900XT but it is all I need for 4K.
My 4090 is power limited to 320w and even at that wattage, it sends your 7900xt to meet it's maker in terms of performance. But why does that matter? How is that relevant to anything?
AusWolfoffering a slight price reduction on the 4090 and releasing a 4090 Ti with a fully enabled GPU die for the same MSRP as the 4090 any time between the release dates of the 4090 and the 5090?
Nothing is stopping them, they did that with the 3090ti as well. But who cares? They can release a 4090ti a 4090super and a 4090ti super, doesn't really matter. When I bought the 4090 I thought it was good value for the performance (and the longevity, since next gen is at least 2 years later), so I don't really care what other product releases.
Posted on Reply
#297
AusWolf
fevgatosMy 4090 is power limited to 320w and even at that wattage, it sends your 7900xt to meet it's maker in terms of performance.
For double the price, damn well it should!
fevgatosNothing is stopping them, they did that with the 3090ti as well. But who cares? They can release a 4090ti a 4090super and a 4090ti super, doesn't really matter. When I bought the 4090 I thought it was good value for the performance (and the longevity, since next gen is at least 2 years later), so I don't really care what other product releases.
If you thought/think it's good value, that's within your rights. Personally, I don't want to buy anything with the thought that something better is lurking around the corner for a potentially similar price.
Posted on Reply
#298
mkppo
fevgatosI never said the 4070ti was faster in raster. You are making stuff up. I said the 4070ti had better raster per dollar. Which is true.
Yeah..no it isn't. When the 4070Ti launched, the 7900XT was already $850 and falling. But 7900XT is anywhere between 8-13% faster. Do the math.

Now you're going to say the MSRP is $900. Well it doesn't matter for shit, because it legit doesn't matter when comparing the launch of 4070Ti.

Also, i'm not sure about the way less features part. DLSS2/3? Sure, but what else? Relive is just as good as Shadowplay, AMD's control center is leaps and bounds better than the shitfest from nvidia that I have to deal with. Some extra features, sure i'll give them that. But don't make it sound like nvidia has a ton more features when they really don't.
Posted on Reply
#299
kapone32
fevgatosMy 4090 is power limited to 320w and even at that wattage, it sends your 7900xt to meet it's maker in terms of performance. But why does that matter? How is that relevant to anything?
Good for you. I guess you were getting random shutdowns as the MSI Gaming channel has a setup like yours and it was pulling 1300 Watts at times from the wall. There is also the fact that with the money I saved vs a 4090 I bought a 7900X3D and X670E board. Of course you would refuse to believe that my performance is good because it consumes half the power vs yours but we have different parameters for our purchase. I try to get the most performance per dollar.
Posted on Reply
#300
Vayra86
kapone32The biggest Takeaway for me was Frame Generation in all Games.
*'while stocks last, no rights reserved, subject to change'

I'll see it when its there and actually beneficial instead of preferring native, which is much of the current state of FSR as a whole.

And I frankly don't care. I like my graphics card as it is, not loaded with semi required patchwork for every other game release. When the industry decides on a uniform, catch all approach, count me in, until then this is marketing.
Posted on Reply
Add your own comment
Jun 10th, 2024 14:46 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts