Friday, August 25th 2023

AMD Announces FidelityFX Super Resolution 3 (FSR 3) Fluid Motion Rivaling DLSS 3, Broad Hardware Support

In addition to the Radeon RX 7800 XT and RX 7700 XT graphics cards, AMD announced FidelityFX Super Resolution 3 Fluid Motion (FSR 3 Fluid Motion), the company's performance enhancement that's designed to rival NVIDIA DLSS 3 Frame Generation. The biggest piece of news here, is that unlike DLSS 3, which is restricted to GeForce RTX 40-series "Ada," FSR 3 enjoys the same kind of cross-brand hardware support as FSR 2. It works on the latest Radeon RX 7000 series, as well as previous-generation RX 6000 series RDNA2 graphics cards, as well as NVIDIA GeForce RTX 40-series, RTX 30-series, and RTX 20-series. It might even be possible to use FSR 3 with Arc A-series, although AMD wouldn't confirm it.

FSR 3 Fluid Motion is a frame-rate doubling technology that generates alternate frames by estimating an intermediate between two frames rendered by the GPU (which is essentially what DLSS 3 is). The company did not detail the underlying technology behind FSR 3 in its pre-briefing, but showed an example of FSR 3 implemented on "Forspoken," where the game puts out 36 FPS at 4K native resolution, is able to run at 122 FPS with FSR 3 "performance" preset (upscaling + Fluid Motion + Anti-Lag). At 1440p native, with ultra-high RT, "Forspoken" puts out 64 FPS, which nearly doubles to 106 FPS without upscaling (native resolution) + Fluid Motion frames + Anti-Lag. The Maximum Fidelity preset of FSR 3 is essentially AMD's version of DLAA (to use the detail regeneration and AA features of FSR without dropping down resolution).
AMD announced just two title debuts for FSR 3 Fluid Motion, the already released "Forspoken," and "Immortals of Aveum" that released earlier this week. The company announced that it is working with game developers to bring FSR 3 support to "Avatar: Frontiers of Pandora," "Cyberpunk 2077," "Warhammer II: Space Marine," "Frostpunk 2," "Alters," "Squad," "Starship Troopers: Extermination," "Black Myth: Wukong," "Crimson Desert," and "Like a Dragon: Infinite Wealth." The company is working with nearly all leading game publishers and game engine developers to add FSR 3 support, including Ascendant, Square Enix, Ubisoft, CD Projekt Red, Saber Interactive, Focus Entertainment, 11-bit Studios, Unreal Engine, Sega, and Bandai Namco Reflector.
AMD is also working to get FSR 3 Fluid Motion frames part of the AMD Hyper-RX feature that the company is launching soon. This is big, as pretty much any DirectX 11 or DirectX 12 game will get Fluid Motion frames, launching in Q1-2024.

Both "Forspoken" and "Immortals of Aveum" will get FSR 3 patches this Fall.
Add your own comment

362 Comments on AMD Announces FidelityFX Super Resolution 3 (FSR 3) Fluid Motion Rivaling DLSS 3, Broad Hardware Support

#301
gffermari
I still don't get how AMD managed to present FSR 3 without mentioning the number one sponsored title, Starfield.
I mean....who the hell cares about forspoken and the other I don't even remember the name one?

You've already delayed FSR 3 for nearly a year. Postpone 1-2 months or whatever, cancel everything and just make it work for the number one title.

AMDs GPU department has just committed suicide on this matter.
Posted on Reply
#302
fevgatos
mkppoYeah..no it isn't. When the 4070Ti launched, the 7900XT was already $850 and falling. But 7900XT is anywhere between 8-13% faster. Do the math.

Now you're going to say the MSRP is $900. Well it doesn't matter for shit, because it legit doesn't matter when comparing the launch of 4070Ti.

Also, i'm not sure about the way less features part. DLSS2/3? Sure, but what else? Relive is just as good as Shadowplay, AMD's control center is leaps and bounds better than the shitfest from nvidia that I have to deal with. Some extra features, sure i'll give them that. But don't make it sound like nvidia has a ton more features when they really don't.
At the 4070ti launch the 7900xt was between 120 and 140 euros more expensive. I'm not talking about msrp, I'm talking about actual pricing in Europe. So yes, the 4070ti was even better at raster per dollar, on top of better rt performance, much lower power draw etc.

Nvidia does have a ton more features. It supports both dlss and fsr, both fg and now fsr 3, Cuda, reflex. Abd the list goes on
kapone32Good for you. I guess you were getting random shutdowns as the MSI Gaming channel has a setup like yours and it was pulling 1300 Watts at times from the wall. There is also the fact that with the money I saved vs a 4090 I bought a 7900X3D and X670E board. Of course you would refuse to believe that my performance is good because it consumes half the power vs yours but we have different parameters for our purchase. I try to get the most performance per dollar.
Why would I be getting shutdowns? I have no idea what you are talking about. My system draws 550w max from the wall while playing tlou, and of course it absolutely demolishes yours in performance. We can test it if you want to
Posted on Reply
#303
kapone32
fevgatosWhy would I be getting shutdowns? I have no idea what you are talking about. My system draws 550w max from the wall while playing tlou, and of course it absolutely demolishes yours in performance. We can test it if you want to
You are entitled to your own opinion but since you don't live with me your comment is just plain conjecture.
Posted on Reply
#304
mkppo
fevgatosAt the 4070ti launch the 7900xt was between 120 and 140 euros more expensive. I'm not talking about msrp, I'm talking about actual pricing in Europe. So yes, the 4070ti was even better at raster per dollar, on top of better rt performance, much lower power draw etc.

Nvidia does have a ton more features. It supports both dlss and fsr, both fg and now fsr 3, Cuda, reflex. Abd the list goes on


Why would I be getting shutdowns? I have no idea what you are talking about. My system draws 550w max from the wall while playing tlou, and of course it absolutely demolishes yours in performance. We can test it if you want to
Well in the US it certainly wasn't the case and i'm not sure how 7900xt was more than 100 euro more expensive as that's larger than the difference in MSRP. Either way, reflex is basically pointless without a monitor to support it and even then it's pretty pointless and AMD anti-lag works pretty well. List goes on? In your 'list' I only see DLSS and CUDA which are well known but FSR works almost as well in general. So uh..CUDA? Is that the list that goes on?
Posted on Reply
#305
dyonoctis
mkppoWell in the US it certainly wasn't the case and i'm not sure how 7900xt was more than 100 euro more expensive as that's larger than the difference in MSRP. Either way, reflex is basically pointless without a monitor to support it and even then it's pretty pointless and AMD anti-lag works pretty well. List goes on? In your 'list' I only see DLSS and CUDA which are well known but FSR works almost as well in general. So uh..CUDA? Is that the list that goes on?
The 7900XT launched at 1049€ in Europe vs 910€ for the 4070ti. The EU market is special to say the least.
Posted on Reply
#306
mkppo
dyonoctisThe 7900XT launched at 1049€ in Europe vs 910€ for the 4070ti. The EU market is special to say the least.
Holy shit that 7900XT was a ripoff at launch, even more so in Europe.
Posted on Reply
#307
fevgatos
kapone32You are entitled to your own opinion but since you don't live with me your comment is just plain conjecture.
Exactly, and so was yours :D
Posted on Reply
#308
AusWolf
fevgatosAt the 4070ti launch the 7900xt was between 120 and 140 euros more expensive. I'm not talking about msrp, I'm talking about actual pricing in Europe. So yes, the 4070ti was even better at raster per dollar, on top of better rt performance, much lower power draw etc.

Nvidia does have a ton more features. It supports both dlss and fsr, both fg and now fsr 3, Cuda, reflex. Abd the list goes on
120 euros for +12 GB VRAM is a no-brainer at this price point.

As for features, why would you count on DLSS/FSR version X if your game runs well at native resolution? Because it does with a card like that. Cuda... Sure, for what? Reflex... what's that?

Edit: Like I said, there is no feature (gimmick?) in existence that would make me buy an 8 GB GPU for $799.
Posted on Reply
#309
kapone32
fevgatosExactly, and so was yours :D
I don't know if you can do this with Nvidia but here is my Gaming Panel. These are all at 4K 144hz. As you can see I have no issues with any Games running at high FPS. Here is my CPU power and here is GPU power





Posted on Reply
#310
fevgatos
AusWolf120 euros for +12 GB VRAM is a no-brainer at this price point.

As for features, why would you count on DLSS/FSR version X if your game runs well at native resolution? Because it does with a card like that. Cuda... Sure, for what? Reflex... what's that?
Why would you use dlss?

Well, for starters, higher fps.
Or you want to use rt, in which case you need dlss.
Or you want to play a game like hogwarts that is incredibly cpu bound even on a 13900k / 7800x 3d, so the only way to stay above 60 in all areas of the game is with FG..
Or you want to improve the native image quality which in some cases is horrible due to bad TAA implementation.
Or you want to lower power draw.

Plenty of reasons, I'm using it on my 4090, so sure as hell I'd be using it on a 4070ti
kapone32I don't know if you can do this with Nvidia but here is my Gaming Panel. These are all at 4K 144hz. As you can see I have no issues with any Games running at high FPS. Here is my CPU power and here is GPU power





And are you suggesting I have issues running games on high fps on my rig? I don't understand what point you are trying to make.

My gpu is power limited to 70%, so basically 320w max. Highest peak power draw was on tlou at 550w, average is around 440-480 measured from the wall, that's on 4k. Cpu usually draws 50 to 70w depending on the game unless I go 1080p dlss performance in which case it peaks at 110w in tlou. Other games it stays below 100.

Btw your cpu power draw is disgusting. 50w on almost idle, yikes. Mine drops down to less than 3 watts. I'm browsing the web watching 2 videos steaming at 5-8 watts.
Posted on Reply
#311
AusWolf
fevgatosWhy would you use dlss?

Well, for starters, higher fps.
Or you want to use rt, in which case you need dlss.
Or you want to play a game like hogwarts that is incredibly cpu bound even on a 13900k / 7800x 3d, so the only way to stay above 60 in all areas of the game is with FG..
Or you want to improve the native image quality which in some cases is horrible due to bad TAA implementation.
Or you want to lower power draw.

Plenty of reasons, I'm using it on my 4090, so sure as hell I'd be using it on a 4070ti
Higher FPS that you don't need because you have high enough FPS on your expensive card anyway.
If you need FG to stay above 60 FPS, then your latency must be horrible.
Improved image quality with an upscaled low-res image? Let's just say, I'll believe it when I see it.
Lower power draw? I've got a permanent 60 FPS cap for that, so I never have to lower my image quality in the first place.

Any more reasons? ;) I see the point of DLSS/FSR on a low/mid-tier card, but FG is just a gimmick. It's unnecessary at high FPS, and it presents latency issues at low FPS.
Posted on Reply
#312
Palindrome
Interested to see how this will work out. Whether the implementation will be good enough to allow people to hang on for their cards for longer. Similar with upscaling, I fear developers will lean on frame gen rather than making their games run well when you can just turn on upscaling+framegen to get 60fps... but hopefully that won't be the case, I hope it'll allow cards to remain relevant to their respective markets for longer :D

On another note, I find it very amusing to read people malding over brand loyalty.. makes for good monday morning entertainment :toast:
Posted on Reply
#313
fevgatos
AusWolfHigher FPS that you don't need because you have high enough FPS on your expensive card anyway.
If you need FG to stay above 60 FPS, then your latency must be horrible.
Improved image quality with an upscaled low-res image? Let's just say, I'll believe it when I see it.
Lower power draw? I've got a permanent 60 FPS cap for that, so I never have to lower my image quality in the first place.

Any more reasons? ;) I see the point of DLSS/FSR on a low/mid-tier card, but FG is just a gimmick. It's unnecessary at high FPS, and it presents latency issues at low FPS.
How do you know you don't need higher fps? What does that even mean? There are games coming out right now that struggle even on a 4090. Without rt that is.

Your latency is perfectly fine due to reflex. Fg + reflex = lower latency than your amd card has running native.

Hwunboxed tested it and yes, in most cases dlss improves image quality due to poor taa implementation. Have you never seen a game and think that it looks like a blurry mess? That's due to TAA. Sotr, rdr2 and to an extent cyberpunk are examples of that. Dlss fixes that issue.
Posted on Reply
#314
Vayra86
AusWolfHigher FPS that you don't need because you have high enough FPS on your expensive card anyway.
If you need FG to stay above 60 FPS, then your latency must be horrible.
Improved image quality with an upscaled low-res image? Let's just say, I'll believe it when I see it.
Lower power draw? I've got a permanent 60 FPS cap for that, so I never have to lower my image quality in the first place.

Any more reasons? ;) I see the point of DLSS/FSR on a low/mid-tier card, but FG is just a gimmick. It's unnecessary at high FPS, and it presents latency issues at low FPS.
On top of that, you're wasting precious time tweaking nonsensical features to then proceed pixel peeping for whatever reason. I've been there, also with 'Nvidia features' and honestly I never gave it more attention than the first half hour. I much prefer just settling on whatever works all over the place, and then leave it be. Simplicity is king, I'm here to focus on the content not the hardware running it... The first course of action on the 7900XT was figuring out how to disable everything that could get in my way. Its done, I'm done, install game and go.

I can't help but shake the feeling Nvidia's featureset advantage is a problem looking for solutions rather than vice versa. Continuous improvement on your 'hardware featureset' also means its never done, especially if you've chosen to make it a thing for yourself. Which is another aspect I personally really don't like, I like my games and my setup finished, polished and complete, the endless fiddling around is fun for a period of time, sure, but the novelty wears off and there are no boss fights.

FG/interpolation is now heralded as a thing. Lol. We've had this for decades and it was never beneficial, we preferred progressive image quality every time, but now its' tied to an overpriced graphics card, and its an advantage? Hilarious, but then again how else can you fill up that 'list that goes on' eh :D

And TAA versus an upscale... sure. I can see there is a perceivable difference... but again, its of a nature that really is at the pixel peeping level. There's still an equal amount of pixels, this is nothing else than every new AA method ever introduced in the history of gaming. Remember the MSAA/SMAA etc discussions? This is it all over again. Nothing is new here. I'm just content running without AA most of the time honestly, just make sure the pixel density/view distance fits and done.
Posted on Reply
#315
AusWolf
fevgatosHow do you know you don't need higher fps? What does that even mean? There are games coming out right now that struggle even on a 4090. Without rt that is.
That means, there isn't a single game that doesn't reach 60 FPS on my 6750 XT at 1080p right now. When that happens, I'll just lower a setting or two, or consider using FSR while I'm planning my next upgrade.
fevgatosYour latency is perfectly fine due to reflex. Fg + reflex = lower latency than your amd card has running native.
Ah, so Reflex is the Nvidia equivalent of Radeon Anti-Lag! Got ya. ;)
fevgatosHwunboxed tested it and yes, in most cases dlss improves image quality due to poor taa implementation. Have you never seen a game and think that it looks like a blurry mess? That's due to TAA. Sotr, rdr2 and to an extent cyberpunk are examples of that. Dlss fixes that issue.
They can test whatever they want, if I don't see it with my own eyes, I won't believe it, it's that simple. I put 100+ hours into Cyberpunk using a 2070 with DLSS Quality, and no, my image quality wasn't better than native.
Vayra86I can't help but shake the feeling Nvidia's featureset advantage is a problem looking for solutions rather than vice versa.
That's it! :) That's why I call them gimmicks instead of features. They're something no one ever needed, but now can't live without all of a sudden. Like a new drug. And by upping some version with every new generation of graphics cards released, Nvidia is only making people more and more addicted, and ensuring that they pay for the next shiny toy, regardless of its price.
Posted on Reply
#316
Vayra86
AusWolfThat means, there isn't a single game that doesn't reach 60 FPS on my 6750 XT at 1080p right now. When that happens, I'll just lower a setting or two, or consider using FSR while I'm planning my next upgrade.


Ah, so Reflex is the Nvidia equivalent of Radeon Anti-Lag! Got ya. ;)


They can test whatever they want, if I don't see it with my own eyes, I won't believe it, it's that simple. I put 100+ hours into Cyberpunk using a 2070 with DLSS Quality, and no, my image quality wasn't better than native.
@phanbuey had a pretty clear example of how the upscale improves IQ in Baldurs Gate 3. Its definitely more crisp. It also has a more artificial look because it looks so crisp. Its similar to a sharpening effect at normal view distance, except with a much higher fidelity. So there is the impression of more detail, certainly, but its a bit (to me at least) like looking at oversatured images. It looks surreal, even if it pops that much more. I tried internal upscale as well, I can see why people prefer one over the other, but I don't miss a thing running at just TAA either. Its different, much like a Reshade pass, but not ubiquitously better.
AusWolfThat means, there isn't a single game that doesn't reach 60 FPS on my 6750 XT at 1080p right now. When that happens, I'll just lower a setting or two, or consider using FSR while I'm planning my next upgrade.


Ah, so Reflex is the Nvidia equivalent of Radeon Anti-Lag! Got ya. ;)


They can test whatever they want, if I don't see it with my own eyes, I won't believe it, it's that simple. I put 100+ hours into Cyberpunk using a 2070 with DLSS Quality, and no, my image quality wasn't better than native.


That's it! :) That's why I call them gimmicks instead of features. They're something no one ever needed, but now can't live without all of a sudden. Like a new drug. And by upping some version with every new generation of graphics cards released, Nvidia is only making people more and more addicted, and ensuring that they pay for the next shiny toy, regardless of its price."
"Reflex"(tm)... is hilarious indeed, its just frame pre-gen / prediction set at 0 or 1 frames but with a new splash of marketing. Nvidia's had that since what, Pascal? AMD has it too... gotta love the fancy stickers eh. And since Nvidia sells those features at premium, every buyer will speak of them because otherwise they've wasted their precious extra money on stuff they don't use.

And they have to talk about it a lot because otherwise nobody can see how they're having an advantage :)
Posted on Reply
#317
AusWolf
Vayra86@phanbuey had a pretty clear example of how the upscale improves IQ in Baldurs Gate 3. Its definitely more crisp. It also has a more artificial look because it looks so crisp. Its similar to a sharpening effect at normal view distance, except with a much higher fidelity. So there is the impression of more detail, certainly, but its a bit (to me at least) like looking at oversatured images. It looks surreal, even if it pops that much more.
I don't know. Cyberpunk at DLSS Quality seemed simply just a tad blurry to me.
Posted on Reply
#318
fevgatos
AusWolfThat means, there isn't a single game that doesn't reach 60 FPS on my 6750 XT at 1080p right now. When that happens, I'll just lower a setting or two, or consider using FSR while I'm planning my next upgrade.


Ah, so Reflex is the Nvidia equivalent of Radeon Anti-Lag! Got ya. ;)
No, reflex isn't anti lag. Reflex is equivalent to hyper RX that is still being developed. FG + reflex has similar latency to what you are getting with an amd card playing natively. Surely you have no latency issues right? So why do you assume you will have issues with FG.

Well okay you have no performance issues cause you are playing at 1080p.
AusWolfI don't know. Cyberpunk at DLSS Quality seemed simply just a tad blurry to me.
Because you are at 1080p I assume. I can post you some screenshots on 4k and you'll see that dlss looks much better. Furthermore you have to take into account that it also increases your framerate. If you want to make a proper comparison of image quality, you should target the same framerate.

So for example, try native 720p vs 1080p DLSS Q. Framerate will be pretty similar but the dlss image will be like, tons better. So what that means really is that, you can upgrade your monitor to 1440p, use dlss and end up with same framerate but much better image quality than you are getting right now with your native 1080p monitor. If you don't find that actually impressive then I don't know man
Vayra86"Reflex"(tm)... is hilarious indeed, its just frame pre-gen / prediction set at 0 or 1 frames but with a new splash of marketing. Nvidia's had that since what, Pascal? AMD has it too... gotta love the fancy stickers eh. And since Nvidia sells those features at premium, every buyer will speak of them because otherwise they've wasted their precious extra money on stuff they don't use.

And they have to talk about it a lot because otherwise nobody can see how they're having an advantage :)
No, reflex is not the same as antilag or prerendered frames. The concept is of course similar, but it's not the same thing. It combines prerendered flames with a non static fps cap. You get much lower latency with reflex than with antilag, igorslab tested it. Reflex at 60 fps gets you lower latency than an amd card with antilag at 200 fps. That is insane.
Posted on Reply
#319
Vayra86
fevgatosNo, reflex is not the same as antilag or prerendered frames. The concept is of course similar, but it's not the same thing. It combines prerendered flames with a non static fps cap. You get much lower latency with reflex than with antilag, igorslab tested it. Reflex at 60 fps gets you lower latency than an amd card with antilag at 200 fps. That is insane.
I learned a thing thanks :) Still that doesn't sound awfully different than what Fast Sync had at one point, right? I certainly do hope it's better than that, because that was stuttery and not quite as dynamic as you'd want.

Still, the gist is, you keep stacking more and more (support required!) added technologies on top of each other because new Nvidia 'solutions' introduce 'new problems' to fix. Is there an improvement... sure. But only with several layers of TLC. Reality checks are nice... I don't need reflex, because I don't need FG, and 'native latency' is already low enough.
Posted on Reply
#320
fevgatos
Vayra86I learned a thing thanks :) Still that doesn't sound awfully different than what Fast Sync had at one point, right? I certainly do hope it's better than that, because that was stuttery and not quite as dynamic as you'd want.

Still, the gist is, you keep stacking more and more (support required!) added technologies on top of each other because new Nvidia 'solutions' introduce 'new problems' to fix. Is there an improvement... sure. But only with several layers of TLC. Reality checks are nice... I don't need reflex, because I don't need FG, and 'native latency' is already low enough.
Well if you are not into competitive games reflex is useless. But the same can be said about fg, do you really care about the increased latency that much? Especially when you are using reflex?

I kid you not, on a game like hogwarts, I couldn't enjoy that game without FG. It was unplayable for me, and not because of the gpu performance but because of the cpu performance. There were areas in the game that even a fully oced tuned 12900k couldn't keep 60. FG just transformed the game to a much smoother experience.
Posted on Reply
#321
AusWolf
fevgatosNo, reflex isn't anti lag. Reflex is equivalent to hyper RX that is still being developed. FG + reflex has similar latency to what you are getting with an amd card playing natively. Surely you have no latency issues right? So why do you assume you will have issues with FG.
As far as I understand, Hypr-RX is not a thing, but rather a combination of things. So what is Reflex exactly? I'll have to look for some official documentation because I'm curious.
fevgatosBecause you are at 1080p I assume. I can post you some screenshots on 4k and you'll see that dlss looks much better. Furthermore you have to take into account that it also increases your framerate. If you want to make a proper comparison of image quality, you should target the same framerate.

So for example, try native 720p vs 1080p DLSS Q. Framerate will be pretty similar but the dlss image will be like, tons better. So what that means really is that, you can upgrade your monitor to 1440p, use dlss and end up with same framerate but much better image quality than you are getting right now with your native 1080p monitor. If you don't find that actually impressive then I don't know man
That's not how it works. If I'm playing at 1080p, I will never ever in my whole life compare to a 720p image. My comparison is 1080p native vs 1080p DLSS/FSR. In this comparison, native always wins in terms of quality.
fevgatosNo, reflex is not the same as antilag or prerendered frames. The concept is of course similar, but it's not the same thing. It combines prerendered flames with a non static fps cap. You get much lower latency with reflex than with antilag, igorslab tested it. Reflex at 60 fps gets you lower latency than an amd card with antilag at 200 fps. That is insane.
Ah, so it's Anti-Lag combined with Radeon Chill/Boost?
Posted on Reply
#322
Vayra86
fevgatosWell if you are not into competitive games reflex is useless. But the same can be said about fg, do you really care about the increased latency that much? Especially when you are using reflex?

I kid you not, on a game like hogwarts, I couldn't enjoy that game without FG. It was unplayable for me, and not because of the gpu performance but because of the cpu performance. There were areas in the game that even a fully oced tuned 12900k couldn't keep 60. FG just transformed the game to a much smoother experience.
Its a bit of a circle we're going into. Why would I care about the doubled frames if it doesn't improve latency? That's exactly why its a problem looking for solutions and not vice versa. If I already had 30 FPS native with ditto latency, reflex won't save me if I use FG to get 60, and if I already had 60 FPS, I can do without both.

Latency on competitive games should already be low enough; if you game at 120+ FPS there really isn't much to gain, if anything. There is no tangible advantage there, only a perceived advantage, its fully in placebo territory much like a 360hz panel.

As for Hogwarts... myeah. Whatever. Any game that fully eats a CPU needs a reality check. Its a turd anyway :) If that edge case makes FG a feature that can't be missed... I think we're looking too hard into it, its not a perk of FG, its just shitty code.
Posted on Reply
#323
AusWolf
fevgatosWell if you are not into competitive games reflex is useless. But the same can be said about fg, do you really care about the increased latency that much? Especially when you are using reflex?

I kid you not, on a game like hogwarts, I couldn't enjoy that game without FG. It was unplayable for me, and not because of the gpu performance but because of the cpu performance. There were areas in the game that even a fully oced tuned 12900k couldn't keep 60. FG just transformed the game to a much smoother experience.
I don't play competitively, either, but I like a fast input response. I don't like it when the game reacts in a 'spongy' way to mouse movement, for example. I haven't tried Hogwarts, though (still waiting for a discount).
Posted on Reply
#324
fevgatos
AusWolfAs far as I understand, Hypr-RX is not a thing, but rather a combination of things. So what is Reflex exactly? I'll have to look for some official documentation because I'm curious.


That's not how it works. If I'm playing at 1080p, I will never ever in my whole life compare to a 720p image. My comparison is 1080p native vs 1080p DLSS/FSR. In this comparison, native always wins in terms of quality.
Ok let me put it another way. You are about to buy a monitor, and you can't decide between a 1440p or a 4k (or, a 1080p vs a 1440p). Buying the higher resolution monitor and using dlss (or fsr for that matter) will get you similar framerate but better image quality than buying the lower resolution on monitor and playing natively.

That's mainly the reason I went for a 4k monitor.
Vayra86Its a bit of a circle we're going into. Why would I care about the doubled frames if it doesn't improve latency? That's exactly why its a problem looking for solutions and not vice versa. If I already had 30 FPS native with ditto latency, reflex won't save me if I use FG to get 60, and if I already had 60 FPS, I can do without both.

Latency on competitive games should already be low enough; if you game at 120+ FPS there really isn't much to gain, if anything. There is no tangible advantage there, only a perceived advantage, its fully in placebo territory much like a 360hz panel.

As for Hogwarts... myeah. Whatever. Any game that fully eats a CPU needs a reality check. Its a turd anyway :) If that edge case makes FG a feature that can't be missed... I think we're looking too hard into it, its not a perk of FG, its just shitty code.
Regarding hogwarts, of course it's mainly down to poor optimization. That's kinda irrelevant, there will always be poorly optimized games, fg allows you to play them

Regarding FG, you'd be impressed by how optically fluid games are with it. Say you are watching someone else play the game, for you as a spectator the game will be much much smoother with FG. If you are the one playing the game then yes, it doesn't improve and usually it offers worse input latency, but it's not that had as people claim. You can get a game from 70 fps all the way up to 120 to max your monitor with no noticeable latency increase.
Posted on Reply
#325
AusWolf
fevgatosOk let me put it another way. You are about to buy a monitor, and you can't decide between a 1440p or a 4k (or, a 1080p vs a 1440p). Buying the higher resolution monitor and using dlss (or fsr for that matter) will get you similar framerate but better image quality than buying the lower resolution on monitor and playing natively.

That's mainly the reason I went for a 4k monitor.
I'll never buy a new monitor not being sure if my PC will play games fluidly on it without DLSS/FSR/FG. A couple weeks ago, I was briefly considering opting in for a 1440p ultrawide (3440x1440), but considering that it's 1.34x the pixels as normal 1440p, or 2.38x the pixels as 1080p, I'm not so sure. I can't justify replacing a monitor that I'm perfectly happy with for one that may or may not play my games fluidly at native res.
But each to their own.
fevgatosRegarding FG, you'd be impressed by how optically fluid games are with it. Say you are watching someone else play the game, for you as a spectator the game will be much much smoother with FG. If you are the one playing the game then yes, it doesn't improve and usually it offers worse input latency, but it's not that had as people claim. You can get a game from 70 fps all the way up to 120 to max your monitor with no noticeable latency increase.
Are you saying that FG is mainly meant for spectators? Nobody watches me play my games (thank God), so what's the point, then? 70 or 120 FPS doesn't make any difference to me whatsoever.
Posted on Reply
Add your own comment
Jun 10th, 2024 21:01 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts