Monday, January 27th 2025

FSR 4 Support Arriving Day One for All Current FSR 3.1 Game Titles According to Leak

AMD Radeon engineers are spending newly allocated extra time on optimizing their upcoming FidelityFX Super Resolution 4 (FSR 4) technology—industry watchdogs believe that a finalized version will launch alongside the initial lineup of RDNA 4 graphics card, now scheduled for release in March. Recently, David McAfee—Vice President and General Manager of Ryzen and Radeon products—revealed that his colleagues were working hard on maximizing performance and enabling "more FSR 4 titles." Insiders have started theorizing about how the current landscape of FSR 3.1-compatible games will translate with next-gen "AI-driven" upscaling techniques—several outlets believe that a freshly patched PC version of The Last of Us Part I is paving the way for eventual "easy" updates.

Kepler_L2—an almost endless fountain of Team Red-related insider knowledge—picked up on a past weekend VideoCardz report, and proceeded to add some extra tidbits via social media interaction. They started off by claiming that Team Red's: "RDNA 4 driver replaces FSR 3.1 DLL with FSR 4." When queried about the implication of said development, Kepler believes that all FSR 3.1 game titles will become ready to support FSR 4 on day one. The upgrade process—possibly achieved through a driver-level DLL swap—is reportedly quite easy to implement. According to the insider: "yeah, it should just work."
Sources: Kepler_L2 Tweet, VideoCardz, Wccftech
Add your own comment

76 Comments on FSR 4 Support Arriving Day One for All Current FSR 3.1 Game Titles According to Leak

#26
Contra
oxrufiioxoand possibly announce somthing to combat MFG.
Why would they invent something when they have AMFM FG+FRS FG and it has been working for over a year. No sane person would combine them, but that is what NV proposes to do with their shiny new MFG garbage
Posted on Reply
#27
ToxicTaZ
Not sure why people are going crazy for "Upscaling" & "Fake Frames" it's Ludacris!

I'll take quality over Blurry Fake Frames any day.

Destroying gaming quality...

Cheers
Posted on Reply
#28
Neo_Morpheus
AusWolfHardware-dependent closed solutions are hurting gaming - they're only good for increasing mindshare and revenue of a single company.
Platform and hardware agnostic tech used to be a desired trait in the tech world but for reasons that I really dont understand, that is now a con and ignored.

And agreed, it does hurt gaming, as a matter of fact, it is already doing a great deal of damage to it and the overall openness of what it was a major point of the PC platform is being rapidly eroded in front of our eyes.
Posted on Reply
#29
Vayra86
ToxicTaZ"Fake Frames" it's Ludacris!


You calling me a fake frame, b****?
Neo_MorpheusPlatform and hardware agnostic tech used to be a desired trait in the tech world but for reasons that I really dont understand, that is now a con and ignored.

And agreed, it does hurt gaming, as a matter of fact, it is already doing a great deal of damage to it and the overall openness of what it was a major point of the PC platform is being rapidly eroded in front of our eyes.
Nah, Nvidia doesn't have that power and it certainly won't hinge on whether upscalers work or don't, or are any good or not.

Its a nice to have, and that is all it really is right now. People overvalue this shit waaay too much. Its a temporary thing.

Devs are never going to stop or go depending on whether X or Y upscalers might work or not. Same thing as RT, its there by grace of the current motions, but its totally not required to make a good game. As long as single devs in attics and small teams can release content that takes the world by storm (and they do, more than ever before in history of gaming, its one surprise hit after another the last 10 years), the PC is as secure as it'll ever be. Everything else is just dancing around that reality, trying to exercise control they can never truly gain.
Posted on Reply
#30
InVasMani
NhonhoAMD and Intel should team up on this upscaling development part.
They should team up in general in GPU space to compete against Nvidia, but it's hard to envision it actually happening.
Posted on Reply
#31
oxrufiioxo
Vayra86

You calling me a fake frame, b****?
I try to refrain from calling them fake frames and rather refer to them as interpolated frames although I still slip up from time to time..... Technically everything you see in a game visually is faked it just comes down to how you like your image faked lol.
Posted on Reply
#32
AusWolf
oxrufiioxoI try to refrain from calling them fake frames and rather refer to them as interpolated frames although I still slip up from time to time..... Technically everything you see in a game visually is faked it just comes down to how you like your image faked lol.
By my definition:
"Normal" frames = ones generated using geometry data, physics and user input.
Fake frames = ones generated using data from another frame.
Posted on Reply
#33
dyonoctis
Vayra86Wasn't open and democratized the name of the game with this swanky new cons00mer AI? ;)

It'll happen once there is no fruit left, and I think that'll be sooner rather than later. Look at how fast AMD is chasing now with featureset. If there is nothing that sets these upscalers apart, there's no point wasting money on three approaches anymore.
We might also end up with a GPGPU situation :
- Nvidia immediately went for a closed API.
- Apple collaborated with AMD, Intel, and Nvidia to make OpenCL, believing that open source was the way to democratize GPGPU.
- Nvidia made a massive sweep by being overcommitted to CUDA, when the other players were fairly passive, and whished for the best.
- We now have CUDA/METAL/HIP/OneAPI. With HIP and OneAPI somehow being opensource, but only backed by the company who created it., And Apple had to make nvidia persona non grata on MacOS so that Metal could have a fghting change.
Devs now have to work with 4 different API, with no unification in sight with Direct X's Direct Compute seeming to be a fucking joke. (Optix merely being CUDA optimized for offline 3D RT/denoising).
Posted on Reply
#34
InVasMani
AusWolfBy my definition:
"Normal" frames = ones generated using geometry data, physics and user input.
Fake frames = ones generated using data from another frame.
It's more like adaptive frames than fake. A fake one would be generated from no base data in essence from scratch. AI text to image is more of a fake frame and that does is based on data and training by extension, but certainly the training algorithms aren't perfect and can't fully guess what we intend or mean with the prompts. I mean hundred people can ask AI to draw a bird, but they aren't all necessarily thinking of the same type bird so does it make it wrong or fake if they differ or vary no the AI is just pooling at random based on the data of many types of birds and doing it's best come up with acceptable bird of any type since it wasn't specified if it's a eagle or a hawk or blue jay or purple flamingo with a beaver tail that breaths butterflies on fire.
Posted on Reply
#35
oxrufiioxo
AusWolfBy my definition:
"Normal" frames = ones generated using geometry data, physics and user input.
Fake frames = ones generated using data from another frame.
I get what you mean but technically the frame is fake regardless of if it's being generated locally or not. Developers have to fake all sorts of stuff just so we have a cohesive image to begin with. Gpu makers are just helping them fake it more input latency and image quality be damned....
Posted on Reply
#36
AusWolf
dyonoctisWe might also end up with a GPGPU situation :
- Nvidia immediately went for a closed API.
- Apple collaborated with AMD, Intel, and Nvidia to make OpenCL, believing that open source was the way to democratize GPGPU.
- Nvidia made a massive sweep by being overcommitted to CUDA, when the other players were fairly passive, and whished for the best.
- We now have CUDA/METAL/HIP/OneAPI. With HIP and OneAPI somehow being opensource, but only backed by the company who created it., And Apple had to make nvidia persona non grata on MacOS so that Metal could have a fghting change.
Devs now have to work with 4 different API, with no unification in sight with Direct X's Direct Compute seeming to be a fucking joke. (Optix merely being CUDA optimized for offline 3D RT/denoising).
Are you suggesting that it's AMD/Intel/Apple's fault that Nvidia pushed their own standard so much and devs jumped on it?
InVasManiIt's more like adaptive frames than fake. A fake one would be generated from no base data in essence from scratch. AI text to image is more of a fake frame and that does is based on data and training by extension, but certainly the training algorithms aren't perfect and can't fully guess what we intend or mean with the prompts. I mean hundred people can ask AI to draw a bird, but they aren't all necessarily thinking of the same type bird so does it make it wrong or fake if they differ or vary no the AI is just pooling at random based on the data of many types of birds and doing it's best come up with acceptable bird of any type since it wasn't specified if it's a eagle or a hawk or blue jay or purple flamingo with a beaver tail that breaths butterflies on fire.
oxrufiioxoI get what you mean but technically the frame is fake regardless of if it's being generated locally or not. Developers have to fake all sorts of stuff just so we have a cohesive image to begin with. Gpu makers are just helping them fake it more input latency and image quality be damned....
That was my definition. I never said you have to agree with it, but I'm sticking to it. A game is based on user input-output. Anything the engine does outside of it is fake. It increases latency for no reason.
Posted on Reply
#37
InVasMani
Fake it til you make it. The way it's meant to be played. It's really adapting previous frame data rather than faking it. FXAA is as much a fake frame as frame interpolation and upscale trying to spatially scale additional pixels that didn't exist from existing ones thru clever quantized algorithm approaches.
Posted on Reply
#38
oxrufiioxo
AusWolfThat was my definition. I never said you have to agree with it, but I'm sticking to it. A game is based on user input-output. Anything the engine does outside of it is fake. It increases latency for no reason.
By your definition using TAA would make a frame fake..... Almost every game uses it.....

TAA anti-aliasing, which stands for "Temporal Anti-Aliasing", works by combining information from previous frames with the current frame to smooth out jagged edges on moving objects, essentially "blending" pixels from multiple frames to create a more refined image

It also technically has a performance cost increasing latency....
Posted on Reply
#39
AusWolf
oxrufiioxoBy your definition using TAA would make a frame fake..... Almost every game uses it.....

TAA anti-aliasing, which stands for "Temporal Anti-Aliasing", works by combining information from previous frames with the current frame to smooth out jagged edges on moving objects, essentially "blending" pixels from multiple frames to create a more refined image

It also technically has a performance cost increasing latency....
I knew this would come up. ;)

That's probably why everybody hates it. It works with data from different frames that have nothing to do with the current one, and therefore isn't entirely accurate.
Posted on Reply
#40
oxrufiioxo
AusWolfI knew this would come up. ;)

That's probably why everybody hates it. It works with data from different frames that have nothing to do with the current one, and therefore isn't entirely accurate.
I like TAA less than I like Frame gen and I don't even currently like frame gen in almost all scenarios..... On a monitor anyways... on a TV it's fine with a controller at least 10 feet away....


I hate the direction both technologies are going personally.... Especially now with GPU makers including them in their benchmarks and gamers going man I love me some of dat....
Posted on Reply
#41
Visible Noise
AusWolfI think everybody should team up and create an industry standard like DirectX or OpenGL and call it OpenUpscale or something.

Hardware-dependent closed solutions are hurting gaming - they're only good for increasing mindshare and revenue of a single company.
There’s no money in open.

Proprietary = profitable. Evidence: Nvidia vs AMD
Posted on Reply
#42
AusWolf
Visible NoiseThere’s no money in open.

Proprietary = profitable. Evidence: Nvidia vs AMD
And as a consumer, why would I need to support that?
Posted on Reply
#43
Visible Noise
AusWolfAnd as a consumer, why would I need to support that?
Because without profit companies don’t exist.
Posted on Reply
#44
dyonoctis
AusWolfAre you suggesting that it's AMD/Intel/Apple's fault that Nvidia pushed their own standard so much and devs jumped on it?
What I've heard from devs in the 3D industry, is that CUDA wasn't just artificially pushed by Nvidia, OpenCL was also a pain to work with, comparatively, especially with Apple. Getting help to fix issues wasn't as easy. Devs had bugs with OpenCL that weren't an issue with CUDA. Blender removed OpenCL support entirely once AMD provided HIP because the performance wasn't good, and Redshift3D only started to support AMD once HIP was available as well.
OpenCL is still popular in some areas, but that API is also scaring developers away in other fields. OpenCL isn't "dead", they just decided to avoid using it since weird stuff happened with it, and no sign of improvement was in sight.
OpenCL rendering support was removed. The combination of the limited Cycles kernel implementation, driver bugs, and stalled OpenCL standard has made maintenance too difficult. We are working with hardware vendors to bring back GPU rendering support on AMD and Intel GPUs, using others APIs.
I have some serious reservations about the lack of an Nvidia GPU option, partially because of poor OpenCL support from Apple compared to CUDA support from Nvidia. So I did some digging and contacted the developers of Neat Image, a denoise plugin for images and video that supports both CUDA and OpenCL. They confirmed my impression that Apple has lackluster support compared to Nvidia's CUDA support for OS X or even AMD's OpenCL and Nvidia on Windows and Linux.
A pro with serious workstation needs reviews Apple’s 2013 Mac Pro - Ars Technica
Cycles - Blender Developer Documentation

It would be one thing if CUDA only made it because Nvidia gave people money, but CUDA also made it because it was easier to use. I refuse to believe that Nvidia poached all the talent in the US meaning that if they aren't involved in something it's doomed to fail.
Posted on Reply
#45
oxrufiioxo
AusWolfAnd as a consumer, why would I need to support that?
That's something every consumer has to ask themselves though and neither conclusion is really wrong

People just need to buy whatever gives them the most for their money the thing is that isn't going to be the same for every person for me that was the 4090 last generation for someone else that might have been a 4060 or 7800XT none of the choices are wrong though.

People who love frame gen are gonna love MFG it's basically just frame gen on crack for better or worse.

AMD does not have a high bar this generation to make a compelling product the GeForce products under 1000 are 4060ti like improvements over their Super counterparts so if there was a gen to turn it around it would be this one. Hopefully both FSR4 and the 9000 series are awesome not because I am likely to use or buy either but because good products benefit everyone.
Posted on Reply
#46
AusWolf
Visible NoiseBecause without profit companies don’t exist.
That's not a reason. You can buy products that work with open standards. For example, FSR and XeSS work on every modern GPU, but AMD and Intel are still making money on the GPUs themselves. We've got Linux for free, maintained by the community for free. The music industry isn't going tits up just because I'm not buying renting my music on Amazon or iTunes.
Posted on Reply
#47
wolf
Better Than Native
Visible NoiseThere’s no money in open.
I'd argue there is some money in it, there are clearly people who will put open/hardware agnostic over a solution that's proprietary but qualitatively superior. So much so they would even decry and not use the feature itself, but see fit to draw a line in the sand over it.

And then there are the people to whom that facet of the argument is effectively below the waterline, they pay more to get the better feature/s, and they get access to the hardware agnostic ones too.

I don't think either buyer is necessarily right or wrong, and I don't see a moral high ground to be claimed either. Those who innovate and have the most desirable product get to charge for it, then the open standards follow and eventually take over.
Posted on Reply
#48
Visible Noise
AusWolfThat's not a reason. You can buy products that work with open standards. For example, FSR and XeSS work on every modern GPU, but AMD and Intel are still making money on the GPUs themselves. We've got Linux for free, maintained by the community for free. The music industry isn't going tits up just because I'm not buying renting my music on Amazon or iTunes.
Have you seen AMD’s and Intel’s financials? They are losing their asses on GPUs.

Linux for free? Take out all the for profit developers that work on it and you have nothing. Do you know who makes 75% of the kernel commits? Intel. They aren’t doing it out of the goodness of their heart. They do it because it makes them a profit.

Do you really think AMD is being altruistic when they open their software? They open their software with the hope that someone develops it for them, because they suck at software. Evidence: RocM.
Posted on Reply
#49
AusWolf
Visible NoiseHave you seen AMD’s and Intel’s financials? They are losing their asses on GPUs.
They are right now because everyone's into Nvidia's proprietary technology.
Visible NoiseLinux for free? Take out all the for profit developers that work on it and you have nothing. Do you know who makes 75% of the kernel commits? Intel. They aren’t doing it out of the goodness of their heart. They do it because it makes them a profit.
Which for profit developers? We need Wine and Proton for compatibility with Windows software because there's so many for profit organisations developing native Linux software, obviously. :rolleyes:
Visible NoiseDo you really think AMD is being altruistic when they open their software?
No, it's a business plan just the same. You can develop an open technology and hope that customers will catch up. I'm just saying that I find this business plan more likeable than locking people into a closed standard.
Posted on Reply
#50
Visible Noise
AusWolfThey are right now because everyone's into Nvidia's proprietary technology
Thus proprietary = profit. Not that difficult. I assume you have an employer? Are they the same as everyone else in the field or do they have some special sauce that keeps them in business? I’m sure they don’t just give away all their corporate secrets to anyone that asks.
AusWolfWhich for profit developers?
Dude, really??? You don’t have any IT exposure, do you?

www.linuxfoundation.org/about/members
AusWolfYou can develop an open technology and hope that customers will catch up
Who is this “you” that has no expenses and doesn’t need to eat?

Maybe AMD should stop hoping and start executing.

Edit: I can’t believe you even brought up Proton. Where would it be without Valve? Proton makes Gabe a fucktonn of money. I wonder why Steam isn’t open? Oh that’s right, there’s no money in it for Valve.
Posted on Reply
Add your own comment
Feb 27th, 2025 07:56 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts