Tuesday, February 6th 2024
Mod Unlocks FSR 3 Fluid Motion Frames on Older NVIDIA GeForce RTX 20/30 Series Cards
NVIDIA's latest RTX 40 series graphics cards feature impressive new technologies like DLSS 3 that can significantly enhance performance and image quality in games. However, owners of older 20 and 30 series NVIDIA GeForce RTX cards cannot officially benefit from these cutting-edge advances. DLSS 3's Frame Generation feature, in particular, requires dedicated hardware only found in NVIDIA's brand new Ada Lovelace architecture. But the ingenious modding community has stepped in with a creative workaround solution where NVIDIA has refused to enable frame generation functionality on older generation hardware. A new third-party modification can unofficially activate both upscaling (FSR, DLAA, DLSS or XeSS) and AMD Fluid Motion Frames on older NVIDIA cards equipped with Tensor Cores. Replacing two key DLL files and a small edit to the Windows registry enables the "DLSS 3" option to be activated in games running on older hardware.
In testing conducted by Digital Foundry, this modification delivered up to a 75% FPS boost - on par with the performance uplift official DLSS 3 provides on RTX 40 series cards. Games like Cyberpunk 2077, Spider-Man: Miles Morales, and A Plague Tale: Requiem were used to benchmark performance. However, there can be minor visual flaws, including incorrect UI interpolation or random frame time fluctuations. Ironically, while the FSR 3 tech itself originates from AMD, the mod currently only works on NVIDIA cards. So, while not officially supported, the resourcefulness of the modding community has remarkably managed to bring cutting-edge frame generation to more NVIDIA owners - until AMD RDNA 3 cards can utilize it as well. This shows the incredible potential of community-driven software modification and innovation.
Source:
via HardwareLuxx
In testing conducted by Digital Foundry, this modification delivered up to a 75% FPS boost - on par with the performance uplift official DLSS 3 provides on RTX 40 series cards. Games like Cyberpunk 2077, Spider-Man: Miles Morales, and A Plague Tale: Requiem were used to benchmark performance. However, there can be minor visual flaws, including incorrect UI interpolation or random frame time fluctuations. Ironically, while the FSR 3 tech itself originates from AMD, the mod currently only works on NVIDIA cards. So, while not officially supported, the resourcefulness of the modding community has remarkably managed to bring cutting-edge frame generation to more NVIDIA owners - until AMD RDNA 3 cards can utilize it as well. This shows the incredible potential of community-driven software modification and innovation.
27 Comments on Mod Unlocks FSR 3 Fluid Motion Frames on Older NVIDIA GeForce RTX 20/30 Series Cards
FTFY.
How long must we wait for AMD to actually compete? Or price their product like a product that's inferior?
Erm, both these technologies are designed to cheat the player/customer. Both offer apparent performance increases at the cost of visual quality, which can be extreme in some cases. A great way to hide performance and specification deficits of your products, while allowing manufacturers to charge a fortune for their products.
Buy the new $1000+ 4080 Super, but if you want to use RT please lower your resolution and click this button to generate the fake extra FPS you need to run at your monitors refresh rate... You may notice lower picture quality and artifacts, buy hey, it works right!
Remember when desperate AMD owners used modded BIOS to put a Zen 3 in an x300 motherboard?
Do you remember when AMD "discovered" that not only Zen 3 can benefit from Resizable Bar only when nVidia announced support for ancient processors (Broadwell, I think)?
Buy an AMD video card, no one disputes your choice. However, I have doubts that viable neurons are enough when you criticize others without understanding that AMD sells at a lower price only because it can only compete with NVidia in rasterization. Otherwise, you are at the mercy of God! Only now is the functionality of some functions that were supposed to work for years being tested, but It's good, however, that you have enough memory vRAM for the 25 frames generated by an outdated graphics processor to run fluently.
2024, Is there a GPU that can run full RT above 60fps @4k without using upscaling or fake frames? I keep seeing resources and energy wasted in this grotesque way. That's just stupid.
Nvidia isn't selling GPUs. It's selling subscriptions. If you stay too long with an old... subscription, you get no new important features.
As far as raytracing performance is concerned, I agree; AMD is in danger of falling behind Intel if RDNA4 doesn't improve upon RDNA3 and Intel manages to improve their abysmal performance per watt and per mm^2.
I find it interesting Nvidia users complain that AMD doesn't have a feature, then still complain when AMD adds it and allows everyone to use it.
Moving along, I do love this new battle cry of the performance per watt that seems to coincide with a very specific company. What truly matters is the end result. Regardless of how inefficient the 4090 is, nothing AMD makes holds a candle to it (except in poorly optimized games). If Intel get their drivers in a better place, they will beat AMD, because when you throw those watts into a calculator over the course of a year they amount to nothing meaningful and it will only serve as one of the last bastions to hold a metric that favors a company that is no better or worse than its competition ethically.
As for the per mm^2 subject, I have not seen anyone bring that one up before. I mean TSMC is the one who is making the nodes and AMD does not have much to do with that. Comparing a larger nm node with a smaller one in efficiency would be unfair in its premise. It ignores performance and transistor density and just would favor whoever paid to have their architecture built on the newest/smallest node. Even then there are examples where that does not pan out to be a good product. Was the Radeon 7 that AMD released to be the first 7nm gaming card any good? Was it quickly usurped by AMD's own card the 5700XT shortly after and all support was abandoned for it? No for the former and Yes for the latter, In fact since Nvidia ended up using Samsung 8nm process, there was no reason to release a subpar rebranded workstation card, but they did it regardless. The fact they have not released an R3 cpu for the 7K CPU series yet and their "budget" GPUs are laptop graphics cards that would not work for the main audience they should be catering to because they decided not to populate all the PCIE lanes are not positives. AMD is not some paragon of virtue, and by no means are the others. I do not understand this need to act as though they are.
I believe we have gone off on tangent. As far as this mod is concerned, it is good that owners of cards based on Ampere and Turing can experiment with frame generation and make up their own minds about its usefulness. It is also another example of why some people prefer AMD to Nvidia.
As for the Intel subject that is just the cherry picking of information. The A770 is not and has not been the card of best value by Intel, that would be the A750. Which has most of the A770 performance with a reduced cost. As seen in this chart in the review of the A580 for performance per dollar (which is an actual important metric) it appears the A750 is in a 24% lead at 1080P, 16% at 1440P, and 23% at 4K when compared to the 7600. The drivers are still being much improved, in fact from this article on TPU dated 01/24/24 certain DX11 games got a +268% performance increase, I would say there is still more headroom for gains.
I can understand when cost is a concern, however you seem to be focusing on efficiency and die size which really does not affect most people in any meaningful way.