Tuesday, February 6th 2024

Mod Unlocks FSR 3 Fluid Motion Frames on Older NVIDIA GeForce RTX 20/30 Series Cards

NVIDIA's latest RTX 40 series graphics cards feature impressive new technologies like DLSS 3 that can significantly enhance performance and image quality in games. However, owners of older 20 and 30 series NVIDIA GeForce RTX cards cannot officially benefit from these cutting-edge advances. DLSS 3's Frame Generation feature, in particular, requires dedicated hardware only found in NVIDIA's brand new Ada Lovelace architecture. But the ingenious modding community has stepped in with a creative workaround solution where NVIDIA has refused to enable frame generation functionality on older generation hardware. A new third-party modification can unofficially activate both upscaling (FSR, DLAA, DLSS or XeSS) and AMD Fluid Motion Frames on older NVIDIA cards equipped with Tensor Cores. Replacing two key DLL files and a small edit to the Windows registry enables the "DLSS 3" option to be activated in games running on older hardware.

In testing conducted by Digital Foundry, this modification delivered up to a 75% FPS boost - on par with the performance uplift official DLSS 3 provides on RTX 40 series cards. Games like Cyberpunk 2077, Spider-Man: Miles Morales, and A Plague Tale: Requiem were used to benchmark performance. However, there can be minor visual flaws, including incorrect UI interpolation or random frame time fluctuations. Ironically, while the FSR 3 tech itself originates from AMD, the mod currently only works on NVIDIA cards. So, while not officially supported, the resourcefulness of the modding community has remarkably managed to bring cutting-edge frame generation to more NVIDIA owners - until AMD RDNA 3 cards can utilize it as well. This shows the incredible potential of community-driven software modification and innovation.
Source: via HardwareLuxx
Add your own comment

27 Comments on Mod Unlocks FSR 3 Fluid Motion Frames on Older NVIDIA GeForce RTX 20/30 Series Cards

#1
Arkz
Mod Unlocks FSR3 Performance on Older NVIDIA GeForce RTX 20/30 Series Cards

FTFY.
Posted on Reply
#2
JAB Creations
Yet another example of how Nvidia treats their own mindless fan boys. There are actually people still buying 12GB cards for $800 right now while I'm running out of VRAM with 16GB and those same people will still buy whatever overpriced entry level garbage Nvidia puts out. They want AMD to compete for lower Nvidia prices, maybe if they stopped buying Nvidia's junk then Nvidia would have no choice but to drop their prices. :rolleyes:
Posted on Reply
#3
HisDivineOrder
JAB CreationsYet another example of how Nvidia treats their own mindless fan boys. There are actually people still buying 12GB cards for $800 right now while I'm running out of VRAM with 16GB and those same people will still buy whatever overpriced entry level garbage Nvidia puts out. They want AMD to compete for lower Nvidia prices, maybe if they stopped buying Nvidia's junk then Nvidia would have no choice but to drop their prices. :rolleyes:
Mindless fanboys would buy an inferior product just because the company who made it is the only competition to a more performant product, not because it's actually better. How many generations of sad ray tracing performance must we endure before Lisa has her team actually match Nvidia or Intel? At first, it was fine. Nvidia beat them to the punch. But then the 30 series came. And the 40 series.

How long must we wait for AMD to actually compete? Or price their product like a product that's inferior?
Posted on Reply
#4
JAB Creations
HisDivineOrderMindless fanboys would buy an inferior product just because the company who made it is the only competition to a more performant product, not because it's actually better. How many generations of sad ray tracing performance must we endure before Lisa has her team actually match Nvidia or Intel? At first, it was fine. Nvidia beat them to the punch. But then the 30 series came. And the 40 series.

How long must we wait for AMD to actually compete?
Not sure, how long does AMD need to recover from literal decades of Intel literally giving billions of dollars to OEMs to not buy AMD products? Don't mock the people struggling to provide competition while you blindly buy the products of criminal organizations like Intel and Nvidia.
Posted on Reply
#5
stimpy88
"new technologies like DLSS 3 that can significantly enhance performance and image quality in games."

Erm, both these technologies are designed to cheat the player/customer. Both offer apparent performance increases at the cost of visual quality, which can be extreme in some cases. A great way to hide performance and specification deficits of your products, while allowing manufacturers to charge a fortune for their products.

Buy the new $1000+ 4080 Super, but if you want to use RT please lower your resolution and click this button to generate the fake extra FPS you need to run at your monitors refresh rate... You may notice lower picture quality and artifacts, buy hey, it works right!
Posted on Reply
#7
Fluffmeister
ArkzMod Unlocks FSR3 Performance on Older NVIDIA GeForce RTX 20/30 Series Cards

FTFY.
Yeah I got excited at first, then saw the headline change from DLSS to FSR :(
Posted on Reply
#8
ZeroFM
stimpy88"new technologies like DLSS 3 that can significantly enhance performance and image quality in games."

Erm, both these technologies are designed to cheat the player/customer. Both offer apparent performance increases at the cost of visual quality, which can be extreme in some cases. A great way to hide performance and specification deficits of your products, while allowing manufacturers to charge a fortune for their products.

Buy the new $1000+ 4080 Super, but if you want to use RT please lower your resolution and click this button to generate the fake extra FPS you need to run at your monitors refresh rate... You may notice lower picture quality and artifacts, buy hey, it works right!
Absolutely correct <3
Posted on Reply
#9
arbiter
ZeroFMAbsolutely correct :love:
except pointing out AMD's tech does the exact same thing you crying about nvidia doing but team red can do no bad right?
Posted on Reply
#10
ZeroFM
arbiterexcept pointing out AMD's tech does the exact same thing you crying about nvidia doing but team red can do no bad right?
Except amd doing in righ way! Open source+ backwards support for older gpu vs ngreedy 3090ti 2$$$ dollar gpu not even 2yr old can't have latest dlss
Posted on Reply
#11
Gica
JAB CreationsYet another example of how Nvidia treats their own mindless fan boys. There are actually people still buying 12GB cards for $800 right now while I'm running out of VRAM with 16GB and those same people will still buy whatever overpriced entry level garbage Nvidia puts out. They want AMD to compete for lower Nvidia prices, maybe if they stopped buying Nvidia's junk then Nvidia would have no choice but to drop their prices. :rolleyes:
It's sad when a Taliban AMD can't see the forest because of the horse glasses.

Remember when desperate AMD owners used modded BIOS to put a Zen 3 in an x300 motherboard?
Do you remember when AMD "discovered" that not only Zen 3 can benefit from Resizable Bar only when nVidia announced support for ancient processors (Broadwell, I think)?

Buy an AMD video card, no one disputes your choice. However, I have doubts that viable neurons are enough when you criticize others without understanding that AMD sells at a lower price only because it can only compete with NVidia in rasterization. Otherwise, you are at the mercy of God! Only now is the functionality of some functions that were supposed to work for years being tested, but It's good, however, that you have enough memory vRAM for the 25 frames generated by an outdated graphics processor to run fluently.
Posted on Reply
#12
theouto
FMF is the driver level motion interpolation, not the name for FSR 3 Frame gen.
Posted on Reply
#13
Denver
HisDivineOrderMindless fanboys would buy an inferior product just because the company who made it is the only competition to a more performant product, not because it's actually better. How many generations of sad ray tracing performance must we endure before Lisa has her team actually match Nvidia or Intel? At first, it was fine. Nvidia beat them to the punch. But then the 30 series came. And the 40 series.

How long must we wait for AMD to actually compete? Or price their product like a product that's inferior?
Who promised RT, photorealistic games etc...and created this wave of crap was Nvidia, what they have delivered so far is not even close to being called realistic RT, most of the time they turn everything into mirrors in an absurdly stupid way, the other half of the time I see expensive hardware running at 20-30fps,

2024, Is there a GPU that can run full RT above 60fps @4k without using upscaling or fake frames? I keep seeing resources and energy wasted in this grotesque way. That's just stupid.
Posted on Reply
#14
jmcosta
stimpy88"new technologies like DLSS 3 that can significantly enhance performance and image quality in games."

Erm, both these technologies are designed to cheat the player/customer. Both offer apparent performance increases at the cost of visual quality, which can be extreme in some cases. A great way to hide performance and specification deficits of your products, while allowing manufacturers to charge a fortune for their products.

Buy the new $1000+ 4080 Super, but if you want to use RT please lower your resolution and click this button to generate the fake extra FPS you need to run at your monitors refresh rate... You may notice lower picture quality and artifacts, buy hey, it works right!
I can't say much about dlss3 but the previous one actually improves image quality over native in some games due to the overuse of TAA by the developers, and RT also makes a difference but only in a few titles like Metro, Cyberpunk, Control etc. These are also playable on a RTX3070
Posted on Reply
#15
john_
AMD doing for Nvidia fans more than Nvidia is willing to do.

Nvidia isn't selling GPUs. It's selling subscriptions. If you stay too long with an old... subscription, you get no new important features.
Posted on Reply
#16
JAB Creations
GicaIt's sad when a Taliban AMD can't see the forest because of the horse glasses.

Remember when desperate AMD owners used modded BIOS to put a Zen 3 in an x300 motherboard?
Do you remember when AMD "discovered" that not only Zen 3 can benefit from Resizable Bar only when nVidia announced support for ancient processors (Broadwell, I think)?

Buy an AMD video card, no one disputes your choice. However, I have doubts that viable neurons are enough when you criticize others without understanding that AMD sells at a lower price only because it can only compete with NVidia in rasterization. Otherwise, you are at the mercy of God! Only now is the functionality of some functions that were supposed to work for years being tested, but It's good, however, that you have enough memory vRAM for the 25 frames generated by an outdated graphics processor to run fluently.
Remember when AMD's stock price was $2 and Intel was giving away billions of dollars to OEMs to not use AMD products? I sure remember when Zen 1 launched and motherboard manufacturers only released motherboards because of obligations and on those motherboards wasn't sufficient storage required for compatibility because no one believed AMD would still be in business let alone support the same CPU socket for almost a decade. Maybe consider the glass house you're standing in before blindly throwing bricks.
Posted on Reply
#17
Zendou
JAB CreationsNot sure, how long does AMD need to recover from literal decades of Intel literally giving billions of dollars to OEMs to not buy AMD products? Don't mock the people struggling to provide competition while you blindly buy the products of criminal organizations like Intel and Nvidia.
Bit of irony when propping up AMD a company who originally illegally reverse engineered Intel chips. Luckily for AMD, Intel needed more foundries at the time, so they granted them a x86 license to manufacture their chips instead of suing the company out of existence. A company that only exists now based off of thievery is not suddenly ethical. All three of the companies have done suspect things, not sure why so many try to purity test corporations. They all have skeletons, buy the product that best suits your needs.
Posted on Reply
#18
AnotherReader
ZendouBit of irony when propping up AMD a company who originally illegally reverse engineered Intel chips. Luckily for AMD, Intel needed more foundries at the time, so they granted them a x86 license to manufacture their chips instead of suing the company out of existence. A company that only exists now based off of thievery is not suddenly ethical. All three of the companies have done suspect things, not sure why so many try to purity test corporations. They all have skeletons, buy the product that best suits your needs.
You might want to check your history there. Intel was required by IBM to find it another supplier, and in 1982, AMD and Intel signed a contract allowing AMD to use their chip designs. Intel reneged on this agreement by the time the 386 came out and AMD had to reverse engineer the 386 and the 486 using clean room design. At every step along the way, when disputes arose, courts ruled in AMD's favour.
Posted on Reply
#19
Zendou
AnotherReaderYou might want to check your history there. Intel was required by IBM to find it another supplier, and in 1982, AMD and Intel signed a contract allowing AMD to use their chip designs. Intel reneged on this agreement by the time the 386 came out and AMD had to reverse engineer the 386 and the 486 using clean room design. At every step along the way, when disputes arose, courts ruled in AMD's favour.
Depends on your source and the perspective. I do not believe that the contract necessarily covered the 386/486. Yet AMD reverse engineered them regardless without Intel's consent. Legal outcomes are not always the be all and end all as many laws and legal decisions have been overturned or changed over the years. The burden on civil cases is only 51% being more likely than unlikely. I hardly see that as a smoking gun that what AMD did was justified. Harkening back to the original post, I do find it odd that ray tracing support remains poor in that even Intel was able to implement a better version of RT than AMD and I am not even sure if they have plans to ever add it to their cards. If anything AMD is in danger of Intel eating into the AMD GPU market share as they seem to be going a more value oriented route, something AMD seems to have abandoned for the most part as they just make there cards slightly cheaper than Nvidia who doesn't seem to care as they currently make the best card on the market.
Posted on Reply
#20
AnotherReader
ZendouDepends on your source and the perspective. I do not believe that the contract necessarily covered the 386/486. Yet AMD reverse engineered them regardless without Intel's consent. Legal outcomes are not always the be all and end all as many laws and legal decisions have been overturned or changed over the years. The burden on civil cases is only 51% being more likely than unlikely. I hardly see that as a smoking gun that what AMD did was justified. Harkening back to the original post, I do find it odd that ray tracing support remains poor in that even Intel was able to implement a better version of RT than AMD and I am not even sure if they have plans to ever add it to their cards. If anything AMD is in danger of Intel eating into the AMD GPU market share as they seem to be going a more value oriented route, something AMD seems to have abandoned for the most part as they just make there cards slightly cheaper than Nvidia who doesn't seem to care as they currently make the best card on the market.
Given that all arguments around the rights of AMD to x86 were settled in 1995, there's no matter of perspective.

As far as raytracing performance is concerned, I agree; AMD is in danger of falling behind Intel if RDNA4 doesn't improve upon RDNA3 and Intel manages to improve their abysmal performance per watt and per mm^2.
Posted on Reply
#21
arbiter
ZeroFMExcept amd doing in righ way! Open source+ backwards support for older gpu vs ngreedy 3090ti 2$$$ dollar gpu not even 2yr old can't have latest dlss
If it wasn't open source it likely wouldn't get used due to AMD and their history of lack lust documentation or very little work behind it that isn't expecting of game dev's to "figure it all out"
AnotherReaderGiven that all arguments around the rights of AMD to x86 were settled in 1995, there's no matter of perspective.

As far as raytracing performance is concerned, I agree; AMD is in danger of falling behind Intel if RDNA4 doesn't improve upon RDNA3 and Intel manages to improve their abysmal performance per watt and per mm^2.
Intel is far behind gpu tech vs 2 companies that been in the game for 15+ years. their performance isn't great but they are pricing things around where they compete. More competition is rarely a bad thing if they can make some waves in a few years making gains each time.
Posted on Reply
#22
Hecate91
arbiterexcept pointing out AMD's tech does the exact same thing you crying about nvidia doing but team red can do no bad right?
Except it is AMD's tech which allows Nvidia users with 20 and 30 series cards to use frame generation.
arbiterIf it wasn't open source it likely wouldn't get used due to AMD and their history of lack lust documentation or very little work behind it that isn't expecting of game dev's to "figure it all out"
Making tech open source is for example why Freesync monitors are more affordable vs. Gsync only monitors , and people with 20 and 30 series cards can run games at high frame rates if they want to use frame gen. Nvidia would rather you buy a new GPU than allow 20 and 30 series to use more features.
I find it interesting Nvidia users complain that AMD doesn't have a feature, then still complain when AMD adds it and allows everyone to use it.
Posted on Reply
#23
Zendou
AnotherReaderGiven that all arguments around the rights of AMD to x86 were settled in 1995, there's no matter of perspective.

As far as raytracing performance is concerned, I agree; AMD is in danger of falling behind Intel if RDNA4 doesn't improve upon RDNA3 and Intel manages to improve their abysmal performance per watt and per mm^2.
There is a matter of perspective, I gave a nuanced argument around the subject, but you can believe whatever you like, that can be your truth.

Moving along, I do love this new battle cry of the performance per watt that seems to coincide with a very specific company. What truly matters is the end result. Regardless of how inefficient the 4090 is, nothing AMD makes holds a candle to it (except in poorly optimized games). If Intel get their drivers in a better place, they will beat AMD, because when you throw those watts into a calculator over the course of a year they amount to nothing meaningful and it will only serve as one of the last bastions to hold a metric that favors a company that is no better or worse than its competition ethically.

As for the per mm^2 subject, I have not seen anyone bring that one up before. I mean TSMC is the one who is making the nodes and AMD does not have much to do with that. Comparing a larger nm node with a smaller one in efficiency would be unfair in its premise. It ignores performance and transistor density and just would favor whoever paid to have their architecture built on the newest/smallest node. Even then there are examples where that does not pan out to be a good product. Was the Radeon 7 that AMD released to be the first 7nm gaming card any good? Was it quickly usurped by AMD's own card the 5700XT shortly after and all support was abandoned for it? No for the former and Yes for the latter, In fact since Nvidia ended up using Samsung 8nm process, there was no reason to release a subpar rebranded workstation card, but they did it regardless. The fact they have not released an R3 cpu for the 7K CPU series yet and their "budget" GPUs are laptop graphics cards that would not work for the main audience they should be catering to because they decided not to populate all the PCIE lanes are not positives. AMD is not some paragon of virtue, and by no means are the others. I do not understand this need to act as though they are.
Posted on Reply
#24
AnotherReader
ZendouThere is a matter of perspective, I gave a nuanced argument around the subject, but you can believe whatever you like, that can be your truth.

Moving along, I do love this new battle cry of the performance per watt that seems to coincide with a very specific company. What truly matters is the end result. Regardless of how inefficient the 4090 is, nothing AMD makes holds a candle to it (except in poorly optimized games). If Intel get their drivers in a better place, they will beat AMD, because when you throw those watts into a calculator over the course of a year they amount to nothing meaningful and it will only serve as one of the last bastions to hold a metric that favors a company that is no better or worse than its competition ethically.

As for the per mm^2 subject, I have not seen anyone bring that one up before. I mean TSMC is the one who is making the nodes and AMD does not have much to do with that. Comparing a larger nm node with a smaller one in efficiency would be unfair in its premise. It ignores performance and transistor density and just would favor whoever paid to have their architecture built on the newest/smallest node. Even then there are examples where that does not pan out to be a good product. Was the Radeon 7 that AMD released to be the first 7nm gaming card any good? Was it quickly usurped by AMD's own card the 5700XT shortly after and all support was abandoned for it? No for the former and Yes for the latter, In fact since Nvidia ended up using Samsung 8nm process, there was no reason to release a subpar rebranded workstation card, but they did it regardless. The fact they have not released an R3 cpu for the 7K CPU series yet and their "budget" GPUs are laptop graphics cards that would not work for the main audience they should be catering to because they decided not to populate all the PCIE lanes are not positives. AMD is not some paragon of virtue, and by no means are the others. I do not understand this need to act as though they are.
Let's not exaggerate; I don't believe that any of these companies are paragons of virtue. However, Apple, Intel, and Nvidia are unarguably worse than AMD when it comes to anti-competitive tactics. As far as Alchemist's lackluster performance is concerned, it has nothing to do with drivers or the 4090 being top dog. By now, they have had well over a year to work on their drivers and they are much improved. Let's compare two GPUs built using the same process: the A770 and the RX 7600. The A770 has a die size of 406 mm^2 while the RX 7600 is only 204 mm^2. The A770 is almost twice the size of the latter, but it struggles to distinguish itselfeven in newer games. Overall, in TechPowerUp's latest reviews of GPUs with comparable performance, the A770 has lower average fps than the 7600 at 1080p. This is despite using 50% more power to render. Unlike AMD's Vega or RDNA 1, Intel didn't suffer from a lack of resources when they were designing Alchemist. The A770 clocks only a little lower than the 7600. Contrast this with AMD where RDNA 2 clocks much higher than RDNA 1. Given that instruction latencies stayed unchanged between RDNA 1 and RDNA 2, this suggests a lack of resources. Given all this, It would be a minor miracle if Battlemage managed to match the 4070.

I believe we have gone off on tangent. As far as this mod is concerned, it is good that owners of cards based on Ampere and Turing can experiment with frame generation and make up their own minds about its usefulness. It is also another example of why some people prefer AMD to Nvidia.
Posted on Reply
#25
Zendou
AnotherReaderLet's not exaggerate; I don't believe that any of these companies are paragons of virtue. However, Apple, Intel, and Nvidia are unarguably worse than AMD when it comes to anti-competitive tactics. As far as Alchemist's lackluster performance is concerned, it has nothing to do with drivers or the 4090 being top dog. By now, they have had well over a year to work on their drivers and they are much improved. Let's compare two GPUs built using the same process: the A770 and the RX 7600. The A770 has a die size of 406 mm^2 while the RX 7600 is only 204 mm^2. The A770 is almost twice the size of the latter, but it struggles to distinguish itselfeven in newer games. Overall, in TechPowerUp's latest reviews of GPUs with comparable performance, the A770 has lower average fps than the 7600 at 1080p. This is despite using 50% more power to render. Unlike AMD's Vega or RDNA 1, Intel didn't suffer from a lack of resources when they were designing Alchemist. The A770 clocks only a little lower than the 7600. Contrast this with AMD where RDNA 2 clocks much higher than RDNA 1. Given that instruction latencies stayed unchanged between RDNA 1 and RDNA 2, this suggests a lack of resources. Given all this, It would be a minor miracle if Battlemage managed to match the 4070.

I believe we have gone off on tangent. As far as this mod is concerned, it is good that owners of cards based on Ampere and Turing can experiment with frame generation and make up their own minds about its usefulness. It is also another example of why some people prefer AMD to Nvidia.
I would say that a possible reason for AMD not being in that position is that they have never had the market share to try. If they were the dominate platform and could, they probably would.

As for the Intel subject that is just the cherry picking of information. The A770 is not and has not been the card of best value by Intel, that would be the A750. Which has most of the A770 performance with a reduced cost. As seen in this chart in the review of the A580 for performance per dollar (which is an actual important metric) it appears the A750 is in a 24% lead at 1080P, 16% at 1440P, and 23% at 4K when compared to the 7600. The drivers are still being much improved, in fact from this article on TPU dated 01/24/24 certain DX11 games got a +268% performance increase, I would say there is still more headroom for gains.

I can understand when cost is a concern, however you seem to be focusing on efficiency and die size which really does not affect most people in any meaningful way.
Posted on Reply
Add your own comment
Dec 21st, 2024 11:16 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts