• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA's Frame Generation Technology Could Come to GeForce RTX 30 Series

But nvidia greedy no?

I mean nvidia backported all the dlss improvements as far back as turing (2018), gpus that are still capable of running new games just fine (contrary to it's competition at the time, lol), but planned obsolescence and ngreedia.
Yes the Nvidia hate is strong. I'm not aware of a single feature they've ever artificially restricted to newer hardware, yet people continue to expect them to do it in every single instance and make up reasons to claim it's true when older hardware isn't supported. The caveat to this of course is that Nvidia likes to maintain their premium brand image, so they don't want to downgrade features to run on older hardware (like for instance making a lower quality version of DLSS that could run on GTX cards.)

In any case I wouldn't get too excited for frame gen on 30 series, considering it may not improve performance as much as people expect on the older/slower hardware and it also uses up a bunch of VRAM that those cards don't really have to spare on modern games where frame gen would be most useful.
 
seeing that MFG or "fake frames" are considered as a joke in the community (and potential customers), maybe this is an attempt to convince people - by making it available for their current GPU - that this is a useful feature, and not just a marketing gimmick

especially if the raw performance gains of the new 50xx series is not that impressive
 
seeing that MFG or "fake frames" are considered as a joke in the community
It's because you are listening to the vocal tiny minority that doesn't have access to them. Everyone else is busy enjoying dlss and FG.
 
I knew this would happen even back during the Ada launch, but everybody was like "nah, it's got this super-duper advanced high tech optical flow thingy, so it's not possible". Yeah, right. :rolleyes:

So c'mon lads, get your piping hot RTX 50 series GPUs today, because DLSS 4 definitely won't run on older hardware, pinkie promise. :roll:
Totally agree.
They did the same with DLSS with older cards. Same will go with DLSS Frame generation. I'm disappointed Nvidia didn't make any serious hardware improvements or advancements with the 50serious.

Skip---> click here.

Here's hoping the next Gen will be next Gen.
 
"and we don't talk about our $7 competition. Have good day."
 
I'm not aware of a single feature they've ever artificially restricted to newer hardware, yet people continue to expect them to do it in every single instance and make up reasons to claim it's true when older hardware isn't supported. The caveat to this of course is that Nvidia likes to maintain their premium brand image, so they don't want to downgrade features to run on older hardware (like for instance making a lower quality version of DLSS that could run on GTX cards.)

In any case I wouldn't get too excited for frame gen on 30 series, considering it may not improve performance as much as people expect on the older/slower hardware and it also uses up a bunch of VRAM that those cards don't really have to spare on modern games where frame gen would be most useful.

They didn't support VESA Adaptive Sync for a while before they finally caved and added into the drivers; you needed to use a monitor with a Gsync module with an Nvidia card.

A bit after they acquired PhysX, they disabled hardware acceleration if the drivers detected a Radeon card in your system. So even if you bought and paid for an Nvidia card to use for PhysX acceleration you got screwed over.

One of the more useful situations for frame generation are for games that have a 60 (or 30) fps cap and can't unlock the framerate. Most likely in these titles the GPU will have computational power to spare for frame generation.
 
It's because you are listening to the vocal tiny minority that doesn't have access to them. Everyone else is busy enjoying dlss and FG.
Upscaling sure but frame gen nope! I would say the ones supporting it are the vocal minority at this point!
 
Thanks, Nvidia! You may be a complete monopoly and overprice most of your products, but you will finally let me have a feature my GPU should have had from the day i bought it. Yay!
 
Thanks, Nvidia! You may be a complete monopoly and overprice most of your products, but you will finally let me have a feature my GPU should have had from the day i bought it. Yay!
Your GPU couldn't have that feature from day one because it needed some highly advanced hardware for it to work, but now it suddenly doesn't. Never mind that AMD never needed said hardware for their tech.

At this point I wouldn't be surprised to find out that Tensor cores actually do nothing in a game and locking DLSS is just an artificial driver limitation.
 
Reminds me when they brought Intellisample 4.0 to GF 6 series even though it was first a GF 7 feature.

...yeah, for some reason, I remembered a ~20 year old scenario when they brought a feature from the newer gen to the older gen since it was practically artificially limited. :laugh:
 
seeing that MFG or "fake frames" are considered as a joke in the community (and potential customers), maybe this is an attempt to convince people - by making it available for their current GPU - that this is a useful feature, and not just a marketing gimmick

especially if the raw performance gains of the new 50xx series is not that impressive

Guess what? Everything your video card produces is fake. It’s fake frames all the way down.

Your GPU couldn't have that feature from day one because it needed some highly advanced hardware for it to work, but now it suddenly doesn't. Never mind that AMD never needed said hardware for their tech.

At this point I wouldn't be surprised to find out that Tensor cores actually do nothing in a game and locking DLSS is just an artificial driver limitation.

Or, software research takes time. I mean how many years did it take AMD to develop FSR?
 
Guess what? Everything your video card produces is fake. It’s fake frames all the way down.
Fake in this context = generated from another frame instead of live geometry data.

Or, software research takes time. I mean how many years did it take AMD to develop FSR?
Is that what Nvidia wants us to believe? It didn't take them much to develop for Ada, only for Ampere? And they happened to get it finished right for the Blackwell launch? I'm not buying any of it.
 
The same Nvidia fake promises, 2025 and not Resizable BAR for RTX2000 simply Nvidia don't want support for past generation GPUS the sold software updates like hardware upgrades
 
The same Nvidia fake promises, 2025 and not Resizable BAR for RTX2000 simply Nvidia don't want support for past generation GPUS the sold software updates like hardware upgrades

Sure, that’s why every feature of DLSS 4 outside of MFG Is going to work on GPUs going all the way back to Turning, because they don’t want to support prior generations :kookoo:

Is that what Nvidia wants us to believe? It didn't take them much to develop for Ada, only for Ampere? And they happened to get it finished right for the Blackwell launch? I'm not buying any of it.

What field do you work in? I‘m assuming nothing scientific related, because you don’t seem to understand the basics of scientific research.

Nvidia has a LOT of highly educated well paid engineers on this. Pathfinding takes time and research - it’s literally why it’s called pathfinding.
 
What field do you work in? I‘m assuming nothing scientific related, because you don’t seem to understand the basics of scientific research.

Nvidia has a LOT of highly educated well paid engineers on this. Pathfinding takes time and research - it’s literally why it’s called pathfinding.
They've also got a lot of highly educated and well paid marketing people. Just saying.

I know enough about science to know that blindly trusting a company (primary source) whose main goal is to sell you stuff without any info from independent sources is dumb.
 
They've also got a lot of highly educated and well paid marketing people. Just saying.

I know enough about science to know that blindly trusting a company (primary source) whose main goal is to sell you stuff without any info from independent sources is dumb.

Ok, you’re a layman who doesn’t know what he doesn’t know. Wikipedia has an article about it.
 
Is that what Nvidia wants us to believe? It didn't take them much to develop for Ada, only for Ampere? And they happened to get it finished right for the Blackwell launch? I'm not buying any of it.
But who uses it anyway, it increases latency, blurs the image, lowers the details, makes textures bland etcetera. So who cares if it was supported in previous gens or not, aye?
 
But who uses it anyway, it increases latency, blurs the image, lowers the details, makes textures bland etcetera. So who cares if it was supported in previous gens or not, aye?
Now that is a very good point.
 
Now that is a very good point.
It is a good point, but it's only used when it suits an argument against nvidia. In this thread it doesn't, so how dare you nvidia not already have support for it in older gens. But when it comes to a DLSS thread, it's useless and nobody uses it blablabla. It's fascinating.

I mean in this very thread i've read about how fine FSR FG is running, so obviously it doesn't need all the nvidia tech to run FG. No mention of loss of detail here. If you go a thread about RT, oh boy, FG is so bad and the loss of image quality is insane.
 
It is a good point, but it's only used when it suits an argument against nvidia. In this thread it doesn't, so how dare you nvidia not already have support for it in older gens. But when it comes to a DLSS thread, it's useless and nobody uses it blablabla. It's fascinating.
I don't care if it supports it or not. I just can't help noticing the irony in all the negativity I got during the Ada launch when I suspected that it was an artificial limit. But it seems some people still believe in fairy tales. :rolleyes:
 
I don't care if it supports it or not. I just can't help noticing the irony in all the negativity I got during the Ada launch when I suspected that it was an artificial limit. But it seems some people still believe in fairy tales. :rolleyes:
Likely because you had and still have no data to support your claim. If it was an AMD card, it wouldn't be an artificial limit. It's because of your preconceived notion that nvidia is eveeel and ngreedia that makes you think that.

Like even if it was an artificial limit, who cares honestly. There was no FG when youbought the card, you thought it was a good deal without FG and that's why you bought it, why do you expect someone to do something for free is beyond me honestly.
 
Likely because you had and still have no data to support your claim.
Does Nvidia? As far as I know, there are no technical details available on Tensor cores, DLSS or FG. Nobody except for an Nvidia engineer knows how they work. Therefore, I believe what I want, and I'd rather not put my trust in a company whose goal is making profit, whether it's called Nvidia, Intel or AMD.

If it was an AMD card, it wouldn't be an artificial limit. It's because of your preconceived notion that nvidia is eveeel and ngreedia that makes you think that.
Ah, the typical "you said something wrong about Nvidia -> it can only be because you're not a fan -> all your points are irrelevant" argument. Very childish.

This is not an AMD vs Nvidia fight, so please don't make it so. This is a fight for our rights to be informed consumers regardless of the colour of the box. The "you must love everything about Nvidia, otherwise you're an AMD fan and can't have an opinion" bullshit is getting very tiresome.

If AMD limits FSR 4 to RDNA 4 because of its AI cores, I'll question it just the same, considering that RDNA 3 has AI cores, too. I want details, not excuses.

Like even if it was an artificial limit, who cares honestly. There was no FG when youbought the card, you thought it was a good deal without FG and that's why you bought it, why do you expect someone to do something for free is beyond me honestly.
Do you think it's unimaginable that someone swapped a 30-series card for a 40-series one specifically for FG? Then why is FG such a huge part of Nvidia's marketing?
 
Does Nvidia? As far as I know, there are no technical details available on Tensor cores, DLSS or FG. Nobody except for an Nvidia engineer knows how they work. Therefore, I believe what I want, and I'd rather not put my trust in a company whose goal is making profit, whether it's called Nvidia, Intel or AMD.


Ah, the typical "you said something wrong about Nvidia -> it can only be because you're not a fan -> all your points are irrelevant" argument. Very childish.

This is not an AMD vs Nvidia fight, so please don't make it so. This is a fight for our rights to be informed consumers regardless of the colour of the box. The "you must love everything about Nvidia, otherwise you're an AMD fan and can't have an opinion" bullshit is getting very tiresome.

If AMD limits FSR 4 to RDNA 4 because of its AI cores, I'll question it just the same, considering that RDNA 3 has AI cores, too. I want details, not excuses.


Do you think it's unimaginable that someone swapped a 30-series card for a 40-series one specifically for FG? Then why is FG such a huge part of Nvidia's marketing?
RDNA doesnt actually have ai cores though (matrix).

Do you think it's unimaginable that someone swapped a 30-series card for a 40-series one specifically for FG? Then why is FG such a huge part of Nvidia's marketing?
Yes, I think it is. I'm not even entirely sure what you are arguing with here. That if nvidia tried, they could somehow make it work on ampere and Turing? Sure they could, I have no doubt about it. The question is, why should they spend resources doing that for free, and what would the end result be? Would it actually work properly? And if it didn't, we would go back to the same argument about artificial limitations and nvidia being greedy making it work like crap.

You keep acting like you are unbiased and how dare I state otherwise but your posts.. I mean you posted the other day how the gpu market is crap but you had to add in that one manafacturer is mostly to blame. You keep adding that poisonous bias into your posts. It is what it is.
 
Back
Top