Tuesday, March 7th 2023

AMD Could Tease DLSS 3-rivaling FSR 3.0 at GDC 2023

AMD could tease its next-generation graphics performance enhancement rivaling NVIDIA DLSS 3, at the 2023 Game Developers Conference (GDC 2023), slated for March 23. While the company didn't name it, its GDC 2023 session brief references an "exciting sneak peek of new FidelityFX technologies" that will be "available soon," meaning that it isn't the recently released FSR 2.2. We expect this to be the very first look at FSR 3.0.

AMD frantically dropped in the first mention of FSR 3.0 in its Radeon RX 7900 series RDNA3 announcement presentation (slide below). The company let out precious little details of the new technology except the mention that it offers double the frame-rate versus FSR 2 (at comparable image quality). Does this involve a frame-rate doubling technology similar to DLSS 3? We don't know yet. It could just be a more advanced upscaling algorithm that doubles performance at a given quality target compared to FSR 2. We'll know for sure later this month. It would be a coup of sorts for AMD if FSR 3.0 doesn't require RX 7000 series GPUs, and can run on older Radeon GPUs, whereas DLSS 3 requires the latest GeForce RTX 40-series GPUs.
Sources: Lance Lee (Twitter), VideoCardz
Add your own comment

70 Comments on AMD Could Tease DLSS 3-rivaling FSR 3.0 at GDC 2023

#51
evernessince
wolfWell yeah, if as a gamer all you want for is 1080p60 without visual bells and whistles (like RT), then I doubt FG of any flavour is for you. having tried it I see it as a great little piece of tech that essentially has no downsides, sure it doesn't improve latency AND visual fluidity, but just one is still a net benefit to the experience, and as you know from me, high fidelity and high framerates are right up my alley.

Really keen to see if AMD can pull a rabbit out of a hat on this one, it took a minute, but they basically did with FSR 1.0 and 2.X all things considered.
The problem is there are significant downsides. The latency hit is massive at lower framerates and the visual quality of the generated frames is lower than native frames. You are going to see a serious latency penalty if you are trying to crank up the visuals with a 30 FPS base, especially in a fast moving game when the AI has to make larger guess and smears larger sections of the frames.

At the end of the day it's really going to depend on the person whether they want to enable it or not, FG isn't going to be as popular as DLSS or FSR without a fundamental change to how the tech works (assuming it's possible). It will likely never be suitable for VR / AR / XR applications as any added latency in those applications can increase motion sickness (this can apply to desktop monitor usage as well but to a much lesser degree).
Posted on Reply
#52
wolf
Better Than Native
ratirtI don't think you understand the problems and/or doubts people are bringing here.
Pretty sure I do.
ratirtIt is not about trying and loving it. It is a tech a feature that is giving something that gamers can enjoy.
Not sure where this is going... it is a feature giving something gamers can enjoy, yes. I see all these features as above and beyond the standard (raster) game performance you get. "paying for it".. sure, with nvidia you pay more but you get more, and not just FG, I'm certain you don't need me to elaborate.
ratirtI mean, you say it like there isn't any other way to get better FPS that is one thing.
I mean, I can't help the way you interpret things, but no, that's now how I was saying it, at all, like not even close. I said what I said, don't add meaning to it that I never did or intended to.
ratirtBTW: Just because you have not experienced something or seen it, doesn't mean you know nothing about it. We have brains to think as well not just experience things.
Nothing is too strong a word, but some things in life, a lot of things really, yeah you do need to actually experience it to know. FG is one of those things. You might disagree, fine, but I can say with absolute certainty that the 'blurb' or specs only tells you so much, playing around with it is a must.
evernessinceThe problem is there are significant downsides. The latency hit is massive at lower framerates
I agree, it's not meant for 30 fps experiences. And yeah again, it's not perfect, there are image quality issues they need to sort out, most prominent is UI ones which they've already begun to address. My point here was latency vs visual fluidity, in the realm of which is it useful, the latency remains roughly as it was before, and visual fluidity doubles. I wont sit here and pig-headedly deny the many scenario's where it doesn't work, like poor (30 or less) starting FPS, or e-sports titles, but on the same token, having experienced it, I can't deny the situations where it works well, damn well.
Posted on Reply
#53
ratirt
wolfPretty sure I do.
Pretty sure you dont since you bring argument 'experience it first if you want to comment about it'. So you honestly don't. It is not about how DLSS3 works and if it works OK.
wolfNot sure where this is going... it is a feature giving something gamers can enjoy, yes. I see all these features as above and beyond the standard (raster) game performance you get. "paying for it".. sure, with nvidia you pay more but you get more, and not just FG, I'm certain you don't need me to elaborate.
You get more you pay more. Ok but that is not the point here. There's still concerns you have not addressed and these are not related to buying and using it.
wolfI mean, I can't help the way you interpret things, but no, that's now how I was saying it, at all, like not even close. I said what I said, don't add meaning to it that I never did or intended to.
In order to have a conversation or seeing the view point of another person you have to get a good look of what has been said. Interpreting things what other said is crucial for any conversation. I'm not adding any meaning.
wolfNothing is too strong a word, but some things in life, a lot of things really, yeah you do need to actually experience it to know. FG is one of those things. You might disagree, fine, but I can say with absolute certainty that the 'blurb' or specs only tells you so much, playing around with it is a must.
It is not about the functionality per se. My case and concerns about the feature for instance. You have not addressed any of my concerns and then you tell me I interpret your words wrong. Ridiculous i would say.
Posted on Reply
#54
wolf
Better Than Native
ratirtPretty sure you dont since you bring argument 'experience it first if you want to comment about it'. So you honestly don't. It is not about how DLSS3 works and if it works OK.
Man this is getting silly, I never said 'experience it first if you want to comment about it' - if you're going to quote me actually quote me, like this;
"I'd highly recommend finding a way to try it for yourself, and hey you might still think it's snake oil after, but it's the only real way to get a sense of it. I found it very impressive."
Experience counts, period. An opinion from the spec sheet counts too, sure, but for my money, it counts less. Sorry not sorry.
ratirtIn order to have a conversation or seeing the view point of another person you have to get a good look of what has been said. Interpreting things what other said is crucial for any conversation. I'm not adding any meaning.
I agree 100%, please read what I said carefully. and dude, you said, and I quote
I mean, you say it like there isn't any other way to get better FPS that is one thing.
You are taking meaning beyond what I said, I never even remotely put forward that there isn't any other way to get better FPS, don't put words in my mouth, I'll vomit every time, so please, stop it.
ratirtYou have not addressed any of my concerns and then you tell me I interpret your words wrong. Ridiculous i would say.
I'm not here to address your concerns, in fact you are on ignore, and I put my better judgment aside to respond to you, my blunder there, especially given you can't even quote or rebutt my points when you appear to be trying. Back to ignoring methinks, I hope someone else addresses your concerns, or you figure it out yourself.
Posted on Reply
#55
ratirt
wolfI'm not here to address your concerns, in fact you are on ignore, and I put my better judgment aside to respond to you, my blunder there, especially given you can't even quote or rebutt my points when you appear to be trying. Back to ignoring methinks, I hope someone else addresses your concerns, or you figure it out yourself.
I have noticed. You are here to address your experience with the feature disregarding anything that has been said.
For the sake of the argument.
wolfI'd highly recommend finding a way to try it for yourself, and hey you might still think it's snake oil after, but it's the only real way to get a sense of it. I found it very impressive.
For you, everything and the only thing is how it performs and you have clearly said it. Disregarding any other concerns.
So next time when somebody touches a subject about something, try addressing their concerns and talk about what they think not about your experience. Because I really dont care about that and I'm sure there is a lot of people, who do not care about what is your experience with a feature when they are asking about totally different things.
Posted on Reply
#56
AusWolf
ratirtI don't think you understand the problems and/or doubts people are bringing here. It is not about trying and loving it. It is a tech a feature that is giving something that gamers can enjoy. I mean, you say it like there isn't any other way to get better FPS that is one thing. Second, you argue that people did not try it so they should not speak about it or raise concerns. For instance, I have not tried it but my concern is not how it works or performs. (it has flaws but these will be addressed for sure) I only hope, your 50-70FPS turning into 80-120 wont be a future of top end graphics that will be forced to use these technologies to be able to achieve that performance, costing horrendous money (as we see now) sill, despite the graphics card actual performance. In my eyes, that would have been a disaster no matter how great FG, DLSS3 in this case is.
That's another thing: I don't want to see the gaming world moving into a state where you need DLSS/FSR, FG and other trickery to achieve acceptable frame rates. I want my 1080p 60 FPS to be exactly that, and not some upscaled, high latency smear on my screen.
Posted on Reply
#57
wolf
Better Than Native
AusWolfThat's another thing: I don't want to see the gaming world moving into a state where you need DLSS/FSR, FG and other trickery to achieve acceptable frame rates. I want my 1080p 60 FPS to be exactly that, and not some upscaled, high latency smear on my screen.
I think I can safely say nobody wants that. If that time is coming, my bet is its years off yet as these technologies are so new. The entire gaming community very much wants and needs games to run well enough on low end hardware without upscaling or frame generation, I do truly believe that standard rendering will remain the baseline for some years, perhaps fsr 1/2 style upscaling added at the most, I loathe the day that frame generation is necessary for an entry level experience. Let's hope that consoles play their part in keeping all that grounded.
Posted on Reply
#58
ratirt
AusWolfThat's another thing: I don't want to see the gaming world moving into a state where you need DLSS/FSR, FG and other trickery to achieve acceptable frame rates. I want my 1080p 60 FPS to be exactly that, and not some upscaled, high latency smear on my screen.
I'm not saying it will come to that but there is a chance it will. The market has been looking in that direction. If that involves RT and FG well, RT is demanding and we are not fully there yet. Plus, devs can always make the game even more demanding when RT is involved and you would need to bump the performance of the cards as well. I only hope it will not stagnate the market which kinda is already at this point.
If it is a smear or lacks contrast or whatever, that can be fixed but if companies starts moving into a direction you purchase hardware with limited or proprietary features which may or may not lose support in near future, which you pay a lot for, that's a problem for me and I would have been very cautious with the purchases. Especially, if you are not sure what direction the company is going. With the latency, in some games it does not matter really and these games can benefit largely from the feature if they adhere to some conditions required. Some games are sensitive to latency and that might be a problem.
It kinda reminds me the time when console gaming was limited to 25 or 30FPS. It was smooth and you could not say a bad thing about it but then when you looked over at a PC 60FPS well, it was a different experience.
Posted on Reply
#59
Vayra86
BoboOOZFor influencers, testers and for the general public, which is more than 90% of the market. You must realize that the average tech savvy Techpowerup forumite is at least in the 5% percentile of the population understanding wise. So Nvidia's marketing doesn't work on you, well they still win in the other 95% of the population and they force AMD to react and scramble instead of innovate, because AMD is also interested in that larger market.
Correctamundo...
Posted on Reply
#60
Legacy-ZA
Calling it now:

If AMD succeeds in its Frame Generation with FSR 3.0, nVidia will magically find a way to make the old RTX2000/3000 series work with DLSS 3.0. Of course, with a nice PR Spin that, "they were working on the problem from DLSS 3.0 release.

:roll:
Posted on Reply
#61
AusWolf
Legacy-ZAIf AMD succeeds in its Frame Generation with FSR 3.0, nVidia will magically find a way to make the old RTX2000/3000 series work with DLSS 3.0.
I've been expecting that ever since DLSS 3.0 was announced. It's all a PR move, like everything with Nvidia these days. They might wait for 4000-series sales to pick up first, though.
Posted on Reply
#62
trsttte
They knew demand was dwindling and about to crash, they needed a new selling point for 4000-series. It was clear from the start, "oh no it won't perform as well" lol, neither does ray tracing since it's inception
Posted on Reply
#63
medi01
btarunrDoes this involve a frame-rate doubling technology similar to DLSS 3?
Oh, that frame interpolation thing some TVs have for more than a decade?

Technology is amazing. :D
matarLet's hope this move by AMD with push Nvidia to let RTX 3000 series to use DLSS 3.0
Let's hope more people will grasp the concept of "vote with your wallet".
BoboOOZInstead, they keep implementing new non standard stuf
G-Sync is so cool indeed. :D :kookoo:
Posted on Reply
#64
dir_d
So Nvidia made SVP4 for Games and now AMD wants to follow suite. I think AMD could do it without AI and i am looking forward to what they show us.
Posted on Reply
#65
medi01
ratirtI don't think you understand the problems and/or doubts people are bringing here. It is not about trying and loving it. It is a tech a feature that is giving something that gamers can enjoy. I mean, you say it like there isn't any other way to get better FPS that is one thing.
The problem here is that an absolute BS tech that even older TVs could pull off, had been successfully sold by green marketing, so now AMD had to waste time on that shit.

There is simply no good use case for it. If you want higher fps, you want lower, not higher latency.

It also further increases the effort to test stuff.


G-Sync, as shitty as it was, being proprietary and force feeding useless chips and driving up prices, at least brought something worthy.
dir_ddo it without AI
There is no "AI" to it, bar marketing, either way.
Posted on Reply
#66
BoboOOZ
medi01The problem here is that an absolute BS tech that even older TVs could pull off, had been successfully sold by green marketing, so now AMD had to waste time on that shit.
The old TV argument is not that great. Yes, frame interpolation has been done for a while, but it's very easy when you can afford a few seconds of latency. It's much harder when you are trying to do it in real time.
Posted on Reply
#67
medi01
BoboOOZYes, frame interpolation has been done for a while, but it's very easy when you can afford a few seconds of latency
It was 1 second or so, on crappy chips in ancient TVs. Come on...
Posted on Reply
#68
BoboOOZ
medi01It was 1 second or so, on crappy chips in ancient TVs. Come on...
I mean, let's just not crap all over it before we see what AMD comes up with. It might be a hard problem to solve with only 20 or so milliseconds of added latency.
Posted on Reply
#69
medi01
BoboOOZI mean, let's just not crap all over it before we see what AMD comes up with
There is no way to add interpolation frames without increasing latency, even if it would be Jesus himself programming it.
Posted on Reply
#70
BoboOOZ
medi01There is no way to add interpolation frames without increasing latency, even if it would be Jesus himself programming it.
Of course, the question is how much latency, and how good will be the interpolated frames?
If dlss/fsr history is anything to go by, and also the fact that AMD are coming to the party later as usual, their solution will offer slightly less quality and more latency, but it'll work on more graphic cards. So Nvidia will still come out on top, as usual.
Posted on Reply
Add your own comment
Dec 20th, 2024 00:42 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts