• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Could Tease DLSS 3-rivaling FSR 3.0 at GDC 2023

as an AMD fanboy even I go /doubt

lulz.

The bar is not high to beat DLSS 3.0.

Yeah I didn't quite understand the whole point of FG in the first place. Its a bit of self defeating exercise; If you can make do with higher latency, you generally don't need high FPS, and if you want low latency, you'll want high FPS. In both cases, FG misses the mark.

So now you get that cinematic looking game running at 120 FPS... with the latency of 60. What's the damn point? Especially if you dó run the minor risk of interpolation artifacting.

Yep, the technology is kind of pointless unless they figure out a way to generate the next frame and get rid of the latency hit.
 
Yep, the technology is kind of pointless unless they figure out a way to generate the next frame and get rid of the latency hit.
That is impossible as long as you generate the extra frames without taking user input into account, which is what the whole idea of frame generation is about.
 
FSR and DLSS are just fancy ways of reducing image quality or making other sacrifices to get performance?

You can probably just tweak game graphics settings slightly for similar results in most titles.
I only based this opinion from one of Moore's law is dead guest, who is one of from a gaming devs. He articulated in sort of, It will probably makes the dev's lazy, but the thing is, it's making them as dev easy to develop games that is already optimize from the get go, and if they have to have to adopt and do it natively they are fine with that. But he point out that we are now in the stage of the game where the user's preference of graphics quality is of importance and that requires an atrocious amount of geometry processing and to do it natively will demand a GPU that is very expensive like 4090.

And If their game runs only on that GPU, IMO how the hell can they sell their game if only a handful of gamers can afford on it? that's where the beauty of upscaling comes in handy like FSR and DLSS. In a sense these technology save us money on buying just a mid tier card instead of us pushing our purse to buy 4080 or 4090, and the same time it saves the gaming industry and it enables them developed an upscaled games in FSR/DLSS, while capitalizing the hardware raw high specs like MB's Displayport 2.1 bandwidth and the sharpness of 4K resolution/8K resolution to reach that quality while doing it in an upscaled settings.

The point here if you have a 1080p or 2K monitor run it on Native, if you have a 4K for the love of games run your game on FSR/DLSS, for you to enjoy the sharpness of your 4K resolution while playing on high fps. I don't have 4K monitor but with the sharpness of this monitor it might be hard for you to pinpoint those rough edges that you can see clearly on a 1080p monitors.

I am not fluent in english, my apology if my grammar is bad here.
 
Last edited:
The bar is not high to beat DLSS 3.0.



Yep, the technology is kind of pointless unless they figure out a way to generate the next frame and get rid of the latency hit.

I was thinking more about frame generation when I made that comment.
 
Not necessarily Accurate you are just taking it into DLSS3's implementation of Frame Generation accounts, It some way AMD admits that it is somewhat like the Frame Generation but the way they implement their AI processing is different. It will only be clear soon when they truly unveils their FSR 3.
It is accurate. Frame generation will never decrease input latency because you're not actually interacting with the frames the frame generation is producing.

Latency is increased slightly, though typically not by a noticeable amount(+/- 5-10ms) because it's an extra thing, in this case inserting frames, that the GPU has to do.
 
If it's not limited to AMD only cards I'm all in.
Yeah, you should buy Nvidia instead because DLSS works great on my RX 6800. :roll:
 
That is impossible as long as you generate the extra frames without taking user input into account, which is what the whole idea of frame generation is about.

Yes, impossible with the way game engines are currently designed. You'd need access to a bunch of data in addition to user inputs that would only be available once the CPU completes another main game loop.

It's likely a better approach to reduce CPU overhead associated to the GPU, that way you can generate real frames instead of fake latent frames.

It is accurate. Frame generation will never decrease input latency because you're not actually interacting with the frames the frame generation is producing.

Latency is increased slightly, though typically not by a noticeable amount(+/- 5-10ms) because it's an extra thing, in this case inserting frames, that the GPU has to do.

The latency hit has been in excess of 30ms in some reviews. As HardwareUnboxed points out, the latency hit is noticeable if your initial FPS is too low and it didn't make sense to enable at all if your FPS is already high. I don't remember the exact sweet spot numbers but I believe you want to be between 70 FPS and 120 FPS for the benefits to outweigh the cons.
 
Let's hope this move by AMD with push Nvidia to let RTX 3000 series to use DLSS 3.0
 
It is accurate. Frame generation will never decrease input latency because you're not actually interacting with the frames the frame generation is producing.

Latency is increased slightly, though typically not by a noticeable amount(+/- 5-10ms) because it's an extra thing, in this case inserting frames, that the GPU has to do.
When I put a comment on FSR 3 I believe I was not mentioning anything about the latency but fps performance since you brought that out, yeah you have a point there, but as to how the RDNA 3's AI accelerators are being used on their Fluid Motion Frame with respect to latency's penalty, is yet be seen on games. So with FSR 3 + Fluid Motion Frame + HYPR-RX seems like they are considering all factors to boost the performance of their RDNA 3 Cards.
 
He articulated in sort of, It will probably makes the dev's lazy, but the thing is, it's making them as dev easy to develop games that is already optimize from the get go

There's an easier way around this that most companies are already using or jumping on: just use an established engine (usually Unreal Engine but there are other options, Decima from Guerilla Games seems pretty nice for example) instead of keeping on reinventing the wheel

It's a compromise between doing new things all the time and meeting basic quality goals without major scraficies elsewhere. Given how the quality of new releases has been decreasing further and further without being particularly innovative I'd say it won't be that much of a sacrifice for a while
 
AMD should make it so FSR3 leverages Tensor cores and then developers could abandon DLSS, as they would use AMD"s open source FSR and getting accelerated performance on Nvidia and AMD cards. Could also leverage whatever Intel uses in Alchemist. Having to support three different upscaling technologies must be a total PITA.
 
Very nice, I look forward to this potentially doubling my performance in games that use FRS 2, thanks AMD and their investors!
 
The way AMD tail after NV with those "upscaling" features (FSRx\DLSSx) just show why NV can charge more for their product.
All agree that it is a must have thing, and the game is who have more and in what quality (and game adaptation).
Good luck to both, as ARK still stand shy in the corner, yet to enter the big boys fight.
 
There's an easier way around this that most companies are already using or jumping on: just use an established engine (usually Unreal Engine but there are other options, Decima from Guerilla Games seems pretty nice for example) instead of keeping on reinventing the wheel

It's a compromise between doing new things all the time and meeting basic quality goals without major scraficies elsewhere. Given how the quality of new releases has been decreasing further and further without being particularly innovative I'd say it won't be that much of a sacrifice for a while
Nvidia has great marketing. They know that if they do the same thing as everybody else, they would end up having to compete on price. Instead, they keep implementing new non standard stuff, this gives them the possibility to constantly move the goalposts and keep the competition scrambling behind.
Yes, the users get slightly less performance in the end, but that's not what matters, what matters is what they are prepared to pay more for innovative Nvidia products. Brilliant.
 
Last edited:
DLSS 3 works well. In fact, it probably better than DLSS 2. Hopefully AMD's answer won't be far behind nVidia.
 
The Million Dollar Question: Will AMD expose Nvidia's lies once again by showing a solution that doesn't require dedicated hardware (ASIC) and AI? :P
 
The latency hit has been in excess of 30ms in some reviews. As HardwareUnboxed points out, the latency hit is noticeable if your initial FPS is too low and it didn't make sense to enable at all if your FPS is already high. I don't remember the exact sweet spot numbers but I believe you want to be between 70 FPS and 120 FPS for the benefits to outweigh the cons.
And that, in my opinion, is what makes frame generation useless. I don't need more performance when the game already runs above 70 FPS, and I most definitely don't want more latency when it doesn't.
 
I'm not a fan of the FG feature and I hardly thing I will be ever. You can always get some FPS with the settings adjustment. Plus there are those bugs and image quality related problems with the generated frames. I only hope this feature, will not make the companies produce way less powerful GPUs for a hard buck and mitigate the low FPS problem with the feature. If it is to improve experience sure but I hope we will not get to the point we rely on it no matter what hardware you get.
 
Yeah I didn't quite understand the whole point of FG in the first place.
Just say it's not for you, because surely you understand the point, if you've seen it with your own eyes at least. It looks more fluid for broadly equal to the same latency as before it was enabled, and entirely reasonable people that have used it give it some merit, it has a point, ie:
I upgraded my work PC to RTX 4080, and have been playing with FG on for the last few hours and it's just absolutely stunning. No issues or anything, just double the FPS. I am constantly hoping to find issues to report, but nothing

Don't get me wrong, it's far from perfect and I understand the criticism, nothing is above constructive criticism, but the feature has merit, at least AMD agrees...
 
Just say it's not for you, because surely you understand the point, if you've seen it with your own eyes at least. It looks more fluid for broadly equal to the same latency as before it was enabled, and entirely reasonable people that have used it give it some merit, it has a point, ie:
Sure, but is it that great below 60 FPS as well? I think we'll see how great FG is when there are capable graphics cards that run the newest games below acceptable frame rates. If it makes 60 out of 30 without added latency, I'll agree that it's great. Making 200 FPS out of 100 is snake oil territory for me.

Don't get me wrong, it's far from perfect and I understand the criticism, nothing is above constructive criticism, but the feature has merit, at least AMD agrees...
It's not about agreeing. It's about following trends to stay competitive.
 
but the feature has merit, at least AMD agrees...
But for what purpose - that is the real question I talk about.
 
But for what purpose - that is the real question I talk about.
For influencers, testers and for the general public, which is more than 90% of the market. You must realize that the average tech savvy Techpowerup forumite is at least in the 5% percentile of the population understanding wise. So Nvidia's marketing doesn't work on you, well they still win in the other 95% of the population and they force AMD to react and scramble instead of innovate, because AMD is also interested in that larger market.
 
Making 200 FPS out of 100 is snake oil territory for me.
I mean sure that's one example of the useful range, from what I've seen it's excellent at turning 50-70 fps into 80-120 fps, feels and looks fantastic.

I'd highly recommend finding a way to try it for yourself, and hey you might still think it's snake oil after, but it's the only real way to get a sense of it. I found it very impressive.
 
I mean sure that's one example of the useful range, from what I've seen it's excellent at turning 50-70 fps into 80-120 fps, feels and looks fantastic.

I'd highly recommend finding a way to try it for yourself, and hey you might still think it's snake oil after, but it's the only real way to get a sense of it. I found it very impressive.
That's what I mean: the 50-70 FPS range is fluid enough for me not to want anything more. I play with a driver-level 60 FPS lock anyways. Freesync kicks in at 48 Hz/FPS on my monitor, so I'm not even bothered by minor fluctuations. When we can test how FG makes 60 FPS out of 30, I'll be interested enough to form a more elaborate opinion about it. Until then, it's snake oil (imo).
 
That's what I mean: the 50-70 FPS range is fluid enough for me not to want anything more. I play with a driver-level 60 FPS lock anyways. Freesync kicks in at 48 Hz/FPS on my monitor, so I'm not even bothered by minor fluctuations. When we can test how FG makes 60 FPS out of 30, I'll be interested enough to form a more elaborate opinion about it. Until then, it's snake oil (imo).
Well yeah, if as a gamer all you want for is 1080p60 without visual bells and whistles (like RT), then I doubt FG of any flavour is for you. having tried it I see it as a great little piece of tech that essentially has no downsides, sure it doesn't improve latency AND visual fluidity, but just one is still a net benefit to the experience, and as you know from me, high fidelity and high framerates are right up my alley.

Really keen to see if AMD can pull a rabbit out of a hat on this one, it took a minute, but they basically did with FSR 1.0 and 2.X all things considered.
 
Back
Top