• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA DLSS Test in Battlefield V

The Turing generation is very disappointing, RT seems close yet far because of the huge performance hit and it still is a gimmick because it's not really full RT rendering, DLSS is even worse because it's just way too limited, it needs to be title Agnostic not depending on nVidia's supercomputers, it needs to be used locally something like a free 2x/4x MSAA like the 360 tried to achieve when it was launched.
The more I think about it the more I'm convinced that nVidia wasted silicon on Tensor cores instead of putting much more regular Cuda cores, which could've improved performance dramatically over Pascal, but to be honest AMD are to blame too, even 2 years after Pascal the 1080ti is still beating everything from them...

Unlike DLSS, raytracing is actually nice to have. The performance loss when playing at 3440x1440 using a 2080 ti and RTX low isn't that bad, being able to see around corners using reflections is quite useful (until your squad mate spawns on you and exits via the window) and the increased immersion from the reflections makes things really pretty. The global illumination in Metro Exodus looks great from the reviews that I have seen, it is a pity that I cannot experience it myself due to Epic's greed in charging me more for the game then what Steam was... :(
 
Colors are more saturated in DLSS images, while highlights are less pronounced. There is a 'softer' image as it is described in the conclusion, but that's not what this is, this is just detail and definition that is lost and cannot be brought back with machine learning. 'Trees look great' because they don't shimmer... hello?! TAA has been doing just about exactly the same for years without blur. This all reads like a desperate struggle for USPs where there aren't any. I guess the FPS counter blinded some people for the truth.

So basically, this technique completely changes the image you're looking at. Its not like AA or a simple upscale at all, you're looking at something radically changed. Yes, radical. The information fed to you on screen is simply different and it creates new artifacts and imperfections. I for one, calibrate my monitor to at least not have oversaturated and crushed colors, DLSS brings that right back with no way to eliminate it. The color information simply isn't there.

Hard Pass on this tech so far in every implementation I've seen.

I honestly don't understand how TPU can pass this off as a positive and great thing for gaming. I suppose you have to value RT ingame very highly to take these IQ hits to maintain performance... very strange. For decades we have been hypercritical about AA techniques and their shortcomings, and here we get a low-res upscaled image served with noticeable drawbacks and its 'a step forward'. I guess I'm weird or something. It gets even better when people support that stance by saying 'so you render at higher res with DLSS'... yeah, and that means you've immediately lost your performance uplift and you're back where you started :roll:*with blur.

And the best new argument I'm hearing lately... 'its a fast paced game, so you won't see it'... uhm.. OK. I guess some people really have a different take on reality from time to time. Since the 90's we've been trying to make the most low-poly, low-res textured models as nice as possible, starting with fast-paced shooters like Quake, Doom, Unreal and now that we've got nice detail on a nice native res we use downscaled content to eliminate that, and say its fine because now we can 'run at native res'. You can't get much more ironical about it... And there are simple resolution scale sliders to help you out there, no Nvidia proprietary BS required and it even provides a more consistent image without altered colors.
 
Last edited:
It seems, that DLSS is much worse for UltraWide users as shown by JayzTwoCents... And it indeed looks horrible on my 3440x1440 monitor.
But still looks bad on 4K. Disappointing I used to trust TPU, I'll need to be more cautious I guess.
 
Last edited:
It seems, that DLSS is much worse for UltraWide users as shown by JayzTwoCents... And it indeed looks horrible on my 3440x1440 monitor.
But still looks bad on 4K. Disappointing I used to trust TPU, I'll need to be more cautious I guess.

what in the heck does Techpowerup have todo with anything at no point in the article is ultrawide even mentioned
 
With the recent update, it has improved at least. The sharpness is much better. But there is still some data loss on the textures, if they manage to do anything to improve on that then there is still hope for it. Of course this isnt to mention 21:9, not a concern for me but it shouldnt be left out in the cold.
 
I would be very curious to see the impact DLSS has on input latency.

Not much different from anything else you add to the pipeline, and you can see the number in your OSD, as frametime or FPS... Your input lag is sending a signal from button press to the game, the GPU only affects that with the amount of FPS it can push out, but is effectively not additional input lag. The actual frametime is also the interval at which you can 'respond' with your input to what is happening on screen. But it is not the actual input lag.

The factors that influence input lag are basically everything until you send a signal through the GPU and 'enter the render pipeline'. The CPU/RAM/software determines how fast it can 'use' your input, the GPU renders it, your monitor displays it. That monitor has a delay for each hue as to how fast it can switch from one color to another, called G2G, and it has some signal processing that adds latency. Effectively, the GPU is the only factor of no influence to 'lag'; when it has a frame, it sends it to the monitor. From the GPU onwards you could refer to this as 'output lag'.

In this sense its best to speak of 'button-to-pixel' latency (which is effectively: input lag + frametime + output lag) because that covers the entire pipeline and in a continuous setting, that is what you're really dealing with in gaming. Once you see an image, you can respond to it. So when a pixel is drawn, you are only suffering the 'input lag' to have your response registered and sent to the game. When that info reaches the GPU, its already past history for you, you just get to enjoy the effect.

EDIT: apparently a bit of a necro post here, though the body is still warm I guess :P
 
Last edited:
Not much different from anything else you add to the pipeline, and you can see the number in your OSD, as frametime or FPS... Your input lag is sending a signal from button press to the game, the GPU only affects that with the amount of FPS it can push out, but is effectively not additional input lag. The actual frametime is also the interval at which you can 'respond' with your input to what is happening on screen. But it is not the actual input lag.

The factors that influence input lag are basically everything until you send a signal through the GPU and 'enter the render pipeline'. The CPU/RAM/software determines how fast it can 'use' your input, the GPU renders it, your monitor displays it. That monitor has a delay for each hue as to how fast it can switch from one color to another, called G2G, and it has some signal processing that adds latency. Effectively, the GPU is the only factor of no influence to 'lag'; when it has a frame, it sends it to the monitor. From the GPU onwards you could refer to this as 'output lag'.

In this sense its best to speak of 'button-to-pixel' latency (which is effectively: input lag + frametime + output lag) because that covers the entire pipeline and in a continuous setting, that is what you're really dealing with in gaming. Once you see an image, you can respond to it. So when a pixel is drawn, you are only suffering the 'input lag' to have your response registered and sent to the game. When that info reaches the GPU, its already past history for you, you just get to enjoy the effect.

EDIT: apparently a bit of a necro post here, though the body is still warm I guess :p

Well yeah how increasing FPS would possibly increase Lag anyway...

On the side note Athem got DLSS support.
 
Back
Top