• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA DLSS Test in Battlefield V

Emu

Joined
Jan 5, 2018
Messages
28 (0.01/day)
The Turing generation is very disappointing, RT seems close yet far because of the huge performance hit and it still is a gimmick because it's not really full RT rendering, DLSS is even worse because it's just way too limited, it needs to be title Agnostic not depending on nVidia's supercomputers, it needs to be used locally something like a free 2x/4x MSAA like the 360 tried to achieve when it was launched.
The more I think about it the more I'm convinced that nVidia wasted silicon on Tensor cores instead of putting much more regular Cuda cores, which could've improved performance dramatically over Pascal, but to be honest AMD are to blame too, even 2 years after Pascal the 1080ti is still beating everything from them...

Unlike DLSS, raytracing is actually nice to have. The performance loss when playing at 3440x1440 using a 2080 ti and RTX low isn't that bad, being able to see around corners using reflections is quite useful (until your squad mate spawns on you and exits via the window) and the increased immersion from the reflections makes things really pretty. The global illumination in Metro Exodus looks great from the reviews that I have seen, it is a pity that I cannot experience it myself due to Epic's greed in charging me more for the game then what Steam was... :(
 
Joined
Sep 17, 2014
Messages
22,193 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Colors are more saturated in DLSS images, while highlights are less pronounced. There is a 'softer' image as it is described in the conclusion, but that's not what this is, this is just detail and definition that is lost and cannot be brought back with machine learning. 'Trees look great' because they don't shimmer... hello?! TAA has been doing just about exactly the same for years without blur. This all reads like a desperate struggle for USPs where there aren't any. I guess the FPS counter blinded some people for the truth.

So basically, this technique completely changes the image you're looking at. Its not like AA or a simple upscale at all, you're looking at something radically changed. Yes, radical. The information fed to you on screen is simply different and it creates new artifacts and imperfections. I for one, calibrate my monitor to at least not have oversaturated and crushed colors, DLSS brings that right back with no way to eliminate it. The color information simply isn't there.

Hard Pass on this tech so far in every implementation I've seen.

I honestly don't understand how TPU can pass this off as a positive and great thing for gaming. I suppose you have to value RT ingame very highly to take these IQ hits to maintain performance... very strange. For decades we have been hypercritical about AA techniques and their shortcomings, and here we get a low-res upscaled image served with noticeable drawbacks and its 'a step forward'. I guess I'm weird or something. It gets even better when people support that stance by saying 'so you render at higher res with DLSS'... yeah, and that means you've immediately lost your performance uplift and you're back where you started :roll:*with blur.

And the best new argument I'm hearing lately... 'its a fast paced game, so you won't see it'... uhm.. OK. I guess some people really have a different take on reality from time to time. Since the 90's we've been trying to make the most low-poly, low-res textured models as nice as possible, starting with fast-paced shooters like Quake, Doom, Unreal and now that we've got nice detail on a nice native res we use downscaled content to eliminate that, and say its fine because now we can 'run at native res'. You can't get much more ironical about it... And there are simple resolution scale sliders to help you out there, no Nvidia proprietary BS required and it even provides a more consistent image without altered colors.
 
Last edited:
Joined
Dec 3, 2014
Messages
347 (0.10/day)
Location
Marabá - Pará - Brazil
System Name KarymidoN TitaN
Processor AMD Ryzen 7 5700X
Motherboard ASUS TUF X570
Cooling Custom Watercooling Loop
Memory 2x Kingston FURY RGB 16gb @ 3200mhz 18-20-20-39
Video Card(s) MSI GTX 1070 GAMING X 8GB
Storage Kingston NV2 1TB| 4TB HDD
Display(s) 4X 1080P LG Monitors
Case Aigo Darkflash DLX 4000 MESH
Power Supply Corsair TX 600
Mouse Logitech G300S
Joined
Aug 15, 2015
Messages
7 (0.00/day)
It seems, that DLSS is much worse for UltraWide users as shown by JayzTwoCents... And it indeed looks horrible on my 3440x1440 monitor.
But still looks bad on 4K. Disappointing I used to trust TPU, I'll need to be more cautious I guess.
 
Last edited:

OneMoar

There is Always Moar
Joined
Apr 9, 2010
Messages
8,793 (1.66/day)
Location
Rochester area
System Name RPC MK2.5
Processor Ryzen 5800x
Motherboard Gigabyte Aorus Pro V2
Cooling Thermalright Phantom Spirit SE
Memory CL16 BL2K16G36C16U4RL 3600 1:1 micron e-die
Video Card(s) GIGABYTE RTX 3070 Ti GAMING OC
Storage Nextorage NE1N 2TB ADATA SX8200PRO NVME 512GB, Intel 545s 500GBSSD, ADATA SU800 SSD, 3TB Spinner
Display(s) LG Ultra Gear 32 1440p 165hz Dell 1440p 75hz
Case Phanteks P300 /w 300A front panel conversion
Audio Device(s) onboard
Power Supply SeaSonic Focus+ Platinum 750W
Mouse Kone burst Pro
Keyboard SteelSeries Apex 7
Software Windows 11 +startisallback
It seems, that DLSS is much worse for UltraWide users as shown by JayzTwoCents... And it indeed looks horrible on my 3440x1440 monitor.
But still looks bad on 4K. Disappointing I used to trust TPU, I'll need to be more cautious I guess.

what in the heck does Techpowerup have todo with anything at no point in the article is ultrawide even mentioned
 
Joined
Dec 14, 2011
Messages
273 (0.06/day)
Processor 12900K @5.1all Pcore only, 1.23v
Motherboard MSI Edge
Cooling D15 Chromax Black
Memory 32GB 4000 C15
Video Card(s) 4090 Suprim X
Storage Various Samsung M.2s, 860 evo other
Display(s) Predator X27 / Deck (Nreal air) / LG C3 83
Case FD Torrent
Audio Device(s) Hifiman Ananda / AudioEngine A5+
Power Supply Seasonic Prime TX 1000W
Mouse Amazon finest (no brand)
Keyboard Amazon finest (no brand)
VR HMD Index
Benchmark Scores I got some numbers.
With the recent update, it has improved at least. The sharpness is much better. But there is still some data loss on the textures, if they manage to do anything to improve on that then there is still hope for it. Of course this isnt to mention 21:9, not a concern for me but it shouldnt be left out in the cold.
 
Joined
Sep 17, 2014
Messages
22,193 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
I would be very curious to see the impact DLSS has on input latency.

Not much different from anything else you add to the pipeline, and you can see the number in your OSD, as frametime or FPS... Your input lag is sending a signal from button press to the game, the GPU only affects that with the amount of FPS it can push out, but is effectively not additional input lag. The actual frametime is also the interval at which you can 'respond' with your input to what is happening on screen. But it is not the actual input lag.

The factors that influence input lag are basically everything until you send a signal through the GPU and 'enter the render pipeline'. The CPU/RAM/software determines how fast it can 'use' your input, the GPU renders it, your monitor displays it. That monitor has a delay for each hue as to how fast it can switch from one color to another, called G2G, and it has some signal processing that adds latency. Effectively, the GPU is the only factor of no influence to 'lag'; when it has a frame, it sends it to the monitor. From the GPU onwards you could refer to this as 'output lag'.

In this sense its best to speak of 'button-to-pixel' latency (which is effectively: input lag + frametime + output lag) because that covers the entire pipeline and in a continuous setting, that is what you're really dealing with in gaming. Once you see an image, you can respond to it. So when a pixel is drawn, you are only suffering the 'input lag' to have your response registered and sent to the game. When that info reaches the GPU, its already past history for you, you just get to enjoy the effect.

EDIT: apparently a bit of a necro post here, though the body is still warm I guess :p
 
Last edited:
Joined
Mar 10, 2014
Messages
1,793 (0.46/day)
Not much different from anything else you add to the pipeline, and you can see the number in your OSD, as frametime or FPS... Your input lag is sending a signal from button press to the game, the GPU only affects that with the amount of FPS it can push out, but is effectively not additional input lag. The actual frametime is also the interval at which you can 'respond' with your input to what is happening on screen. But it is not the actual input lag.

The factors that influence input lag are basically everything until you send a signal through the GPU and 'enter the render pipeline'. The CPU/RAM/software determines how fast it can 'use' your input, the GPU renders it, your monitor displays it. That monitor has a delay for each hue as to how fast it can switch from one color to another, called G2G, and it has some signal processing that adds latency. Effectively, the GPU is the only factor of no influence to 'lag'; when it has a frame, it sends it to the monitor. From the GPU onwards you could refer to this as 'output lag'.

In this sense its best to speak of 'button-to-pixel' latency (which is effectively: input lag + frametime + output lag) because that covers the entire pipeline and in a continuous setting, that is what you're really dealing with in gaming. Once you see an image, you can respond to it. So when a pixel is drawn, you are only suffering the 'input lag' to have your response registered and sent to the game. When that info reaches the GPU, its already past history for you, you just get to enjoy the effect.

EDIT: apparently a bit of a necro post here, though the body is still warm I guess :p

Well yeah how increasing FPS would possibly increase Lag anyway...

On the side note Athem got DLSS support.
 
Top