• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

EA DICE Names NVIDIA the Battlefield 2042 Official PC Platform Partner

Literally every comparison shows that DLSS Quality (version 2.x) looks better than native, at least in 4K. It just works. ;)

What I want to see is Dynamic DLSS. I want the game to lower the preset when GPU usage is maxed out. I hate framerate drops. I feel like dynamic resolution scaling on PC was abandoned because of G-SYNC/FreeSync. Personally I prefer Black Frame Insertion over VRR, but framerate drops are very annoying. It sucks to see the GPU at 50% most of the time because some moments are very heavy and drop below the refresh rate.
 
Since BF 2042 is a no-go for me, I'm gonna play Doom Eternal again, actually didn't complete the Campaign (got to about mid way I guess), but not that RT's been enabled with Adrenalin 21.6.2, I tested it at 3840x1080, Nightmare presets, RT enabled and it ran silky smooth at about 120fps to >200fps. I've unfamiliar with the KB mapping, so I guess I'll restart to better enjoy the game.
I enjoyed the old BF up to BF1 but BFV lost my interest in a few weeks.
Since days of Bad Company I owned every one of these games.
Let's see how good this will be.
I liked also single players compains as well as multiplayer.
It is sad that they don't make single player anymore.:(
I even enjoyed those single players in COD.

I loved BF3 and BF4 with their modern setup so I will buy this and hopefully it will not disappoint me.:D

In the past Frostbite engine worked well on AMD as on NVIDIA. I hope it's not any different this time. I played all those generations of BF primarily on different generations of AMD cards. :rolleyes:
 
Wait ? Nvidia said DLSS makes image better than native ? i can't understand how ? unless the textures used in game already come a bit bigger in size and blurry, maybe less contrast and less saturation, just enough so when the "magic" sauce comes everything looks right, so basically a scam, that's why it needs "support", otherwise what the fu... can DLSS do with perfectly optimized assets ? this is just stupid, we will soon have keyboards and mice with AI, they will light up when your turn your computer on, sensing you.

Yes, DLSS can be better than native because it's trained on 16K x 16K images and can render details not available otherwise.

FSR on the other hand is semi-OK at ultra-quality (i.e. as a 1440p to 4K conversion) but in all other modes it's just pure crap because it's a freaking image upscaler which doesn't use motion vectors, so even TAA implemented in many games including UE4/5 beats it hands down. DLSS on the other hand has been shown to work even in Ultra Performance mode where it has the bare minimum of pixels. At this mode FSR will simply fall apart because there's no AI or high-res image data behind it.


Here's probably the best overview of FSR on the net: https://www.eurogamer.net/articles/...performance-wins-but-what-about-image-quality

FSR is currently for people with already good enough GPUs capable of rendering at 1440p. For all others it's as good as dead.
 
Last edited by a moderator:
Yes, DLSS can be better than native because it's trained on 16K x 16K images and can render details not available otherwise.

DLSS 2.0 is trained on a generalised GAN, and not specifically that of images of a specific game. Now, whilst true that DLSS 2.0 can add detail, the reality is it still usually lacks from a native image, and DLSS at lower resolutions, whilst it certainly resolves more than the base resolution, still usually looks noticeably worse than native resolution.

Of course its interesting you post Control though. I think this rubbish of DLSS somehow always magically making games looks better comes from this and Death Stranding, two games with such notoriously shit implementations of TAA that inserting DLSS and getting rid of the developer implemented TAA makes for a remarkable increase in image quality.

As for FSR, yep its worse than native all the time (in a technical sense, whether people can tell is a different story), but it gives people a better looking image regardless than if they ran with resolution scaling, so all those guys back on 9xx,10xx, 4xx, 5xx, and other low power cards can usually render a native image regardless. And really, whats more important, maybe resolving an unnoticeable amount of extra detail, or actually being able to hit 1080/60 or 1440/60 at a better IQ than previously possible?
 
Last edited:
The amount of hubris and illiteracy in this post is staggering. Yes, DLSS can be better than native because it's trained on 16K x 16K images and can render details not available otherwise.
To the second point, so as i understand DLSS is trained with higher quality assets to produce better lower resolution assets, SO WHY NOT USE THEM PRE RENDERED ? what happens then ?
Can't EA downscale a 16k x 16k properly ? they need Nvidia for this ?
Funny thing is this can only work for Nvidia because they dedicated a part on the die especially for this, so that part could be sitting doing nothing if not for DLSS.
 
Last edited by a moderator:
Back
Top