• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Marvel's Spider-Man Miles Morales: FSR 2.1 vs. DLSS 2 vs. DLSS 3 Comparison

Joined
Sep 9, 2021
Messages
78 (0.06/day)
Marvel's Spider-Man Miles Morales is out now, with support for NVIDIA's Deep Learning Anti-Aliasing (DLAA), NVIDIA's DLSS Super Resolution (also known as DLSS 2.4), NVIDIA's DLSS Frame Generation (also known as DLSS 3) and AMD's FidelityFX Super Resolution 2.1 (FSR 2.1). In this mini-review we take a look, comparing the image quality and performance gains offered by these technologies.

Show full review
 
Very nice. Double the fps with frame generation is insane
 
I am waiting for the prices to come down on the 4080/4090 cards and more games to support DLSS 3 Frame Generation. Also can't wait to see what AMDs cards benchmark at and if they actually compete with Nvidia's RT capabilities, which I doubt.
 
I am waiting for the prices to come down on the 4080/4090 cards and more games to support DLSS 3 Frame Generation. Also can't wait to see what AMDs cards benchmark at and if they actually compete with Nvidia's RT capabilities, which I doubt.
they have ALWAYS competed if u look at price to performances esp with nvida using way more watts if u live in a hot area will increase that cost further if u gotta be 4k ultra go nivida(electicity) if u dont own a 4k montor or seen the df or linus or hardware unboxex a idk 17 million subs all doing videos on why ultra vs high (in side by side sc) show the performance increase is insane... the fact is a 400 dollar ps5 can run god of war one of the best looking games out in 2022... rt is a scam... the 20 series cards had to use low low low rt at 1080p 60fps in a easy to run game... and if your fanboyism needs more proff nvidia sucks

nivida themselves relased a trailer with raw expensvie compuuter(we know but not fully because they are greedy and dont show the specs so the video is 10000% usless for math)
a 2022 high end computer running portal at 23 fps meanwhile that game cameout vs the 2006 xbox one that had 512 mbs of total console memory vs the possible 24bgs of JUST v ram at 48x the v ram alone....
 
FSR has come such a long way in a short amount of time. I'm impressed and glad it's available to all. I'm still not on the frame generation train. I'd rather they modify that tech to make a new antialiasing method, one that doesn't require an existing TAA pipeline and can be applied to any game at driver level.
 
Does dlss3 require Windows 11 officially? In my personal experience in Cyberpunk amd Darktide it was not available in windows 10 as an option and GeForce experience was getting confused but when I upgraded to Windows 11 on second drive It was available as an option.
 
Does dlss3 require Windows 11 officially? In my personal experience in Cyberpunk amd Darktide it was not available in windows 10 as an option and GeForce experience was getting confused but when I upgraded to Windows 11 on second drive It was available as an option.
You have to turn on hardware accelerated gpu scheduling (HAGS) in windows 10 for it to work. HAGS is on by default in windows 11. Also Cyberpunk doesn’t have dlss 3 yet (what has been shown online is not publically available). Lastly if you’re playing Darktide through the windows store rather than steam, then you have to use a work around to get dlss 3 operational.
 
Latency was mentioned but not any actual figures
Could you show the ones displayed by Nvidias overlay?

I run my games in the 3-8ms render latency range (<15ms system latency, games show one or the other) and i've seen reports that DLSS3 adds upto 20ms onto those existing figures which might be fine for 30FPS or 60FPS with a controller, but becomes absolute hell to someone used to faster paced responsive controls
 
I find that the gap has narrowed to where DLSS and FSR are closer than ever, really strong showing from FSR here.

The biggest difference to my eye is broken lines/shimmering on straight edges, visible and eye catching in most scenes at 4k output Quality mode, and that sometimes translates to distant break up of certain complex angular geometry elements that are rendered. Ghosting and disillusion seem fixed to the point where it never 'grabs' my eye, which is excellent, I'd need to really nit-pick to find it.

Smoothed egdes/no shimmer or breakup on straight lines and distant detail are two of DLSS biggest strengths imo, and possibly the biggest hurdle to climb for FSR now?

Very strong showing from all technologies I'd say, and without a DLSS capable card I'd go as far as to say (at least for my use case of 4k output/quality mode), turning on FSR is a no brainer.
 
The biggest problem is that neither company goes back and updates their code

Games are running DLSS that never get updates but we can manually do it, and others get FSR 2.0 but with broken settings that look worse than FSR 1.0


DRG has FSR1 (and it works as good as FSR1 ever has, its decent but soft)
FSR2 is a shimmery mess
DLSS is great

But then you can run the FSR2 -> DSR2 conversion and FSR2 looks as great as DLSS


It really seems like they just need to let the game devs tweak the values, or let drivers update them or something
 
Hate them for being greedy but if it wasn't for Nvidia DLSS, AMD would never made FSR, even consoles benefit from this now.
Amd is like those chinese "brands" copying everything Apple does, this must be quite annoying for Nvidia.
 
Lot's of fuzzy boxy pixels with FSR, esp. on the player model & the windows. :cool: Guess the ray traces destroying pixels here.

Hate them for being greedy but if it wasn't for Nvidia DLSS, AMD would never made FSR, even consoles benefit from this now.
Amd is like those chinese "brands" copying everything Apple does, this must be quite annoying for Nvidia.

Or you can say they're taking inspirations from the market leader, developing their own version & bringing it "open source" to the masses. :laugh:
 
Hate them for being greedy but if it wasn't for Nvidia DLSS, AMD would never made FSR, even consoles benefit from this now.
Amd is like those chinese "brands" copying everything Apple does, this must be quite annoying for Nvidia.
Or you can look at it another way if it wasn't for FSR Nvidia would have not made dlss 2.0 which is exactly when DLSS became usable rinse repeat and now we are at dlss3 and fsr 3.0 and soon dlss 4 and so on and so forth. Without open standards closed proprietary will stagnate too. Without closed source not spending money on r and d and constantly fortifying it position of dominance ( Nvidia) and putting pressure on open source, eventually open source will stagnate. We need both for a progress in innovation. Questions?
 
Hate them for being greedy but if it wasn't for Nvidia DLSS, AMD would never made FSR, even consoles benefit from this now.
Amd is like those chinese "brands" copying everything Apple does, this must be quite annoying for Nvidia.
Consoles always had something for scaling - from just using large fonts only, to keeping UI elements at native res while the game rendering was lower.

It's always been odd how little of that was used on the PC end, but i suppose it must have had downsides we werent aware of (such as breaking with other aspect ratios?)
 
Playing this now and can confirm dlss (2.4) looks way better here than in Spider-Man Remastered. I happily leave it on even when using frame gen.

Frame gen looks flawless to me maxing out my 160 hz 4K monitor but just a bit off when hooked to my 120 hz TV, so I leave it off when playing there. Since the IQ of the frame gen experience can depend on the final output fps, I think the reviewer’s max monitor refresh and the final fps output should be mentioned in future reviews.
 
Playing this now and can confirm dlss (2.4) looks way better here than in Spider-Man Remastered. I happily leave it on even when using frame gen.

Frame gen looks flawless to me maxing out my 160 hz 4K monitor but just a bit off when hooked to my 120 hz TV, so I leave it off when playing there. Since the IQ of the frame gen experience can depend on the final output fps, I think the reviewer’s max monitor refresh and the final fps output should be mentioned in future reviews.
The monitor is what should be mentioned, as each display can be wildly different even with the same basic specs

A VA panel has more smearing than an IPS, for example - do the frame gen might make that worse
 
Back
Top