• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Immortals of Aveum: FSR 2.2 vs. DLSS 2 vs. DLSS 3 Comparison

Low quality post by Upgrayedd
Joined
Mar 14, 2014
Messages
1,492 (0.37/day)
Processor 11900K
Motherboard ASRock Z590 OC Formula
Cooling Noctua NH-D15 using 2x140mm 3000RPM industrial Noctuas
Memory G. Skill Trident Z 2x16GB 3600MHz
Video Card(s) eVGA RTX 3090 FTW3
Storage 2TB Crucial P5 Plus
Display(s) 1st: LG GR83Q-B 1440p 27in 240Hz / 2nd: Lenovo y27g 1080p 27in 144Hz
Case Lian Li Lancool MESH II RGB (I removed the RGB)
Audio Device(s) AKG Q701's w/ O2+ODAC (Sounds a little bright)
Power Supply Seasonic Prime 850 TX
Mouse Glorious Model D
Keyboard Glorious MMK2 65% Lynx MX switches
Software Win10 Pro
This is one of the games getting FSR3 at launch.
There is a problem on the PC version with motion vectors not working on the main character's hand for spell selection isn't working, hence the problems with FSR2.







But everywhere else FSR2 seems to be working pretty well. The motion vectors are working fine on the console versions, so we should expect this to be solved on the PC front quite soon, probably on time for the FSR3 patch which is cool.
I'm also guessing FSR3 will land on consoles, meaning the 45-60 FPS might turn into 80-120FPS.




These numbers are completely made up. No average numbers were mentioned nor shown in the video. The Series X and PS5 suffer from framerate drops from 60 to ~45 in intensive scenes with many enemies and spells, and the PS5 tends to go ~5-8FPS lower because it's apparently using a higher LOD bias.






Don Allen who wrote the PC technical review says otherwise, and any reasonable person will say otherwise. He even states FSR2 Quality offers higher detail than DLSS2 Quality and native. He says he doesn't recommend using FSR2 Performance, which is a long distance from claiming "it's absolute trash".

But of course this is the internet and some internet people think ridiculous hyperbole makes them look cool and intellectual, so here we are.
This is some truly hard-core AMD fanboying.
Cope.
I don't think I've seen a single person say FSR2 is better than DLSS2. Ever.
 
Joined
May 11, 2020
Messages
268 (0.15/day)
It became fashionable to talk crap about FSR2.

This is some truly hard-core AMD fanboying.
Cope.
I don't think I've seen a single person say FSR2 is better than DLSS2. Ever.

Too bad nvidia can afford better build quality and more vram on their mid/high range gpus (up to 4070ti), so I have to wait more time to enjoy dlss2, but someday.
 
Last edited:
Joined
Dec 10, 2022
Messages
486 (0.59/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
The solution:

Prioritise real gaming (rasterised) performance when choosing a video card because then you won't have to rely on gimmicks like upscaling or frame-generation when playing the latest games. Your card will be fast enough to play the game without needing any crutches. This is why I'm far more concerned with hardware than software when I buy PC parts, especially video cards. You can always add a software implementation later but the hardware that a card is born with is the same hardware that it dies with. People have become so concerned with gimmicks like RT, DLSS, FSR, XeSS, Frame-Generation, etc. that they've stopped being concerned with whether or not the card is fast enough to actually play new games.

There's no substitute for native resolution and this article does a good job of explaining why.
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,929 (2.87/day)
Location
north
System Name Black MC in Tokyo
Processor Ryzen 5 7600
Motherboard MSI X670E Gaming Plus Wifi
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Corsair Vengeance @ 6000Mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston KC3000 1TB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Plantronics 5220, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Dell SK3205
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
The solution:

Prioritise real gaming (rasterised) performance when choosing a video card because then you won't have to rely on gimmicks like upscaling or frame-generation when playing the latest games. Your card will be fast enough to play the game without needing any crutches. This is why I'm far more concerned with hardware than software when I buy PC parts, especially video cards. You can always add a software implementation later but the hardware that a card is born with is the same hardware that it dies with. People have become so concerned with gimmicks like RT, DLSS, FSR, XeSS, Frame-Generation, etc. that they've stopped being concerned with whether or not the card is fast enough to actually play new games.

There's no substitute for native resolution and this article does a good job of explaining why.

Lowering settings is a thing too. And for this specific game no GPU can do a solid 60FPS @ 4K, which honestly isn't great.

I don't think I've seen a single person say FSR2 is better than DLSS2. Ever.

To be fair that's not what he said.
 
Joined
Oct 28, 2012
Messages
1,354 (0.30/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
This is some truly hard-core AMD fanboying.
Cope.
I don't think I've seen a single person say FSR2 is better than DLSS2. Ever.
It's not about FSR being better in term of technicality but being hardware agnostic. Some games seem to be working better with one approach over the other, or they have a different set of issues, and you have to pick your poison
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,523 (1.31/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30 tuned
Video Card(s) Palit Gamerock RTX 5080 oc
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
He even states FSR2 Quality offers higher detail than DLSS2 Quality and native.
The default sharpening leads to perceived detail in stills (and it's genuinely done rather well for a single checkbox solution), one of FSR's strengths, but in motion it's not the same story.
But of course this is the internet and some internet people think ridiculous hyperbole makes them look cool and intellectual, so here we are.
I agree, but I suppose I differ in stance because from my perspective the hyperbole is often reversed when talking about Upscaling, among other things. For example the perspective that totally irrespective of IQ or performance etc, FSR is better purely because it's multi vendor, yet deny to concede XeSS can do the same but better and even serve AMD users better than FSR. Or the users here and across the web that repeatedly feel the need, in every relevant thread (sometimes not even), to say they dislike upscaling, think it's a gimmick, will never use it and so on, like this very thread... Perspective counts for a lot.

I certainly don't think FSR is "absolute trash" - there are some really good implementations where it's more than serviceable, and downright good, roughly on par with DLSS on a balance of IQ and artefacts. The issue seems to be older version or poorer implementations really letting it down, where honestly the presentation has an overall look and feel to it that I absolutely cannot stand, but I still wouldn't go as far as to blanket say "FSR [2.X] is trash" - that's an extreme take.
There's no substitute for native resolution and this article does a good job of explaining why.
Interesting, as it seems that in this game and many others, the articles explanation says the opposite, that DLSS is a better substitute for native rendering.
The in-game TAA solution has very poor rendering of small object detail—thin steel objects and power lines, incomplete and shimmery small particle effects, tree leaves, and vegetation in general. The TAA solution also has shimmering issues on the whole image, even when standing still, which are especially visible at lower resolutions such as 1080p. All of these issues are resolved when DLSS is enabled, due to a superior anti-aliasing solution. With DLSS you can expect an improved level of detail rendered in vegetation and tree leaves in comparison to the in-game TAA solution. Small details in the particle effects in different variety of magic abilities are rendered more stably, correctly and completely in all Quality modes.
And this
1693445775250.png


Now I suppose one could easily now say, well DLAA is better than them all, because it is native resolution, plus a better AA solution than the standard TAA, I accept that. But simply compared to Native+(standard)TAA, which is the option most commonly presented, it seems obvious to me that there absolutely is a substitute - high quality temporal reconstruction upscaling with a top tier temporal antialiasing solution. I'm not here to say it's the only way forward, or will be true 100% of the time, that developers should rely on lower input resolution targets and letting upscaling do all the heavy lifting and so on, far from it. But, because it is true at least once, it has shown that it absolutely can be a substitute, especially as an easily user selectable option without overthinking or deep diving into tweaking and optimising, just set and forget (and get more fps to boot).

I do fully agree with you to prioritise the hardware specifications meeting your needs for rasterised rendering first, I would not want to buy a card where from the outset I must rely on upscaling of any given flavour to achieve the critical performance targets I have. I also vehemently disagree that they are gimmicks, I don't think they even meet the dictionary definition of the word. I think they're great features that can be enjoyed at the users discretion, which millions of users do. I am willing to accept you see it differently, quite passionately so, and that nothing I say can convince you, just showing you another point of view mate.
 
Joined
Oct 31, 2013
Messages
187 (0.05/day)
Looks like the devs had some issues implementing upscaling well. Probably too much time constraints.

DLSS requires the following inputs: color, motion vectors, depth buffer, output buffer, previous output buffer.
FSR 2 requires the following inputs: color, motion vectors, depth buffer, reactive mask(if not then generate automatically), trasnparency & composition mask (if no then generate automatically), output buffer, previous output buffer.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,412 (7.83/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
This is one of the games getting FSR3 at launch.
Because it needs a frame doubler on the consoles, they'll procry about the amazing quality and performance gains as they finally run native 720p without removing half the environment

These numbers are completely made up. No average numbers were mentioned nor shown in the video. The Series X and PS5 suffer from framerate drops from 60 to ~45 in intensive scenes with many enemies and spells, and the PS5 tends to go ~5-8FPS lower because it's apparently using a higher LOD bias.
They show them in the corner and discuss it multiple times? Did you watch the same video i posted?

There's no substitute for native resolution and this article does a good job of explaining why.
DLSS quality can actually be better than native res in many titles, as it's effectively a form of antialising and hides the shimmer effect.
That's the one exception, IMO (And I hope devs implement FSR properly in the future so it ends up the same)

I've confirmed (literal side by side testing) with Deep rock galactic. Two 32" displays, one 1440p one 4k
4k had a better view distance with more visible long-distance details, like legible text on far away display screens. DLSS quality made it slightly easier to read like cleartype does.
Changing the 4K display to 1440p made the text harder to read just like the native 1440p display, so the higher monitor resolution has benefits even with lower render resolutions.

Most interestingly, DLSS quality on the 1440p display (I assume 1080p render res) actually increased the legibility of that text - it was roughly halfway between native 1440p and 4k for the distance you could walk away before it was readable.


Actually i think the cleartype analogy is probably the most accurate way to describe how some things look better with it.
 
Last edited:

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,523 (1.31/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30 tuned
Video Card(s) Palit Gamerock RTX 5080 oc
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Most interestingly, DLSS quality on the 1440p display (I assume 1080p render res)
I agree with the other stuff you said for sure, but this but I believe was 960p for a native 1440p display, if I recall correctly from my previous 3440x1440 monitor.
 
Joined
Sep 17, 2014
Messages
23,474 (6.13/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
"Absolute trash" is the kind of hyperbole that makes any further discussion completely void of meaning.
I.e. it's an absolute trash statement.
No, its trash. Darktide has the same issue. FSR is pretty much unusable in first person games more often than not, you get a disturbed pixel mess every time something moves fast. Contrast edges just trail too much. It becomes detrimental to image quality and its hard to 'unsee'.

In isometric perspectives FSR fares a lot better, generally, or if the action isn't as fast paced.

But honestly, native should just run properly. I care as little for FSR as I do for DLSS at this point. They all suffer from drawbacks.

The solution:

Prioritise real gaming (rasterised) performance when choosing a video card because then you won't have to rely on gimmicks like upscaling or frame-generation when playing the latest games. Your card will be fast enough to play the game without needing any crutches. This is why I'm far more concerned with hardware than software when I buy PC parts, especially video cards. You can always add a software implementation later but the hardware that a card is born with is the same hardware that it dies with. People have become so concerned with gimmicks like RT, DLSS, FSR, XeSS, Frame-Generation, etc. that they've stopped being concerned with whether or not the card is fast enough to actually play new games.

There's no substitute for native resolution and this article does a good job of explaining why.
This.
Trickery is trickery and it means compromises, and you'll never know what they look like until every game is in your hands and possibly already purchased too. There are also no guarantees of continued and all-encompassing support. Plus there are a lot of technologies getting stacked now, so the support required per game will be even greater. All of this is lipstick on pigs, no more and no less, until we land on a single, universal way of implementation that is standard with every game release.

We'll probably get there eventually, but we're not there yet - not with Xess, DLSS, or FSR.
 
Last edited:
Joined
Jun 14, 2020
Messages
4,614 (2.67/day)
System Name Mean machine
Processor AMD 6900HS
Memory 2x16 GB 4800C40
Video Card(s) AMD Radeon 6700S
Because it needs a frame doubler on the consoles, they'll procry about the amazing quality and performance gains as they finally run native 720p without removing half the environment


They show them in the corner and discuss it multiple times? Did you watch the same video i posted?


DLSS quality can actually be better than native res in many titles, as it's effectively a form of antialising and hides the shimmer effect.
That's the one exception, IMO (And I hope devs implement FSR properly in the future so it ends up the same)

I've confirmed (literal side by side testing) with Deep rock galactic. Two 32" displays, one 1440p one 4k
4k had a better view distance with more visible long-distance details, like legible text on far away display screens. DLSS quality made it slightly easier to read like cleartype does.
Changing the 4K display to 1440p made the text harder to read just like the native 1440p display, so the higher monitor resolution has benefits even with lower render resolutions.

Most interestingly, DLSS quality on the 1440p display (I assume 1080p render res) actually increased the legibility of that text - it was roughly halfway between native 1440p and 4k for the distance you could walk away before it was readable.


Actually i think the cleartype analogy is probably the most accurate way to describe how some things look better with it.
Α proper comparison, since you had both monitors, is to compare native 1440p on your 1440p panel vs 4k + DLSS Q on your 4k panel. I assume the latter will look a lot better while providing similar performance. Which should end all debated about the usefulness of DLSS. It actually improves image quality, period.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,523 (1.31/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30 tuned
Video Card(s) Palit Gamerock RTX 5080 oc
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Α proper comparison, since you had both monitors, is to compare native 1440p on your 1440p panel vs 4k + DLSS Q on your 4k panel. I assume the latter will look a lot better while providing similar performance. Which should end all debated about the usefulness of DLSS. It actually improves image quality, period.
I can tell you right now that's easy, I've tried native 1440p on a native 1440p panel, and 4k using DLSS Quality, with an input res of 1440p, and DLSS w/input 1440p looks better, easily, hands down, with performance well above native 4k.

People saying "gimmick", or "lipstick on a pig", legit don't know what they're missing out on. They're entitled of course to an opinion, but if said opinion is not borne from first hand experience... Well, not only does it mean less but I can see how they arrived there, because if you see and game with it, said opinion tends to change pretty fast, and if you haven't, well that sort of goes in a different, less meritorious bucket doesn't it.

Having said that, the difference between a gimmick and a feature does indeed lie in the eye of the beholder, which makes it so interesting to see and hear those people call it a gimmick, but then again, when they can't enable it and game with it through the entirely of a 15-20+ hour long game, possibly multiple times, It's no wonder they say what they say, because from their, non user of the tech /biased perspective, of course it's a gimmick, how could it not be?
 
Joined
Jun 14, 2020
Messages
4,614 (2.67/day)
System Name Mean machine
Processor AMD 6900HS
Memory 2x16 GB 4800C40
Video Card(s) AMD Radeon 6700S
I can tell you right now that's easy, I've tried native 1440p on a native 1440p panel, and 4k using DLSS Quality, with an input res of 1440p, and DLSS w/input 1440p looks better, easily, hands down, with performance well above native 4k.

People saying "gimmick", or "lipstick on a pig", legit don't know what they're missing out on. They're entitled of course to an opinion, but if said opinion is not borne from first hand experience... Well, not only does it mean less but I can see how they arrived there, because if you see and game with it, said opinion tends to change pretty fast, and if you haven't, well that sort of goes in a different, less meritorious bucket doesn't it.

Having said that, the difference between a gimmick and a feature does indeed lie in the eye of the beholder, which makes it so interesting to see and hear those people call it a gimmick, but then again, when they can't enable it and game with it through the entirely of a 15-20+ hour long game, possibly multiple times, It's no wonder they say what they say, because from their, non user of the tech /biased perspective, of course it's a gimmick, how could it not be?
Problem is nvidia is pushing it as an fps increaser, which I guess it kinda is, but if you use it to increase your FPS, you get a hit on image quality on some games (and a boost in other of course with mediocre TAA implementation). Personally I just use it to play at a higher resolution monitor than I would otherwise use.

You have a card that can only handle 1080p natively? Great, buy a 1440p monitor and use upcale. Got a card that can only handle 1440p? Buy a 4k monitor and use upscale. It's actually kinda nuts what you can do with it.
 
Joined
Sep 17, 2014
Messages
23,474 (6.13/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I can tell you right now that's easy, I've tried native 1440p on a native 1440p panel, and 4k using DLSS Quality, with an input res of 1440p, and DLSS w/input 1440p looks better, easily, hands down, with performance well above native 4k.

People saying "gimmick", or "lipstick on a pig", legit don't know what they're missing out on. They're entitled of course to an opinion, but if said opinion is not borne from first hand experience... Well, not only does it mean less but I can see how they arrived there, because if you see and game with it, said opinion tends to change pretty fast, and if you haven't, well that sort of goes in a different, less meritorious bucket doesn't it.

Having said that, the difference between a gimmick and a feature does indeed lie in the eye of the beholder, which makes it so interesting to see and hear those people call it a gimmick, but then again, when they can't enable it and game with it through the entirely of a 15-20+ hour long game, possibly multiple times, It's no wonder they say what they say, because from their, non user of the tech /biased perspective, of course it's a gimmick, how could it not be?
Its a gimmick because it is not unified, not because the game shows an advantage in either or both IQ and FPS. No support and you havent got those benefits. Thats the problem ;) not what it offers in itself, I applaud that move. But only if it is not abused to wrestle you to a new GPU every couple of years; and thats clearly the case with DLSS.
 
Joined
Dec 10, 2022
Messages
486 (0.59/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
Lowering settings is a thing too.
You could, but that has sour taste to it if you spent $400 or more on your card.
And for this specific game no GPU can do a solid 60FPS @ 4K, which honestly isn't great.
Yes but this game is the outlier of outliers and there are far more badly-optimised games that can have their bad optimisation nullified by a card that's powerful enough to handle them. Don't forget that initially, Jedi Survivor ate 21GB of VRAM and struggled on the RTX 4090:
Pre-Launch 'Jedi: Survivor' Eats 21GB of VRAM, Struggles on RTX 4090 | Tom's Hardware (tomshardware.com)

Clearly, the game just isn't ready yet because I played Jedi Survivor on an RX 6800 XT and it was a solid 60FPS (I lock it at 60 because my display is 60Hz) from beginning to end at 1440p Ultra. Immortals of Aveum is not indicative of most upcoming titles.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,412 (7.83/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
I agree with the other stuff you said for sure, but this but I believe was 960p for a native 1440p display, if I recall correctly from my previous 3440x1440 monitor.
2560x1440 and 3440x1440 will not math the same

You could, but that has sour taste to it if you spent $400 or more on your card.

Yes but this game is the outlier of outliers and there are far more badly-optimised games that can have their bad optimisation nullified by a card that's powerful enough to handle them. Don't forget that initially, Jedi Survivor ate 21GB of VRAM and struggled on the RTX 4090:
Pre-Launch 'Jedi: Survivor' Eats 21GB of VRAM, Struggles on RTX 4090 | Tom's Hardware (tomshardware.com)

Clearly, the game just isn't ready yet because I played Jedi Survivor on an RX 6800 XT and it was a solid 60FPS (I lock it at 60 because my display is 60Hz) from beginning to end at 1440p Ultra. Immortals of Aveum is not indicative of most upcoming titles.
Hogwarts legacy, Jedi Survivor, Starfield, Immortals...
All these console ports feel horribly unfinished.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,523 (1.31/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30 tuned
Video Card(s) Palit Gamerock RTX 5080 oc
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
2560x1440 and 3440x1440 will not math the same
Naturally the total pixel counts are different, and the first number is different, but it's still 1440 vertical resolution.

2560 x1440 DLSS Q input res is 1706x960
3440 x1440 DLSS Q input res is 2293x960
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,412 (7.83/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Naturally the total pixel counts are different, and the first number is different, but it's still 1440 vertical resolution.

2560 x1440 DLSS Q input res is 1706x960
3440 x1440 DLSS Q input res is 2293x960
Thank.

Math is annoying, some things i can do easily in my head and others require several copies of calculator open
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,777 (4.68/day)
Location
Kepler-186f
Processor 7800X3D -25 all core
Motherboard B650 Steel Legend
Cooling Frost Commander 140
Memory 32gb ddr5 (2x16) cl 30 6000
Video Card(s) Merc 310 7900 XT @3100 core -.75v
Display(s) Agon 27" QD-OLED Glossy 240hz 1440p
Case NZXT H710
Power Supply Corsair RM850x
@maxus24 just wanted to let you know I tried the demo of this just now on my 7900 xt and it is amazing. gorgeous game and honestly really beautiful, not had any issues with fsr3 and frame gen. they must have updated and fixed a lot of stuff or its just better on amd 7000 series, not sure. anyways yeah I am really like this, so glad they put out a free demo this weekend

This is one of the games getting FSR3 at launch.
There is a problem on the PC version with motion vectors not working on the main character's hand for spell selection isn't working, hence the problems with FSR2.







But everywhere else FSR2 seems to be working pretty well. The motion vectors are working fine on the console versions, so we should expect this to be solved on the PC front quite soon, probably on time for the FSR3 patch which is cool.
I'm also guessing FSR3 will land on consoles, meaning the 45-60 FPS might turn into 80-120FPS.




These numbers are completely made up. No average numbers were mentioned nor shown in the video. The Series X and PS5 suffer from framerate drops from 60 to ~45 in intensive scenes with many enemies and spells, and the PS5 tends to go ~5-8FPS lower because it's apparently using a higher LOD bias.






Don Allen who wrote the PC technical review says otherwise, and any reasonable person will say otherwise. He even states FSR2 Quality offers higher detail than DLSS2 Quality and native. He says he doesn't recommend using FSR2 Performance, which is a long distance from claiming "it's absolute trash".

But of course this is the internet and some internet people think ridiculous hyperbole makes them look cool and intellectual, so here we are.

they must have updated the game and drivers then, because playing the demo on my 7900 xt has looked jaw dropping beautiful.

this game has beautiful environments, and the combat is actually more fun than I was expecting. really neat opening story too. make sure if you play the demo to slow down and enjoy just looking around the environment some, its really beautiful and has nice attention to detail plus some hidden treasures.
 
Joined
Dec 10, 2022
Messages
486 (0.59/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
Hogwarts legacy, Jedi Survivor, Starfield, Immortals...
All these console ports feel horribly unfinished.
I couldn't agree more but Immortals of Aveum takes it to another level. At least those other games didn't have the developers saying that you must use upscaling to play them. :laugh:
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,777 (4.68/day)
Location
Kepler-186f
Processor 7800X3D -25 all core
Motherboard B650 Steel Legend
Cooling Frost Commander 140
Memory 32gb ddr5 (2x16) cl 30 6000
Video Card(s) Merc 310 7900 XT @3100 core -.75v
Display(s) Agon 27" QD-OLED Glossy 240hz 1440p
Case NZXT H710
Power Supply Corsair RM850x
Aveum is on sale for $29 at the moment, I am considering getting it, but it feels like the type of game that will make it into a humble bundle sooner rather than later... so I am biding my time to think on this one. I am going to get it eventually though, I really enjoyed the demo, minus the fps crashes during a few cutscenes... hopefully by the time I get it they will have patched that up.
 
Joined
Dec 10, 2022
Messages
486 (0.59/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
A funny thing happened today. An ex-co-worker of mine (let's call him Jake) called me to talk about the video card that he just bought. Some years ago, he needed to upgrade from his GTX 1660 Super. This was back in the bad-old mining boom days of 2021 and I asked him what games he plays and at what resolution. He said games like Far Cry, Witcher III, CS:GO, COD, etc. and that he had a 1080p60Hz display. I told him to get a Radeon and he looked at me as if I had just suggested that he sleep with his sister! He said "In almost ten years of gaming, I've never owned a Radeon... don't they suck or something?" After laughing at him over his reaction and his words, I pointed out to him that I had been using Radeons since 2008 and have been very happy with them. I told him that he wouldn't be able to tell what GPU he was using just based on whether it was a GeForce or a Radeon, just like he didn't know if his phone's GPU was an Adreno or an Imagination.

With the prices being what they were, he caved and bought a Gigabyte RX 6600 XT. Knowing that his experience was going to be just fine, I would tease him about defecting to "THE RED TEAM" by asking him just how terrible his gaming experience was. One day, he finally said "Ok, ok... I admit it, my Radeon card is awesome, the Adrenalin interface is WAY better than GeForce Experience, I got far more for my money with a Radeon card and I'm glad that you pushed me in that direction!". So, he got a 1440p display and now of course, his RX 6600 XT isn't good enough for it. He said that he started using FSR and that it really helped. Since it was his first time using FSR I asked him if he noticed any difference in image quality when he turned FSR on at 1440p and he said that it looked exactly the same to him but he knew that the RX 6600 XT wasn't going to be sufficient for 1440p going forward, especially with only 8GB of VRAM.

As far as I'm concerned, this means that even if DLSS is somewhat "better" than FSR, if you only have the one tech to use, it really doesn't matter. He was so pleased with his experience with his RX 6600 XT that he just upgraded to a Powercolor RX 7800 XT Hellhound (which is why he called). He said that he wanted to thank me for "opening his eyes to better possiblities" (his words, not mine). I told him that I was glad to help and that he should keep his RX 6600 XT as a spare in case he ever needs one. He told me that his girlfriend's PC now has the RX 6600 XT and he has his old GTX 1660 Super stored safely in a box for just that purpose. I said "That's an even better plan. There's nothing wrong with your GTX 1660 Super, it just can't hang with modern cards but it will make a great spare." and I believe it will.

The end of the call sounded like this:
Me: Oh, and Jake?
Jake: Yeah?
Me: Welcome to the Red Team.
Jake: (laughing) "Yeah, I know, f$#@ you!"
Me: Heheheheheh

*click!*
 
Joined
Nov 11, 2016
Messages
3,573 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
A funny thing happened today. An ex-co-worker of mine (let's call him Jake) called me to talk about the video card that he just bought. Some years ago, he needed to upgrade from his GTX 1660 Super. This was back in the bad-old mining boom days of 2021 and I asked him what games he plays and at what resolution. He said games like Far Cry, Witcher III, CS:GO, COD, etc. and that he had a 1080p60Hz display. I told him to get a Radeon and he looked at me as if I had just suggested that he sleep with his sister! He said "In almost ten years of gaming, I've never owned a Radeon... don't they suck or something?" After laughing at him over his reaction and his words, I pointed out to him that I had been using Radeons since 2008 and have been very happy with them. I told him that he wouldn't be able to tell what GPU he was using just based on whether it was a GeForce or a Radeon, just like he didn't know if his phone's GPU was an Adreno or an Imagination.

With the prices being what they were, he caved and bought a Gigabyte RX 6600 XT. Knowing that his experience was going to be just fine, I would tease him about defecting to "THE RED TEAM" by asking him just how terrible his gaming experience was. One day, he finally said "Ok, ok... I admit it, my Radeon card is awesome, the Adrenalin interface is WAY better than GeForce Experience, I got far more for my money with a Radeon card and I'm glad that you pushed me in that direction!". So, he got a 1440p display and now of course, his RX 6600 XT isn't good enough for it. He said that he started using FSR and that it really helped. Since it was his first time using FSR I asked him if he noticed any difference in image quality when he turned FSR on at 1440p and he said that it looked exactly the same to him but he knew that the RX 6600 XT wasn't going to be sufficient for 1440p going forward, especially with only 8GB of VRAM.

As far as I'm concerned, this means that even if DLSS is somewhat "better" than FSR, if you only have the one tech to use, it really doesn't matter. He was so pleased with his experience with his RX 6600 XT that he just upgraded to a Powercolor RX 7800 XT Hellhound (which is why he called). He said that he wanted to thank me for "opening his eyes to better possiblities" (his words, not mine). I told him that I was glad to help and that he should keep his RX 6600 XT as a spare in case he ever needs one. He told me that his girlfriend's PC now has the RX 6600 XT and he has his old GTX 1660 Super stored safely in a box for just that purpose. I said "That's an even better plan. There's nothing wrong with your GTX 1660 Super, it just can't hang with modern cards but it will make a great spare." and I believe it will.

The end of the call sounded like this:
Me: Oh, and Jake?
Jake: Yeah?
Me: Welcome to the Red Team.
Jake: (laughing) "Yeah, I know, f$#@ you!"
Me: Heheheheheh

*click!*

Is telling people to buy Radeon some sort of religious thing or something? It's just a video card, Jake would be just as happy with a 4070, and possibly even more when he try out some next-gen looking games :cool:
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,777 (4.68/day)
Location
Kepler-186f
Processor 7800X3D -25 all core
Motherboard B650 Steel Legend
Cooling Frost Commander 140
Memory 32gb ddr5 (2x16) cl 30 6000
Video Card(s) Merc 310 7900 XT @3100 core -.75v
Display(s) Agon 27" QD-OLED Glossy 240hz 1440p
Case NZXT H710
Power Supply Corsair RM850x
Is telling people to buy Radeon some sort of religious thing or something? It's just a video card, Jake would be just as happy with a 4070, and possibly even more when he try out some next-gen looking games :cool:

yeah I don't regret my 7900 xt considering the price I paid, but yeah nothing beats DLSS (each iteration, not just a specific version). and nothing beats frame gen. Frame gen is going to make it so cards last 10+ years now. wild times.
 
Joined
Jun 14, 2020
Messages
4,614 (2.67/day)
System Name Mean machine
Processor AMD 6900HS
Memory 2x16 GB 4800C40
Video Card(s) AMD Radeon 6700S
Is telling people to buy Radeon some sort of religious thing or something? It's just a video card, Jake would be just as happy with a 4070, and possibly even more when he try out some next-gen looking games :cool:
Especially someone that can't tell the difference between fsr and native in 1080p...
 
Top