• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Starfield: FSR 3 vs DLSS 3 vs XeSS Comparison

Joined
Sep 9, 2021
Messages
78 (0.06/day)
Starfield has finally been updated with official support for AMD's FidelityFX Super Resolution 3, featuring Frame Generation technology, and Intel's Xe Super Sampling (XeSS 1.2) upscaler. Bethesda have also improved the quality of their DLSS implementation. In this mini-review we compare the image quality and performance gains offered by these technologies.

Show full review
 
Thanks for the comparison, it's a shame it's on such a mediocre and underwhelming game.

I can't believe how unfinished and empty Starfield still is, almost 6 months after launch which was, itself, delayed by almost a full year with 450 staff working on it full-time and 250 staff still assigned to it in 2024. I enjoyed it enough to finish the campaign but I gave up on collecting star powers because they were repetitive and boring, I had no desire to sit through any of the poorly-scripted, poorly-acted side quests, and having finished the campaign New Game+ was just about the least appealing thing I could imagine. Those 6/10 ratings are well deserved, and the 10/10 reviews at launch are all absolutely from sources I now deem as corrupt and in the pockets of the big publishers.

All the interesting handcrafted world stuff from prior Bethesda games is gone, and this FSR3 Frame-gen article is hard to fully appreciate because Bethesda's creation engine is so decrepit and obsolete that expecting FSR and Frame-gen to be bonus additions is just lipstick on a pig. They're necessary but unwelcome crutches for the game - which runs upscaled with frame-gen worse than anyone would expect it to run natively without frame-gen. In so many scenes where my 7800XT + 5800X3D were struggling to maintain 60fps, I was thinking "why does this game run so badly? It doesn't even look good - it wouldn't be out of place running on a PS4Pro or XB1!". Your last paragraph really highlights the godawful state of this game's performance and dated, unsuitable engine.

Sadly, Bethesda games have historically been fixed by modders and Bethesda have become increasingly hostile and uncooperative towards the modding community. As far as I'm concerned, Bethesda are done, they've not made a top-tier game in over a decade, and even then the games were technical disasters that the community stepped in to fix because Bethesda couldn't be bothered.
 
Thanks for the comparison, it's a shame it's on such a mediocre and underwhelming game.

I can't believe how unfinished and empty Starfield still is, almost 6 months after launch which was, itself, delayed by almost a full year with 450 staff working on it full-time and 250 staff still assigned to it in 2024. I enjoyed it enough to finish the campaign but I gave up on collecting star powers because they were repetitive and boring, I had no desire to sit through any of the poorly-scripted, poorly-acted side quests, and having finished the campaign New Game+ was just about the least appealing thing I could imagine. Those 6/10 ratings are well deserved, and the 10/10 reviews at launch are all absolutely from sources I now deem as corrupt and in the pockets of the big publishers.

All the interesting handcrafted world stuff from prior Bethesda games is gone, and this FSR3 Frame-gen article is hard to fully appreciate because Bethesda's creation engine is so decrepit and obsolete that expecting FSR and Frame-gen to be bonus additions is just lipstick on a pig. They're necessary but unwelcome crutches for the game - which runs upscaled with frame-gen worse than anyone would expect it to run natively without frame-gen. In so many scenes where my 7800XT + 5800X3D were struggling to maintain 60fps, I was thinking "why does this game run so badly? It doesn't even look good - it wouldn't be out of place running on a PS4Pro or XB1!". Your last paragraph really highlights the godawful state of this game's performance and dated, unsuitable engine.

Sadly, Bethesda games have historically been fixed by modders and Bethesda have become increasingly hostile and uncooperative towards the modding community. As far as I'm concerned, Bethesda are done, they've not made a top-tier game in over a decade, and even then the games were technical disasters that the community stepped in to fix because Bethesda couldn't be bothered.
Yeah I played Oblivion and I feel thats how open world games should be, its a live and vibrant world. Albeit with a ton of bugs which as you said got fixed by modders.

Starfield from your description sounds like its adopted the MMO model of game creation? MMO's I feel are such a lazy approach to game development.
 
Sadly, Bethesda games have historically been fixed by modders and Bethesda have become increasingly hostile and uncooperative towards the modding community. As far as I'm concerned, Bethesda are done, they've not made a top-tier game in over a decade, and even then the games were technical disasters that the community stepped in to fix because Bethesda couldn't be bothered.
Two decades even, if you are like me and consider everything after Morrowind to be a sharp decline.
 
Why do we have TAA again?
TAA looks like garbage.
Can we please get a native 2160p/4K with no AA & no AFF on the list?
 
I dunno, XeSS (on right) makes text, indicators and lights on cockpit very washed out and idk, fainted. (FSR3 on left)
memeb.jpg
 
Why do we have TAA again?
TAA looks like garbage.
Can we please get a native 2160p/4K with no AA & no AFF on the list?
Ok, I get the no AA angle, a lot of modern methods give questionable results, fair enough. But why, in gods name, would you want no AF? It pretty much has been a free setting performance-wise for a long while now and has no visual drawbacks. We aren’t in the 90-s where running without texture filtering was an intended look for a game in some cases.
 
I bet these games would look a lot better if they'd apply whatever upscaling method before the TAA pass rather than after, which is how I am sure most of, if not all of them do it.

TAA reduces perceived resolution in motion massively, so when you upscale something from 1440p to 4K or whatever in reality you're trying to upscale from something that looks way worse than 1440p would look like if it wasn't for TAA.
 
Last edited:
I bet these games would look a lot better if they'd apply whatever upscaling method before the TAA pass rather than after, which is how I am sure most of, if not all of them do it.
Aren’t all three current methods (DLSS, FSR, XeSS) are by themselves temporal by nature and drop in instead of the TAA stage, not before or after? At least that’s how NV and Intel describe the process, unsure in AMD.
 
Aren’t all three current methods (DLSS, FSR, XeSS) are by themselves temporal by nature and drop in instead of the TAA stage, not before or after? At least that’s how NV and Intel describe the process, unsure in AMD.
The current frame goes through the TAA pass before it's upscaled, then the upscaler may have a temporal component as well, most new games don't even give you the option to turn off TAA because it's an integral part of the rendering pipeline.

The point is I suspect the results would be much better if TAA was applied after the image is upscaled.
 
Last edited:
Doesn't matter, they just poorly implemented the TAA and they are very unlikely to fix it. You should avoid the native TAA if at all possible in this game and use DLAA if your on nVidia, FSR3 on Native or XeSS on UQ if you on AMD or Intel. Especially at 1080p, your already at a lower resolution and the native TAA just destroys image quality so again use DLAA, FSR3 native or XeSS UQ.
 
Ok, I get the no AA angle, a lot of modern methods give questionable results, fair enough. But why, in gods name, would you want no AF? It pretty much has been a free setting performance-wise for a long while now and has no visual drawbacks. We aren’t in the 90-s where running without texture filtering was an intended look for a game in some cases.

Because we do not need it with the larger texture packs for 4K that are given now.
 
Because we do not need it with the larger texture packs for 4K that are given now.
AF has nothing to do with texture resolution though. You would still get quality degradation at oblique viewing angles and high distance regardless of resolution, that’s just an inherent limitation of how textures work with 3D rendering. Considering the negligible performance impact, leaving it off just because “muh 4K” makes absolutely no sense. And yes, EXTREMELY high res textures do mitigate the need for AF, but this is a far more taxing and VRAM intensive way of doing things.
 
History of FSR vs DLSS in TPU tests is consistent, so no need to read the review. The conclusion is already known.

When also the review starts with
To the dismay of many expectant gamers, Starfield only had support for AMD's FidelityFX Super Resolution (FSR 2.2)
the reader already knows that FSR needs to be better than real life to even have a chance for a .... draw.

Fast bypassing the review and going to conclusion, one line is enough to confirm that there is nothing new to read here, compared to all those previous comparisons.
Unfortunately, despite the FSR implementation being updated from FSR 2.2 to FSR 3 with the latest patch, the quality of FSR upscaling remains the same as it was on launch day.
 
I find it funny how they're implementing this shit to a game that barely anyone's playing.

- Starfield had "$5 bargain bin" purchase written all over it from some of the very first launch trailers. Its been obvious for a while that Bethsoft just didn't know how to make the concept work or breathe life into it. Everything smacked of "Mass Effect but no soul".

MAYBE we'll get 2-3 DLCs that basically flesh out base building, survival, defense, and some additional story and pump the value up to $10 GOTY Edition purchase.
 
Fast bypassing the review and going to conclusion, one line is enough to confirm that there is nothing new to read here, compared to all those previous comparisons.

Yep. It's pretty pointless because by the end of 2024 all proprietary upscaling tech will be shitcanned in deference to standards implemented in DirectX and Vulkan, just like they did with raytracing.
 
I've tested FSR3 in both Talos Principle and Starfield now. I find that DLSS without FG looks and feels much better than FSR3, especially on foliage.
 
- Starfield had "$5 bargain bin" purchase written all over it from some of the very first launch trailers. Its been obvious for a while that Bethsoft just didn't know how to make the concept work or breathe life into it. Everything smacked of "Mass Effect but no soul".

MAYBE we'll get 2-3 DLCs that basically flesh out base building, survival, defense, and some additional story and pump the value up to $10 GOTY Edition purchase.
The problem is they really have kept and tried to build on an old engine. This game was overhyped and was never going to deliver a truly engaging and lasting experience. I am heartened by Intel's efforts to make XeSS work, however.
 
Yep. It's pretty pointless because by the end of 2024 all proprietary upscaling tech will be shitcanned in deference to standards implemented in DirectX and Vulkan, just like they did with raytracing.
Er, proprietary ray tracing seems to be alive and well, for those that use RT, most have NVIDIA cards, although that's obviously also true for those who don't use RT too. Regardless, proprietary RT vendor tech, such as DLSS 3.5 Ray Reconstruction, is certainly still being updated and included in new games.

I highly doubt upscaling standards will go differently. Even Sony is rumoured to be moving away from FSR to their own model with the next PlayStation, which is a dominant slice of the console market. There's a chance Microsoft will do the same, perhaps with XeSS, if they do end up going with an Intel chip with the next Xbox.
 
I appreciate your work @maxus24 I just would like to say this FSR3 is designed first and foremost for 7000 series cards and we all know that, I personally believe for accurate results it should be tested on 7000 series cards and not a 4080. Not your fault as that is all you have.

That being said, I do believe the results you have would be different if done on 7900 xt or xtx, and if I am wrong then so be it, at least there would be evidence to prove so.

@Vya Domus why am I the only one who ever talks about this? I don't get it.
 
I appreciate your work @maxus24 I just would like to say this FSR3 is designed first and foremost for 7000 series cards and we all know that, I personally believe for accurate results it should be tested on 7000 series cards and not a 4080. Not your fault as that is all you have.

That being said, I do believe the results you have would be different if done on 7900 xt or xtx, and if I am wrong then so be it, at least there would be evidence to prove so.

@Vya Domus why am I the only one who ever talks about this? I don't get it.
FSR, by design, is hardware agnostic, designed for maximum cross platform support, hence integration from consoles to phones to PCs. It is not like XeSS that has three different kernels, or DLSS, that requires tensor cores and is specifically built around NVIDIA architecture. FSR3 isn't really changed from FSR2, it just adds some features for attempted parity with NVIDIA, such as frame generation.

AMD frame generation technology also does not use or require hardware exclusive to AMD, beyond simply having enough GPU horsepower to generate a good enough base FPS, unlike DLSS3 FG, which requires the optical flow accelerator within Ada cards, so there's no real difference there either.

Your argument would be more suitable to instead support also running game testing on Intel Arc GPUs, but without speaking for reviewers, I'd wager it's a limited enough consumer base to not warrant the expense and time at this point, beyond acknowledging that IQ and performance will be slightly better on native Intel hardware. Perhaps when Battlemage comes out.

Besides, other reviewers and the general public encounter similar issues with FSR. Unfortunately AMD has simply not improved the upscaler component for quite a long time at this point.

Personally I think the XeSS approach has the best balance, DLSS for sure has the best quality, but it's not universal, requiring specific hardware, and FSR goes too far in support, without enough development to make the end result good enough.
 
FSR, by design, is hardware agnostic, designed for maximum cross platform support, hence integration from consoles to phones to PCs. It is not like XeSS that has three different kernels, or DLSS, that requires tensor cores and is specifically built around NVIDIA architecture. FSR3 isn't really changed from FSR2, it just adds some features for attempted parity with NVIDIA, such as frame generation.

AMD frame generation technology also does not use or require hardware exclusive to AMD, beyond simply having enough GPU horsepower to generate a good enough base FPS, unlike DLSS3 FG, which requires the optical flow accelerator within Ada cards, so there's no real difference there either.

Your argument would be more suitable to instead support also running game testing on Intel Arc GPUs, but without speaking for reviewers, I'd wager it's a limited enough consumer base to not warrant the expense and time at this point, beyond acknowledging that IQ and performance will be slightly better on native Intel hardware. Perhaps when Battlemage comes out.

Besides, other reviewers and the general public encounter similar issues with FSR. Unfortunately AMD has simply not improved the upscaler component, for quite a long time at this point.

Personally I think the XeSS approach has the best balance, DLSS for sure has the best quality, but it's not universal, requiring specific hardware, and FSR goes too far in support, without enough development to make the end result good enough.

fair enough, thank you for explaining it to me

I personally thought Starfield looked good on my 1440p 7900 xt rig, I recall even sharing some artistic screenshots on the Starfield thread here. I think sometimes with this upscaling stuff we hyperfocus on things that we really never notice when actually playing the game. it is what it is though and I agree with your conclusion, interesting to see that Intel is actually the winner though, I really hope they can improve next gen and get the drivers sorted for a majority of games, would be nice to have a third competitor to keep prices in check
 
fair enough, thank you for explaining it to me

I personally thought Starfield looked good on my 1440p 7900 xt rig, I recall even sharing some artistic screenshots on the Starfield thread here. I think sometimes with this upscaling stuff we hyperfocus on things that we really never notice when actually playing the game. it is what it is though and I agree with your conclusion, interesting to see that Intel is actually the winner though, I really hope they can improve next gen and get the drivers sorted for a majority of games, would be nice to have a third competitor to keep prices in check
DLSS is the winner in technical performance, and the exclusivity isn't really a big problem since NVIDIA cards are the vast majority, with RTX features being around for three generations now. Intel XeSS just has the best approach to balancing technical performance with compatibility, likely so as to support non-PC platforms in future, as I can't see a reason why they'd spend the development resources otherwise.
 
Er, proprietary ray tracing seems to be alive and well, for those that use RT, most have NVIDIA cards, although that's obviously also true for those who don't use RT too. Regardless, proprietary RT vendor tech, such as DLSS 3.5 Ray Reconstruction, is certainly still being updated and included in new games.
Sort of.

Ray reconstruction occurs after the ray tracing calculations happen (which are done at the API level and vendor independent) but replaces the normal denoiser with a custom pre-trained AI denoiser. That's the "reconstruction" part.

Technically AMD and Intel could come up with their own denoiser that leverages their their own tensor/vector/matrix cores.

It's just math.

I highly doubt upscaling standards will go differently. Even Sony is rumoured to be moving away from FSR to their own model with the next PlayStation. There's a chance Microsoft will do the same, perhaps with XeSS, if they do end up going with an Intel chip with the next Xbox.

That information comes from two MLID videos/podcasts.
  • Tom covered that Microsoft is trying to pressure AMD for lower pricing on next-gen custom silicon and threatened to go to ARM or Intel.
    • Microsoft also did this last-gen and AMD called their bluff.
    • Intel is super-hungry for Microsoft's business, which is when Tom and Dan started spitballing the idea of the XBox using XeSS.
    • They also mentioned why it's highly unlikely that Microsoft would switch to Intel unless Intel just made them a ridiculous at-cost deal.
  • Tom and his guest in a more recent podcast mentioned that Sony is rumored to be developing an upscaling tech. It's important to point out that:
    • The PS4 and 5 don't use DirectX or Vulkan. They have their own low-level API called GNM and GNMX. Their shader language is PSSL.
    • They're developing their own hardware accelerated upscaling tech for the same reason Microsoft are.

So why are Microsoft and Sony developing their own tech?
  • To prevent vendor lock in (e.g. Switching from AMD to Intel)
  • To make it easier to develop for and port games from console to PC and vice versa.
  • Developing for three different proprietary upscaling technologies means comitting expansive engineering time for three different code paths and at least three more quality control efforts. That's ridiculously expensive.
    • Coincidentally that's also why most studios ship with FSR initially - they can hit both consoles and the PC in one swimlane.
I hope I explained it more clearly this time rather than just making a simple statement. I sometimes forget that most people don't have a lot of experience working in software or hardware development.
 
Last edited:
Back
Top