Wednesday, September 13th 2023

Starfield to Finally Get DLSS Support

The space opera RPG everyone's been lapping up over the week, Starfield, is finally getting official support for NVIDIA DLSS, among a handful of other glaring omissions for its PC version. Bethesda Game Studios on Wednesday announced that it will release a "small hotfix patch" for the game which adds a few must-haves for the PC version. To begin with, it adds support for the 32:9 ultra-wide monitor aspect-ratio (something that should have made it to a space sim), along with an FOV slider. The display settings include brightness and contrast controls, along with an HDR calibration menu. The most important of these announcements is that the game will now receive first-party support for NVIDIA DLSS. This would be DLSS 2 (super resolution), and not the newer DLSS 3 frame generation.
Source: Bethesda
Add your own comment

73 Comments on Starfield to Finally Get DLSS Support

#51
kapone32
lasThis is not really bias. More like reality. Most if not all other reviewers comes up with the same results. Even TechSpot that for sure don't talk bad about AMD showed DLSS being FSR superior. If DF is Nvidia biased, Techspot is AMD biased. It's more actual experience with the diffferent cards. I have tried alot of different Nvidia and AMD and I know for sure that AMD don't match Nvidia in terms of features and drivers when you don't cherrypick a few games but look at the overall picture instead across multiple titles, especially when you don't solely look at the most popular games, which reviewers tend to use.

Also, RTX can do so much more than simple rasterization, that you have a hard time settling with AMD after using a RTX GPU. Not a single AMD feature is on par with Nvidias and AMD don't even have a counter for many RTX features (DLAA, DLDSR, DLSS 3 + FG + 3.5 to name a few).

Even if 7900XTX had matched 4090 in raster, I would have picked 4090 anyway because of DLSS, DLAA, DLDSR and Reflex mostly + Much better RT perf. I am using many RTX mods and RTX features to improve the experience of older games (DLDSR, RT mods etc) - features like this can transform old games - and I can't wait for Half Life 2 RTX. There's many other features present (with RTX or Nvidia in general) and every single one of these features beats what AMD is offering. Pretty much all streamers use Nvidia because of ShadowPlay and the native integration on Twitch and most big streaming platforms. ReLive is not really close and has bigger performance hit + less native support on platforms and way more issues.

This is why AMD has lower prices. Lack of features. Worse features. Worse drivers and support especially when you leave the most popular games. AMD spends most of their time/money optimizing for games that actively gets benchmarked so their GPUs look good in reviews. AMDs performance in early access games, betas or just lesser popular titles are not on par with Nvidia 9 out of 10 times. Most developers use Nvidia and optimizes for Nvidia because 80% of the PC gaming segment uses Nvidia. Nvidia have tons of money for inventing features, improving features and perfect experience in games (driver optimization). AMD have much lower R&D funds and software department in general.

Also, AMDs main business is not GPUs but CPUs and APUs. Nvidias is. They are industry leader in gaming GPUs, enterprise GPUs and AI GPUs.

Gaming GPUs are not really profitable for AMD. They earn more per wafer by selling CPUs and APUs (both OEM and Consoles) and all their chips uses the same TSMC lines, meaning it makes more sense for AMD to just make CPUs and APUs. AMD decides what chips to put out and CPUs are more profitable. More chips per wafer equals more money for AMD.

I know it's hard to accept the fact that AMD lacks features, but they do. This is why every AMD GPU user hates the word upscaling, rt, downsampling and more, because their cards can't really do it properly. They are stuck with "native" and good old raster, which is why they praise native all the time and talk RTX features down - because they can't use them = Denial.

Yet native can easily be improved and beaten with features like DLAA or even DLSS on the higher presets if you want some performance on top as well. However DLAA beats native and every other AA solution every single time when it comes to visuals and DLAA is a preset of DLSS now. DLSS don't just mean upscaling (with built in AA), it also means best AA method today; DLAA.

Even developers embrace upscaling, downsampling, sharpening filters and next gen anti-aliasing. Like I said several times now, FSR2 is enabled as default in Starfield. Upscaling is enabled in Remnant 2 as default. They officially stated the game was designed with DLSS/FSR/XeSS image upscaling in mind.

Native is not really better these days, especially not if you use DLAA or DLDSR. DLSS will only on lower presets make visuals worse, but performance skyrocket - This is up to the user to decide. AMD can't match these features at all. This is what you pay extra for, when you buy Nvidia. And resell value of AMD hardware is lower as well because demand is lower and AMD lowers prices several times thru a generation.
Is the 7900 series not just as fast as a 3090 in RT? Then you have to realize that only 2 Nvidia skus are ok for RT. You can also read and learn that information.Then you seem to forget that the 7900XT is half in cost of what a 4090 is where I live. Would I spend another $1200 to get features that are in less than 1% of Games and not in the Games I focus on.

I guess you have used Relive and see that Twitch and Youtube are fully supported in that and now have AV1 encoding and decoding but I guess I don't have the same access that you have.

Amd prices are lower because they don't fully agree with Nvidia and their price structure, I can promise you that if the 7900XT was $500 you would not be able to buy one. Another thing you don't seem to understand is the 6700XT has been the best value GPU since last year.

When I look at AMD software and compare it to Nvidia I certainly do not agree with that assessment that AMD users only use native because they lack features.


Yep that is why they were so desperate to buy ARM. The fact that AMD AND Intel have CPUs and GPUs on the X86 licence is not a good look long term for Nvidia.

DLAA, DLSS and do you know that I turn off AA? It is my PC after all but to say it looks worse is just wrong. You are talking about DLSS not RT.

Yep Bathesda was quick to embrace DLSS when they were creating Starfield. Modern Warfare is not faster on AMD cards. 4K is not achievable on 7900 cards and AMD drivers are garbage.

The only Game in my library with XeSS support is Redout 2. Do you know what that Game is?

The resale value of AMD GPUs is lower because the retail price is lower. I can promise you that plenty of people made profits selling AMD cards during the Mining Boom. You know what I see in that. Do you know what CPU company was in Open source from the beginning? I look (again) at the 6500XT that AMD made so that some users with less money than us can enjoy PC Gaming. The fact remains though that Nvidia's propaganda Campaign is strong.

I have humble Choice and there has not been a Game that my card can't run through so try again.

In the end you are trying to establish that AMD GPUs are not good by stating that they don't have Nvidia specific features because when I watched the State of Play yesterday not once did I think about DLSS or DLAA when watching trailers for Spiderman 2, Pandora and a few others that will be made on AMD hardware. I also guess that Microsoft is not one of the biggest publishers in the space right now but you can go on with you opinion.
Posted on Reply
#52
fevgatos
kapone32Is the 7900 series not just as fast as a 3090 in RT? Then you have to realize that only 2 Nvidia skus are ok for RT.
That's not how it works. At all. Yes, you need a 4080 and a 4090 to play 4k RT (and usually with DLSS on top of that), but you don't need a 4080 or a 4090 to play at 1440p. You know, cards like the 4070 that are targeted at 1440p can play RT just fine. That is not the case with the 7900xtx, since that's targeted at 4k but can't do RT at that resolution.
Posted on Reply
#53
kapone32
fevgatosThat's not how it works. At all. Yes, you need a 4080 and a 4090 to play 4k RT (and usually with DLSS on top of that), but you don't need a 4080 or a 4090 to play at 1440p. You know, cards like the 4070 that are targeted at 1440p can play RT just fine. That is not the case with the 7900xtx, since that's targeted at 4k but can't do RT at that resolution.
So I guess the TPU review of the 7900 cards was just false. I guess my experience has also been false for the last 8 months. You are one of the people that will move Goal posts to support your argument that AMD sucks and Nvidia is the only option. I can't do RT at 4K? Are you sure about that? Don't forget that my monitors Freesync range is 45-144 HZ meaning that even if I am getting 60 FPS @ 4K it would still be butter smooth. I am really tired of people that do not own the card I have telling me what it cannot do.
Posted on Reply
#54
fevgatos
kapone32So I guess the TPU review of the 7900 cards was just false. I guess my experience has also been false for the last 8 months. You are one of the people that will move Goal posts to support your argument that AMD sucks and Nvidia is the only option. I can't do RT at 4K? Are you sure about that? Don't forget that my monitors Freesync range is 45-144 HZ meaning that even if I am getting 60 FPS @ 4K it would still be butter smooth. I am really tired of people that do not own the card I have telling me what it cannot do.
That is at ALL not the point. At all. Im saying that just because someone says the 7900 series is not good enough for RT doesn't mean that only the 4080 and 4090 can play RT. It means that the 7900 cannot play games with RT at their intended usecases. A 4070 is a 1440p card so you are not going to buy that one if you have a 4k monitor.

And yes, you are correct, the 7900 series cannot do RT. Are you counting farcry 6 as an RT game?
Posted on Reply
#55
kapone32
fevgatosAnd yes, you are correct, the 7900 series cannot do RT. Are you counting farcry 6 as an RT game?
This has to be the most uneducated response I have ever seen. Once again you are telling me what my own card cannot do when I play CP2077 at 4K and sometimes use Ray Tracing. Then you are purporting that the only cards that can do 4K are the 4090 and 4080. I am not going to bother showing you any screenshots. I have done that before and you did not even look at it but it's ok. You and a few other users on here are known as Nvidia fanatics. Far Cry 6? I don't even play Far Cry. Please keep in mind that even though Gaming looks nice that it is all about Math. Before you go on waxing about RT you should learn what Trip Hawkins from EA did more than 30 years ago. Using that principle a 57000 million transistor board would be beneficial at that then add another 16 billion for the CPU and you will understand that 1/0 would be super fast with that many Gates opening and closing.
Posted on Reply
#56
Dr. Dro
Thanks to the fine work of brilliant folks like LukeFZ and Kaldaien I've been enjoying the hell out of Starfield with Special K and DLSS 3.5 Frame Generation for the last 15 days... I'm devouring this game, I love it so much, it single-handedly reignited my love for video games.

www.nexusmods.com/starfield/mods/761

Buttery smooth 144 fps even with the settings all on ultra and full resolution DLSS (DLAA preset F). Pure bliss, just like this fine lady right here. I just started Playthrough 3 (NG++), currently at level 54. I'm going to keep replaying this one for a few years to come, just like I always revisit Oblivion. I hope Bethesda's official implementation is great, community has done well for itself so far.

lasRTX can do so much more than simple rasterization, that you have a hard time settling with AMD after using a RTX GPU. Not a single AMD feature is on par with Nvidias and AMD don't even have a counter for many RTX features (DLAA, DLDSR, DLSS 3 + FG + 3.5 to name a few).
Yup! I must agree. The things that GeForce RTX cards can do are like arcane magic, it's really impressive. I'm hopeful that once AMD's Hypr-RX thing matures they have a similar suite, but it will take some time.
Posted on Reply
#57
fevgatos
kapone32Then you are purporting that the only cards that can do 4K are the 4090 and 4080.
4k, yes. That's what im saying.

But, if you are happy, I'm happy. If your card can do 4k RT in Cyberpunk, that's great, enjoy.
kapone32You and a few other users on here are known as Nvidia fanatics.
Irrelevant. I can claim you are known as amd fanatic. Doesn't matter, it's not an argument.
Posted on Reply
#58
kapone32
fevgatos4k, yes. That's what im saying.

But, if you are happy, I'm happy. If your card can do 4k RT in Cyberpunk, that's great, enjoy.


Irrelevant. I can claim you are known as amd fanatic. Doesn't matter, it's not an argument.
Stop spreading misinformation and we will be Golden
Posted on Reply
#59
Dr. Dro
kapone32Stop spreading misinformation and we will be Golden
I mean, playable is quite a variable standard... I have more or less a rough estimate of what the 7900 XTX can do with RT (not that it's relevant to Starfield - it doesn't support RT, anyways. There's some engine code regarding support but it's entirely non-functional, the toggle in the ini does not do anything) since I had the RTX 3090 and they have roughly the same RT performance (bit tilted towards the 3090 in this aspect, but the 7900 XTX's superior raster power more or less levels it out), and at least, speaking for myself and myself only, I don't think it's adequate for 4K raytracing. AMD hasn't achieved this yet. But then again, if I may be entirely honest, neither are the 3090 Ti nor the 4080: that leaves you with the 4090 and that's... stretching.

I think we'll need the 5090, and unless AMD catches up to Blackwell with RDNA 4, the 9900 XTX/RDNA 5 flagship for us to start really talking about ray or pathtraced 4K gaming at >60fps without resorting to upscalers or frame generation.
Posted on Reply
#60
fevgatos
kapone32Stop spreading misinformation and we will be Golden
I'm taking my information from this



Your card at 4k with FSR Q will get lower fps than the above, so below 35. Again, if you are happy with that - im happy, we are both happy. The above is of course without path tracing, with PT you are looking at single digits or something.
Posted on Reply
#61
kapone32
fevgatosI'm taking my information from this



Your card at 4k with FSR Q will get lower fps than the above, so below 35. Again, if you are happy with that - im happy, we are both happy. The above is of course without path tracing, with PT you are looking at single digits or something.
Once again you do not realize that I have my own information
Posted on Reply
#62
fevgatos
kapone32Once again you do not realize that I have my own information
Your "information" tells me nothing about how the game runs with RT. If that's the information you got, yeah, okay buddy, im sold.
Posted on Reply
#63
AnotherReader
Dr. DroI mean, playable is quite a variable standard... I have more or less a rough estimate of what the 7900 XTX can do with RT (not that it's relevant to Starfield - it doesn't support RT, anyways. There's some engine code regarding support but it's entirely non-functional, the toggle in the ini does not do anything) since I had the RTX 3090 and they have roughly the same RT performance (bit tilted towards the 3090 in this aspect, but the 7900 XTX's superior raster power more or less levels it out), and at least, speaking for myself and myself only, I don't think it's adequate for 4K raytracing. AMD hasn't achieved this yet. But then again, if I may be entirely honest, neither are the 3090 Ti nor the 4080: that leaves you with the 4090 and that's... stretching.

I think we'll need the 5090, and unless AMD catches up to Blackwell with RDNA 4, the 9900 XTX/RDNA 5 flagship for us to start really talking about ray or pathtraced 4K gaming at >60fps without resorting to upscalers or frame generation.
Looking at the performance in CyberPunk's overdrive mode, it's rather unlikely that even the 5090 will be fast enough for path-tracing at 4K without DLSS.
Posted on Reply
#65
Dr. Dro
AnotherReaderLooking at the performance in CyberPunk's overdrive mode, it's rather unlikely that even the 5090 will be fast enough for path-tracing at 4K without DLSS.
Yeah, I believe it. Seems really hardcore stuff, still a few gens out of reach, unless they manage to make it do another +100% on the 4090... and the worst of all is that I may very well have to sell my 4080 and buy a 5080... won't be buying anything on the 90 segment until the prices pipe down a bit, if ever. I hold a faint hope that Blackwell will be like Ampere as in, refinement pass for less money, while both Turing and Ada brought the tech to the table (for a cost). If Nvidia does this cadence, it might just work, I dunno.
Posted on Reply
#66
AnotherReader
Dr. DroYeah, I believe it. Seems really hardcore stuff, still a few gens out of reach, unless they manage to make it do another +100% on the 4090... and the worst of all is that I may very well have to sell my 4080 and buy a 5080... won't be buying anything on the 90 segment until the prices pipe down a bit, if ever. I hold a faint hope that Blackwell will be like Ampere as in, refinement pass for less money, while both Turing and Ada brought the tech to the table (for a cost). If Nvidia does this cadence, it might just work, I dunno.
The 3090 to 4090 jump was due to the disparity between Samsung's 8 nm node and TSMC's N4. Such a jump with N3 is rather unlikely. We have already seen Apple deliver ho-hum gains with their A17 Pro SOC and Apple's design chops are second to none. SRAM scaling finally died with N3 though backside power delivery should restore it in future nodes, but that means that SRAMs in the chips won't shrink at all from their previous designs so the amount of logic that can be packed in compared to N4 won't increase as much as one might expect.
Posted on Reply
#67
las
kapone32Is the 7900 series not just as fast as a 3090 in RT? Then you have to realize that only 2 Nvidia skus are ok for RT. You can also read and learn that information.Then you seem to forget that the 7900XT is half in cost of what a 4090 is where I live. Would I spend another $1200 to get features that are in less than 1% of Games and not in the Games I focus on.

I guess you have used Relive and see that Twitch and Youtube are fully supported in that and now have AV1 encoding and decoding but I guess I don't have the same access that you have.

Amd prices are lower because they don't fully agree with Nvidia and their price structure, I can promise you that if the 7900XT was $500 you would not be able to buy one. Another thing you don't seem to understand is the 6700XT has been the best value GPU since last year.

When I look at AMD software and compare it to Nvidia I certainly do not agree with that assessment that AMD users only use native because they lack features.


Yep that is why they were so desperate to buy ARM. The fact that AMD AND Intel have CPUs and GPUs on the X86 licence is not a good look long term for Nvidia.

DLAA, DLSS and do you know that I turn off AA? It is my PC after all but to say it looks worse is just wrong. You are talking about DLSS not RT.

Yep Bathesda was quick to embrace DLSS when they were creating Starfield. Modern Warfare is not faster on AMD cards. 4K is not achievable on 7900 cards and AMD drivers are garbage.

The only Game in my library with XeSS support is Redout 2. Do you know what that Game is?

The resale value of AMD GPUs is lower because the retail price is lower. I can promise you that plenty of people made profits selling AMD cards during the Mining Boom. You know what I see in that. Do you know what CPU company was in Open source from the beginning? I look (again) at the 6500XT that AMD made so that some users with less money than us can enjoy PC Gaming. The fact remains though that Nvidia's propaganda Campaign is strong.

I have humble Choice and there has not been a Game that my card can't run through so try again.

In the end you are trying to establish that AMD GPUs are not good by stating that they don't have Nvidia specific features because when I watched the State of Play yesterday not once did I think about DLSS or DLAA when watching trailers for Spiderman 2, Pandora and a few others that will be made on AMD hardware. I also guess that Microsoft is not one of the biggest publishers in the space right now but you can go on with you opinion.
No lol. AMDs 7900XTX don't even beat 4070 Ti in RT. Nvidia have tons of SKUs that are usable for RT but DLSS can easily make most of their cards to proper RT. 4090 does 4K60 RT with no upscaling (read 4090 TPU review for proof). I don't really care much about RT tho. I care about DLAA, DLSS, DLDSR and Reflex. AMD has no features that even comes close. They don't even have an answer for many Nvidia features at all.

Tons of people are having issues with Relive on Twitch. Simply Google it. Shadowplay is higher quality and uses less (minimal) ressources with much better integration and support. This is why 99% of streamers use Nvidia.

If AMD sold 7900XT for 500 dollars they might as well stop selling GPUs because they would make zero money from it.

Nvidia don't care about x86 market at all. Haha.. Nvidia wanted ARM because it's better for them, they already do ARM chips. x86 is pointless for Nvidia.

You turn AA off and have terrible image quality with jaggies all over, good for you. Even with 4K no anti aliasing sucks. Some of us wants good quality with no shimmering or jaggies, and DLAA can do just that. No other AA solution comes close and AMD users are forced to use the worse AA solutions.

AMD is cheaper because they would have no business if they priced their GPUs on par or higher than Nvidia. They would sell zero cards. AMD is the cheaper choice and it is cheaper for a reason. Even with lower prices, they are stuggling with marketshare. Nvidia sits at 80% on steam hardware survey. AMD marketshare went down over the last years, not up.
Posted on Reply
#68
AnotherReader
lasNvidia don't care about x86 market at all. Haha.. Nvidia wanted ARM because it's better for them, they already do ARM chips. x86 is pointless for Nvidia.
That's just sour grapes. Nvidia only cares about ARM, because they couldn't get their hands on x86. Project Denver started out as a x86 processor like Transmeta's Crusoe. Nvidia even considered being acquired by AMD. As Zen4C has shown, at the high end, x86 vs ARM doesn't matter. In addition to high performance, It is simultaneously power efficient and area efficient.
Posted on Reply
#69
las
AnotherReaderThat's just sour grapes. Nvidia only cares about ARM, because they couldn't get their hands on x86. Project Denver started out as a x86 processor like Transmeta's Crusoe. Nvidia even considered being acquired by AMD. As Zen4C has shown, at the high end, x86 vs ARM doesn't matter. In addition to high performance, It is simultaneously power efficient and area efficient.
ARM is gaining more and more marketshare in the enterprise market. Besides, Nvidia has full focus on GPUs, you know AI boom.
Posted on Reply
#70
Why_Me
lasNo lol. AMDs 7900XTX don't even beat 4070 Ti in RT. Nvidia have tons of SKUs that are usable for RT but DLSS can easily make most of their cards to proper RT. 4090 does 4K60 RT with no upscaling (read 4090 TPU review for proof). I don't really care much about RT tho. I care about DLAA, DLSS, DLDSR and Reflex. AMD has no features that even comes close. They don't even have an answer for many Nvidia features at all.

Tons of people are having issues with Relive on Twitch. Simply Google it. Shadowplay is higher quality and uses less (minimal) ressources with much better integration and support. This is why 99% of streamers use Nvidia.

If AMD sold 7900XT for 500 dollars they might as well stop selling GPUs because they would make zero money from it.

Nvidia don't care about x86 market at all. Haha.. Nvidia wanted ARM because it's better for them, they already do ARM chips. x86 is pointless for Nvidia.

You turn AA off and have terrible image quality with jaggies all over, good for you. Even with 4K no anti aliasing sucks. Some of us wants good quality with no shimmering or jaggies, and DLAA can do just that. No other AA solution comes close and AMD users are forced to use the worse AA solutions.

AMD is cheaper because they would have no business if they priced their GPUs on par or higher than Nvidia. They would sell zero cards. AMD is the cheaper choice and it is cheaper for a reason. Even with lower prices, they are stuggling with marketshare. Nvidia sits at 80% on steam hardware survey. AMD marketshare went down over the last years, not up.
las blowing up this thread with truth bombs. AMD cultist won't like this ^^
Posted on Reply
#71
vacsati
srekal34Well you have the point. Both upscalers should be supported from the get go. I guess there is less drama for missing FSR because of low market share.
Bad example, since the BG3 doesnt even need upscaling, the game engine is not really taxing on hardwares. So its just normal that noone care about the upscale options ib BG3...
AnotherReaderThat's just sour grapes. Nvidia only cares about ARM, because they couldn't get their hands on x86. Project Denver started out as a x86 processor like Transmeta's Crusoe. Nvidia even considered being acquired by AMD. As Zen4C has shown, at the high end, x86 vs ARM doesn't matter. In addition to high performance, It is simultaneously power efficient and area efficient.
Seems like you are living in the past...
Posted on Reply
#72
AnotherReader
vacsatiBad example, since the BG3 doesnt even need upscaling, the game engine is not really taxing on hardwares. So its just normal that noone care about the upscale options ib BG3...


Seems like you are living in the past...
The past informs the present. I'm only pointing out that while Nvidia is very successful now, there was a time when they wanted to make x86 CPUs.
Posted on Reply
Add your own comment
Jun 3rd, 2024 07:57 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts