• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Starfield to Finally Get DLSS Support

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Well Digital Foundry is fairly frequently sponsored by Nvidia so inherently you have to assume they are not an unbiased source of information in regards to GPUs. Given that they still do reviews on GPU products despite this tells me they are not very ethical, you cannot review products when you are being payed by competitor's of those products.
This is not really bias. More like reality. Most if not all other reviewers comes up with the same results. Even TechSpot that for sure don't talk bad about AMD showed DLSS being FSR superior. If DF is Nvidia biased, Techspot is AMD biased. It's more actual experience with the diffferent cards. I have tried alot of different Nvidia and AMD and I know for sure that AMD don't match Nvidia in terms of features and drivers when you don't cherrypick a few games but look at the overall picture instead across multiple titles, especially when you don't solely look at the most popular games, which reviewers tend to use.

Also, RTX can do so much more than simple rasterization, that you have a hard time settling with AMD after using a RTX GPU. Not a single AMD feature is on par with Nvidias and AMD don't even have a counter for many RTX features (DLAA, DLDSR, DLSS 3 + FG + 3.5 to name a few).

Even if 7900XTX had matched 4090 in raster, I would have picked 4090 anyway because of DLSS, DLAA, DLDSR and Reflex mostly + Much better RT perf. I am using many RTX mods and RTX features to improve the experience of older games (DLDSR, RT mods etc) - features like this can transform old games - and I can't wait for Half Life 2 RTX. There's many other features present (with RTX or Nvidia in general) and every single one of these features beats what AMD is offering. Pretty much all streamers use Nvidia because of ShadowPlay and the native integration on Twitch and most big streaming platforms. ReLive is not really close and has bigger performance hit + less native support on platforms and way more issues.

This is why AMD has lower prices. Lack of features. Worse features. Worse drivers and support especially when you leave the most popular games. AMD spends most of their time/money optimizing for games that actively gets benchmarked so their GPUs look good in reviews. AMDs performance in early access games, betas or just lesser popular titles are not on par with Nvidia 9 out of 10 times. Most developers use Nvidia and optimizes for Nvidia because 80% of the PC gaming segment uses Nvidia. Nvidia have tons of money for inventing features, improving features and perfect experience in games (driver optimization). AMD have much lower R&D funds and software department in general.

Also, AMDs main business is not GPUs but CPUs and APUs. Nvidias is. They are industry leader in gaming GPUs, enterprise GPUs and AI GPUs.

Gaming GPUs are not really profitable for AMD. They earn more per wafer by selling CPUs and APUs (both OEM and Consoles) and all their chips uses the same TSMC lines, meaning it makes more sense for AMD to just make CPUs and APUs. AMD decides what chips to put out and CPUs are more profitable. More chips per wafer equals more money for AMD.

I know it's hard to accept the fact that AMD lacks features, but they do. This is why every AMD GPU user hates the word upscaling, rt, downsampling and more, because their cards can't really do it properly. They are stuck with "native" and good old raster, which is why they praise native all the time and talk RTX features down - because they can't use them = Denial.

Yet native can easily be improved and beaten with features like DLAA or even DLSS on the higher presets if you want some performance on top as well. However DLAA beats native and every other AA solution every single time when it comes to visuals and DLAA is a preset of DLSS now. DLSS don't just mean upscaling (with built in AA), it also means best AA method today; DLAA.

Even developers embrace upscaling, downsampling, sharpening filters and next gen anti-aliasing. Like I said several times now, FSR2 is enabled as default in Starfield. Upscaling is enabled in Remnant 2 as default. They officially stated the game was designed with DLSS/FSR/XeSS image upscaling in mind.

Native is not really better these days, especially not if you use DLAA or DLDSR. DLSS will only on lower presets make visuals worse, but performance skyrocket - This is up to the user to decide. AMD can't match these features at all. This is what you pay extra for, when you buy Nvidia. And resell value of AMD hardware is lower as well because demand is lower and AMD lowers prices several times thru a generation.
 
Last edited:
Joined
Feb 1, 2019
Messages
3,667 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
I would argue it has tbh, when it goes from a nicety to something you always turn on for playable performance, then yeah, it's too useful. And it's not like the games coming out recently are anything to write home about graphics-wise, starfield has its physics but that's it, and every other game just looks bland and bad. Perhaps if it was a jaw dropping experience then I could, could, excuse it, but not only are we getting crippling optimization due to laziness, but games look the same as before.

So yeah, it has become too useful, as someone that likes to play at native res, I am sad
Been playing dune spice wars, it has no DLSS, and wouldnt think would need it either, its a indie RTS strategy mix type game, and the graphics are what to be expected from an indie RTS.

I then hit my hotkey for GPU stats and my 3080 needs to run at about 50% utilisation at max turbo clocks for 60fps o_O. So although the game is running fine, its very heavy utilisation for what its actually displaying, just seems modern game development now has everything requiring so much resources. This is even if the case if I have the game paused on a menu or something.
 
Joined
Jun 2, 2017
Messages
9,373 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
This is not really bias. More like reality. Most if not all other reviewers comes up with the same results. Even TechSpot that for sure don't talk bad about AMD showed DLSS being FSR superior. If DF is Nvidia biased, Techspot is AMD biased. It's more actual experience with the diffferent cards. I have tried alot of different Nvidia and AMD and I know for sure that AMD don't match Nvidia in terms of features and drivers when you don't cherrypick a few games but look at the overall picture instead across multiple titles, especially when you don't solely look at the most popular games, which reviewers tend to use.

Also, RTX can do so much more than simple rasterization, that you have a hard time settling with AMD after using a RTX GPU. Not a single AMD feature is on par with Nvidias and AMD don't even have a counter for many RTX features (DLAA, DLDSR, DLSS 3 + FG + 3.5 to name a few).

Even if 7900XTX had matched 4090 in raster, I would have picked 4090 anyway because of DLSS, DLAA, DLDSR and Reflex mostly + Much better RT perf. I am using many RTX mods and RTX features to improve the experience of older games (DLDSR, RT mods etc) - features like this can transform old games - and I can't wait for Half Life 2 RTX. There's many other features present (with RTX or Nvidia in general) and every single one of these features beats what AMD is offering. Pretty much all streamers use Nvidia because of ShadowPlay and the native integration on Twitch and most big streaming platforms. ReLive is not really close and has bigger performance hit + less native support on platforms and way more issues.

This is why AMD has lower prices. Lack of features. Worse features. Worse drivers and support especially when you leave the most popular games. AMD spends most of their time/money optimizing for games that actively gets benchmarked so their GPUs look good in reviews. AMDs performance in early access games, betas or just lesser popular titles are not on par with Nvidia 9 out of 10 times. Most developers use Nvidia and optimizes for Nvidia because 80% of the PC gaming segment uses Nvidia. Nvidia have tons of money for inventing features, improving features and perfect experience in games (driver optimization). AMD have much lower R&D funds and software department in general.

Also, AMDs main business is not GPUs but CPUs and APUs. Nvidias is. They are industry leader in gaming GPUs, enterprise GPUs and AI GPUs.

Gaming GPUs are not really profitable for AMD. They earn more per wafer by selling CPUs and APUs (both OEM and Consoles) and all their chips uses the same TSMC lines, meaning it makes more sense for AMD to just make CPUs and APUs. AMD decides what chips to put out and CPUs are more profitable. More chips per wafer equals more money for AMD.

I know it's hard to accept the fact that AMD lacks features, but they do. This is why every AMD GPU user hates the word upscaling, rt, downsampling and more, because their cards can't really do it properly. They are stuck with "native" and good old raster, which is why they praise native all the time and talk RTX features down - because they can't use them = Denial.

Yet native can easily be improved and beaten with features like DLAA or even DLSS on the higher presets if you want some performance on top as well. However DLAA beats native and every other AA solution every single time when it comes to visuals and DLAA is a preset of DLSS now. DLSS don't just mean upscaling (with built in AA), it also means best AA method today; DLAA.

Even developers embrace upscaling, downsampling, sharpening filters and next gen anti-aliasing. Like I said several times now, FSR2 is enabled as default in Starfield. Upscaling is enabled in Remnant 2 as default. They officially stated the game was designed with DLSS/FSR/XeSS image upscaling in mind.

Native is not really better these days, especially not if you use DLAA or DLDSR. DLSS will only on lower presets make visuals worse, but performance skyrocket - This is up to the user to decide. AMD can't match these features at all. This is what you pay extra for, when you buy Nvidia. And resell value of AMD hardware is lower as well because demand is lower and AMD lowers prices several times thru a generation.
Is the 7900 series not just as fast as a 3090 in RT? Then you have to realize that only 2 Nvidia skus are ok for RT. You can also read and learn that information.Then you seem to forget that the 7900XT is half in cost of what a 4090 is where I live. Would I spend another $1200 to get features that are in less than 1% of Games and not in the Games I focus on.

I guess you have used Relive and see that Twitch and Youtube are fully supported in that and now have AV1 encoding and decoding but I guess I don't have the same access that you have.

Amd prices are lower because they don't fully agree with Nvidia and their price structure, I can promise you that if the 7900XT was $500 you would not be able to buy one. Another thing you don't seem to understand is the 6700XT has been the best value GPU since last year.

When I look at AMD software and compare it to Nvidia I certainly do not agree with that assessment that AMD users only use native because they lack features.


Yep that is why they were so desperate to buy ARM. The fact that AMD AND Intel have CPUs and GPUs on the X86 licence is not a good look long term for Nvidia.

DLAA, DLSS and do you know that I turn off AA? It is my PC after all but to say it looks worse is just wrong. You are talking about DLSS not RT.

Yep Bathesda was quick to embrace DLSS when they were creating Starfield. Modern Warfare is not faster on AMD cards. 4K is not achievable on 7900 cards and AMD drivers are garbage.

The only Game in my library with XeSS support is Redout 2. Do you know what that Game is?

The resale value of AMD GPUs is lower because the retail price is lower. I can promise you that plenty of people made profits selling AMD cards during the Mining Boom. You know what I see in that. Do you know what CPU company was in Open source from the beginning? I look (again) at the 6500XT that AMD made so that some users with less money than us can enjoy PC Gaming. The fact remains though that Nvidia's propaganda Campaign is strong.

I have humble Choice and there has not been a Game that my card can't run through so try again.

In the end you are trying to establish that AMD GPUs are not good by stating that they don't have Nvidia specific features because when I watched the State of Play yesterday not once did I think about DLSS or DLAA when watching trailers for Spiderman 2, Pandora and a few others that will be made on AMD hardware. I also guess that Microsoft is not one of the biggest publishers in the space right now but you can go on with you opinion.
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Is the 7900 series not just as fast as a 3090 in RT? Then you have to realize that only 2 Nvidia skus are ok for RT.
That's not how it works. At all. Yes, you need a 4080 and a 4090 to play 4k RT (and usually with DLSS on top of that), but you don't need a 4080 or a 4090 to play at 1440p. You know, cards like the 4070 that are targeted at 1440p can play RT just fine. That is not the case with the 7900xtx, since that's targeted at 4k but can't do RT at that resolution.
 
Joined
Jun 2, 2017
Messages
9,373 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
That's not how it works. At all. Yes, you need a 4080 and a 4090 to play 4k RT (and usually with DLSS on top of that), but you don't need a 4080 or a 4090 to play at 1440p. You know, cards like the 4070 that are targeted at 1440p can play RT just fine. That is not the case with the 7900xtx, since that's targeted at 4k but can't do RT at that resolution.
So I guess the TPU review of the 7900 cards was just false. I guess my experience has also been false for the last 8 months. You are one of the people that will move Goal posts to support your argument that AMD sucks and Nvidia is the only option. I can't do RT at 4K? Are you sure about that? Don't forget that my monitors Freesync range is 45-144 HZ meaning that even if I am getting 60 FPS @ 4K it would still be butter smooth. I am really tired of people that do not own the card I have telling me what it cannot do.
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
So I guess the TPU review of the 7900 cards was just false. I guess my experience has also been false for the last 8 months. You are one of the people that will move Goal posts to support your argument that AMD sucks and Nvidia is the only option. I can't do RT at 4K? Are you sure about that? Don't forget that my monitors Freesync range is 45-144 HZ meaning that even if I am getting 60 FPS @ 4K it would still be butter smooth. I am really tired of people that do not own the card I have telling me what it cannot do.
That is at ALL not the point. At all. Im saying that just because someone says the 7900 series is not good enough for RT doesn't mean that only the 4080 and 4090 can play RT. It means that the 7900 cannot play games with RT at their intended usecases. A 4070 is a 1440p card so you are not going to buy that one if you have a 4k monitor.

And yes, you are correct, the 7900 series cannot do RT. Are you counting farcry 6 as an RT game?
 
Joined
Jun 2, 2017
Messages
9,373 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
And yes, you are correct, the 7900 series cannot do RT. Are you counting farcry 6 as an RT game?
This has to be the most uneducated response I have ever seen. Once again you are telling me what my own card cannot do when I play CP2077 at 4K and sometimes use Ray Tracing. Then you are purporting that the only cards that can do 4K are the 4090 and 4080. I am not going to bother showing you any screenshots. I have done that before and you did not even look at it but it's ok. You and a few other users on here are known as Nvidia fanatics. Far Cry 6? I don't even play Far Cry. Please keep in mind that even though Gaming looks nice that it is all about Math. Before you go on waxing about RT you should learn what Trip Hawkins from EA did more than 30 years ago. Using that principle a 57000 million transistor board would be beneficial at that then add another 16 billion for the CPU and you will understand that 1/0 would be super fast with that many Gates opening and closing.
 
Joined
Dec 25, 2020
Messages
7,018 (4.81/day)
Location
SĂŁo Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Thanks to the fine work of brilliant folks like LukeFZ and Kaldaien I've been enjoying the hell out of Starfield with Special K and DLSS 3.5 Frame Generation for the last 15 days... I'm devouring this game, I love it so much, it single-handedly reignited my love for video games.


Buttery smooth 144 fps even with the settings all on ultra and full resolution DLSS (DLAA preset F). Pure bliss, just like this fine lady right here. I just started Playthrough 3 (NG++), currently at level 54. I'm going to keep replaying this one for a few years to come, just like I always revisit Oblivion. I hope Bethesda's official implementation is great, community has done well for itself so far.

1694751789.png


RTX can do so much more than simple rasterization, that you have a hard time settling with AMD after using a RTX GPU. Not a single AMD feature is on par with Nvidias and AMD don't even have a counter for many RTX features (DLAA, DLDSR, DLSS 3 + FG + 3.5 to name a few).

Yup! I must agree. The things that GeForce RTX cards can do are like arcane magic, it's really impressive. I'm hopeful that once AMD's Hypr-RX thing matures they have a similar suite, but it will take some time.
 
Last edited:
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Then you are purporting that the only cards that can do 4K are the 4090 and 4080.
4k, yes. That's what im saying.

But, if you are happy, I'm happy. If your card can do 4k RT in Cyberpunk, that's great, enjoy.

You and a few other users on here are known as Nvidia fanatics.
Irrelevant. I can claim you are known as amd fanatic. Doesn't matter, it's not an argument.
 
Joined
Jun 2, 2017
Messages
9,373 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
4k, yes. That's what im saying.

But, if you are happy, I'm happy. If your card can do 4k RT in Cyberpunk, that's great, enjoy.


Irrelevant. I can claim you are known as amd fanatic. Doesn't matter, it's not an argument.
Stop spreading misinformation and we will be Golden
 
Joined
Dec 25, 2020
Messages
7,018 (4.81/day)
Location
SĂŁo Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Stop spreading misinformation and we will be Golden

I mean, playable is quite a variable standard... I have more or less a rough estimate of what the 7900 XTX can do with RT (not that it's relevant to Starfield - it doesn't support RT, anyways. There's some engine code regarding support but it's entirely non-functional, the toggle in the ini does not do anything) since I had the RTX 3090 and they have roughly the same RT performance (bit tilted towards the 3090 in this aspect, but the 7900 XTX's superior raster power more or less levels it out), and at least, speaking for myself and myself only, I don't think it's adequate for 4K raytracing. AMD hasn't achieved this yet. But then again, if I may be entirely honest, neither are the 3090 Ti nor the 4080: that leaves you with the 4090 and that's... stretching.

I think we'll need the 5090, and unless AMD catches up to Blackwell with RDNA 4, the 9900 XTX/RDNA 5 flagship for us to start really talking about ray or pathtraced 4K gaming at >60fps without resorting to upscalers or frame generation.
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Stop spreading misinformation and we will be Golden
I'm taking my information from this



Your card at 4k with FSR Q will get lower fps than the above, so below 35. Again, if you are happy with that - im happy, we are both happy. The above is of course without path tracing, with PT you are looking at single digits or something.
 
Joined
Jun 2, 2017
Messages
9,373 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I'm taking my information from this



Your card at 4k with FSR Q will get lower fps than the above, so below 35. Again, if you are happy with that - im happy, we are both happy. The above is of course without path tracing, with PT you are looking at single digits or something.
Once again you do not realize that I have my own information
AMD Software_ Adrenalin Edition 2023-09-15 10_16_05 AM.png
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Joined
Nov 26, 2021
Messages
1,705 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
I mean, playable is quite a variable standard... I have more or less a rough estimate of what the 7900 XTX can do with RT (not that it's relevant to Starfield - it doesn't support RT, anyways. There's some engine code regarding support but it's entirely non-functional, the toggle in the ini does not do anything) since I had the RTX 3090 and they have roughly the same RT performance (bit tilted towards the 3090 in this aspect, but the 7900 XTX's superior raster power more or less levels it out), and at least, speaking for myself and myself only, I don't think it's adequate for 4K raytracing. AMD hasn't achieved this yet. But then again, if I may be entirely honest, neither are the 3090 Ti nor the 4080: that leaves you with the 4090 and that's... stretching.

I think we'll need the 5090, and unless AMD catches up to Blackwell with RDNA 4, the 9900 XTX/RDNA 5 flagship for us to start really talking about ray or pathtraced 4K gaming at >60fps without resorting to upscalers or frame generation.
Looking at the performance in CyberPunk's overdrive mode, it's rather unlikely that even the 5090 will be fast enough for path-tracing at 4K without DLSS.
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Joined
Dec 25, 2020
Messages
7,018 (4.81/day)
Location
SĂŁo Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Looking at the performance in CyberPunk's overdrive mode, it's rather unlikely that even the 5090 will be fast enough for path-tracing at 4K without DLSS.

Yeah, I believe it. Seems really hardcore stuff, still a few gens out of reach, unless they manage to make it do another +100% on the 4090... and the worst of all is that I may very well have to sell my 4080 and buy a 5080... won't be buying anything on the 90 segment until the prices pipe down a bit, if ever. I hold a faint hope that Blackwell will be like Ampere as in, refinement pass for less money, while both Turing and Ada brought the tech to the table (for a cost). If Nvidia does this cadence, it might just work, I dunno.
 
Joined
Nov 26, 2021
Messages
1,705 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Yeah, I believe it. Seems really hardcore stuff, still a few gens out of reach, unless they manage to make it do another +100% on the 4090... and the worst of all is that I may very well have to sell my 4080 and buy a 5080... won't be buying anything on the 90 segment until the prices pipe down a bit, if ever. I hold a faint hope that Blackwell will be like Ampere as in, refinement pass for less money, while both Turing and Ada brought the tech to the table (for a cost). If Nvidia does this cadence, it might just work, I dunno.
The 3090 to 4090 jump was due to the disparity between Samsung's 8 nm node and TSMC's N4. Such a jump with N3 is rather unlikely. We have already seen Apple deliver ho-hum gains with their A17 Pro SOC and Apple's design chops are second to none. SRAM scaling finally died with N3 though backside power delivery should restore it in future nodes, but that means that SRAMs in the chips won't shrink at all from their previous designs so the amount of logic that can be packed in compared to N4 won't increase as much as one might expect.
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Is the 7900 series not just as fast as a 3090 in RT? Then you have to realize that only 2 Nvidia skus are ok for RT. You can also read and learn that information.Then you seem to forget that the 7900XT is half in cost of what a 4090 is where I live. Would I spend another $1200 to get features that are in less than 1% of Games and not in the Games I focus on.

I guess you have used Relive and see that Twitch and Youtube are fully supported in that and now have AV1 encoding and decoding but I guess I don't have the same access that you have.

Amd prices are lower because they don't fully agree with Nvidia and their price structure, I can promise you that if the 7900XT was $500 you would not be able to buy one. Another thing you don't seem to understand is the 6700XT has been the best value GPU since last year.

When I look at AMD software and compare it to Nvidia I certainly do not agree with that assessment that AMD users only use native because they lack features.


Yep that is why they were so desperate to buy ARM. The fact that AMD AND Intel have CPUs and GPUs on the X86 licence is not a good look long term for Nvidia.

DLAA, DLSS and do you know that I turn off AA? It is my PC after all but to say it looks worse is just wrong. You are talking about DLSS not RT.

Yep Bathesda was quick to embrace DLSS when they were creating Starfield. Modern Warfare is not faster on AMD cards. 4K is not achievable on 7900 cards and AMD drivers are garbage.

The only Game in my library with XeSS support is Redout 2. Do you know what that Game is?

The resale value of AMD GPUs is lower because the retail price is lower. I can promise you that plenty of people made profits selling AMD cards during the Mining Boom. You know what I see in that. Do you know what CPU company was in Open source from the beginning? I look (again) at the 6500XT that AMD made so that some users with less money than us can enjoy PC Gaming. The fact remains though that Nvidia's propaganda Campaign is strong.

I have humble Choice and there has not been a Game that my card can't run through so try again.

In the end you are trying to establish that AMD GPUs are not good by stating that they don't have Nvidia specific features because when I watched the State of Play yesterday not once did I think about DLSS or DLAA when watching trailers for Spiderman 2, Pandora and a few others that will be made on AMD hardware. I also guess that Microsoft is not one of the biggest publishers in the space right now but you can go on with you opinion.
No lol. AMDs 7900XTX don't even beat 4070 Ti in RT. Nvidia have tons of SKUs that are usable for RT but DLSS can easily make most of their cards to proper RT. 4090 does 4K60 RT with no upscaling (read 4090 TPU review for proof). I don't really care much about RT tho. I care about DLAA, DLSS, DLDSR and Reflex. AMD has no features that even comes close. They don't even have an answer for many Nvidia features at all.

Tons of people are having issues with Relive on Twitch. Simply Google it. Shadowplay is higher quality and uses less (minimal) ressources with much better integration and support. This is why 99% of streamers use Nvidia.

If AMD sold 7900XT for 500 dollars they might as well stop selling GPUs because they would make zero money from it.

Nvidia don't care about x86 market at all. Haha.. Nvidia wanted ARM because it's better for them, they already do ARM chips. x86 is pointless for Nvidia.

You turn AA off and have terrible image quality with jaggies all over, good for you. Even with 4K no anti aliasing sucks. Some of us wants good quality with no shimmering or jaggies, and DLAA can do just that. No other AA solution comes close and AMD users are forced to use the worse AA solutions.

AMD is cheaper because they would have no business if they priced their GPUs on par or higher than Nvidia. They would sell zero cards. AMD is the cheaper choice and it is cheaper for a reason. Even with lower prices, they are stuggling with marketshare. Nvidia sits at 80% on steam hardware survey. AMD marketshare went down over the last years, not up.
 
Joined
Nov 26, 2021
Messages
1,705 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Nvidia don't care about x86 market at all. Haha.. Nvidia wanted ARM because it's better for them, they already do ARM chips. x86 is pointless for Nvidia.
That's just sour grapes. Nvidia only cares about ARM, because they couldn't get their hands on x86. Project Denver started out as a x86 processor like Transmeta's Crusoe. Nvidia even considered being acquired by AMD. As Zen4C has shown, at the high end, x86 vs ARM doesn't matter. In addition to high performance, It is simultaneously power efficient and area efficient.
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
That's just sour grapes. Nvidia only cares about ARM, because they couldn't get their hands on x86. Project Denver started out as a x86 processor like Transmeta's Crusoe. Nvidia even considered being acquired by AMD. As Zen4C has shown, at the high end, x86 vs ARM doesn't matter. In addition to high performance, It is simultaneously power efficient and area efficient.
ARM is gaining more and more marketshare in the enterprise market. Besides, Nvidia has full focus on GPUs, you know AI boom.
 
Joined
Jan 29, 2021
Messages
1,876 (1.32/day)
Location
Alaska USA
No lol. AMDs 7900XTX don't even beat 4070 Ti in RT. Nvidia have tons of SKUs that are usable for RT but DLSS can easily make most of their cards to proper RT. 4090 does 4K60 RT with no upscaling (read 4090 TPU review for proof). I don't really care much about RT tho. I care about DLAA, DLSS, DLDSR and Reflex. AMD has no features that even comes close. They don't even have an answer for many Nvidia features at all.

Tons of people are having issues with Relive on Twitch. Simply Google it. Shadowplay is higher quality and uses less (minimal) ressources with much better integration and support. This is why 99% of streamers use Nvidia.

If AMD sold 7900XT for 500 dollars they might as well stop selling GPUs because they would make zero money from it.

Nvidia don't care about x86 market at all. Haha.. Nvidia wanted ARM because it's better for them, they already do ARM chips. x86 is pointless for Nvidia.

You turn AA off and have terrible image quality with jaggies all over, good for you. Even with 4K no anti aliasing sucks. Some of us wants good quality with no shimmering or jaggies, and DLAA can do just that. No other AA solution comes close and AMD users are forced to use the worse AA solutions.

AMD is cheaper because they would have no business if they priced their GPUs on par or higher than Nvidia. They would sell zero cards. AMD is the cheaper choice and it is cheaper for a reason. Even with lower prices, they are stuggling with marketshare. Nvidia sits at 80% on steam hardware survey. AMD marketshare went down over the last years, not up.
las blowing up this thread with truth bombs. AMD cultist won't like this ^^
 
  • Haha
Reactions: las
Joined
Mar 10, 2023
Messages
39 (0.06/day)
Well you have the point. Both upscalers should be supported from the get go. I guess there is less drama for missing FSR because of low market share.
Bad example, since the BG3 doesnt even need upscaling, the game engine is not really taxing on hardwares. So its just normal that noone care about the upscale options ib BG3...

That's just sour grapes. Nvidia only cares about ARM, because they couldn't get their hands on x86. Project Denver started out as a x86 processor like Transmeta's Crusoe. Nvidia even considered being acquired by AMD. As Zen4C has shown, at the high end, x86 vs ARM doesn't matter. In addition to high performance, It is simultaneously power efficient and area efficient.
Seems like you are living in the past...
 
Joined
Nov 26, 2021
Messages
1,705 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Bad example, since the BG3 doesnt even need upscaling, the game engine is not really taxing on hardwares. So its just normal that noone care about the upscale options ib BG3...


Seems like you are living in the past...
The past informs the present. I'm only pointing out that while Nvidia is very successful now, there was a time when they wanted to make x86 CPUs.
 
Top