• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4080 SUPER to Feature 20GB Memory, Based on AD102

Joined
Jan 14, 2019
Messages
12,566 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Sure but market prices are dictated by sales. If nobody buys amd gpus of course prices will drop. That's not the point. If you want to compare greediness between companies, you have to compare msrps.
I'm not comparing greediness. I just offered an explanation of the current market conditions. Like I said, Nvidia is often times 25% more expensive than the equivalent Radeon, but people still buy it because it's green.

That is just not true. 10 years ago Radeon had over 40% of the marketshare. That's because they were making good cards. Now they arent

How is ryzen a highly esteemed name? It's a much newer brand than Radeon. Obviously because they made good products. Radeon has stopped doing that for many many years. Thinking that nvidia sells because of brand name is just full copium.

Starting from the pascal era (2015) amd has been nonstop failing. Too late to the market, not competitive with nvidias high end cards, exorbitant prices, very limited availability. They are not selling cause they are failing on multiple fronts
That's true up to RDNA 2. Both 6000 and 7000 series cards are good products at reasonable prices and with wide availability. The majority still wants Nvidia because "DLSS is soooo much better at making my game look like crap, muh! Hardware Unboxed said so with their pixel-by-pixel analysis that no gamer in their right mind ever does at home, but still!"

Ryzen is much newer than Radeon, but it doesn't get half as much crap from the internet as Radeon does, especially with things like X3D.
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
I'm not comparing greediness. I just offered an explanation of the current market conditions. Like I said, Nvidia is often times 25% more expensive than the equivalent Radeon, but people still buy it because it's green.


That's true up to RDNA 2. Both 6000 and 7000 series cards are good products at reasonable prices and with wide availability. The majority still wants Nvidia because "DLSS is soooo much better at making my game look like crap, muh! Hardware Unboxed said so with their pixel-by-pixel analysis that no gamer in their right mind ever does at home, but still!"
It was definitely not true for rdna 2, at least not in Europe. And I'm talking about big retailers in EU like mindfactory, alternate etc. You couldn't even find LISTINGS in EU shops for rdna2 months after their launch. The first cards on stock started showing up around February of 2021, and that's when of course mining took off.

And just because you don't like dlss for whatever reasons even though it's straight up better than native, why do you expect other people to feel the same?
 
Joined
Aug 13, 2010
Messages
5,479 (1.04/day)
BTW, who called it, eh?
Everyone who witnessed RTX 40 series cards' initial launch and seen that many of them are made of slightly cut down core complexes of their own full-core configs.
This road has been paved from the very start, really. Competition-less, NVIDIA is used to making 2 product generations out of 1 architecture.
 
Joined
Jan 14, 2019
Messages
12,566 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
It was definitely not true for rdna 2, at least not in Europe. And I'm talking about big retailers in EU like mindfactory, alternate etc. You couldn't even find LISTINGS in EU shops for rdna2 months after their launch. The first cards on stock started showing up around February of 2021, and that's when of course mining took off.
Okay, fair enough. I'll give you that one. It doesn't explain current conditions, though.

it's straight up better than native
Riiiight... :roll:
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,116 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
much higher RT performance

That's actually a fallacy I called out in the reviews for the 4xxx. The RT performance hit on ADA is practically the same as Ampere. Only the 4090 appears to get clear. You need to look at each game where the results vary, but the drop in performance on a 3080 using RT is around 48-50%. For the 4080 it's also in that ballpark. The other cards stack the same. Apart from the 4090 (arguably the best GPU ever), the hit is the same transferring to ADA from Ampere. RT fps only improves because the base performance (rasterisation) increases, but the actual hit is the same.
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
That's actually a fallacy I called out in the reviews for the 4xxx. The RT performance hit on ADA is practically the same as Ampere. Only the 4090 appears to get clear. You need to look at each game where the results vary, but the drop in performance on a 3080 using RT is around 48-50%. For the 4080 it's also in that ballpark. The other cards stack the same. Apart from the 4090 (arguably the best GPU ever), the hit is the same transferring to ADA from Ampere. RT fps only improves because the base performance (rasterisation) increases, but the actual hit is the same.
What difference does that make? I don't care about the % drop of performance, I care about the framerate on games worth playing with rt. And in those sadly amd cards are just 1 or 2 generations behind. Someone might not care about rt and that's fine, but those are the facts right now.
 
Joined
Jan 14, 2019
Messages
12,566 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
That's actually a fallacy I called out in the reviews for the 4xxx. The RT performance hit on ADA is practically the same as Ampere. Only the 4090 appears to get clear. You need to look at each game where the results vary, but the drop in performance on a 3080 using RT is around 48-50%. For the 4080 it's also in that ballpark. The other cards stack the same. Apart from the 4090 (arguably the best GPU ever), the hit is the same transferring to ADA from Ampere. RT fps only improves because the base performance (rasterisation) increases, but the actual hit is the same.
That's why I call Ada Ampere 2.0 and Ampere Turing 2.0. There has been zero change in the base architecture since 2018, and I fail to see what makes the new generation of RT cores actually new.
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Okay, fair enough. I'll give you that one. It doesn't explain current conditions, though.


Riiiight... :roll:
Well see just because you disagree doesn't make it true. Or course that works both ways. That's why I'm willing to offer you a blind test, I'll post a game at native 1440p and then the same at 4k with dlss q. Same internal resolution, same performance, the dlss one will look so much better youd throw native into the trash it belongs to.
 

bug

Joined
May 22, 2015
Messages
13,842 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Call me silly but in my mind, the company that has 10-15-20% market share is the one that should be trying to compete - lower prices. Not the one that has 80%. When nvidia knows (and they know, cause it has happened for the last 6-7 years now) whatever they release, at whatever price, people will buy it BECAUSE AMD will offer an inferior product for a similar amount of money (okay, maybe with a 50€ discount) what is the incentive for nvidia to lower it's prices? This gen was extra fascinating, cause some nvidia cards on top of having more and better features and lower power draw, much higher RT performance, they even had better Raster performance per dollar, which is absolutely absurd. Check the launch prices of 4070ti vs the 7900xt, the 4070ti had higher raster per $. But people feel the need to blame nvidia even when amd is doing even worse in some cases.
No, no, no. You got it all wrong.

You see, Nvidia is the bad guy here (that's axiomatic, doesn't need a demonstration). AMD, doing the exact same things as Nvidia, is only forced to do so (again, axiomatic), thus, while doing the exact same things, they are obviously the good guys.

Now, go write that down 100 times so it sticks to your brain so you won't make further silly comments on the Internet, ok? ;)
 
Last edited:
Joined
Apr 19, 2018
Messages
1,227 (0.50/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
That's actually a fallacy I called out in the reviews for the 4xxx. The RT performance hit on ADA is practically the same as Ampere. Only the 4090 appears to get clear. You need to look at each game where the results vary, but the drop in performance on a 3080 using RT is around 48-50%. For the 4080 it's also in that ballpark. The other cards stack the same. Apart from the 4090 (arguably the best GPU ever), the hit is the same transferring to ADA from Ampere. RT fps only improves because the base performance (rasterisation) increases, but the actual hit is the same.
NOBODY that I'm aware of has actually gone in-depth and fully compared RT perf clock for clock on different nVidia GPU's. It seems funny to me when nVidia bang on about 3x RT performance boost, yet the penalty for enabling RT is always around the same. If the RT cores were really 3 times faster over the previous generation, then surely in the same game, same settings, enabling RT would have less of a performance impact, as for sure the IQ has not improved, so nothing extra is being done, yet the performance penalty is virtually the same.

I have always smelled shenanigans when nVidia markets is RT performance, ever since the 30x0 series came out. It's also made more perplexing that a Radeon GPU with no or limited HW RT performs as well as it does compared to nVidias offerings with its "state of the art RT cores"... I never trust nVidia marketing, and the whole HW RT core makes me suspicious. It makes me think that HW RT is more like SW RT with some HW assist.

EDIT:
FYI - I was just looking into nVidias perf claims for their RT cores. The 30x0 series is listed as being x2 the RT perf of the 20x0 series. The 40x0 series is listed as having x2 the RT perf of the 30x0 series...
 
Last edited:
Joined
Jan 14, 2019
Messages
12,566 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Well see just because you disagree doesn't make it true. Or course that works both ways. That's why I'm willing to offer you a blind test, I'll post a game at native 1440p and then the same at 4k with dlss q. Same internal resolution, same performance, the dlss one will look so much better youd throw native into the trash it belongs to.
No. Post a screenshot at 4K native vs 4K DLSS Q. Or post it at 1440p native vs 1440p DLSS Q. Nobody with a 4K monitor plays at 1440p ever.

Edit: I have to add, I had or have no intention to turn the conversation into an AMD vs Nvidia comparison in any way. You did that yourself, like you always do for some unknown reason.
 
Last edited:
Joined
Apr 19, 2018
Messages
1,227 (0.50/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
No. Post a screenshot at 4K native vs 4K DLSS Q. Or post it at 1440p native vs 1440p DLSS Q. Nobody with a 4K monitor plays at 1440p ever.

Edit: I have to add, I had or have no intention to turn the conversation into an AMD vs Nvidia comparison in any way. You did that yourself, like you always do for some unknown reason.
If you have a small monitor or poor eyesight, then some will view DLSS as "free performance"...

I certainly do not see DLSS as anything other than a tool to increase performance because your card lacks performance, at the expense of image quality.

DLSS is unusable for me, as it looks like crud on my 50" 4K screen, and while fine and arguably a great feature for lower end cards, I see it as unacceptable that this is now starting to be mandated on cards costing a lot over a thousand dollars. DLSS has turned into a performance crutch that game devs are now exploiting, with nVidias blessing after realising that they can shift more low-end product for a higher price and higher profits.
 
Last edited:
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
No. Post a screenshot at 4K native vs 4K DLSS Q. Or post it at 1440p native vs 1440p DLSS Q. Nobody with a 4K monitor plays at 1440p ever.

Edit: I have to add, I had or have no intention to turn the conversation into an AMD vs Nvidia comparison in any way. You did that yourself, like you always do for some unknown reason.
Why would I post a screenshot where one is running at 100 fps and the other one at 140? That's not a fair comparison cause a static image doesn't take framerate into account. A proper comparison is done by testing at same framerate, and at same framerate dlss absolutely smashes native rendering.
 
Joined
Jan 14, 2019
Messages
12,566 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Why would I post a screenshot where one is running at 100 fps and the other one at 140? That's not a fair comparison cause a static image doesn't take framerate into account. A proper comparison is done by testing at same framerate, and at same framerate dlss absolutely smashes native rendering.
You said "it's straight up better than native" which is not the same as "it runs faster because it looks worse".
 
Joined
Apr 19, 2018
Messages
1,227 (0.50/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
You said "it's straight up better than native" which is not the same as "it runs faster because it looks worse".
Ιt is straight up better than native when you compare properly, ie. iso framerate.

It's also better than native in most games even when you compare them the way you do.
 
Joined
Jan 14, 2019
Messages
12,566 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Ιt is straight up better than native when you compare properly, ie. iso framerate.
It seems our definitions of the word "properly" are entirely different. I'll leave it at that.

It's also better than native in most games even when you compare them the way you do.
Erm, no.
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
It seems our definitions of the word "properly" are entirely different. I'll leave it at that.


Erm, no.
Well you can say no and I can say yes that's why blind tests exist.

The whole point of dlss is that you can get better image quality with similar performance to native. So in order to properly test if that is the case, you have to equalize framerate. Its not really something to argue about. It is what it is
 
Joined
Jan 14, 2019
Messages
12,566 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Well you can say no and I can say yes that's why blind tests exist.
Which I've seen and done a few of and then drawn my conclusion.

The whole point of dlss is that you can get better image quality with similar performance to native. So in order to properly test if that is the case, you have to equalize framerate. Its not really something to argue about. It is what it is
No. The point of DLSS is that you can make your game run faster with a slight loss in image quality. With a 4K monitor, you will never ever in your whole life play any game at 1440p, so that comparison is utterly and entirely pointless.
 

bug

Joined
May 22, 2015
Messages
13,842 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Well you can say no and I can say yes that's why blind tests exist.

The whole point of dlss is that you can get better image quality with similar performance to native. So in order to properly test if that is the case, you have to equalize framerate. Its not really something to argue about. It is what it is
I think there are two issues here.
1. Straight up image quality. If you look at still images, frame rates are rather irrelevant and DLSS may or may not look better than native, depending on its training.
2. Gameplay. This is where DLSS will falter more, when things get into motion. Frame rates definitely matter here. But keep in mind artifacting and ghosting in motion can happen even in the absence of DLSS.

Imho, this is all largely irrelevant. Why? Because there are two types of games: fast-paced and non fast-paced. Fast-paced games can exhibit the most problems, but at the same time you are more unlikely to spot them in the heat of the action. Unless there's annoying flickering or smth like that. In that case turn off DLSS, lower details, no way around that. For games that aren't so fast faced, you can get by with 60fps or even less, so turn off DLSS if you pixel-peep.
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Which I've seen and done a few of and then drawn my conclusion.


No. The point of DLSS is that you can make your game run faster with a slight loss in image quality. With a 4K monitor, you will never ever in your whole life play any game at 1440p, so that comparison is utterly and entirely pointless.
The point isnt whether you are going to play at 1440p in a 4K monitor. The point is you can supersample and then use dlss. Say you have a 1440p monitor, you use dldsr to play at 4k and then upscale with dlss. Render resolution is still 1440p, which is your monitors native resolution, but the image quality is way above native.

I think there are two issues here.
1. Straight up image quality. If you look at still images, frame rates are rather irrelevant and DLSS may or may not look better than native, depending on its training.
2. Gameplay. This is where DLSS will falter more, when things get into motion. Frame rates definitely matter here. But keep in mind artifacting and ghosting in motion can happen even in the absence of DLSS.

Imho, this is all largely irrelevant. Why? Because there are two types of games: fast-paced and non fast-paced. Fast-paced games can exhibit the most problems, but at the same time you are more unlikely to spot them in the heat of the action. Unless there's annoying flickering or smth like that. In that case turn off DLSS, lower details, no way around that. For games that aren't so fast faced, you can get by with 60fps or even less, so turn off DLSS if you pixel-peep.
The issue here is that as you mentioned yourself, there are games that falter in motion at native. Eg tlou has flickering on native,and so does starfield. People act like native is the end all be all when in fact in most cases dlss looks straight up better even in motion while increasing your framerate.
 
Joined
Apr 19, 2018
Messages
1,227 (0.50/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
The whole point of dlss is that you can get better image quality with similar performance to native.
Upscaled 1080p/1440p to 4K is BETTER quality than native 4K rendering.

Wow, ok, that's one hell of a statement right there! I need some time for your statement to sink in...
 

bug

Joined
May 22, 2015
Messages
13,842 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
The issue here is that as you mentioned yourself, there are games that falter in motion at native. Eg tlou has flickering on native,and so does starfield. People act like native is the end all be all when in fact in most cases dlss looks straight up better even in motion while increasing your framerate.
At the end of the day, DLSS is just another tool in the GPU's toolbox. Use it if you will, or don't. Just don't tell me having one more tool is a bad thing. That's the attitude (not yours) that I don't get.
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Upscaled 1080p/1440p to 4K is BETTER quality than native 4K rendering.

Wow, ok, that's one hell of a statement right there! I need some time for your statement to sink in...
I can post you 2 screenshots running at similar framerate and you tell me which is the native on. Deal?

Also that's not what I said at all, read again. I said 4k dlss q looks better than native 1440p while it performs similarly.
 
Top