Monday, October 23rd 2023
NVIDIA GeForce RTX 4080 SUPER to Feature 20GB Memory, Based on AD102
NVIDIA's upcoming mid-life refresh for its GeForce RTX 40-series "Ada" product stack sees the introduction of three new SKUs, led by the GeForce RTX 4080 SUPER, as was reported last week. In the older report, we speculated how NVIDIA could go about creating the RTX 4080 SUPER. BenchLife reports that the RTX 4080 SUPER will be given 20 GB as its standard memory size, and will be based on the larger "AD102" silicon. The SKU will utilize a 320-bit wide memory interface carved out of the 384-bit available to the silicon. The "AD102" has 144 streaming multiprocessors (SM) on die, from which the flagship RTX 4090 is configured with 128, and so NVIDIA could pick an SM count that's lower than that of the RTX 4090, while being higher than the 76 of the current RTX 4080.
Sources:
Wccftech, BenchLife.info
145 Comments on NVIDIA GeForce RTX 4080 SUPER to Feature 20GB Memory, Based on AD102
Yes, you quoted me but you don't understand what I said in the quote. Let me repeat it once more. 4k DLSS Q looks way better (emphasis on the WAY) than native 1440p
also 4k dlss q will still have motion artifacts that are not present in native rendering (if the AA is not garbage), also 4k dlss q runs worse than native 1440p as the upscale still has a performance impact vs the base resolution, it's not magic mate, it's very easy to tell it's not magic, don't treat it as such, and many people will prefer native, be it because of stability or sharpness.
dlss also breaks many screen space effects as it causes them to render at a lower quality resolution, or will break effects that rely on TAA, such is the case with the DOF in MWII
I don't get why people argue otherwise, even a blind person can tell you that applying dlss to a supersampled image looks way better than native.
Everybody sees what they want to see. The distinction is mostly psychological: look at the system specs of people dismissing DLSS, the vast majority are running AMD hardware.
Another aspect is monitors. If you have a 4k at 32" or less, you're not seeing all the details the monitor is capable of anyway, so you won't see minute upscaling artifacts either. Move to an 80" TV from 2' away and that can change quite a bit.
I just wish things were better, I guess we all share that desire
I know you weren't throwing anything at me, but I still feel like it's worth clarifying that I am not talking out of ignorance, I swear! I agree here, AMD could pick up the slack, the GPUs they have are very good (some, not all, many are still awful), but because of the unfortunate situation that they find themselves in, they need to do more than that, like they did with ryzen back in the day. While in hindsight this was a terrible moment to begin testing chiplets, since this gen was a missed golden opportunity, and we know that chiplets on gpus are giving them a hard time (leading to the cancellation of top end 8000 series gpus), I do feel like they may be arriving at a good position if things work out in the end, but that's an if, not a when, chances are it could blow up in their faces, no way to know.
I choose to remain hopeful, if they could achieve wonders with ryzen, I want to hope that they can do the same with radeon.
*If anyone is talking out of ignorance, that would be me. I have been watching TPU's DLSS IQ comparisons, but I am still stuck with my 1060, thus I definitely don't have first hand experience with either DLSS or FSR. I do have some background in computer graphics and about two decades of gaming behind me, so there's that.
i.ibb.co/swbznGb/4k-dlss-performance.png
i.ibb.co/M2RJqs6/1080p-native-1.png
i.ibb.co/JFztSLy/1440p-DLSS-Q-2.png
BTW. As you seems to be an avid NV fan, did you ever seen the Nvidia logo? At least one glance? :rolleyes:
Trim the DLSS, fake frames out, and performance wise the card would be basically 3090Ti with better power efficiency. Completely agree. Though, the chiplet idea might possibly help AMD in future, to add them along with CPU ones in desktop/mobile APUs. So they won't need to make a monolithic chips any more for these products. However these are my own guesses and speculations.
But yeah, they lost so much money on untested lineup that got them into trouble, with possibility to get no next gen decent discrete cards at all. They'd better to do the testing before launching full product line, or at least release them in limited quantities as a side product, e.g. Radeon VII.
Also, I am astonished to report that: they all look meh, I am a bit surprised actually, it's higher res sure, but it retains many weaknesses of the native 1080p image, such as this, which looks to be AO stairstepping
(see what I said about it making things not look that much better than you may think?) (dlss 4k perf on the left, 1080p on the right)
(1440p dlss q on left, native 1080p on the middle, 4k dlss perf on the right) I actually find this particle effect to look better on native 1080p than 1440p dlss quality because of what I mentioned before, it looks noticeably more blocky on dlss, it's also why the transparent window looks argueably nicer on native 1080p compared to dlss @1440p, and hell, I'd say it looks better than in the 4k dlss perf image because there it looks incredibly out of place, because it's trying to upscale an effect (the smudgyness of the window) that is dependent on the internal resolution, but still upscales it to 4k in the same way that the particles or ssr would be, so it looks jarring by comparison.
I guess the lines themselves are sharper, sure, but texture quality does not feel quite crystal 4k (which is a problem I've had in many games back when I ran dlss, textures never feeling quite right, now weather it's because of incorrect use of negative bias or just dlss, that's not my problem)
I'd also argue that Cyberpunk 2077 is not the best game to test this with, since that game has easily one of the worst TAA solutions I've seen any game have, spiderman might be a better go tbh, specially with SMAA
Anyway, I think I'll be sticking to 1080p thank you, low res, but not jarring (though the TAA is terrible, so I'll just not look at this game)
This was pointless