Monday, October 23rd 2023
NVIDIA GeForce RTX 4080 SUPER to Feature 20GB Memory, Based on AD102
NVIDIA's upcoming mid-life refresh for its GeForce RTX 40-series "Ada" product stack sees the introduction of three new SKUs, led by the GeForce RTX 4080 SUPER, as was reported last week. In the older report, we speculated how NVIDIA could go about creating the RTX 4080 SUPER. BenchLife reports that the RTX 4080 SUPER will be given 20 GB as its standard memory size, and will be based on the larger "AD102" silicon. The SKU will utilize a 320-bit wide memory interface carved out of the 384-bit available to the silicon. The "AD102" has 144 streaming multiprocessors (SM) on die, from which the flagship RTX 4090 is configured with 128, and so NVIDIA could pick an SM count that's lower than that of the RTX 4090, while being higher than the 76 of the current RTX 4080.
Sources:
Wccftech, BenchLife.info
145 Comments on NVIDIA GeForce RTX 4080 SUPER to Feature 20GB Memory, Based on AD102
And just because you don't like dlss for whatever reasons even though it's straight up better than native, why do you expect other people to feel the same?
This road has been paved from the very start, really. Competition-less, NVIDIA is used to making 2 product generations out of 1 architecture.
You see, Nvidia is the bad guy here (that's axiomatic, doesn't need a demonstration). AMD, doing the exact same things as Nvidia, is only forced to do so (again, axiomatic), thus, while doing the exact same things, they are obviously the good guys.
Now, go write that down 100 times so it sticks to your brain so you won't make further silly comments on the Internet, ok? ;)
I have always smelled shenanigans when nVidia markets is RT performance, ever since the 30x0 series came out. It's also made more perplexing that a Radeon GPU with no or limited HW RT performs as well as it does compared to nVidias offerings with its "state of the art RT cores"... I never trust nVidia marketing, and the whole HW RT core makes me suspicious. It makes me think that HW RT is more like SW RT with some HW assist.
EDIT:
FYI - I was just looking into nVidias perf claims for their RT cores. The 30x0 series is listed as being x2 the RT perf of the 20x0 series. The 40x0 series is listed as having x2 the RT perf of the 30x0 series...
Edit: I have to add, I had or have no intention to turn the conversation into an AMD vs Nvidia comparison in any way. You did that yourself, like you always do for some unknown reason.
I certainly do not see DLSS as anything other than a tool to increase performance because your card lacks performance, at the expense of image quality.
DLSS is unusable for me, as it looks like crud on my 50" 4K screen, and while fine and arguably a great feature for lower end cards, I see it as unacceptable that this is now starting to be mandated on cards costing a lot over a thousand dollars. DLSS has turned into a performance crutch that game devs are now exploiting, with nVidias blessing after realising that they can shift more low-end product for a higher price and higher profits.
It's also better than native in most games even when you compare them the way you do.
The whole point of dlss is that you can get better image quality with similar performance to native. So in order to properly test if that is the case, you have to equalize framerate. Its not really something to argue about. It is what it is
1. Straight up image quality. If you look at still images, frame rates are rather irrelevant and DLSS may or may not look better than native, depending on its training.
2. Gameplay. This is where DLSS will falter more, when things get into motion. Frame rates definitely matter here. But keep in mind artifacting and ghosting in motion can happen even in the absence of DLSS.
Imho, this is all largely irrelevant. Why? Because there are two types of games: fast-paced and non fast-paced. Fast-paced games can exhibit the most problems, but at the same time you are more unlikely to spot them in the heat of the action. Unless there's annoying flickering or smth like that. In that case turn off DLSS, lower details, no way around that. For games that aren't so fast faced, you can get by with 60fps or even less, so turn off DLSS if you pixel-peep.
Wow, ok, that's one hell of a statement right there! I need some time for your statement to sink in...
Also that's not what I said at all, read again. I said 4k dlss q looks better than native 1440p while it performs similarly.
2.) I quoted you.