Monday, October 23rd 2023

NVIDIA GeForce RTX 4080 SUPER to Feature 20GB Memory, Based on AD102

NVIDIA's upcoming mid-life refresh for its GeForce RTX 40-series "Ada" product stack sees the introduction of three new SKUs, led by the GeForce RTX 4080 SUPER, as was reported last week. In the older report, we speculated how NVIDIA could go about creating the RTX 4080 SUPER. BenchLife reports that the RTX 4080 SUPER will be given 20 GB as its standard memory size, and will be based on the larger "AD102" silicon. The SKU will utilize a 320-bit wide memory interface carved out of the 384-bit available to the silicon. The "AD102" has 144 streaming multiprocessors (SM) on die, from which the flagship RTX 4090 is configured with 128, and so NVIDIA could pick an SM count that's lower than that of the RTX 4090, while being higher than the 76 of the current RTX 4080.
Sources: Wccftech, BenchLife.info
Add your own comment

145 Comments on NVIDIA GeForce RTX 4080 SUPER to Feature 20GB Memory, Based on AD102

#101
rv8000
Dr. DroDon't take this personally, it's not aimed at you... I just wish I saw such armchair analysis as to why AMD releases something like the 6750 and 7900 GRE, which you aren't even going to see in Western markets outside of OEM channels and isn't equally slammed for it. Instead, we see praise, as if they're championing something by re-releasing old or low-grade hardware at lower MSRPs in laser-targeted channels that won't benefit anyone building a PC.
If I remember correctly most youtube based reviewers said exactly that about the GRE though, and did call it pointless. The majority of then typically slam both Nvidia and AMD for re-releases and pointless refreshes.
Posted on Reply
#102
bug
rv8000If I remember correctly most youtube based reviewers said exactly that about the GRE though, and did call it pointless. The majority of then typically slam both Nvidia and AMD for re-releases and pointless refreshes.
I wouldn't read too much into it. There too much time between Ada and its successor, so Nvidia had to release something in between. They did it before with Turing so, really, nothing new here. It's not like Nvidia invented the stopgap solution.
Posted on Reply
#103
rv8000
bugI wouldn't read too much into it. There too much time between Ada and its successor, so Nvidia had to release something in between. They did it before with Turing so, really, nothing new here. It's not like Nvidia invented the stopgap solution.
Point being no one championed the release of the GRE, and most of the internet forums take every stab they can at AMD because reasons. His comment just doesn’t make any sense.
Posted on Reply
#104
theouto
fevgatosThe issue here is that as you mentioned yourself, there are games that falter in motion at native. Eg tlou has flickering on native,and so does starfield. People act like native is the end all be all when in fact in most cases dlss looks straight up better even in motion while increasing your framerate.
That happens because of the over reliance on bad Anti aliasing methods, such is the case with most TAA implementations like the one in the unreal engine, meanwhile you have anti aliasing methods like SMAA that don't display such behaviours, yet can be a bit less stable, albeit substantially sharper and the more ideal AA solution as resolution increases. Also, straight up better is a stretch when both look meh
Posted on Reply
#105
JustBenching
stimpy88I.) I can run it on my own system, and have.
2.) I quoted you.
Yes you can run on your own system but you can also lie about your findings. So, do you want me to post 2 screenshots and you tell me which is which?

Yes, you quoted me but you don't understand what I said in the quote. Let me repeat it once more. 4k DLSS Q looks way better (emphasis on the WAY) than native 1440p
Posted on Reply
#106
kapone32
Where I live the cheapest 4090 is $2300 and the cheapest 4080 is $1300. That leaves $1000 of room to place the 4080 20GB which will probably be between $16-1700 as a baseline. I do feel sorry for those who bought cards like the 4080 and I am sure the 4070TI Super will come with at least 16GB of VRAM. There is the thought process that unless they change the chip we may have a repeat of the 4060Ti 8 and 16GB cards.
fevgatosYes you can run on your own system but you can also lie about your findings. So, do you want me to post 2 screenshots and you tell me which is which?

Yes, you quoted me but you don't understand what I said in the quote. Let me repeat it once more. 4k DLSS Q looks way better (emphasis on the WAY) than native 1440p
Why are you comparing 1440P to 4K anything. Obviously there are more pixels generated at 4K. There is also the fact that DLSS is not in every Game of which raster is a basis. Of course the GPU is only 1 part of the equation as the quality and type of monitor has way more impact for picture quality than the GPU.
Posted on Reply
#107
theouto
fevgatosYes you can run on your own system but you can also lie about your findings. So, do you want me to post 2 screenshots and you tell me which is which?

Yes, you quoted me but you don't understand what I said in the quote. Let me repeat it once more. 4k DLSS Q looks way better (emphasis on the WAY) than native 1440p
And like you've been told, no one will go from 1440p to 4k dlss q, if you have a 1440p monitor, you'd go to 1440p dlssq, not 4k dlss q, you are reaching hard to the point where you reach a case scenario that 99% of users won't ever reach, so it's largely irrelevant
Posted on Reply
#108
JustBenching
kapone32Why are you comparing 1440P to 4K anything. Obviously there are more pixels generated at 4K. There is also the fact that DLSS is not in every Game of which raster is a basis. Of course the GPU is only 1 part of the equation as the quality and type of monitor has way more impact for picture quality than the GPU.
I'm not comparing 1440p to 4k,im comparing image quality at iso framerates. Whichever method can give you better image quality at same fps is obviously superior.
Posted on Reply
#109
kapone32
fevgatosI'm not comparing 1440p to 4k,im comparing image quality at iso framerates. Whichever method can give you better image quality at same fps is obviously superior.
Please tell me what you know about Upscaling
Posted on Reply
#110
JustBenching
kapone32Please tell me what you know about Upscaling
The fact that 3 people are trying to convince me I'm wrong but none of them are willing to take a blind test tells me everything I need to know about upscaling
theoutoAnd like you've been told, no one will go from 1440p to 4k dlss q, if you have a 1440p monitor, you'd go to 1440p dlssq, not 4k dlss q, you are reaching hard to the point where you reach a case scenario that 99% of users won't ever reach, so it's largely irrelevant
And as you've been answered if you have a 1440p monitor you don't need to play at 4k,you use supersample. Dldsr at 4k+dlss q >>> native.
Posted on Reply
#111
theouto
fevgatosThe fact that 3 people are trying to convince me I'm wrong but none of them are willing to take a blind test tells me everything I need to know about upscaling
..........answer the question you've been asked mate, it takes a quick search on your search engine of choice

also 4k dlss q will still have motion artifacts that are not present in native rendering (if the AA is not garbage), also 4k dlss q runs worse than native 1440p as the upscale still has a performance impact vs the base resolution, it's not magic mate, it's very easy to tell it's not magic, don't treat it as such, and many people will prefer native, be it because of stability or sharpness.

dlss also breaks many screen space effects as it causes them to render at a lower quality resolution, or will break effects that rely on TAA, such is the case with the DOF in MWII
Posted on Reply
#112
JustBenching
theouto..........answer the question you've been asked mate, it takes a quick search on your search engine of choice

also 4k dlss q will still have motion artifacts that are not present in native rendering (if the AA is not garbage), also 4k dlss q runs worse than native 1440p as the upscale still has a performance impact vs the base resolution, it's not magic mate, it's very easy to tell it's not magic, don't treat it as such, and many people will prefer native, be it because of stability or sharpness.

dlss also breaks many screen space effects as it causes them to render at a lower quality resolution, or will break effects that rely on TAA, such is the case with the DOF in MWII
So it would be quite easy to tell on a blind test...
Posted on Reply
#113
bug
fevgatosSo it would be quite easy to tell on a blind test...
Except you can't do a blind test on images rendered at different resolutions. Their resolution is a dead giveaway. If you crop, the image with bigger objects was rendered at lower resolution. If you rescale, the quality of the rendering is gone.
Posted on Reply
#114
JustBenching
bugExcept you can't do a blind test on images rendered at different resolutions. Their resolution is a dead giveaway. If you crop, the image with bigger objects was rendered at lower resolution. If you rescale, the quality of the rendering is gone.
Indeed. But the point is, the 4k dlss will look much better than the native 1440p, even on a 1440p monitor, for obvious reasons.

I don't get why people argue otherwise, even a blind person can tell you that applying dlss to a supersampled image looks way better than native.
Posted on Reply
#115
TumbleGeorge
TotallyDespite what you say, it has been the case for more than a decade
Little side example but more smart and effective architecture is much important than node that in use to make a chip. I hope that Nvidia, AMD and Intel work on better architecture not laying change of lithography and overclocking to pull them out of the hole of mediocrity.
Posted on Reply
#116
bug
fevgatosIndeed. But the point is, the 4k dlss will look much better than the native 1440p, even on a 1440p monitor, for obvious reasons.

I don't get why people argue otherwise, even a blind person can tell you that applying dlss to a supersampled image looks way better than native.
Easy explanation: there is no mathematical formula for "will look much better".
Everybody sees what they want to see. The distinction is mostly psychological: look at the system specs of people dismissing DLSS, the vast majority are running AMD hardware.

Another aspect is monitors. If you have a 4k at 32" or less, you're not seeing all the details the monitor is capable of anyway, so you won't see minute upscaling artifacts either. Move to an 80" TV from 2' away and that can change quite a bit.
Posted on Reply
#117
Dr. Dro
AusWolfCPUs have nothing to do with GPUs. Ryzen is a highly esteemed name. Radeon isn't.
The question that needs be asked is why is that. Why has Ryzen in the 5 years that it exists as a brand/product/series is considered to be positive and Radeon despite existing since 2000 isn't.
the54thvoidI don't think poking fun at Nvidia is hatred. I see people getting tired of their pricing bullshit. AMD as a 2nd tier company with shareholders have to follow the lead. If Nvidia released price sensible products in terms of market affordability, then AMD would really have to push hard to make something matter, otherwise, their prices would have to flatline to compete. But Nv pushing prices so high, means AMD can raise them too to appease their own shareholders.

I think Nv treats the gaming consumer base with contempt. It's decision to hyper inflate card prices (my budget model 4070ti at £800, is a prime example) shows that much. It's absolutely all right to give them a hard time, just as it is to give AMD a hard time for trying to do the same (although again, shareholder expectations tie AMD pricing in sync with Nvidia).

It would be lovely to think a 4080 super would push the 4080 down in price, but at this point, I see three cards getting priced above 1k for Nvidia, which is just plain nuts from a consumer standpoint. It is this behaviour that turns people away (often necessarily in financial terms).
I guess when it comes down to it we all want cheaper cards, I agree... But then again it's like @fevgatos mentioned, AMD's the one that's gotta undercut and innovate to make inroads in market share, but they haven't done that. They've done half hearted copies of Nvidia's recent technologies (DLSS-G, Reflex Low Latency, etc.) which tend to perform worse and each time introducing more and more issues on their drivers, using open source as both an excuse and crutch.

I just wish things were better, I guess we all share that desire
Posted on Reply
#118
theouto
bugEasy explanation: there is no mathematical formula for "will look much better".
Everybody sees what they want to see. The distinction is mostly psychological: look at the system specs of people dismissing DLSS, the vast majority are running AMD hardware.
This is true, but in my defense, I've had a good deal of firsthand experience with dlss! It simply is that my latest gpu is an AMD gpu, but before this I was running a laptop that had a 2070, so I was almost forced to use it wherever possible lol.

I know you weren't throwing anything at me, but I still feel like it's worth clarifying that I am not talking out of ignorance, I swear!
Dr. DroI guess when it comes down to it we all want cheaper cards, I agree... But then again it's like @fevgatos mentioned, AMD's the one that's gotta undercut and innovate to make inroads in market share, but they haven't done that. They've done half hearted copies of Nvidia's recent technologies (DLSS-G, Reflex Low Latency, etc.) which tend to perform worse and each time introducing more and more issues on their drivers, using open source as both an excuse and crutch.
I agree here, AMD could pick up the slack, the GPUs they have are very good (some, not all, many are still awful), but because of the unfortunate situation that they find themselves in, they need to do more than that, like they did with ryzen back in the day. While in hindsight this was a terrible moment to begin testing chiplets, since this gen was a missed golden opportunity, and we know that chiplets on gpus are giving them a hard time (leading to the cancellation of top end 8000 series gpus), I do feel like they may be arriving at a good position if things work out in the end, but that's an if, not a when, chances are it could blow up in their faces, no way to know.

I choose to remain hopeful, if they could achieve wonders with ryzen, I want to hope that they can do the same with radeon.
Posted on Reply
#119
bug
theoutoThis is true, but in my defense, I've had a good deal of firsthand experience with dlss! It simply is that my latest gpu is an AMD gpu, but before this I was running a laptop that had a 2070, so I was almost forced to use it wherever possible lol.

I know you weren't throwing anything at me, but I still feel like it's worth clarifying that I am not talking out of ignorance, I swear!
I know most people here aren't talking out of ignorance*, my point is two people can look at the same screen and still see different things. Under these circumstances, I don't think it's possible to establish once and for all who's right, so I'm not even trying anymore.

*If anyone is talking out of ignorance, that would be me. I have been watching TPU's DLSS IQ comparisons, but I am still stuck with my 1060, thus I definitely don't have first hand experience with either DLSS or FSR. I do have some background in computer graphics and about two decades of gaming behind me, so there's that.
Posted on Reply
#120
JustBenching
bugEasy explanation: there is no mathematical formula for "will look much better".
Everybody sees what they want to see. The distinction is mostly psychological: look at the system specs of people dismissing DLSS, the vast majority are running AMD hardware.

Another aspect is monitors. If you have a 4k at 32" or less, you're not seeing all the details the monitor is capable of anyway, so you won't see minute upscaling artifacts either. Move to an 80" TV from 2' away and that can change quite a bit.
Sure there is no mathematical formula, but Im pretty confident if I post 2 pictures with and without DLSS running at same framerate, you will instantly and without hesitation declare the one with DLSS as the better one.
Posted on Reply
#121
bug
fevgatosSure there is no mathematical formula, but Im pretty confident if I post 2 pictures with and without DLSS running at same framerate, you will instantly and without hesitation declare the one with DLSS as the better one.
Don't be so sure, I've done my fair share of photo editing, I can be quite the pixel peeper ;)
Posted on Reply
#122
JustBenching
bugDon't be so sure, I've done my fair share of photo editing, I can be quite the pixel peeper ;)
Well here you go, not a blind test since I have the names, but doesn't need a blind test. In my eyes the 4k DLSS performance is by far the best one but performance drops a bit, 1440p DLSS Q and native 1080p have same fps but the DLSS is apparently better. Retains more details on pretty much everything. Sadly even imgbb compressed them a bit, original images were 25mb, imgbb hosted them at 10mb.

i.ibb.co/swbznGb/4k-dlss-performance.png
i.ibb.co/M2RJqs6/1080p-native-1.png
i.ibb.co/JFztSLy/1440p-DLSS-Q-2.png
Posted on Reply
#123
Random_User
Dr. Drops. a kitten dies every time someone writes nVIDIA or nVidia. Have a heart.
Nobody dies. It's just hurts your personal feelings. since you have too much entanglement with Nvidia corporate mentality. No offence. Nvidia won't cradle you before bed. :laugh:
BTW. As you seems to be an avid NV fan, did you ever seen the Nvidia logo? At least one glance? :rolleyes:

Not to rub your wounds, however...
Now, here goes real punch :D

the54thvoidThat's actually a fallacy I called out in the reviews for the 4xxx. The RT performance hit on ADA is practically the same as Ampere. Only the 4090 appears to get clear. You need to look at each game where the results vary, but the drop in performance on a 3080 using RT is around 48-50%. For the 4080 it's also in that ballpark. The other cards stack the same. Apart from the 4090 (arguably the best GPU ever), the hit is the same transferring to ADA from Ampere. RT fps only improves because the base performance (rasterisation) increases, but the actual hit is the same.
Yep. Have seen a video recently, of a guy testing 4K native RT with 4090 and e.g in C2077, it was struggling desperately in the area of pre 30 fps. So without upscaling it's bad idea.

Trim the DLSS, fake frames out, and performance wise the card would be basically 3090Ti with better power efficiency.
theouto...While in hindsight this was a terrible moment to begin testing chiplets, since this gen was a missed golden opportunity...
Completely agree. Though, the chiplet idea might possibly help AMD in future, to add them along with CPU ones in desktop/mobile APUs. So they won't need to make a monolithic chips any more for these products. However these are my own guesses and speculations.

But yeah, they lost so much money on untested lineup that got them into trouble, with possibility to get no next gen decent discrete cards at all. They'd better to do the testing before launching full product line, or at least release them in limited quantities as a side product, e.g. Radeon VII.
Posted on Reply
#124
theouto
fevgatosWell here you go, not a blind test since I have the names, but doesn't need a blind test. In my eyes the 4k DLSS performance is by far the best one but performance drops a bit, 1440p DLSS Q and native 1080p have same fps but the DLSS is apparently better. Retains more details on pretty much everything. Sadly even imgbb compressed them a bit, original images were 25mb, imgbb hosted them at 10mb.

i.ibb.co/swbznGb/4k-dlss-performance.png
i.ibb.co/M2RJqs6/1080p-native-1.png
i.ibb.co/JFztSLy/1440p-DLSS-Q-2.png
1440p dlss quality is not 1080p btw, it's closer to 900p

Also, I am astonished to report that: they all look meh, I am a bit surprised actually, it's higher res sure, but it retains many weaknesses of the native 1080p image, such as this, which looks to be AO stairstepping
(see what I said about it making things not look that much better than you may think?) (dlss 4k perf on the left, 1080p on the right)

(1440p dlss q on left, native 1080p on the middle, 4k dlss perf on the right) I actually find this particle effect to look better on native 1080p than 1440p dlss quality because of what I mentioned before, it looks noticeably more blocky on dlss, it's also why the transparent window looks argueably nicer on native 1080p compared to dlss @1440p, and hell, I'd say it looks better than in the 4k dlss perf image because there it looks incredibly out of place, because it's trying to upscale an effect (the smudgyness of the window) that is dependent on the internal resolution, but still upscales it to 4k in the same way that the particles or ssr would be, so it looks jarring by comparison.

I guess the lines themselves are sharper, sure, but texture quality does not feel quite crystal 4k (which is a problem I've had in many games back when I ran dlss, textures never feeling quite right, now weather it's because of incorrect use of negative bias or just dlss, that's not my problem)
I'd also argue that Cyberpunk 2077 is not the best game to test this with, since that game has easily one of the worst TAA solutions I've seen any game have, spiderman might be a better go tbh, specially with SMAA

Anyway, I think I'll be sticking to 1080p thank you, low res, but not jarring (though the TAA is terrible, so I'll just not look at this game)

This was pointless
Posted on Reply
#125
JustBenching
theouto1440p dlss quality is not 1080p btw, it's closer to 900p

Also, I am astonished to report that: they all look meh, I am a bit surprised actually, it's higher res sure, but it retains many weaknesses of the native 1080p image, such as this, which looks to be AO stairstepping
(see what I said about it making things not look that much better than you may think?) (dlss 4k perf on the left, 1080p on the right)

(1440p dlss q on left, native 1080p on the middle, 4k dlss perf on the right) I actually find this particle effect to look better on native 1080p than 1440p dlss quality because of what I mentioned before, it looks noticeably more blocky on dlss, it's also why the transparent window looks argueably nicer on native 1080p compared to dlss @1440p, and hell, I'd say it looks better than in the 4k dlss perf image because there it looks incredibly out of place, because it's trying to upscale an effect (the smudgyness of the window) that is dependent on the internal resolution, but still upscales it to 4k in the same way that the particles or ssr would be, so it looks jarring by comparison.

I guess the lines themselves are sharper, sure, but texture quality does not feel quite crystal 4k (which is a problem I've had in many games back when I ran dlss, textures never feeling quite right, now weather it's because of incorrect use of negative bias or just dlss, that's not my problem)
I'd also argue that Cyberpunk 2077 is not the best game to test this with, since that game has easily one of the worst TAA solutions I've seen any game have, spiderman might be a better go tbh, specially with SMAA

Anyway, I think I'll be sticking to 1080p thank you, low res, but not jarring (though the TAA is terrible, so I'll just not look at this game)

This was pointless
Yeah, 1080p native looks obviously better. LOL
Posted on Reply
Add your own comment
Dec 20th, 2024 19:08 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts