Monday, September 5th 2022

NVIDIA GeForce RTX 4080 Comes in 12GB and 16GB Variants

NVIDIA's upcoming GeForce RTX 4080 "Ada," a successor to the RTX 3080 "Ampere," reportedly comes in two distinct variants based on memory size, memory bus width, and possibly even core-configuration. MEGAsizeGPU reports that they have seen two reference designs for the RTX 4080, one with 12 GB of memory and a 10-layer PCB, and the other with 16 GB of memory and a 12-layer PCB. Increasing numbers of PCB layers enable greater density of wiring around the ASIC. At debut, the flagship product from NVIDIA is expected to be the RTX 4090, with its 24 GB memory size, and 14-layer PCB. Apparently, the 12 GB and 16 GB variants of the RTX 4080 feature vastly different PCB designs.

We've known from past attempts at memory-based variants, such as the GTX 1060 (3 GB vs. 6 GB), or the more recent RTX 3080 (10 GB vs. 12 GB), that NVIDIA turns to other levers to differentiate variants, such as core-configuration (numbers of available CUDA cores), and the same is highly likely with the RTX 4080. The RTX 4080 12 GB, RTX 4080 16 GB, and the RTX 4090, could be NVIDIA's answers to AMD's RDNA3-based successors of the RX 6800, RX 6800 XT, and RX 6950 XT, respectively.
Sources: Wccftech, MEGAsizeGPU (Twitter)
Add your own comment

61 Comments on NVIDIA GeForce RTX 4080 Comes in 12GB and 16GB Variants

#51
Selaya
Vayra86Plenty... a handful ;) And only if you go complete crazy on them. So yeah, sure for that tiny 1% subset, there is a market over 16GB RAM (in gaming).

The very vast majority can make everything happen with far less. VRAM, sure, you could get 16GB today and fill it up. But I think with the new architectures, 12GB is fine, and it will also carry all older titles that can't use said architectural improvements - for that, the olde 8GB was just on the edge of risk; and its also why those 11GB Pascal GPUs were in a fine top spot. They will munch through anything you can get right now, and will run into GPU core constraints if you push the latest hard. Its similar to the age of 4GB Maxwell while the 980ti carried 6GB. It is almost exclusively the 6GB that carried it another few years longer over everything else that competed. I think we're at that point now, but for 12GB versus 8 or 10; - not 16. The reason is console developments; and parity with the PC mainstream. We're still dealing with developers, market share, and specs targeted at those crowds. Current consoles have 16GB total but they have to run an OS; and as a result both run in some crippled combo which incurs speed penalties. At best, either console will have 13,5GB at a (much) lower bandwidth/speed. That's the reason a tiny handful of games can already exhibit small problems with 8 or 10.

16GB, perhaps only in the very top end is going to matter much like it does now, nice to have at best in a niche. Newer titles that lean heavy on VRAM will also be leaning on the new architectures. We've also seen that 10GB 3080's are mostly fine, but occasionally/rarely not. 12GB should carry another few years at least even on top end.

That said... I think we're very close to 'the jump to 32GB RAM' as a mainstream gaming option for PCs, but that'll really only be useful in more than a few games by the time the next console gen is out, and DDR5 is priced to sanity at decent speeds.

The TL DR / my point: there is no real 'sense' in chasing ahead way over the mainstream 'high end' gaming spec. Games won't utilize it proper, won't even support it sometimes, or you just will find a crapload of horsepower wasted for more heat and noise. Its just getting the latest greatest for giggles. Let's call it what it is - the same type of sense that lives in gamer minds buying that 3090(ti) over that 3080 at an immense premium.
oh you're absolutely correct on the vram part; i should've commented that out ig.
but system memory is just so cheap these days, it makes very little sense to not go for at least 32gib unless again it's like a hard budget build. but if you're like buying a 3070 you might asw spring for the 32gib of RAM.
Posted on Reply
#52
Hxx
ixiGive me 4080 16GīBē for 500e and we have a deal ngreedia.
lmao amazing some folks still believe this. nvidia has been successful selling high end cards for $1K for almost 3 years. there is almost zero chance a 4080 will be less than like 900 retail . mind as well ignore this release and grab a used 3080 or some 3070ti for 500 and call it a day.
Posted on Reply
#53
erocker
*
ZoneDymoI mean sure, but who is going to get an RTX4080 only to run limited settings because you got the 12gb model....its just a weird limitation that should nto exist.
Nvidia could price them right. At the right price it could be a really good value for 1440p. ...but this is Nvidia, that usually doesn't happen.
Posted on Reply
#54
kapone32
erockerNvidia could price them right. At the right price it could be a really good value for 1440p. ...but this is Nvidia, that usually doesn't happen.
The thing is the current Gen is already plenty enough for 1440P.
Posted on Reply
#55
ModEl4
Lol, Nvidia is trolling AMD and/or the leakers!
It's not on a different die, there is no way AD102 based on the 12GPC rumor to be cut-down so much in order to fit AD103 performance, they would have to fully disable 4GPC which ain't happening.
Probably it's the same AD103 but I doubt the name would be the same, it's just there to confuse whoever want to figure out Nvidia model placement.
Probably a model sitting above Full AD104 (If Full AD104 is 4070 then this is 4070Ti for example)
According to rumors up till now, 4080 was a cut-down AD103 with 9728 Cuda cores with 256bit bus and 21Gbps GDDR6.
Before one-two weeks we had the rumor regarding 23Gbps memory, which correlates very good with a cut-down 6GPC AD103 model because the extra memory bandwidth (23/21) will be needed with a 192bit bus since the ROPs cutting is only 15% (7/6).
Of course it can have the same 7 GPC 9728 Cuda cores as the previous rumor (or 7 GPC with 10240 Cuda cores) supported by 192bit bus with 23Gbps GDDR6X instead of 256bit bus with 21Gbps but will be bandwidth limited.
Below what the rumors were suggesting regarding Ada architecture/AD103 model:

Posted on Reply
#56
Minus Infinity
So the 16GB will be a 256 bit bus and the 12GB will be a 384 bit bus?

BTW for RDNA3 AMD is increasing bus width after telling us IC was the answer. 7900 394 bit, 7800 320 bit, 7700 256 bit. Nividia is cutting bus width other than on 4090 series. AMD realised with 4K gaming IC wasn't enough as they watched often large 1440p leads reversed in 4K.

RDNA3 is looking better all the time.
Posted on Reply
#57
ixi
Hxxlmao amazing some folks still believe this. nvidia has been successful selling high end cards for $1K for almost 3 years. there is almost zero chance a 4080 will be less than like 900 retail . mind as well ignore this release and grab a used 3080 or some 3070ti for 500 and call it a day.
Luckily i can get 3070 ti for 400e and 3080 for 500 (fresh not used) :), of course it is via contacts not retail. You can call me or those "others" how you want. If you wanna overpay for product - do it, your will and choice. And if that makes you happy - do it.
Posted on Reply
#58
Vayra86
Hxxlmao amazing some folks still believe this. nvidia has been successful selling high end cards for $1K for almost 3 years. there is almost zero chance a 4080 will be less than like 900 retail . mind as well ignore this release and grab a used 3080 or some 3070ti for 500 and call it a day.
Its not about belief, its about where the limits are for people in this market.

I'm of much the same nature, the fact some idiots overpay for GPUs doesn't mean I somehow need to feel forced to do the same. I just get those off their hands at half price a bit later. NP, ty. Its the same in every market, those who want the latest greatest are paying premium. To each their own.

A good deal is one where you're deciding what it is, not someone else deciding it for you or 'circumstance' somehow giving you FOMO. Zero chance of whatever being there at what price? We'll see. People are just going to sit on their money, it has happened in the PC and in the gaming world quite a few times already, alongside all those who overpay for product just the same. Both groups are relevant.

And its fun looking back on those good deals too. I'm still using a 2017 bought GTX 1080; playing anything I really want to play to date, even at 3440x1440 as the GPU survived the move from 1080p to it just fine. Still smooth frames and even > 80 FPS in most stuff I find worthwhile. And then realizing I paid under 500 for that GPU. That's value ;) And its still running as it did on day one, too - and what did I miss over the years since 2017? A few ray traced light sources in a handful of utterly forgettable titles? Its hilarious what a little patience and restraint has on offer in that sense ;)

Another thing you might not realize proper is the fact that we're about to crash into a recession, inflation is high and any entertainment-based market is going to feel the biggest blow as large consumer groups are forced to devote time and money to survival instead of 'fun'. GL with 1K GPUs in that environment. The time of free money is behind us...
Posted on Reply
#59
neatfeatguy
TigerfoxThey stayed away from asymmetric memory setup since the infamous GTX 970 and I don't expect them to reassume this bad habit on the 4080 in particular. Plus, there would be a huge performance difference between 192Bit with 12GB and 256Bit with 16GB.
Look at the RTX 2060 12GB vs the RTX 2060 Super.
Super uses 256 bus and 8GB
12GB uses 192 bus and clearly, 12GB.
Both have the same specs in terms of clock speeds (1470/1750) and memory types (GDDR6) but the Super is only around 6% faster.

I couldn't imagine a 4080 12GB being that far behind a 4080 16GB. I think it would just be stupid if they dropped both models. How are they going to leave room for a Ti or Super model down the road if they're hogging all the small stepping stones of performance right out of the gate?
Posted on Reply
#60
Hxx
Vayra86Its not about belief, its about where the limits are for people in this market.

I'm of much the same nature, the fact some idiots overpay for GPUs doesn't mean I somehow need to feel forced to do the same. I just get those off their hands at half price a bit later. NP, ty. Its the same in every market, those who want the latest greatest are paying premium. To each their own.

A good deal is one where you're deciding what it is, not someone else deciding it for you or 'circumstance' somehow giving you FOMO. Zero chance of whatever being there at what price? We'll see. People are just going to sit on their money, it has happened in the PC and in the gaming world quite a few times already, alongside all those who overpay for product just the same. Both groups are relevant.

And its fun looking back on those good deals too. I'm still using a 2017 bought GTX 1080; playing anything I really want to play to date, even at 3440x1440 as the GPU survived the move from 1080p to it just fine. Still smooth frames and even > 80 FPS in most stuff I find worthwhile. And then realizing I paid under 500 for that GPU. That's value ;) And its still running as it did on day one, too - and what did I miss over the years since 2017? A few ray traced light sources in a handful of utterly forgettable titles? Its hilarious what a little patience and restraint has on offer in that sense ;)

Another thing you might not realize proper is the fact that we're about to crash into a recession, inflation is high and any entertainment-based market is going to feel the biggest blow as large consumer groups are forced to devote time and money to survival instead of 'fun'. GL with 1K GPUs in that environment. The time of free money is behind us...
those "limits" or whatever you're calling them its called market price and obviously there are more than a few idiots if nvida thinks they can sell their inventory at a certain price. Its not about being a good deal , thats what you want it to be, which btw contradicts your first sentence, when realistically things are worth what people are willing to pay. Thats capitalism for you.
ixiLuckily i can get 3070 ti for 400e and 3080 for 500 (fresh not used) :), of course it is via contacts not retail. You can call me or those "others" how you want. If you wanna overpay for product - do it, your will and choice. And if that makes you happy - do it.
if you can get a 500 euro rtx 3080 then ignore this upcoming release. At least i would. Nothing will come out of nvidia this year that will beat that value
Posted on Reply
#61
Vayra86
Hxxthose "limits" or whatever you're calling them its called market price and obviously there are more than a few idiots if nvida thinks they can sell their inventory at a certain price. Its not about being a good deal , thats what you want it to be, which btw contradicts your first sentence, when realistically things are worth what people are willing to pay. Thats capitalism for you.


if you can get a 500 euro rtx 3080 then ignore this upcoming release. At least i would. Nothing will come out of nvidia this year that will beat that value
My idea of a good deal is people consider the price fair for the product they get. Whatever their arguments in that sense, so sure this is subjective as hell!
Posted on Reply
Add your own comment
Dec 18th, 2024 03:52 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts