Wednesday, September 14th 2022

NVIDIA RTX 4080 12GB and 16GB Based on Different Chips, Vastly Different Shader Counts

When we first got news about NVIDIA's upcoming GeForce RTX 4080 "Ada" coming in 12 GB and 16 GB variants, we knew there was more setting the two apart than just memory size and memory bus-width. Turns out there's a lot more. According to detailed specifications leaked to the web, while the 16 GB variant of the RTX 4080 is based on the same AD103, the second largest chip after the AD102; the 12 GB RTX 4080 is based on the smaller AD104 chip which has a physically narrower memory bus.

It looks like NVIDIA is debuting the RTX 40-series with at least three models—RTX 4090 24 GB, RTX 4080 16 GB, and RTX 4080 12 GB. The RTX 4090 is the top-dog part, with the ASIC code "AD102-300-xx." It's endowed with 16,384 CUDA cores, a boost frequency of up to 2.52 GHz, 24 GB of 21 Gbps GDDR6X memory, and a typical graphics power (TGP) of 450 W, which is "configurable" up to 600 W. The RTX 4080 16 GB is based on the AD103-300-xx" comes with 9,728 CUDA cores, a boost frequency of 2.50 GHz, and 16 GB of 23 Gbps GDDR6X memory across a narrower memory bus than the one the RTX 4090 comes with. This card reportedly has a 340 W TGP configurable up to 516 W.
The GeForce RTX 4090 12 GB is positioned a notch below its 16 GB namesake, but is based on the smaller AD104 chip, with 7,680 CUDA cores running at speeds of up to 2.61 GHz, 12 GB of 21 Gbps GDDR6X memory, and a TGP of 285 W that's configurable up to 366 W. It's interesting how the leak includes not just TGP, but also maximum configurable TGP. The various board partners will utilize the latter as their power limits to achieve overclocked speeds. Even the NVIDIA Founders Edition board is technically "custom design," and so it could feature higher-than-stock TGP.
Source: VideoCardz
Add your own comment

66 Comments on NVIDIA RTX 4080 12GB and 16GB Based on Different Chips, Vastly Different Shader Counts

#26
SOAREVERSOR
LenneTruly some ugly mismarketing. I feel bad for those who doesn't know that much about computers, and gets the cheaper one just because "12GB is fine for me" and the card is actually way slower than the similarly named one.
I think the last nvidia GPU I bought that I didn't feel fucked after buying was the 8800gtx SLI config. The last GPU I bought that I felt good about and really happy with the price was a 6800gt for a second rig. The first rig had dual 6800 ultras and was really nice but that 6800gt only took one power connector, was single slot and spat out Doom 3 and HL2 at 1600 verticle just fine. 9700 pro was also a gem. Then it goes back to glide graphics and good lord what a mess.
Posted on Reply
#27
Ruru
S.T.A.R.S.
SOAREVERSORI think the last nvidia GPU I bought that I didn't feel fucked after buying was the 8800gtx SLI config. The last GPU I bought that I felt good about and really happy with the price was a 6800gt for a second rig. The first rig had dual 6800 ultras and was really nice but that 6800gt only took one power connector, was single slot and spat out Doom 3 and HL2 at 1600 verticle just fine. 9700 pro was also a gem. Then it goes back to glide graphics and good lord what a mess.
Yeah, as GT wasn't disabled in any means, just slightly lower clocks and those practically always OC'd to Ultra clocks.
Posted on Reply
#28
1d10t
TheoneandonlyMrKAd106 afaik not these.

Don't like this personally and they're is likely to be a performance disparity between the 12/16GB parts, how could there not be?!.
Was meant for sarcasm but continue on :D
Posted on Reply
#29
Pumper
50% price difference confirmed, lol.
Posted on Reply
#30
64K
The 4080 will be a cluster-fuck.
Posted on Reply
#31
bug
64KThe 4080 will be a cluster-fuck.
Because of the label on the box? Sure.
Posted on Reply
#32
TheoneandonlyMrK
1d10tWas meant for sarcasm but continue on :D
Continue on with what?!.

My reply had no snark, your original post had no :p either.
Posted on Reply
#33
64K
bugBecause of the label on the box? Sure.
No, because the 4080 will not only have two different VRAM sizes but also will have different amounts of shaders.
Posted on Reply
#34
TheoneandonlyMrK
64KNo, because the 4080 will not only have two different VRAM sizes but also will have different amounts of shaders.
Not to mention memory buss width and hence bandwidth.
Posted on Reply
#35
oxrufiioxo
64KNo, because the 4080 will not only have two different VRAM sizes but also will have different amounts of shaders.
I personally don't really care as anyone with half a brain should be able to do their own research on buying a likely 700+ usd gpu.... My worry is Nvidia is doing this to hit the 699 usd price point with the 12GB version while the 16GB varient (The real 4080) will be closer to 999 usd.
Posted on Reply
#36
64K
oxrufiioxoI personally don't really care as anyone with half a brain should be able to do their own research on buying a likely 700+ usd gpu.... My worry is Nvidia is doing this to hit the 699 usd price point with the 12GB version while the 16GB varient (The real 4080) will be closer to 999 usd.
imo most gamers don't visit tech sites so they won't even know the differences. It's like the two 1060s. Even game developers didn't distinguish the two in their system requirements at first.
Posted on Reply
#37
oxrufiioxo
64Kimo most gamers don't visit tech sites so they won't even know the differences. It's like the two 1060s. Even game developers didn't distinguish the two in their system requirements at first.
So you're on a crusade to help gamers who don't educate themselves that are likely buying an Alienware or ibuypower system? Cool.

Get the good word out there I guess.

Personally those gamers are better off on a PS5/SeriesX

Edit. On a 2-400 product I'm with you but on a 700+ one I don't feel bad for people who make bad decisions. Especially whole system purchases that likely cost 1500-2000 usd. I also told countless people not to buy the 1060 3GB most didn't listen stating 3GB is plenty for 1080p...
Posted on Reply
#38
TheoneandonlyMrK
oxrufiioxoI personally don't really care as anyone with half a brain should be able to do their own research on buying a likely 700+ usd gpu.... My worry is Nvidia is doing this to hit the 699 usd price point with the 12GB version while the 16GB varient (The real 4080) will be closer to 999 usd.
And we all know how reliable they're MSRP is eh.
Posted on Reply
#39
oxrufiioxo
TheoneandonlyMrKAnd we all know how reliable they're MSRP is eh.
For realz... Here's your 12GB 4080 for the low low price of 699 (but really 800 usd sucka). I am hopeful that in the current market they aren't able to get away with this though.
Posted on Reply
#40
64K
oxrufiioxoPersonally those gamers are better off on a PS5/SeriesX
There are tens of million PC gamers out there. The word is already out here but no mention on the PCgamer site
Posted on Reply
#41
oxrufiioxo
64KThere are tens of million PC gamers out there. The word is already out here but no mention on the PCgamer site
People who can't even type what's the difference between X gpu vs x gpu into google have bigger problems than Nvidia choice to segment it's products. For those who don't even know what a gpu is ignorance is bliss I guess.
Posted on Reply
#42
64K
oxrufiioxoPeople who can't even type what's the difference between X gpu vs x gpu into google have bigger problems than Nvidia choice to segment it's products. For those who don't even know what a gpu is ignorance is bliss I guess.
But there's no reason for the uniformed to question the difference in the 4080s except of course for the price. That is what Nvidia is counting on and the developers of games will just say 4080 in requirements for 4K

I don't even know if the differences will even show up on sites like PCGamer. Their tech editor is mostly clueless about tech. He once stated in a review on a GTX 690 that it would give better 4K performance because it had twice as much VRAM.
Posted on Reply
#43
oxrufiioxo
64KBut there's no reason for the uniformed to question the difference in the 4080s except of course for the price. That is what Nvidia is counting on and the developers of games will just say 4080 in requirements for 4K

I don't even know if the differences will even show up on sites like PCGamer. Their tech editor is mostly clueless about tech. He once stated in a review on a GTX 690 that it would give better 4K performance because it had twice as much VRAM.
I actually liked the GTX 690 it was a pretty neat card I preferred my sli 680s but only taking up one alot was pretty cool. Tear for the death of sli although I can't even imagine doing it with my 3080ti :laugh:

I only see the two 4080s as being an issue if they are priced the same or at least very close in price. I doubt that will be the case the specs are pretty different. I'm also not convinced this isn't an oem only varient where 300+ watt cards are a bad idea as it is.

This isn't the same situation as with the 1030 that came in both ddr4 and Gddr5 varients that costed about the same and performed very different that preyed on the market that would be most impacted by this.
Posted on Reply
#44
TheoneandonlyMrK
oxrufiioxoI actually liked the GTX 690 it was a pretty neat card I preferred my sli 680s but only taking up one alot was pretty cool. Tear for the death of sli although I can't even imagine doing it with my 3080ti :laugh:

I only see the two 4080s as being an issue if they are priced the same or at least very close in price. I doubt that will be the case the specs are pretty different. I'm also not convinced this isn't an oem only varient where 300+ watt cards are a bad idea as it is.

This isn't the same situation as with the 1030 that came in both ddr4 and Gddr5 varients that costed about the same and performed very different that preyed on the market that would be most impacted by this.
Isn't it.

Because I expect the 16GB to be out and reviewed first.

Then they ship the LESS performance part with increased customer excitement for the slightly cheaper one.
Hopefully W1zzard will straighten it all out for people, but I will shit in the thread if I am right and the 12GB is slightly delayed.

Because that's a scum tactic.
Posted on Reply
#45
oxrufiioxo
TheoneandonlyMrKIsn't it.

Because I expect the 16GB to be out and reviewed first.

Then they ship the LESS performance part with increased customer excitement for the slightly cheaper one.
Hopefully W1zzard will straighten it all out for people, but I will shit in the thread if I am right and the 12GB is slightly delayed.

Because that's a scum tactic.
Not really sure what Nvidia is thinking but going by the specs I expect these cards to be 200+ usd apart making them hard to confuse. They'd look better just calling the 12GB varient the 4070ti and could price it around $650 if it's indeed faster or similar than a 3090 and look good. People smarter than me when it comes to marketing a product are making these decisions so if they thik people are stupid enough to confuse them I guess it must be true.

None of these companies are our friend they are simply here to make money so really it comes down to how good a product both varients are not what they're named hopefully they both end up decent products without highly inflated prices vs 30 series.
Posted on Reply
#46
TheoneandonlyMrK
oxrufiioxoNot really sure what Nvidia is thinking but going by the specs I expect these cards to be 200+ usd apart making them hard to confuse. They'd look better just calling the 12GB varient the 4070ti and could price it around $650 if it's indeed faster or similar than a 3090 and look good. People smarter than me when it comes to marketing a product are making these decisions so if they thinm people are stupid enough to confuse them I guess it must be true.

None of these companies are our friend they are simply here to make money so really it comes down to how good a product both varients are not what they're named hopefully they both end up decent products without highly inflated prices vs 30 series.
Paul's hardware mentioned a very good point.

The case's people have now, a lot of them are not going to work well with 800watts under load in use.

There's going to be a glut of random restarts etc too, since many will chance they're future proof 850watt PSU can do it, and some can't.
Posted on Reply
#47
Easo
I really really do not want to know how much they will cost. And this is totally unnecessary, but I guess marketing people at Nvidia know what they are doing.
Posted on Reply
#48
oxrufiioxo
TheoneandonlyMrKPaul's hardware mentioned a very good point.

The case's people have now, a lot of them are not going to work well with 800watts under load in use.

There's going to be a glut of random restarts etc too, since many will chance they're future proof 850watt PSU can do it, and some can't.
The 12GB variant may end up being the better card for a lot of people thankfully my 011 Dynamic XL with 7 Phantek T30 fans should be up to the task I hope lol.... I've been recommending people grab at least 1k psu since ampere launched if they plan on going higher than a 3070 just in case it should be interesting regardless.

My 3080ti actually runs pretty cool at 450w maxed out power limits 65-70C but I doubt 600W is realistically doable though. I still might try lol.
Posted on Reply
#49
evernessince
P4-630If the performance is there, I have no issues with it, even consuming less power....
The problem lies in that it's clearly a tactic design to take advantage of customer exceptions. The name would imply that it lands close of the 4080 16GB but rumors suggest that it's more like a 4070 Ti. If the performance it too far from the 4080 then the naming is misleading plain and simple.

In addition, it sets Nvidia up for pulling more shenanigans in the future. PC gamers might have to worry about 4080s and future generations of cards with significant differences in performance despite the name suggesting they are the reasonably the same. If people don't flag this now there's nothing stopping Nvidia from bifurcating other SKUs and potentially increasing the gap in performance between said SKUs.

I really hope PC gamers put their foot down because this kind of trend only hurts consumers.
Posted on Reply
#50
64K
evernessinceThe problem lies in that it's clearly a tactic design to take advantage of customer exceptions. The name would imply that it lands close of the 4080 16GB but rumors suggest that it's more like a 4070 Ti. If the performance it too far from the 4080 then the naming is misleading plain and simple.

In addition, it sets Nvidia up for pulling more shenanigans in the future. PC gamers might have to worry about 4080s with significant differences in performance despite the name suggesting they are the reasonably the same. If people don't flag this now there's nothing stopping Nvidia from bifurcating other SKUs and potentially increasing the gap in performance between said SKUs.

I really hope PC gamers put their foot down because this kind of trend only hurts consumers.
Nividia has been pulling shenanigans since the GTX 680.
Posted on Reply
Add your own comment
Jan 31st, 2025 15:09 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts