Wednesday, September 14th 2022

NVIDIA RTX 4080 12GB and 16GB Based on Different Chips, Vastly Different Shader Counts

When we first got news about NVIDIA's upcoming GeForce RTX 4080 "Ada" coming in 12 GB and 16 GB variants, we knew there was more setting the two apart than just memory size and memory bus-width. Turns out there's a lot more. According to detailed specifications leaked to the web, while the 16 GB variant of the RTX 4080 is based on the same AD103, the second largest chip after the AD102; the 12 GB RTX 4080 is based on the smaller AD104 chip which has a physically narrower memory bus.

It looks like NVIDIA is debuting the RTX 40-series with at least three models—RTX 4090 24 GB, RTX 4080 16 GB, and RTX 4080 12 GB. The RTX 4090 is the top-dog part, with the ASIC code "AD102-300-xx." It's endowed with 16,384 CUDA cores, a boost frequency of up to 2.52 GHz, 24 GB of 21 Gbps GDDR6X memory, and a typical graphics power (TGP) of 450 W, which is "configurable" up to 600 W. The RTX 4080 16 GB is based on the AD103-300-xx" comes with 9,728 CUDA cores, a boost frequency of 2.50 GHz, and 16 GB of 23 Gbps GDDR6X memory across a narrower memory bus than the one the RTX 4090 comes with. This card reportedly has a 340 W TGP configurable up to 516 W.
The GeForce RTX 4090 12 GB is positioned a notch below its 16 GB namesake, but is based on the smaller AD104 chip, with 7,680 CUDA cores running at speeds of up to 2.61 GHz, 12 GB of 21 Gbps GDDR6X memory, and a TGP of 285 W that's configurable up to 366 W. It's interesting how the leak includes not just TGP, but also maximum configurable TGP. The various board partners will utilize the latter as their power limits to achieve overclocked speeds. Even the NVIDIA Founders Edition board is technically "custom design," and so it could feature higher-than-stock TGP.
Source: VideoCardz
Add your own comment

66 Comments on NVIDIA RTX 4080 12GB and 16GB Based on Different Chips, Vastly Different Shader Counts

#1
oxrufiioxo
Just looking at specs the lower tier 4080 looks like it should be a 4070ti or oem only varient....

Never been a huge fan of two different skus carrying the same name though.
Posted on Reply
#2
P4-630
btarunrThe GeForce RTX 4090 12 GB is positioned a notch below its 16 GB namesake, but is based on the smaller AD104 chip, with 7,680 CUDA cores
I guess you mean the RTX 4080 12GB....
Posted on Reply
#3
Garrus
These are tricks to keep nVidia's outrageous 67 percent gross margin going (or at least attempt to keep the good times going for them, not for us).

RTX 3080 used the top 102 chip, if the 4080 12GB uses AD104 instead of 103 or 102, you are actually paying the same money for 2 tiers lower performance in the stack compared to last gen.
Posted on Reply
#5
P4-630
Garrusyou are actually paying the same money for 2 tiers lower performance in the stack compared to last gen.
If the performance is there, I have no issues with it, even consuming less power....
Posted on Reply
#6
oxrufiioxo
P4-630If the performance is there, I have no issues with it, even consuming less power....
I agree, if the 4080 is 30-40% faster at the same ish price as the 3080 it'll be fine.

The 600, 700, 1000, 2000 non ti 80 series cards all used the 104 die. Ampere was the first time since the GTX 580 that we got above that for a 80 non ti tiered card.
Posted on Reply
#7
HisDivineOrder
"Buy a 4080 today! Starting at $699.99."

Back in the day, these "4080"'s would have been a 4070. Now they want to charge more without saying they've once again upped the price on the tiers, so they're disguising it by calling it a 4080. Far from "who cares if it performs," I'd care because they're charging you more for the performance tier they would have sold you in prior generations for less because it would have been designated a lower tier. This way, they get to reduce tier relative performance because if THIS is considered a 4080 then imagine the 4070 below it. Or the 4060 below that.
Posted on Reply
#8
Minus Infinity
So 4090 384 bit, 4080 16GB 256 bit and 4080 12GB 192 bit, so a glorified 3060 Ti. This confirms 4070 Ti will be 192 bit at most, 4070 maybe 160 bit.
Posted on Reply
#9
ZoneDymo
Sure you can say people buying a gpu will be tech savy enough, but I still feel this is done just to confuse and borderline scam people.
They are already doing too many SKU's but to then also have different versions of cards with the same name.

And yes I know both AMD and Nvidia have done this in the past (which I hated then as well), this needs to stop honestly.

Reviewers now have a lot more work telling people how each version of teh same gpu performs....
Posted on Reply
#10
1d10t
Which one utilize PCIe Express x8?
Posted on Reply
#11
TheoneandonlyMrK
1d10tWhich one utilize PCIe Express x8?
Ad106 afaik not these.

Don't like this personally and they're is likely to be a performance disparity between the 12/16GB parts, how could there not be?!.
Posted on Reply
#12
Unregistered
Why not just call it a 4070ti and avoid confusion, unless it's the aim.
#13
Arco
Xex360Why not just call it a 4070ti and avoid confusion, unless it's the aim.
Always the aim, why not make 30 cards with very similar names and then make people get an inferior card.
Posted on Reply
#14
ModEl4
Lol regarding model rumors for full AD104, we went from RTX 4070 12GB (best case) to RTX 4080 12GB (worst case)
So from 3070 successor ($499) we went to 3080 successor ($699) skipping 3070Ti successor ($599) price level entirely.
So I don't know what pricing level this rumor suggests, maybe something like the below scenario:

$999-$799 16GB cut-down AD103 based
$799-$649 12GB Full AD104 based
$649-$499 10GB cut-down AD104 based

If true, pricing seems to be getting worst by the day!
Posted on Reply
#15
Arco
ModEl4Lol regarding model rumors for full AD104, we went from RTX 4070 12GB (best case) to RTX 4080 12GB (worst case)
So from 3070 successor ($499) we went to 3080 successor ($699) skipping 3070Ti successor ($599) price level entirely.
So I don't know what pricing level this rumor suggests, maybe something like the below scenario:

$999-$799 16GB cut-down AD103 based
$799-$649 12GB Full AD104 based
$649-$499 10GB cut-down AD104 based

If true, pricing seems to be getting worst by the day!
Oof, do the black Friday deals change the prices at all much?
Posted on Reply
#16
ModEl4
ArcoOof, do the black Friday deals change the prices at all much?
Don't take too seriously the rumors (and more so the prices I quoted since I don't have any info and the reply was spontaneous without much though)
In less than one week, we will probably have the real deal from Nvidia themselves at GTC.
Posted on Reply
#17
Arco
ModEl4Don't take too seriously the rumors (and more so the prices I quoted since I don't have any info and the reply was spontaneous without much though)
In less than one week, we will probably have the real deal from Nvidia themselves at GTC.
Yeah, the rumor mill going strong. Personally, I'm building my first setup on Black Friday to get the best deals I can. When are we going to get 6 slot coolers or better yet.
Posted on Reply
#18
dom99
Another generation of GPUs I won't be buying because it costs more than my mortgage.

I don't know why people buy them at this price. £250 maximum is a sensible amount to spend on a GPU in mh opinion.
Posted on Reply
#19
SOAREVERSOR
Xex360Why not just call it a 4070ti and avoid confusion, unless it's the aim.
RTXYZ 4090 TI GT ULTRA SUPER OVERCLOCKED
Posted on Reply
#20
Arco
dom99Another generation of GPUs I won't be buying because it costs more than my mortgage.

I don't know why people buy them at this price. £250 maximum is a sensible amount to spend on a GPU in mh opinion.
If my build wasn't delayed like 2 years that would be the case for me. But I have to buy basically another setup excluding mic, mouse, and keyboard. So I might as well go insane and get a great setup with upgrade options. (Due to AM5, ATX3.0, DDR5, and PCIE5.)
Posted on Reply
#21
DeathtoGnomes
I have to agree with the 12GB that it should be labeled a 3070/Ti, calling it a 3080 is more like a bait and switch cash grab. :shadedshu:
Posted on Reply
#22
mama
I think we need to stop seeing the 4080 as being a direct replacement upgrade for the 3080. It's not. It won't be. It's marketing. Whatever Nvidia chooses to name a GPU it should be reviewed on price for performance alone.
Posted on Reply
#23
Ruru
S.T.A.R.S.
Truly some ugly mismarketing. I feel bad for those who doesn't know that much about computers, and gets the cheaper one just because "12GB is fine for me" and the card is actually way slower than the similarly named one.
Posted on Reply
#24
bug
ZoneDymoSure you can say people buying a gpu will be tech savy enough, but I still feel this is done just to confuse and borderline scam people.
If you buy solely based on the sticker on the box, nothing can save you from making wrong choices.
If, however, you do the smallest amount of due diligence and read a review before buying, then you know exactly what performance you're buying. Internal organization of the GPU is pretty much irrelevant to the average buyer (and even for some enthusiasts).

I'm not a fan of using the same moniker for essentially different GPUs, but that's not the end of the world.
Posted on Reply
#25
DeathtoGnomes
bugbut that's not the end of world.
LIES! Nvidia wants you to see their flagship cards as the end-all, be-all.




End-all of your wallet. :laugh:
Posted on Reply
Add your own comment
Dec 17th, 2024 22:21 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts