Wednesday, September 14th 2022

NVIDIA RTX 4080 12GB and 16GB Based on Different Chips, Vastly Different Shader Counts

When we first got news about NVIDIA's upcoming GeForce RTX 4080 "Ada" coming in 12 GB and 16 GB variants, we knew there was more setting the two apart than just memory size and memory bus-width. Turns out there's a lot more. According to detailed specifications leaked to the web, while the 16 GB variant of the RTX 4080 is based on the same AD103, the second largest chip after the AD102; the 12 GB RTX 4080 is based on the smaller AD104 chip which has a physically narrower memory bus.

It looks like NVIDIA is debuting the RTX 40-series with at least three models—RTX 4090 24 GB, RTX 4080 16 GB, and RTX 4080 12 GB. The RTX 4090 is the top-dog part, with the ASIC code "AD102-300-xx." It's endowed with 16,384 CUDA cores, a boost frequency of up to 2.52 GHz, 24 GB of 21 Gbps GDDR6X memory, and a typical graphics power (TGP) of 450 W, which is "configurable" up to 600 W. The RTX 4080 16 GB is based on the AD103-300-xx" comes with 9,728 CUDA cores, a boost frequency of 2.50 GHz, and 16 GB of 23 Gbps GDDR6X memory across a narrower memory bus than the one the RTX 4090 comes with. This card reportedly has a 340 W TGP configurable up to 516 W.
The GeForce RTX 4090 12 GB is positioned a notch below its 16 GB namesake, but is based on the smaller AD104 chip, with 7,680 CUDA cores running at speeds of up to 2.61 GHz, 12 GB of 21 Gbps GDDR6X memory, and a TGP of 285 W that's configurable up to 366 W. It's interesting how the leak includes not just TGP, but also maximum configurable TGP. The various board partners will utilize the latter as their power limits to achieve overclocked speeds. Even the NVIDIA Founders Edition board is technically "custom design," and so it could feature higher-than-stock TGP.
Source: VideoCardz
Add your own comment

66 Comments on NVIDIA RTX 4080 12GB and 16GB Based on Different Chips, Vastly Different Shader Counts

#51
TheoneandonlyMrK
64KNividia has been pulling shenanigans since the GTX 680.
In other news refreshed 3060 3060,Ti and 3070Ti using a castrated 102 on the way too.

So yeah, they're not alone though but they do it best.
Posted on Reply
#52
Unregistered
dom99Another generation of GPUs I won't be buying because it costs more than my mortgage.

I don't know why people buy them at this price. £250 maximum is a sensible amount to spend on a GPU in mh opinion.
Yep. I just wait and pick up used cards nowadays. Last couple cards have been second-hand market. I refuse to contribute to normalizing the pricing they are pushing.
Posted on Edit | Reply
#53
bug
64KNo, because the 4080 will not only have two different VRAM sizes but also will have different amounts of shaders.
It happened to the 1060 and 2060 before and they were the actual opposite of a clusterfuck.
They'll just be different cards with the same name on the box. It won't matter the minute you install the card in your system and throw away the box.
Posted on Reply
#54
dyonoctis
bugIt happened to the 1060 and 2060 before and they were the actual opposite of a clusterfuck.
They'll just be different cards with the same name on the box. It won't matter the minute you install the card in your system and throw away the box.
Yhea the 1060 3gb was just 11% slower, and most knew to avoid it. The 2060 was a special case, it released within ampere gen to make up for the lack of stock. But they didn't have a 26% difference in CUDA count.
("RTX 4080 GS" or RTX 4080 MX".) Anyway, we'll once they are out if they are really GPU of the same caliber deserving of sharing the same name. (Maybe time to bring back :"RTX 4080 GS" or RTX 4080 MX".)
Posted on Reply
#56
THU31
ArcoYeah, the rumor mill going strong. Personally, I'm building my first setup on Black Friday to get the best deals I can. When are we going to get 6 slot coolers or better yet.
At the start of the RTX 20 announcement, Jensen said he was there to announce the GTX 1180.

It would be so cool if they followed up by showing a comically large 4080 and 4090 in honor of that April 1 video. I would have so much respect for them. :D


As for the article, it looks like a rebrand of the 4070 to the 4080 12 GB. If they sell that for $700, I will be skipping this generation. No point in upgrading for anything less than 30%. I usually prefer 50%. And my 2070 SUPER to 3080 upgrade gave me 80% in 4K.
Posted on Reply
#57
bug
THU31As for the article, it looks like a rebrand of the 4070 to the 4080 12 GB. If they sell that for $700, I will be skipping this generation. No point in upgrading for anything less than 30%. I usually prefer 50%. And my 2070 SUPER to 3080 upgrade gave me 80% in 4K.
Exactly. What matters when upgrading is performance (like you, I feel that if there's not an extra 50% to be gained, it doesn't really let use higher settings to the point you can actually see the difference). If performance is there, would you dismiss the card just because it said 4090 on the box? Or maybe 4050? Or even 1234567890?
Posted on Reply
#58
Arco
bugExactly. What matters when upgrading is performance (like you, I feel that if there's not an extra 50% to be gained, it doesn't really let use higher settings to the point you can actually see the difference). If performance is there, would you dismiss the card just because it said 4090 on the box? Or maybe 4050? Or even 1234567890?
My upgrades are like 6-8 years apart but I get 300% percent per upgrade. (560SE to 1060 3GB.) This setup will get me like 500% of my perf at least! (10603GB to 4080-4090.)
Posted on Reply
#59
bug
ArcoMy upgrades are like 6-8 years apart but I get 300% percent per upgrade. (560SE to 1060 3GB.) This setup will get me like 500% of my perf at least! (10603GB to 4080-4090.)
I'm not that stingy, but I always skip a generation or two. Now that I don't really have time for gaming anymore, I can probably slow down.
Posted on Reply
#60
Arco
bugI'm not that stingy, but I always skip a generation or two. Now that I don't really have time for gaming anymore, I can probably slow down.
I just need a good setup at 4k@120 fps for a very long time. I don't play the newest games nor the biggest prosumer workloads, but I do want a very large headroom.
Posted on Reply
#61
TheoneandonlyMrK
So according to videocards site the 12 GB card uses 21GB gddrx6 and the 16GB uses 23GB gddrx6.

With the buss width and shaders these are not going to perform the same at all, like the 16GB is looking like a Ti version comparatively.
Posted on Reply
#62
oxrufiioxo
TheoneandonlyMrKSo according to videocards site the 12 GB card uses 21GB gddrx6 and the 16GB uses 23GB gddrx6.

With the buss width and shaders these are not going to perform the same at all, like the 16GB is looking like a Ti version comparatively.
The rumor is the 12GB version was going to be the 4070 or 4070ti but Nvidia changed the name last minute so that they could charge 700+ for it I guess we will know more on the 20th
Posted on Reply
#63
64K
TheoneandonlyMrKSo according to videocards site the 12 GB card uses 21GB gddrx6 and the 16GB uses 23GB gddrx6.

With the buss width and shaders these are not going to perform the same at all, like the 16GB is looking like a Ti version comparatively.
That is what I think should happen too. Call the 12GB a 4080 and the 16GB with more cores a 4080 Ti. That would end the confusion between the 2 GPUs and be the sensible way forward for Nvidia. If they wanted to release faster 4080s later on they could just tack on Super to the names.
Posted on Reply
#64
bug
oxrufiioxoThe rumor is the 12GB version was going to be the 4070 or 4070ti but Nvidia changed the name last minute so that they could charge 700+ for it I guess we will know more on the 20th
These last minute changes aren't usually done on a whim, they're most often done because one player learns something about the other's lineup.
Posted on Reply
#65
oxrufiioxo
bugThese last minute changes aren't usually done on a whim, they're most often done because one player learns something about the other's lineup.
Personally I couldn't care less what they call them just what the performance is over what I currently own.
Posted on Reply
Add your own comment
Dec 18th, 2024 00:46 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts