Friday, September 23rd 2022

NVIDIA AD103 and AD104 Chips Powering RTX 4080 Series Detailed

Here's our first look at the "AD103" and "AD104" chips powering the GeForce RTX 4080 16 GB and RTX 4080 12 GB, respectively, thanks to Ryan Smith from Anandtech. These are the second- and third-largest implementations of the GeForce "Ada" graphics architecture, with the "AD102" powering the RTX 4090 being the largest. Both chips are built on the same TSMC 4N (4 nm EUV) silicon fabrication process as the AD102, but are significantly distant from it in specifications. For example, the AD102 has a staggering 80 percent more number-crunching machinery than the AD103, and a 50 percent wider memory interface. The sheer numbers at play here, enable NVIDIA to carve out dozens of SKUs based on the three chips alone, before we're shown the mid-range "AD106" in the future.

The AD103 die measures 378.6 mm², significantly smaller than the 608 mm² of the AD102, and it reflects in a much lower transistor count of 45.9 billion. The chip physically features 80 streaming multiprocessors (SM), which work out to 10,240 CUDA cores, 320 Tensor cores, 80 RT cores, and 320 TMUs. The chip is endowed with a healthy ROP count of 112, and has a 256-bit wide GDDR6X memory interface. The AD104 is smaller still, with a die-size of 294.5 mm², a transistor count of 35.8 billion, 60 SM, 7,680 CUDA cores, 240 Tensor cores, 60 RT cores, 240 TMUs, and 80 ROPs. Ryan Smith says that the RTX 4080 12 GB maxes out the AD104, which means its memory interface is physically just 192-bit wide.
Sources: Ryan Smith (Twitter), VideoCardz
Add your own comment

152 Comments on NVIDIA AD103 and AD104 Chips Powering RTX 4080 Series Detailed

#126
Dirt Chip
AusWolfIt's not irrelevant.
Realising an item's real worth vs. what the price tag says is part of what makes you an informed consumer. Just because something costs $900 it doesn't mean it's actually worth $900, even if you can afford it.
So you will settle for a lesser product even if you can afford the one that is best for you (and cost more) just because it is less of a value compering past product?

If so, you dont really need to but the product in the first place.
Posted on Reply
#127
Vayra86
Dirt ChipSo you shop by preformance to Watt of $, not by die size.
The same way you don't shop by memory-bus and, in most situations, not by memory size.
No, its not really like that. If you have some experience in generations of GPU and buying the 'best one', also in terms of performance/$, its far too simple to just look at performance/$ in one isolated product at one moment in time.

GPUs last. And if you buy the right one, they can last very long. And when they last very long, the cost metric works out differently. The longer you can use a GPU, the more value you can extract from it. So damn right things like VRAM capacity matter. If your MO is buying small generational upgrades of, say, 30~40% each time or skip just one gen to get perhaps 50% on the same tier, you've not selected a price point, you just want every upgrade you can get your hands on and its not going to be cost effective. This is the very reason I stuck with my GTX 1080 for so long. It still games fine, I don't miss much honestly. But the more important reasoning here is that any upgrade won't gain me anything meaningful and the tier I usually buy into, to make sure I HAVE a GPU that can last 5 years plus, is far too expensive. Let's face it: a 10GB 3080 at a higher price than what I came from, is just simply not a great deal, no matter how much core perf it has over the past one. It's just missing resources. And a cheaper 12GB with much lower core power? Same issue: what's the point?! Gain 12 FPS in games you could already play?

There is a sweet spot within the high end of every GPU stack where you get something that 'exceeds the norm' for quite a while. Some call it future proofing, but rather, I prefer to call it the PC Sweet spot and it relates closely to where consoles are at that time. It applies to CPU, GPU, RAM, storage. Going to the very top end of available performance is extremely expensive and not cost effective, and because consoles aren't there yet, it won't pay off either because that level isn't being optimized for. But staying right under, and not buying at launch but afterwards, and taking a careful look at market developments, is extremely cost effective. You'll be having something that no software can reasonably get to its knees in the first few years, and after that you'll still have 'enough' for quite a long time as the mainstream/median of performance level in the market starts catching up.

So you can be damn sure I shop by memory size/bus/featureset and numerous other aspects of a GPU. When all those stars align, relative to what I'm upgrading from, and there is a gain of royally over 50% at a good price point (75~100% is better, and that time is close now for me), that's when I know I can get a new deal that will run anything for a long time and then some, while not getting ripped off.

The math is beautiful. In 2017 I paid 420 eur for a GTX 1080. I might soon sell it for 150 ish. I might buy a 550-600 eur replacement that's twice as fast. That's about 50 eur/year for high end gaming. Before mining I've made a similar move going from a 780ti to 1080.
Posted on Reply
#128
AusWolf
Dirt ChipSo you will settle for a lesser product even if you can afford the one that is best for you (and cost more) just because it is less of a value compering past product?
I'm not comparing it to a past product. I'm comparing it to its actual worth. If it doesn't match its price tag, I won't buy it. Even if I had a million £, I wouldn't buy a Lamborghini because it offers terrible value.
Dirt ChipIf so, you dont really need to but the product in the first place.
That's right. Gaming is a hobby, not a necessity. There's still hundreds of old classics that I can play just fine with my 6500 XT. I'll want an upgrade eventually, but it won't have to be a ridiculously overpriced halo product from Nvidia.
Vayra86No, its not really like that. If you have some experience in generations of GPU and buying the 'best one', also in terms of performance/$, its far too simple to just look at performance/$ in one isolated product at one moment in time.

GPUs last. And if you buy the right one, they can last very long. And when they last very long, the cost metric works out differently. The longer you can use a GPU, the more value you can extract from it. So damn right things like VRAM capacity matter. If your MO is buying small generational upgrades of, say, 30~40% each time or skip just one gen to get perhaps 50% on the same tier, you've not selected a price point, you just want every upgrade you can get your hands on and its not going to be cost effective. This is the very reason I stuck with my GTX 1080 for so long. It still games fine, I don't miss much honestly. But the more important reasoning here is that any upgrade won't gain me anything meaningful and the tier I usually buy into, to make sure I HAVE a GPU that can last 5 years plus, is far too expensive. Let's face it: a 10GB 3080 at a higher price than what I came from, is just simply not a great deal, no matter how much core perf it has over the past one. It's just missing resources. And a cheaper 12GB with much lower core power? Same issue: what's the point?! Gain 12 FPS in games you could already play?

There is a sweet spot within the high end of every GPU stack where you get something that 'exceeds the norm' for quite a while. Some call it future proofing, but rather, I prefer to call it the PC Sweet spot and it relates closely to where consoles are at that time. It applies to CPU, GPU, RAM, storage. Going to the very top end of available performance is extremely expensive and not cost effective, and because consoles aren't there yet, it won't pay off either because that level isn't being optimized for. But staying right under, and not buying at launch but afterwards, and taking a careful look at market developments, is extremely cost effective. You'll be having something that no software can reasonably get to its knees in the first few years, and after that you'll still have 'enough' for quite a long time as the mainstream/median of performance level in the market starts catching up.

So you can be damn sure I shop by memory size/bus/featureset and numerous other aspects of a GPU. When all those stars align, relative to what I'm upgrading from, and there is a gain of royally over 50% at a good price point (75~100% is better, and that time is close now for me), that's when I know I can get a new deal that will run anything for a long time and then some, while not getting ripped off.

The math is beautiful. In 2017 I paid 420 eur for a GTX 1080. I might soon sell it for 150 ish. I might buy a 550-600 eur replacement that's twice as fast. That's about 50 eur/year for high end gaming. Before mining I've made a similar move going from a 780ti to 1080.
That's a good way of thinking about it. :)

My thinking is that there's the
1. Halo products - They offer terrible value for money and depreciate quickly. In GPUs, avoid at all costs. In CPUs, only buy if your budget allows it. CPUs keep their usage value for a bit longer as generational upgrades aren't so significant. Unfortunately, Nvidia is positioning a bigger and bigger chunk of their product stack into this category. Spending nearly £1,000 on a graphics card that you'll swap for something else only a couple years later when DLSS 4 comes out and only runs on the next high-end thing is a terrible value, whichever way you look at it.
2. Mid-range - It gives you nearly the same experience as the top-end, but at a significantly lower price, and lower depreciation over time. The best value for money is here.
3. Low-end - Low price, low performance, and almost no resale value. This is where I shop when I'm curious about something new, but don't want to touch my savings for something unknown.

Edit: It's not just about price per performance, either. I couldn't care less if my games run at 60 or 100 fps, so from my point of view, the only difference between a midrange and high-end graphics card (besides price and power consumption / cooling requirements) is how well future games will run on it - which Nvidia kills by making new DLSS iterations run only on the newest generation. So effectively, the high-end offers worse value than ever before.
Posted on Reply
#129
N/A
The idea of a mid range is broken with the 40 series, since 4090 provides best value. 4080 12 is less than a half and costs more.
Take into account that Nvidia also skipped 7nm, we are getting the 2024 cards now ahead of time. so the die size is more tightly packed than the usual. 4080 16 is 600mm² unpacked.
AD104 is not you regular 4070. to compete with 3080 Ti, 4070 would have needed only 25,6 billion and 256 bit bus and less L2.
Because of this the whole lineup would be messed up now, except 4090 that provides incredible value. but for how long, it depends how soon Nvidia moves to N3 or 3N.
Posted on Reply
#130
ARF
N/AThe idea of a mid range is broken with the 40 series, since 4090 provides best value. 4080 12 is less than a half and costs more.
Take into account that Nvidia also skipped 7nm, we are getting the 2024 cards now ahead of time. so the die size is more tightly packed than the usual. 4080 16 is 600mm² unpacked.
AD104 is not you regular 4070. to compete with 3080 Ti, 4070 would have needed only 25,6 billion and 256 bit bus and less L2.
Because of this the whole lineup would be messed up now, except 4090 that provides incredible value. but for how long, it depends how soon Nvidia moves to N3 or 3N.
Yeah, what has changed is that today the low end and midrange are charged with premium taxes, while the halo enthusiasts parts are left without these premium taxes.

In a normal world, the 4080-12 should cost 599, while the 4090 should cost 1999.
What we see is the opposite, the halo is cheaper, while the midrange is more expensive.
Posted on Reply
#131
AusWolf
ARFYeah, what has changed is that today the low end and midrange are charged with premium taxes, while the halo enthusiasts parts are left without these premium taxes.

In a normal world, the 4080-12 should cost 599, while the 4090 should cost 1999.
What we see is the opposite, the halo is cheaper, while the midrange is more expensive.
What I see is the mid-range moving up into high-end and halo categories, while the low-end gets basically no attention.

What I mean is, x90 used to be halo tier, x80 high-end, x70 and x60 mid-range, x50 entry-level gamer and x30 and x10 low-end, but now x90 and x80 are halo products, x70 is high-end, x60 and x50 are mid-range, and the low-end basically ceased to exist. AMD is only a tiny bit better in this regard. They actually have entry-level gaming cards with the 6400 and 6500 XT.

This is very strange in a time when chip manufacturing costs more than ever and people have less and less money for hobbies.
Posted on Reply
#132
Vayra86
AusWolfI'm not comparing it to a past product. I'm comparing it to its actual worth. If it doesn't match its price tag, I won't buy it. Even if I had a million £, I wouldn't buy a Lamborghini because it offers terrible value.


That's right. Gaming is a hobby, not a necessity. There's still hundreds of old classics that I can play just fine with my 6500 XT. I'll want an upgrade eventually, but it won't have to be a ridiculously overpriced halo product from Nvidia.


That's a good way of thinking about it. :)

My thinking is that there's the
1. Halo products - They offer terrible value for money and depreciate quickly. In GPUs, avoid at all costs. In CPUs, only buy if your budget allows it. CPUs keep their usage value for a bit longer as generational upgrades aren't so significant. Unfortunately, Nvidia is positioning a bigger and bigger chunk of their product stack into this category. Spending nearly £1,000 on a graphics card that you'll swap for something else only a couple years later when DLSS 4 comes out and only runs on the next high-end thing is a terrible value, whichever way you look at it.
2. Mid-range - It gives you nearly the same experience as the top-end, but at a significantly lower price, and lower depreciation over time. The best value for money is here.
3. Low-end - Low price, low performance, and almost no resale value. This is where I shop when I'm curious about something new, but don't want to touch my savings for something unknown.

Edit: It's not just about price per performance, either. I couldn't care less if my games run at 60 or 100 fps, so from my point of view, the only difference between a midrange and high-end graphics card (besides price and power consumption / cooling requirements) is how well future games will run on it - which Nvidia kills by making new DLSS iterations run only on the newest generation. So effectively, the high-end offers worse value than ever before.
Your point 1 conclusion is the exact nail on the head there: nvidia is pushing a larger part of the stack to halo product, and that is caused ironically not by monster specs, but only monstrous MSRP without much to show for it. It wont last.
Posted on Reply
#133
ARF
A 12-GB part can not be "halo" because today's games and the future games need more VRAM allocation.
Posted on Reply
#134
AusWolf
Vayra86Your point 1 conclusion is the exact nail on the head there: nvidia is pushing a larger part of the stack to halo product, and that is caused ironically not by monster specs, but only monstrous MSRP without much to show for it. It wont last.
The specs are monstrous, too, considering that you only need that category for 4K. Even my 6500 XT can play everything at 1080p. If I end up buying a 4070, 4060 or 7700 XT, I'll be sorted for a good few years.
Posted on Reply
#135
Vayra86
ARFA 12-GB part can not be "halo" because today's games and the future games need more VRAM allocation.
But thats just allocation its not usage and you will never notice!!! /S quote:-Nvidia Ampere early adopters
AusWolfThe specs are monstrous, too, considering that you only need that category for 4K. Even my 6500 XT can play everything at 1080p. If I end up buying a 4070, 4060 or 7700 XT, I'll be sorted for a good few years.
4K on a 12 GB part? Lmao, that will last all of 12 months at best. All I see is monstrous shader counts that say nothing, alongside way too little bandwidth.
Posted on Reply
#136
AusWolf
ARFA 12-GB part can not be "halo" because today's games and the future games need more VRAM allocation.
Allocation and usage are different things. Most modern games allocate as much VRAM as they can without using all of it.
Vayra864K on a 12 GB part? Lmao, that will last all of 12 months at best.
Depends on the game, I guess. The GPU resources are there, nonetheless. 12 months sounds about as long as Nvidia wants it to last. By that time, the 4080 Super 24 GB will be out.
Posted on Reply
#137
Vayra86
AusWolfAllocation and usage are different things. Most modern games allocate as much VRAM as they can without using all of it.


Depends on the game, I guess. The GPU resources are there, nonetheless. 12 months sounds about as long as Nvidia wants it to last. By that time, the 4080 Super 24 GB will be out.
Full allocation plus low bandwidth relative to core perf = stutter heaven. We've been here a few times in Nvidia history...

AMD also tried a top end product with low VRAM but very high relative bandwidth btw... Fury X with a measly 4GB. We have never seen a GPU fall off in perf over time faster than Fury X. It lost against 6GB at lower bandwidth every step of the way... losing its 1440p & 4K lead by the time Pascal released ; 980ti still relevant, Fury relegated to midrange.

VRAM matters, its the most effective tool for planned obscolescence.
Posted on Reply
#138
AusWolf
Vayra86Full allocation plus low bandwidth relative to core perf = stutter heaven. We've been here a few times in Nvidia history...
That will be magically solved by DLSS 4.0. ;) Oh wait... you'll need a 50-series card for that. :slap:

Edit: This is probably why they never released the 3070 Ti 16 GB. It would have made the entire 40-series pointless.
Posted on Reply
#139
Vayra86
AusWolfThat will be magically solved by DLSS 4.0. ;) Oh wait... you'll need a 50-series card for that. :slap:
With 13.5GB right :D
Posted on Reply
#140
AusWolf
Vayra86With 13.5GB right :D
That's the Nvidia recipe recently...
1. Pee in your pants seeing how fast the new generation is.
2. Fork up some money, or take out a loan to buy a shiny new x90 card.
3. Wait for a year until the Ti / Super version is out with better efficiency and more VRAM and your halo card isn't worth crap anymore.
4. Start again.
Posted on Reply
#141
Guwapo77
ModEl4The die size differences (12-12.5% for AD102/Navi31 and 9-8% for AD103/Navi32) are based on the figures that leakers claimed for AMD.
The performance/W is just my estimation (4090 will be at max -10% less efficient if compared at the same TBP)
AMD fans saying otherwise just isn't doing AMD a favour because anything more it will to disappointment.
Even what I'm saying probably is too much, because if you take a highly OC Navi31 flagship partner card like Powercolor Red devil, Asus strix, Asrock Formula and the TBP is close to 450W, what i just said is that Navi31 flagship will be regarding performance 100% and 4090 90% which probably isn't going to happen...
Your time to shine is just around the corner. We'll see.
Posted on Reply
#142
pavle
We'll see how much performance is raised; more and more I suspect that large L2 cache might possibly be a Tile architecture element more so than infinity_cache-copy so there might be some impressive jumps.
If it's so then Gigapixel's stuff is finally getting full use (anyone remember that name?)...
Posted on Reply
#143
AusWolf
To those of you who don't understand why having two different cards being called the same (4080) is bad:


TLDR: It's 1. intentionally misleading costumers, 2. the way Nvidia plans to get away with selling a card for 900 bucks that realistically should launch around 600.
Posted on Reply
#144
R-T-B
The Quim Reaper..are the gaming tech review sites going to persist in playing along with Nvidia's sham naming of this card or will they have the integrity to call it out for what it actually is?
I mean, it's what they name it man. It'd be a sham to name it anything else. We leave the opinions to the end user.
TheoneandonlyMrKNo they don't
Why would everyone naming it whatever they feel like be better and create less confusion? Do tell.
Posted on Reply
#145
TheoneandonlyMrK
R-T-BI mean, it's what they name it man. It'd be a sham to name it anything else. We leave the opinions to the end user.


Why would everyone naming it whatever they feel like be better and create less confusion? Do tell.
In all seriousness ii wouldn't expect to call it a 4070 , and at no point did I suggest otherwise.

It should be pointed out as THE Slightly gimped 4080 or the shit one.
Posted on Reply
#146
AusWolf
TheoneandonlyMrKIn all seriousness ii wouldn't expect to call it a 4070 , and at no point did I suggest otherwise.

It should be pointed out as THE Slightly gimped 4080 or the shit one.
But it's not. It's called a 4080, same as the 16 GB version. But it's not the same.

The average Joe will walk into a store and think "hey, this 4080 is cheaper than that one. Cool, I don't need 16 GB VRAM anyway" only to find out later that his card is a lot slower than what he expected it to be. Or maybe he doesn't even realise it straight away. I'm not sure which scenario is more sad.

It's not like the RX 480 that you could buy with either 4 or 8 GB VRAM with the same GPU, but the small difference in price suggests so.

I agree with JayzTwoCents: it should be legally mandated to include specs on the box.
Posted on Reply
#147
TheoneandonlyMrK
AusWolfBut it's not. It's called a 4080, same as the 16 GB version. But it's not the same.

The average Joe will walk into a store and think "hey, this 4080 is cheaper than that one. Cool, I don't need 16 GB VRAM anyway" only to find out later that his card is a lot slower than what he expected it to be. Or maybe he doesn't even realise it straight away. I'm not sure which scenario is more sad.

It's not like the RX 480 that you could buy with either 4 or 8 GB VRAM with the same GPU, but the small difference in price suggests so.

I agree with JayzTwoCents: it should be legally mandated to include specs on the box.
I meant in the press, we can't rename it as others have said but can make it known that we don't like it and recognise the BS it represents.

The 12Gb has no founders ed so Evga were on the right track IMHO.

MSRP means nothing, so clearly the 12GB cards are going to end up at or above the cost of Nvidia's 16GB card.

I wouldn't touch Nvidia with yours this time out, but many couldn't care less about ethics, a shame as this shits not new.

I agree with your statements though so no argument here.
Posted on Reply
#148
AusWolf
TheoneandonlyMrKI meant in the press, we can't rename it as others have said but can make it known that we don't like it and recognise the BS it represents.

The 12Gb has no founders ed so Evga were on the right track IMHO.

MSRP means nothing, so clearly the 12GB cards are going to end up at or above the cost of Nvidia's 16GB card.

I wouldn't touch Nvidia with yours this time out, but many couldn't care less about ethics, a shame as this shits not new.

I agree with your statements though so no argument here.
I see what you mean. If I was press, I'd start my review by stating that the 4080 12 GB is only a 4080 in name, and I do not recommend anyone to buy one at MSRP. It's essentially a scam.

I agree with you too - I'll give nvidia a hard pass on the 40-series as well. In fact, I'm tempted to build an all-AMD machine again.

Edit: The really sad thing is what JayzTwoCents said in his video. The card may be awesome, but the whole nvidia experience gets soured by their shady practices around naming and pricing.
Posted on Reply
#149
Luke357
Skipping this gen. Both CPU and GPU from this year are minuscule upgrades from last gen. Waiting on 15th gen and RTX 5000 series.
Posted on Reply
#150
Athlonite
AusWolfThe really sad thing is what JayzTwoCents said in his video. The card may be awesome, but the whole nvidia experience gets soured by their shady practices around naming and pricing.
I'll agree with that Jay is right when full specs should be on the box so people can make an informed decision before purchase.... Do I want the full fat 4080 or should I take the 4080 gimp model instead
Posted on Reply
Add your own comment
Dec 19th, 2024 17:45 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts