• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA AD103 and AD104 Chips Powering RTX 4080 Series Detailed

Joined
Oct 27, 2020
Messages
799 (0.53/day)
The 6 extra chips are just infinity cache, apparently.

I wonder how well the GPU would work without it, Unlike RDNA2, RDNA3 Flagship will have huge bandwidth to play with.
No they aren't just Infinity cache.
At 7nm 128MB infinity cache was around 78mm², so 96MB around 58.5mm².At 6nm with the same T libraries it would be around 51mm²-54mm² or at this ballpark.
Even if they targeted much higher throughput using higher T libraries, I don't see them more than double from that, so 108mm² at max.
The die area of the chiplets in Navi31 will be according to rumors 225mm² at least, so what you're saying doesn't add up imo.
 
Joined
Dec 12, 2016
Messages
1,955 (0.67/day)
The 6 extra chips are just infinity cache, apparently.

I wonder how well the GPU would work without it, Unlike RDNA2, RDNA3 Flagship will have huge bandwidth to play with.
The six extra chips are cache AND memory controllers. These must be counted as die area as they would typically be part of monolithic chip and the GPU wouldn’t work without them.
 
Joined
Jun 18, 2021
Messages
2,570 (2.00/day)
So they pumped 3 different chips right from the start? that's new, as is the 4080 not using the top 102 one.

Also wasn't jensen saying moore law is dead? Seems pretty alive to me when the 3080 used a 600+mm2 chip and the 4080 is using a 300+mm2 chip :D
 
Joined
Mar 1, 2021
Messages
115 (0.08/day)
Processor R7 7800X3D
Motherboard MSI B650 Tomahawk
Memory 2x32GB 6000CL30
Video Card(s) RTX 3070 FE
Case Lian Li O11 Air Mini
Power Supply Corsair RM1000x
295 mm² 192bit bus card for 1100€, good luck with that one NV

For reference previous biggest (consumer) 104 die cards were:

GA104 - RTX 3070ti 392 mm² 256bit ~600€
TU104 - RTX 2080S 545 mm² 256bit ~699€
 
Joined
Oct 27, 2020
Messages
799 (0.53/day)
That is a lot of assumptions, it will be interesting to see if you are correct.
The die size differences (12-12.5% for AD102/Navi31 and 9-8% for AD103/Navi32) are based on the figures that leakers claimed for AMD.
The performance/W is just my estimation (4090 will be at max -10% less efficient if compared at the same TBP)
AMD fans saying otherwise just isn't doing AMD a favour because anything more it will to disappointment.
Even what I'm saying probably is too much, because if you take a highly OC Navi31 flagship partner card like Powercolor Red devil, Asus strix, Asrock Formula and the TBP is close to 450W, what i just said is that Navi31 flagship will be regarding performance 100% and 4090 90% which probably isn't going to happen...
 
Joined
Oct 26, 2019
Messages
169 (0.09/day)
AD103 supposed to be RTX 4070 and AD104 supposed to be RTX 4060, but as there is no competition, they renamed them up and bumped prices 3 times up.
 
Joined
Nov 6, 2016
Messages
1,777 (0.60/day)
Location
NH, USA
System Name Lightbringer
Processor Ryzen 7 2700X
Motherboard Asus ROG Strix X470-F Gaming
Cooling Enermax Liqmax Iii 360mm AIO
Memory G.Skill Trident Z RGB 32GB (8GBx4) 3200Mhz CL 14
Video Card(s) Sapphire RX 5700XT Nitro+
Storage Hp EX950 2TB NVMe M.2, HP EX950 1TB NVMe M.2, Samsung 860 EVO 2TB
Display(s) LG 34BK95U-W 34" 5120 x 2160
Case Lian Li PC-O11 Dynamic (White)
Power Supply BeQuiet Straight Power 11 850w Gold Rated PSU
Mouse Glorious Model O (Matte White)
Keyboard Royal Kludge RK71
Software Windows 10
AD103 supposed to be RTX 4070 and AD104 supposed to be RTX 4060, but as there is no competition, they renamed them up and bumped prices 3 times up.
No competition? What do you mean by that? RDNA2 matched or beat 30 series in raster, FSR 2.0 has great reviews, and most certainly RDNA3 will compete, and because AMD's chiplet approach should be cheaper to manufacture, RDNA3 should offer better performance per dollar....but despite all of that, everyone will buy Nvidia and reward their behavior and perpetuate Nvidia's constant price increases in perpetuity.

Let's be honest everyone, AMD could release a GPU that matched Nvidia in every way including raytracing, and have FSR equal to DLSS in every way and charge less than Nvidia for it, and everyone would STILL buy Nvidia (which only proves consumer choices are quite irrational and are NOT decided by simply comparing specs, as the existence of fanboys should testify to...)...and as long as that's true, the GPU market will ALWAYS be hostile to consumers. The ONLY way things are going to improve for consumers is if AMD starts capturing marketshare and Nvidia is punished by consumers... but based on historical precedent, I have no hope for that...

And I don't believe Intel's presence would have improved the situation much, not as much as a wholly new company in the GPU space would have, because Intel would have leveraged success in the GPU market (which would have probably been carved away from AMD's limited marketshare instead of Nvidia's and would have resulted in Nvidia's marketshare remaining at 80% and AMD's 20% being divided between AMD and Intel) to further marginalize AMD in the x86 space (for example, by using influence with OEMs to have an Intel CPU matched with an Intel GPU and further diminish AMDs position among OEMs, which is how Intel devastated AMD in the 2000s BTW), and it would have been trading a marginally better GPU market for a much worse CPU market, imo. Although it'd never happen, what would be really improve the market would be if Nvidia got broken up like AT&T did in the 80s...
 
Last edited:
Joined
Dec 16, 2017
Messages
2,939 (1.15/day)
System Name System V
Processor AMD Ryzen 5 3600
Motherboard Asus Prime X570-P
Cooling Cooler Master Hyper 212 // a bunch of 120 mm Xigmatek 1500 RPM fans (2 ins, 3 outs)
Memory 2x8GB Ballistix Sport LT 3200 MHz (BLS8G4D32AESCK.M8FE) (CL16-18-18-36)
Video Card(s) Gigabyte AORUS Radeon RX 580 8 GB
Storage SHFS37A240G / DT01ACA200 / ST10000VN0008 / ST8000VN004 / SA400S37960G / SNV21000G / NM620 2TB
Display(s) LG 22MP55 IPS Display
Case NZXT Source 210
Audio Device(s) Logitech G430 Headset
Power Supply Corsair CX650M
Software Whatever build of Windows 11 is being served in Canary channel at the time.
Benchmark Scores Corona 1.3: 3120620 r/s Cinebench R20: 3355 FireStrike: 12490 TimeSpy: 4624
They marketing the card as RTX 4080 12GB for take more money from buyers when is in the reality the RTX 4070, is time to tell the truth to the people and no take any more bullshit from Nvidia
... the 4080 16 GB variant is barely a 4070, tbh, much less the 12 GB variant.
 
Joined
Jun 10, 2021
Messages
126 (0.10/day)
AD103 supposed to be RTX 4070 and AD104 supposed to be RTX 4060, but as there is no competition, they renamed them up and bumped prices 3 times up.

How so?

GK104 GTX680 @ 294mm2 full die with 1536 Cuda = $499, adjusted for inflation $645 USD.

GP104 GTX1080 @ 314mm2 full die with 2560 Cuda = $599, adjusted for inflation, $740 USD.


I'll agree the 4080 12GB is overpriced, but it's not the first time Nvidia has done this relatively speaking. :) Not to defend Nvidia.. but.. They've been doing this crap for years.

If they priced it at $700 it would have been more in line with some of their previous G104 full die x80 class GPUs. Margins are obviously higher now.. Hardware EE is more expensive too.
 
Last edited:
Joined
Dec 16, 2017
Messages
2,939 (1.15/day)
System Name System V
Processor AMD Ryzen 5 3600
Motherboard Asus Prime X570-P
Cooling Cooler Master Hyper 212 // a bunch of 120 mm Xigmatek 1500 RPM fans (2 ins, 3 outs)
Memory 2x8GB Ballistix Sport LT 3200 MHz (BLS8G4D32AESCK.M8FE) (CL16-18-18-36)
Video Card(s) Gigabyte AORUS Radeon RX 580 8 GB
Storage SHFS37A240G / DT01ACA200 / ST10000VN0008 / ST8000VN004 / SA400S37960G / SNV21000G / NM620 2TB
Display(s) LG 22MP55 IPS Display
Case NZXT Source 210
Audio Device(s) Logitech G430 Headset
Power Supply Corsair CX650M
Software Whatever build of Windows 11 is being served in Canary channel at the time.
Benchmark Scores Corona 1.3: 3120620 r/s Cinebench R20: 3355 FireStrike: 12490 TimeSpy: 4624
How so?

GK104 GTX680 @ 294mm2 full die with 1536 Cuda = $499, adjusted for inflation $645 USD.

GP104 GTX1080 @ 314mm2 full die with 2560 Cuda = $599, adjusted for inflation, $740 USD.


I'll agree the 4080 12GB is overpriced, but it's not the first time Nvidia has done this relatively speaking. :)
Look at the sheer difference in SM counts between the 4080 16 GB and the 4090. The 4090 has a lot more SMs than the 4080 16 GB (128 vs 76). So the 4080 16 GB variant is around 60% of the 4090, and the 4080 12 GB variant with 60 SM is around 47% of the 4090.


Meanwhile, the 3090 vs the 3080: 82 vs 68, which means the 3080 has around 83% of the 3090's SM count activated. And the 3070 Ti has 58% of the SM count of the 3090.

So, yes. You know what, the 4080 16 GB variant is actually a 4070 Ti. And the 4080 12 GB variant reminds of the 3060 Ti actually (since both have around 47% of their respective lineups' 90 card core count).

So, Nvidia basically named them both "4080" just so they didn't show up as asking +1000 euros for a 60-70 class card.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.61/day)
Location
Ex-usa | slava the trolls
I have some performance figures:

Cyberpunk 2077 @3840x2160 RT on:

RTX 4090: 46.5 FPS (+97% higher performance)
RTX 3090 Ti: 23.6 FPS
ASRock Radeon RX 6950 XT OC Formula Review - Ray Tracing | TechPowerUp

1663944048386.png


1663943995161.png

GeForce RTX 4090 Performance Figures and New Features Detailed by NVIDIA (wccftech.com)
 
Joined
Dec 16, 2017
Messages
2,939 (1.15/day)
System Name System V
Processor AMD Ryzen 5 3600
Motherboard Asus Prime X570-P
Cooling Cooler Master Hyper 212 // a bunch of 120 mm Xigmatek 1500 RPM fans (2 ins, 3 outs)
Memory 2x8GB Ballistix Sport LT 3200 MHz (BLS8G4D32AESCK.M8FE) (CL16-18-18-36)
Video Card(s) Gigabyte AORUS Radeon RX 580 8 GB
Storage SHFS37A240G / DT01ACA200 / ST10000VN0008 / ST8000VN004 / SA400S37960G / SNV21000G / NM620 2TB
Display(s) LG 22MP55 IPS Display
Case NZXT Source 210
Audio Device(s) Logitech G430 Headset
Power Supply Corsair CX650M
Software Whatever build of Windows 11 is being served in Canary channel at the time.
Benchmark Scores Corona 1.3: 3120620 r/s Cinebench R20: 3355 FireStrike: 12490 TimeSpy: 4624
Worthless comparison because the test systems are completely different. W1zzard was using a Ryzen 5800X, with 16 GB of RAM, on Windows 10.

Those guys were using an unknown version of Windows 11, a Core i9 12900K, and 32 GB of RAM, speed and timings unknown
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.61/day)
Location
Ex-usa | slava the trolls
Worthless comparison because the test systems are completely different. W1zzard was using a Ryzen 5800X, with 16 GB of RAM, on Windows 10.

Those guys were using an unknown version of Windows 11, a Core i9 12900K, and 32 GB of RAM, speed and timings unknown

At 4K it doesn't matter. :D
 
Joined
Feb 3, 2012
Messages
202 (0.04/day)
Location
Tottenham ON
System Name Current
Processor i7 12700k
Motherboard Asus Prime Z690-A
Cooling Noctua NHD15s
Memory 32GB G.Skill
Video Card(s) GTX 1070Ti
Storage WD SN-850 2TB
Display(s) LG Ultragear 27GL850-B
Case Fractal Meshify 2 Compact
Audio Device(s) Onboard
Power Supply Seasonic 1000W Titanium
I’m assuming once 3000 series stock sells out nVidia will quietly release a 4070 that is identical to the 4080 12gb and replaces it.
 
Joined
Mar 14, 2008
Messages
511 (0.08/day)
Location
DK
System Name Main setup
Processor i9 12900K
Motherboard Gigabyte z690 Gaming X
Cooling Water
Memory Kingston 32GB 5200@cl30
Video Card(s) Asus Tuf RTS 4090
Storage Adata SX8200 PRO 1 adn 2 TB, Samsung 960EVO, Crucial MX300 750GB Limited edition
Display(s) HP "cheapass" 34" 3440x1440
Case CM H500P Mesh
Audio Device(s) Logitech G933
Power Supply Corsair RX850i
Mouse G502
Keyboard SteelSeries Apex Pro
Software W11
I am looking forward to the 4080 10GB and the 4080 8GB, i think one of those would be good for an upgrade for my wife, or maybe a 4080 6GB..........................................................................................................
 
Joined
Jun 10, 2021
Messages
126 (0.10/day)
Look at the sheer difference in SM counts between the 4080 16 GB and the 4090. The 4090 has a lot more SMs than the 4080 16 GB (128 vs 76). So the 4080 16 GB variant is around 60% of the 4090, and the 4080 12 GB variant with 60 SM is around 47% of the 4090.


Meanwhile, the 3090 vs the 3080: 82 vs 68, which means the 3080 has around 83% of the 3090's SM count activated. And the 3070 Ti has 58% of the SM count of the 3090.

So, yes. You know what, the 4080 16 GB variant is actually a 4070 Ti. And the 4080 12 GB variant reminds of the 3060 Ti actually (since both have around 47% of their respective lineups' 90 card core count).

So, Nvidia basically named them both "4080" just so they didn't show up as asking +1000 euros for a 60-70 class card.

So you do realize that SM count isn't linear performance generation to generation, right? Nvidia has also moved around the "class" of GPU's for multiple generations.

Like I said, the 4080 12GB isn't too far off from what cards like the GTX680 or GTX1080 were if you factor inflation. The only difference these days is that Nvidia moved the "top end" to a higher goal post. Thats it..

Is the 4080 12GB overpriced? Yes, but it isn't too far off from certain previous x80 GPUs with full G104 dies. Like I said EE design/cooling is also WAY more expensive these days. We're not talking about 150-200w cards anymore.

Am I the only one who realizes Nvidia has been doing this shit for years?
 
Last edited:

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.61/day)
Location
Ex-usa | slava the trolls
GTX680 or GTX1080

Speaking of which... if history is anything to go by, then we won't see competition from Radeon. Those were exactly the worst times for Radeon with Vega 64 and HD 7970.

What does AMD prepare to counter the RTX 4090 launch? :confused:
 
Joined
Jun 10, 2021
Messages
126 (0.10/day)
Speaking of which... if history is anything to go by, then we won't see competition from Radeon. Those were exactly the worst times for Radeon with Vega 64 and HD 7970.

What does AMD prepare to counter the RTX 4090 launch? :confused:

Who knows. Hope the leaks aren't true. AMD seems to be going the NVIDIA route by downgrading specs per generation to ensure people "upgrade" sooner.

And I don't trust AMD to be a savior either. MSRP pricing on later released RX6000 cards durring the mining crisis was a joke..

The truth is, both these companies only care about your dollar. Let them fight for it.
 
Joined
Oct 4, 2017
Messages
706 (0.27/day)
Location
France
Processor RYZEN 7 5800X3D
Motherboard Aorus B-550I Pro AX
Cooling HEATKILLER IV PRO , EKWB Vector FTW3 3080/3090 , Barrow res + Xylem DDC 4.2, SE 240 + Dabel 20b 240
Memory Viper Steel 4000 PVS416G400C6K
Video Card(s) EVGA 3080Ti FTW3
Storage XPG SX8200 Pro 512 GB NVMe + Samsung 980 1TB
Display(s) Dell S2721DGF
Case NR 200
Power Supply CORSAIR SF750
Mouse Logitech G PRO
Keyboard Meletrix Zoom 75 GT Silver
Software Windows 11 22H2
..are the gaming tech review sites going to persist in playing along with Nvidia's sham naming of this card or will they have the integrity to call it out for what it actually is?

First of all TPU is not a ''gaming'' tech reviewer , it is an information website centered on technology .

Secondly , reviewers are legally bound to call a given products what the manufacturer calls it in their presentations , it's not like they can go out and call it random names ... They can express their oppinion on the naming/segmentation but they can't make up names so no need to make a fuss about it .

As long as they provide accurate information about the perf and price/perf ratio then they've done their job. It's up to the customer to make the final decision based on this information .
 
Last edited:
Joined
Jun 18, 2017
Messages
118 (0.04/day)
And I don't believe Intel's presence would have improved the situation much, not as much as a wholly new company in the GPU space would have, because Intel would have leveraged success in the GPU market (which would have probably been carved away from AMD's limited marketshare instead of Nvidia's and would have resulted in Nvidia's marketshare remaining at 80% and AMD's 20% being divided between AMD and Intel) to further marginalize AMD in the x86 space (for example, by using influence with OEMs to have an Intel CPU matched with an Intel GPU and further diminish AMDs position among OEMs, which is how Intel devastated AMD in the 2000s BTW), and it would have been trading a marginally better GPU market for a much worse CPU market, imo. Although it'd never happen, what would be really improve the market would be if Nvidia got broken up like AT&T did in the 80s...
To be fair, I don't think I've ever seen an Intel CPU paired with an AMD dGPU in a laptop.
 
Joined
Oct 4, 2017
Messages
706 (0.27/day)
Location
France
Processor RYZEN 7 5800X3D
Motherboard Aorus B-550I Pro AX
Cooling HEATKILLER IV PRO , EKWB Vector FTW3 3080/3090 , Barrow res + Xylem DDC 4.2, SE 240 + Dabel 20b 240
Memory Viper Steel 4000 PVS416G400C6K
Video Card(s) EVGA 3080Ti FTW3
Storage XPG SX8200 Pro 512 GB NVMe + Samsung 980 1TB
Display(s) Dell S2721DGF
Case NR 200
Power Supply CORSAIR SF750
Mouse Logitech G PRO
Keyboard Meletrix Zoom 75 GT Silver
Software Windows 11 22H2
Insane world when a 4090 is the best value GPU.

There is no information out there to allow for an educated oppinion about value , for this you need reviews first ...
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.61/day)
Location
Ex-usa | slava the trolls
So, they are basically 4070 and 4060.

Yes, the confusion comes from the seemingly large leap but bear in mind that the old generation was built on Samsung 8N, which is a 12nm process node in reality.
 
Joined
May 26, 2021
Messages
138 (0.11/day)
Yes, the confusion comes from the seemingly large leap but bear in mind that the old generation was built on Samsung 8N, which is a 12nm process node in reality.
I don't think we are confused about anything here. There were node leaps in the past as well, case in point, from Maxwell to Turing, but never seen the Core ratio of Titan (biggest chip) to 1080 be this skewed.
 
Joined
Nov 26, 2021
Messages
1,705 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Yes, the confusion comes from the seemingly large leap but bear in mind that the old generation was built on Samsung 8N, which is a 12nm process node in reality.
It is closest to TSMC's 10 nm; TSMC's 12 nm is 16 nm with different standard cells.
 
Top