Monday, April 29th 2024

NVIDIA Builds Exotic RTX 4070 From Larger AD103 by Disabling Nearly Half its Shaders

A few batches of GeForce RTX 4070 graphics cards are based on the 5 nm "AD103" silicon, a significantly larger chip than the "AD104" that powers the original RTX 4070. A reader has reached out to us with a curiously named MSI RTX 4070 Ventus 3X E 12 GB OC graphics card, saying that TechPowerUp GPU-Z wasn't able to detect it correctly. When we took a closer look at their GPU-Z submission data, we found that the card was based on the larger "AD103" silicon, looking at its device ID. Interestingly, current NVIDIA drivers, such as the 552.22 WHQL used here, are able to seamlessly present the card to the user as an RTX 4070. We dug through older versions of GeForce drivers, and found that the oldest driver to support this card is 551.86, which NVIDIA released in early-March 2024.

The original GeForce RTX 4070 was created by NVIDIA by enabling 46 out of 60 streaming multiprocessors (SM), or a little over 76% of the available shaders. To create an RTX 4070 out of an "AD103," NVIDIA would have to enable 46 out of 80, or just 57% of the available shaders, and just 36 MB out of the 64 MB available on-die L2 cache. The company would also have to narrow the memory bus down to 192-bit from the available 256-bit, to drive the 12 GB of memory. The PCB footprint, pin-map, and package size of both the "AD103" and "AD104" are similar, so board partners are able to seamlessly integrate the chip with their existing AD104-based RTX 4070 board designs. End-users would probably not even notice the change until they fire up diagnostic utilities and find them surprised.
Why NVIDIA would make RTX 4070 using the significantly larger "AD103" silicon, is anyone's guess—the company probably has a stash of chips that are good enough to match the specs of the RTX 4070, so it would make sense to harvest the RTX 4070 out of them, which could sell for at least $500 in the market. This also opens up the possibility of RTX 4070 SUPER cards based on this chip, all NVIDIA has to do is dial up the SM count to 56, and increase the L2 cache available to 48 MB. How the switch to AD103 affects power and thermals, is an interesting thing to look out for.

Our next update of TechPowerUp GPU-Z will be able to correctly detect RTX 4070 cards based on AD103 chips.
Add your own comment

57 Comments on NVIDIA Builds Exotic RTX 4070 From Larger AD103 by Disabling Nearly Half its Shaders

#51
Assimilator
mubarak ahmedI tried searching the box for any manufacturing date but I didn't find anything, and the BIOS is not available in the database
Please upload that BIOS to help W1zz in adding support for this card to GPU-Z.
Posted on Reply
#52
Chrispy_
stimpy88And the performance of the non-RT hardware in the newer cards is not having an impact... :kookoo: Clockspeeds, shaders, VRAM bandwidth etc? Try looking at the performance hit from enabling RT.
You were complaining solely about RT performance stagnation, so I showed that the RT performance jumping an enormous 83% and 76% amount per generation.
RT performance is improving faster than non-RT performance, since general performance per gen including all the raster/shader/FLOPS stuff is more like 25-50% depending on which models you pick to compare.

So even relative to the non-RT performance gains per generation, the RT hardware is improving at about double that rate. How is that a "glacial pace" of RT improvement?
Feel free to explain it in a way that makes sense, but as far as I can see it's pretty black and white; RT performance is improving faster than anything else in the GPU space right now. It's the opposite of glacial.
Posted on Reply
#53
mubarak ahmed
AssimilatorPlease upload that BIOS to help W1zz in adding support for this card to GPU-Z.
I've already uploaded it to the database, but just to make sure nothing I made was wrong I will upload it here because I saw the PCIe is set to gen 3 and the card it self is gen 3 also.
Posted on Reply
#54
Assimilator
mubarak ahmedI've already uploaded it to the database, but just to make sure nothing I made was wrong I will upload it here because I saw the PCIe is set to gen 3 and the card it self is gen 3 also.
That's due to your motherboard/CPU, one or both of which doesn't support PCIe 4.0.
Posted on Reply
#55
mubarak ahmed
AssimilatorThat's due to your motherboard/CPU, one or both of which doesn't support PCIe 4.0.
I know but it should say that the card supports PCIE 4.0 but it's running on 3.0 right?
Posted on Reply
#56
Assimilator
mubarak ahmedI know but it should say that the card supports PCIE 4.0 but it's running on 3.0 right?
Yes, you are right! Maybe the fact that GPU-Z doesn't support this model is affecting its ability to detect the bus speed correctly.
Posted on Reply
#57
Dr. Dro
mubarak ahmedI know but it should say that the card supports PCIE 4.0 but it's running on 3.0 right?
That depends. What CPU/motherboard do you have?
Posted on Reply
Add your own comment
Nov 25th, 2024 00:59 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts