Friday, December 9th 2022

NVIDIA GeForce RTX 4070 and RTX 4070 Ti Detailed Specs Sheet Leaks

It turns out that NVIDIA has not one, but two new GeForce RTX 40-series "Ada" SKUs on the anvil this January. One of these is the RTX 4070 Ti, which we know well to be a rebranding of the RTX 4080 12 GB in the face of backlash that forced NVIDIA to "unlaunch" it. The other as it turns out, is the RTX 4070, with an interesting set of specifications. Based on the same 4 nm AD104 silicon as the RTX 4070 Ti, the new RTX 4070 is significantly cut down. NVIDIA enabled 46 out of 60 streaming multiprocessors (SM) physically present on the silicon, which yield 5,888 CUDA cores—the same count as the previous-gen RTX 3070, when compared to the 7,680 that the maxed-out RTX 4070 Ti enjoys.

The GeForce RTX 4070, besides 5,888 CUDA cores, gets 46 RT cores, 184 Tensor cores, 184 TMUs, and a reduced ROP count of 64, compared to 80 of the RTX 4070 Ti. The memory configuration remains the same as the RTX 4070 Ti, with 12 GB of 21 Gbps GDDR6X memory across the chip's 192-bit memory interface, working out to 504 GB/s of memory bandwidth. An interesting aspect of this SKU is its board power, rated at 250 W, compared to the 285 W of the RTX 4070 Ti, and the 220 W of its 8 nm predecessor, the RTX 3070.
Source: harukaze5719 (Twitter)
Add your own comment

93 Comments on NVIDIA GeForce RTX 4070 and RTX 4070 Ti Detailed Specs Sheet Leaks

#26
bug
pavleYeah, that's correct, bug. Flower power generations of graphics cards. :)
You can get full trilinear either by turning trilinear optimizations to off in nvidia Control panel or choosing "High Quality" rendering option. With nv Quadro cards you get true trilinear under OpenGL even with rendering set to "Quality".
I did that for a while. But since I wasn't seeing any difference (I'm not a pixel peeper), I kinda forgot about it.

For those not in the know, this isn't about artificial patterns, but about a trilinear optimization that translated to texture shimmering and, iirc, visible transitions between various LOD levels.
Posted on Reply
#27
pavle
I haven't yet found any problems with using brilinear, but I anticipate lack of pixels by RTX 4070 only having 64 ROPs instead of full 80. :)
Posted on Reply
#28
bug
pavleI haven't yet found any problems with using brilinear, but I anticipate lack of pixels by RTX 4070 only having 64 ROPs instead of full 80. :)
Sometimes ROPs get scaled back simply because they're underutilized. We'll need benchmarks to see actual impact.

Fwiw, I would even be ok with the same or up to 10% less performance than current gen if that translates to a 25% or more reduction in price.
Posted on Reply
#29
Blaeza
bugSometimes ROPs get scaled back simply because they're underutilized. We'll need benchmarks to see actual impact.

Fwiw, I would even be ok with the same or up to 10% less performance than current gen if that translates to a 25% or more reduction in price.
Nvidia and value for mon... I can't say that sentence as it doesn't exist as of now. AMD need to do something special.
Posted on Reply
#30
N/A
In case 4070 is 3080;12, $100 cheaper and 3 months later than Ti, it's good enough for me. Can't wait. And knowing Nvidia, 2070 was as expensive as it was faster than a 1070, after a mining crash they are on the offensive, just because. For no good reason.
Posted on Reply
#31
john_
Remember

"The more you buy(pay), the more you save(get)".
Posted on Reply
#32
bug
N/AIn case 4070 is 3080;12, $100 cheaper and 3 months later than Ti, it's good enough for me. Can't wait. And knowing Nvidia, 2070 was as expensive as it was faster than a 1070, after a mining crash they are on the offensive, just because. For no good reason.
I think the rumor is both 4070 and 4070Ti will launch together. However, launch is one thing, availability is another.
Posted on Reply
#34
Pumper
bugThis is potentially good news. 1/4 smaller than previous gen, about the same specs and a narrower memory bus. All the premises for something that a little faster than Ampere at lower prices. I am now curious about how this plays out.
What lower prices?
Posted on Reply
#35
64K
DenverWow... I just saw that Nvidia planned to launch the 4080 12gb with a mid-end chip, what a joke. lol
If you take a look at the specs the 4080 16GB is also a midrange Ada GPU. Nvidia has been playing this trickery since the GTX 680 release and even on tech sites such as this one it goes mostly unnoticed by the members. The 4090 is the only high end Ada so far.
Posted on Reply
#36
Blaeza
64KIf you take a look at the specs the 4080 16GB is also a midrange Ada GPU. Nvidia has been playing this trickery since the GTX 680 release and even on tech sites such as this one it goes mostly unnoticed by the members. The 4090 is the only high end Ada so far.
Until the 4090 Ti/Super/Turbo.
Posted on Reply
#37
64K
BlaezaUntil the 4090 Ti/Super/Turbo.
The 4090 will remain a high end Ada GPU forever. Just not the Flagship high end when/if the 4090 Ti comes out.
Posted on Reply
#38
bug
PumperWhat lower prices?
I said "premises", didn't I?

Nvidia has also reported declining sales, so we know there is some pressure on them. We'll see how this plays out.
Posted on Reply
#39
Recus
ARFThe only hope is that AMD could wish to save us :(

RTX 3070 - 5888 shaders, 256-bit, 392 sq. mm, $499
RTX 4070 - 5888 shaders, 192-bit, 295 sq. mm, $699-799




NVIDIA GeForce RTX 4070 rumored to feature 5888 CUDA cores, 12GB memory and 250W TDP - VideoCardz.com

nvidia is digging new deeper holes.

The shitshow by nvidia continues... now it's up to us to vote with our wallets...
AMD is selling 6700 XT successor for $999. How that's a save?

6700 XT 335 mm², 96 MB Infinity Cache $479
7900 XTX 306 mm², 96 MB Infinity Cache $999

But people still gonna buy it and claim they are voting with their wallets for better future.

A lot of people in Nvidia threads are furious about Nvidia pricing but they won't buy it anyway, actually they are praising Nvidia to increase prices so AMD respond with lower prices (Titan Z $3000 vs RX 295X2 $1500) so they could buy only AMD as always.

Posted on Reply
#40
ARF
N/AIn case 4070 is 3080;12, $100 cheaper
Let's see:

RTX 3080: 8704 shaders, 272 TMUs, 96 ROPs, 320-bit, 628 sq. mm.
RTX 4070: 5888 shaders, 184 TMUs, 64 ROPs, 192-bit, 295 sq. mm.

I don't see how RTX 4070 will get close to RTX 3080 even in the most cherry-picked cases.
RecusAMD is selling 6700 XT successor for $999. How that's a save?

6700 XT 335 mm², 96 MB Infinity Cache $479
7900 XTX 306 mm², 96 MB Infinity Cache $999
You need to calculate the total die area, which for Navi 31 is not 306 sq. mm but over 500 sq. mm.

Posted on Reply
#41
xorbe
Seems like nVidia has successfully sold that next gen perf should just build upon last gen pricing. Maybe 5000 series will bring something palpable.
Posted on Reply
#42
PrettyKitten800
Databasedgod
b1k3rdudeIm not overly impressed. Apart from the extra L2 cache, and some increased numbers on the fake 4080 the new GPU's seems to have lower specs. I guess the 3rd party reviews will confirm or refute my viewpoint.
The only spec that is lower on the 4070s when compared to the 3070s is the 192bit memory bus width. But this is way, way, way more than compensated for by the TWELVE TIMES increase in L2 cache. “Extra” is a bit of an understatement.

And look at their theoretical compute performance. The 4070ti has almost 2x the compute power and the 4070 has about 1.5x the 3070s. Those aren’t numbers they can fudge. They don’t necessarily translate one-to-one in game performance, but the 4070s are going to be *significantly* faster than the 3070s. Roughly 1.5x-2x faster.
Posted on Reply
#43
ARF
PrettyKitten800the 4070s are going to be *significantly* faster than the 3070s. Roughly 1.5x-2x faster.
No, they won't be.



Because it would mean roughly RTX 3090 - RTX 4080 performance.
Posted on Reply
#44
bug
PrettyKitten800The only spec that is lower on the 4070s when compared to the 3070s is the 192bit memory bus width. But this is way, way, way more than compensated for by the TWELVE TIMES increase in L2 cache. “Extra” is a bit of an understatement.
Actually, since 4070 is on GDDR6X, it still has more bandwidth than the 3070 even with the narrower bus. It's the 4070Ti that loses some bandwidth (~16%) compared to the 3070Ti.

Still, at the end of the day internal details can be irrelevant. What matters is $$$/fps.
Posted on Reply
#45
N/A
ARFLet's see:

RTX 3080: 8704 shaders, 272 TMUs, 96 ROPs, 320-bit, 628 sq. mm.
4070 boosts the clock speed to roughly 8960 96 Rop equivalent or 50%, but the 192-bit will affect things badly. I'm just grateful for the extra 2GB. At first they hinted 10GB and this would be really bad.
Posted on Reply
#46
PrettyKitten800
Databasedgod
RecusAMD is selling 6700 XT successor for $999. How that's a save?

6700 XT 335 mm², 96 MB Infinity Cache $479
7900 XTX 306 mm², 96 MB Infinity Cache $999

But people still gonna buy it and claim they are voting with their wallets for better future.

A lot of people in Nvidia threads are furious about Nvidia pricing but they won't buy it anyway, actually they are praising Nvidia to increase prices so AMD respond with lower prices (Titan Z $3000 vs RX 295X2 $1500) so they could buy only AMD as always.

uhh… the 7900XTX is not the successor to the 6700XT. The specs you’re comparing mean nothing because those are two different architectures, each manufactured using a different process node.

6700XT transistor count: 17.2 million
7900XTX transistor count: 58 million

6700XT memory: 12gb, 192bit bus, 16Gbps rate
7900XTX memory: 24gb and 384bit bus, 20Gbps rate

Sooooo the 7900XTX has 3.37x more transistors, 2x the amount of RAM, 2.5x more RAM bandwidth, about 4.5x more compute power, and costs about 2x more. Seems like a pretty decent deal to me, even if it *was* the successor to the 6700xt.
bugActually, since 4070 is on GDDR6X, it still has more bandwidth than the 3070 even with the narrower bus. It's the 4070Ti that loses some bandwidth (~16%) compared to the 3070Ti.

Still, at the end of the day internal details can be irrelevant. What matters is $$$/fps.
My point was that the GDDR6/X bus width and data rate doesn’t really matter on the 4070s because the insane increase in L2 cache more than compensates for any loss in memory bus throughput.

They’re using the same strategy AMD started using with the 6000 series. The 6800xt/6900xt could get away with 256bit wide GDDR6 at 16Gbps because they had 128mb of L3 cache to compensate for it. Now we’re seeing Nvidia following suit with an increase to the size of their L2 cache by 1200%. Nvidia’s massive L2 cache solution is analogous to AMDs infinity cache solution. GDDR6/X is too slow for it keep up with these new GPUs and the solution is to implement an insane amount of cache.
Posted on Reply
#47
ARF
N/A4070 boosts the clock speed to roughly 8960 96 Rop equivalent or 50%
How do you know?
I am inclined to believe that clock speeds don't scale performance linearly. It's more like 10% higher clock means 5% higher performance.
N/Abut the 192-bit will affect things badly. I'm just grateful for the extra 2GB. At first they hinted 10GB and this would be really bad.
I agree. I think 4070 is a badly designed chip with severe internal imbalances which would cause performance issues, especially at 2160p, 4320p and beyond.
Posted on Reply
#48
Alien_Zero
nVidia are anti Consumer, Consumers then should be Anti-nVidia......
nVidia can take these RTX 40 series and shove them up their @$$
Posted on Reply
#49
ARF
Alien_ZeronVidia are anti Consumer, Consumers then should be Anti-nVidia......
nVidia can take these RTX 40 series and shove them up their @$$


Posted on Reply
Add your own comment
Nov 22nd, 2024 17:33 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts