Tuesday, August 13th 2024

NVIDIA Readies GeForce RTX 4070 GDDR6 Variant?

NVIDIA is possibly launching a more cost-effective variant of its GeForce RTX 4070 graphics card featuring GDDR6 memory, replacing the GDDR6X that it originally launched with. The new SKU would be better differentiated from the RTX 4070 SUPER. When NVIDIA refreshed its RTX 40-series "Ada" product stack in January, it discontinued the RTX 4070 Ti and RTX 4080, which had been replaced in the lineup by the RTX 4070 Ti SUPER and RTX 4080 SUPER, but at the time it didn't tinker with the RTX 4070, which continued to sell at a roughly $50 lower price than the RTX 4070 SUPER. This new SKU could be an attempt by NVIDIA to get further down below the $500-mark in pricing.

The RTX 4070 originally launched with 21 Gbps GDDR6X memory. This new variant sees the memory replaced with 20 Gbps conventional GDDR6. The JEDEC standard GDDR6 chips could be cheaper than GDDR6X, and could very well be the same GDDR6 chip models AMD uses in some of its higher Radeon RX 7000 series SKUs. This, however, comes with a 4.75% drop in memory bandwidth, which NVIDIA probably overcomes with increasing the GPU clocks a touch. The ASIC code for this SKU is AD104-251, compared to the AD104-250 of the original RTX 4070. The core configuration is otherwise unchanged—you get 5,888 CUDA cores across 46 streaming multiprocessors. Galax has a card based on this SKU ready.
Source: VideoCardz
Add your own comment

32 Comments on NVIDIA Readies GeForce RTX 4070 GDDR6 Variant?

#26
basco
back on topic they should mention it on the box so everybody can see the difference at first glance even if it´s just 1% slower
Posted on Reply
#27
N/A
It may actually be faster since GDDR6X has complicated timings and fake bandwidth that relies on a bunch of voltage levels to transmit data and I don't believe it does that as efficiently all the time..
Posted on Reply
#28
Ruru
S.T.A.R.S.
Should be called as 4060 Ultra or 4070 SE. Now this reminds me of the infamous GT 1030 DDR4.
Posted on Reply
#29
Caring1
Dr. DroSame :toast:
You were driving with his wife too? :roll:
Posted on Reply
#30
Dr. Dro
N/AIt may actually be faster since GDDR6X has complicated timings and fake bandwidth that relies on a bunch of voltage levels to transmit data and I don't believe it does that as efficiently all the time..
I don't think that a different encoding/signaling scheme is actually "fake bandwidth" - DDR5 does something similar and it does tend to deliver, but a ~5% reduction in memory bandwidth can be compounded by a ~5% increase in default clock speeds or by the activation of an extra SM partition. With 20 Gbps G6 vs. 21 Gbps G6X, if Nvidia opted for an eventual extra SM route (bringing it up to 6144 cores) - I think I would prefer the G6 version in this situation.
RuruShould be called as 4060 Ultra or 4070 SE. Now this reminds me of the infamous GT 1030 DDR4.
The G5 and D4 versions of the 1030 were so radically different in performance that you could call the D4 a 1010 (which eventually released in the OEM market) or even a 1005 in comparison, but in this case it's practically a nothingburger. If they compensate it somehow by increasing engine clock or activating an extra core partition, it's gonna balance itself out and the G6 version may even enjoy reduced power consumption. In any case, I don't think it'll be a showstopper.
Caring1You were driving with his wife too? :roll:
Whoa, I'm not that kind of guy :laugh:
Posted on Reply
#31
RJARRRPCGP
RuruShould be called as 4060 Ultra or 4070 SE. Now this reminds me of the infamous GT 1030 DDR4.
Reminds me of the AGP era (Radeon 8500> Radeon 9000 Pro) (and Radeon 9500> Radeon 9550)
Posted on Reply
Add your own comment
Mar 6th, 2025 23:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts