Wednesday, September 11th 2024

GeForce RTX 4070 with Slower GDDR6 Memory Priced on-par with Regular RTX 4070 Cards

NVIDIA GeForce board partners are preparing a silent launch of a variant of the GeForce RTX 4070 with slower 20 Gbps GDDR6 memory in place of the 21 Gbps GDDR6X that's standard to the RTX 4070, which results in a 5% reduction in memory bandwidth. It turns out that other specs, such as GPU clocks or core-configuration aren't changed to compensate for the reduced memory bandwidth. ASUS is among the first board partners with an RTX 4060 GDDR6 card, the ASUS DUAL RTX 4070 GDDR6, which was briefly listed on Newegg for $569, before it went out of stock. This is reported by VideoCardz as being the same price as the regular ASUS DUAL RTX 4070 with GDDR6X.

ASUS isn't the only NVIDIA board partner with an RTX 4070 GDDR6, Wccftech spotted a GALAX branded card that comes with the model string "RTX 4070 D6 1-click OC." Its retail box features a large specs-sheet on the front face that clearly mentions GDDR6 as the memory type. NVIDIA's move to re-spec the RTX 4070 with 20 Gbps GDDR6 was originally seen as a move to reduce its costs, letting the card be sold closer to the $500-mark. It remains to be seen if real-world prices settle down below those of the original RTX 4070 cards.
Sources: VideoCardz, Hassan Mujtaba (Twitter)
Add your own comment

75 Comments on GeForce RTX 4070 with Slower GDDR6 Memory Priced on-par with Regular RTX 4070 Cards

#52
Dr. Dro
The Norwegian Drone PilotGALAX GeForce RTX 4070 OC 2X Graphics Card Review: Out Goes GDDR6X, In Goes GDDR6 (wccftech.com)

The RTX 4070 GDDR6 offers 99% of the performance of the GDDR6X variant.
The other thing is that we saw the GDDR6 temperatures to be moderately lower than GDDR6X and the power consumption of the card was also a bit lower.
Yep. It's like I was saying all along. Net zero performance impact, a slight increase in power efficiency and lower thermals on the memory area. Nothing to see here.
Posted on Reply
#53
AnotherReader
Dr. DroYep. It's like I was saying all along. Net zero performance impact, a slight increase in power efficiency and lower thermals on the memory area. Nothing to see here.
The text doesn't match the power consumption graphs. All in all, it looks like GDDR6X is now slightly more efficient than GDDR6.



Posted on Reply
#54
wNotyarD
Dr. DroYep. It's like I was saying all along. Net zero performance impact, a slight increase in power efficiency and lower thermals on the memory area. Nothing to see here.
For the common none-the-wiser buyer, it may be just this simple.
But for us, tech-savvy enthusiasts who know what's going on, there's no way we won't ask for a price reduction when we know the BOM was lowered and NVIDIA's profit increased.
Posted on Reply
#55
Dr. Dro
AnotherReaderThe text doesn't match the power consumption graphs. All in all, it looks like GDDR6X is now slightly more efficient than GDDR6.



Could be swapped, more than a few errors in that review. They need a proofreader badly, most of it seems to be referring to it as a Super. It's such a minute change anyway, the normal variance between chips as well as the probably slightly higher clocks of the "OC" variant could also tilt it the less than one percent there. Won't invalidate that they are functionally equal, G6 variant is 99% performance to 101% efficiency.

Hopefully W1zz will get a sample as well.
wNotyarDFor the common none-the-wiser buyer, it may be just this simple.
But for us, tech-savvy enthusiasts who know what's going on, there's no way we won't ask for a price reduction when we know the BOM was lowered and NVIDIA's profit increased.
I agree but that's where they'll tell us to go pound sand :oops:
Posted on Reply
#56
64K
wNotyarDFor the common none-the-wiser buyer, it may be just this simple.
But for us, tech-savvy enthusiasts who know what's going on, there's no way we won't ask for a price reduction when we know the BOM was lowered and NVIDIA's profit increased.
We don't know that Nvidia's profit increased. They only sell the GPU. The card manufacturers have the expense of which VRAM to buy calculated into their bottom line. Shouldn't they be the ones to adjust the price?
Posted on Reply
#57
evernessince
Onasi@evernessince
I haven’t mentioned pricing in this thread at all. Of course, hypothetically, it would be nice for the 4070 GDDR6 version to be cheaper. Hell, the 4070 itself has always been relatively overpriced for what it was. But here’s the rub - it won’t be cheaper. You know it, I know it. There’s no reason for it to be. It will still sell. Because that’s what the market in which one of the players holds 80%+ share looks like. You saying it’s an “objectively cheaper card to build” kind of ignores the fact that there are A LOT of objectively inexpensive to make products that are still priced with insane margins because that’s how branding and mind share works. What, you think additional 128 gigs on an iPhone cost Apple 150 dollars?

I see no reason to charge against windmills. Saying that something should be cheaper in a comment on an enthusiast forum won’t make NV reconsider their pricing policy. What would is real competition. And we’re fairly barren on that nowadays.
Can't argue with any of this. I just wish the market wasn't the way it is...
Posted on Reply
#58
kondamin
Shame they didn’t double it
Posted on Reply
#59
wolf
Better Than Native
Man the persecution complex for some who have entangled their whole identity in AMD is real, to show that level of support I'd expect a pay check, or free products.

This is looking like pretty much nothing burger as predicted. I do think Nvidia should at least make it transparent which memory variant you buy, and with a smidge of luck price cuts will start in 3-6 months as the 50 series and RDNA4 hit the market.
Posted on Reply
#60
Pepamami
well that would be nice, but not for 12GB
Posted on Reply
#61
shadad
Next, 4070 with no memory.
Posted on Reply
#62
zenlaserman
lasWell, some people are still buying Radeon 6000 and RTX 3000 series. I can't imagine that either, yet it happens.
Check out the guy who's chased the performance dragon so far up the high-horse's arse that he lost sight of what the real world does. Or pretends to be clueless anyway - maybe you're one of those fake elitists. So, the real world buys what's available where they are at, at what they can afford. You prolly know this, your own online ego has been pressuring you to lie to yourself to gain clout points you'll never gain from people you'll never see. There's a lot of guys acting like you in this forum atm smh.
lasHas nothing to do with brand loyalty, most people don't really care - If AMD GPUs were competitive, they would sell.
Now you think you speak for "most people" lol. You're one of those. When it comes to die-hard Radeon users, there _is_ brand loyalty (I for example have not bought nV since G92), and there are many gamers using exclusively Radeons because they ARE competitive and they DO sell. As most of us know, most consoles use Radeons, and there are more console users than there are PC gamers. BTW, Radeons work just fine in PC games, too, even if it's onboard.
Posted on Reply
#63
Hecate91
wolfMan the persecution complex for some who have entangled their whole identity in AMD is real, to show that level of support I'd expect a pay check, or free products.

This is looking like pretty much nothing burger as predicted. I do think Nvidia should at least make it transparent which memory variant you buy, and with a smidge of luck price cuts will start in 3-6 months as the 50 series and RDNA4 hit the market.
Not having such strong brand loyalty for Nvidia that you aren't defending a product being priced the same yet not having the same specs as before isn't a persecution complex.
Is the leather jacket man paying well? When Nvidia launches a different version of the same card its nothing but when AMD does it people are upset over it.
Posted on Reply
#64
las
zenlasermanCheck out the guy who's chased the performance dragon so far up the high-horse's arse that he lost sight of what the real world does. Or pretends to be clueless anyway - maybe you're one of those fake elitists. So, the real world buys what's available where they are at, at what they can afford. You prolly know this, your own online ego has been pressuring you to lie to yourself to gain clout points you'll never gain from people you'll never see. There's a lot of guys acting like you in this forum atm smh.



Now you think you speak for "most people" lol. You're one of those. When it comes to die-hard Radeon users, there _is_ brand loyalty (I for example have not bought nV since G92), and there are many gamers using exclusively Radeons because they ARE competitive and they DO sell. As most of us know, most consoles use Radeons, and there are more console users than there are PC gamers. BTW, Radeons work just fine in PC games, too, even if it's onboard.
Yeah AMD is doing really well in the GPU space:

www.tomshardware.com/pc-components/gpus/nvidias-grasp-of-desktop-gpu-market-balloons-to-88-amd-has-just-12-intel-negligible-says-jpr

store.steampowered.com/hwsurvey/videocard/ - Barely any AMD dGPUs in top 25

www.techpowerup.com/326415/amd-confirms-retreat-from-the-enthusiast-gpu-segment-to-focus-on-gaining-market-share

www.theverge.com/2024/9/9/24240173/amd-udna-gpu-ai-gaming-rdna-cdna-jack-huynh

This is a hardware forum, some of us cares about hardware.
Posted on Reply
#65
wolf
Better Than Native
Hecate91Is the leather jacket man paying well? When Nvidia launches a different version of the same card its nothing but when AMD does it people are upset over it.
When it performs within a margin of error, it's effectively the same product, plus you can quote me saying they should tell buyers what version they're getting and that changing the specs is scummy.
Hecate91Not having such strong brand loyalty for Nvidia that you aren't defending a product being priced the same yet not having the same specs as before isn't a persecution complex.
Zero brand loyalty, literally, zero. From what's been tested, the spec difference *checks notes* doesn't seem to make a difference to the product performance. How about we see when and if that happens to an AMD product rather than a pre-emptive cry about a made up argument? We've got people in here with AMD tattooed on their brain complaining about something that hasn't happened, and if it did, Mark this post so you can quote me and how I feel about that [currently theoretical] situation. Because yes, having a pre emptive cry that if AMD did the same they'd be dealt with worse is tantamount to a persecution complex. I hold everyone to the same standard, unlike some who wear their bias on their sleeve.
Posted on Reply
#66
bug
Hecate91Not having such strong brand loyalty for Nvidia that you aren't defending a product being priced the same yet not having the same specs as before isn't a persecution complex.
Happens all the time. Sometimes manufacturers will replace resistors or capacitors because there are cheaper alternatives or because they can't source the original version anymore*. You were getting a 4070 before, you're still getting a 4070 now. If you were after a specific GDDR version, that's on the box, too.

*And then there's Kingston who will only guarantee certain parameters for a SSD model, while reserving the right to replace any and all components inside. That's a valid business, too. Even if it will give some pause to the tech-savvy users.
Posted on Reply
#67
evernessince
Onasi@Dr. Dro
Right. I don’t think anything indicated that the 4070 was ever bandwidth starved (complaints were about the AMOUNT of VRAM) and 5% is a negligible decrease. And the regular GDDR6 should be less power hungry and cooler than 6X, so it might be an arguable improvement.
Just a follow up to this, we can confirm at this point the GDDR6 model is 4% slower on average and is less power efficient:

The 4070 GDDR6 model consumes 1.52% less power while being 4% slower. It also consumes more power at idle. So overall a small reduction in efficiency.




I believe the video describes it perfectly, it's a small sneaky downgrade. It's baffling why Nvidia does things like this when it would be so easy for them to avoid.
Posted on Reply
#68
bug
evernessinceJust a follow up to this, we can confirm at this point the GDDR6 model is 4% slower on average and is less power efficient:

The 4070 GDDR6 model consumes 1.52% less power while being 4% slower. It also consumes more power at idle. So overall a small reduction in efficiency.




I believe the video describes it perfectly, it's a small sneaky downgrade. It's baffling why Nvidia does things like this when it would be so easy for them to avoid.
First of all, I don't see how a 5% bandwidth reduction results in a 4% performance penalty*.
Second of all, 4% is margin of error, you can get that if you benchmark the same video card model from different batches.

*Notice how you can from 8200 all the way down to 6800 to see that kind of difference. Even then, it's not across the board.
Posted on Reply
#69
Lew Zealand
bugFirst of all, I don't see how a 5% bandwidth reduction results in a 4% performance penalty*.
Second of all, 4% is margin of error, you can get that if you benchmark the same video card model from different batches.

*Notice how you can from 8200 all the way down to 6800 to see that kind of difference. Even then, it's not across the board.
GPU memory bandwidth is generally more taxed with higher framerates, thus showing a greater effect at lower resolutions. The 4% drop is at 1080p, whereas it's 3% at 1440p where this GPU is more likely to be used. 2% at 4K FWIW to further show the trend. If it's repeatable, then it's not within the margin of error but it's well within the margin of not noticing it when you're playing if there's no fps counter active. However with no reduction in price it's also lower value and frankly, I'd like that 3% price reduction to $534 list.
Posted on Reply
#70
evernessince
bugFirst of all, I don't see how a 5% bandwidth reduction results in a 4% performance penalty*.
Second of all, 4% is margin of error, you can get that if you benchmark the same video card model from different batches.

*Notice how you can from 8200 all the way down to 6800 to see that kind of difference. Even then, it's not across the board.
Most card variants are within margin of error. That doesn't mean the data on their performance delta between the reference is irrelevant. If the data can be repeated, it's valid.

Mind you IMO that's only part of the problem. Nvidia is selling customers a materially cheaper product without letting customers know (like for example adding something to the model name). Some SSD manufacturer's do this and it ruins trust in their brand because people don't know the exact specs of what they are getting. Customers are required to be vigilant because in a capitalist market with no laws preventing this sort of thing, that's the only protection we have against it becoming even worse.
Posted on Reply
#71
bug
evernessinceMind you IMO that's only part of the problem. Nvidia is selling customers a materially cheaper product without letting customers know (like for example adding something to the model name).
Are you insane? It says GDDR6 on the box, that's not different enough for you?
Do you also happen to think car manufacturers should release a different model for each trim?
Posted on Reply
#72
Lew Zealand
bugAre you insane? It says GDDR6 on the box, that's not different enough for you?
Do you also happen to think car manufacturers should release a different model for each trim?
Look at the box graphics on a typical GPU box. There is a single letter difference between these 2 models. One letter and it's not very large and that's only if you have 2 GPUs side by side to compare and you know that there's supposed to be a difference. GPU version of Where's Waldo.

Yes by the time you're spending $550 on a GPU you should be doing a lot of homework but the difference in labeling between the 2 is very subtle. There was a thread in the forums here last week with someone who wanted a 3060-class GPU and thought the 3060 8GB was the standard configuration and not cut down from 12GB. Same problem with a single different number, and an experienced forum member didn't know this product existed and assumed it was new when it's been out for almost 2 years now.

These changes are deceptive when the major model name doesn't change.
Posted on Reply
#73
londiste
Lew ZealandGPU memory bandwidth is generally more taxed with higher framerates, thus showing a greater effect at lower resolutions. The 4% drop is at 1080p, whereas it's 3% at 1440p where this GPU is more likely to be used. 2% at 4K FWIW to further show the trend. If it's repeatable, then it's not within the margin of error but it's well within the margin of not noticing it when you're playing if there's no fps counter active.
This is really strange. Memory bandwidth usually plays a larger part when resolution increases. The difference percentages would make much more sense the other way around.
Will TPU make a full review of a 4070 GDDR6? I am interested in how the power consumption and clock speeds turn out.
Posted on Reply
#74
Lew Zealand
londisteThis is really strange. Memory bandwidth usually plays a larger part when resolution increases. The difference percentages would make much more sense the other way around.
Will TPU make a full review of a 4070 GDDR6? I am interested in how the power consumption and clock speeds turn out.
I'm just going on what I observe in GPU-Z when testing with the same GPU under different resolutions. As the fps goes up at smaller resolutions, when keeping the same overall GPU power usage, you'll see the Memory Controller Load % readout go up. Maybe it only works this way in some games or benchmarks and not others? I certainly haven't done an exhaustive look at this but now I'm interested. Generally I'm testing this with 3DMark Time Spy's second Graphics Test as it's picky about both Memory speed and GPU core overclocking, though I also use their Speed Way and more recently Steel Nomad tests.
Posted on Reply
#75
64K
I don't know what is going on with this. I just read an article over on Videocardz that the 4070 GDDR6 is launching in Germany at €30 higher than the GDDR6X version.
Posted on Reply
Add your own comment
Dec 11th, 2024 22:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts