Wednesday, September 11th 2024

GeForce RTX 4070 with Slower GDDR6 Memory Priced on-par with Regular RTX 4070 Cards

NVIDIA GeForce board partners are preparing a silent launch of a variant of the GeForce RTX 4070 with slower 20 Gbps GDDR6 memory in place of the 21 Gbps GDDR6X that's standard to the RTX 4070, which results in a 5% reduction in memory bandwidth. It turns out that other specs, such as GPU clocks or core-configuration aren't changed to compensate for the reduced memory bandwidth. ASUS is among the first board partners with an RTX 4060 GDDR6 card, the ASUS DUAL RTX 4070 GDDR6, which was briefly listed on Newegg for $569, before it went out of stock. This is reported by VideoCardz as being the same price as the regular ASUS DUAL RTX 4070 with GDDR6X.

ASUS isn't the only NVIDIA board partner with an RTX 4070 GDDR6, Wccftech spotted a GALAX branded card that comes with the model string "RTX 4070 D6 1-click OC." Its retail box features a large specs-sheet on the front face that clearly mentions GDDR6 as the memory type. NVIDIA's move to re-spec the RTX 4070 with 20 Gbps GDDR6 was originally seen as a move to reduce its costs, letting the card be sold closer to the $500-mark. It remains to be seen if real-world prices settle down below those of the original RTX 4070 cards.
Sources: VideoCardz, Hassan Mujtaba (Twitter)
Add your own comment

75 Comments on GeForce RTX 4070 with Slower GDDR6 Memory Priced on-par with Regular RTX 4070 Cards

#26
Neo_Morpheus
Kind of mild comments, with some rightful and realistic ones.

I can guarantee that if this was an AMD product….oh boy, these comments would be….spicy. :D
Posted on Reply
#27
64K
Neo_MorpheusKind of mild comments, with some rightful and realistic ones.

I can guarantee that if this was an AMD product….oh boy, these comments would be….spicy. :D
AMD can't do this because they already use the slower GDDR6 with their GPUs.
Posted on Reply
#28
gffermari
It's not the first time nVidia does something like this.
Although most likely the difference between the versions will be minimal to non existent, it should have been noted on the product name.
Posted on Reply
#29
bug
Dr. DroNonsense, hardware revision for high-volume SKUs is common practice. How many variants of Polaris did AMD release? ;)
Gigabyte mobos come to mind. They tend to get a v2.0 or v1.1 right after the initial release.
As long as the box says GDDR6 instead of GDDR6X, that's enough differentiation for me. (It will also be a different SKU, but who looks up those numbers?)
Posted on Reply
#30
Nater
64KThat's not what this topic is about and the GDDR6 decision isn't an across the board change so it was a bad analogy on Nater's part. Right?
They didn't shrink the salt and vinegar chips the same time they did the sour cream and onion. It's still fuggin shrinkflation.
Posted on Reply
#31
evernessince
64KAMD can't do this because they already use the slower GDDR6 with their GPUs.
Lower memory speed without changing model name?

Both AMD and Nvidia have done it before, particularly in the low end recently.
64KThat's not what this topic is about and the GDDR6 decision isn't an across the board change so it was a bad analogy on Nater's part. Right?
He was providing an example of customers getting less value, which directly relates to this as that's exactly what's happening here.

The apathy towards the continued gradual reduction in value in palpable here. We've been getting less and paying more generation over generation and it frankly astounds me there are people who can't see where that's already gotten us and will continue to take us. This small decrease on it's own isn't huge but the series of reductions (whether that be VRAM or chip size you get at a give budget) as compared to what we should have or would have historically gotten or the price increases has more than added up.

Nvidia is testing the waters with this move to see how people react to them lowering the specs on a x70 class card just like they were testing the waters with the 2nd nuked 4080. They will give you as little as you are willing to accept, which for some appears to be continously shrinking.
Onasi@Dr. Dro
Right. I don’t think anything indicated that the 4070 was ever bandwidth starved (complaints were about the AMOUNT of VRAM) and 5% is a negligible decrease. And the regular GDDR6 should be less power hungry and cooler than 6X, so it might be an arguable improvement.
I suppose this is a nice Nvidia marketing line "Oh look we "upgraded" you to more power efficient product by making it slower".

Right in line with Intel marketing as well. How can we be making the argument that when Nvidia uses GDDR6 it's an "arguable" upgrade but when AMD uses GDDR6 it's inferior.

Can't believe I even have to point out the extremely obvious double standard, aside from the fact that we shouldn't be arguing for a memory downgrade.
Posted on Reply
#32
64K
evernessinceLower memory speed without changing model name?

Both AMD and Nvidia have done it before, particularly in the low end recently.



He was providing an example of customers getting less value, which directly relates to this as that's exactly what's happening here.

The apathy towards the continued gradual reduction in value in palpable here. We've been getting less and paying more generation over generation and it frankly astounds me there are people who can't see where that's already gotten us and will continue to take us. This small decrease on it's own isn't huge but the series of reductions (whether that be VRAM or chip size you get at a give budget) as compared to what we should have or would have historically gotten or the price increases has more than added up.

Nvidia is testing the waters with this move to see how people react to them lowering the specs on a x70 class card just like they were testing the waters with the 2nd nuked 4080. They will give you as little as you are willing to accept, which for some appears to be continously shrinking.
What apathy? Members here, including myself, are saying the price should be lower on the GDDR6 version but we don't know what the real world impact will be on benches until we see it. If you recall Nvidia released a version of the 1060 with faster VRAM and didn't call it anything but a 1060 and go back and read the review here. The faster VRAM hardly made a difference.

Edit: here is the link to the review and benches:

www.techpowerup.com/review/kfa2-gtx-1060-6-gb-gddr5x/31.html
Posted on Reply
#33
Onasi
evernessinceRight in line with Intel marketing as well. How can we be making the argument that when Nvidia uses GDDR6 it's an "arguable" upgrade but AMD's GGDR6 is inferior.

Can't believe I even have to point out the extremely obvious double standard, aside from the fact that we shouldn't be arguing for memory downgrade.
That’s a nice talking point, except I have never, not once, in my whole time on TPU have argued that AMD using G6 is “inferior” and, in fact, have said multiple times that G6X is a power hog with dubious performance benefits, called it a “janky experiment” by NV and Micron and called for them to stop circumventing JEDEC spec for marketing reasons and just use GDDR6 for anything short of a 4090 (that thing is halo, they can throw whatever experimental BS on that one). So no, no double standards on my part there. In my PERSONAL view - this isn’t a downgrade. Losing ~5% bandwidth on a non-starved for bandwidth card in exchange for lower power consumption and heat is an acceptable trade off.
Posted on Reply
#34
evernessince
64KWhat apathy? Members here, including myself, are saying the price should be lower on the GDDR6 version but we don't know what the real world impact will be on benches until we see it. If you recall Nvidia released a version of the 1060 with faster VRAM and didn't call it anything but a 1060 and go back and read the review here. The faster VRAM hardly made a difference.
Aside from Onasi saying it's arguably an improvement, your comment was a lot more whishy washey than you are implying:
They list the GDDR6 in the specs on Newegg and put it on the box but I doubt most gamers look at that. They probably just see 4070 name. Price should be lower but I have yet to see how much difference the slower VRAM makes so it's hard to tell what the price should be. Might just amount to a minuscule difference in benches.
No, regardless of what the benches say it should be cheaper because it's cheaper for Nvidia. GDDR6 is cheaper than GDDR6X. There was no need to condition your statement on performance. You as a customer are getting an objectively lesser product with cheaper RAM, you should want a discount.
Posted on Reply
#35
AnotherReader
64KWhat apathy? Members here, including myself, are saying the price should be lower on the GDDR6 version but we don't know what the real world impact will be on benches until we see it. If you recall Nvidia released a version of the 1060 with faster VRAM and didn't call it anything but a 1060 and go back and read the review here. The faster VRAM hardly made a difference.

Edit: here is the link to the review and benches:

www.techpowerup.com/review/kfa2-gtx-1060-6-gb-gddr5x/31.html
In the 1060's case, the review notes that the GDDR5X was clocked lower to match the bandwidth of 8 Gbps GDDR5.
Posted on Reply
#36
Hecate91
The double standards are interesting, Nvidia gets praised and defended with the marketing of changing VRAM for efficiency, even though the cost of cheaper GDDR6 isn't being passed down to the consumer. In addition to the product naming being misleading to anyone looking to buying a 4070 may not notice they're buying a GDDR6 version.
If this was AMD doing it there would be a massive outrage and every tech channel would be bashing them for it.
Posted on Reply
#37
64K
evernessinceAside from Onasi saying it's arguably an improvement, your comment was a lot more whishy washey than you are implying:



No, regardless of what the benches say it should be cheaper because it's cheaper for Nvidia. GDDR6 is cheaper than GDDR6X. There was no need to condition your statement on performance. You as a customer are getting an objectively lesser product with cheaper RAM, you should want a discount.
Nonsense re-read my reply. It's not at all wishy washy. Here I'll paste it for you:

"Same price as GDDR6X version. They list the GDDR6 in the specs on Newegg and put it on the box but I doubt most gamers look at that. They probably just see 4070 name. Price should be lower but I have yet to see how much difference the slower VRAM makes so it's hard to tell what the price should be. Might just amount to a minuscule difference in benches."

There is nothing wishy washy about wanting to see some benchmarks before bashing Nvidia. Don't you want to be fair and unbiased?
Posted on Reply
#38
evernessince
OnasiThat’s a nice talking point, except I have never, not once, in my whole time on TPU have argued that AMD using G6 is “inferior” and, in fact, have said multiple times that G6X is a power hog with dubious performance benefits, called it a “janky experiment” by NV and Micron and called for them to stop circumventing JEDEC spec for marketing reasons and just use GDDR6 for anything short of a 4090 (that thing is halo, they can throw whatever experimental BS on that one). So no, no double standards on my part there. In my PERSONAL view - this isn’t a downgrade. Losing ~5% bandwidth on a non-starved for bandwidth card in exchange for lower power consumption and heat is an acceptable trade off.
GDDR6 is cheaper than GDDR6X. There's no arguing around that and as a result the card should be cheaper. In essence you are saying that it's fine if Nvidia changes parts out in instances where the performance impact isn't huge, regardless of considerations like cost difference. Mind you, if you want lower power / heat on GDDR6X nothing is stopping you from tuning it but the whole performance argument is irrelevant to the cost difference.

FYI I never said you did say GDDR6 was inferior. I was pointing out the fact that those statements were made in general. It was to point out the general approach to a memory downgrade in this thread has been indifferent to supportive (as in your case).
Posted on Reply
#39
Assimilator
OnasiThat’s a nice talking point, except I have never, not once, in my whole time on TPU have argued that AMD using G6 is “inferior” and, in fact, have said multiple times that G6X is a power hog with dubious performance benefits, called it a “janky experiment” by NV and Micron and called for them to stop circumventing JEDEC spec for marketing reasons and just use GDDR6 for anything short of a 4090 (that thing is halo, they can throw whatever experimental BS on that one). So no, no double standards on my part there. In my PERSONAL view - this isn’t a downgrade. Losing ~5% bandwidth on a non-starved for bandwidth card in exchange for lower power consumption and heat is an acceptable trade off.
TBH I've never understood NVIDIA's obsession with weird nonstandard memories. As we've already demonstrated the nominal impact is ~5% which really doesn't seem to be necessary, or worth it for all the R&D cost (and increased power draw) that has to be paid. NVIDIA gets a lot of hate on these forums for a lot of stupid irrelevant bullshit, but making VRAM more expensive by forking its production is an unnecessary waste.
Posted on Reply
#40
evernessince
64KNonsense re-read my reply. It's not at all wishy washy. Here I'll paste it for you:

"Same price as GDDR6X version. They list the GDDR6 in the specs on Newegg and put it on the box but I doubt most gamers look at that. They probably just see 4070 name. Price should be lower but I have yet to see how much difference the slower VRAM makes so it's hard to tell what the price should be. Might just amount to a minuscule difference in benches."

There is nothing wishy washy about wanting to see some benchmarks before bashing Nvidia. Don't you want to be fair and unbiased?
Performance has no bearing on what is an objectively cheaper product to produce.
Posted on Reply
#41
sethmatrix7
Dr. DroNonsense, hardware revision for high-volume SKUs is common practice. How many variants of Polaris did AMD release? ;)
Posted on Reply
#42
evernessince
AssimilatorTBH I've never understood NVIDIA's obsession with weird nonstandard memories. As we've already demonstrated the nominal impact is ~5% which really doesn't seem to be necessary, or worth it for all the R&D cost (and increased power draw) that has to be paid. NVIDIA gets a lot of hate on these forums for a lot of stupid irrelevant bullshit, but making VRAM more expensive by forking its production is an unnecessary waste.
If AMD can afford to release different VRAM variants of low-end cards when it moves a fraction of the volume that Nvidia does, I'd wager it's really not that expensive to do so.
Posted on Reply
#43
Dr. Dro
In the earlier thread about this product it was mentioned that these spec revisions, as well as halo products rarely if ever see a price reduction. This is technically enshittification, but there are a few salient points to be accounted for. G6's lower electrical and thermal footprint may be beneficial (even though Ada's newer gen G6X chips aren't as bad as Ampere's used to be) and timings might offset it - we'll simply have to wait and see.

Regardless, anyone who currently owns a GDDR6X RTX 4070 can likely replicate this new card's performance by lowering memory speed by 1 Gbps, which would be reducing the G6X frequency to 1250 MHz. Maybe one of you fine folks can make a thread about it - 20 Gbps G6 would be running at roughly 2500 MHz, so there's the signaling change.
Posted on Reply
#44
Onasi
@evernessince
I haven’t mentioned pricing in this thread at all. Of course, hypothetically, it would be nice for the 4070 GDDR6 version to be cheaper. Hell, the 4070 itself has always been relatively overpriced for what it was. But here’s the rub - it won’t be cheaper. You know it, I know it. There’s no reason for it to be. It will still sell. Because that’s what the market in which one of the players holds 80%+ share looks like. You saying it’s an “objectively cheaper card to build” kind of ignores the fact that there are A LOT of objectively inexpensive to make products that are still priced with insane margins because that’s how branding and mind share works. What, you think additional 128 gigs on an iPhone cost Apple 150 dollars?

I see no reason to charge against windmills. Saying that something should be cheaper in a comment on an enthusiast forum won’t make NV reconsider their pricing policy. What would is real competition. And we’re fairly barren on that nowadays.
Posted on Reply
#45
kapone32
Dr. DroNonsense, hardware revision for high-volume SKUs is common practice. How many variants of Polaris did AMD release? ;)
2
Posted on Reply
#46
RedelZaVedno
Ngreedia at it's best. Milking uneducated consumers.
Posted on Reply
#47
tussinman
Remember being excited when the original 4070 got announced. 3080 level performance with less power, could possibly be a good value? Nope, most of the models in stock where like $650. Here we are in almost 2025 and they will still most likely try to charge over $500 for this variant
Posted on Reply
#48
yfn_ratchet
64KBad analogy. That would imply that Nvidia will just replace GDDR6X with GDDR6 VRAM in all GPUs and keep the same MSRP.
Don't jinx it now. :laugh:
Posted on Reply
#49
Random_User
This is an utter BS. Even if the performance is on par with GDDR6X, the 5% less in bandwidth specs, should manifest into 5% less price. Period.
The sole fact, that "usual" GDDR6 have much broader offer from more RAM makers, should have already drive the price down, significantly. I'm somewhat sure, that the "savings" from this transition alone, should have extricated a huge pile of money. Not to mention, this VRAM change, may lead to the smaller coolers, due to lesser heat output, and this even lesser expences.
Unless the VRAM in these cards comes from the similar
corrupt
"exclusive" deal, or contract with Micron, there's no way this is real price.
But yeah, this is targeted at non-savvy/unaware people, who will just see the giant "4070" symbols on the green fancy box, and will mistakenly depart with more money, than they should.
Posted on Reply
#50
Dr. Dro
kapone322
Disregarding workstation and mining, these are all the same Ellesmere die (the most common Polaris 10/20 variant), sold as Radeon RX gaming GPUs. The only differences between them are minor revisions, such as memory speed or core configuration:

RX 470
RX 470D
RX 480
RX 570
RX 570X
RX 580G
RX 580X
RX 580 2048SP
RX 580
RX 590 GME

Let's see, that's 2x5, aka 10. It's perfectly normal for high volume, middle segment products to have subvariants and different SKUs. Not the first time NVIDIA did it, not the first time AMD did it - both will continue to do so in the future.
Random_UserThis is an utter BS. Even if the performance is on par with GDDR6X, the 5% less in bandwidth specs, should manifest into 5% less price. Period.
The sole fact, that "usual" GDDR6 have much broader offer from more RAM makers, should have already drive the price down, significantly. I'm somewhat sure, that the "savings" from this transition alone, should have extricated a huge pile of money. Not to mention, it should show up in the smaller coolers, due to lesser heat output.
Unless the VRAM in these cards comes from the similar
corrupt
"exclusive" deal, or contract with Micron, there's no way this is real price.
But yeah, this is targeted at non-savvy/unaware people, who will just see the giant "4070" symbols on the green fancy box, and will mistakenly depart with more money, than they should.
The RTX 4070 uses the same memory IC in demand for building the high-margin RTX 4090. This is why this GDDR6 variant was even built, so they can reallocate the chip supply to the RTX 4090 assembly lines. If the performance is in the same ballpark, most users won't mind, so in that regard, it makes no business sense to reduce the price. Demand is as high as it has ever been, after all.
Posted on Reply
Add your own comment
Dec 11th, 2024 22:52 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts