Monday, March 27th 2023
12GB Confirmed to be GeForce RTX 4070 Standard Memory Size in MSI and GIGABYTE Regulatory Filings
It looks like 12 GB will be the standard memory size for the NVIDIA GeForce RTX 4070 graphics card the company plans to launch in mid-April 2023. It is very likely that the card has 12 GB of memory across the 192-bit memory bus width of the "AD104" silicon the SKU is based on. The RTX 4070 is already heavily cut down from the RTX 4070 Ti that maxes out the "AD104," with the upcoming SKU featuring just 5,888 CUDA cores, compared to the 7,680 of the RTX 4070 Ti. The memory sub-system, however, could see NVIDIA use the same 21 Gbps-rated GDDR6X memory chips, which across the 192-bit memory interface, produce 504 GB/s of memory bandwidth. Confirmation of the memory size came from regulatory filings of several upcoming custom-design RTX 4070 board models by MSI and GIGABYTE, with the Eurasian Economic Commission (EEC), and Korean NRRA.
Sources:
harukaze5719 (Twitter), VideoCardz
62 Comments on 12GB Confirmed to be GeForce RTX 4070 Standard Memory Size in MSI and GIGABYTE Regulatory Filings
See, saving money is easy, its all about expectations.
Shader count isn't either, and 300 for an x70 isn't realistic to begin with.
Let's refresh our memories a bit - a crippled 3.5GB 970 was already MSRP 329,-
Nine years ago.
But we all know this x70 won't release for 400-450, it'll do 550+ at least.
And as they have said, AI will need tens of thousands of GPUs, so all you gamers can go play with your rain sticks.
But people hate thinking that the prices today are caused by anything but greed. If the market will pay it there's no reason not to, leaving money on the table would be silly.
Then again, $500 for a custom model means you got the MSRP right.
How about some current precedent? The 6900xt with a 256 bit bus is able to keep up in raster with the 320 bit 3080 and 384 bit 3090, depending on if AMD bothered to release optimized drivers. The 7900xt, a 320 bit card, averages out to the same speed as the 4070ti, a 192 bit card, and loses to the 4080, a 256 bit card.
www.techspot.com/review/2642-radeon-7900-xt-vs-geforce-rtx-4070-ti/#1440p
Bits =! speed.
*No, the irony of talking cost saving when video cards cost as much as they do today is not lost on me.
Nvidia's data center revenue % has been growing for years.
I agree with you though 300-400 usd is not happening ever again on a XX70 card people will be lucky if the 4060 is remotely close to 300 usd.
We got spoiled by years of cheap cards thanks to the 2008 recession and slow recovery.
Poor Nvidia, only making 7% above their average during an economic expansion while everyone else is struggling to put food on the table. Woe is them.
I still don't think that it's nearly enough for that potent of a video card though.
Milking too
I was pretty underwhelmed with the 7000 series to the point I'm not surprised at 4000 series pricing. I feel like AMD took at least a step back vs the 6000 series which in general competed better with Nvidia's 90 tier card. Not that the performance is bad it's actually pretty good but the 4080 is one of the most underwhelming nvidia cards from a price perspective literally a 71% price increase vs it's predecessor but even at the ridiculous 1200 usd msrp which is kinda sad because Nvidia left AMD with a huge window to obliterate the 4080/4070ti and at best they are matching them.
I really hope at the XX70 tier and lower the 7000 series is much more impressive where RT matters a lot less.