Monday, March 27th 2023

12GB Confirmed to be GeForce RTX 4070 Standard Memory Size in MSI and GIGABYTE Regulatory Filings

It looks like 12 GB will be the standard memory size for the NVIDIA GeForce RTX 4070 graphics card the company plans to launch in mid-April 2023. It is very likely that the card has 12 GB of memory across the 192-bit memory bus width of the "AD104" silicon the SKU is based on. The RTX 4070 is already heavily cut down from the RTX 4070 Ti that maxes out the "AD104," with the upcoming SKU featuring just 5,888 CUDA cores, compared to the 7,680 of the RTX 4070 Ti. The memory sub-system, however, could see NVIDIA use the same 21 Gbps-rated GDDR6X memory chips, which across the 192-bit memory interface, produce 504 GB/s of memory bandwidth. Confirmation of the memory size came from regulatory filings of several upcoming custom-design RTX 4070 board models by MSI and GIGABYTE, with the Eurasian Economic Commission (EEC), and Korean NRRA.
Sources: harukaze5719 (Twitter), VideoCardz
Add your own comment

62 Comments on 12GB Confirmed to be GeForce RTX 4070 Standard Memory Size in MSI and GIGABYTE Regulatory Filings

#51
bug
KovoetNvidia will make the price at whatever as unfortunately they know people will play for it.
No, they don't: fourweekmba.com/nvidia-revenue-by-segment/
Even at the inflated prices, they managed to lose revenue. That means people aren't buying as much anymore. Fingers crossed this trend continues.
Posted on Reply
#52
mama
Why_MeSays who.

And this from Wizard on a GPU analysis for The Last of Us Part 1: "Not only rendering performance requirements are high, but VRAM is also challenging. Even at 1600x900 we measured allocation of 10.5 GB, which of course doesn't mean that every single frame touches all the memory. Still, cards with 8 GB do encounter a small FPS penalty even at 1080p. At 4K, the VRAM usage is 14 GB, which makes things challenging for RTX 3080 10 GB and RTX 4070 Ti 12 GB—both these cards drop well below their AMD counterparts at 4K."
Posted on Reply
#53
bug
mamaAnd this from Wizard on a GPU analysis for The Last of Us Part 1: "Not only rendering performance requirements are high, but VRAM is also challenging. Even at 1600x900 we measured allocation of 10.5 GB, which of course doesn't mean that every single frame touches all the memory. Still, cards with 8 GB do encounter a small FPS penalty even at 1080p. At 4K, the VRAM usage is 14 GB, which makes things challenging for RTX 3080 10 GB and RTX 4070 Ti 12 GB—both these cards drop well below their AMD counterparts at 4K."
Tbh, neither is marketed as a 4k card. They can push 4k, but, of course, you're going to have to lower something to get there. Both those cards and their AMD counterparts are below the 60fps threshold. Nvidia cards can usually use the DLSS3 escape hatch (power VRAM requirements as a side effect), but with this title they don't get that option.

And then there's the remake of RE4, where these card push 4k pretty comfortably.
Posted on Reply
#54
Why_Me
mamaAnd this from Wizard on a GPU analysis for The Last of Us Part 1: "Not only rendering performance requirements are high, but VRAM is also challenging. Even at 1600x900 we measured allocation of 10.5 GB, which of course doesn't mean that every single frame touches all the memory. Still, cards with 8 GB do encounter a small FPS penalty even at 1080p. At 4K, the VRAM usage is 14 GB, which makes things challenging for RTX 3080 10 GB and RTX 4070 Ti 12 GB—both these cards drop well below their AMD counterparts at 4K."
Because 'The Last of Us' is such a terrific optimized game /sarasm. And btw who's going to purchase this card for gaming at 4K?
Posted on Reply
#55
Bwaze
bugTbh, neither is marketed as a 4k card.
RTX 3080 was of course marketed as a 4K card, it was also marketed as a flagship - and RTX 3090 was marketed as a "TITAN class performance" and not meant as a normal gaming card product.
Posted on Reply
#56
mama
Why_MeSays who.

And this commentary...
Posted on Reply
#57
Why_Me
mamaAnd this commentary...
Because console gamers pretending to be PC gamers are probably the only ones that will play that horribly optimized game.
Posted on Reply
#58
Hecate91
Interesting that the nvidia fans only complain a game is "badly optimized" when it runs worse on a nvidia gpu due to nvidia putting just enough vram on their cards so you're forced to replace it sooner.
And 12GB on a 4070 isn't really enough, especially for a $600 gpu, while you can buy a RX 6950XT for around $650.
Posted on Reply
#59
Why_Me
Hecate91Interesting that the nvidia fans only complain a game is "badly optimized" when it runs worse on a nvidia gpu due to nvidia putting just enough vram on their cards so you're forced to replace it sooner.
And 12GB on a 4070 isn't really enough, especially for a $600 gpu, while you can buy a RX 6950XT for around $650.
The game has issues and anyone familiar with PC gaming would know that.
Posted on Reply
#60
droopyRO
BwazeI don't think Nvidia listens to customer feedback in random forums.




And as they have said, AI will need tens of thousands of GPUs, so all you gamers can go play with your rain sticks.
Good. Then that will 1. mean death to new AAA, PC games or 2. game devs will have to really scratch their heads and optimize the shit out of ever game they make or port to PC so that users can game with old hardware like 8GB vRAM cards.
Posted on Reply
#62
Mister300
BwazeNah, Jensen said that this era is over, that Moore's Law is dead. And months later, Moore actually died. Coincidence?
Actually if you use HBM like AMD(circa 2015) did with the Fury X you can hit TB/sec BW. Should be standard now instead of DDR6. NVIDIA mafia squashed it. Remember the AMD 4K killer claims, BW isn't everything for high FPS.
Posted on Reply
Add your own comment
Dec 18th, 2024 05:14 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts