Monday, April 24th 2023
Modded NVIDIA GeForce RTX 3070 With 16 GB of VRAM Shows Impressive Performance Uplift
A memory mod for the NVIDIA GeForce RTX 3070 that doubles the amount of VRAM showed some impressive performance gains, especially in the most recent games. While the mod was more complicated than earlier ones, since it required some additional PCB soldering, the one tested game shows incredible performance boost, especially in the 1%, 0.1% lows, and the average frame rate.
Modding the NVIDIA GeForce RTX 3070 to 16 GB VRAM is not a bad idea, since NVIDIA already planned a similar card (RTX 3070 Ti 16 GB), but eventually cancelled it. With today games using more than 8 GB of VRAM, it means that some RTX 30 series graphics card can struggle with pushing playable FPS. The modder benchmarked the new Resident Evil 4 at very high settings, showing that those additional 8 GB of VRAM is the difference between stuttering and smooth gameplay.As said, the recent mod is a bit more complicated than the earlier one done on some earlier graphics cards, as some resistors needed to be grounded in order to support higher-capacity memory ICs, and the modded graphics card had to be set to high-performance mode in the NVIDIA Control Panel, in order to fix flickering.
AMD marketing has recently called out NVIDIA and pulled the VRAM card, but with NVIDIA launching the GeForce RTX 4070 with 12 GB of VRAM, it appears this won't change anytime soon. These mods show that there is definitely the need for more VRAM, at least in some games.
Sources:
Paulo Gomes (Youtube), via Videocardz
Modding the NVIDIA GeForce RTX 3070 to 16 GB VRAM is not a bad idea, since NVIDIA already planned a similar card (RTX 3070 Ti 16 GB), but eventually cancelled it. With today games using more than 8 GB of VRAM, it means that some RTX 30 series graphics card can struggle with pushing playable FPS. The modder benchmarked the new Resident Evil 4 at very high settings, showing that those additional 8 GB of VRAM is the difference between stuttering and smooth gameplay.As said, the recent mod is a bit more complicated than the earlier one done on some earlier graphics cards, as some resistors needed to be grounded in order to support higher-capacity memory ICs, and the modded graphics card had to be set to high-performance mode in the NVIDIA Control Panel, in order to fix flickering.
AMD marketing has recently called out NVIDIA and pulled the VRAM card, but with NVIDIA launching the GeForce RTX 4070 with 12 GB of VRAM, it appears this won't change anytime soon. These mods show that there is definitely the need for more VRAM, at least in some games.
80 Comments on Modded NVIDIA GeForce RTX 3070 With 16 GB of VRAM Shows Impressive Performance Uplift
it might not be accurate but it gives an indication of how vram is used and how much new games require them for low and ultra settings
I don't get this sentiment at all. You pay through the nose for even a midrange GPU and then people are content with all sorts of quality reductions to keep it afloat. While there are ALSO similarly priced midrangers that don't force you into that, today or in the next three to five years.
Like... why even bother to begin with, just use your IGP and save money, 720p is after all just a setting eh.
The engine isn't even using any new tech, its just a big game.
the cards even allocate and use vram differently just changing brands and having the same vram size
How are you concluding anything about maximums and minimums based on vram size? talking about the 8GB comparisons, the 4k numbers are a bit irrelevant for this discussion.
Fresh and still hot
No AMD vs Nvidia in this video. Only 8GB vs 16GB, pure Nvidia.
We need that clapping hand emoji.
Unless you are seeing consistent microstuttering at higher framerates, you are not running out of VRAM. So far nobody has displayed that with TW3.
Apple offers an easy way to read memory pressure in the Mac OS's activity monitor, but Microsoft has yet to do something like this on Windows. A dead giveaway that you are short on RAM is when you begin to see the Compressed Memory figure rise (ideally you want this at 0 MB), and you'll be practically out of RAM when your commit charge exceeds your physical RAM capacity. This is also the reason why you will never see RAM usage maxed out in the Windows task manager, it will attempt to conserve about 1 GB of RAM for emergency use at all times, this leads many people with the mindset of "I paid for 16 GB of RAM and use 16 GB of RAM I shall" to think that their computers are doing OK and that they aren't short at all.
A similar concept applies to GPU memory allocation on Windows. As you may know by now, Windows doesn't treat memory as absolute values, but rather as an abstract concept of addressable pages instead, with the MB values reported by the OS being more estimates than a precise, accurate metric. Normally, the WDDM graphics driver will allocate physical memory present in the graphics adapter plus up to 50% of system RAM, so for example, if you have a 24 GB GPU, plus 32 GB of RAM, you will have a maximum of 24 + 16 = around 40 GB of addressable GPU memory:
This means that a 8 GB GPU such as the RTX 3070 on a PC with 32 GB of RAM actually has around 20 GB of addressable memory. However, at that point, the graphics subsystem is no longer interested in performance but rather, preventing crashes, as it's fighting for resources demanded by programs in main memory. By running games that reasonably demand a GPU that has that much dedicated memory to begin with, you can see where this is going fast.
I believe we may be seeing this symptom in Hardware Unboxed's video, in The Last of Us, where the computer is attempting to conserve memory at all costs:
This is, of course, my personal understanding of things. Don't take it as gospel, I might be talking rubbish - but one thing's for sure, by ensuring that I always have more RAM than an application demands, I have dodged that performance problem for many years now.
Sure 8GB was stupid, my RX480 come with 8GB back in 2016
using a 3070, especially considering the cost, to game at 1080p is absurd
using a 3070 to run a game at 1440p at ultra is also absurd, it brings nothing to the experience, HU thinks the same as me, they even made a video about it
don't buy one, not because of the vram, but because of the price even if you play CSGO competitively
if you already have one, let HU make another video just so you fell even more buyers remorse, where were they when the 3070's released?!
and we keep beating these points around.
But textures must be maxed, same as LOD and all other things that affect real assets. The GPU hit is low but the IQ win is high.
I am waiting to see if anyone does that.
GDDR6X | Memory | Micron Technology
It seems that there are 2 version of Micron chips, 8Gbit and 16Gbit, used 12pcs x 32 IO package.