Monday, April 24th 2023

Modded NVIDIA GeForce RTX 3070 With 16 GB of VRAM Shows Impressive Performance Uplift

A memory mod for the NVIDIA GeForce RTX 3070 that doubles the amount of VRAM showed some impressive performance gains, especially in the most recent games. While the mod was more complicated than earlier ones, since it required some additional PCB soldering, the one tested game shows incredible performance boost, especially in the 1%, 0.1% lows, and the average frame rate.

Modding the NVIDIA GeForce RTX 3070 to 16 GB VRAM is not a bad idea, since NVIDIA already planned a similar card (RTX 3070 Ti 16 GB), but eventually cancelled it. With today games using more than 8 GB of VRAM, it means that some RTX 30 series graphics card can struggle with pushing playable FPS. The modder benchmarked the new Resident Evil 4 at very high settings, showing that those additional 8 GB of VRAM is the difference between stuttering and smooth gameplay.
As said, the recent mod is a bit more complicated than the earlier one done on some earlier graphics cards, as some resistors needed to be grounded in order to support higher-capacity memory ICs, and the modded graphics card had to be set to high-performance mode in the NVIDIA Control Panel, in order to fix flickering.

AMD marketing has recently called out NVIDIA and pulled the VRAM card, but with NVIDIA launching the GeForce RTX 4070 with 12 GB of VRAM, it appears this won't change anytime soon. These mods show that there is definitely the need for more VRAM, at least in some games.

Sources: Paulo Gomes (Youtube), via Videocardz
Add your own comment

80 Comments on Modded NVIDIA GeForce RTX 3070 With 16 GB of VRAM Shows Impressive Performance Uplift

#51
Outback Bronze
Wouldn't it be nice to upgrade the ram on your GPU like we do on our mobo's : )
Posted on Reply
#52
BSim500
Outback BronzeWouldn't it be nice to upgrade the ram on your GPU like we do on our mobo's : )
I remember some of the old Matrox Millennium's let you do exactly that...
Posted on Reply
#53
TumbleGeorge
Outback BronzeWouldn't it be nice to upgrade the ram on your GPU like we do on our mobo's : )
No, extreme temperatures and especially frequent changes in a large range, around the GPU, and from the operation of the memories themselves, it is better to have the chips soldered.
Posted on Reply
#54
Hyderz
Bomby569That was another video that made no sense at all, you shouldn't test that on cards with 24GB, it's like upgrading from 16GB of RAM to 32GB, all your applications suddenly report they are using more. And i'm not sure afterburner or any other software similar ever reported real usage for vram.
I don't get what is so difficult to test thing in realistic scenarios, not ultra settings, not 4090's at 1080p. It's infuriating.

I'm sorry but they are sellers, all of them, they want to sell you their videos, they make a living from our clicks, testing realistic scenarios is not click material.

Next we'll be testing Cyberpunk in 4k using a 3dfx, using no keyboard and one hand tied behind my back while drinking cola and eating pizza driving down the highway, because lols
i think the whole point of the video is to see how much vram it uses and 4090 and 7900xtx are the current generation cards with most vram... from low settings 1080p and 4k to max settings 1080p and 4k
it might not be accurate but it gives an indication of how vram is used and how much new games require them for low and ultra settings
Posted on Reply
#55
Juancito
This just shows the bad value proposition of the RTX4070. True, the power comsumption has gone down significantly comparing RTX3070 to RTX4070. However, if the RTX3070 shows such performance with extra RAM, this confirms that what reviewers have been saying: the RTX4070 is actually a 4050 or 4060 at a higher price than the 3070 launched. I just sold my 3070 thinking of the 4070 but will actually wait for AMD's next launch.
Posted on Reply
#56
Vayra86
mb194dcSurely totally depends on res and settings. You don't need more than 8GB if you're using 1080p.

Can probably just tweak settings at higher resolutions to stay within it. Easier than modding hardware.
Sure, you can run on low as well, why even bother buying a new GPU at all right!

I don't get this sentiment at all. You pay through the nose for even a midrange GPU and then people are content with all sorts of quality reductions to keep it afloat. While there are ALSO similarly priced midrangers that don't force you into that, today or in the next three to five years.
Like... why even bother to begin with, just use your IGP and save money, 720p is after all just a setting eh.
Posted on Reply
#57
Prima.Vera
Looks like 10GB is more than enough, since those games were pulling maximum ~9600MB usage for VRAM...
Posted on Reply
#58
Vayra86
Prima.VeraLooks like 10GB is more than enough, since those games were pulling maximum ~9600MB usage for VRAM...
I'm already seeing north of 12GB in TW WH3 at sub 4K (3440x1440).
The engine isn't even using any new tech, its just a big game.
Posted on Reply
#59
Bomby569
Hyderzi think the whole point of the video is to see how much vram it uses and 4090 and 7900xtx are the current generation cards with most vram... from low settings 1080p and 4k to max settings 1080p and 4k
it might not be accurate but it gives an indication of how vram is used and how much new games require them for low and ultra settings
A card with more vram will allocate more vram, and the usage numbers are not reliable, and is probably doing the same because they scale in the same way. You don't get any useful information this way.
Posted on Reply
#60
TumbleGeorge
Bomby569A card with more vram will allocate more vram, and the usage numbers are not reliable, and is probably doing the same because they scale in the same way. You don't get any useful information this way.
Improving minimum frames and delta between minimum and maximum is useful information. Must be reason for this... Only one difference is VRAM size. Think! :)
Posted on Reply
#61
Bomby569
TumbleGeorgeImproving minimum frames and delta between minimum and maximum is useful information. Must be reason for this... Only one difference is VRAM size. Think! :)
what are we talking about now? Brian's video? i see different minimum and maximum, 0.1% lows using the same vram (24GB) and just using a different card, amd or nvidia.

the cards even allocate and use vram differently just changing brands and having the same vram size

How are you concluding anything about maximums and minimums based on vram size? talking about the 8GB comparisons, the 4k numbers are a bit irrelevant for this discussion.
Posted on Reply
#62
BSim500
Vayra86I don't get this sentiment at all. You pay through the nose for even a midrange GPU and then people are content with all sorts of quality reductions to keep it afloat.
Whilst I agree it doesn't make sense to skimp on VRAM if you're paying +£600 for a new GPU intended for 2023-2025 AAA Ultra gaming, in reality many of us tweak settings anyway out of personal preference (I cannot stand Depth of Myopia, Chromatic Abhorration, "Supernova Bloom", stupefied head-bob, etc, effects) and think they look ridiculous far more than they have ever 'added realism'. And I'd still turn them off even if I owned a 65,536 Yottabyte VRAM GPU...
Posted on Reply
#63
Dr. Dro
BSim500Whilst I agree it doesn't make sense to skimp on VRAM if you're paying +£600 for a new GPU intended for 2023-2025 AAA Ultra gaming, in reality many of us tweak settings anyway out of personal preference (I cannot stand Depth of Myopia, Chromatic Abhorration, "Supernova Bloom", stupefied head-bob, etc, effects) and think they look ridiculous far more than they have ever 'added realism'. And I'd still turn them off even if I owned a 65,536 Yottabyte VRAM GPU...
Agreed, but head bob is great though, realism!!! o_O:laugh:
Posted on Reply
#64
john_

Fresh and still hot

No AMD vs Nvidia in this video. Only 8GB vs 16GB, pure Nvidia.
Posted on Reply
#65
Prima.Vera
Vayra86I'm already seeing north of 12GB in TW WH3 at sub 4K (3440x1440).
The engine isn't even using any new tech, its just a big game.
It just caches the VRAM. Nothing new. Games have done that for more than 10 years now...
Posted on Reply
#66
TheinsanegamerN
You know, it's amazing. People in this forum are jumping over themselves saying how the 3070 should have had 16GB at launch. Now, if nvidia had launched a 16gb 3070, with the associated $80+ price increase to cover the additional memory cost, the same people would REEEE about the price and how its too expensive and how nvidia is da big evil guy.
Vayra86I'm already seeing north of 12GB in TW WH3 at sub 4K (3440x1440).
The engine isn't even using any new tech, its just a big game.
Allocation. :)Is. :)Not. :)Utilization.:)

We need that clapping hand emoji.

Unless you are seeing consistent microstuttering at higher framerates, you are not running out of VRAM. So far nobody has displayed that with TW3.
Posted on Reply
#67
Dr. Dro
TheinsanegamerNYou know, it's amazing. People in this forum are jumping over themselves saying how the 3070 should have had 16GB at launch. Now, if nvidia had launched a 16gb 3070, with the associated $80+ price increase to cover the additional memory cost, the same people would REEEE about the price and how its too expensive and how nvidia is da big evil guy.

Allocation. :)Is. :)Not. :)Utilization.:)

We need that clapping hand emoji.

Unless you are seeing consistent microstuttering at higher framerates, you are not running out of VRAM. So far nobody has displayed that with TW3.
Allocation may not be utilization, but we've reached a point where games in general are beginning to no longer run adequately with 8 GB GPUs or 16 GB RAM PCs. People who make that argument often forgo or reject the concept of memory pressure. As physical memory nears exhaustion, data will be first compressed (which costs CPU cycles, but should still be manageable in most cases), and in order of priority, swapped onto slower memory tiers (whenever available) until it reaches storage, which is comparably glacial even on high-speed NVMe SSDs.

Apple offers an easy way to read memory pressure in the Mac OS's activity monitor, but Microsoft has yet to do something like this on Windows. A dead giveaway that you are short on RAM is when you begin to see the Compressed Memory figure rise (ideally you want this at 0 MB), and you'll be practically out of RAM when your commit charge exceeds your physical RAM capacity. This is also the reason why you will never see RAM usage maxed out in the Windows task manager, it will attempt to conserve about 1 GB of RAM for emergency use at all times, this leads many people with the mindset of "I paid for 16 GB of RAM and use 16 GB of RAM I shall" to think that their computers are doing OK and that they aren't short at all.

A similar concept applies to GPU memory allocation on Windows. As you may know by now, Windows doesn't treat memory as absolute values, but rather as an abstract concept of addressable pages instead, with the MB values reported by the OS being more estimates than a precise, accurate metric. Normally, the WDDM graphics driver will allocate physical memory present in the graphics adapter plus up to 50% of system RAM, so for example, if you have a 24 GB GPU, plus 32 GB of RAM, you will have a maximum of 24 + 16 = around 40 GB of addressable GPU memory:



This means that a 8 GB GPU such as the RTX 3070 on a PC with 32 GB of RAM actually has around 20 GB of addressable memory. However, at that point, the graphics subsystem is no longer interested in performance but rather, preventing crashes, as it's fighting for resources demanded by programs in main memory. By running games that reasonably demand a GPU that has that much dedicated memory to begin with, you can see where this is going fast.

I believe we may be seeing this symptom in Hardware Unboxed's video, in The Last of Us, where the computer is attempting to conserve memory at all costs:




This is, of course, my personal understanding of things. Don't take it as gospel, I might be talking rubbish - but one thing's for sure, by ensuring that I always have more RAM than an application demands, I have dodged that performance problem for many years now.
Posted on Reply
#68
Vayra86
TheinsanegamerNYou know, it's amazing. People in this forum are jumping over themselves saying how the 3070 should have had 16GB at launch. Now, if nvidia had launched a 16gb 3070, with the associated $80+ price increase to cover the additional memory cost, the same people would REEEE about the price and how its too expensive and how nvidia is da big evil guy.

Allocation. :)Is. :)Not. :)Utilization.:)

We need that clapping hand emoji.

Unless you are seeing consistent microstuttering at higher framerates, you are not running out of VRAM. So far nobody has displayed that with TW3.
Certainly but it is an indicator for sure.
Posted on Reply
#69
Bomby569
Me watching these videos:

Sure 8GB was stupid, my RX480 come with 8GB back in 2016
using a 3070, especially considering the cost, to game at 1080p is absurd
using a 3070 to run a game at 1440p at ultra is also absurd, it brings nothing to the experience, HU thinks the same as me, they even made a video about it
don't buy one, not because of the vram, but because of the price even if you play CSGO competitively
if you already have one, let HU make another video just so you fell even more buyers remorse, where were they when the 3070's released?!

and we keep beating these points around.
Posted on Reply
#70
Vayra86
BSim500Whilst I agree it doesn't make sense to skimp on VRAM if you're paying +£600 for a new GPU intended for 2023-2025 AAA Ultra gaming, in reality many of us tweak settings anyway out of personal preference (I cannot stand Depth of Myopia, Chromatic Abhorration, "Supernova Bloom", stupefied head-bob, etc, effects) and think they look ridiculous far more than they have ever 'added realism'. And I'd still turn them off even if I owned a 65,536 Yottabyte VRAM GPU...
Absolutely I disable the same crap every single time :D

But textures must be maxed, same as LOD and all other things that affect real assets. The GPU hit is low but the IQ win is high.
Posted on Reply
#71
sepheronx
Vayra86Absolutely I disable the same crap every single time :D

But textures must be maxed, same as LOD and all other things that affect real assets. The GPU hit is low but the IQ win is high.
High textures in RE4R is demanding on vram though. My 3080 has crashed a few times till I reduced some settings at 1440p
Posted on Reply
#73
mama
Bomby569A card with more vram will allocate more vram, and the usage numbers are not reliable, and is probably doing the same because they scale in the same way. You don't get any useful information this way.
Yes but if the game doesn't crash (which happens) then the quality of the game diminishes, sometimes very noticeably because the card is scrambling to allocate resources and has to compromise. If that's what you want then the 4070 is for you because that's the future - like it or not.
TheinsanegamerNYou know, it's amazing. People in this forum are jumping over themselves saying how the 3070 should have had 16GB at launch. Now, if nvidia had launched a 16gb 3070, with the associated $80+ price increase to cover the additional memory cost, the same people would REEEE about the price and how its too expensive and how nvidia is da big evil guy.

Allocation. :)Is. :)Not. :)Utilization.:)

We need that clapping hand emoji.

Unless you are seeing consistent microstuttering at higher framerates, you are not running out of VRAM. So far nobody has displayed that with TW3.
Let's wait for the stuttering then :D
Posted on Reply
#74
Bomby569
mamaYes but if the game doesn't crash (which happens) then the quality of the game diminishes, sometimes very noticeably because the card is scrambling to allocate resources and has to compromise. If that's what you want then the 4070 is for you because that's the future - like it or not.
Not crash! but don't start or stutter, it's true, i'm not contesting that.
Posted on Reply
#75
SoNic67
I am interested to do this memory mod to maybe 24GB, on my 3080Ti with 12GB, just for the fun of it.
I am waiting to see if anyone does that.
GDDR6X | Memory | Micron Technology
It seems that there are 2 version of Micron chips, 8Gbit and 16Gbit, used 12pcs x 32 IO package.
Posted on Reply
Add your own comment
Nov 24th, 2024 09:46 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts