Monday, April 24th 2023

Modded NVIDIA GeForce RTX 3070 With 16 GB of VRAM Shows Impressive Performance Uplift

A memory mod for the NVIDIA GeForce RTX 3070 that doubles the amount of VRAM showed some impressive performance gains, especially in the most recent games. While the mod was more complicated than earlier ones, since it required some additional PCB soldering, the one tested game shows incredible performance boost, especially in the 1%, 0.1% lows, and the average frame rate.

Modding the NVIDIA GeForce RTX 3070 to 16 GB VRAM is not a bad idea, since NVIDIA already planned a similar card (RTX 3070 Ti 16 GB), but eventually cancelled it. With today games using more than 8 GB of VRAM, it means that some RTX 30 series graphics card can struggle with pushing playable FPS. The modder benchmarked the new Resident Evil 4 at very high settings, showing that those additional 8 GB of VRAM is the difference between stuttering and smooth gameplay.
As said, the recent mod is a bit more complicated than the earlier one done on some earlier graphics cards, as some resistors needed to be grounded in order to support higher-capacity memory ICs, and the modded graphics card had to be set to high-performance mode in the NVIDIA Control Panel, in order to fix flickering.

AMD marketing has recently called out NVIDIA and pulled the VRAM card, but with NVIDIA launching the GeForce RTX 4070 with 12 GB of VRAM, it appears this won't change anytime soon. These mods show that there is definitely the need for more VRAM, at least in some games.

Sources: Paulo Gomes (Youtube), via Videocardz
Add your own comment

80 Comments on Modded NVIDIA GeForce RTX 3070 With 16 GB of VRAM Shows Impressive Performance Uplift

#76
sepheronx
SoNic67I am interested to do this memory mod to maybe 24GB, on my 3080Ti with 12GB, just for the fun of it.
I am waiting to see if anyone does that.
GDDR6X | Memory | Micron Technology
It seems that there are 2 version of Micron chips, 8Gbit and 16Gbit, used 12pcs x 32 IO package.
If you ever figure it out, post it here. I gotta do this otherwise all my 3070's will end up as drink trays.
Posted on Reply
#77
chrcoluk
Wish someone opened a shop in the UK to do this stuff, but I expect Nvidia would get it shut down super fast.

--

So much misunderstanding in this thread, its frustrating how misinformed people are.

Allocation is utilisation as VRAM cannot be over committed.

In addition a game might run on less VRAM, but that doesnt mean there is no penalty in doing so, many modern engines now dynamically adjust quality to manage VRAM, as well as things like textures staying loaded, draw distance etc. also been affected.

I can play FF7 remake on a 10 gig GPU (just about), does that mean 10 gig is enough for the game? well yeah if you ok with stuttering, frequent texture/asset swapping, low quality textures due to LOD been reduced dynamically and the occasional crash.
Posted on Reply
#78
Hyderz
chrcolukWish someone opened a shop in the UK to do this stuff, but I expect Nvidia would get it shut down super fast.

--

So much misunderstanding in this thread, its frustrating how misinformed people are.

Allocation is utilisation as VRAM cannot be over committed.

In addition a game might run on less VRAM, but that doesnt mean there is no penalty in doing so, many modern engines now dynamically adjust quality to manage VRAM, as well as things like textures staying loaded, draw distance etc. also been affected.

I can play FF7 remake on a 10 gig GPU (just about), does that mean 10 gig is enough for the game? well yeah if you ok with stuttering, frequent texture/asset swapping, low quality textures due to LOD been reduced dynamically and the occasional crash.
FF7R uses more than 10gb? Is it another bad optimization or just demanding game?
Posted on Reply
#79
chrcoluk
HyderzFF7R uses more than 10gb? Is it another bad optimization or just demanding game?
I would say more lack of optimization, it is using VRAM for RAM type stuff, as its inherited from unified memory console design, this type of thing will become more popular in future games, as most games are designed for their main platform (usually consoles).

The amount it uses is dynamic, the game detects the hardware and auto configures a VRAM budget. A feature available to any UE4 game.

I ended up doing a combination of setting shadows to low quality and various manual UE4 tweaks. Also launching the game in DX11 mode lowers the impact of asset swapping.

In addition launching it on defaults, with GPU accelerated apps running (discord, browsers etc.) caused it to crash without "out of VRAM errors", as it makes the assumption it doesnt have to share VRAM with anything else on the system on its auto budget.
Posted on Reply
#80
LupintheIII
BSim500Whilst I agree it doesn't make sense to skimp on VRAM if you're paying +£600 for a new GPU intended for 2023-2025 AAA Ultra gaming, in reality many of us tweak settings anyway out of personal preference (I cannot stand Depth of Myopia, Chromatic Abhorration, "Supernova Bloom", stupefied head-bob, etc, effects) and think they look ridiculous far more than they have ever 'added realism'. And I'd still turn them off even if I owned a 65,536 Yottabyte VRAM GPU...
It's incredible how you guys are actively asking for worst products, is this some kind of Stocholm syndrom?
More Vram is always better for the same price, no matter what, being it for gaming or work applications, and at $500+ you should have plenty, not barely enough.

Also there is only so much you can do turning down settings when stuff related to gameplay itself is asking for Vram (NPC AI, geometry, view distance etc.).
So even if you are happy paying $500 to play at 1080p low settings (for whatever reason) you will soon find yourself limited in the games you can play.
TheinsanegamerNYou know, it's amazing. People in this forum are jumping over themselves saying how the 3070 should have had 16GB at launch. Now, if nvidia had launched a 16gb 3070, with the associated $80+ price increase to cover the additional memory cost, the same people would REEEE about the price and how its too expensive and how nvidia is da big evil guy.

Allocation. :)Is. :)Not. :)Utilization.:)

We need that clapping hand emoji.

Unless you are seeing consistent microstuttering at higher framerates, you are not running out of VRAM. So far nobody has displayed that with TW3.
That doesn't mean the GPU is allocating those resources for no reason, I don't get how that's still not clear.
Also if you see 7.4GB utilizzation (or allocation) on a 8GB card, you are running out of Vram, you won't see 8GB or the game would just crash.
Microstutter is not the only sign of running out of Vram, immage quality degradation is actually the most common, but one many won't notice (that show how few are actually able to tell the difference between "ultra" vs low or even RT on Vs RT off
Posted on Reply
Add your own comment
Nov 24th, 2024 12:06 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts