Thursday, February 22nd 2024

Modders Pull Off 16GB GeForce RTX 2080 Upgrade, Modded Card Posts 8% Performance Boost

Brazilian tech enthusiast Paulo Gomes, in association with Jefferson Silva, and Ygor Mota, successfully modded an EVGA GeForce RTX 2080 "Turing" graphics card to 16 GB. This was done by replacing each of its 8 Gbit GDDR6 memory chips with ones that have double the density, at 16 Gbit. Over the GPU's 256-bit wide memory bus, eight of these chips add up to 16 GB. The memory speed was unchanged at 14 Gbps reference, as were the GPU clocks.

The process of modding involves de-soldering each of the eight 8 Gbit chips, clearing out the memory pads of any shorted pins, using a GDDR6 stencil to place replacement solder balls, and then soldering the new 16 Gbit chips onto the pad under heat. Besides replacing the memory chips, a series of SMD jumpers need to be adjusted near the BIOS ROM chip, which lets the GPU correctly recognize the 16 GB memory size. The TU104 silicon by default supports higher density memory, as NVIDIA uses this chip on some of its professional graphics cards with 16 GB memory, such as the Quadro RTX 5000.
This is it, nothing to be done on the software side, and TechPowerUp GPU-Z should show you the detected memory size. A "Resident Evil 4" benchmark run with a performance overlay shows that the game is utilizing almost 9.8 GB of video memory compared to 7.7 GB on the original card, and is posting a 7.82% performance increase, from 64 FPS on average, to 69 FPS. This is greater than the kind of performance deltas you see between the 8 GB and 16 GB variants of the RTX 4060 Ti. Besides gaming performance increases, the 16 GB memory should significantly improve the generative AI performance of the RTX 2080.
Sources: Paulo Gomes (YouTube), VideoCardz
Add your own comment

16 Comments on Modders Pull Off 16GB GeForce RTX 2080 Upgrade, Modded Card Posts 8% Performance Boost

#1
delshay
The question is, where did they get the schematic from as you need to know which jumpers to move.

This is a successful mod unlike the the last one posted in the news on TECHPOWER which ended up with the card artifacting after mod. ..Welldone to those involve in the mod.
Posted on Reply
#2
wolf
Better Than Native
I also thought that the drivers would not work or not recognise the extra, but from what I gather it's linked to the quadro cards of the same silicon allowing for the higher memory amount to be accessed?
Posted on Reply
#3
delshay
wolfI also thought that the drivers would not work or not recognise the extra, but from what I gather it's linked to the quadro cards of the same silicon allowing for the higher memory amount to be accessed?
Interestingly you should say that, because I would of thought change of firmware will also be required.
Posted on Reply
#4
Hyderz
that is a cool mod... and it shows the gpu architecture, clock speed etc can utilize the extra ram for a better gameplay experience in today's games
Posted on Reply
#5
DarkDreams
I have my doubts about the "performance uplift" being the result of the memory mod, at least not solely:
First off the GPU-Z screenshots were taken on different windows installs (win 11 vs win 10).
Secondly in the gaming screenshots the 16 GB version is 11 °C (!!!) cooler.
Finally and imo most importantly the ingame scene is different with the modded version just staring straigth at a wall, while the other screenshot is in motion.
Posted on Reply
#6
wolf
Better Than Native
Hyderzthat is a cool mod... and it shows the gpu architecture, clock speed etc can utilize the extra ram for a better gameplay experience in today's games
I'd have thought so too... maybe modded drivers?
DarkDreamsI have my doubts about the "performance uplift" being the result of the memory mod, at least not solely:
First off the GPU-Z screenshots were taken on different windows installs (win 11 vs win 10).
Secondly in the gaming screenshots the 16 GB version is 11 °C (!!!) cooler.
Finally and imo most importantly the ingame scene is different with the modded version just staring straigth at a wall, while the other screenshot is in motion.
Very good pickups, I forget the increments but a few architectures now in a row have scaled speed directly with temperature, like 2 identical cards, one at 45c load and one at 75c load, will be separated by a good 30-90mhz. It's something like 15mhz every 8c with Ampere? I'm not certain of the specific amount, but assuming Turing works the same, the temperature could easily account for some of that performance delta, along with other variables you mentioned and perhaps even more.
Posted on Reply
#8
Denver
Hmm, I believe the driver isn't utilizing the additional VRAM; it likely remains in conservative mode, constantly attempting to restrict itself to the original 8GB of VRAM.

I'm curious whether the GPU might become unstable if faster memory chips (18Gbps) were installed.
Posted on Reply
#9
dereckesanches
DarkDreamsI have my doubts about the "performance uplift" being the result of the memory mod, at least not solely:
First off the GPU-Z screenshots were taken on different windows installs (win 11 vs win 10).
Secondly in the gaming screenshots the 16 GB version is 11 °C (!!!) cooler.
Finally and imo most importantly the ingame scene is different with the modded version just staring straigth at a wall, while the other screenshot is in motion.
Watch the video
DenverHmm, acredito que o driver não está utilizando VRAM adicional; provavelmente permanece no modo conservador, tentando constantemente restringir-se aos 8 GB originais de VRAM.

Estou curioso para saber se a GPU pode ficar instável se chips de memória mais rápidos (18 Gbps) for
Watch the video
Posted on Reply
#10
SOAREVERSOR
delshayThe question is, where did they get the schematic from as you need to know which jumpers to move.

This is a successful mod unlike the the last one posted in the news on TECHPOWER which ended up with the card artifacting after mod. ..Welldone to those involve in the mod.
The catch is if one is an actual engineer it's not that hard to figure out.
Posted on Reply
#11
Ruru
S.T.A.R.S.
Just wondering..... why? Wouldn't it be wiser to mod newer cards, or are those so locked that it's practically impossible?
delshayThe question is, where did they get the schematic from as you need to know which jumpers to move.
My wild guess is pure trial and error.
Posted on Reply
#12
Luke357
I wonder if I could make my 12GB card into a 24GB then perhaps install a 3090 BIOS.
Posted on Reply
#13
mechtech
mad soldering skillz right there
Posted on Reply
#14
The Von Matrices
Luke357I wonder if I could make my 12GB card into a 24GB then perhaps install a 3090 BIOS.
Even though both use GA102 silicon, they aren't the exact same. The 3080 has more shaders fused off than the 3090, so you couldn't just add more memory chips and then use the 3090 BIOS since it isn't the exact same chip.
Posted on Reply
#15
JAB Creations
What was the first Nvidia graphics card with more than 12GB (frigin, 11GB) of VRAM? Nvidia fan boys love to complain about AMD though at least AMD doesn't go out of their way to intentionally screw customers on VRAM in a bid to try to force people to upgrade...frigin, same-generation now. I max out 16GB often enough, the memory interfaces on those 4070s must be where all the mileage is at on those cards. Nvidia could release cards with enough VRAM though because people keep buying garbage they'll happily keep selling it. :kookoo:
Posted on Reply
#16
stimpy88
nGreedia is watching this situation. Wanna bet they will start artificially limiting their cards through the drivers to not recognise cards that have been modded if this catches on!
Posted on Reply
Add your own comment
Nov 21st, 2024 08:04 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts