Tuesday, February 11th 2025

NVIDIA GeForce RTX 5060 16 GB Variants Deemed Fake, Insiders Insist SKU is 8 GB Only

According to early February reportage, Team Green's GeForce RTX 5060 and RTX 5060 Ti graphics cards are expected to launch onto market next month. Very basic technical information has leaked online; insiders reckon that both product tiers will be utilizing the NVIDIA "Blackwell" GB206 GPU. Rumors have swirled regarding intended VRAM configurations—loose online declarations point to variants being prepared with 8 or 16 GB of GDDR7 VRAM, on a 128-bit bus. Regulatory filings indicate two different configs with the eventual arrival of GeForce RTX 5060 Ti models, but certain industry watchdogs insist that the GeForce RTX 5060 SKU will be an 8 GB-only product.

A curious-looking ZOTAC Trinity OC White Edition GeForce RTX 5060 16 GB variant surfaced via a TikTok video—post-analysis, expert eyes declared that the upload contained doctored material. A BenchLife.info report pointed to a notable inconsistency on the offending item's retail packaging: "DLSS 3 should not appear on the GeForce RTX 50 series box, because the Blackwell GPU architecture focuses on DLSS 4." The publication presented evidence of ZOTAC RTX 4070 Ti SUPER Trinity OC White Edition box art being repurposed in the TikToker's video. Hardware soothsayer MEGAsizeGPU added their two cents: "this is fake. There is no plan for a GeForce RTX 5060 16 GB, and the box is photoshopped from the last-gen ZOTAC box." At the end of their report, BenchLife reckons that NVIDIA has not sent a "GeForce RTX 5060 color box template" to its board partners.
Sources: BenchLife, Tom's Hardware
Add your own comment

22 Comments on NVIDIA GeForce RTX 5060 16 GB Variants Deemed Fake, Insiders Insist SKU is 8 GB Only

#1
mtosev
8gb isn't much vram for 2025.
Posted on Reply
#2
Darmok N Jalad
8GB is getting pretty tight these days. I only play at 1080p or 1440p, and some games exceed 8GB without hesitation.
Posted on Reply
#3
_roman_
It'S an entry level graphic card for basic tasks.
Posted on Reply
#4
Quicks
_roman_It'S an entry level graphic card for basic tasks.
Please tell that to Nvidia that want to charge 400$+ for a basic card doing basic tasks.
Posted on Reply
#5
InVasMani
More VRAM and rasterization performance would make more sense on a $400 GPU overall than pushing RT, but what do I know. I mean it's got plenty of VRAM at 8GB if you just render in black and white who needs color it worked in the 1950's it should be acceptable for 2025.
Posted on Reply
#6
Darmok N Jalad
_roman_It'S an entry level graphic card for basic tasks.
You say that like it’s an iGPU. This card will be bought up by gamers in droves.
Posted on Reply
#7
GodisanAtheist
The argument always revolves around price. If NV was offering the xx60 class cards at $200 with 8GB of VRAM, there wouldn't be much room to complain.

But they're not, they're asking for $300 for starters, and still offering 8GB at the $400+ level. When you're turning down basic texture settings (which do a lot to improve visual fidelity with a very low computational cost when there is enough VRAM) or cannot even use all the VRAM hungry features of your card at $400 there is a problem.
Posted on Reply
#8
Darmok N Jalad
GodisanAtheistThe argument always revolves around price. If NV was offering the xx60 class cards at $200 with 8GB of VRAM, there wouldn't be much room to complain.

But they're not, they're asking for $300 for starters, and still offering 8GB at the $400+ level. When you're turning down basic texture settings (which do a lot to improve visual fidelity with a very low computational cost when there is enough VRAM) or cannot even use all the VRAM hungry features of your card at $400 there is a problem.
They want to upsell you. They know they have customers by the snarglies, or they wouldn’t ask so much for what once was a mid-grade gaming line.
Posted on Reply
#9
InVasMani
It seems like it's just going to make the B580 look that much better in terms of overall value considerations. A lot of people will settle on that GPU instead I imagine.
Posted on Reply
#10
dgianstefani
TPU Proofreader
Reality check time.

That 0.1 higher FPS from double the VRAM, while the 8 GB card still beats its 12 and 16 GB competition.

This is before you factor in RTX cards having a competent upscaler (compared to the competition).

Surely worth $50-100 more for that 0.1 FPS /s

The only reasonable argument for the 16 GB xx60 variants was for professional workloads, but lets face it, if you're doing "professional" work, you can probably afford more than $400 for your GPU.

Posted on Reply
#11
Darmok N Jalad
dgianstefaniReality check time.

That 0.1 higher FPS from double the VRAM, while the 8 GB card still beats its 12 and 16 GB competition.

This is before you factor in RTX cards having a competent upscaler (compared to the competition).

Surely worth $50-100 more for that 0.1 FPS /s

The only reasonable argument for the 16 GB xx60 variants was for professional workloads, but lets face it, if you're doing "professional" work, you can probably afford more than $400 for your GPU.

So I’m just imagining the 10-11GB of VRAM usage I see in games at 1080/1440, or is that Windows hallucinating?
Posted on Reply
#12
Tomgang
If rtx 5060 is only 8 gb vram. Avoid it.

I just tried playing stalker 2 with my rtx 4060 and well it was not a pleasant play throw. I had to go down to 1080P medium settings before it was somewhat playerble. Not do to gpu dit not had enough grunt. It had. But when i ran out of vram and could go from 150 fps with dlss and fg and down to measly 30 -40 fps when i hit vram limit and response and enter menu was slow as hell.

Avoid at all costs 8 gb vram gpu's. Its not a pleasant gaming on new games. Yes i only tried 1 so far, but i think more games will run terribly on 8 gb vram cards even though the gpu has enough grunt to do so.

8 gb vram cards is a dying breed of gpu's. Avoid and don't buy. You will probably regret it
Posted on Reply
#13
dgianstefani
TPU Proofreader
Darmok N JaladSo I’m just imagining the 10-11GB of VRAM usage I see in games at 1080/1440, or is that Windows hallucinating?
Allocation ≠ usage.
Posted on Reply
#14
Mrgravia
From what I understand, doubling the Vram only really makes a difference if the memory bus is fast enough to populate it.
Posted on Reply
#15
dgianstefani
TPU Proofreader
MrgraviaFrom what I understand, doubling the Vram only really makes a difference if the memory bus is fast enough to populate it.
The GPU itself also needs to be fast enough to make use of it for games.
Posted on Reply
#16
SirB
_roman_It'S an entry level graphic card for basic tasks.
So is my RTX3060 12 GB. See that? 12GB vram. 8 is a joke in 2025.
dgianstefaniThe GPU itself also needs to be fast enough to make use of it for games.
My 3060 handles 12 GB just fine. Halo uses all of it on big maps @3440x1440. MSFS as well.
Posted on Reply
#17
sudothelinuxwizard
This will make zero sense to buy above a used 3060 unless you use it for Folding@Home and light gaming. You'll probably save 200 euros, and you can even try and nab a EVGA one, which is much better then the current offerings that are for the most part engineering trainwrecks with shitty warranties.
Posted on Reply
#18
Darc Requiem
8GB is beyond absurd at this point. The RX 480 launched with 8GB of VRAM for $239 in July of 2016.
Posted on Reply
#19
InVasMani
I sure the hell am not buying a new 8GB GPU ever in 2025 or beyond unless you can just prove that past VRAM GPU limitations are a obsolete consideration today and into the future.
Posted on Reply
#20
GodisanAtheist
dgianstefaniReality check time.

That 0.1 higher FPS from double the VRAM, while the 8 GB card still beats its 12 and 16 GB competition.

This is before you factor in RTX cards having a competent upscaler (compared to the competition).

Surely worth $50-100 more for that 0.1 FPS /s

The only reasonable argument for the 16 GB xx60 variants was for professional workloads, but lets face it, if you're doing "professional" work, you can probably afford more than $400 for your GPU.

- Issue with these graphs is they struggle to capture IQ degradation as a result of texture swapping and place holder textures. New game engines seem to work around huge performance hits by simply keeping an ultra low res texture in place until the higher quality asset can be streamed in. You're not going to see much change to overall game performance, but it will make the experience worse.

It's really up to the individual at that point if the IQ hit bothers them or not.
Posted on Reply
#21
alwayssts
Darmok N JaladThey want to upsell you. They know they have customers by the snarglies, or they wouldn’t ask so much for what once was a mid-grade gaming line.
This is literally applicable at every step, just for different use-cases.

Literally none of their cards make sense except for the '90', which is priced bc they know they can't get you upgrade the next gen.
8GB limited to basic console gaming
12GB/<45TF limited to run out at 1440p/1080pRT.
4080 run out of compute at 1440pRT/4k. 5080 run out of ram. 24GB not-enough raster on a 5080 to keep up with 4090 that is actually built for 4k native or 1440pRT upscaling to 4k.
The 4090 makes sense, but is a fortune. The 5090 makes almost no sense, and is even more of a fortune, but ig makes sense for people that need the best humanly possible and money is no object.

Point is, literally EVERYTHING is an upsell. And that up-sell will be outdated by the next thing almost immediately.
GodisanAtheist- Issue with these graphs is they struggle to capture IQ degradation as a result of texture swapping and place holder textures. New game engines seem to work around huge performance hits by simply keeping an ultra low res texture in place until the higher quality asset can be streamed in. You're not going to see much change to overall game performance, but it will make the experience worse.

It's really up to the individual at that point if the IQ hit bothers them or not.
This is SO true, and something I've meant to bring up. It's a dirty trick for RAM limitations, and I'd love for people to understand this...but it's so hard to quantify in a way people understand beyond MH polys.
It doesn't have to be that way, but you also need to buy a card that isn't limited by VRAM. Some people think it's the game...:p. It is, but it's not a bug, it's a feature (because your graphics card is vram-limited)!

IDK how to get people to understand all these things...there's so many underhanded things going on now that people literally have to dig through so much crap to understand correctly that they just don't.
It's like nVIDIA wins by information paralysis and people never being able to parse it all(largely because much of the important information, such as real needed RAM allocation for high-rez textures, are hidden).

I really do think the best way to show it is open-world games like MH.
While they once were the most glaring to show hitching from load/swap, now they show the effect of the inability/lag from a low-rez asset to a high-rez swap due to buffer (even if fps/ram usage steady).

This is why I laugh at 'stutter struggle' or the 'nVIDIA use less ram' arguments. Yeah, they do all of that....because of this very exact thing. Fast ram helps for a swap, but not if it's not enough to hold everything!
Which is increasingly the case. MH needs 16-18GB to keep the textures loaded all the time. I expect a lot of 12GB users complaining about this very thing...'but it only uses [x]GB and/or good fps!'
And ofc they'll say 'it's the game not being optimized' and/or 'broken', or worse yet they'll believe the game is supposed to be that way (and blame the dev placeholder assets for looking bad).
Certainly not nvidia skimping on ram for the capability of your GPU (that can apparently run the actual simulation just fine but not keep decent textures loaded).
There will be more and more of this, most more subtle, even for 16GB users. It's a bummer; goes to show nVIDIA really has caught on to a lot of our testing methods and will do literally anything to save a buck.
This is why you should *ALWAYS* trust allocation, never usage. This exact thing and for that reason. A lot of people just don't get it (or have time/interest to learn), and I truly do think nVIDIA relies on that.

Do you think I like writing bazillion word essays? No. Does nVIDIA expect you to watch hour long videos on people investigating how their stuff *actually* works (which they do their best to hide)? Prob no.

But I hope people read and/or watch them, because so much cut-corners/penny-pinching/planned obsolescence occurs and many don't even understand how/why and or sometimes that actually is until you do.

Many people saw or now see through the '5070 is a 4090' nonsense because either they could understand 1080p->4k upscaling and 4x framegen (1 of 16 pixels) or people explained it to them, which is good.
But there are MANY more instances of stuff like that, and some of them are incredibly difficult to explain and/or show people, like this exact issue, which is tough unless they actually are already using the card.

Gold star for bringing this up; inform whom you can when you can so they are not tricked and/or confused by this. :toast:
Posted on Reply
#22
_roman_
We live in our tech bubble. I would not be surprised when the buy decission is made by.
Nvidia - because AMD graphic cards are not cool
Is it the latest generation - buy.
not so much money - buy. Than the buyer ends with a 5060 graphic card - although another one would be the better choice.

Just check out those talk videos which get a lot of views. I do not want to bash someone. Sometimes it's very hard to listen, but that person get's a lot of views. And maybe people will make their decision because of that. Just because he has a job in a Gamers / tech magazine, makes youtube videos, and write in the gamers tech website and of course forum. I do not want to post the video. 25 minutes for one fact youtube video - 8GB VRAM are not enough. Or headlines how to tune processor "fill in the newest hyped processor" - buy our paper magazine.
Posted on Reply
Add your own comment
Feb 11th, 2025 20:08 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts