• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

32 GB NVIDIA RTX 5090 To Lead the Charge As 5060 Ti Gets 16 GB Upgrade and 5060 Still Stuck With Last-Gen VRAM Spec

Joined
Jun 16, 2019
Messages
381 (0.19/day)
System Name Cyberdyne Systems Core
Processor AMD Sceptre 9 3950x Quantum neural processor (384 nodes)
Motherboard Cyberdyne X1470
Memory 128TB QRAM
Video Card(s) CDS Render Accelerator 4TB
Storage SK 16EB NVMe PCI-E 9.0 x8
Display(s) LG C19 3D Environment Projection System
Power Supply Compact Fusion Cell
Software Skysoft Skynet
I'm not going endlessly with you into arguments you have 100% lost. By the arguments provided, by even the links, proof, which I was talking about from the beginning, provided, you were firmly wrong, now you're only deflecting and I have no time for this senseless and endless debate. For you this is just about "being right" now, not about technology and interest and technical facts. 8 GB is 100% enough for even most 1440p games today, so it will be easily easily enough for 1080p for the foreseeable future, now. End of story. Also, Nvidia would 100% not risk a 8 GB card if you were right, but oh surprise, you're obviously not right. -> Otherwise debate this with @W1zzard perhaps. It's his data. Maybe you'd listen to him.

edit: to make this a bit more clear, I'm gonna explain what the game review data shows for the RTX 4060 - and what it does not show:

- games are all tested in Ultra settings, so in a lot of cases *beyond* sweet spot of the 4060, settings which should not be used for a 4060, in other words, or only with DLSS activated. Worst case scenario, you could say.
- despite this beyond sweet spot usage, the card never had terrible fps (< 30 or even <10 fps), which would show that the vram buffer is running into short comings, a very obvious sign usually.
- the only "problem" the card had was in some games in 1080p (and I only talk about 1080p here for this card), it had less than 60 fps.
- the min fps scaled with the average fps it had in that game, so if the avg fps was already under 60 fps, obviously the min fps wouldn't be great as well - not related to vram.
- also there are games like Starfield which generally have issues with min fps that are visible in that review, and not only with the RTX 4060. Also not related to vram.
- the card generally behaved like it should, it was *not* hampered by 8 GB vram in any of the games. Just proving that what I said all along is true.
- furthermore 8 GB vram is also proven to be mostly stable for even 1440p+, as the vram requirements barely increase with resolution alone. There are a variety of parameters that will increase vram usage, for example world size, ray tracing, texture complexity, optimisation issues. A lot of games don't even have problems in 4K (!) with 8 GB vram. That is the funny thing here. :) The problems start when the parameters get more exotic, so to say.
- so saying 8 GB vram isn't enough today for 1080p or even 1440p, is just wrong. Can it have select issues? Yes. It's not perfect, if you go beyond 1080 it will have more issues, but it will still mostly be fine. In 1080p, which this discussion was about, it's 99.9% fine on the other hand, making this discussion unnecessary.
- as it's still easily enough for 1080p, it will also be enough for a new low end card, for the foreseeable future.

haha now let's stay in reality. What I can tell you is this: if you have a ton of vram it can be used, it can be useful even in games that don't *need* it, just so your PC has never to reload vram again, it's basically luxury, less stutter than cards with 16 GB for example. I experienced this while switching from my older GPU to the new one, Diablo 4 ran that much smoother, there was basically no reloading anymore while zoning.

The issue here is that you're comparing two different companies. AMD used to do a "I give you extra vram" for marketing vs Nvidia. The fact here is, that historically Nvidia Upper Mid range GTX 1070 and Nvidia Semi-High End GTX 1080 used 8 GB back then, this is a fact. Whether AMD used a ton of vram on a mid range card, does not change this fact. The same chip of AMD started originally with 4 GB, as per the link you provided yourself, everyone can see. AMD also used marketing tactics like this on R9 390X, so you can go even further back to GPUs that never needed 8 GB in their relevant life time. When the R9 390X "needed" 8 GB, it was already too slow to really utilise it, making 4 GB the sweet spot for the GPU, not 8 GB (averaged on the life time of course). :) And as was already said, by multiple people not just me, Nvidia vram management is simply better than AMDs - this is also a historical fact since this is true for a very long time now, making a AMD vs Nvidia comparison kinda moot. AMD will historically always need a bit more vram to do the same things Nvidia does (not have lags or stutter). As someone said this is probably due to some software optimisation.


Really hungry VRAM games are indeed very rare. Doesn't mean they don't matter. People have been having issues with Indiana Jones on 8GB cards in 1080p. That's right now, not the future. This is gonna become more common. I'm sure there's ways of dealing with it like lowering textures, shadows, and what ever else. But the issue is there now. If all the common games don't have it, great. But if it's a game someone wants to play on their new card and they discover it's running crappy cause it's hitting the ceiling then... Well that just sucks.

And people are too forgiving of a massive rich company skimping on VRAM. 5060 should really launch with about 12GB, but will no doubt be 8GB. 12GB should at least let it run games for the next few years without running into any issues. Nvidia is just greedy and stingy. The fact that my 3080 only came with 10GB, and an empty slot on the PCB where another 2GB module could fit, is showing their greed physically. My 6700 XT cost far less and came with 12GB. The 6800 XT and 6900 XT both had 16GB. NV are lagging behind and people excuse it with the most games work fine argument. If a new card is coming out now, I should expect it to work with any game just fine, a new product shouldn't have an issue right away, rare as it may be.
 

k0vasz

New Member
Joined
Feb 3, 2024
Messages
9 (0.03/day)
3060ti user here

8GB is still enough for 1080p@high/max settings (I cannot confirm this year's AAA titles, cause I've not played any of them yet)
But even if it wasn't enough, you can still turn graphics settings a notch down, and you'll get similar quality as on a PS5/SeriesX

So, 8GB is still enough?
Yes
Would I buy an 50xx card with 8GB for 1080p?
No (except if I were on a budget, but in that case, I'd go for a 40xx card/AMD/Intel/used card instead)
 
Joined
Jul 24, 2024
Messages
298 (2.00/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
They only play triple A games on Ultra settings with exact the same one vram heavy level over and over again, just to prove the dramatubers right. /jk :)))))

It does afaik, DLSS lowers vram amount, but the way I used the DLSS argument was in general, for performance, not only vram. With Frame Gen, not sure if it stagnates vram usage or increases it.

Edge cases and things normal users (so most people on planet) barely care about, as long as game is mostly fine and they have no problems. You're basically citing a luxury problem here to try making a point that 8 GB isn't enough, you can only say 8 GB is "suboptimal" with your argument, but this topic was about *not enough* and not "suboptimal" so your argument is firmly beside the point. Also you are citing 4060 Ti, which isn't the main point of this discussion, this is about 5060, 4060 and other low end cards that will 100% be fine with 8 GB vram.

Strawman argument that I have already refuted in #29.

And those were lower midrange cards / mid range cards, from 2016, and had 6 GB / 3 GB Vram, yes, not 8 GB. If a card that is mid or semi highend from 2016 has 8 GB it just proves, that high end of 2016 is now the low end, is just natural evolution, very normal. Nothing special.

Otherwise read my other posts, nothing you said wasn't already answered multiple times.
While I think further arguing in this thread is pointless, I shall add one more comment of mine, for fun.

DLSS does not lower VRAM utilization amount and if it by miracle does, it will be within margin of error. Go check the internet.

Even Intel realized that 8 GB is not enough in lowend.
The less is cache capacity, the more bandwidth there must be to compensate it. And bandwidth is not just about GPU's memory bus-width,
but think about where are graphics assets stored and from where and through what else components are they pulled from to the VRAM.

Sometimes you need space, not bandwidth. Bandwitdh is of no use when there is not enough space to fit things into.

As soon as devs will move onto higher quality textures, 8 GB will become unsufficient even for 1080p aimed GPUs.
 
Joined
May 11, 2018
Messages
1,292 (0.53/day)
While I think further arguing in this thread is pointless, I shall add one more comment of mine, for fun.

DLSS does not lower VRAM utilization amount and if it by miracle does, it will be within margin of error. Go check the internet.

But it goes hand in hand that if you're already lowering your resolution and upscaling, you might as well lower the graphics settings - there is no game on market that can't be somehow enjoyed even on integrated graphics - so users of expensive discrete graphics cards should be silent! (Doesn't matter that you have to resort to that at relatively low resolution, raytracing and other fancy settings off, on a card that costs half of monthly salary)
 
Joined
Nov 13, 2024
Messages
88 (2.38/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5950X
Motherboard ASRock B550 Phantom Gaming 4
Cooling Peerless Assassin 120 SE
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) NVIDIA GeForce RTX 3080, 10 GB GDDR6X (ASUS TUF)
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
As soon as devs will move onto higher quality textures, 8 GB will become unsufficient even for 1080p aimed GPUs.
yep, also curious how the 4060ti 16gb version ages. Maybe it becomes relevant in some years.
 

AcE

Joined
Dec 3, 2024
Messages
263 (15.47/day)
People have been having issues with Indiana Jones on 8GB cards in 1080p. That's right now, not the future.
Already said it multiple times in this thread, if you have a opinion provide proof, otherwise it's just talk, and we had more than enough talk here already. :)
DLSS does not lower VRAM utilization amount and if it by miracle does, it will be within margin of error. Go check the internet.
Your "argument", so I'm not gonna check anything, if you want to make an argument, provide data, talk really isn't worth much in technical discussions, not at this point at least. :) Afaik, DLSS does lower vram usage, if it doesn't, well, wasn't really part of my argument anyways, the data is also without DLSS, I only used DLSS as a side point and not in the main argument.
Even Intel realized that 8 GB is not enough in lowend.
Speculation, Intel never said anything like that, otherwise, again!, provide proof. We are not in the bible studies here, that you see "oh the lowest card of Intel has 12 GB now, intel must be of the opinion that 8 GB isn't enough!", that's called *interpretation*, be honest about it, and don't pretend it's fact. (edit: the B580 isn't even the lowest card of Intel for this generation, as the name already implies, so your argument will never make any sense to begin with). The most obvious interpretation here isn't yours, however, it's simply that 192 bit bus will either use 6 GB (which is too less) or 12 GB and they chose 12 (just like Nvidia did with 3060 back then), so it's just a very basic decision they did which does not have anything to do with this discussion, that 8 GB is enough or not. :) But again, it's been already proven numerous times that 8 GB *IS* enough for low end gaming and a low end card, so there's that. :) I'm gonna slowly also quit this discussion and will probably soon ignore "@" that are just talk without proper arguments that contain facts.

1) it's proven 8 GB is enough, by data, anyone who is of a different opinion seems to be phishing for engagement (youtubers), or is looking for the worst case scenarios (not practical, not a good argument to have)

2) the 2 companies with 100% dGPU share of the market currently still use 8 GB cards (intel has A750 / A770 which is also 8 GB vram and has no issues worth talking about), so they confirm that it is enough, otherwise wouldn't do that

3) only a cynic would reduce their technical decision just to "omg it's just about money! They are saving money by reducing vram, or it's planned obsolescence". I don't envy people that are so cynical. And cynicism leads you to roads that aren't worth traveling on.

4) let's be real and honest here for a second, Nvidia is a pretty great company that is known for quality and not for nonsense, so if they do 8 GB video cards in 2022 + probably soon still in 2025, they have good reasons for it, that are beyond than just "omg I have 1 trillion dollars, I need 50 million dollars more!", that's how it is. When exactly was the last time, a video card of Nvidia really didn't work out because the vram amount wasn't enough? Right, never? Good to know. We have a lot, a ton of baseless talk here, and the few data points provided point in the direction that 8 GB is enough today and for the foreseeable future, for 1080p at least. Not talking about 1440p, I don't give any guarantees there, it's well known that 3070 is a 1440p card (or was?) that has some issues with vram, but that's another topic, it's not a low end card and not about 1080p, which is, what this is about. Nobody buys 4060 for 1440p, at least not without accepting compromises, not with thinking "that will be perfectly fine!". Anyone who buys a 4060, or soon 5060, will think to themselves, "this is a low end card, it is what it is, I'll accept compromises, or I'll stay firmly with 1080p even accepting compromises there, if necessary if anything I play is just AAA games". As soon as the discussion isn't about worst case scenarios anymore, which are the moot points the opponents of 8 GB vram brought, it's just AA or even A gaming, the vram isn't even a topic anymore. Also the usage of smart settings eviscerates the points the opponents of 8 GB vram have as well. Which is usually what a low end card owner should do. :) "Ultra Ultra Ultra" is usually for people who own more expensive, higher end cards. Tbh, Ultra is a enthusiast setting, so why should I force it on a low end card? I'm gonna use it if I can, and if the game is in the slightest laggy, it's lowered to "High" after 5 seconds, that's the reality.
 
Last edited:
Joined
Jul 7, 2019
Messages
931 (0.47/day)
Most of the issues with modern games is that they're poorly optimized, and can only be optimized so much for PC given the wide array of possible system configs. We see that most games that are well optimized can run at higher settings with lower VRAM, while poorly optimized games rely on the brute force of a GPU + larger VRAM to have the same quality as an optimized game with lower VRAM. It seems more likely that game companies will continue to shift to the easier brute force method, which will also necessitate more VRAM to make up for the shortcomings of trying to chase the "ultra-realistic" settings on stock game engines, as those work "well enough".

As well, many games are still made to the lowest common denominator, which are consoles, and those are generally only equal to mid-range GPUs, so in theory, most games should work on limited VRAM (but of course, they don't always). In this generation, the XSS is the limiting factor with 8GB allocated to its GPU, while XSX allocates up to 10GB to its GPU and the PS5 lets most of the 16GB be usable by the GPU depending on game optimization. Of those, the XSS has shown actual issues with its allocated 8GB given that most games were not designed for it but rather the XSX, so additional tweaks had to be made to games to offer settings that ran on an XSS at lower settings than XSX. The 2 extra GB on the XSX make a lot of difference, and the PS5 makes use of the larger available memory to make up for the slightly inferior GPU specs relative to the XSX.
 
Joined
Oct 30, 2020
Messages
268 (0.18/day)
I think the 8GB argument is pointless and we are going in circles. One person seems adamant that it's enough while a few others say it isn't. I'm firmly in the latter camp simply because there are instances with the 4060 where 8GB framebuffer is maxed out especially with RT at which point lowering graphics requirements are a necessity otherwise framerates are terrible and choppy. The 5060 will only make this worse because it's (hopefully) faster.

Now if the 5060 is at the level of, say, 4060Ti, this is what we're looking at:

1734546577364.png

Not insignificant. So essentially the games where 4060 was bottlenecked by framebuffer will have the same upper limit of graphics settings in the 5060.
 

AcE

Joined
Dec 3, 2024
Messages
263 (15.47/day)
I think the 8GB argument is pointless and we are going in circles. One person seems adamant
It's especially pointless because the opponents are just talking, and are unable to give proof / data that supports their claims. :) "One person", sadly, no, multiple people are on my side, and 2-3 companies, AMD, Nvidia and Intel. They all still use 8 GB video cards.

The funny thing was when one guy tried to use my own data against me, I studied the data again and pointed out quickly that it supports all my claims and thanked him for it.
there are instances with the 4060 where 8GB framebuffer is maxed out especially with RT
This supports multiple of my claims, thanks. 8 GB opponents are just fishing for edge cases and exemptions instead of looking for the broad rule, that the card works with 0 issues when used normally. :)

You guys generally have 1 thing in common: edge cases, low % usage cases, impractical usage of a low end card. This all just makes it easier for me to bring this to bed. :)

You can also say, a Fiat will break down if you always drive it at 180 km/h. This is the argument you guys are basically making, a very unrealistic scenario which makes 0 sense and will never happen.
 
Last edited:
Joined
Dec 7, 2022
Messages
54 (0.07/day)
The only proof in this thread is that you may see an ace but it is really not an ace by all means. :(

It is like witnessing a wrong way crash...
 
Joined
Oct 30, 2020
Messages
268 (0.18/day)
This supports multiple of my claims, thanks. 8 GB opponents are just fishing for edge cases and exemptions instead of looking for the broad rule, that the card works with 0 issues when used normally. :)

You guys generally have 1 thing in common: edge cases, low % usage cases, impractical usage of a low end card. This all just makes it easier for me to bring this to bed. :)

You can also say, a Fiat will break down if you always drive it at 180 km/h. This is the argument you guys are basically making, a very unrealistic scenario which makes 0 sense and will never happen.

Well, not all games will be able to max an 8GB framebuffer of a card with performance characteristics of a 4060 while framerates are still playable. But there are many instances already where 8GB is maxed out while the framerates are still playable but because of that limitation it becomes a choppy mess. It'll just be worse on the 5060 as the 4060Ti 16GB has a greater than 30% performance uplift on average compared to the 8GB version at 1440p with RT. That's not an edge case at all and very much playable in many games, just not on the 8GB version unless you lower textures and such.

You are saying others aren't providing proof whereas i see a lot of people posting other reviews but state you only trust TPU. So that's the review i'm basing my statement on.

You see, for some people that might absolutely be a deal breaker but for you it seems it's perfectly fine since you're justifying nvidia's opinion on having 8GB cards. Opinions can obviously differ but seemingly most people are in the latter camp where they absolutely don't see 8GB as enough because it's only going to get worse down the line. Maybe you are fine when a random texture goes missing and you realise you need to turn down the settings or just turn it down altogether when enabling RT, watching your framebuffer etc. But for most, it's a major no no.+

By now you should just agree to disagree and move on. There's enough evidence of the 8GB vs 16GB debate on numerous websites, HUB for one did a lot of testing and the results speak for themselves.
 
Joined
Jun 16, 2019
Messages
381 (0.19/day)
System Name Cyberdyne Systems Core
Processor AMD Sceptre 9 3950x Quantum neural processor (384 nodes)
Motherboard Cyberdyne X1470
Memory 128TB QRAM
Video Card(s) CDS Render Accelerator 4TB
Storage SK 16EB NVMe PCI-E 9.0 x8
Display(s) LG C19 3D Environment Projection System
Power Supply Compact Fusion Cell
Software Skysoft Skynet
Already said it multiple times in this thread, if you have a opinion provide proof, otherwise it's just talk, and we had more than enough talk here already. :)

Your "argument", so I'm not gonna check anything, if you want to make an argument, provide data, talk really isn't worth much in technical discussions, not at this point at least. :) Afaik, DLSS does lower vram usage, if it doesn't, well, wasn't really part of my argument anyways, the data is also without DLSS, I only used DLSS as a side point and not in the main argument.

Speculation, Intel never said anything like that, otherwise, again!, provide proof. We are not in the bible studies here, that you see "oh the lowest card of Intel has 12 GB now, intel must be of the opinion that 8 GB isn't enough!", that's called *interpretation*, be honest about it, and don't pretend it's fact. (edit: the B580 isn't even the lowest card of Intel for this generation, as the name already implies, so your argument will never make any sense to begin with). The most obvious interpretation here isn't yours, however, it's simply that 192 bit bus will either use 6 GB (which is too less) or 12 GB and they chose 12 (just like Nvidia did with 3060 back then), so it's just a very basic decision they did which does not have anything to do with this discussion, that 8 GB is enough or not. :) But again, it's been already proven numerous times that 8 GB *IS* enough for low end gaming and a low end card, so there's that. :) I'm gonna slowly also quit this discussion and will probably soon ignore "@" that are just talk without proper arguments that contain facts.

1) it's proven 8 GB is enough, by data, anyone who is of a different opinion seems to be phishing for engagement (youtubers), or is looking for the worst case scenarios (not practical, not a good argument to have)

2) the 2 companies with 100% dGPU share of the market currently still use 8 GB cards (intel has A750 / A770 which is also 8 GB vram and has no issues worth talking about), so they confirm that it is enough, otherwise wouldn't do that

3) only a cynic would reduce their technical decision just to "omg it's just about money! They are saving money by reducing vram, or it's planned obsolescence". I don't envy people that are so cynical. And cynicism leads you to roads that aren't worth traveling on.

4) let's be real and honest here for a second, Nvidia is a pretty great company that is known for quality and not for nonsense, so if they do 8 GB video cards in 2022 + probably soon still in 2025, they have good reasons for it, that are beyond than just "omg I have 1 trillion dollars, I need 50 million dollars more!", that's how it is. When exactly was the last time, a video card of Nvidia really didn't work out because the vram amount wasn't enough? Right, never? Good to know. We have a lot, a ton of baseless talk here, and the few data points provided point in the direction that 8 GB is enough today and for the foreseeable future, for 1080p at least. Not talking about 1440p, I don't give any guarantees there, it's well known that 3070 is a 1440p card (or was?) that has some issues with vram, but that's another topic, it's not a low end card and not about 1080p, which is, what this is about. Nobody buys 4060 for 1440p, at least not without accepting compromises, not with thinking "that will be perfectly fine!". Anyone who buys a 4060, or soon 5060, will think to themselves, "this is a low end card, it is what it is, I'll accept compromises, or I'll stay firmly with 1080p even accepting compromises there, if necessary if anything I play is just AAA games". As soon as the discussion isn't about worst case scenarios anymore, which are the moot points the opponents of 8 GB vram brought, it's just AA or even A gaming, the vram isn't even a topic anymore. Also the usage of smart settings eviscerates the points the opponents of 8 GB vram have as well. Which is usually what a low end card owner should do. :) "Ultra Ultra Ultra" is usually for people who own more expensive, higher end cards. Tbh, Ultra is a enthusiast setting, so why should I force it on a low end card? I'm gonna use it if I can, and if the game is in the slightest laggy, it's lowered to "High" after 5 seconds, that's the reality.
Go tell Alex he's wrong then.
 
Joined
Mar 11, 2008
Messages
979 (0.16/day)
Location
Hungary / Budapest
System Name Kincsem
Processor AMD Ryzen 9 9950X
Motherboard ASUS ProArt X870E-CREATOR WIFI
Cooling Be Quiet Dark Rock Pro 5
Memory Kingston Fury KF560C32RSK2-96 (2×48GB 6GHz)
Video Card(s) Sapphire AMD RX 7900 XT Pulse
Storage Samsung 970PRO 500GB + Samsung 980PRO 2TB + FURY Renegade 2TB+ Adata 2TB + WD Ultrastar HC550 16TB
Display(s) Acer QHD 27"@144Hz 1ms + UHD 27"@60Hz
Case Cooler Master CM 690 III
Power Supply Seasonic 1300W 80+ Gold Prime
Mouse Logitech G502 Hero
Keyboard HyperX Alloy Elite RGB
Software Windows 10-64
Benchmark Scores https://valid.x86.fr/cq8xu2 https://valid.x86.fr/4d8n02 X570 https://www.techpowerup.com/gpuz/g46uc
5090 comes with 32GB GDDR7?!
Wish I could get it soon after lunch!
Would be fun to run "AI" stuff one it.
 
Last edited:
Joined
Jan 20, 2019
Messages
1,590 (0.74/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
Joined
Dec 25, 2020
Messages
6,992 (4.80/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
The 5060 8gb is manufactured e waste, how is it acceptable the 3060 have 12gb while the 4060 and the 5060 having only 8

The only reason the 3060 has 12 GB is that 6 GB really wasn't enough and its 192-bit memory bus can only do 6 or 12
 

AcE

Joined
Dec 3, 2024
Messages
263 (15.47/day)
Who? Go tell W1zzard that he's wrong. :) Data > Any person on the planet. I don't care to bring personal things into this. Logic > *
But there are many instances already where 8GB is maxed out while the framerates are still playable but because of that limitation it becomes a choppy mess.
Impossible, that would show in the average FPS - and your argument is solely concentrated on worst case scenario Ultra settings, which are only smart part of the general argument and already debunked, maybe try to regard more than just 1 setting? - Also the outcry of masses would arise, which never did, not even for the RX 7600 with worse vram management than Nvidias. You guys are simply wrong, accept it and move on. :) Endless deflecting, certainly will never impress me, to the contrary, the people who agreed with me all seemed chilled, well balanced people, the guys who didn't more often than not behaved panicked and angrily most of the time. There's a interesting pattern. I guess being positive is just better than being sceptical and negative and coloring it into how you see the world (and then see too many things just not how they really are, but over everything else, there will be *balance*, the zen mantra). But I digress, the discussion is just too long in the tooth now.
 
Last edited:
Joined
Jul 24, 2024
Messages
298 (2.00/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
A man can give you ANY fact, you just deny it and continue mumbling your own words. Look, there's a reason why most of people don't agree with you.

Btw, do also check other reviews when something new is released, then make opinion. For instance, W1zzard reviewed 9800X3D and found it to be not so much better than predecessor, while other sites reported higher inter-generational performance gains and not by a small amount (5% and 10% is a huge difference). I'm not trying to point out that W1zzard's reviews are bad, it's just he uses his own metrics and techniques, other sites have their own. Different software tested, different games tested. You can see these days that a significant performance loss/gain can easily be obtained with different Windows 11 version, which is in fact ridiculous, because that's not difference between generational OS releases (like Win10 vs. Win11) but only half year feature updates.
 
Joined
Oct 30, 2020
Messages
268 (0.18/day)
Impossible, that would show in the average FPS - and your argument is solely concentrated on worst case scenario Ultra settings, which are only smart part of the general argument and already debunked, maybe try to regard more than just 1 setting? - Also the outcry of masses would arise, which never did, not even for the RX 7600 with worse vram management than Nvidias. You guys are simply wrong, accept it and move on. :) Endless deflecting, certainly will never impress me, to the contrary, the people who agreed with me all seemed chilled, well balanced people, the guys who didn't more often than not behaved panicked and angrily most of the time. There's a interesting pattern. I guess being positive is just better than being sceptical and negative and coloring it into how you see the world (and then see too many things just not how they really are, but over everything else, there will be *balance*, the zen mantra). But I digress, the discussion is just too long in the tooth now.

you replied to what I said as impossible, well, here's your trusted TPU chart for one game. Note that there are others:

1734623809076.png


That's a 20% performance uplift for the 16GB version. Note that it's not a flat 20% uplift but that the 8GB version is fine and playable till it isn't in areas where 8GB is maxed out and it's an awful choppy mess.

I don't know why you keep repeating "you guys are just wrong". There's no right or wrong, when someone says 8GB is absolutely not enough he's right - there are many cases where 8GB is showing to be a limiting factor. But if someone says 8GB is enough he isn't necessarily wrong - maybe his games don't max out 8GB or he's willing to reduce settings in edge cases.

All i'm trying to say is 8GB is already maxed out in a few instances today and it'll be worse down the line. There's no argument to be had here.
 

AcE

Joined
Dec 3, 2024
Messages
263 (15.47/day)
you replied to what I said as impossible, well, here's your trusted TPU chart for one game. Note that there are others:
You ignore 80% of my arguments and think you got a point, maybe read more carefully or be more honest?

You guys pick extreme cases, with Ultra settings on a low end card, worst case scenarios. This makes your point moot at best, completely irrelevant and out of touch, at worst. Nobody has problems with the 4060, because nobody uses the card like you think they would. Where are the million unhappy customers?

I literally even said this the last time I responded to you, no care from your side, you seem to be completely oblivious to arguments in general. Stay in your bubble then. :) This discussion is over anyway. The opponents just use extreme cases and think that's an argument to make on a low end card that nearly nobody uses like that (or really, nobody). Again, there is more to PC usage than just Ultra settings, try to be realistic for once. And even on the worst case scenarios the card works flawlessly 99% of time. You got no point at all.

If you need to use extreme edge cases as an argument, it just really proves, you don't have real arguments anyway. General rule in life.
All i'm trying to say is 8GB is already maxed out in a few instances today and it'll be worse down the line. There's no argument to be had here.
This was already debunked as well, so many times, future guessing isn't a argument you can realistically make, because you don't know it. That's another entirely moot point. If Nvidia brings a new 8 GB card soon, it will probably work for 2-4 years, easily. I bet on it. And you will lose that bet. After that the card is overstretched anyways and details will be reduced - in the lifetime of the card, the vram will never be the problem before it's out of power anyway. This already happened countless times in history of GPUs, btw.
 
Last edited:
Joined
Oct 30, 2020
Messages
268 (0.18/day)
You ignore 80% of my arguments and think you got a point, maybe read more carefully or be more honest?

You guys pick extreme cases, with Ultra settings on a low end card, worst case scenarios. This makes your point moot at best, completely irrelevant and out of touch, at worst. Nobody has problems with the 4060, because nobody uses the card like you think they would. Where are the million unhappy customers?

I literally even said this the last time I responded to you, no care from your side, you seem to be completely oblivious to arguments in general. Stay in your bubble then. :) This discussion is over anyway. The opponents just use extreme cases and think that's an argument to make on a low end card that nearly nobody uses like that (or really, nobody). Again, there is more to PC usage than just Ultra settings, try to be realistic for once. And even on the worst case scenarios the card works flawlessly 99% of time. You got no point at all.

If you need to use extreme edge cases as an argument, it just really proves, you don't have real arguments anyway. General rule in life.

This was already debunked as well, so many times, future guessing isn't a argument you can realistically make, because you don't know it. That's another entirely moot point. If Nvidia brings a new 8 GB card soon, it will probably work for 2-4 years, easily. I bet on it. And you will lose that bet. After that the card is overstretched anyways and details will be reduced - in the lifetime of the card, the vram will never be the problem before it's out of power anyway. This already happened countless times in history of GPUs, btw.

I wasn't even being combative yet you seem to be the one pointing fingers here and not even once. Never mind that. A few things before I leave this "discussion":

1) I'm not sure where you're getting the 80% number from, or the "stay in your bubble" part because I thought I covered your points of substance. Do you mean the one other point you made about "unrealistic ultra settings" scenario? I literally was countering that very point, saying these aren't unrealistic settings because the framerates are playable as I showed in the rachet and clank graph I posted. How is that an extreme edge case? You know what, that might subjectively be an extreme edge case for you so how about this? Surely over 100FPS isn't an extreme edge case, yet the 8GB is unplayable here?

1734631309964.png



What was the other point I ignored? All I saw was you saying people who disagree with you are panicking and angry while people who agreed with you seem like chill, well balanced people. That's a given, considering people who agree with you will not argue with you and vice versa. Not that i've seen that in this thread anyway but just saying.

2) There aren't millions of unhappy customers because most games were fine till recent months. See the side by side graph of when the 4060Ti 16GB was released, and the latest one:

1734632534861.png


See how a negligible 3% difference turned into over 30% in a year? It's easy to predict the trajectory at which this is going.

What's the last point you're "debunking"? My point was 8GB is being maxed out in a few instances today, and I literally provided a graphs (two now) of those instances. I also just showed in the last graph that it is getting worse down the line - memory requirements throughout the ages hasn't decreased for games nor will it anytime soon. This isn't predicting the future, it's an educated assumption based on historical trends. How are you debunking that, by typing debunked?

Sure, we do not know the future 5060's performance but its certainly going to be higher than the 4060 which will only expose the 8GB limitation further. Don't count on nvidia bringing some magic vram management technique if that's what you're implying and if they do, more power to them. History says otherwise though and speaking of history since you went down this road already, know that the GTX680 and Fury X in the past were easily VRAM limited a year or two down the line. I owned both, so I know and i'm sure there are others (and other cases) too.
 

Attachments

  • 1734631635546.png
    1734631635546.png
    273.8 KB · Views: 3
Joined
Jul 24, 2024
Messages
298 (2.00/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
Who's next in queue to reason with unreasonable one?
 

AcE

Joined
Dec 3, 2024
Messages
263 (15.47/day)
How is that an extreme edge case?
You picked the one (1) game that has heavy heavy vram usage, and everyone knows that. It's also the 1 (one) game that magically makes the 7600 XT a worth while buy. :D
That's a given, considering people who agree with you will not argue with you and vice versa.
No, they just posted their posts, they did not quote me. Some of them posted good real life examples of how 8 GB video cards are not a issue.
What was the other point I ignored?
1) 4060 only, low end cards, not 4060 Ti and higher - I never said 4060 Ti is fine with 8 GB, it's a different matter as it's a lower mid range GPU 2) where I said that Ultra shouldn't be used as the main thing a low end user will use. 3) where I said not everyone with a 4060 will be triple A gamer, most are not (!) I'm pretty sure. So these Ultra benchmarks in all triple A games aren't that relevant, they show worst case scenarios, and still the 4060 works through them with almost 0 issues.
See how a negligible 3% difference turned into over 30% in a year? It's easy to predict the trajectory at which this is going.
Irrelevant to the discussion because I was only talking about low end cards like 4060 and 7600, not the 4060 Ti. I never said the 4060 Ti is great. 7700 XT is better IMO and for multiple reasons.
Sure, we do not know the future 5060's performance but its certainly going to be higher than the 4060 which will only expose the 8GB limitation further.
Unlikely, currently no problem in 1080p and it will stay like that in the foreseeable future. Low end buyers are also way less triple A gamers, they are mostly competitive game/<AAA gamers, and those games are way less demanding.
 
Last edited:
Top