- Joined
- Jun 16, 2019
- Messages
- 381 (0.19/day)
System Name | Cyberdyne Systems Core |
---|---|
Processor | AMD Sceptre 9 3950x Quantum neural processor (384 nodes) |
Motherboard | Cyberdyne X1470 |
Memory | 128TB QRAM |
Video Card(s) | CDS Render Accelerator 4TB |
Storage | SK 16EB NVMe PCI-E 9.0 x8 |
Display(s) | LG C19 3D Environment Projection System |
Power Supply | Compact Fusion Cell |
Software | Skysoft Skynet |
I'm not going endlessly with you into arguments you have 100% lost. By the arguments provided, by even the links, proof, which I was talking about from the beginning, provided, you were firmly wrong, now you're only deflecting and I have no time for this senseless and endless debate. For you this is just about "being right" now, not about technology and interest and technical facts. 8 GB is 100% enough for even most 1440p games today, so it will be easily easily enough for 1080p for the foreseeable future, now. End of story. Also, Nvidia would 100% not risk a 8 GB card if you were right, but oh surprise, you're obviously not right. -> Otherwise debate this with @W1zzard perhaps. It's his data. Maybe you'd listen to him.
edit: to make this a bit more clear, I'm gonna explain what the game review data shows for the RTX 4060 - and what it does not show:
- games are all tested in Ultra settings, so in a lot of cases *beyond* sweet spot of the 4060, settings which should not be used for a 4060, in other words, or only with DLSS activated. Worst case scenario, you could say.
- despite this beyond sweet spot usage, the card never had terrible fps (< 30 or even <10 fps), which would show that the vram buffer is running into short comings, a very obvious sign usually.
- the only "problem" the card had was in some games in 1080p (and I only talk about 1080p here for this card), it had less than 60 fps.
- the min fps scaled with the average fps it had in that game, so if the avg fps was already under 60 fps, obviously the min fps wouldn't be great as well - not related to vram.
- also there are games like Starfield which generally have issues with min fps that are visible in that review, and not only with the RTX 4060. Also not related to vram.
- the card generally behaved like it should, it was *not* hampered by 8 GB vram in any of the games. Just proving that what I said all along is true.
- furthermore 8 GB vram is also proven to be mostly stable for even 1440p+, as the vram requirements barely increase with resolution alone. There are a variety of parameters that will increase vram usage, for example world size, ray tracing, texture complexity, optimisation issues. A lot of games don't even have problems in 4K (!) with 8 GB vram. That is the funny thing here. The problems start when the parameters get more exotic, so to say.
- so saying 8 GB vram isn't enough today for 1080p or even 1440p, is just wrong. Can it have select issues? Yes. It's not perfect, if you go beyond 1080 it will have more issues, but it will still mostly be fine. In 1080p, which this discussion was about, it's 99.9% fine on the other hand, making this discussion unnecessary.
- as it's still easily enough for 1080p, it will also be enough for a new low end card, for the foreseeable future.
haha now let's stay in reality. What I can tell you is this: if you have a ton of vram it can be used, it can be useful even in games that don't *need* it, just so your PC has never to reload vram again, it's basically luxury, less stutter than cards with 16 GB for example. I experienced this while switching from my older GPU to the new one, Diablo 4 ran that much smoother, there was basically no reloading anymore while zoning.
The issue here is that you're comparing two different companies. AMD used to do a "I give you extra vram" for marketing vs Nvidia. The fact here is, that historically Nvidia Upper Mid range GTX 1070 and Nvidia Semi-High End GTX 1080 used 8 GB back then, this is a fact. Whether AMD used a ton of vram on a mid range card, does not change this fact. The same chip of AMD started originally with 4 GB, as per the link you provided yourself, everyone can see. AMD also used marketing tactics like this on R9 390X, so you can go even further back to GPUs that never needed 8 GB in their relevant life time. When the R9 390X "needed" 8 GB, it was already too slow to really utilise it, making 4 GB the sweet spot for the GPU, not 8 GB (averaged on the life time of course). And as was already said, by multiple people not just me, Nvidia vram management is simply better than AMDs - this is also a historical fact since this is true for a very long time now, making a AMD vs Nvidia comparison kinda moot. AMD will historically always need a bit more vram to do the same things Nvidia does (not have lags or stutter). As someone said this is probably due to some software optimisation.
Really hungry VRAM games are indeed very rare. Doesn't mean they don't matter. People have been having issues with Indiana Jones on 8GB cards in 1080p. That's right now, not the future. This is gonna become more common. I'm sure there's ways of dealing with it like lowering textures, shadows, and what ever else. But the issue is there now. If all the common games don't have it, great. But if it's a game someone wants to play on their new card and they discover it's running crappy cause it's hitting the ceiling then... Well that just sucks.
And people are too forgiving of a massive rich company skimping on VRAM. 5060 should really launch with about 12GB, but will no doubt be 8GB. 12GB should at least let it run games for the next few years without running into any issues. Nvidia is just greedy and stingy. The fact that my 3080 only came with 10GB, and an empty slot on the PCB where another 2GB module could fit, is showing their greed physically. My 6700 XT cost far less and came with 12GB. The 6800 XT and 6900 XT both had 16GB. NV are lagging behind and people excuse it with the most games work fine argument. If a new card is coming out now, I should expect it to work with any game just fine, a new product shouldn't have an issue right away, rare as it may be.