- Joined
- Sep 17, 2014
- Messages
- 22,285 (6.02/day)
- Location
- The Washing Machine
Processor | 7800X3D |
---|---|
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | Thermalright Peerless Assassin |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
But I thought having 12GB of VRAM on a weaker card was better over having 8GB on a faster card?
I'm so confused. CP2077 can utilize up to 10GB of VRAM when RT is enabled (upwards of 7GB without RT), according to TPU and their benchmarking. Why is a 3060Ti with only 8GB out pacing a 3060 with 12GB? Last I checked 12 > 8.
Guru3D shows the 3060Ti, even with it's limited VRAM is 30-35% faster than the 3060 that has more VRAM.
Maybe it's just that game.....what other game can use lots of VRAM....?
Oh, maybe FarCry 6 will show how much better the 12GB is over 8GB.
At 1440p almost 8GB is used.
At 4k over 9GB can be used.
No...still seeing a 30-35% faster performance from the lesser VRAM 3060Ti over the 3060.
I just don't understand. What am I not getting? Maybe folks are saying that 4 years down the road the weaker 3060 with 12GB will give better performance over a 3060Ti with only a measly 8GB. Man. I'm just going to have to wait a while to see if this is true. Shucks, by then no one will care because we'll tall be talking/complaining about the current gen AMD/Nvidia/Intel will be providing.
Maybe the comparison now should be a focus on 6 versus 8 GB to see the gap. Then, FC6 shows its struggle. Older Ubisoft titles had similar issues. Engine quirk, but since the engine is used a lot, its there anyway. Pop in was always a thing in FC since part 3 as well.
Thats why Im saying : look back because history repeats, and higher VRAM cards just tend to last longer and resell for more on second hand markets.
The cards that stood out from midrange to high end over the last ten years have all been higher VRAM cards. Most notably, 7970 that would eclipse everything even years down the line due to combo of 3GB and wide bus. And the 980ti that is still relevant with its 6GB while any Fury X with 4GB (despite a superb bandwidth!!) is utterly pointless right now. The 1080ti is in a similar place, it will be relevant for quite a while still even with 5-6 years of age. With its 11GB it will eat any older, modded texture upped game at any res even ten years from now.
These differences dont show up in the first couple of years post launch. But year 3-5, they do. Its surprising even to me that we are actually already hearing issues with 10GB in certain titles. At the same time, the consoles with 16GB are pushing that mainstream bottom end up rapidly. Its a perfect storm for low VRAM to fall off faster than usual. Or at least require post launch dev/Nvidia TLC through patching or drivers. Its for that reason Nvidia has been shitting game ready drivers out of every hole for so long now.
And we can complain then that 'AMD is pushing useless stuff to fill 16 gigs', but what really happens is devs are using resources because it saves them effort. As they always have: once hardware gets mainstream it gets used.
Last edited: