- Joined
- Sep 17, 2014
- Messages
- 23,011 (6.08/day)
- Location
- The Washing Machine
System Name | Tiny the White Yeti |
---|---|
Processor | 7800X3D |
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3 |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
VR HMD | HD 420 - Green Edition ;) |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
The miserable 10 GB on the RTX 3080 already decrease the performance, forcing Nvidia to do shenanigans in the driver to force lower resolution textures in the games..
RTX 3080 VRAM usage warnings and the issue with VRAM pool sizes: the compromise of 4K gaming | ResetEra
Far Cry 6 needs more VRAM than the Nvidia RTX 3080 has to load HD textures | PCGamesN
Exactly and those shenanigans don't show up in reviews at launch. Win win, right? 10GB was enough and people can happily live in ignorance that it is. Unless they frequent TPU, where I'll keep massaging that one in since lots of buyers were convinced all was fine and 'oh, otherwise the card's perf isn't going to push those details anyway' or 'I'll just lower my textures'. On GPUs that are hitting the 1K MSRP mark. Less than a year post-launch. Its utterly retarded.
To each their own, it was easy to predict this.
So no, Nvidia won't do the hardware workaround for 'sufficient' VRAM anymore, after they got burned on Maxwell they're designing around it and they sell you the argument that all is well with 50% reduced VRAM compared to Pascal in relative core power. Meanwhile, they still do release double VRAM for the entire stack. Early adopter heaven, cash twice on fools with money. Its much better than getting forced to settle with customers for your 3.5GB.
The newest argument to avoid buyers' remorse now is 'muh muh but we have DLSS, so it looks great anyway'. Okay, enjoy being on Nvidia's DLSS leash for your gaming, what used to be waiting for your SLI profiles now is waiting for your DLSS profiles. But! This time, it will all be different, right?
Its a piss poor gen, this one. Even the 450W on this 3090ti underlines it - they can't make a full die to save their life without pulling all the stops on voltage. Samsung 8nm. Fan-tas-tic node.
Last edited: