- Joined
- Feb 1, 2019
- Messages
- 3,523 (1.67/day)
- Location
- UK, Midlands
System Name | Main PC |
---|---|
Processor | 13700k |
Motherboard | Asrock Z690 Steel Legend D4 - Bios 13.02 |
Cooling | Noctua NH-D15S |
Memory | 32 Gig 3200CL14 |
Video Card(s) | 4080 RTX SUPER FE 16G |
Storage | 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red |
Display(s) | LG 27GL850 |
Case | Fractal Define R4 |
Audio Device(s) | Soundblaster AE-9 |
Power Supply | Antec HCG 750 Gold |
Software | Windows 10 21H2 LTSC |
One could apply the same argument to things like slapping on RT cores, spending loads of money to expand rendering performance.To keep on slapping on more and more gobs of VRAM generation after generation isn't a complete solution because that VRAM cost money. It raises the price of a card at a time when prices for cards are already ridiculous. imo there needs to be an increase in VRAM offered but a step towards not letting the VRAM requirements get absurd is also part of the solution.
The difference with VRAM though despite what Nvidia like to present its not an expensive component. Its just given the feel of being expensive because Nvidia are so stingy with it and of course VRAM is not modular. So one has to buy an entire new GPU to upgrade their VRAM.
If you look at it from the point of view of a tfops vs VRAM capacity on Nvidia, its regressed, if you look at it in a way of what Nvidia provide vs the console market, its regressed. At the very least I think there should be a bump to what the consoles have on mid and high end cards.
Now if Nvidia add e.g. 8 gigs VRAM and then add $400 to the price, thats them being greedy, its not the cost to them.
I think its ok to apply minimalism (to a degree) on the budget cards whilst charging an appropriate price, so 8 gigs on a 4060 for $300. But it shouldnt be that aggressive on a 4070ti same with the 3080.
Remember Intel stuck 8 gigs on a £250 card.
In my opinion the tensor cores at least have more value than the RT cores, as DLSS can be used in many games now due to DLDSR or whatever its called, whilst RT is only on a tiny fraction of games with questionable value. But with that said I expect if Nvidia wanted they could probably implement DLDSR using normal shaders.Isn't the only other use of tensor cores for DLSS? If they can find another use for them all power to Nvidia -- for games that don't implement DLSS at all!
If I had a choice between say a 16 gig 4070ti GTX or a 12 gig 4070ti RTX at same price, absolutely 100% I be picking the former. I suspect I wouldnt be alone either.
Last edited: