- Joined
- Jan 14, 2019
- Messages
- 14,039 (6.35/day)
- Location
- Midlands, UK
Processor | Various Intel and AMD CPUs |
---|---|
Motherboard | Micro-ATX and mini-ITX |
Cooling | Yes |
Memory | Overclocking is overrated |
Video Card(s) | Various Nvidia and AMD GPUs |
Storage | A lot |
Display(s) | Monitors and TVs |
Case | The smaller the better |
Audio Device(s) | Speakers and headphones |
Power Supply | 300 to 750 W, bronze to gold |
Mouse | Wireless |
Keyboard | Mechanic |
VR HMD | Not yet |
Software | Linux gaming master race |
Definitely not. What you tested only shows VRAM allocation, not VRAM usage. For example, you can peg an 8 GB card to max usage in one game, but still have a similar experience with a card of equal processing power but only 4 GB VRAM (I'm not saying that you should, though). When the game starts to stutter at random points (especially when loading assets), then you know you've hit a VRAM wall.
With the above logic, I'd say 8 GB is more than enough for 1080p, 16 GB for 4K.
With the above logic, I'd say 8 GB is more than enough for 1080p, 16 GB for 4K.