- Joined
- Jun 14, 2020
- Messages
- 3,332 (2.07/day)
System Name | Mean machine |
---|---|
Processor | 12900k |
Motherboard | MSI Unify X |
Cooling | Noctua U12A |
Memory | 7600c34 |
Video Card(s) | 4090 Gamerock oc |
Storage | 980 pro 2tb |
Display(s) | Samsung crg90 |
Case | Fractal Torent |
Audio Device(s) | Hifiman Arya / a30 - d30 pro stack |
Power Supply | Be quiet dark power pro 1200 |
Mouse | Viper ultimate |
Keyboard | Blackwidow 65% |
His videos lately are just BS. Chooses settings to hog the Vram of the 3070,and then acts surprised it stutters like crazy. The actual question is, what is the image quality impact in those games if you drop textures to high instead of ultra? Not a lot I'd imagine, that's why he is not testing it. Won't generate as many clicks as just pooping on nvidia will.They've somewhat already done this, comparing the 3070 to its pro counterpart, if you missed it you might want to watch it:
It's quite the terrifying difference and I bet it's had a lot to do with people's sudden change of heart on 16 GB RAM+8 GB GPU PCs in the recent months.
Why isn't he for example testing amd vs nvidia on the new PT cyberpunk upgrade? I really wonder
Generally speaking, sure, but cyberpunk specifically, for some freaking reason 40 fps is a horrible experience to me. I've played games at 30 fps and it doesn't bother me that much, but 40 on cyberpunk feels atrocious.Contrary to popular belief, 80fps at native settings isn't necessary for it to be considered playable
I'm of the "60 fps or go home" school of thought, but a lot of people may be happy with 40ish in games like that. FSR would sacrifice some image quality to keep that on the upper end of that range, closer to 60 really.