- Joined
- Sep 17, 2014
- Messages
- 22,564 (6.03/day)
- Location
- The Washing Machine
System Name | Tiny the White Yeti |
---|---|
Processor | 7800X3D |
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3 |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
VR HMD | HD 420 - Green Edition ;) |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
These results echo what we've seen in other recent games. So all devs are bad now and you are right, is that itIf needing a 4090 to run 1440p at not even 120 fps is a quality release to you, you might wanna get a reality check.
Its okay to have an opinion, but the numbers don't lie... and neither do the VRAM requirements.
60 FPS is a fine bar to have, and it actually used to be the bar we judge things by for a long time. Its not just because people bought high refresh monitors that games can suddenly max them out because of it. Never been the case. There was a sweet spot in times before RT where GPUs would just slaughter anything DX11 rasterized. The Pascal>Turing days. Oh yeah, btw, those new monitors also run 1440p or 4K now.
Those days are over. And regardless, you can still get 60 FPS at maxed Raster settings with most cards that aren't sub midrange or ancient.
Last edited: