- Joined
- Jan 14, 2019
- Messages
- 12,273 (5.78/day)
- Location
- Midlands, UK
System Name | Nebulon B |
---|---|
Processor | AMD Ryzen 7 7800X3D |
Motherboard | MSi PRO B650M-A WiFi |
Cooling | be quiet! Dark Rock 4 |
Memory | 2x 24 GB Corsair Vengeance DDR5-4800 |
Video Card(s) | AMD Radeon RX 6750 XT 12 GB |
Storage | 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2 |
Display(s) | Dell S3422DWG, 7" Waveshare touchscreen |
Case | Kolink Citadel Mesh black |
Audio Device(s) | Logitech Z333 2.1 speakers, AKG Y50 headphones |
Power Supply | Seasonic Prime GX-750 |
Mouse | Logitech MX Master 2S |
Keyboard | Logitech G413 SE |
Software | Bazzite (Fedora Linux) KDE |
I see your point, but I still don't think one should need a $400 GPU just to run the game at 1080p with RT off.I don't get the hate. First 'Next gen' games are always like this. Devs eventually learn to optimize better, and cards get more powerful. This has always happened in gaming.
Visuals are getting harder to improve meaningfully without impacting performance too much, that's how diminishing returns work. I also wouldn't expect AAA, story-based games to err on the side of performance rather than visuals.
On the bright side, as long as Low gives you 95% of the visual quality of Ultra with 3x the FPS, I'm fine.
On the other bright side, I remember when I bought my 6750 XT and some people tried to suggest upgrading my monitor because 1080p is so oldschool, and the 6750 XT can do so much more, and I was like "nah, it'll be fine". Who was right then?
Last edited: