- Joined
- Jan 14, 2019
- Messages
- 13,800 (6.26/day)
- Location
- Midlands, UK
Processor | Various Intel and AMD CPUs |
---|---|
Motherboard | Micro-ATX and mini-ITX |
Cooling | Yes |
Memory | Overclocking is overrated |
Video Card(s) | Various Nvidia and AMD GPUs |
Storage | A lot |
Display(s) | Monitors and TVs |
Case | The smaller the better |
Audio Device(s) | Speakers and headphones |
Power Supply | 300 to 750 W, bronze to gold |
Mouse | Wireless |
Keyboard | Mechanic |
VR HMD | Not yet |
Software | Linux gaming master race |
It's the best possible graphics quality at the best available screen resolution. The developer lists other options too, because not everyone can game at such settings.bang on.
Not sure I follow this logic, the creator creates games with standard resolution options, generally from 720p ish or even lower, all the way to 4k and beyond, sometimes (hopefully even, with ultrawide options and VR too) so which one of those is intended by the creator? Is playing 720p native more the way the creator imagined than 1440p? what about 1440 upscaled to 4k? or is 4k native what's intended and anything lower isn't? I'm interested to hear from you how the end users arbitrary resolution choice relates to the creators vision.
Bonus question, if the Developer lists upscaling for all the system specs and target resolutions and fps as we've seen lately, is playing with upscaling the way it's imaged by the creator?
That's a completely different topic altogether. Supersampling is a higher resolution image scaled down to fit your monitor. Upscaling is a lower resolution image scaled up to look acceptable on your monitor. It should be plain obvious to see why and how one gives much better quality than the other.If rendering at native panel resolution is to be touted as the best, then why does super sampling exist?
Also, not many games use supersampling these days, that's why I'm not using that as my example.