- Joined
- Dec 25, 2020
- Messages
- 8,205 (5.21/day)
- Location
- São Paulo, Brazil
Processor | 13th Gen Intel Core i9-13900KS |
---|---|
Motherboard | ASUS ROG Maximus Z790 Apex Encore |
Cooling | Pichau Lunara ARGB 360 + Honeywell PTM7950 |
Memory | 32 GB G.Skill Trident Z5 RGB @ 7600 MT/s |
Video Card(s) | Palit GameRock GeForce RTX 5090 32 GB |
Storage | 500 GB WD Black SN750 + 4x 300 GB WD VelociRaptor WD3000HLFS HDDs |
Display(s) | 55-inch LG G3 OLED |
Case | Cooler Master MasterFrame 700 benchtable |
Power Supply | EVGA 1300 G2 1.3kW 80+ Gold |
Mouse | Microsoft Classic IntelliMouse |
Keyboard | IBM Model M type 1391405 |
Software | Windows 10 Pro 22H2 |
No. Many people claim that they don't see the difference. Either because they support Nvidia so much that they don't want to, or simply because they use old 1K screens which are horrendous with image quality, anyways.
The situation is such that some people even claim that they don't see a difference between 1K and 4K. How can we discuss anything more?
That's because there's no difference, not because "they can't tell". A decade plus ago or whatever Nvidia defaulted HDMI to a limited color range for compatibility reasons (deep color isn't supported on earlier HDMI TVs: even the PS3 has an option to toggle this) and this caused the image to look compressed, this hasn't been the case for years upon years now. Nothing was "proven", you don't feel comfortable when a GeForce is plugged to your screen due to personal bias, not because "the image looks worse".