- Joined
- Dec 25, 2020
- Messages
- 7,013 (4.81/day)
- Location
- São Paulo, Brazil
System Name | "Icy Resurrection" |
---|---|
Processor | 13th Gen Intel Core i9-13900KS Special Edition |
Motherboard | ASUS ROG Maximus Z790 Apex Encore |
Cooling | Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM |
Memory | 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V |
Video Card(s) | ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition |
Storage | 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD |
Display(s) | 55-inch LG G3 OLED |
Case | Pichau Mancer CV500 White Edition |
Audio Device(s) | Apple USB-C + Sony MDR-V7 headphones |
Power Supply | EVGA 1300 G2 1.3kW 80+ Gold |
Mouse | Microsoft Classic Intellimouse |
Keyboard | IBM Model M type 1391405 (distribución española) |
Software | Windows 11 IoT Enterprise LTSC 24H2 |
Benchmark Scores | I pulled a Qiqi~ |
No. Many people claim that they don't see the difference. Either because they support Nvidia so much that they don't want to, or simply because they use old 1K screens which are horrendous with image quality, anyways.
The situation is such that some people even claim that they don't see a difference between 1K and 4K. How can we discuss anything more?
That's because there's no difference, not because "they can't tell". A decade plus ago or whatever Nvidia defaulted HDMI to a limited color range for compatibility reasons (deep color isn't supported on earlier HDMI TVs: even the PS3 has an option to toggle this) and this caused the image to look compressed, this hasn't been the case for years upon years now. Nothing was "proven", you don't feel comfortable when a GeForce is plugged to your screen due to personal bias, not because "the image looks worse".