- Joined
- Dec 25, 2020
- Messages
- 4,822 (3.89/day)
- Location
- São Paulo, Brazil
System Name | Project Kairi Mk. IV "Eternal Thunder" |
---|---|
Processor | 13th Gen Intel Core i9-13900KS Special Edition |
Motherboard | MSI MEG Z690 ACE (MS-7D27) BIOS 1G |
Cooling | Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1 |
Memory | G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2 |
Video Card(s) | ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition |
Storage | 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs |
Display(s) | 55-inch LG G3 OLED |
Case | Cooler Master MasterFrame 700 |
Audio Device(s) | EVGA Nu Audio (classic) + Sony MDR-V7 cans |
Power Supply | EVGA 1300 G2 1.3kW 80+ Gold |
Mouse | Razer DeathAdder Essential Mercury White |
Keyboard | Redragon Shiva Lunar White |
Software | Windows 10 Enterprise 22H2 |
Benchmark Scores | "Speed isn't life, it just makes it go faster." |
No. Many people claim that they don't see the difference. Either because they support Nvidia so much that they don't want to, or simply because they use old 1K screens which are horrendous with image quality, anyways.
The situation is such that some people even claim that they don't see a difference between 1K and 4K. How can we discuss anything more?
That's because there's no difference, not because "they can't tell". A decade plus ago or whatever Nvidia defaulted HDMI to a limited color range for compatibility reasons (deep color isn't supported on earlier HDMI TVs: even the PS3 has an option to toggle this) and this caused the image to look compressed, this hasn't been the case for years upon years now. Nothing was "proven", you don't feel comfortable when a GeForce is plugged to your screen due to personal bias, not because "the image looks worse".