- Joined
- May 22, 2015
- Messages
- 13,986 (3.95/day)
Processor | Intel i5-12600k |
---|---|
Motherboard | Asus H670 TUF |
Cooling | Arctic Freezer 34 |
Memory | 2x16GB DDR4 3600 G.Skill Ripjaws V |
Video Card(s) | EVGA GTX 1060 SC |
Storage | 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500 |
Display(s) | Dell U3219Q + HP ZR24w |
Case | Raijintek Thetis |
Audio Device(s) | Audioquest Dragonfly Red :D |
Power Supply | Seasonic 620W M12 |
Mouse | Logitech G502 Proteus Core |
Keyboard | G.Skill KM780R |
Software | Arch Linux + Win10 |
Hate to break it to you, but nvidia is screwing you. You're not getting 10-bit from your game cards. You have to be using a Quadro. And for Adobe you have to enable 30 bit display in the preferences and advanced settings. And for the control panel driver setting it has to be set at 10-bit. Then there's the fact that your 10-bit monitor is probably 8+FRC, which is fake.
I wonder how they'll handle this scam they've been running when people try to play HDR content on their $2000 HDR monitors that go on sale this month. On the desktop your game cards only do 8-bit. They only do 10-bit in DX games.
Can you make up your mind?
The only thing that requires a Quadro is 10bit for OpenGL (which kinda sucks for Linux, but Linux has more stringent issues anyway).
And yes, 10bit requires an end-to-end solution to work, that's in no way particular to Nvidia.