# Do 10-bit monitors require a Quadro/Firepro card?



## Atnevon (Feb 19, 2014)

I cannot seem to find a simple answer to this question!

Anyway. So I found in my next display search, some talk brewing over whether a monitor is a true 8-bit vs 10-bit. Take these Dell Panels for instance:

Dell U2713HM
http://www.newegg.com/Product/Product.aspx?Item=9SIA0ZX1AJ7781

Dell U3014
http://www.newegg.com/Product/Product.aspx?Item=N82E16824260132

From what I gather, 10-bit displays need a Quadro or Firepro card. But there are possible instances where this may not be the case with the right Geforce card?!?

I also saw that now for 2650x1440 it does not matter if you use HDMI (with rev 1.3 or higher) or Display Port.

(Also for AdobeRGB support you need HDMI 1.4 or DisplayPort)

I just wish this were laid out a bit simpler. Does anyone have a good experience with these? I know either way a display upgrade from my TN Samsung will be fine. Currently I was lucky to binge on a 780 Ti, so, wondering if its worth the extral leap in the panel for my photo/video work, in addition to the wind-down headshot sessions.


----------



## FordGT90Concept (Feb 19, 2014)

Require?  Most likely not; however, the most you can send it is 8-bit if you don't have a card that supports 10-bit color.

NVIDIA and AMD intentionally leave >8-bit support out of their consumer graphics because they want you to pay the premium for the workstation cards.

I'd look at a good IPS monitor with 8-bits per color.  The only way one can justify spending that kind of money on a >8-bit monitor is if it is coupled with an workstation card that can drive that increased palette.  Graphic artists might be able to justify it as can the medical imaging industry but it's really kind of moot outside of those instances.


----------



## Blín D'ñero (Feb 19, 2014)

No. Radeon cards have no problem sending 10-bits through DP or HDMI.
http://www.luminous-landscape.com/reviews/accessories/10bit.shtml


----------



## FX-GMC (Feb 19, 2014)

Blín D'ñero said:


> No. Radeon cards have no problem sending 10-bits through DP or HDMI.
> http://www.luminous-landscape.com/reviews/accessories/10bit.shtml



I haven't been able to find solid evidence that this is true.  Just a few articles that are supposed to show you how to enable 10-bit color and people replying that it didn't work.


----------



## Blín D'ñero (Feb 19, 2014)

http://www.rage3d.com/board/showpost.php?p=1337428124&postcount=5


----------



## FX-GMC (Feb 19, 2014)

Blín D'ñero said:


> http://www.rage3d.com/board/showpost.php?p=1337428124&postcount=5



Then we can safely assume that it works for all Radeon cards.  Good call.


----------



## Steevo (Feb 19, 2014)

Mine works fine.


----------



## repman244 (Feb 19, 2014)

So why do they put a "Enable 10-bit" in FirePro drivers then? It has to have some effect, no?

A thing to consider here is that very few monitors have NATIVE 10-bit support most are 8-bit but use interpolation to "achieve" 10-bit.

EDIT: The only monitor that I know that is a true native 10-bit is HP LP2480zx and it shows in the price.


----------



## Easy Rhino (Feb 19, 2014)

The easiest and best way to find out is to call the manufacturer and ask...


----------



## FX-GMC (Feb 19, 2014)

Easy Rhino said:


> The easiest and best way to find out is to call the manufacturer and ask...



lmao.  I do hope you are joking.


----------



## Easy Rhino (Feb 19, 2014)

FX-GMC said:


> lmao.  I do hope you are joking.



Not joking. It is very easy to call Dell support and ask them.


----------



## FX-GMC (Feb 19, 2014)

Easy Rhino said:


> Not joking. It is very easy to call Dell support and ask them.



I see.


----------



## Blín D'ñero (Feb 20, 2014)

FX-GMC said:


> Then we can safely assume that it works for all Radeon cards.  Good call.


I don't think so and i didn't say that.

http://forums.anandtech.com/showthread.php?t=2358239
Of course in that thread also some that claim it does not or can not work.
But you could try the suggestions in its Original Post.

If it works you should see no banding in the 10 bit test ramp picture (google for it, i have it here somewhere) when opened in Photoshop (CS4+ i believe, so CS6 should definitely work) in Windows 7 or 8.

I haven't tried it myself because alas my U2410's (on my 5870 cf system) turned out to being not truely 10-bit supporting, they only emulate it.  Seem to remember that my U3011 (on my 7970cf system) is 10-bit, but that's my gaming PC (and no Photoshop or any 10-bit application installed) so i haven't tried there either and won't bother.

I guess that however it is not encouraged in the drivers it is supported by the hardware, from Radeon 4870 up to latest Radeons. But i haven't tested this myself.


----------



## Mussels (Feb 20, 2014)

my sony HDTV says its running 10 bit over HDMI every time it turns on connected to my 7970


----------



## 4ghz (Feb 20, 2014)

Can an average Joe see the difference between 8 bits color and 10 bits color?  I didn't think anyone can see the difference in a few million shades of color.


----------



## Steevo (Feb 20, 2014)

http://www.avsforum.com/t/1381724/official-4-4-4-chroma-subsampling-thread

http://documentation.apple.com/en/color/usermanual/index.html#chapter=1&section=3&tasks=true

HDMI 1.3:


*Higher speed:* HDMI 1.3 increases its single-link bandwidth to 340 MHz (10.2 Gbps) to support the demands of future HD display devices, such as higher resolutions, Deep Color and high frame rates. In addition, built into the HDMI 1.3 specification is the technical foundation that will let future versions of HDMI reach significantly higher speeds.
*Deep Color:* HDMI 1.3 supports 10-bit, 12-bit and 16-bit (RGB or YCbCr) color depths, up from the 8-bit depths in previous versions of the HDMI specification, for stunning rendering of over one billion colors in unprecedented detail.
*Broader color space:* HDMI 1.3 adds support for “x.v.Color™” (which is the consumer name describing the IEC 61966-2-4 xvYCC color standard), which removes current color space limitations and enables the display of any color viewable by the human eye.
*New mini connector:* With small portable devices such as HD camcorders and still cameras demanding seamless connectivity to HDTVs, HDMI 1.3 offers a new, smaller form factor connector option.
*Lip Sync:* Because consumer electronics devices are using increasingly complex digital signal processing to enhance the clarity and detail of the content, synchronization of video and audio in user devices has become a greater challenge and could potentially require complex end-user adjustments. HDMI 1.3 incorporates automatic audio synching capabilities that allows devices to perform this synchronization automatically with total accuracy.
*New HD lossless audio formats:* In addition to HDMI’s current ability to support high-bandwidth uncompressed digital audio and all currently-available compressed formats (such as Dolby® Digital and DTS®), HDMI 1.3 adds additional support for new lossless compressed digital audio formats Dolby TrueHD and DTS-HD Master Audio™.


----------



## Atnevon (Feb 20, 2014)

4ghz said:


> Can an average Joe see the difference between 8 bits color and 10 bits color?  I didn't think anyone can see the difference in a few million shades of color.



Average, not a bit. I'm a graphic designer/photographer so being color anal is a thing (when it needs to be)

I'll be honest and say that for designers or photographers not having a complete AdobeRGB workflow and their work consistantly going in tandom to precise printers, then the 10-bit isn't for them. THis is why I've decided on a Dell U2713HM. Its still a MUCH better panel to let me have a more accurate workflow, but also stil quick enough so I can get a good round of CS:Go or Borderlands 2 in between downtime.



Easy Rhino said:


> Not joking. It is very easy to call Dell support and ask them.


In ordering my panel, the people on the phone told me I could not apply more than a 25% off coupon (of which I had a 30), so I had to use the online support chat. It took me nearly an hour and a half total to order this B. I did manage to get one total (with some extra coupons I had) for $345, so I am damn happy.



Mussels said:


> my sony HDTV says its running 10 bit over HDMI every time it turns on connected to my 7970


Can you show a screenie of your connection? I'm curious what's its saying when its connected. (not doubting or calling you out at all, I'm just extremely curious)




FordGT90Concept said:


> .Graphic artists might be able to justify it as can the medical imaging industry but it's really kind of moot outside of those instances.


Yes and no I think in many contexts. Since my work, now, does not go to presses or magazines, the color accuracy (to a SUPER level) isn't as important. A great IPS is WAY better in my research findings than any TN anyday. I found many 10-bit displays, but, all are 1K STARTING!

I would imagine if the circumstances were higher, then yes, I can foresee this. However in my own little world, the times I would justify a full 10-bit, quadro, calibrated, detailed setting would only be in an isolated room for full proofing and matching right before it went to press.

Quadro cards themselves are affordable. The ones that can keep a good framerate up in a good FPS though....ca....CHING!

The past while has been fun to research in all this nonetheless. Knowledge is power!


----------



## Mussels (Feb 20, 2014)

crap quality since it popped up so fast i had to use burst mode to catch the text

as requested: 10 bit HDMI from an AMD card


----------



## Aquinus (Feb 20, 2014)

Mussels said:


> crap quality since it popped up so fast i had to use burst mode to catch the text
> 
> as requested: 10 bit HDMI from an AMD card



Could you show us where in CCC you can set the color depth if the option is available or does it just default to 10bit and doesn't let you change it?


----------



## Kaynar (Feb 20, 2014)

The real question here is do you NEED Adobe RGB (aka need 10bit). If you are just going to use sRGB then you don't need a 10 bit panel and good 8bit and 10bit panels will both produce excellent sRGB performance, especially if you own a calibrator, though from my own experience I did find Asus's PA246Q sRGB colours to be slightly oversaturated (could be bad for someone that actually wants to work on photography in sRGB). I had a 10bit Asus PA246Q over DP on my HD7970, but was using sRGB. Worked fine.

And yes, the average Joe CAN see the difference between 8 bit and 10 bit *when true 10 bit is actually being used and the picture was not made for that colour format*


----------



## Mussels (Feb 20, 2014)

Aquinus said:


> Could you show us where in CCC you can set the color depth if the option is available or does it just default to 10bit and doesn't let you change it?



no settings, that just shows up on the TV.


----------



## Steevo (Feb 21, 2014)

Windows does not run 10 bit natively even though programs and the driver are capable of rendering it, so short of running the 10 bit gradient test in a 10 bit capable program it means absolutely nothing. Viewing even Blu-Ray at its 4:2:0 color/chroma is crap, yet many claim to have life altering experiences with its clarity. 

In short 10 bit is useful for exactly perfection in still photo when dealing with minute gradients or anal retentive attention to colors with a hard splattering of OCD and too much time on your hands. 

Rendering a 8 bit color in 10 bit still results in the same color being displayed.


----------



## Blín D'ñero (Feb 21, 2014)

Steevo said:


> Windows does not run 10 bit natively even though programs and the driver are capable of rendering it, so short of running the 10 bit gradient test in a 10 bit capable program it means absolutely nothing. Viewing even Blu-Ray at its 4:2:0 color/chroma is crap, yet many claim to have life altering experiences with its clarity.
> 
> In short 10 bit is useful for exactly perfection in still photo when dealing with minute gradients or anal retentive attention to colors with a hard splattering of OCD and too much time on your hands.
> 
> Rendering a 8 bit color in 10 bit still results in the same color being displayed.




8 bits per color channel result in no more 16 million colors (a fraction of the colors we perceive in the real world).  10 bits is needed to at least reach Adobe RGB without banding. Banding is so obvious everybody sees it.
Plus, camera and printers already have a way larger color space than average (non-pro) displays so yes it's high time 10 bit displays become the standard.
So, if the hardware is supporting it, enjoy it!


----------



## Steevo (Feb 23, 2014)

Blín D'ñero said:


> 8 bits per color channel result in no more 16 million colors (a fraction of the colors we perceive in the real world).  10 bits is needed to at least reach Adobe RGB without banding. Banding is so obvious everybody sees it.
> Plus, camera and printers already have a way larger color space than average (non-pro) displays so yes it's high time 10 bit displays become the standard.
> So, if the hardware is supporting it, enjoy it!


Due to the alteration in chroma the banding is reduced significantly by dithering the two colors together. Will it truly matter to you or I and will we notice that its being done? Probably not.


----------



## Chitz (Feb 26, 2014)

this reminds me of those 60hz vs 120hz discussions on youtube ,  good old days


----------



## TheoneandonlyMrK (Feb 26, 2014)

Amd drivers allow you the option but not as you are expecting. 
In the panel config menu of catalyst I have since 5870 days been able to choose full (studio) rgb10bit , ,normal or ygcn weva.
It's that option and is almost always there unless a really low grade monitor is attached with no options via its edid list


----------



## Mussels (Feb 26, 2014)

theoneandonlymrk said:


> Amd drivers allow you the option but not as you are expecting.
> In the panel config menu of catalyst I have since 5870 days been able to choose full (studio) rgb10bit , ,normal or ygcn weva.
> It's that option and is almost always there unless a really low grade monitor is attached with no options via its edid list




oh, so the 'full' option here is 10 bit? or the 4:4:4 ones?


----------



## TheoneandonlyMrK (Feb 26, 2014)

Yeah man. Most monitors I've used would work on it but likely just emulate it.


----------



## handsome_stan (Jan 21, 2015)

From my understanding if you want 10 bit color output from video editing programs then you will most likely need a quadro because geforce, despite being capable of 10 bit color support in directx, don't support 10bit in opengl or cl, my memory is poor.
That being said a quadro k620 is cheap but enough to drive the GUI monitor. The geforce can be used for processing the video in most edit/color/vfx applications that support GPU processing/rendering.



handsome_stan said:


> From my understanding if you want 10 bit color output from video editing programs then you will most likely need a quadro because geforce, despite being capable of 10 bit color support in directx, don't support 10bit in opengl or cl, my memory is poor.
> That being said a quadro k620 is cheap but enough to drive the GUI monitor. The geforce can be used for processing the video in most edit/color/vfx applications that support GPU processing/rendering.


Holy poop, how did I... I didn't even look at the date of the last post. My apologies.


----------



## Mussels (Jan 21, 2015)

old thread, but we did confirm in another thread that AMD driver support upto 12 bit colour now, at least on my 7970 and newer

Its ok man, thread necros happen from time to time.


----------



## PanzerIV (Nov 17, 2016)

So after all as of *November 2016*, what is the official word as I keep finding only old threads and half the people says yes, others no or they don't know... some thing they do have 10bit while it's actualy a simulated 10bit, so do we NEED a FireGL/Quadro card to handle a native 10bit monitor and does Windows 10 64bit supports 10bit environnement or only some software like Adobe?

It would be quite useless if I would be able to edit photos in 10bit but then could only view them in 8bit from let's say (Windows Live Photo Gallery) through Windows 10... (-_-) as those 10bit monitors starts on sale at 1000$ while we can have a really good Korean 2560x1440 glossy IPS used starting at 200$ only, or a higher-end model at about 350$ or at most 400$.

*EDIT:*  Holy sh*t what a coïncidence, haven't posted in a very long time and I see that I just posted the same date that I've joined this forum, Nov 16th 2009 lol.


----------



## TheoneandonlyMrK (Nov 17, 2016)

Mussels said:


> old thread, but we did confirm in another thread that AMD driver support upto 12 bit colour now, at least on my 7970 and newer
> 
> Its ok man, thread necros happen from time to time.


This answer Still applies ,my 4k does full rgb 12 bit from an Rx 480 too so I'll vouch for it.
Though I did also install full monitor drivers and mess with windows colour profiles to actually get it to look right.


----------



## cdawall (Nov 17, 2016)

Mussels said:


> my sony HDTV says its running 10 bit over HDMI every time it turns on connected to my 7970



That's how the 10bit sony I use at work is. 23" 1080P 10bit little guy.


----------



## FordGT90Concept (Nov 17, 2016)

As for NVIDIA...
http://nvidia.custhelp.com/app/answ...-bit-per-color-support-on-nvidia-geforce-gpus

Windows 10-bit, yes in GTX 2## series; OpenGL 10-bit, no.

Not finding anything that suggests that changed since.  If you're considering an NVIDIA card, best contact NVIDIA directly.


----------



## Directrix (Nov 19, 2018)

My experience with 10 bit color was:
As a landscape photographer I bought a Dell U2410 10 bit screen for the extra gamut, I installed the supplied color profiles for the screen into windows.
Dual DVI or Displayport input was required to use the screen at full resolution.
The photos I edited on it looked good. However they printed too dark and with color casts, and the white balance looked wrong when other people viewed the images on their computers e.g.via flickr.
Another issue was editing with Adobe software looked OK but other software (such as windows photo viewer) showed the images with strong color casts.
5 years later I bought an NVidia Quadro, installed the drivers, and used it with the Dell U2410. The colors have noticeably better graduation and better handling of high saturation.
My new images now have correct color balance when I export them for other people to view. Everything works correctly and the screen looks great, plus the quadro speeds up some image processing .

My advice is only buy a 10 bit monitor if you have a Quadro (or Firepro) card . Trying to mix and match is worse than a standard monitor.
I spent a hundred hours correcting the colors of 5 years of photos. For some, I had deleted the raw files and could not correct the color without degrading the images. :-(


----------



## SoNic67 (Nov 20, 2018)

Yeah, "gaming" people always tend to put down the pro level cards, with statements like "you don't need them the gaming are so much better".
But those are mostly statements made just because "grapes are sour" and because they never actually used a professional software that makes use of those drivers and cards.



theoneandonlymrk said:


> my 4k does full rgb 12 bit from an Rx 480 too so I'll vouch for it


Yeah, I am sure that's not true.


----------



## cdawall (Nov 20, 2018)

I have a vega FE which uses a pro driver and in theory would work just like a firepro card for color reproduction and my 1080ti, can't tell the difference in color on my 10 bit dell would be curious how that applied to accuracy on prints.

Both cards show an astonishing difference between 8 bit and 10 bit color however. That is night and day.


----------



## MrGenius (Nov 20, 2018)

cdawall said:


> I have a vega FE which uses a pro driver and in theory would work just like a firepro card for color reproduction and my 1080ti, can't tell the difference in color on my 10 bit dell would be curious how that applied to accuracy on prints.
> 
> Both cards show an astonishing difference between 8 bit and 10 bit color however. That is night and day.


I flashed a Vega FE 8GB BIOS on my Vega 64 and used the pro drivers to enable 10bpc. So yeah...no theory about it. You can do 10bpc with a Vega FE for sure.

I'm having trouble understanding what you mean by "can't tell the difference in color on my 10 bit dell" but "Both cards show an astonishing difference between 8 bit and 10 bit color". Can't tell the difference? Astonishing difference? So which is it? WTF? 



SoNic67 said:


> Yeah, "gaming" people always tend to put down the pro level cards, with statements like "you don't need them the gaming are so much better".
> But those are mostly statements made just because "grapes are sour" and because they never actually used a professional software that makes use of those drivers and cards.


You too. What the hell you mean by that? You think pro cards are just better at everything?


----------



## qubit (Nov 20, 2018)

I remember 10-bit colour being touted for my bottom end FX5200 way back in 2003, but I never found a way to make it display it, so I doubt it was actually supported. I reckon the GPU physically had the capability though.


----------



## Jetster (Nov 20, 2018)

No


----------



## Mussels (Nov 20, 2018)

Closed the thread due to its age/necro


----------

