I think that's supported but it's kinda a mess? Or at least, the interface itself isn't meant to be used in 10-bit color modes (I have read complaints of everything looking washed out, that sort of stuff). Kinda makes sense, since 90% of the world uses 8-bit panels at best (or worse, like 6-bit+FRC), but it's annoying if you have the hardware for it.
Other people also have thrown around the idea that the code that handles color management (WDDM, DWM and/or whatever else) would need a rewrite or something (which might affect drivers too), but that's just an idea some have had. Can't say if it holds or not.
Also, the two main GPU vendors used to limit high-color modes to professional models, I think? (Nvidia with Quadros, AMD with FirePro/Radeon Pro/whatever)
EDIT: AMD doesn't support it unless you're using Dual-Link DVI or DisplayPort, they don't mention HDMI