Thanks. But it's not only up to your eyes, but also about your monitor/panel you are using. For example I can see line between 252 and 253 on MVA; on TN panel I can see all 3 lines, but on absurd angles ... directly looking there are no lines at all and it looks solid one color. On IPS I can also only see 1 line between 252 and 253. Then again as you chose the bright end of the palette, the lack of visual difference might be caused also from your panel brightness limitations.... there comes in the HDR brighter light thing?
For comparison I took the liberty to copy your idea to darker colors and here I can see all 4 colors clearly on my 3 different displays (Dell IPS; Dell TN and Benq MVA):
That example already quite clearly illustrates IMHO that on average scenario the brightest and possibly also darkest colors are just cut off on ordinary cheap 8bit monitors; NOT that you wouldn't see the difference, if it were actually there!
I took your picture. I cut out one of the four colors. Tell me, which one is it?
Tired of waiting, it's 128.
Here's another image. Tell me if it's the same color, or a different one (no cheating, I uploaded 3 different versions and files). Based upon that, what number is it (off of your spectrum)?
Last one, Is this the same as the two above. The same as one of the two above, or completely different than the two above? What number is it?
I can't really assume that you're answering honestly, because you could utilize poor viewing angles to see differences in the colors. I can't even assume you'll be honest about not reading ahead. As such, I'll give you the benefit of doubt and assume that you got all these right. The answers, by the way, are 128-128-126.
What have we proven. In a static environment you can alter luminosity and because of the underlying technology you might be able to tell the difference in a continuous spectrum. Now I'm going to need you to be honest with yourself here, how often do you see a continuous spectrum of color? While I'm waiting for that honesty, let's tackle the whole fallacy of the sunset image.
Sunsets are a BS test. They're taking a continuous spectrum, and by nature of digital storage, they're never going to have the same amount of colors available to reproduce that continuous spectrum. The absolute best we can ever hope for is producing a spectrum whose steps are indistinguishable. What has been demonstrated isn't that, what has been demonstrated is an image artifacted to hell by compression. Decompression can't account for a lot of differences between the colors, leading to blockiness where a spectrum value rounds to the same producible number. 10-bit won't fix that. 10-bit can't save an object compressed to hell. What 10 bit can do is make gaming colors more contiguous, but there again I have to ask you whether you can actually tell the difference between the above characters when they might appear on screen together for fractions of a second.
Let's keep all of this simple. Your example, as stated previously, is BS. The 3-8 bit difference being the same as 8-10 is laughable. The sunsets can be produced right now if my static image is compressed to hell first. You've shown one continuous spectrum, which is compressed to fit onto a screen. Tell me, how does one slider that's supposed to have 256^3 colors fit on a screen with less than 1920 vertical pixels without being compressed?
I will state this again. 10 bit colors matter less than increased frame rate, HDR colors, and amount of pixels driven. If you'd like to argue that 10 bit is somehow a gigantic leap forward, which is what you did and continue to do, you've got some hurdles. You're going to have to justify said "improvement" with thousands of dollars in hardware, your software is going to have to support 10 bit, and you're going to have to do all of this when the competition just plugged a new card in and immediately saw improvements. Do you sell a new video card on the fact that you could invest thousands on new hardware to see mathematically and biologically insignificant gains, or do you buy the card because the next game you want to play will run buttery smooth? Please, mind that what you are arguing is that more colors are somehow better, before you respond with "I can tell the difference on my current monitor (which I would have to replace to actually see benefits from 10 bit video data.)"
This isn't a personal attack. This is me asking if you understand that what you are arguing is silly in the mathematical and biological sense. It made sense to go from 5 bits per color to 8 bits per color. It doesn't make the same sense to go from 8 bits per color to 10 bits per color (mathematically derived above) despite actually producing a lot more colors. You're just dividing the same spectrum farther. HDR would increase the spectrum size. Pixel count (or more specifically density) would allow more color changes without incongruity. Increasing to 10 bits of color information doesn't do the same, and its costs are huge (right now).
You're welcome to continue arguing the point. I've lost the drive to care. You're welcome to let your wallet talk, and I'll let mine talk. If Nvidia's offering isn't better, Polaris will be the first time I've seen a reason to spend money since the 7xxx series from AMD. It won't be for the 10-bit colors. It won't be for the HBM. It'll be because I can finally turn up the eye candy in my games, have them playback smoothly, and do all of this with less power draw. That's what sells me a card, and what sells most people their cards. If 10-bit is what you need, power to you. It's a feature I don't even care about until Polaris demonstrates playing games better than what I've already got.