Monday, January 4th 2016
AMD Demonstrates Revolutionary 14 nm FinFET Polaris GPU Architecture
AMD provided customers with a glimpse of its upcoming 2016 Polaris GPU architecture, highlighting a wide range of significant architectural improvements including HDR monitor support, and industry-leading performance-per-watt. AMD expects shipments of Polaris architecture-based GPUs to begin in mid-2016.
AMD's Polaris architecture-based 14nm FinFET GPUs deliver a remarkable generational jump in power efficiency. Polaris-based GPUs are designed for fluid frame rates in graphics, gaming, VR and multimedia applications running on compelling small form-factor thin and light computer designs.
"Our new Polaris architecture showcases significant advances in performance, power efficiency and features," said Lisa Su, president and CEO, AMD. "2016 will be a very exciting year for Radeon fans driven by our Polaris architecture, Radeon Software Crimson Edition and a host of other innovations in the pipeline from our Radeon Technologies Group."
The Polaris architecture features AMD's 4th generation Graphics Core Next (GCN) architecture, a next-generation display engine with support for HDMI 2.0a and DisplayPort 1.3, and next-generation multimedia features including 4K h.265 encoding and decoding.
AMD has an established track record for dramatically increasing the energy efficiency of its mobile processors, targeting a 25x improvement by the year 2020.
AMD's Polaris architecture-based 14nm FinFET GPUs deliver a remarkable generational jump in power efficiency. Polaris-based GPUs are designed for fluid frame rates in graphics, gaming, VR and multimedia applications running on compelling small form-factor thin and light computer designs.
"Our new Polaris architecture showcases significant advances in performance, power efficiency and features," said Lisa Su, president and CEO, AMD. "2016 will be a very exciting year for Radeon fans driven by our Polaris architecture, Radeon Software Crimson Edition and a host of other innovations in the pipeline from our Radeon Technologies Group."
The Polaris architecture features AMD's 4th generation Graphics Core Next (GCN) architecture, a next-generation display engine with support for HDMI 2.0a and DisplayPort 1.3, and next-generation multimedia features including 4K h.265 encoding and decoding.
AMD has an established track record for dramatically increasing the energy efficiency of its mobile processors, targeting a 25x improvement by the year 2020.
88 Comments on AMD Demonstrates Revolutionary 14 nm FinFET Polaris GPU Architecture
For comparison I took the liberty to copy your idea to darker colors and here I can see all 4 colors clearly on my 3 different displays (Dell IPS; Dell TN and Benq MVA):
That example already quite clearly illustrates IMHO that on average scenario the brightest and possibly also darkest colors are just cut off on ordinary cheap 8bit monitors; NOT that you wouldn't see the difference, if it were actually there!
I took your picture. I cut out one of the four colors. Tell me, which one is it?
Tired of waiting, it's 128.
Here's another image. Tell me if it's the same color, or a different one (no cheating, I uploaded 3 different versions and files). Based upon that, what number is it (off of your spectrum)?
Last one, Is this the same as the two above. The same as one of the two above, or completely different than the two above? What number is it?
I can't really assume that you're answering honestly, because you could utilize poor viewing angles to see differences in the colors. I can't even assume you'll be honest about not reading ahead. As such, I'll give you the benefit of doubt and assume that you got all these right. The answers, by the way, are 128-128-126.
What have we proven. In a static environment you can alter luminosity and because of the underlying technology you might be able to tell the difference in a continuous spectrum. Now I'm going to need you to be honest with yourself here, how often do you see a continuous spectrum of color? While I'm waiting for that honesty, let's tackle the whole fallacy of the sunset image.
Sunsets are a BS test. They're taking a continuous spectrum, and by nature of digital storage, they're never going to have the same amount of colors available to reproduce that continuous spectrum. The absolute best we can ever hope for is producing a spectrum whose steps are indistinguishable. What has been demonstrated isn't that, what has been demonstrated is an image artifacted to hell by compression. Decompression can't account for a lot of differences between the colors, leading to blockiness where a spectrum value rounds to the same producible number. 10-bit won't fix that. 10-bit can't save an object compressed to hell. What 10 bit can do is make gaming colors more contiguous, but there again I have to ask you whether you can actually tell the difference between the above characters when they might appear on screen together for fractions of a second.
Let's keep all of this simple. Your example, as stated previously, is BS. The 3-8 bit difference being the same as 8-10 is laughable. The sunsets can be produced right now if my static image is compressed to hell first. You've shown one continuous spectrum, which is compressed to fit onto a screen. Tell me, how does one slider that's supposed to have 256^3 colors fit on a screen with less than 1920 vertical pixels without being compressed?
I will state this again. 10 bit colors matter less than increased frame rate, HDR colors, and amount of pixels driven. If you'd like to argue that 10 bit is somehow a gigantic leap forward, which is what you did and continue to do, you've got some hurdles. You're going to have to justify said "improvement" with thousands of dollars in hardware, your software is going to have to support 10 bit, and you're going to have to do all of this when the competition just plugged a new card in and immediately saw improvements. Do you sell a new video card on the fact that you could invest thousands on new hardware to see mathematically and biologically insignificant gains, or do you buy the card because the next game you want to play will run buttery smooth? Please, mind that what you are arguing is that more colors are somehow better, before you respond with "I can tell the difference on my current monitor (which I would have to replace to actually see benefits from 10 bit video data.)"
This isn't a personal attack. This is me asking if you understand that what you are arguing is silly in the mathematical and biological sense. It made sense to go from 5 bits per color to 8 bits per color. It doesn't make the same sense to go from 8 bits per color to 10 bits per color (mathematically derived above) despite actually producing a lot more colors. You're just dividing the same spectrum farther. HDR would increase the spectrum size. Pixel count (or more specifically density) would allow more color changes without incongruity. Increasing to 10 bits of color information doesn't do the same, and its costs are huge (right now).
You're welcome to continue arguing the point. I've lost the drive to care. You're welcome to let your wallet talk, and I'll let mine talk. If Nvidia's offering isn't better, Polaris will be the first time I've seen a reason to spend money since the 7xxx series from AMD. It won't be for the 10-bit colors. It won't be for the HBM. It'll be because I can finally turn up the eye candy in my games, have them playback smoothly, and do all of this with less power draw. That's what sells me a card, and what sells most people their cards. If 10-bit is what you need, power to you. It's a feature I don't even care about until Polaris demonstrates playing games better than what I've already got.
380,390 are strong products as well.
HDR (the contrast ratio between black and white) is far more important than the color depth (8-bit or 10 bit) as far as I'm concerned. You can clearly see the benefits of HDR--you can't clearly see the benefits of 10-bit under most circumstances.
HDR doesn't require 10 bit. HDR functionally makes the scale for color bigger. @Xzibit's graph does an excellent job showing it. Mathematically, 64,64,64 in an HDR monitor might equate to 128,128,128 in a standard monitor. This will increase the range between different values, which will make 10 bit relevant because the colors will be separated by more than in an 8-bit monitor. At the same time if you need x to make y relevant then y on its own is significant;y less relevant. This is even more true when x and y enabled devices are still perched well out of the reasonable consumer pricing zone.
No, just no. You're telling me than despite having my current 7970 GPU on the same process node as Fury and the 980 they are worth buying. You're saying that a purchase of $500+ is warranted, to get marginally better performance. You're of course glossing over the near release of Pascal, and Polaris (in a thread about Polaris no less) to tell me that both of these cards represent a worthwhile delta in performance for their substantial price tag. Either you don't understand the value of money, or you have a definition of worthwhile reason to upgrade that allowed you to upgrade through each and every one of Intel's recent processors despite the popular concensus being that Skylake is really the first time we've had a CPU worth upgrading to since Sandybridge. Let me make this clear though, buying a new card today I'd go with your selections. New versus upgrading is an entirely different consideration though.
I think you've adequately demonstrated both a personal lack of effort by not reading my comment, and an understanding of the situation which I can't link to a rational drive. As such, I'm going to just ask one thing from you; show me any proof of your statements and logic. deemon put forward the effort to find an infographic. If you're going to make a claim, support it. Math, graphics, infographics, and the like make your points salient. You've come to the discussion believing that you don't even have to try and understand the opposite argument or fully comprehend what they are saying.
In contrast, I believe I understand where deemon is coming from. It's a technically correct point, more color variations is better. What it isn't is a point which I feel should be used to sell monitors, or demonstrated with misleading material. This is where we differ, and it seems to be an irreconcilable point. There isn't a right answer here, but there are plenty of things that can be contradicted if said without comprehending their meaning.
Supporting it normally is more likely. The EDID of a monitor/TV can tell the GPU/device what its capable of. User would have option to enable/disable.
Are you seriously saying that Fury cards are only "marginally" faster than 7970? Into a wall of text, when main point could have been expressed in a single sentence: compression will eat up those extra bits in 10 bit.
Well, we'll see about that. Depends on what one means by "require". In my books, if you have a 10 bit panel (and all HDR panels I've heard about are 10 bits and to my knowledge it will stay like that for quite a while) 10 bit support in your graphic card is required.
You continue to be ignorant. Technically the 290 was an improvement on the 7970. The question isn't what is better, but where the price to performance increase is reasonable. Going from a 7970 (with a healthy overclock) to a Fury isn't reasonable. This is the same argument you seem to be constantly making, under the assumption that there's no such thing as a cost to benefit weighting on money spent. Heck, HDR and 10-bit together are objectively better than what we have now, but I'm not rushing out to spend thousands of dollars on new hardware. If you really wanted to continue this logic you should own the Intel enthusiast platform's X offering, because everything else compromises core count and thus is worse.
You now try and summarize my entire point in a sentence, and utterly fail. Congratulations, you can't see the other side and it is therefore wrong. Third grade level reasoning there. What I've been saying is that 10 bit offers more distinction between already hard to differentiate colors. Your monitor is probably 1920x1080 (judging from sales figures). There are therefore more colors in the spectrum than can adequately fit on a horizontal line. How do you believe that's done? All you need to so is remove a ton of them, and round the remaining color values so that (2^8)^3 (256^3 = 16777216 colors fit into 1920 pixels. Do you not see the hypocrisy there? Better yet, I explicitly stated that HDR would change the limitations of colors (pure colors would be beyond the traditional 256 value range) such that HDR would make 10-bit more reasonable, yet you've somehow glossed over that conclusion.
I'd like you to ask yourself one question. @deemon is not an idiot, and has a point which has some foundation. Why then did they follow up the spectrum with 4 color blotches? I'll give you a hint, it was the most intelligent move I could have imagined, and while I don't agree to the point the argument was a laudable response to me. What was demonstrated was an argument that our current spectrum doesn't do enough to differentiate color values. It didn't rely on compression, or crappy math. It displayed the point simply and concisely. If you saw a marked difference between the four colors you have to admit that 10 bit color values would have erased that distinction by having so many more in between them that the colors wouldn't be distinguishable.
I don't buy the argument because I don't watch the solid color polygon network. Motion is what our eyes are better at discerning, which is why frame rate and pixel count are more important. Deemon is free to allow their money to push for 10 bit color. My money is going to push for higher pixels count and frame rates. Right now, 10 bit is a huge price premium from the hardware side, which is another huge hit against it being particularly useful. Whenever 10 bit and HDR capable monitors are reasonably priced (ie, reasonably enough that most monitors/TVs use the technology) then 10 bit will be viable. At that point, @deemon will be 100% correct. For now, with Polaris being a few months from release, it's not a concern. Unsurprisingly, I only care about now because if I forever lived in the future I'd never be able to buy anything and be happy with its performance.
I have asked a simple question, "Are you seriously saying that Fury cards are only "marginally" faster than 7970" and I want to see a simple answer to it, like YES (look, it's merely X% faster) or NO.
I read your wall of text and didn't find the answer to that.
Now to newly introduced "reasonably priced" argument.
MSRP of 7970 was 550$.
MSRP of Fury Nano (yet, any would do) is 499$
Fury STOMPS over 7970 performance wise, it's nowhere a "marginal" improvement, 290x was +30-40% over 7970 on avg, Furys are faster than that. Among other things.
To me, this is the least compelling benefit it brings.
Much more interesting is wider gamut. And why do I need to address your own objections of your own arguments? I don't need to see bars to get that point.
Now that I've taken the effort to say that, let's define why. In a single game, the 290 was capable of going from 70 FPS to 95 FPS (gpuboss.com/gpus/Radeon-R9-290X-vs-Radeon-HD-7970). That "30-40%" improvement means nothing, when the difference is based upon setting everything to maximum and playing it at a constant resolution. You've pulled numbers that mean very little, when I'm literally already playing with all of the eye candy set to maximum. If the 7970 could only get 70 FPS at moderate setting, and the 290x could get 95 at high settings I'd gladly agree with you. The problem here is numbers aren't useful without context, that you are incapable or too lazy to provide. I'll say this much, the reviewers here said that the 290x was a waste on any resolution less than 2560x1600, and I'd have to agree.
I'd like to follow that up with the comment that you are either being obtuse intentionally, or revision of history is acceptable to you. MSRP today and at release aren't the same thing. You quote the release price of the 7970, but the newly discounted price of the Fury is on display (release was $550 or $650). Tell me, isn't the best bang for your buck then going to be the oldest high level card out there? If I were to follow that logic a 7970 can be had today for a couple hundred dollars. The Fury is now $500. That means the price difference is 250%. Can the Fury perform at 250% of the output of a 7970? Nope.
Finally, you don't get 10-bit at all. Please stop talking like you do. The wider range of colors is all HDR. Let's make the example simple. You've got a red LED. Said LED can be signaled between 0 and 2 volts. The steps between on and off (0-2 volts) in 8 bit would be 0.0078 wide. This mean that 1 volt would be 128. Now we've got 10 bit color. The LED is powered the same way, 0-2 volts. The difference is each step is now 0.0020 wide. That means 1 volt would now be 512. In both cases for 1 volt we've got different numbers, but the color is the same exact value. That's the difference between 10 bit an 8 bit color encoding (grossly simplified).
You quote a "wider gamut" of colors. The earlier post by Xzibit explains that is what HDR does. Let me give you a clue, HDR is High Dynamic Range; what you are talking about is a greater range in colors. I'm not sure how that eluded your detection, but I think we can agree that the logic is easy to comprehend now.
Why are you continuing to argue the point? I can only assume that you're one of the people who upgraded to Fury or the 2xx or 3xx series cards. Cards that were developed on the same node as the 7970, but pushed further from the factory to help differentiate them when performance gains from minor structural changes proved insignificant. I will say that the 9xx series from Nvidia is excellent for DX11 performance. The 3xx series from AMD is doing a very good job driving huge monitors. The problem with both is that their improvements aren't noticeable gaming, despite benchmarks telling us otherwise. Polaris and Pascal are setting themselves up as a huge leap in performance that will be appreciable. If you'd like to argue otherwise I implore you to go spend the money. The entire point here was to discuss Polaris, but you've made it about why I should support an insanely expensive feature and buy a GPU today. I can't understand why, but you seem hell bent.