Monday, January 4th 2016
AMD Demonstrates Revolutionary 14 nm FinFET Polaris GPU Architecture
AMD provided customers with a glimpse of its upcoming 2016 Polaris GPU architecture, highlighting a wide range of significant architectural improvements including HDR monitor support, and industry-leading performance-per-watt. AMD expects shipments of Polaris architecture-based GPUs to begin in mid-2016.
AMD's Polaris architecture-based 14nm FinFET GPUs deliver a remarkable generational jump in power efficiency. Polaris-based GPUs are designed for fluid frame rates in graphics, gaming, VR and multimedia applications running on compelling small form-factor thin and light computer designs.
"Our new Polaris architecture showcases significant advances in performance, power efficiency and features," said Lisa Su, president and CEO, AMD. "2016 will be a very exciting year for Radeon fans driven by our Polaris architecture, Radeon Software Crimson Edition and a host of other innovations in the pipeline from our Radeon Technologies Group."
The Polaris architecture features AMD's 4th generation Graphics Core Next (GCN) architecture, a next-generation display engine with support for HDMI 2.0a and DisplayPort 1.3, and next-generation multimedia features including 4K h.265 encoding and decoding.
AMD has an established track record for dramatically increasing the energy efficiency of its mobile processors, targeting a 25x improvement by the year 2020.
AMD's Polaris architecture-based 14nm FinFET GPUs deliver a remarkable generational jump in power efficiency. Polaris-based GPUs are designed for fluid frame rates in graphics, gaming, VR and multimedia applications running on compelling small form-factor thin and light computer designs.
"Our new Polaris architecture showcases significant advances in performance, power efficiency and features," said Lisa Su, president and CEO, AMD. "2016 will be a very exciting year for Radeon fans driven by our Polaris architecture, Radeon Software Crimson Edition and a host of other innovations in the pipeline from our Radeon Technologies Group."
The Polaris architecture features AMD's 4th generation Graphics Core Next (GCN) architecture, a next-generation display engine with support for HDMI 2.0a and DisplayPort 1.3, and next-generation multimedia features including 4K h.265 encoding and decoding.
AMD has an established track record for dramatically increasing the energy efficiency of its mobile processors, targeting a 25x improvement by the year 2020.
88 Comments on AMD Demonstrates Revolutionary 14 nm FinFET Polaris GPU Architecture
It shows 850E before the voltage, perhaps meaning 850Mhz core clock Energy saving made or efficiency mode?
Considering the 950 in a review with some overclock was pulling roughly 100W by itself, and they are saying the whole tower in the video was pulling 150-160W for the Nvidia 950 system, compared to their 86W total... that means it was pulling 30-40W only for the GPU.
Can they keep this what seem like 3 week cadence (or quicker) going till there's products to show? I'm hoping RTG has a new PR initiative to build a slow constant beat... there was Crimson, 380X, the whole better pixels ~ HDR stuff, FreeSync monitors, now Polaris architecture. All of a sudden it just seems they're letting stuff go too rapidly and will there be enough to keep this beat for 4-6 months? I mean I'd hate to see a period of silence where rumor and forums start crafting information just to get hits, and either AMD is in a place plug holes (deny), or have to bring it forward to cover it.
It sounds like they wanted this released now and freely outing their intentions which isn't any big thing, but the whole leading "performance-per-watt" seems a bit to "before it’s time". They could've done Polaris and how they differentiate from GCN with display engine support HDMI 2.0a DP 1.3, and multimedia features for 4K h.265 and not came with an actual comparison to competition perf/W to the (system to system) seemed offer a little too much at this stage.
If you'll note, the power consumption figures are for a single 1080p monitor and the entire system. That means that the theoretical savings from the GPU should be 154-86= 68 Watts. That seems a little bit high, especially considering that none of the new features of that GPU are being utilized. Given those numbers, the extra 6/8 pin power connector is dang near not needed. What I find even funnier is that instead of showing their progress (say a 3xx series card versus this new one) they compare to what will be outdated Nvidia tech before they come to market.
This is depressingly fluff. Instead of showing what they've got they're measuring against an old stick. It could well be the fear that Nvidia will release truly amazing cards with Pascal, but I'd hazard that this is more smoke screen than outright fear. Say that your cards are great before the competition can respond, and then when they bring out numbers you can target your lineup and its pricing to compete well. I'm saddened to think that AMD PR thinks this is necessary.
If AMD came forward with a demonstration where we could see 2 or 3 monitors, running games, I might be happier. I would even accept 4k video as a reasonable demonstration (especially over a single cable, rather than the painful setups we have to make now). What they've shown us right now is that an unknown card, in an unknown segment, can compete well against a relatively low end card from the competitor that has been available for some time.
Sorry, but you don't buy a 950 for gaming unless you've got a very tight budget. I'd like to see two cards that we can say are direct price competitors in the $200-300 range square off. That's where the computer enthusiast looks to spend their money, not on something just adequate for current gaming.
Expect most news to be beneficial for TV and Mobile. Both will save their in-depth reveals for their own shows later on.
4k HDR TVs will be limited to 30hz on HDMI 2.0a for full support but likely manufacturers will dumb it down to 8-bit like they currently are with current 4k support until HDMI gets better. DisplayPort 1.3 should have no problem with full 4k HDR support at 60hz.
HDMI 2.0a
DisplayPort 1.3
4K h.265 encode/decode
YAY! :D I want it NOW! :D
I was looking forward to owning "Greenland," awesome place. Not that it's really relevant, but oh well.
EDIT: NVM, it seems according to another newslink, this is simply the name for the 4th gen GCN architecture. They still have arctic islands.
What it technically means, is 10/12bits per color information, instead of currently widely used 8bits. So instead of 256*256*256 = 16.7M different colors you will get instead on case of 10 bits 1024*1024*1024 = 1073M colors and on case of 12 bits 4096*4096*4096 = 281474976M colors.
Maybe easier to understand would be that right now you have from black to white 256 different gradient steps (256 shades of grey :D ) and if you made such gradient over your 1080p screen from left to right, each different color will have 7.5 pixels wide stripe, then on case of 10bit colors you will have 1024 different colors and each color line would be 1.875pixels wide => much much smoother gradient. On case of 12 bits, you could make such gradient over 4k screen and each line would have it's own color.
However, I must say, that those "HDR photos" that I have seen in interwebs and in journals, for example
stuckincustoms.smugmug.com/Portfolio/i-8mFWsjn/1/900x591/953669278_349a6a9897_o-900x591.jpg
www.imgbase.info/images/safe-wallpapers/photography/hdr/41108_hdr_hdr_landscape.jpg
although look beautiful, don't exactly look natural ... so I am a bit puzzled, what this AMD HDR stuff would mean for picture itself.
If anyone has better explanation please correct me :)
HDR on TVs seem to be a reference for 4k standards + improved CR on panels.
I think these TN monitors should just disappear or they should only be used in entry level products, more like it happen already for the phones. Most decent phones have ips/amoled or different similar tech.
On the performance side I can't wait to see what the FinFET brings to the table. There should be amazing improvement over the last generation. And I really like that they continue with the GCN, which means most of the GCN cards will still get support.
Source: I own a 144Hz TN gaming monitor...
Hopefully with this new generation of AMD GPU-s will bring new displays to market (AMD hinted with cooperation with different display manufacturers for "HDR displays") with 10/12 bit support, that doesn't cost an arm and a leg. And have Free-Sync support. And are IPS ... and are OLED ... and are 21:9 ... and are 144+Hz... and are curved. Too many things that you have to look for when shopping for displays.
I obviously have the IPS with 60hz, and a pretty old one as well, but i still like it so much, despite its pitfalls, which is mostly lack of 120Hz and Freesync/Gsync.
I don't know what to say, I occasionally visit electronics stores and every time I pass by the shelf with monitors I can instantly tell which is TN and which is IPS, just by seeing the viewing angles, and my guess is that they sell quite new monitors in there, in fact some of the ips on display are those crazy wide monitors, which are a very new gimmick.
I would also like more hertz and tearing free experience, but not at the expense of color fidelity and viewing angles. If I cannot have both then I prefer IPS.
I think companies should stop investing in TN and focus more on making better technology affordable and available to everybody.
BTW prices are not as you say from what I see if you compare the same brand, a high hertz TN monitor is like 70% of a good 144hz IPS (MG279Q), and there are some TN gaming monitors which are even more expensive than the IPS like the PG278Q.
For sure you will not notice the viewing angles during gaming. For me, I don't care about anti-aliasing for example, and I game with it disabled most of the time, although in some games I could easily enable it, without dropping under 60 frames. If you look carefully on a static image you will notice it, but during movement and action ... not really.
I'm also pretty sure that the low response time does make a difference and I do plan to get to 120hz myself, but somehow I still find it hard to let go on my old monitor, which still works perfectly and served me well for so many years.