• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Demonstrates Revolutionary 14 nm FinFET Polaris GPU Architecture

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,306 (7.52/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
AMD provided customers with a glimpse of its upcoming 2016 Polaris GPU architecture, highlighting a wide range of significant architectural improvements including HDR monitor support, and industry-leading performance-per-watt. AMD expects shipments of Polaris architecture-based GPUs to begin in mid-2016.

AMD's Polaris architecture-based 14nm FinFET GPUs deliver a remarkable generational jump in power efficiency. Polaris-based GPUs are designed for fluid frame rates in graphics, gaming, VR and multimedia applications running on compelling small form-factor thin and light computer designs.

"Our new Polaris architecture showcases significant advances in performance, power efficiency and features," said Lisa Su, president and CEO, AMD. "2016 will be a very exciting year for Radeon fans driven by our Polaris architecture, Radeon Software Crimson Edition and a host of other innovations in the pipeline from our Radeon Technologies Group."

The Polaris architecture features AMD's 4th generation Graphics Core Next (GCN) architecture, a next-generation display engine with support for HDMI 2.0a and DisplayPort 1.3, and next-generation multimedia features including 4K h.265 encoding and decoding.


AMD has an established track record for dramatically increasing the energy efficiency of its mobile processors, targeting a 25x improvement by the year 2020.

View at TechPowerUp Main Site
 
Joined
Nov 4, 2005
Messages
12,016 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
Compared to a 950, but with no mention of price, but at only .85 V according to the specs so that is good I suppose, unless its a very cherry picked sample with a very custom tuned BIOS for power consumption.

It shows 850E before the voltage, perhaps meaning 850Mhz core clock Energy saving made or efficiency mode?

Considering the 950 in a review with some overclock was pulling roughly 100W by itself, and they are saying the whole tower in the video was pulling 150-160W for the Nvidia 950 system, compared to their 86W total... that means it was pulling 30-40W only for the GPU.
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,121 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
Too early for anything but PR speculation. HBM2 and the 1/2 node size will have massive efficiency gains regardless. But, happily it seems the head man is doing the talking, not some marketing type so this could be a very good portent of things to come.
 
Joined
Apr 19, 2011
Messages
2,198 (0.44/day)
Location
So. Cal.
It sure seems RTG has started a constant drum beat... and it feels a little to early.

Can they keep this what seem like 3 week cadence (or quicker) going till there's products to show? I'm hoping RTG has a new PR initiative to build a slow constant beat... there was Crimson, 380X, the whole better pixels ~ HDR stuff, FreeSync monitors, now Polaris architecture. All of a sudden it just seems they're letting stuff go too rapidly and will there be enough to keep this beat for 4-6 months? I mean I'd hate to see a period of silence where rumor and forums start crafting information just to get hits, and either AMD is in a place plug holes (deny), or have to bring it forward to cover it.

It sounds like they wanted this released now and freely outing their intentions which isn't any big thing, but the whole leading "performance-per-watt" seems a bit to "before it’s time". They could've done Polaris and how they differentiate from GCN with display engine support HDMI 2.0a DP 1.3, and multimedia features for 4K h.265 and not came with an actual comparison to competition perf/W to the (system to system) seemed offer a little too much at this stage.
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,121 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
It sure seems RTG has started a constant drum beat... and it feels a little to early.

Can they keep this what seem like 3 week cadence (or quicker) going till there's products to show? I'm hoping RTG has a new PR initiative to build a slow constant beat... there was Crimson, 380X, the whole better pixels ~ HDR stuff, FreeSync monitors, now Polaris architecture. All of a sudden it just seems they're letting stuff go too rapidly and will there be enough to keep this beat for 4-6 months? I mean I'd hate to see a period of silence where rumor and forums start crafting information just to get hits, and either AMD is in a place plug holes (deny), or have to bring it forward to cover it.

It sounds like they wanted this released now and freely outing their intentions which isn't any big thing, but the whole leading "performance-per-watt" seems a bit to "before it’s time". They could've done Polaris and how they differentiate from GCN with display engine support HDMI 2.0a DP 1.3, and multimedia features for 4K h.265 and not came with an actual comparison to competition perf/W to the (system to system) seemed offer a little too much at this stage.

Well, Nvidia have an event tomorrow so perhaps AMD jumped first for the PR? 2016 could be very dog eat dog.
 
Joined
Apr 2, 2011
Messages
2,850 (0.57/day)
I'm too wary to jump at this for its face value.

If you'll note, the power consumption figures are for a single 1080p monitor and the entire system. That means that the theoretical savings from the GPU should be 154-86= 68 Watts. That seems a little bit high, especially considering that none of the new features of that GPU are being utilized. Given those numbers, the extra 6/8 pin power connector is dang near not needed. What I find even funnier is that instead of showing their progress (say a 3xx series card versus this new one) they compare to what will be outdated Nvidia tech before they come to market.

This is depressingly fluff. Instead of showing what they've got they're measuring against an old stick. It could well be the fear that Nvidia will release truly amazing cards with Pascal, but I'd hazard that this is more smoke screen than outright fear. Say that your cards are great before the competition can respond, and then when they bring out numbers you can target your lineup and its pricing to compete well. I'm saddened to think that AMD PR thinks this is necessary.


If AMD came forward with a demonstration where we could see 2 or 3 monitors, running games, I might be happier. I would even accept 4k video as a reasonable demonstration (especially over a single cable, rather than the painful setups we have to make now). What they've shown us right now is that an unknown card, in an unknown segment, can compete well against a relatively low end card from the competitor that has been available for some time.

Sorry, but you don't buy a 950 for gaming unless you've got a very tight budget. I'd like to see two cards that we can say are direct price competitors in the $200-300 range square off. That's where the computer enthusiast looks to spend their money, not on something just adequate for current gaming.
 
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
It sure seems RTG has started a constant drum beat... and it feels a little to early.

Can they keep this what seem like 3 week cadence (or quicker) going till there's products to show? I'm hoping RTG has a new PR initiative to build a slow constant beat... there was Crimson, 380X, the whole better pixels ~ HDR stuff, FreeSync monitors, now Polaris architecture. All of a sudden it just seems they're letting stuff go too rapidly and will there be enough to keep this beat for 4-6 months? I mean I'd hate to see a period of silence where rumor and forums start crafting information just to get hits, and either AMD is in a place plug holes (deny), or have to bring it forward to cover it.

It sounds like they wanted this released now and freely outing their intentions which isn't any big thing, but the whole leading "performance-per-watt" seems a bit to "before it’s time". They could've done Polaris and how they differentiate from GCN with display engine support HDMI 2.0a DP 1.3, and multimedia features for 4K h.265 and not came with an actual comparison to competition perf/W to the (system to system) seemed offer a little too much at this stage.

CES is going to be HDR heavy HDMI 2.0a with all the TV manufacturers pushing it this year. PC folks will have DP 1.3 finally. All there is left is product support from TV & Monitors once the GPUs are made available. HDMI 2.0 was short-lived and horribly supported by only a limited expensive TVs.

Expect most news to be beneficial for TV and Mobile. Both will save their in-depth reveals for their own shows later on.
 
Last edited:
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
From AnandTech:

As for RTG’s FinFET manufacturing plans, the fact that RTG only mentions “FinFET” and not a specific FinFET process (e.g. TSMC 16nm) is intentional. The group has confirmed that they will be utilizing both traditional partner TSMC’s 16nm process and AMD fab spin-off (and Samsung licensee) GlobalFoundries’ 14nm process, making this the first time that AMD’s graphics group has used more than a single fab. To be clear here there’s no expectation that RTG will be dual-sourcing – having both fabs produce the same GPU – but rather the implication is that designs will be split between the two fabs. To that end we know that the small Polaris GPU that RTG previewed will be produced by GlobalFoundries on their 14nm process, meanwhile it remains to be seen how the rest of RTG’s Polaris GPUs will be split between the fabs.

Unfortunately what’s not clear at this time is why RTG is splitting designs like this. Even without dual sourcing any specific GPU, RTG will still incur some extra costs to develop common logic blocks for both fabs. Meanwhile it's also not clear right now whether any single process/fab is better or worse for GPUs, and what die sizes are viable, so until RTG discloses more information about the split order, it's open to speculation what the technical reasons may be. However it should be noted that on the financial side of matters, as AMD continues to execute a wafer share agreement with GlobalFoundries, it’s likely that this split helps AMD to fulfill their wafer obligations by giving GlobalFoundries more of AMD's chip orders.

Seems that TSMC still has some secret GPU sauce.
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,121 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
CES is going to be HDR heavy HDMI 2.0a with all the TV manufacturers pushing it this year. PC folks will have DP 1.3 finally. All there is left is product support from TV & Monitors once the GPUs are made available. HDMI 2.0 was short-lived and horribly supported by only a limited expensive TVs.

Expect most news to be beneficial for TV and Mobile. Both will save their in-depth reveals for their own shows later on.

HDR a go-go. Was looking at 4k OLED TV's but read the current crop aren't HDR compatible which is what the next Blu-ray (4K) will use. So yeah, I think that will be a huge focus this year as you say.
 
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
HDR a go-go. Was looking at 4k OLED TV's but read the current crop aren't HDR compatible which is what the next Blu-ray (4K) will use. So yeah, I think that will be a huge focus this year as you say.

Another thing to look out for is HDR requires higher bandwidth HDMI 2.0a doesn't look to be capable of handling 4k rec.2020 10-bit HDR @ 60hz. I suspect all this years model (with the exception of some $12,000+ models) to suffer the same draw backs almost all 4k TVs still have of just being a resolution increase with no proper support.

4k HDR TVs will be limited to 30hz on HDMI 2.0a for full support but likely manufacturers will dumb it down to 8-bit like they currently are with current 4k support until HDMI gets better. DisplayPort 1.3 should have no problem with full 4k HDR support at 60hz.
 
Joined
Jun 13, 2012
Messages
1,412 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Audio Device(s) Logitech Z906 5.1
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
It shows 850E before the voltage, perhaps meaning 850Mhz core clock Energy saving made or efficiency mode?

Considering the 950 in a review with some overclock was pulling roughly 100W by itself, and they are saying the whole tower in the video was pulling 150-160W for the Nvidia 950 system, compared to their 86W total... that means it was pulling 30-40W only for the GPU.
Its AMD so can't take anything they say at 100% face value, til its proven by independent reviews
It sure seems RTG has started a constant drum beat... and it feels a little to early.
Yea typical AMD starting up the hype train which could come to a screeching halt if they poke the green monster to much.
What I find even funnier is that instead of showing their progress (say a 3xx series card versus this new one) they compare to what will be outdated Nvidia tech before they come to market.
This is depressingly fluff. Instead of showing what they've got they're measuring against an old stick. It could well be the fear that Nvidia will release truly amazing cards with Pascal,
I think AMD should stick to comparing to their own cards instead of nvidia. if you look on nvidia's site they just compare their cards to their own cards not amd.
 
Last edited:
Joined
Aug 15, 2008
Messages
5,941 (0.99/day)
Location
Watauga, Texas
System Name Univac SLI Edition
Processor Intel Xeon 1650 V3 @ 4.2GHz
Motherboard eVGA X99 FTW K
Cooling EK Supremacy EVO, Swiftech MCP50x, Alphacool NeXXos UT60 360, Black Ice GTX 360
Memory 2x16GB Corsair Vengeance LPX 3000MHz
Video Card(s) Nvidia Titan X Tri-SLI w/ EK Blocks
Storage HyperX Predator 240GB PCI-E, Samsung 850 Pro 512GB
Display(s) Dell UltraSharp 34" Ultra-Wide (U3415W) / (Samsung 48" Curved 4k)
Case Phanteks Enthoo Pro M Acrylic Edition
Audio Device(s) Sound Blaster Z
Power Supply Thermaltake 1350watt Toughpower Modular
Mouse Logitech G502
Keyboard CODE 10 keyless MX Clears
Software Windows 10 Pro
Blah blah blah, beat Nvidia so I can play with team Red again. Please.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
0:39

HDMI 2.0a
DisplayPort 1.3
4K h.265 encode/decode

YAY! :D I want it NOW! :D
 
Joined
Apr 16, 2015
Messages
306 (0.09/day)
System Name Zen
Processor Ryzen 5950X
Motherboard MSI
Cooling Noctua
Memory G.Skill 32G
Video Card(s) RX 7900 XTX
Storage never enough
Display(s) not OLED :-(
Keyboard Wooting One
Software Linux
0:39

HDMI 2.0a
DisplayPort 1.3
4K h.265 encode/decode

YAY! :D I want it NOW! :D

Now if they can only give us a card that can pull off 4k HDR @ 60+fps at ultra settings in any game in a single card solution and in Fury Nano size card. That would be something! :)
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I think 4K HDR @ 60+ FPS is entirely plausible. I think the PCB will be nano-sized (because HBM) but the air coolers on it would have to be larger to prevent thermal throttling. A little more than Fury X performance from a 380-sized card is likely.
 
Joined
Aug 20, 2007
Messages
21,552 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 5800X Optane 800GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
So much for "Arctic Islands" naming scheme?

I was looking forward to owning "Greenland," awesome place. Not that it's really relevant, but oh well.

EDIT: NVM, it seems according to another newslink, this is simply the name for the 4th gen GCN architecture. They still have arctic islands.
 
Last edited:
Joined
Nov 21, 2007
Messages
3,688 (0.59/day)
Location
Ohio
System Name Felix777
Processor Core i5-3570k@stock
Motherboard Biostar H61
Memory 8gb
Video Card(s) XFX RX 470
Storage WD 500GB BLK
Display(s) Acer p236h bd
Case Haf 912
Audio Device(s) onboard
Power Supply Rosewill CAPSTONE 450watt
Software Win 10 x64
i'm not super tv/monitor savy besides the basics. I keep reading HDR.....only HDR i know of is the light setting in games.
 
Joined
Apr 16, 2015
Messages
306 (0.09/day)
System Name Zen
Processor Ryzen 5950X
Motherboard MSI
Cooling Noctua
Memory G.Skill 32G
Video Card(s) RX 7900 XTX
Storage never enough
Display(s) not OLED :-(
Keyboard Wooting One
Software Linux
i'm not super tv/monitor savy besides the basics. I keep reading HDR.....only HDR i know of is the light setting in games.

https://en.wikipedia.org/wiki/High-dynamic-range_imaging

What it technically means, is 10/12bits per color information, instead of currently widely used 8bits. So instead of 256*256*256 = 16.7M different colors you will get instead on case of 10 bits 1024*1024*1024 = 1073M colors and on case of 12 bits 4096*4096*4096 = 281474976M colors.

Maybe easier to understand would be that right now you have from black to white 256 different gradient steps (256 shades of grey :D ) and if you made such gradient over your 1080p screen from left to right, each different color will have 7.5 pixels wide stripe, then on case of 10bit colors you will have 1024 different colors and each color line would be 1.875pixels wide => much much smoother gradient. On case of 12 bits, you could make such gradient over 4k screen and each line would have it's own color.

However, I must say, that those "HDR photos" that I have seen in interwebs and in journals, for example
http://stuckincustoms.smugmug.com/Portfolio/i-8mFWsjn/1/900x591/953669278_349a6a9897_o-900x591.jpg
http://www.imgbase.info/images/safe-wallpapers/photography/hdr/41108_hdr_hdr_landscape.jpg

although look beautiful, don't exactly look natural ... so I am a bit puzzled, what this AMD HDR stuff would mean for picture itself.

If anyone has better explanation please correct me :)
 
Last edited:
Joined
Nov 21, 2007
Messages
3,688 (0.59/day)
Location
Ohio
System Name Felix777
Processor Core i5-3570k@stock
Motherboard Biostar H61
Memory 8gb
Video Card(s) XFX RX 470
Storage WD 500GB BLK
Display(s) Acer p236h bd
Case Haf 912
Audio Device(s) onboard
Power Supply Rosewill CAPSTONE 450watt
Software Win 10 x64
~
What it technically means, is 10/12bits per color information, instead of currently widely used 8bits. So instead of 256*256*256 = 16.7M different colors you will get instead on case of 10 bits 1024*1024*1024 = 1073M colors and on case of 12 bits 4096*4096*4096 = 281474976M colors.

Maybe easier to understand would be that right now you have from black to white 256 different gradient steps (256 shades of grey :D ) and if you made such gradient over your 1080p screen from left to right, each different color will have 7.5 pixels wide stripe, then on case of 10bit colors you will have 1024 different colors and each color line would be 1.875pixels wide => much much smoother gradient. On case of 12 bits, you could make such gradient over 4k screen and each line would have it's own color.
~
If anyone has better explanation please correct me :)

ahh thanks, so that would explain why in dark dark scenes, my tv tends to have somewhat noticeable black "shade" lines as i call them, where its distinguishable from one area of black on my screen to another area instead of a smooth transition. Course my TV is from 2012. Here i thought they called 16.7mill colors "true color" because that was all the recognizable colors to the human eye.
 
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
https://en.wikipedia.org/wiki/High-dynamic-range_imaging

What it technically means, is 10/12bits per color information, instead of currently widely used 8bits. So instead of 256*256*256 = 16.7M different colors you will get instead on case of 10 bits 1024*1024*1024 = 1073M colors and on case of 12 bits 4096*4096*4096 = 281474976M colors.

Maybe easier to understand would be that right now you have from black to white 256 different gradient steps (256 shades of grey :D ) and if you made such gradient over your 1080p screen from left to right, each different color will have 7.5 pixels wide stripe, then on case of 10bit colors you will have 1024 different colors and each color line would be 1.875pixels wide => much much smoother gradient. On case of 12 bits, you could make such gradient over 4k screen and each line would have it's own color.

However, I must say, that those "HDR photos" that I have seen in interwebs and in journals, for example
http://stuckincustoms.smugmug.com/Portfolio/i-8mFWsjn/1/900x591/953669278_349a6a9897_o-900x591.jpg
http://www.imgbase.info/images/safe-wallpapers/photography/hdr/41108_hdr_hdr_landscape.jpg

although look beautiful, don't exactly look natural ... so I am a bit puzzled, what this AMD HDR stuff would mean for picture itself.

If anyone has better explanation please correct me :)

HDR on cameras is multi exposures filtered to make 1.
HDR on TVs seem to be a reference for 4k standards + improved CR on panels.
 
Last edited:
Joined
Apr 18, 2015
Messages
234 (0.07/day)
although look beautiful, don't exactly look natural ... so I am a bit puzzled, what this AMD HDR stuff would mean for picture itself.

Another question is, how would those HDR 10 bit/color pictures look on a "gaming" TN panel which in most of the cases has 6 bit/color.

I think these TN monitors should just disappear or they should only be used in entry level products, more like it happen already for the phones. Most decent phones have ips/amoled or different similar tech.

On the performance side I can't wait to see what the FinFET brings to the table. There should be amazing improvement over the last generation. And I really like that they continue with the GCN, which means most of the GCN cards will still get support.
 
Last edited:
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
Yet another one bashing TN panels because reasons. Good luck finding a 144Hz IPS screens that don't cost fucking 1k €. But you can get TN ones with same other specs for 1/4th of the price. And color/angle wise they aren't that worse. Stop thinking of TN panels from year 2005 and comparing them to those released in 2015...

Source: I own a 144Hz TN gaming monitor...
 
Last edited:
Joined
Apr 16, 2015
Messages
306 (0.09/day)
System Name Zen
Processor Ryzen 5950X
Motherboard MSI
Cooling Noctua
Memory G.Skill 32G
Video Card(s) RX 7900 XTX
Storage never enough
Display(s) not OLED :-(
Keyboard Wooting One
Software Linux
Another question is, how would those HDR 10 bit/color pictures on a "gaming" TN panel which in most of the cases it has 6 bit/color.

99% of the currently available monitors don't benefit from this, be they TN, IPS or MVA. There are few uber-expensive "pro" displays... that you can use with nvidia tegra/firepro GPU-s I believe.

Hopefully with this new generation of AMD GPU-s will bring new displays to market (AMD hinted with cooperation with different display manufacturers for "HDR displays") with 10/12 bit support, that doesn't cost an arm and a leg. And have Free-Sync support. And are IPS ... and are OLED ... and are 21:9 ... and are 144+Hz... and are curved. Too many things that you have to look for when shopping for displays.
 
Last edited:
Joined
Apr 18, 2015
Messages
234 (0.07/day)
Yet another one bashing TN panels because reasons. Good luck finding a 144Hz IPS screens that don't cost fucking 1k €. But you can get TN ones with same other specs for 1/4th of the price. And color/angle wise they aren't that worse. Stop thinking of TN panels from year 2005 and comparing them to those released in 2015...

Source: I own a 144Hz TN gaming monitor...

:)

I obviously have the IPS with 60hz, and a pretty old one as well, but i still like it so much, despite its pitfalls, which is mostly lack of 120Hz and Freesync/Gsync.

I don't know what to say, I occasionally visit electronics stores and every time I pass by the shelf with monitors I can instantly tell which is TN and which is IPS, just by seeing the viewing angles, and my guess is that they sell quite new monitors in there, in fact some of the ips on display are those crazy wide monitors, which are a very new gimmick.

I would also like more hertz and tearing free experience, but not at the expense of color fidelity and viewing angles. If I cannot have both then I prefer IPS.
I think companies should stop investing in TN and focus more on making better technology affordable and available to everybody.

BTW prices are not as you say from what I see if you compare the same brand, a high hertz TN monitor is like 70% of a good 144hz IPS (MG279Q), and there are some TN gaming monitors which are even more expensive than the IPS like the PG278Q.
 
Last edited:
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
Viewing angles are meaningless for gaming imo. For comfortable gaming you're facing monitor dead on anyway. Besides, even if you lean a bit to any side, trust me, in the heat of the battle, you'll NEVER notice tiny gradients of colors that are a bit off. And with pixel response times, 1ms (TN) compared to 5ms (IPS), zero shadowing. When I first brought it home, image was so sharp in motion it was weird to look at the image because it was so sharp even during insane motion (Natural Selection 2). Or the road in NFS Hot Pursuit 2010. I could actually see road texture sharply where on old monitor it was just a blurry mess and I had a 2ms 75Hz gaming screen. But it was an older TN panel and it showed its age a bit.
 
Top