• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Demonstrates Revolutionary 14 nm FinFET Polaris GPU Architecture

Joined
Apr 16, 2015
Messages
306 (0.09/day)
System Name Zen
Processor Ryzen 5950X
Motherboard MSI
Cooling Noctua
Memory G.Skill 32G
Video Card(s) RX 7900 XTX
Storage never enough
Display(s) not OLED :-(
Keyboard Wooting One
Software Linux
Viewing angles are meaningless for gaming imo. For comfortable gaming you're facing monitor dead on anyway.

http://www.lagom.nl/lcd-test/viewing_angle.php
Put your browser to fullscreen (F11) scroll a bit downward so your entire display is covered with the "lagom" text on grey background.
I see a huge color distortion on my TN panel even when viewing "dead on anyway" so much that on top of the screen the red text is actually already cyan. Whereas on my IPS and MVA panel I do not see this color distortion on his picture.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Wishing I didn't look at that. Man my monitor is crap! :laugh:

Viewing angle and gamma it's red at the bottom, gets choppy about 2/3 of the way up, and turns cyan above that. Viewing angle and brightness I see shades of violet instead of a solid color.


Yea i don't know if they are just keeping it a secret or if they got issues. But keeping it a secret is more likely as if there was issues would probably heard.
I doubt there are "issues" with the TMSC process; they are just way behind Samsung which is why AMD can demo (and likely deliver) the next generation of GPUs first.
 
Joined
Jun 13, 2012
Messages
1,412 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Audio Device(s) Logitech Z906 5.1
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
I doubt there are "issues" with the TMSC process; they are just way behind Samsung which is why AMD can demo (and likely deliver) the next generation of GPUs first.
that or AMD is feeling the pressure of rumored april release of pascal? if it does release in under 3 months AMD likely won't be for another 3 after that so they are tring to create some buzz and people to hold out
 
Joined
Aug 15, 2008
Messages
5,941 (0.99/day)
Location
Watauga, Texas
System Name Univac SLI Edition
Processor Intel Xeon 1650 V3 @ 4.2GHz
Motherboard eVGA X99 FTW K
Cooling EK Supremacy EVO, Swiftech MCP50x, Alphacool NeXXos UT60 360, Black Ice GTX 360
Memory 2x16GB Corsair Vengeance LPX 3000MHz
Video Card(s) Nvidia Titan X Tri-SLI w/ EK Blocks
Storage HyperX Predator 240GB PCI-E, Samsung 850 Pro 512GB
Display(s) Dell UltraSharp 34" Ultra-Wide (U3415W) / (Samsung 48" Curved 4k)
Case Phanteks Enthoo Pro M Acrylic Edition
Audio Device(s) Sound Blaster Z
Power Supply Thermaltake 1350watt Toughpower Modular
Mouse Logitech G502
Keyboard CODE 10 keyless MX Clears
Software Windows 10 Pro
http://www.lagom.nl/lcd-test/viewing_angle.php
Put your browser to fullscreen (F11) scroll a bit downward so your entire display is covered with the "lagom" text on grey background.
I see a huge color distortion on my TN panel even when viewing "dead on anyway" so much that on top of the screen the red text is actually already cyan. Whereas on my IPS and MVA panel I do not see this color distortion on his picture.
That's a neat test.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.08/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
I'm just going to say this, the 14nm FinFet is already sort of a let down, and has already shown to consume more power than TSMC's 16nm process, in literally the most Apples to Apples comparison possible...see what I did there?

So I'm not holding my breath about power consumption, AMD has already kind of shot themselves in the foot by picking a less power efficient process.
 
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
http://www.lagom.nl/lcd-test/viewing_angle.php
Put your browser to fullscreen (F11) scroll a bit downward so your entire display is covered with the "lagom" text on grey background.
I see a huge color distortion on my TN panel even when viewing "dead on anyway" so much that on top of the screen the red text is actually already cyan. Whereas on my IPS and MVA panel I do not see this color distortion on his picture.

Text looks exactly the same even from very extreme left and right angles. It also looks the same from very extreme top down angle. Only time text becomes cyan is by looking it from extreme bottom angle. Meaning I have to make my beard touching the desk. Let me just say I NEVER look at the display from that angle. Only slight brightness change of the entire image from all other 3 extreme angles.

In normal position, like 1-2cm of top left corner is kinda turning to cyan. When you factor in the fact that you never have a monochrome image like this in games or in Windows, it becomes entirely irrelevant "problem".

The model of my monitor is ASUS VG248QE with TN panel, 144Hz screen and 1ms pixel response. Yeah, that's how far TN's have come. Stop bloody comparing them to TN's from 2005 already.
 
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
Text looks exactly the same even from very extreme left and right angles. It also looks the same from very extreme top down angle. Only time text becomes cyan is by looking it from extreme bottom angle. Meaning I have to make my beard touching the desk. Let me just say I NEVER look at the display from that angle. Only slight brightness change of the entire image from all other 3 extreme angles.

In normal position, like 1-2cm of top left corner is kinda turning to cyan. When you factor in the fact that you never have a monochrome image like this in games or in Windows, it becomes entirely irrelevant "problem".

The model of my monitor is ASUS VG248QE with TN panel, 144Hz screen and 1ms pixel response. Yeah, that's how far TN's have come. Stop bloody comparing them to TN's from 2005 already.

Actually in-game the color-shifting is worse due to games Ignoring any such color calibration profiles and settings. People just seem to tolerate it because they buy them for gaming or a budget monitor. VA panels have slight color shifts too but no where near the degree you'll find on TNs.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I'm just going to say this, the 14nm FinFet is already sort of a let down, and has already shown to consume more power than TSMC's 16nm process, in literally the most Apples to Apples comparison possible...see what I did there?

So I'm not holding my breath about power consumption, AMD has already kind of shot themselves in the foot by picking a less power efficient process.
Got linky?
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.08/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Got linky?

All over the net. Apple used TMSC's 16nm FinFet and Samsung's 14nm FinFet to make the A9 chips. Then put those chips into identical phones. The phones with Samsung chips get 20-30% less battery life.

http://arstechnica.com/apple/2015/10/samsung-vs-tsmc-comparing-the-battery-life-of-two-apple-a9s/

The interesting thing is that the real difference seems to come out in very heavy CPU intensive tasks. So, if you make a GPU, there is a good chance the load power consumption will be higher using Samsung's 14nm than TMSC's 16nm.
 
Last edited:
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Samsung never made a Cortex-A9. Well they did, but the lowest process it was on was 32nm. :confused:

Is there a specific model I should be looking at? The newer Samsungs (e.g. Galaxy S6) run on Cortex-A53 or Cortex-A57.

Considering their operating systems are completely different and Apple does its own thing for GPUs (for Metal), I don't think it's really a 1:1 comparison.


Edit: Samsung Galaxy S6 has 8 cores as a Big-Little architecture--nothing like Apple. Actually, it appears Appl A9 is only a dual core. Samsung's processor should have substantially more performance when power saving and when under heavy load.

Edit: It would be best to compare APL0898 to APL1022. Here we go: http://bgr.com/2015/10/08/iphone-6s-a9-processor-samsung-tsmc-batterygate/
 
Last edited:

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.08/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Edit: It would be best to compare APL0898 to APL1022.

The Apple A9, made by Samsung, to the Apple A9, made by TMSC...

The identical chip, in identical phones, the only difference is 14nm FinFET vs. 16nm FinFET. The Samsung chip clearly uses more power. It doesn't get more identical than that.

Its a 3 watt difference

It could be the reviewer they keep quoting got a high leak one. Bad luck

It isn't just one reviewer, the results have been confirmed all over the net. Even Apple confirmed there is a 2-3% difference in normal use.

It also isn't a 3 Watt difference, not sure where that came from... I don't even think these chips use 3 Watts a full load.
 
Last edited:
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
It isn't just one reviewer, the results have been confirmed all over the net. Even Apple confirmed there is a 2-3% difference in normal use.

It also isn't a 3 Watt difference, not sure where that came from... I don't even think these chips use 3 Watts a full load.

I was wrong, I was looking at something else. Serves me right for playing around with 4k scaling on a 24' monitor, It was tolerable on a 27' tho.
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
If the chips are truly identical in design, you'd think the only way they'd get ~2% different performance is perhaps the Samsung is running at 2% lower clock (e.g. 1.813 GHz). Even so, that's not good on the power consumption side of things.
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
So AMD going Samsung not really confirmed yet.

Remember that in case of Apple's chips 16nm TSMC is significantly better (I recall 20%-ish less power consumption, correct me if I'm wrong) than 14nm Samsung.

I think AMD should stick to comparing to their own cards instead of nvidia. if you look on nvidia's site they just compare their cards to their own cards not amd.
Logical reasoning is strong within you.

AMD's power consumption was badmouthed into oblivion even though mainstream cards were nowhere as bad, e.g. 380x about 20%-ish more total power consumed, while also being faster.
If they'd compare to own cards, uneducated public wouldn't figure that it is much better than nVidia's.

I've got to ask a fundamental question here. Do more colors really matter?
It's also brigthness/contrast, so, yep, it does.
 
Joined
Apr 16, 2015
Messages
306 (0.09/day)
System Name Zen
Processor Ryzen 5950X
Motherboard MSI
Cooling Noctua
Memory G.Skill 32G
Video Card(s) RX 7900 XTX
Storage never enough
Display(s) not OLED :-(
Keyboard Wooting One
Software Linux
I've got to ask a fundamental question here. Do more colors really matter?







 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.08/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64

Except it isn't actually that bad, those are extreme exaggeration, not what it is really like. The funny thing is, every single one of those images is an 8-bit image...
 
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
Except it isn't actually that bad, those are extreme exaggeration, not what it is really like. The funny thing is, every single one of those images is an 8-bit image...

Those are simulated images to illustrate the difference

A 10-bit image will have banding on a 8-bit monitor. Most people if they haven't recently bought a monitor are likely on a 6-bit+FRC monitor and don't even know it.
 
Last edited:
Joined
Apr 2, 2011
Messages
2,850 (0.57/day)

Help me understand here.

First, I'm going to give you the facts that the monitor I'm looking at right now isn't 10-bit, so the premise on its face is silly. If it can render the "better" image then 8 bit is capable of it now.

Now, what you've shown is banding. The color bands you've shown represent only 26 distinct colors (and that's really pushing it). 26 distinct color values would be 3 bit color (approximately) if my math isn't off. 26/3 = 8.67, two cubed is 8. The difference between 8 bit and 10 bit above is then akin to the difference between 3 bit and 8 bit. I'm having an immensely difficult time buying that little bit of math.

Moreover, you've started equating grayscale with color scale. Sorry, but that's silly too. If you're including grayscale as a measure then your 8 bit colors are now 3(2^8)*2^8 = 4(2^8). Can you seriously tell that many distinct colors apart? I'll give you a hint, biology says you can't.


So what I'm seeing in the "10 bit" color spectrum is the 8 bit spectrum I'd already be able to see. There is some minor banding, but not really enough to notice unless intensely focused upon. If I already can't tell the difference between color values, what exactly is the difference between having more of them to utilize?


What you've demonstrated is marketing BS, that people would use to sell me a TV. This is like arguing that a TV is better because it supports a refresh rate of 61 Hz, above a TV with 60 Hz. While technically correct, the difference is entirely unappreciable with the optical hardware we were born with. I can see the difference between 24 and 48 Hz resolutions (thank you Peter Jackson). I can't tell the difference between the RGB color of 1.240.220 and 1.239.220. I don't see how adding an extra couple of values between colors I already can't differentiate is particularly useful.


This is why I'm asking why 10 bit color is useful. It's another specification that makes precious little sense when you understand the math, and even less when you understand the biology. Tell me, did you buy it when people said adding a yellow LED to the TV screen produced "better" yellows (Sharp's Aquos)? Did you suddenly rush out and get a new data standard that included an extra value to accommodate that new LED? I'd say no. I'd say that it was a cheap tactic to sell a TV on a feature that was impossible to discern. That's largely what 10 bit is to me, a feature that can't reasonably improve my experience, but I will be told is why I should by this new thing.

Please sell me a GPU that can run two 1080p monitors with all of the eye candy on high, not a chunk of silicon which requires a couple of thousand dollars of monitor replacement to potentially be slightly better than what I've got now. Especially not when the potential improvement is functionally impossible for my eyeballs to actually see.
 
Joined
Apr 16, 2015
Messages
306 (0.09/day)
System Name Zen
Processor Ryzen 5950X
Motherboard MSI
Cooling Noctua
Memory G.Skill 32G
Video Card(s) RX 7900 XTX
Storage never enough
Display(s) not OLED :-(
Keyboard Wooting One
Software Linux
Help me understand here.

First, I'm going to give you the facts that the monitor I'm looking at right now isn't 10-bit, so the premise on its face is silly. If it can render the "better" image then 8 bit is capable of it now.

Now, what you've shown is banding. The color bands you've shown represent only 26 distinct colors (and that's really pushing it). 26 distinct color values would be 3 bit color (approximately) if my math isn't off. 26/3 = 8.67, two cubed is 8. The difference between 8 bit and 10 bit above is then akin to the difference between 3 bit and 8 bit. I'm having an immensely difficult time buying that little bit of math.

Moreover, you've started equating grayscale with color scale. Sorry, but that's silly too. If you're including grayscale as a measure then your 8 bit colors are now 3(2^8)*2^8 = 4(2^8). Can you seriously tell that many distinct colors apart? I'll give you a hint, biology says you can't.


So what I'm seeing in the "10 bit" color spectrum is the 8 bit spectrum I'd already be able to see. There is some minor banding, but not really enough to notice unless intensely focused upon. If I already can't tell the difference between color values, what exactly is the difference between having more of them to utilize?


What you've demonstrated is marketing BS, that people would use to sell me a TV. This is like arguing that a TV is better because it supports a refresh rate of 61 Hz, above a TV with 60 Hz. While technically correct, the difference is entirely unappreciable with the optical hardware we were born with. I can see the difference between 24 and 48 Hz resolutions (thank you Peter Jackson). I can't tell the difference between the RGB color of 1.240.220 and 1.239.220. I don't see how adding an extra couple of values between colors I already can't differentiate is particularly useful.


This is why I'm asking why 10 bit color is useful. It's another specification that makes precious little sense when you understand the math, and even less when you understand the biology. Tell me, did you buy it when people said adding a yellow LED to the TV screen produced "better" yellows (Sharp's Aquos)? Did you suddenly rush out and get a new data standard that included an extra value to accommodate that new LED? I'd say no. I'd say that it was a cheap tactic to sell a TV on a feature that was impossible to discern. That's largely what 10 bit is to me, a feature that can't reasonably improve my experience, but I will be told is why I should by this new thing.

Please sell me a GPU that can run two 1080p monitors with all of the eye candy on high, not a chunk of silicon which requires a couple of thousand dollars of monitor replacement to potentially be slightly better than what I've got now. Especially not when the potential improvement is functionally impossible for my eyeballs to actually see.

Are you really so blunt, that you can not understand, that those pictures, although OBVIOUSLY being 8bit in nature, have only the meaning to illustrate what kind of a difference the 8bit and 10bit picture would have, if you once put a 10bit monitor next to your 8bit one and looked 10bit picture on it (vs 8bit picture on your 8bit monitor). Really?

(24Hz and 48Hz are not resolutions, but refresh rates; Yes, I can not see a difference between 1,240,220 and 1,239,220 either, but I can see a clear difference between 0,128,0 and 0,129,0 for example - also when I make a gradient from 0,0,0 to 255,255,255 on my monitor, I do see distinct lines between colors - true, some go over smoother that others, but still there too many visible lines in the picture and the gradient is not smooth; I don't buy a TV because it has better stats on paper - I have to see the difference myself and it has to be big enough for me to be convinced - quite often it has been clearly visible. Same way I want to see 10bit picture with my own eyes in shop compared to 8bit before I buy anything (unless ofc the price difference is so small that it doesn't matter anyway, but I doubt that this will be the case when consumer 10bit displays launch); You mistake me for hardware vendor - I am not (read: I don't sell you anything, no new displays, no new GPU-s nor new eyeballs that can actually see different colors);
 
Last edited:

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,675 (2.86/day)
Location
w
System Name Black MC in Tokyo
Processor Ryzen 5 7600
Motherboard MSI X670E Gaming Plus Wifi
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Corsair Vengeance @ 6000Mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston KC3000 1TB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Plantronics 5220, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Dell SK3205
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
Are you really so dent, that you can not understand, that those pictures, although OBVIOUSLY being 8bit in nature, have only the meaning to illustrate what kind of a difference the 8bit and 10bit picture would have, if you once put a 10bit monitor next to your 8bit one and looked 10bit picture on it (vs 8bit picture on your 8bit monitor). Really?

(24Hz and 48Hz are not resolutions, but refresh rates; Yes, I can not see a difference between 1,240,220 and 1,239,220 either, but I can see a clear difference between 0,128,0 and 0,129,0 for example - also when I make a gradient from 0,0,0 to 255,255,255 on my monitor, I do see distinct lines between colors - true, some go over smoother that others, but still there too many visible lines in the picture and the gradient is not smooth; I don't buy a TV because it has better stats on paper - I have to see the difference myself and it has to be big enough for me to be convinced - quite often it has been clearly visible. Same way I want to see 10bit picture with my own eyes in shop compared to 8bit before I buy anything (unless ofc the price difference is so small that it doesn't matter anyway, but I doubt that this will be the case when consumer 10bit displays launch); You mistake me for hardware vendor - I am not (read: I don't sell you anything, no new displays, no new GPU-s nor new eyeballs that can actually see different colors);

Obviously he isn't, he questions if there is such a huge difference between 8 and 10 bits, and the answer as you hint at will depend on the person.
 
Joined
Apr 16, 2015
Messages
306 (0.09/day)
System Name Zen
Processor Ryzen 5950X
Motherboard MSI
Cooling Noctua
Memory G.Skill 32G
Video Card(s) RX 7900 XTX
Storage never enough
Display(s) not OLED :-(
Keyboard Wooting One
Software Linux
Obviously he isn't, he questions if there is such a huge difference between 8 and 10 bits, and the answer as you hint at will depend on the person.

Time will tell. I did see clear difference when going from monochrome (1bit) to CGA(2bit), then again when moved to EGA (4bit)... then again when moved to VGA (8bit)... then again when moved to "TrueColor" (8bit *3)... I don't see why this would be any different this time.
 
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
Obviously he isn't, he questions if there is such a huge difference between 8 and 10 bits, and the answer as you hint at will depend on the person.

There more variables to it. Web images are in 8-bit. Monitor coverage can be big ranging from 65/72/75/92/98/99/100+ pct. Average monitor is between 65-75%. As an example Newtekie is 72%.
 
Last edited:
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
As I said earlier, it's more about contrast than color banding, and, yep, it does make one hell of a difference.

High-dynamic-range imaging (HDRI or HDR) is a technique used in imaging and photography to reproduce a greater dynamic range of luminosity than is possible with standard digital imaging or photographic techniques.

The aim is to present the human eye with a similar range of luminance as that which, through the visual system, is familiar in everyday life. The human eye, through adaptation of the iris (and other methods) adjusts constantly to the broad dynamic changes ubiquitous in our environment. The brain continuously interprets this information so that most of us can see in a wide range of light conditions. Most cameras, on the other hand, cannot.

HDR images can represent a greater range of luminance levels than can be achieved using more 'traditional' methods, such as many real-world scenes containing very bright, direct sunlight to extreme shade, or very faint nebulae.
In other words:
  • Expanded color gamut ("no banding" is only about color accuracy!!!)
  • Higher contrast
 
Last edited:
Joined
Apr 2, 2011
Messages
2,850 (0.57/day)
Are you really so blunt, that you can not understand, that those pictures, although OBVIOUSLY being 8bit in nature, have only the meaning to illustrate what kind of a difference the 8bit and 10bit picture would have, if you once put a 10bit monitor next to your 8bit one and looked 10bit picture on it (vs 8bit picture on your 8bit monitor). Really?

(24Hz and 48Hz are not resolutions, but refresh rates; Yes, I can not see a difference between 1,240,220 and 1,239,220 either, but I can see a clear difference between 0,128,0 and 0,129,0 for example - also when I make a gradient from 0,0,0 to 255,255,255 on my monitor, I do see distinct lines between colors - true, some go over smoother that others, but still there too many visible lines in the picture and the gradient is not smooth; I don't buy a TV because it has better stats on paper - I have to see the difference myself and it has to be big enough for me to be convinced - quite often it has been clearly visible. Same way I want to see 10bit picture with my own eyes in shop compared to 8bit before I buy anything (unless ofc the price difference is so small that it doesn't matter anyway, but I doubt that this will be the case when consumer 10bit displays launch); You mistake me for hardware vendor - I am not (read: I don't sell you anything, no new displays, no new GPU-s nor new eyeballs that can actually see different colors);


Allow me to quote what I said to you.
"Now, what you've shown is banding. The color bands you've shown represent only 26 distinct colors (and that's really pushing it). 26 distinct color values would be 3 bit color (approximately) if my math isn't off. 26/3 = 8.67, two cubed is 8. The difference between 8 bit and 10 bit above is then akin to the difference between 3 bit and 8 bit. I'm having an immensely difficult time buying that little bit of math."

Let's do the math here. The spectrum is functionally continuous and adding extra bits to color information doesn't produce stronger base colors, so we're going to say what each bit represents of the spectrum:
2^1 = 1 bit, 50%
2^2 = 2 bit, 25%
2^3 = 3 bit, 12.5%
2^4 = 4 bit, 6.25%
2^5 = 5 bit, 3.125%
2^6 = 6 bit, 1.5625%
2^7 = 7 bit, 0.78125%
2^8 = 8 bit, 0.390625%
2^9 = 9 bit, 0.1953125%
2^10 = 10 bit, 0.0976562%

That means, according to your cited differences, that a step down from 12.5% to 0.390625% (delta = 12.109) is the same as a step down from 0.39% to 0.09% ( delta = 0.2929). Those numbers don't lie, and I call bullshit on them. Your infographic is attempting to convey differences, that aren't in any way representative of reality. If you'd like to contest that, please point out exactly where the error in my mathematics lies. I'll gladly change my opinion if I've somehow missed the point somewhere.


In short, what I'm telling you is that your example is crap, which is what I thought I said clearly above. Everything I've observed from 10 bit is functionally useless, though you're free to claim otherwise. What I have observed is variations in intensity and increases in frame rates making a picture objectively better. HDR (intensity of colors being pushed out), pixel count (standard more monitor is good argument), and frame rate (smoother motion) are therefore demonstrably what cards should be selling themselves on (if they want my money), not making a rainbow slightly more continuous in its color spectrum. Unless you missed the title of the thread, this is still about Polaris.


So we're clear, I have no personal malice here. I hate it when companies market new technology on BS promises, like the yellow LED and 10 bit being an enormous improvement over 8 bit. In the case of yellow LEDs, I've literally never been able to determine a difference. In the case of 10 bit color, I've yet to see a single instance of where slightly "off" colors due to not having an extra couple of shades of one color is going to influence me more than dropping from 30 Hz refresh to 25 Hz refresh. If you are of the opposite opinion, I'm glad to allow you to entertain it. I just hope you understand how much of an investment you're going to have to make for that slight improvement, while the other side of it requires significantly less cost to see appreciable improvement.
 
Top