• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon GPUs Limit HDR Color Depth to 8bpc Over HDMI 2.0

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,300 (7.52/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
High-dynamic range or HDR is all the rage these days as the next big thing in display output, now that hardware has time to catch up with ever-increasing display resolutions such as 4K ultra HD, 5K, and the various ultra-wide formats. Hardware-accelerated HDR is getting a push from both AMD and NVIDIA in this round of GPUs. While games with HDR date back to Half Life 2, hardware-accelerated formats that minimize work for game developers, in which the hardware makes sense of an image and adjusts its output range, is new and requires substantial compute power. It also requires additional interface bandwidth between the GPU and the display, since GPUs sometimes rely on wider color palettes such as 10 bpc (1.07 billion colors) to generate HDR images. AMD Radeon GPUs are facing difficulties in this area.

German tech publication Heise.de discovered that AMD Radeon GPUs render HDR games (games that take advantage of new-generation hardware HDR, such as "Shadow Warrior 2") at a reduced color depth of 8 bits per cell (16.7 million colors), or 32-bit; if your display (eg: 4K HDR-ready TV) is connected over HDMI 2.0 and not DisplayPort 1.2 (and above). The desired 10 bits per cell (1.07 billion colors) palette is available only when your HDR display runs over DisplayPort. This could be a problem, since most HDR-ready displays these days are TVs. Heise.de observes that AMD GPUs reduce output sampling from the desired Full YCrBr 4: 4: 4 color scanning to 4: 2: 2 or 4: 2: 0 (color-sub-sampling / chroma sub-sampling), when the display is connected over HDMI 2.0. The publication also suspects that the limitation is prevalent on all AMD "Polaris" GPUs, including the ones that drive game consoles such as the PS4 Pro.



View at TechPowerUp Main Site
 
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
There must be technical reason behind this, considering Polaris is new GPU that now has HDMI 2.0 support (unlike R9 Fury which was still HDMI 1.4). Question is, why. It can't be cheating or GPU processing saving since DisplayPort does work with HDR in full fat mode. So, what is it? Unless HDMI 2.0 support isn't actually HDMI 2.0. Somehow. That would kinda suck.
 

bug

Joined
May 22, 2015
Messages
13,843 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
This is weird, because HDMI 2.0 offers the same bandwidth as DP 1.2. Probably a bug in the drivers?
 
Joined
Aug 14, 2012
Messages
225 (0.05/day)
System Name "Big E"
Processor I5 2400
Motherboard Intel DQ67OW
Cooling Scythe Samurai ZZ
Memory 4 X 2 Gb Kingmax 1333
Video Card(s) MSI RX470 gaming x 4gb
Storage samsung F3 500 GB
Display(s) Acer S271HLBbid
Case "Big E"
Power Supply Gembird 450 W
Mouse Generic
Keyboard Generic
Software W10 LTSC
Benchmark Scores Nothing worthy to mention
I remember that the human eye can`t distinguish more that 16 milion colors anyway.Whats the point for more except for marketing purposes?
 

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
17,772 (2.42/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
I remember that the human eye can`t distinguish more that 16 milion colors anyway.Whats the point for more except for marketing purposes?

The point is HDR which the human eye can see.
 
Joined
Aug 19, 2013
Messages
9 (0.00/day)
Jesus TPU, this is an HDMI limitation, not an AMD one! The only question is what is better ... doing 8 bit with 4:4:4 chroma sub-sampling, or doing 10 bit with 4:2:2. In theory the latter is better, but this is hugely deppends on the game’s tonemapping pipeline. Today it is hard to design a pipeline to both traditional display signals and new HDR output signals. The industry don't have enough experience to do this. So the tonemapping pipeline in a game is designed for 8 bits per cell. So even if a hardware can do 10 bit with 4:2:2 in 4K@60 fps, it is not useful to support this option until the game’s tonemapping pipeline mature enough.
 
Joined
Nov 3, 2013
Messages
2,141 (0.53/day)
Location
Serbia
Processor Ryzen 5600
Motherboard X570 I Aorus Pro
Cooling Deepcool AG400
Memory HyperX Fury 2 x 8GB 3200 CL16
Video Card(s) RX 6700 10GB SWFT 309
Storage SX8200 Pro 512 / NV2 512
Display(s) 24G2U
Case NR200P
Power Supply Ion SFX 650
Mouse G703 (TTC Gold 60M)
Keyboard Keychron V1 (Akko Matcha Green) / Apex m500 (Gateron milky yellow)
Software W10
German tech publication Heise.de..........
.........The publication also suspects that the limitation is prevalent on all AMD "Polaris" GPUs, including the ones that drive game consoles such as the PS4 Pro.

Something you forgot to mention

In a test at heise they checked out Shadow Warrior 2 in HDR a Radeon RX 480 which showed similar visual results towards a GeForce GTX 1080. So it seems this is the case for Nvidia as well and likely Nvidia is using a similar trick at 8-bit also. Nvidia has not yet shared info on this though. According to heise, they did see a decrease in performance with Nvidia whereas the RX 480 performance remained the same.
 
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
I remember that the human eye can`t distinguish more that 16 milion colors anyway.Whats the point for more except for marketing purposes?

Just like human eye can't see more than 24fps, right? Things are not so simple for computers, mostly before image actually reaches the output display device. Those 24bit means just number of solid colors. You need additional 8bits to represent transparent colors, that's why we address it as 32bit while color space is actually 24bit (+8bit transparency).

I've been working with 3D raytracing and the tool had option for 40bit image renering which significantly increased rendering time, but decreased or even eliminated color banding on the output image (especially when that got scaled down later to be stored as image file since mainstream formats like JPG, PNG or BMP don't support 40bits).
 
Joined
Jan 13, 2015
Messages
51 (0.01/day)
[/irony on]

Of course human eye can see 10-bit palette accurately - just as human ear is capable to hear cats and dogs frequencies and distinguish clearly the debilitating difference between 22.1KHz and 24KHz (let alone 48KHz!). Also, all displays used accurately show all 100% of visible spectrum, having no technological limitations whatsoever, so this is of utmost importance! All humanity vision and hearing are flawless, too... And we ALL need 40MP cameras for pictures later reproduced pictures on FHD or, occasionally, 4k displays... especially on mobile phones displays...

[irony off]

I try to fight this kind of blind belief in ridiculous claims for many years, but basically it's not worth it...
 

bug

Joined
May 22, 2015
Messages
13,843 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
I remember that the human eye can`t distinguish more that 16 milion colors anyway.Whats the point for more except for marketing purposes?
The human eye can actually distinguish less than a million colours, but the point is, which colours?
On one side we have the dynamic range, which is the difference between the darkest and brightest colour. Th human eye has a certain dynamic range, but by adjusting the pupil, it can shift this range up and down.
On the other hand, we have the computer which handles colour in a discrete world. Discrete computing inevitably alters the data, thus we need to have the computer work at a level of detail the human doesn't see, otherwise we end up with wrong colours and/or banding.

In short, for various reasons, computers need more info to work with than the naked eye can see.
 
Joined
Sep 9, 2009
Messages
1,168 (0.21/day)
Location
Austria, Europe
System Name Bang4Buck
Processor AMD Ryzen 9 5900x
Motherboard MSI PRESTIGE x570 CREATION
Cooling Fractal Design Celsius S36
Memory 32Gb 4400Mhz Patriot Viper Steel(Samsung B-die) @ 3800Mhz 16-16-16-32-48-1T @ 1.38v
Video Card(s) MSI RTX 3080 Ti Suprim X
Storage Adata SX8200Pro 512Gb/2x Crucial P1 1Tb/Samsung 840 EVO/6Tb Raid -HGST Enterprise/2x IronWolf 8Tb/
Display(s) Samsung UE49KS8002 4K HDR TV (US - 9 series)
Case Fractal Design Define R6 Black Usb-C
Audio Device(s) HDMI out to Denon X4400H reciever, 2x Dali Zensor 7, Dali Zensor Vokal, 2x Dali Zensor 1, Yamaha Sub
Power Supply Seasonic Prime Ultra 750W
Mouse Logitech G305 Lightspeed
Keyboard Logitech K520
Software Windows 11 Pro x64
Benchmark Scores https://www.3dmark.com/spy/26216445
Like zlatan has mentioned already, this is a HDMI 2.0 limiation, not a AMD/Nvidia one.
 
Joined
Sep 15, 2011
Messages
6,762 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
I remember that the human eye can`t distinguish more that 16 milion colors anyway.Whats the point for more except for marketing purposes?
Actually the human eye can distinguish between 2 and 10 million, depending on age, eye viewside quality, etc... However HDR supposedly was good on removing color banding in games/movies...
 
Joined
Jan 29, 2012
Messages
6,881 (1.46/day)
Location
Florida
System Name natr0n-PC
Processor Ryzen 5950x-5600x | 9600k
Motherboard B450 AORUS M | Z390 UD
Cooling EK AIO 360 - 6 fan action | AIO
Memory Patriot - Viper Steel DDR4 (B-Die)(4x8GB) | Samsung DDR4 (4x8GB)
Video Card(s) EVGA 3070ti FTW
Storage Various
Display(s) Pixio PX279 Prime
Case Thermaltake Level 20 VT | Black bench
Audio Device(s) LOXJIE D10 + Kinter Amp + 6 Bookshelf Speakers Sony+JVC+Sony
Power Supply Super Flower Leadex III ARGB 80+ Gold 650W | EVGA 700 Gold
Software XP/7/8.1/10
Benchmark Scores http://valid.x86.fr/79kuh6
HDR reminds me of nfs:mw 2005 it had an hdr setting way back when.
 
Joined
Jan 17, 2006
Messages
932 (0.13/day)
Location
Ireland
System Name "Run of the mill" (except GPU)
Processor R9 3900X
Motherboard ASRock X470 Taich Ultimate
Cooling Cryorig (not recommended)
Memory 32GB (2 x 16GB) Team 3200 MT/s, CL14
Video Card(s) Radeon RX6900XT
Storage Samsung 970 Evo plus 1TB NVMe
Display(s) Samsung Q95T
Case Define R5
Audio Device(s) On board
Power Supply Seasonic Prime 1000W
Mouse Roccat Leadr
Keyboard K95 RGB
Software Windows 11 Pro x64, insider preview dev channel
Benchmark Scores #1 worldwide on 3D Mark 99, back in the (P133) days. :)
The key here is that we are now getting displays that can output HDR images which means better contrast, more visible detail in dark areas. small very bright points at the same time as dark areas etc. i.e. the entire pipeline needs to support HDR to get the "full effect".
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,843 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
HDR reminds me of nfs:mw 2005 it had an hdr setting way back when.
Eh, it's not the same thing. That was tone mapping (see: https://en.wikipedia.org/wiki/Tone_mapping ). This time we're talking real HDR. Which is nice and has been around since forever, but it needs to shift huge amount of data which video cards are only now starting to be able to handle. Even so, be prepared for the same content availability issues that have plagued 3D and 4k.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
3840 x 2160 x 60 Hz x 30 bit = 14,929,920,000 or 14.9 Gb/s
HDMI 2.0 maximum bandwidth: 14.4 Gb/s

I said it before and I'll say it again, HDMI sucks. They're working on HDMI 2.1 spec likely to increase the bandwidth. Since HDMI is running off the DVI backbone that was created over a decade ago, the only way to achieve more bandwidth is shorter cables. HDMI is digging its own grave and has been for a long time.
 
Last edited:
Joined
Apr 21, 2010
Messages
578 (0.11/day)
System Name Home PC
Processor Ryzen 5900X
Motherboard Asus Prime X370 Pro
Cooling Thermaltake Contac Silent 12
Memory 2x8gb F4-3200C16-8GVKB - 2x16gb F4-3200C16-16GVK
Video Card(s) XFX RX480 GTR
Storage Samsung SSD Evo 120GB -WD SN580 1TB - Toshiba 2TB HDWT720 - 1TB GIGABYTE GP-GSTFS31100TNTD
Display(s) Cooler Master GA271 and AoC 931wx (19in, 1680x1050)
Case Green Magnum Evo
Power Supply Green 650UK Plus
Mouse Green GM602-RGB ( copy of Aula F810 )
Keyboard Old 12 years FOCUS FK-8100
This is weird, because HDMI 2.0 offers the same bandwidth as DP 1.2. Probably a bug in the drivers?

Nope, For 10bit ,4K@60Hz and 4:4:4 , You need at least DP 1.3 far above HDMI 2.0 , This supports 30/36/48 RGB Colour or 10/12/16 bit Colour at 4K

Edit : Oh i found this :

http://www.hdmi.org/manufacturer/hdmi_2_0/hdmi_2_0_faq.aspx

Look at a Chart :
4K@60 , 10 bit , 4.2.2 : Pass
4K@60 , 10 bit , 4:4:4 : Fail
 
Last edited:
Joined
May 29, 2012
Messages
537 (0.12/day)
System Name CUBE_NXT
Processor i9 12900K @ 5.0Ghz all P-cores with E-cores enabled
Motherboard Gigabyte Z690 Aorus Master
Cooling EK AIO Elite Cooler w/ 3 Phanteks T30 fans
Memory 64GB DDR5 @ 5600Mhz
Video Card(s) EVGA 3090Ti Ultra Hybrid Gaming w/ 3 Phanteks T30 fans
Storage 1 x SK Hynix P41 Platinum 1TB, 1 x 2TB, 1 x WD_BLACK SN850 2TB, 1 x WD_RED SN700 4TB
Display(s) Alienware AW3418DW
Case Lian-Li O11 Dynamic Evo w/ 3 Phanteks T30 fans
Power Supply Seasonic PRIME 1000W Titanium
Software Windows 11 Pro 64-bit
I remember that the human eye can`t distinguish more that 16 milion colors anyway.Whats the point for more except for marketing purposes?
You don't "remember" anything, because that is bullshit what you just said. It's like saying the human eye can't see more than 30FPS.

Also, this is a limitation of the HDMI spec. That's the entire purpose for HDMI 2.0a's existence is to add the necessary metadata stream for UHD-HDR to work.
 

bug

Joined
May 22, 2015
Messages
13,843 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Nope, For 10bit ,4K@60Hz and 4:4:4 , You need at least DP 1.3 far above HDMI 2.0 , This supports 30/36/48 RGB Colour or 10/12/16 bit Colour at 4K

Edit : Oh i found this :

http://www.hdmi.org/manufacturer/hdmi_2_0/hdmi_2_0_faq.aspx

Look at a Chart :
4K@60 , 10 bit , 4.2.2 : Pass
4K@60 , 10 bit , 4:4:4 : Fail
I was commenting on this:
German tech publication Heise.de discovered that AMD Radeon GPUs render HDR games ... at a reduced color depth of 8 bits per cell (16.7 million colors), or 32-bit; if your display (eg: 4K HDR-ready TV) is connected over HDMI 2.0 and not DisplayPort 1.2 (and above).

The assertions is it only reduces the colour depth on HDMI 2.0 while all is peachy on DP 1.2. Which, as I have said, it's weird, because both HDMI 2.0 and DP 1.2 have the same bandwidth. Still, it could be HDMI's overhead that makes the difference once again.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,632 (6.68/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Not a fan of hdmi anyway, such a weaksauce connector standrd.
 
Joined
Feb 25, 2016
Messages
292 (0.09/day)
This is weird, because HDMI 2.0 offers the same bandwidth as DP 1.2. Probably a bug in the drivers?
So you are in the driver. Sorry couldn't resist it.:p

Something you forgot to mention
In a test at heise they checked out Shadow Warrior 2 in HDR a Radeon RX 480 which showed similar visual results towards a GeForce GTX 1080. So it seems this is the case for Nvidia as well and likely Nvidia is using a similar trick at 8-bit also. Nvidia has not yet shared info on this though. According to heise, they did see a decrease in performance with Nvidia whereas the RX 480 performance remained the same.
That's quite the thing he missed.
 

bug

Joined
May 22, 2015
Messages
13,843 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.87/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Jesus TPU, this is an HDMI limitation, not an AMD one! The only question is what is better ... doing 8 bit with 4:4:4 chroma sub-sampling, or doing 10 bit with 4:2:2. In theory the latter is better, but this is hugely deppends on the game’s tonemapping pipeline. Today it is hard to design a pipeline to both traditional display signals and new HDR output signals. The industry don't have enough experience to do this. So the tonemapping pipeline in a game is designed for 8 bits per cell. So even if a hardware can do 10 bit with 4:2:2 in 4K@60 fps, it is not useful to support this option until the game’s tonemapping pipeline mature enough.
This would mean that NVIDIA will have the same limitation then. I personally have no idea as I'm not familiar with the intimate details of the HDMI spec and am taking your word for it.

@btarunr Do you want to check zlatan's point and update the article if he's right?
 
Joined
Sep 15, 2011
Messages
6,762 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
This time is not AMD fault, but HDMI standard one. nVidia has THE SAME issue with HDMI 2.0 btw.
Also not sure why those 4K HDR TVs are not provided with at least DP1.2 interfaces also...
 
Joined
Apr 29, 2014
Messages
4,304 (1.11/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
Hence why I prefer Display Port...
 
Top