Monday, July 23rd 2018

Performance Penalty from Enabling HDR at 4K Lower on AMD Hardware Versus NVIDIA

The folks over at Computerbase.de have took it into their hands to study exactly how much of an impact >(if any) would activating HDR on a 4K panel affect performance cross different hardware configurations. Supposedly, HDR shouldn't impose any performance penalty on GPUs that were designed to already consider that output on a hardware level; however, as we know, expectations can sometimes be wrong.
Comparing an AMD Vega 64 graphics card against an NVIDIA GeForce 1080, the folks ate Computerbase arrived at some pretty interesting results: AMD hardware doesn't incur in as big a performance penalty (up to 2%) as NVIDIA's graphics card (10% on average) when going from standard SDR rendering through to HDR rendering. Whether due to driver-level issues or not is unclear; however, it could also have something to do with the way NVIDIA's graphics cards process 4K RGB signals by applying color compression down to reduced chroma YCbCr 4:2:2 in HDR - an extra amount of work that could reduce frame rendering. However, it's interesting to note how Mass Effect Andromeda, one of the games NVIDIA gave a big marketing push for and that showcased HDR implementation, sees no performance differential on the green hardware.I also seem to remember some issues regarding AMD's frame time performance being abysmal - looking at Computerbase's results, howveer, those times seem to be behind us.
Source: Computerbase
Add your own comment

26 Comments on Performance Penalty from Enabling HDR at 4K Lower on AMD Hardware Versus NVIDIA

#1
StrayKAT
Nvidia doesn't have an option to just display RGB instead of YCbCr? Am I reading that correctly? I don't even remember if I had the option now that I recall.
Posted on Reply
#2
Lauen
The folks ate Computerbase
:laugh:
Posted on Reply
#3
DeathtoGnomes
the folks ate Computerbase
ketchup and mustard? maybe some Dijon ?

I'm guessing thats the difference between HBM and GDDR.
Posted on Reply
#4
Steevo
Something something, users won't notice of we strip a few colors in order to make more FPS, cause eye candy is all about FPS and not about better rendering....


I wonder if this will fix the Sprite/sparkles seen on their hardware.
Posted on Reply
#6
bug
So, in titles where Nvidia has worked with the developers, the performance is as expected. In others, not so much.
Either programmers are really sloppy or (more likely) there's something not obvious about Nvidia's drivers/APIs that isn't readily visible to developers. Whatever it is, let's hope it gets fixed/ironed out before HDR monitors become mainstream.
Posted on Reply
#7
ExV6k
bugSo, in titles where Nvidia has worked with the developers, the performance is as expected. In others, not so much.
Either programmers are really sloppy or (more likely) there's something not obvious about Nvidia's drivers/APIs that isn't readily visible to developers. Whatever it is, let's hope it gets fixed/ironed out before HDR monitors become mainstream.
In my opinion, it's the second option that's more likely.
Posted on Reply
#8
TheoneandonlyMrK
So Nvidias lossy colour compression makes a difference , what a surprise.

I am of the opinion that the way it's meant to be played is optimised more than people are aware of ,Hdr and high bit rate RGB limit some of their optimization, until they re optimise that is.
Posted on Reply
#9
Space Lynx
Astronaut
theoneandonlymrkSo Nvidias lossy colour compression makes a difference , what a surprise.

I am of the opinion that the way it's meant to be played is optimised more than people are aware of ,Hdr and high bit rate RGB limit some of their optimization, until they re optimise that is.
they will throw that 70% market share muscle around and come out on top of this. I fully expect money to win, as it always does. gg life
Posted on Reply
#10
Fluffmeister
Neither are 4K cards anyway, but then most monitors aren't really HDR... so hey ho.
Posted on Reply
#11
R-T-B
theoneandonlymrkSo Nvidias lossy colour compression makes a difference , what a surprise.
Yep, only in this case it actually limits their performance. It's only "good trait" is to save line bandwidth.

Thank god their SDR color techniques are at least lossless.
Posted on Reply
#12
Unregistered
SteevoSomething something, users won't notice of we strip a few colors in order to make more FPS, cause eye candy is all about FPS and not about better rendering....


I wonder if this will fix the Sprite/sparkles seen on their hardware.
I remember having almost the same discussion at least 7 years ago on how AMD/ATI was slower because they used a better color gamut... And now this kinda validates that it's still happening... Lol

Don't get me wrong people I still like Nvidia...
But in reality Nvidia cheats for the numbers... Always have and probably always will... At least they make it look good enough for most not to notice.
But now in the age of HDR... We get to see performance leveled and AMD is looking good..... Just in time to get stomped again..
Posted on Edit | Reply
#13
R0H1T
So what happened to the 1080Ti, did computerbase not have one for the review or would the drop off, in performance, have been even more accentuated?
Posted on Reply
#14
Vya Domus
theoneandonlymrkSo Nvidias lossy colour compression makes a difference , what a surprise.
I don't think it's that , I reckon it's because of the higher 16 bit FP performance.
Posted on Reply
#15
cucker tarlson
Vya DomusI don't think it's that , I reckon it's because of the higher 16 bit FP performance.
Is that connected to HDR somehow ?
Posted on Reply
#16
mohammed2006
I have nvidia and 4k hdr monitor. It is not only 4:2:2 . I can chose rgb full or 4:4:4 limited unless nvidia are lying in control panel
Posted on Reply
#17
Niarod
"Have took it..."

Take-took-taken
Posted on Reply
#18
Smartcom5
cucker tarlsonWeird this is coming out only now.
Only thing is, it doesn't. Not even remotely.

Why? The also German site GameStar.de also featured such tests on HDR (Link, in German) – and which impacts on performance would to be expected from using it. Already back then in '16.

The results? They were virtually identical, at least pattern-wise (nVidia took serious hits while AMD's ones were pretty much negligible). The outcome were literally the very same as the one Computerbase.de comes up with now in '18.
SteevoSomething something, users won't notice of we strip a few colors in order to make more FPS, cause eye candy is all about FPS and not about better rendering....
theoneandonlymrkSo Nvidias lossy colour compression makes a difference , what a surprise.

I am of the opinion that „The Way it's Meant to be Played“ is optimised more than people are aware of, Hdr and high bit rate RGB limit some of their optimization, until they're optimise that's it.
The thing just is, nVidia was made aware of such outcomes already back then, they still have yet to bring in any fix driver-wise. It seems it's a architectural problem for them and a rather undesirable side effect of their infamous texture-compression and as such, can't just easily avoided without losing a good chunk of their performance … and as such, nVidia rather loves to avoid talking that much about it.

The obvious fact that they haven't fixed it yet since 2016 literally implies it's a architectural problem which can't be solved easily. It's a known phenomenon for a while now and all intellectual games and scenarios being thought through came to the very same result – most likely it's nVidia's texture-compression algorithm which causes this. … which isn't magically fixed overnight using some mighty wonder driver, at least not without having a rather huge impact on performance in the first place.

Did I mentioned that i love blowbacks boomerangs? Their paths are so awesomely easy predictable! ♥
R0H1TSo what happened to the 1080Ti, did computerbase not have one for the review or would the drop off, in performance, have been even more accentuated?
Computerbase.de is rather known to don't have collective-tests which feature any GTX 1080 Ti, and so does many other reviewers too.
The reason for this is, that you can calculate the given placing of a Ti by adding (I believe 20%, correct me if I'm wrong here). Anyway, the placing of a Ti is literally calculable if a given test features a GTX 1080 reference card. So, benching a dedicated Ti if you already have the results of a reference-clocked 1080 is just a waste of time, thus, often just skipped (for obvious reasons).
Posted on Reply
#20
FordGT90Concept
"I go fast!1!11!1!"
That's mostly about rendering techniques to work around the fact that the sun is really bright but monitors aren't. Really don't know why all that information about DirectX is on there.


As for performance difference, the only thing I can think of is that GeForce has to complete some shader operations in two clocks instead of one because its hardware can't do it in one pass.
Posted on Reply
#21
TheoneandonlyMrK
FluffmeisterNeither are 4K cards anyway, but then most monitors aren't really HDR... so hey ho.
Yes mate ,they are ,I play any games i solo play at 4k , could it be better , couldn't it always.

@R-T-B are you sure about Sd i thought different and there is a visible difference between brands imho.
Posted on Reply
#22
bug
theoneandonlymrkYes mate ,they are ,I play any games i solo play at 4k , could it be better , couldn't it always.
No, there aren't: displayhdr.org/certified-products/
That's 15 monitors if you include the DisplayHDR 400 ones, which are really just standard monitors (i.e. not bottom of the barrel crap).

Plus, how do you think you look when you reply to "most monitors aren't really HDR" with "Yes mate ,they are ,I play any games i solo play at 4k"?
Posted on Reply
#23
TheoneandonlyMrK
bugNo, there aren't: displayhdr.org/certified-products/
That's 15 monitors if you include the DisplayHDR 400 ones, which are really just standard monitors (i.e. not bottom of the barrel crap).

Plus, how do you think you look when you reply to "most monitors aren't really HDR" with "Yes mate ,they are ,I play any games i solo play at 4k"?
I have both a 4k monitor and a 4k hdr (yes a crap one:() tv and my point was that you can game at 4k with thise cards.
Also quote my other sentence, could it be better , couldn't it always.
Ah you did:( apologies.
Posted on Reply
#24
wiak
nvidia, "the way its meant to be reduced"
they have done this before, when hdmi 2.0 was new, they would reduce to signal to 4:2:0 to push it to televisions/monitors, yes it works in 4k, but looks like crap
Posted on Reply
#25
bug
wiaknvidia, "the way its meant to be reduced"
they have done this before, when hdmi 2.0 was new, they would reduce to signal to 4:2:0 to push it to televisions/monitors, yes it works in 4k, but looks like crap
That was a limitation of HDMI that applied across the board: it didn't have enough bandwidth for both the signal and HDCP2.2 at the same time. Moron.
Posted on Reply
Add your own comment
Dec 19th, 2024 07:23 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts