Thursday, February 28th 2019

AMD Showcases FreeSync 2 HDR Technology With Oasis Demo

AMD is looking to further push the adoption of FreeSync with the release of FreeSync 2 HDR Technology. The primary goal of the new standard is to take what FreeSync already offered including wide variable refresh rates and low framerate compensation and to pair that with HDR for a truly immersive experience. To show off what FreeSync 2 can do while also pushing for broader adoption has resulted in AMD creating their new Oasis Demo. Following the familiar principle that seeing is believing, AMD will be looking to compare their FreeSync 2 monitors against their non-HDR counterparts with this new demo at retail locations. This will allow consumers to see the difference for themselves in a way static images and youtube videos cannot convey. The Demo itself has been built using Unreal Engine 4 and has full support for HDR10 and FreeSync 2 HDR transport protocols. When it comes to settings the demo packs numerous options including FPS limits with various presets or custom options, vertical sync on/off, FreeSync on/off, Content modes, etc. You can view AMD's overview of the Demo in the video below.

Source: AMD
Add your own comment

29 Comments on AMD Showcases FreeSync 2 HDR Technology With Oasis Demo

#1
las
After trying out Samsung C27HG70 for a week, I can say that HDR on PC is still a joke. This monitor is DisplayHDR 600 certified. I wouldn't even want to see 400 certified monitors... Basicly useless for HDR. Like all those fake HDR 4K TV's.

It will take years before PC will have proper HDR gaming, like consoles have now (in some games) - Software and hardware simply lacks.

HDR have only impressed me using OLED so far. I have a LG C7. I can't wait till Micro LED replaces LCD as the standard.
I doubt we'll see proper HDR for PC till then (unless you are going to use an OLED TV maybe). Software will mature too.

Tbh I think HDR is overrated. Tons of movies/series and games claim HDR support, when in reality there's a big difference between how good that HDR is implemented. It's still the early days...
Posted on Reply
#2
cucker tarlson
I had a chance to see and compare quite a few TV sets in a big electronics shop,and a good non-HDR TV looks miles better than an entry level one with crappy HDR.Once you've got a good implementation of HDR on a quality TV it looks amazing but that's what's required.Stay away from those budget hdr monitors please.
Posted on Reply
#3
las
Just look for Ultra HD Premium certification when buying a HDR TV :p You'll get native 10 bit panel and 1000 nits then.. (for LCD)

There's TONS of 4K HDR TV's with 8 bit panels. They can RECIEVE HDR content, but not show it. They are still allowed to sell them as 4K HDR TV's. This is how most PC monitors do HDR I guess :p

I'm fine with SDR on PC for a few more years..
Posted on Reply
#4
MrGenius
Does HDR help with, or better yet eliminate, color banding? I was just looking at one of these FreeSync 2 HDR400 monitors, and it says "HDR400 specifies true 8-bit image quality". And I'm like...you lost me at 8-bit....see ya! Already made that mistake and deeply regretting it. If you can call it a mistake. As it's not advertised, ALMOST EVER, whether a specific monitor is 8-bit or not. So I had no clue, and/or any way of knowing ahead of time, what I was getting. Not only that though. I didn't even know what it was, and why I should care. How come that's not pretty much the most important specification that's given, or even a major topic of discussion? I mean...Jesus...is there anything worse than color banding? I have to not play games where it shows up now...or I feel like throwing this $450 POS out the window. Sickening...
Posted on Reply
#5
Zubasa
lasAfter trying out Samsung C27HG70 for a week, I can say that HDR on PC is still a joke. This monitor is DisplayHDR 600 certified. I wouldn't even want to see 400 certified monitors... Basicly useless for HDR. Like all those fake HDR 4K TV's.

It will take years before PC will have proper HDR gaming, like consoles have now (in some games) - Software and hardware simply lacks.

HDR have only impressed me using OLED so far. I have a LG C7. I can't wait till Micro LED replaces LCD as the standard.
I doubt we'll see proper HDR for PC till then (unless you are going to use an OLED TV maybe). Software will mature too.

Tbh I think HDR is overrated. Tons of movies/series and games claim HDR support, when in reality there's a big difference between how good that HDR is implemented. It's still the early days...
For PC monitors the CHG70 is already miles ahead of most "gaming" monitors.
Many of those don't even run true 8-bit instead runs 6-bit + FRC rubbish.
Another issue with HDR on PC is problems with color mapping, if not done properly HDR mode will look WORST than SDR.
It also doesn't help that in order to watch HDR on youtube you need to turn on some flags in M$ Edge that is off by default.
Few games actually runs Freesync 2 HDR, and the color mapping on nVidia GPUs for HDR is IMO crap.
Especially seeing the rip-off 4k 144hs "HDR1000" G-Sync monitor from Asus in person, the HDR with that on BF5 (which is an RTX game) is absolute garbage.
Oh and don't get me started with the HDR on Anthem, it often bugs out and the Gamma goes through the roof instead of 2.2~2.4.
Posted on Reply
#6
Blueberries
Is this video a joke? It looks like every 6th grader's first Powerpoint presentation when they discover transition effects.
Posted on Reply
#7
mtcn77
MrGeniusDoes HDR help with, or better yet eliminate, color banding? I was just looking at one of these FreeSync 2 HDR400 monitors, and it says "HDR400 specifies true 8-bit image quality". And I'm like...you lost me at 8-bit....see ya! Already made that mistake and deeply regretting it. If you can call it a mistake. As it's not advertised, ALMOST EVER, whether a specific monitor is 8-bit or not. So I had no clue, and/or any way of knowing ahead of time, what I was getting. Not only that though. I didn't even know what it was, and why I should care. How come that's not pretty much the most important specification that's given, or even a major topic of discussion? I mean...Jesus...is there anything worse than color banding? I have to not play games where it shows up now...or I feel like throwing this $450 POS out the window. Sickening...
DisplayHDR specification as a whole of HDR10 rests upon 8-bit displays. Dolby Vision is 10-bit displays.
Posted on Reply
#8
las
ZubasaFor PC monitors the CHG70 is already miles ahead of most "gaming" monitors.
Many of those don't even run true 8-bit instead runs 6-bit + FRC rubbish.
Another issue with HDR on PC is problems with color mapping, if not done properly HDR mode will look WORST than SDR.
It also doesn't help that in order to watch HDR on youtube you need to turn on some flags in M$ Edge that is off by default.
Few games actually runs Freesync 2 HDR, and the color mapping on nVidia GPUs for HDR is IMO crap.
Especially seeing the rip-off 4k 144hs "HDR1000" G-Sync monitor from Asus in person, the HDR with that on BF5 (which is an RTX game) is absolute garbage.
Oh and don't get me started with the HDR on Anthem, it often bugs out and the Gamma goes through the roof instead of 2.2~2.4.
Miles ahead of most gaming monitors in HDR maybe - It's still bad HDR tho - Everything else about it is mediocre if you ask me. Not much of a gaming monitor IMO. I have tested tons of VA, IPS and TN gaming monitors over the past 8-10 years. VA have never impressed me in faster paced games. The slow pixel response time continues to be a problem for this tech.

The blur on CHG70, even at fastest mode with low input lag enabled, is way too much compared to even IPS/AHVA high refresh rate panels.
Pixel response is simply too high, especially in dark scenes. Too much smearing. Playing fast paced games is still a problem.

I was playing Apex Legends side-by-side with PG279Q and the difference in smoothness and responsiveness were simply huge.
Oh, I still have the CHG70 standing on the floor here. I will test it some more over the next days, vs Dell's new S2719DGF and Asus PG278QR, two 8 bit TN 1440p monitors.

So far tho, CHG70 does not have the nice black levels good VA monitors usually have. I remember testing Eizo FG2421, it had much better blacks and colors in general (also 1:5000 contrast). Failed in other aspects tho (pixel response time especially... once again and small size/res)

I like VA panels for TV's, if you don't need good viewing angles, but for PC gaming I would not recommend them, unless you are playing slower paced games or can accept some blur/smearing. It's good enough for RPG's, MMO's etc. It has some issues with text sharpness tho (because of how subpixels are generated), this is true for all Samsung and AUO VA panels.

In the end there's no perfect monitor. This is why I like to keep testing new monitors.

6 bit + FRC is typically low-end TN. There's many true 8 bit TN (mostly 1440p), there's also alot of true 8 bit high refresh rate IPS/AHVA panels and also some 8 bit + FRC.
These 8 bit 1440p TN panels have way better image quality than those cheap 6 bit + FRC 1080p TN panels.
Posted on Reply
#9
SoNic67
lasThis monitor is DisplayHDR 600 certified. I wouldn't even want to see 400 certified monitors... Basicly useless for HDR. Like all those fake HDR 4K TV's.
....
It will take years before PC will have proper HDR gaming, like consoles have now (in some games)
I don't get the logic. Consoles came with better monitors? I think this is a case of placebo effect...
Zubasamonitor from Asus
I have bought a cheap-ish Acer monitor myself. Bad colors, broke down in warranty (had to pay shipping to get it fixed), came back still looking yucky. Didn't help that I was running it dual monitor next to a 24" AOC HDTV display with much better colors.
I saw an Asus laptop display (my son's). Yuck again.

Right now I am using an AOC U2879VF. Specs say 4K FreeSync, 1ms, with 10 bit (8 bit +FRC). But since it has max 300 cd/m2, it doesn't qualify for HDR 400. It come with a color calibration profile file, and that's important to me. Even if I have a Spyder calibration tool, I found out that, if the manufacturer doesn't provide that file, it usually means that you can't actually fully calibrate that monitor.

IMO, HDR is over-hyped. Another way to convince us to spend money.
Basically, if you ron't run a HRD monitor at the highest luminosity, it doesn't meet the "HDR" specifications. Try that at night!
Posted on Reply
#10
las
SoNic67I don't get the logic. Consoles came with better monitors? I think this is a case of placebo effect...


I have bought an Asus monitor myself. Bad colors, broke down in warranty (had to pay shipping to get it fixed), came back still looking yucky. Didn't help that I was running it dual monitor next to a 24" HDTV display with much better colors.
I saw Asus laptop display at a friend. Yuck again.
It's easier for dev's to get HDR working right on consoles. They don't have to support 1000's of different software and hardware configs. This is pretty much fact.

For proper HDR on consoles you still need an Ultra HD Premium certified TV or better, an OLED, like I got. These can show HDR properly. HDR on CHG70 looks terrible compared to OLED HDR. It does not even have FALD. Edge LED with very few zones = Bad HDR. Again, facts. Go ask in an AV forum ;)

I wonder why Asus gaming monitors generally scores very high in reviews if that's the case.
"Asus laptop display" - it's not like Asus made the panel. Your friend probably cheaped out and got a 6 bit TN panel. Asus have plenty of laptop's with good panels, as in IPS.

HDR is not overrated, come again when you have seen proper HDR content on an OLED 4K TV.
HDR on LCD, on the other hand, is kind of a joke. You need FALD on a LCD to make it decent and still it's much worse than OLED. Backlight < Self emitting pixels.
Pretty much no PC Monitors use FALD. There's a few tho, with 2000 dollar pricetag or so.

"HDR is over-hyped" is what people who have never seen proper HDR always claim.
Posted on Reply
#11
MrGenius
mtcn77DisplayHDR specification as a whole of HDR10 rests upon 8-bit displays. Dolby Vision is 10-bit displays.
Any idea what the answers to my questions would be. Or should ask the questions differently, or ask different questions? Let me try one more time.

I recently bought a 1440p 144Hz FreeSync "gaming" monitor. I was under no impression that it was HDR capable. Which it isn't. But I was also not aware that it was 8-bit. Nor did I even know why being 8-bit mattered, or made any noticeable difference. I come to find out later that it does. I'm extremely satisfied with my monitor other than the color banding. Which I'm hearing is due to it being 8-bit. And that 10-bit is the solution to it. I came from gaming on CRTs and LED TVs. Which I had never noticed color banding on. So this was all news to me.

Questions
  • It sounds like HDR would help with, or eliminate, color banding by producing a "wider color gamut". Is this true?
  • If it is true, to what extent is it true?
  • Or, if all HDR10 is effectively 10-bit(which I'm reading it is, or it would be called HDR8, making the fact that the display itself is 8-bit irrelevant), is it equivalent to SDR 10-bit in terms of helping with or eliminating color banding?
  • Basically what I'm asking in a nutshell is...if I hate color banding, do I need HDR or SDR 10-bit to get rid of it? Either one? Neither one? Dolby Vision(which in fact is 12-bit capable)?
Posted on Reply
#12
mtcn77
MrGeniusAny idea what the answers to my questions would be. Or should ask the questions differently, or ask different questions? Let me try one more time.

I recently bought a 1440p 144Hz FreeSync "gaming" monitor. I was under no impression that it was HDR capable. Which it isn't. But I was also not aware that it was 8-bit. Nor did I even know why being 8-bit mattered, or made any noticeable difference. I come to find out later that it does. I'm extremely satisfied with my monitor other than the color banding. Which I'm hearing is due to it being 8-bit. And that 10-bit is the solution to it. I came from gaming on CRTs and LED TVs. Which I had never noticed color banding on. So this was all news to me.

Questions
  • It sounds like HDR would help with, or eliminate, color banding by producing a "wider color gamut". Is this true?
  • If it is true, to what extent is it true?
  • Or, if all HDR10 is effectively 10-bit(which I'm reading it is, or it would be called HDR8, making the fact that the display itself is 8-bit irrelevant), is it equivalent to SDR 10-bit in terms of helping with or eliminating color banding?
  • Basically what I'm asking in a nutshell is...if I hate color banding, do I need HDR or SDR 10-bit to get rid of it? Either one? Neither one? Dolby Vision(which in fact is 12-bit capable)?
The thing is, 8-bit is the desktop limit however 10-bit is activated with Dolby Vision. If however you have an issue with banding, you could try lowering gamma, or in other words screen brightness.
Posted on Reply
#13
MrGenius
mtcn77...you could try lowering gamma, or in other words screen brightness.
Thanks. But I'd rather not have to try anything. It looks perfect to me with the factory default monitor settings and standard Windows settings. Everything I've done to mess with them ended up looking worse. And, like I said, I'm actually very pleased with how it looks. Except the goddamn color banding in low light/dark areas of games that's driving me fricken crazy! As I'd imagine it would do to anybody. I don't know why they don't take special care to make sure it never happens. Man...I just want to go back to CRT. Nothing beats CRT image quality...PERIOD. It's just the matters of screen size and/or resolution that's poor with them. Other than that they can't be beat. All this LED, LCD, OLED, QLED, Micro LED, VA, TN, IPS bullshit is just that. BULLSHIT!!!
Posted on Reply
#14
Zubasa
las6 bit + FRC is typically low-end TN. There's many true 8 bit TN (mostly 1440p), there's also alot of true 8 bit high refresh rate IPS/AHVA panels and also some 8 bit + FRC.
These 8 bit 1440p TN panels have way better image quality than those cheap 6 bit + FRC 1080p TN panels.
I am mainly talking about the color reproduction and static contrast.
I had the MG278Q so I know how a actual 8-bit TN looks like, its okay, but not nearly as good as the CHG70.
Monitors are usually rated at their max brightness to cheat the contrast ratio, in the case of the Samsung VA, even the cheaper 1080p CFG70 managed 2800:1 of the rated 3000 contrast even at 130 nits (stock is 300nits).
Some of the other monitors especially the IPS monitors only gets anywhere near their rated Contrast ratio at eye blinding brightness.

HDR and Wide Color Gammut signals takes longer to process so in general HDR monitors have higher input lag.
The best E-Sport monitors are from BenQ that manages 9ms true input lag, but of course those look as bad ascheap TN, and actually induces flicker by black light strobing intentionally to give you better motion vision.
The PG27UQ on the other hand has over 20ms true input lag, so there is that.

Now on the issues of the CHG70, there is overshoot / reverse ghosting when pixel overdrive is on.
This is esepcially true on older firmware for this monitor, so try updating the firmware if you haven't already done so.
Posted on Reply
#15
mtcn77
MrGeniusThanks. But I'd rather not have to try anything. It looks perfect to me with the factory default monitor settings and standard Windows settings. Everything I've done to mess with them ended up looking worse. And, like I said, I'm actually very pleased with how it looks. Except the goddamn color banding in low light/dark areas of games that's driving me fricken crazy! As I'd imagine it would do to anybody. I don't know why they don't take special care to make sure it never happens. Man...I just want to go back to CRT. Nothing beats CRT image quality...PERIOD. It's just the matters of screen size and/or resolution that's poor with them. Other than that they can't be beat. All this LED, LCD, OLED, QLED, Micro LED, VA, TN, IPS bullshit is just that. BULLSHIT!!!
Try black equalizer, maybe?
Posted on Reply
#16
las
ZubasaI am mainly talking about the color reproduction and static contrast.
I had the MG278Q so I know how a actual 8-bit TN looks like, its okay, but not nearly as good as the CHG70.
Monitors are usually rated at their max brightness to cheat the contrast ratio, in the case of the Samsung VA, even the cheaper 1080p CFG70 managed 2800:1 of the rated 3000 contrast even at 130 nits (stock is 300nits).
Some of the other monitors especially the IPS monitors only gets anywhere near their rated Contrast ratio at eye blinding brightness.

HDR and Wide Color Gammut signals takes longer to process so in general HDR monitors have higher input lag.
The best E-Sport monitors are from BenQ that manages 9ms true input lag, but of course those look as bad ascheap TN, and actually induces flicker by black light strobing intentionally to give you better motion vision.
The PG27UQ on the other hand has over 20ms true input lag, so there is that.

Now on the issues of the CHG70, there is overshoot / reverse ghosting when pixel overdrive is on.
This is esepcially true on older firmware for this monitor, so try updating the firmware if you haven't already done so.
The monitor is brand new and came with newest firmware.

I'm not saying it's a bad monitor. It's just that HDR is not impressive and it's not suitable for fast paced games (if you ask me - This is true for all VA monitors tho).

Both Asus and Dell have monitors sub 5ms total input lag. PG278QR and Dell S2716DG (and the new S2719DGF) are all very low.
Even the PG279Q (AHVA/IPS) I'm using now, has ~5ms on avg (low 4.1 and high 6.5) - www.tftcentral.co.uk/reviews/asus_rog_swift_pg279q.htm#response_times

This is why VA is a problem for fast paced games: www.tftcentral.co.uk/reviews/samsung_c32hg70.htm#response
High is 40-50 and sometimes 60ms.

Conclusion: "Some slow response times resulting in dark smearing on some content. Fairly typical of VA panels"


You are right about HDR increasing total input lag. But if the monitors run in SDR mode, it should be the same. I don't know any fast paced MP games that support HDR on PC so you should get the "low" input lag.

I have not tried PG27UQ because I have no interrest in 2160p/4K at only 27". IMO you need 32 inches for this resolution.
Posted on Reply
#17
mtcn77
How is HDR increasing lag? I think we discussed this, AMD gpu vs Nvidia gpu.
Posted on Reply
#18
las
mtcn77How is HDR increasing lag? I think we discussed this, AMD gpu vs Nvidia gpu.
More processing etc.
Posted on Reply
#19
mtcn77
lasMore processing etc.
Nope, computerbase checked it.
  • These aren't affected.
    • Assassin's Creed Origins – 3.840 × 2.160: FPS, Durchschnitt
    • Battlefield 1 – 3.840 × 2.160: FPS, Durchschnitt
    • F1 2017 – 3.840 × 2.160: FPS, Durchschnitt
    • Final Fantasy XV – 3.840 × 2.160: FPS, Durchschnitt
    • Mass Effect: Andromeda – 3.840 × 2.160: FPS, Durchschnitt
    • Shadow Warrior 2: FPS, Durchschnitt
    • Star Wars: Battlefront 2 – 3.840 × 2.160: FPS, Durchschnitt
  • These are affected >1%.
    • Call of Duty: WWII – 3.840 × 2.160: FPS, Durchschnitt
    • Destiny 2 – 3.840 × 2.160: FPS, Durchschnitt
    • Far Cry 5 – 3.840 × 2.160: FPS, Durchschnitt
    • Mittelerde: Schatten des Krieges – 3.840 × 2.160: FPS, Durchschnitt
    • Resident Evil 7 – Biohazard: FPS, Durchschnitt
Posted on Reply
#20
las
mtcn77Nope, computerbase checked it.
  • These aren't affected.
    • Assassin's Creed Origins – 3.840 × 2.160: FPS, Durchschnitt
    • Battlefield 1 – 3.840 × 2.160: FPS, Durchschnitt
    • F1 2017 – 3.840 × 2.160: FPS, Durchschnitt
    • Final Fantasy XV – 3.840 × 2.160: FPS, Durchschnitt
    • Mass Effect: Andromeda – 3.840 × 2.160: FPS, Durchschnitt
    • Shadow Warrior 2: FPS, Durchschnitt
    • Star Wars: Battlefront 2 – 3.840 × 2.160: FPS, Durchschnitt
  • These are affected >1%.
    • Call of Duty: WWII – 3.840 × 2.160: FPS, Durchschnitt
    • Destiny 2 – 3.840 × 2.160: FPS, Durchschnitt
    • Far Cry 5 – 3.840 × 2.160: FPS, Durchschnitt
    • Mittelerde: Schatten des Krieges – 3.840 × 2.160: FPS, Durchschnitt
    • Resident Evil 7 – Biohazard: FPS, Durchschnitt
We're talking input lag, not fps
Posted on Reply
#21
mtcn77
lasWe're talking input lag, not fps
The processing is done on the gpu.
Posted on Reply
#22
las
mtcn77The processing is done on the gpu.
No.

Enabling HDR on a TV or monitor increases total input lag. Has nothing to do with framerate.
Posted on Reply
#23
mtcn77
lasNo.

Enabling HDR on a TV or monitor increases total input lag. Has nothing to do with framerate.
I checked. No, it isn't.
Posted on Reply
#24
las
mtcn77I checked. No, it isn't.
Yes. This is why TV reviews contain input lag testing for both SDR and HDR.
Posted on Reply
#25
Zubasa
lasThe monitor is brand new and came with newest firmware.

I'm not saying it's a bad monitor. It's just that HDR is not impressive and it's not suitable for fast paced games (if you ask me - This is true for all VA monitors tho).

Both Asus and Dell have monitors sub 5ms total input lag. PG278QR and Dell S2716DG (and the new S2719DGF) are all very low.
Even the PG279Q (AHVA/IPS) I'm using now, has ~5ms on avg (low 4.1 and high 6.5) - www.tftcentral.co.uk/reviews/asus_rog_swift_pg279q.htm#response_times

This is why VA is a problem for fast paced games: www.tftcentral.co.uk/reviews/samsung_c32hg70.htm#response
High is 40-50 and sometimes 60ms.

Conclusion: "Some slow response times resulting in dark smearing on some content. Fairly typical of VA panels"


You are right about HDR increasing total input lag. But if the monitors run in SDR mode, it should be the same. I don't know any fast paced MP games that support HDR on PC so you should get the "low" input lag.

I have not tried PG27UQ because I have no interrest in 2160p/4K at only 27". IMO you need 32 inches for this resolution.
VA in general is not great for response time, I agree.
But IPS and TN monitors are not always better.
For example the TN PG278Q on average is actually slower than the IPS PG279Q, and the non-HDR 3440x1440 PG348Q has worse input lag than the 4k PG27UQ
Also at least for average input lag the C32HG70 which is the 32 inch version is actually not the slowest, compare to the TN XG35VQ.
www.tftcentral.co.uk/reviews/asus_rog_swift_pg27uq.htm#lag
Posted on Reply
Add your own comment
Aug 15th, 2024 02:03 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts