Monday, June 25th 2018

NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing

PCPer had the opportunity to disassemble the ASUS ROG Swift PG27UQ 27", a 4K 144 Hz G-Sync HDR Monitor and found that the G-Sync module is a newer version than the one used on 1st generation G-Sync monitors (which of course do not support 4K / 144 Hz / HDR). The module is powered by an FPGA made by Altera (Intel-owned since 2015). The exact model number is Arria 10 GX 480, which is a high-performance 20 nanometer SoC that provides enough bandwidth and LVDS pins to process the data stream.

The FPGA is sold in low quantities for $2000 at Digikey and Mouser. Assuming that NVIDIA buys thousands, PCPer suggests that the price of this chip alone will add $500 to monitor cost. The BOM cost is further increased by 3 GB of DDR4 memory on the module. With added licensing fees for G-SYNC, this explains why these monitors are so expensive.
Sources: PCPer Review, Altera Product Page
Add your own comment

94 Comments on NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing

#1
TheinsanegamerN
The price you pay for high resolution and HDR. betwen this, the panel and the GPU, vs a 1440p144 setup.
Posted on Reply
#2
TheLostSwede
News Editor
Yet G-sync feels like a over rated technology on a whole. Not overly impressed by my own screen, just glad I didn't pay the full price for it.
The fact they're using an FPGA suggests Nvidia doesn't expect to sell the kind of volume of these screens were a custom ASIC would make sense from a cost perspective, which further shows how over rated G-sync is.
Posted on Reply
#3
TheinsanegamerN
TheLostSwedeYet G-sync feels like a over rated technology on a whole. Not overly impressed by my own screen, just glad I didn't pay the full price for it.
The fact they're using an FPGA suggests Nvidia doesn't expect to sell the kind of volume of these screens were a custom ASIC would make sense from a cost perspective, which further shows how over rated G-sync is.
I think it depends on both the games you play and your setup. on a 144hz setup with fast paced FPS games, it makes a huge difference in smoothness and usability VS using vsync. But on slower games like RTS games, or at lower refresh rates, GSYNC's usefulness is diminished.

It's biggest advantage is not having to use vsync, which helps for any reaction based game.
Posted on Reply
#4
las
Gsync and Freesync is a joke, unless you play in 30-60 fps range. Tearing is not an issue at 120+ fps using 120+ Hz.

No serious gamer should use VSYNC. Adds input lag.

I have Gsync. I use ULMB instead. Way better. Any gaming LCD should use black frame insertion. Much less blur in fast paced games.
Posted on Reply
#5
Vya Domus
lasNo serious gamer should use VSYNC. Adds input lag.
I laugh my ass off every time I see this. The average human reaction time is something like 250ms , whoever seriously thinks that a time frame of 16ms of less can make a perceivable difference is being delusional.

I remember playing the COD WWII beta , using v-sync at 60fps and a shitty wireless mouse I was pretty much always in the top 3. With all that "unbearable lag" , go figure.
Posted on Reply
#6
las
Vya DomusI laugh my ass off every time I see this. The average human reaction time is something like 250ms , whoever seriously thinks that a time frame of 16ms of less can make a perceivable difference is being delusional.

I remember playing the COD WWII beta , using v-sync at 60fps and a shitty wireless mouse I was pretty much always in the top 3. With all that "unbearable lag" , go figure.
Delusional. Haha, I can feel it instantly. You sound like a casual gamer and you probably are with that CPU. Your 60 Hz TV has tons of input lag too. No wonder you can't tell the difference between VSYNC on and off..

Go try a low input lag 120-240 Hz monitor with 120+ fps ....
Posted on Reply
#7
atomicus
TheinsanegamerNI think it depends on both the games you play and your setup. on a 144hz setup with fast paced FPS games, it makes a huge difference in smoothness and usability VS using vsync. But on slower games like RTS games, or at lower refresh rates, GSYNC's usefulness is diminished.

It's biggest advantage is not having to use vsync, which helps for any reaction based game.
You can certainly still see tearing at 144Hz despite what some people say, and G-Sync can of course help with this, but tearing at that high a refresh rate doesn't last as long as it does at lower refresh rates, so it's wrong to say G-Sync makes a "huge" difference when you're reaching that kind of frame rate. Yes it will help, but it would be more beneficial and appreciated at lower frame rates where, without it, you'd be more aware of the tearing. I'm all for G-Sync though and yes, it means you don't need V-Sync, and ultimately 144Hz with G-Sync is definitely going to give the smoothest gaming experience. Of course, if your GPU can't even get close to pushing the monitor that high it's a moot point. Let's not also forget that input lag and certain monitor panel characteristics (smearing, ghosting, gamma shift etc.) can also factor in heavily to the overall gaming experience, with some people being far more sensitive than others to these things.
Posted on Reply
#9
Space Lynx
Astronaut
Gaming feels smoother to me when I use G-Sync, so I don't know what to tell the naysayers. I guess turn it off and don't enjoy it. I personally notice a difference in my overall gaming experience being smoother.
Posted on Reply
#10
Vya Domus
lynx29Gaming feels smoother to me when I use G-Sync
Of course you do , variable refresh rate ensures you see only unique frames where the frame time variation is minimal. That's a huge chunk of what constitutes a smooth gaming experience not the lack of 10ms or whatever in terms of response time.
Posted on Reply
#11
Gasaraki
Vya DomusOf course you do , variable refresh rate ensures you see only unique frames where the frame time variation is minimal. That's a huge chunk of what constitutes a smooth gaming experience not the lack of 10ms or whatever in terms of response time.
Yeah, people have no idea what G-Sync and Freesync does. Running a monitor at 120Hz+ without VRR technology doesn't make tearing go away, you need to sync up the frames between the video card and monitor.
Posted on Reply
#12
Liviu Cojocaru
It is kind of weird to have a small PC powering a monitor to connect it to an actual computer :D
Posted on Reply
#13
GamerNerves
Freesync and G-Sync are very welcome technologies. It is completely up to the invidiual person if he notices the advantages or not. These adaptive synchronization technologies can eliminate microstutter and tearing with no added input lag, what regural vertical synchronization will induce, at any supported refresh rate defined by the monitor at hand.

My personal opinion is that at refresh rate more than 120 Hz tearing is noticable, but it doesn't bother me, unlike at 60 Hz when it certainly does. I can definetly enjoy games with such unnoticable tearing. However, depending on the game being played, microstutter can occur for various reasons, which adaptive sync is often able to reduce or completely eliminate.
Posted on Reply
#14
qubit
Overclocked quantum bit
I'd love to know what G-SYNC does so much better over a FreeSync monitor to achieve a similar result that takes so much more engineering and money. I can only think that the solution is much more sophisticated, but there doesn't seem to be that much difference in reviews.

In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.
Posted on Reply
#15
las
GasarakiYeah, people have no fuckin idea what G-Sync and Freesync does. Running a monitor at 120Hz+ without VRR technology doesn't make tearing go away, you need to sync up the frames between the video card and monitor.
It does not matter when frames are replaced that fast. No pro gamers use VSYNC or adaptive sync. Wonder why...
VRR is mainly for low fps gaming. Both Gsync and Freesync has shown to add input lag vs VSYNC OFF, depends on game how much.

Once again, ULMB is far superior to Gsync. I have Gsync and it's a joke at high fps. ULMB is delivering CRT like motion with no blur. Why on earth would I use Gsync when I can have buttery smooth motion.

Most people here don't have a clue about how smooth games CAN run. 120+ fps using 120+ Hz with ULMB ... Come again when you have tried it.
Posted on Reply
#16
CrAsHnBuRnXp
lasYou sound like a casual gamer and you probably are with that CPU.
Always with the personal attacks.
Posted on Reply
#17
Liviu Cojocaru
qubitI'd love to know what G-SYNC does so much better over a FreeSync monitor to achieve a similar result that takes so much more engineering and money. I can only think that the solution is much more sophisticated, but there doesn't seem to be that much difference in reviews.

In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.
I agree, I think they need to step up their game and use the hardware in the graphics card to do most of the work
Posted on Reply
#18
kastriot
Ahahahaahahahahahahah how pathetic...
Posted on Reply
#19
TheinsanegamerN
qubitI'd love to know what G-SYNC does so much better over a FreeSync monitor to achieve a similar result that takes so much more engineering and money. I can only think that the solution is much more sophisticated, but there doesn't seem to be that much difference in reviews.

In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.
One simple answer: the backing of nvidia GPUs.

You want to use adaptive sync with a 1080ti? You must use gsync. The 144hz freesync limit doesnt matter all that much because AMD doesnt make a GPU that can actually push that frame rate with any reasonable detail level. When freesync was being pushed hard, the 480 was the best AMD had to offer.

I imagine there is a lot more you can do with hardware based monitor expansions VS the freesync standard, but I doubt nvidia will pursue that path until AMD can bother to compete.
Posted on Reply
#20
qubit
Overclocked quantum bit
TheinsanegamerNOne simple answer: the backing of nvidia GPUs.

You want to use adaptive sync with a 1080ti? You must use gsync. The 144hz freesync limit doesnt matter all that much because AMD doesnt make a GPU that can actually push that frame rate with any reasonable detail level. When freesync was being pushed hard, the 480 was the best AMD had to offer.

I imagine there is a lot more you can do with hardware based monitor expansions VS the freesync standard, but I doubt nvidia will pursue that path until AMD can bother to compete.
Thing is, attempting consumer or business lock-in is something most companies try at some point, but it can easily backfire on them when they're undercut by the cheaper rival, regardless of technical merit. I think that's happening here between G-SYNC and FreeSync.
Posted on Reply
#21
TheinsanegamerN
qubitThing is, attempting consumer or business lock-in is something most companies try at some point, but it can easily backfire on them when they're undercut by the cheaper rival, regardless of technical merit. I think that's happening here between G-SYNC and FreeSync.
I want to agree, but I dont see freesync winning any victories here, as nvidia GPUs dominate steam's numbers and AMD faffs around with Vega.

The nvidia lock-in will absolutely backfire on them the moment AMD gets their act together.
Posted on Reply
#22
phanbuey
i mean is that because of Nvidia, or just because HDR is still teething...?

How many said modules are out there at the moment?

Seems like that's an issue that will fix itself with economies of scale / less bloated implementations of HDR Gsync boards.
Posted on Reply
#23
cucker tarlson
Vya Domus is just so clueless,just like in most of his posts. Thinks he knows best. Plays on a TV - says HRR/ULMB is a gimmick. Plays on FX6300 - says his CPU doesn't bottleneck GPUs in new games. Fights against nvidia anti-consumer practices - runs a 1060. This man is a walking casserole of contradiction and nonsense.
Vya DomusOf course you do , variable refresh rate ensures you see only unique frames where the frame time variation is minimal. That's a huge chunk of what constitutes a smooth gaming experience not the lack of 10ms or whatever in terms of response time.
I agree with the first sentence 110% and I think you hit the nail on the head here, but what you wrote at the end of the second one is so stupid it cancells the right part out. :laugh: Of course low lag is important,as is fast pixel response. That's why I'll never buy a VA for fps games.

@las
ULMB is better than g-sync,but requires A LOT more CPU and GPU horsepower to run. Strobing is very hard on my eyes at anything less than 120Hz. Plus running ULMB with a wide amplitude in fps (let's say 90-130 fps) and vsync off produces less fluid animation, unless you play with fast sync, but from my experience that only produces the desired effect at very high framerate (~200 fps or higher for me). ULMB with v-sync on feels very fluid and has very,very little lag. G-sync is incredible for making the animation look smooth at lower fps, with very little added lag and no tearing. Of course there's more blur than ULMB, but the game still feels very,very fluid.

I'd say for me the hierarchy goes like this

1. ULMB @120 fps locked vsync on - but that's just impossible to run on most modern games.
2. G-sync at avg. of 90 fps or higher, this is the one I most often use due to the insane requirements for no.1
3.ULMB at avg. +100 fps with fast sync

I prefer fluid animation with no stutter. I pick that up instantly when I play with v-sync off, be it even at 165hz and 150 fps.It's not even about tearing,though I see it at 165hz too. I notice lack of frame synchronisation immediately,that's just me. That's why to me g-sync is the best thing that I've seen implemented in gaming in the recent years, by a mile.
Posted on Reply
#24
Vya Domus
cucker tarlsonVya Domus is just so clueless,just like in most of his posts. Thinks he knows best. Plays on a TV - says HRR/ULMB is a gimmick. Plays on FX6300 - says his CPU doesn't bottleneck GPUs in new games. Fights against nvidia anti-consumer practices - runs a 1060. This man is a walking contradiction.
Honestly I am tired of reading your nonsense everywhere , I am pretty tolerant to this sort of stuff by I genuinely feel like I am reading spam. Always picking up a fight and trying to insult me before you even try to argue with what I say , as unsuccessful as you are at doing that you can't even contain yourself from showing how toxic you are with every occasion. You've proven to me you can't read and interpret information properly numerous times so I have no need to waste my time with you anymore. You don't need to do that either since you'll be on ignore from now on.
Posted on Reply
#25
cucker tarlson
You just can't take whatever it is you find written on the internet by someone who calls themselves an expert and argue with the personal experience of just about anyone who jumped from 60hz to 120hz or higher. This is ridiculous. But I know you'd rather take a French leave when people point out the obvious flaws in your thinking and gaps in your experience and knowledge than admit even an expert can be 100% wrong, therefore you are too.

And why is it that people's first reaction when they see someone calls them out on their statements is to block or ignore. I've seen dozens of people point out my bad thinking, and among hundreds of things I don't agree with I still can find a lot of things that changed my perspective. One can't be 100% right at all times, but blocking whatever you see that you don't like is cowardly.
Posted on Reply
Add your own comment
Nov 21st, 2024 05:31 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts