Monday, June 25th 2018

NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing

PCPer had the opportunity to disassemble the ASUS ROG Swift PG27UQ 27", a 4K 144 Hz G-Sync HDR Monitor and found that the G-Sync module is a newer version than the one used on 1st generation G-Sync monitors (which of course do not support 4K / 144 Hz / HDR). The module is powered by an FPGA made by Altera (Intel-owned since 2015). The exact model number is Arria 10 GX 480, which is a high-performance 20 nanometer SoC that provides enough bandwidth and LVDS pins to process the data stream.

The FPGA is sold in low quantities for $2000 at Digikey and Mouser. Assuming that NVIDIA buys thousands, PCPer suggests that the price of this chip alone will add $500 to monitor cost. The BOM cost is further increased by 3 GB of DDR4 memory on the module. With added licensing fees for G-SYNC, this explains why these monitors are so expensive.
Sources: PCPer Review, Altera Product Page
Add your own comment

94 Comments on NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing

#52
Fluffmeister
It's it true HDR? And is it ironic Freesync is a trademark?

The reality is a lot of this display nonsense is expensive and it will get cheaper over time, I just can't get invested in the salty tears of the dodgy supply chain.
Posted on Reply
#53
Xzibit
I think Intel is missing an opportunity here. Intel owns Altera

All G-Sync monitors should have
Posted on Reply
#54
Fluffmeister
Overpriced Intel Altera tech and DRAM companies price fixing. We spend years waiting for the competition to turn up, then suddenly some get into bed together with some weird Kaby G VegaPolaris M frankenstein lark.

We all have Intel inside, God bless them.
Posted on Reply
#55
Vya Domus
FordGT90ConceptPersistance of vision just being *slightly* off can cause motion sickness or dizziness.
And single digit ms variations can cause that ? We are talking a few 1/1000ths of a second , I was unable to find any study or price of information than can confirm humans can spot variations that small and be able to respond to them in a consistent way. Not to mention humans can't even maintain a consistent reaction time in the first place , it always varies by 20 , 30 ,50ms , you are telling something like 16ms can have a measurable effect ?
Posted on Reply
#56
qubit
Overclocked quantum bit
B-RealJust check how many Freesync and how many G-Sync monitors are on the market. Even the second biggest TV manufacturer allowed Freesync support on some of their 2018 models. People can taunt AMD but getting a much cheaper monitor with the same specifications is good for everyone.
Yeah, that's what I'm saying, FreeSync is becoming the de facto standard. NVIDIA better wake up and smell the roses before it's too late.
Posted on Reply
#57
TheGuruStud
lasGsync and Freesync is a joke, unless you play in 30-60 fps range. Tearing is not an issue at 120+ fps using 120+ Hz.

No serious gamer should use VSYNC. Adds input lag.

I have Gsync. I use ULMB instead. Way better. Any gaming LCD should use black frame insertion. Much less blur in fast paced games.
I used to play exclusively at 120 fps/120 hz on a huge CRT back in the day and even then I needed vsync. Get a clue, tearing is monumentally distracting and drives me nuts at any rate. You must be blind. Adaptive sync is awesome and the future for all gaming. Go get some more milk from mommy.
Posted on Reply
#58
Jism
Vya DomusI laugh my ass off every time I see this. The average human reaction time is something like 250ms , whoever seriously thinks that a time frame of 16ms of less can make a perceivable difference is being delusional.

I remember playing the COD WWII beta , using v-sync at 60fps and a shitty wireless mouse I was pretty much always in the top 3. With all that "unbearable lag" , go figure.
Yeah and why do you think there is a professional gaming market playing with systems like 144FPS?

I can feel a difference in both Vsync on (capped at 72hz) or 72hz without Vsync and 110FPS produced by my GPU. The difference is is that i can see litterally the input lag from sweeping left to right in Pubg for example. Your GPU is actually being constrained to limit at your refresh rate, and this constain adds input lag. Maybe you or your neighbour does'nt note this but anyone playing a FPS game will note / feel the difference.

Play long enough and you'll understand that at some point the 60Hz / 60FPS becomes a limitation in any fast pace FPS game.
Posted on Reply
#59
Prima.Vera
TheLostSwedeYet G-sync feels like a over rated technology on a whole. Not overly impressed by my own screen, just glad I didn't pay the full price for it.
The fact they're using an FPGA suggests Nvidia doesn't expect to sell the kind of volume of these screens were a custom ASIC would make sense from a cost perspective, which further shows how over rated G-sync is.
I completelly agree. I have also a G-Spot monitor, but to be honest playing on a 1440p with an GTX 1080 cannot go more than 100 fps anyways and it's always averaging between 40-80 fps. So I never get any tearing even with those x-SYNCs enabled or not... Plus fluidity....hmmm
Posted on Reply
#60
FordGT90Concept
"I go fast!1!11!1!"
Vya DomusAnd single digit ms variations can cause that ? We are talking a few 1/1000ths of a second , I was unable to find any study or price of information than can confirm humans can spot variations that small and be able to respond to them in a consistent way. Not to mention humans can't even maintain a consistent reaction time in the first place , it always varies by 20 , 30 ,50ms , you are telling something like 16ms can have a measurable effect ?
en.wikipedia.org/wiki/Subliminal_stimuli

Just because the higher orders of the brain ignores something doesn't mean it didn't register with lower orders of brain function.

Higher refresh rate and more frames translates to a smoother, clearer picture. Example: get in a vehicle and drive the speed limit, now focus on grass just beyond the road out the passenger window. Your brain will take all of that data, force your eyes to hook on to a specific reference point, and take a snap shot of it. In that instance, you'll have clarity. Try to do the same thing video and the clarity simply isn't there. There's huge gaps in the data between frames. Your brain will end up hooking on to a single frame and recalling that picture.

Again, this has nothing to do with reaction time and everything to do with how the brain handles eye sight:
en.wikipedia.org/wiki/Persistence_of_vision
Posted on Reply
#61
cucker tarlson
B-RealJust check how many Freesync and how many G-Sync monitors are on the market. Even the second biggest TV manufacturer allowed Freesync support on some of their 2018 models. People can taunt AMD but getting a much cheaper monitor with the same specifications is good for everyone.
Freesync equivalents lack strobing, which is probably the best technology for gaming ATM as long as you can keep a constant, high fps.
Vya DomusAnd single digit ms variations can cause that ? We are talking a few 1/1000ths of a second , I was unable to find any study or price of information than can confirm humans can spot variations that small and be able to respond to them in a consistent way. Not to mention humans can't even maintain a consistent reaction time in the first place , it always varies by 20 , 30 ,50ms , you are telling something like 16ms can have a measurable effect ?
You're mistaking reaction time with what one can observe. Still confused. And why do you need a study,there's a million people who can tell you this and, better even, you can get a high refresh display yourself instead of relying on some digits taken out of context. What "human reaction" varies by 50ms ? Certainly not players' muscle memory when they play shooters. You're taking stuff out of context again.
Posted on Reply
#63
cucker tarlson
FordGT90Concept

"Surprisingly, there’s no overdrive setting in the OSD menu. Instead, the ‘Response Time’ setting offers Standard, Faster, and Fastest options. Both Faster and Fastest modes enable backlight strobing which delivers the specified 1ms response time speed."

G-SYNC vs FreeSync is like SLI versus Crossfire. The former is hardware, the latter is software.
Ah,good find.
A ‘1ms MRPT’ (Moving Picture Response Time) is specified, achieved using the ‘Impulsive Scanning’ strobe backlight mode.
Though I wonder how dark a VA panel gets in strobing mode.
Posted on Reply
#64
las
TheGuruStudI used to play exclusively at 120 fps/120 hz on a huge CRT back in the day and even then I needed vsync. Get a clue, tearing is monumentally distracting and drives me nuts at any rate. You must be blind. Adaptive sync is awesome and the future for all gaming. Go get some more milk from mommy.
You used VSYNC? Hahaha... You like input lag? Casual gamer detected. Case closed.
Posted on Reply
#66
las
TheGuruStudBan this retard, please. He knows nothing. He's never even used a CRT nor even knows the amount of input lag incurred from anything. And thinks that you can actually see anything while gaming with tearing. Mommy should have aborted this mouth breather.
Read and learn - www.blurbusters.com/faq/motion-blur-reduction/

Professional gamers don't use Gsync or Freesync. Why do you think?

Btw how old are you? Talking about CRT's yet acting like a teen :laugh:
Posted on Reply
#67
Pure Wop
lasGsync and Freesync is a joke, unless you play in 30-60 fps range. Tearing is not an issue at 120+ fps using 120+ Hz.

No serious gamer should use VSYNC. Adds input lag.

I have Gsync. I use ULMB instead. Way better. Any gaming LCD should use black frame insertion. Much less blur in fast paced games.
The problem is even on my 3440*1440 monitor 120+ fps requires SLI and good game support of SLI, not to mention 4K ones.

To my surprise I can't find ULMB ultrawides either.
Posted on Reply
#68
las
TheGuruStudDo you think you're a pro gamer? Lolololololololololololololololololol. I bet you also think you're a world class driver in mommy's Civic, dumb ass. Tearing ruins gaming a lot more than 10ms.

And it's funny, vsync didn't stop me from ROFL stomping wannabes like yourself online in every FPS.
You are clueless. Pro gamers use what's best, and they don't use Gsync or Freesync.
Nah I'm not pro, but I know how games are supposed to run. No motion blur and lowest possible input lag.

Seriously, how old are you? Hahaha. You act like a mad teen. Ragekid?
Posted on Reply
#70
bug
lasYou are clueless. Pro gamers use what's best, and they don't use Gsync or Freesync.
That is where you're wrong. In the absence of statistics showing gamers using fast refresh screens consistently win more than the others, the more likely explanation is gamers use whatever the sponsors make available for them.
If refresh was paramount, there'd be a plethora of CRTs at every competition.

Also, judging what's best for you based of what professional gamers use is like looking at Formula 1 or Nascar to see what car you need to buy next.
Posted on Reply
#71
las
bugThat is where you're wrong. In the absence of statistics showing gamers using fast refresh screens consistently win more than the others, the more likely explanation is gamers use whatever the sponsors make available for them.
If refresh was paramount, there'd be a plethora of CRTs at every competition.

Also, judging what's best for you based of what professional gamers use is like looking at Formula 1 or Nascar to see what car you need to buy next.
Not really the same. They are not superhumans. Anyone will benefit from no motion blur. LCD/OLED has tons of motion blur without proper blur reduction mode.
Have you tried gaming with motion blur reduction mode? It's like playing on CRT again.

Posted on Reply
#72
bug
lasNot really the same. They are not superhumans. Anyone will benefit from no motion blur. LCD/OLED has tons of motion blur without proper blur reduction mode.
Have you tried gaming with motion blur reduction mode? It's like playing on CRT again.
You continue to assume gaming can only happen on the very high end...
Posted on Reply
#73
lexluthermiester
TheGuruStudGo get some more milk from mommy.
You were making a good point up until that.
TheGuruStudMommy should have aborted this mouth breather.
Really?

Maybe you're the one that needs growing up, eh?

I do agree with adaptive-sync being very useful, but whether or not it's the future remains to be seen.
lasIt's like playing on CRT again.
CRT's had motion-blur as well. It was less pronounced at lower refresh-rates, but it was still there.
Posted on Reply
#74
las
bugYou continue to assume gaming can only happen on the very high end...
Well I have Gsync and I choose ULMB over it. It's simply a much better experience. I rarely experience tearing, so why would I accept motion blur all the time when I can have no/low motion blur and lower input lag?

Gsync and Freesync is good for high res gaming on high settings, where fps can dip way below 100. I sometimes play single player games with it. But fast paced multiplayer games? Never. ULMB all the way. It's much easier to track enemies with no motion blur. As in day and night difference.

I'm not telling people how to play their games, but I have tried tons of gaming monitors and Gsync and Freesync have never really impressed me. Mostly because I very rarely settle with less than 100 fps and tearing is not a big issue with high fps.
Posted on Reply
#75
bug
lasWell I have Gsync and I choose ULMB over it. It's simply a much better experience. I rarely experience tearing, so why would I accept motion blur all the time when I can have no/low motion blur and lower input lag?

Gsync and Freesync is good for high res gaming on high settings, where fps can dip way below 100. I sometimes play single player games with it. But fast paced multiplayer games? Never. ULMB all the way. It's much easier to track enemies with no motion blur. As in day and night difference.

I'm not telling people how to play their games, but I have tried tons of gaming monitors and Gsync and Freesync have never really impressed me. Mostly because I very rarely settle with less than 100 fps and tearing is not a big issue with high fps.
So all this is you telling us that at high FPS ULMB is better and at low FPS GSync/FreeSync is better? We already knew that.

And about the lag that synching adds: it means your mouse cursor will be lagging at most until the next refresh. At 60 fps that's 16ms. At 100 fps it's 10ms. That's still lag, but well into negligible territory.
Posted on Reply
Add your own comment
Nov 21st, 2024 09:09 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts