Monday, June 25th 2018
NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing
PCPer had the opportunity to disassemble the ASUS ROG Swift PG27UQ 27", a 4K 144 Hz G-Sync HDR Monitor and found that the G-Sync module is a newer version than the one used on 1st generation G-Sync monitors (which of course do not support 4K / 144 Hz / HDR). The module is powered by an FPGA made by Altera (Intel-owned since 2015). The exact model number is Arria 10 GX 480, which is a high-performance 20 nanometer SoC that provides enough bandwidth and LVDS pins to process the data stream.
The FPGA is sold in low quantities for $2000 at Digikey and Mouser. Assuming that NVIDIA buys thousands, PCPer suggests that the price of this chip alone will add $500 to monitor cost. The BOM cost is further increased by 3 GB of DDR4 memory on the module. With added licensing fees for G-SYNC, this explains why these monitors are so expensive.
Sources:
PCPer Review, Altera Product Page
The FPGA is sold in low quantities for $2000 at Digikey and Mouser. Assuming that NVIDIA buys thousands, PCPer suggests that the price of this chip alone will add $500 to monitor cost. The BOM cost is further increased by 3 GB of DDR4 memory on the module. With added licensing fees for G-SYNC, this explains why these monitors are so expensive.
94 Comments on NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing
The reality is a lot of this display nonsense is expensive and it will get cheaper over time, I just can't get invested in the salty tears of the dodgy supply chain.
All G-Sync monitors should have
VegaPolaris M frankenstein lark.We all have Intel inside, God bless them.
I can feel a difference in both Vsync on (capped at 72hz) or 72hz without Vsync and 110FPS produced by my GPU. The difference is is that i can see litterally the input lag from sweeping left to right in Pubg for example. Your GPU is actually being constrained to limit at your refresh rate, and this constain adds input lag. Maybe you or your neighbour does'nt note this but anyone playing a FPS game will note / feel the difference.
Play long enough and you'll understand that at some point the 60Hz / 60FPS becomes a limitation in any fast pace FPS game.
Just because the higher orders of the brain ignores something doesn't mean it didn't register with lower orders of brain function.
Higher refresh rate and more frames translates to a smoother, clearer picture. Example: get in a vehicle and drive the speed limit, now focus on grass just beyond the road out the passenger window. Your brain will take all of that data, force your eyes to hook on to a specific reference point, and take a snap shot of it. In that instance, you'll have clarity. Try to do the same thing video and the clarity simply isn't there. There's huge gaps in the data between frames. Your brain will end up hooking on to a single frame and recalling that picture.
Again, this has nothing to do with reaction time and everything to do with how the brain handles eye sight:
en.wikipedia.org/wiki/Persistence_of_vision
"Surprisingly, there’s no overdrive setting in the OSD menu. Instead, the ‘Response Time’ setting offers Standard, Faster, and Fastest options. Both Faster and Fastest modes enable backlight strobing which delivers the specified 1ms response time speed."
G-SYNC vs FreeSync is like SLI versus Crossfire. The former is hardware, the latter is software.
Professional gamers don't use Gsync or Freesync. Why do you think?
Btw how old are you? Talking about CRT's yet acting like a teen :laugh:
To my surprise I can't find ULMB ultrawides either.
Nah I'm not pro, but I know how games are supposed to run. No motion blur and lowest possible input lag.
Seriously, how old are you? Hahaha. You act like a mad teen. Ragekid?
AOC G2460PQU and BenQ XL2730Z are FreeSync panels.
AMD recommends using Enhanced Sync when FreeSync is enabled.
If refresh was paramount, there'd be a plethora of CRTs at every competition.
Also, judging what's best for you based of what professional gamers use is like looking at Formula 1 or Nascar to see what car you need to buy next.
Have you tried gaming with motion blur reduction mode? It's like playing on CRT again.
Maybe you're the one that needs growing up, eh?
I do agree with adaptive-sync being very useful, but whether or not it's the future remains to be seen. CRT's had motion-blur as well. It was less pronounced at lower refresh-rates, but it was still there.
Gsync and Freesync is good for high res gaming on high settings, where fps can dip way below 100. I sometimes play single player games with it. But fast paced multiplayer games? Never. ULMB all the way. It's much easier to track enemies with no motion blur. As in day and night difference.
I'm not telling people how to play their games, but I have tried tons of gaming monitors and Gsync and Freesync have never really impressed me. Mostly because I very rarely settle with less than 100 fps and tearing is not a big issue with high fps.
And about the lag that synching adds: it means your mouse cursor will be lagging at most until the next refresh. At 60 fps that's 16ms. At 100 fps it's 10ms. That's still lag, but well into negligible territory.