Monday, June 25th 2018
NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing
PCPer had the opportunity to disassemble the ASUS ROG Swift PG27UQ 27", a 4K 144 Hz G-Sync HDR Monitor and found that the G-Sync module is a newer version than the one used on 1st generation G-Sync monitors (which of course do not support 4K / 144 Hz / HDR). The module is powered by an FPGA made by Altera (Intel-owned since 2015). The exact model number is Arria 10 GX 480, which is a high-performance 20 nanometer SoC that provides enough bandwidth and LVDS pins to process the data stream.
The FPGA is sold in low quantities for $2000 at Digikey and Mouser. Assuming that NVIDIA buys thousands, PCPer suggests that the price of this chip alone will add $500 to monitor cost. The BOM cost is further increased by 3 GB of DDR4 memory on the module. With added licensing fees for G-SYNC, this explains why these monitors are so expensive.
Sources:
PCPer Review, Altera Product Page
The FPGA is sold in low quantities for $2000 at Digikey and Mouser. Assuming that NVIDIA buys thousands, PCPer suggests that the price of this chip alone will add $500 to monitor cost. The BOM cost is further increased by 3 GB of DDR4 memory on the module. With added licensing fees for G-SYNC, this explains why these monitors are so expensive.
94 Comments on NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing
The fact they're using an FPGA suggests Nvidia doesn't expect to sell the kind of volume of these screens were a custom ASIC would make sense from a cost perspective, which further shows how over rated G-sync is.
It's biggest advantage is not having to use vsync, which helps for any reaction based game.
No serious gamer should use VSYNC. Adds input lag.
I have Gsync. I use ULMB instead. Way better. Any gaming LCD should use black frame insertion. Much less blur in fast paced games.
I remember playing the COD WWII beta , using v-sync at 60fps and a shitty wireless mouse I was pretty much always in the top 3. With all that "unbearable lag" , go figure.
Go try a low input lag 120-240 Hz monitor with 120+ fps ....
www.visualexpert.com/Resources/realprt.html
My personal opinion is that at refresh rate more than 120 Hz tearing is noticable, but it doesn't bother me, unlike at 60 Hz when it certainly does. I can definetly enjoy games with such unnoticable tearing. However, depending on the game being played, microstutter can occur for various reasons, which adaptive sync is often able to reduce or completely eliminate.
In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.
VRR is mainly for low fps gaming. Both Gsync and Freesync has shown to add input lag vs VSYNC OFF, depends on game how much.
Once again, ULMB is far superior to Gsync. I have Gsync and it's a joke at high fps. ULMB is delivering CRT like motion with no blur. Why on earth would I use Gsync when I can have buttery smooth motion.
Most people here don't have a clue about how smooth games CAN run. 120+ fps using 120+ Hz with ULMB ... Come again when you have tried it.
You want to use adaptive sync with a 1080ti? You must use gsync. The 144hz freesync limit doesnt matter all that much because AMD doesnt make a GPU that can actually push that frame rate with any reasonable detail level. When freesync was being pushed hard, the 480 was the best AMD had to offer.
I imagine there is a lot more you can do with hardware based monitor expansions VS the freesync standard, but I doubt nvidia will pursue that path until AMD can bother to compete.
The nvidia lock-in will absolutely backfire on them the moment AMD gets their act together.
How many said modules are out there at the moment?
Seems like that's an issue that will fix itself with economies of scale / less bloated implementations of HDR Gsync boards.
@las
ULMB is better than g-sync,but requires A LOT more CPU and GPU horsepower to run. Strobing is very hard on my eyes at anything less than 120Hz. Plus running ULMB with a wide amplitude in fps (let's say 90-130 fps) and vsync off produces less fluid animation, unless you play with fast sync, but from my experience that only produces the desired effect at very high framerate (~200 fps or higher for me). ULMB with v-sync on feels very fluid and has very,very little lag. G-sync is incredible for making the animation look smooth at lower fps, with very little added lag and no tearing. Of course there's more blur than ULMB, but the game still feels very,very fluid.
I'd say for me the hierarchy goes like this
1. ULMB @120 fps locked vsync on - but that's just impossible to run on most modern games.
2. G-sync at avg. of 90 fps or higher, this is the one I most often use due to the insane requirements for no.1
3.ULMB at avg. +100 fps with fast sync
I prefer fluid animation with no stutter. I pick that up instantly when I play with v-sync off, be it even at 165hz and 150 fps.It's not even about tearing,though I see it at 165hz too. I notice lack of frame synchronisation immediately,that's just me. That's why to me g-sync is the best thing that I've seen implemented in gaming in the recent years, by a mile.
And why is it that people's first reaction when they see someone calls them out on their statements is to block or ignore. I've seen dozens of people point out my bad thinking, and among hundreds of things I don't agree with I still can find a lot of things that changed my perspective. One can't be 100% right at all times, but blocking whatever you see that you don't like is cowardly.