Monday, June 25th 2018

NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing

PCPer had the opportunity to disassemble the ASUS ROG Swift PG27UQ 27", a 4K 144 Hz G-Sync HDR Monitor and found that the G-Sync module is a newer version than the one used on 1st generation G-Sync monitors (which of course do not support 4K / 144 Hz / HDR). The module is powered by an FPGA made by Altera (Intel-owned since 2015). The exact model number is Arria 10 GX 480, which is a high-performance 20 nanometer SoC that provides enough bandwidth and LVDS pins to process the data stream.

The FPGA is sold in low quantities for $2000 at Digikey and Mouser. Assuming that NVIDIA buys thousands, PCPer suggests that the price of this chip alone will add $500 to monitor cost. The BOM cost is further increased by 3 GB of DDR4 memory on the module. With added licensing fees for G-SYNC, this explains why these monitors are so expensive.
Sources: PCPer Review, Altera Product Page
Add your own comment

94 Comments on NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing

#76
las
bugSo all this is you telling us that at high FPS ULMB is better and at low FPS GSync/FreeSync is better? We already knew that.

And about the lag that synching adds: it means your mouse cursor will be lagging at most until the next refresh. At 60 fps that's 16ms. At 100 fps it's 10ms. That's still lag, but well into negligible territory.
I'm saying that high fps ULMB is better than high fps GSYNC/FREESYNC.

www.blurbusters.com/persistence-vs-motion-blur/

It all adds us. I'm going for lowest possible input lag.
Posted on Reply
#77
lexluthermiester
lasIt all adds us. I'm going for lowest possible input lag.
You're missing something very important. There is a point where the pursuit of gaming perfection actually gets in the way of gaming enjoyment. I speak from experience. 120hz is a very good high-bar that isn't out of reach of the average PC gamer. However, 99% of all games are still very enjoyable at 60hz which is one of the reasons it's a standard across the industry.
Posted on Reply
#78
las
lexluthermiesterYou're missing something very important. There is a point where the pursuit of gaming perfection actually gets in the way of gaming enjoyment. I speak from experience. 120hz is a very good high-bar that isn't out of reach of the average PC gamer. However, 99% of all games are still very enjoyable at 60hz which is one of the reasons it's a standard across the industry.
30-60 fps and 60 Hz are a more like a compromise because of hardware limitations.

High end TV's, smartphones, tablets, PC monitors etc. You see 120 Hz support on tons of devices these days. 60 Hz was never enough. Yes, it's "acceptable", but EVERYONE can see the difference between 60 fps/Hz and 120 fps/Hz when setup properly (Yep, I have seen people buy 120-240 Hz monitors and use them at 60 Hz...) Even in many stores that showcase high refresh rate monitors, they run 60 Hz. No wonder some people think it's all marketing.

It won't take many years before 120 Hz is the new 60 Hz. Atleast on high-end stuff.
Posted on Reply
#79
lexluthermiester
lasIt won't take many years before 120 Hz is the new 60 Hz. Atleast on high-end stuff.
People have been saying that for the better part of 2 decades, and yet here we are, still very much in a 60hz world. The reason is simple, science and the limitations of human visual perception. The human eye can only perceive between 20 to 30 individual frames per second. Above that and it's simply becomes perception of smoothness. While we can tell that there is a difference between 30->60hz and 60->120hz, we will never see the individual frames. For the vast majority of the human race, 60hz will always be seen as fluid and smooth. For elitists, like myself, 120hz is the standard. Anything above 120hz is a waste of time and resources because of the limitations of the human eye. This is factual science and the display building industry knows it.
Posted on Reply
#80
bug
We've had this very conversation back in CRT days: is 100Hz enough or do you really need 240Hz?
This time, though the problem is a little different: fast refresh screens were expensive back then as well, but today, because of higher resolutions, the video cards that can push them are prohibitively expensive.

Edit: this is in reply to post #81
Posted on Reply
#81
las
I easily tell 240fps/240Hz is more smooth than 120fps/120Hz, but 1080p TN is a dealbreaker for me atm. 1440p/120-165Hz IPS is "fine" for now.
It's not a huge difference like 60fps/60Hz to 120/120. But it's definitely noticable and provides even better smoothness.


In the CRT days, 100 Hz were bare minimum for me. Just because 60 Hz aint blinking on LCD/OLED, does not mean it's smooth. It's just as terrible in terms of motion.
lexluthermiesterPeople have been saying that for the better part of 2 decades, and yet here we are, still very much in a 60hz world.
They might have said that, but it's happening now.

Pretty much all 2018 high-end TV's supports 120 Hz native.
iPad Pro is 120 Hz native. Several Android phones has 90-120 Hz native.
More and more PC monitors and laptops has 120-240 Hz native.

Every single person can see the difference between 60 and 120 Hz. Most just don't know that 60 Hz is crap. It was choosen because of bandwidth and hardware limitations.
Posted on Reply
#82
bug
@lexluthermiester 60 fps may be fine, but most panels can't even refresh that fast. They need overdrive to keep all transitions at 16ms or less and overdrive can and most of the time does introduce artifacts. That doesn't make 60Hz panel unusable, but it does mean there's room for improvement.
las's 120Hz with ULMB is actually 60Hz, but with a black frame inserted after each rendered frame. It's a trick that makes overdrive unnecessary, but needs panels that can refresh fast.
Posted on Reply
#83
lexluthermiester
bug@lexluthermiester 60 fps may be fine, but most panels can't even refresh that fast. They need overdrive to keep all transitions at 16ms or less and overdrive can and most of the time does introduce artifacts. That doesn't make 60Hz panel unusable, but it does mean there's room for improvement.
las's 120Hz with ULMB is actually 60Hz, but with a black frame inserted after each rendered frame. It's a trick that makes overdrive unnecessary, but needs panels that can refresh fast.
Ah ok, I gotcha. Not all panels are like that.
Posted on Reply
#84
Steve3p0
Yeah 1440p HDR freesync looks good.
Posted on Reply
#85
JoniISkandar
TheinsanegamerNI want to agree, but I dont see freesync winning any victories here, as nvidia GPUs dominate steam's numbers and AMD faffs around with Vega.

The nvidia lock-in will absolutely backfire on them the moment AMD gets their act together.
Nvidia Dominance after they push more to sponsored game and adding more software that only them know,,

since physix era this way start nvidia dominance,, AMD making best Graphic card HD 5870 and better than GTX 480,, people still buy nvidia because MIND SET not performance,,, this happen when Nvidia logo always flying around on every game
Posted on Reply
#86
John Naylor
SteevoIt's running security checks to ensure only Nvidia cards are connected, that requires a VM, which needs RAM, and a fast processor.......
I see it the other way around ... the number of Freesync monitors offering any type of MBR technology are shrinking fast. Freesync and G-Syns will drop to insignificance fir the enthusiast gamer as 144 + Hz will be the order of the day
B-RealJust check how many Freesync and how many G-Sync monitors are on the market. Even the second biggest TV manufacturer allowed Freesync support on some of their 2018 models. People can taunt AMD but getting a much cheaper monitor with the same specifications is good for everyone.
1. Too bad no such thing exists. When you can compare "apples and apples" and the price is the same. Freesync monitors with monitor manufacturer supplied Motion Blur Reduction (MBR) were roughly the same price as G-sync when both broke on the scene. Over time, said manufacturers are including this feature less and less. The quality of these solutions is variable as each manufacturer's implementation is different

2. One of the reasons for the above is the market place. nVidia sold greater than 2 times more GTX 970s then all 20+ AMD 200 and 300 series cards combined. Back then, AMD had no answer to the 970 on up.... today, AMD has no answer to the 1060 on up.

3. Lets look at the list of "best gaming monitors:

www.blurbusters.com/faq/120hz-monitors/

Of the 29 monitors ....

.... all 29 have some form of MBR technology
.... 9 are Freesync
.... 20 are G-Sync

Depending on budget (incl peripherals) and resolution, we recommend

2160p w/ $3k+ Budget ... wait for 11xx series and 144 hz IPS HDR G-Sync Monitors w/ ULMB
1440p w/ $2k+ Budget ... GTX 1070 series and 144 hz IPS HG-Sync Monitors w/ ULMB
1080p w/ ~1.5k budget ... GTX 1060 w/ G-Sync or at least MBR capable (Classic LightBoost Blur Reduction technology) Monitor
1080p w/ <1.5k budget ... RX 580 w/Free-Sync and / or MBR capable (Classic LightBoost Blur Reduction technology) Monitor
JoniISkandarsince physix era this way start nvidia dominance,, AMD making best Graphic card HD 5870 and better than GTX 480,, people still buy nvidia because MIND SET not performance,,, this happen when Nvidia logo always flying around on every game
This reads like Faux News ... doesn't agree with any published test numbers. I don't see any support of that position in TPU reviews, at least in the upper tiers.

www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1060/31.html

"It seems the GTX 1060 is everything AMD wanted the RX 480 to be"

While that comment was for the reference model, its the AIB cards that really matter to gaming enthusiasts. Lets give AMD an advantage by comparing the 1060 with the later generation 580.



We see that the AIB 580 is 1.064 times as fast as the reference 1060 and that when overclocked, it's 7.1% faster
www.techpowerup.com/reviews/Gigabyte/AORUS_RX_580_XTR/33.html

So the card is 1.14 (1.064 x 1.071) times as fast as the reference 1060.

The AIB 1060 is 1.041 times as fast as the reference 1060 but it overclocks 12.1% making it 1.17 times faster than the reference 1060.

www.techpowerup.com/reviews/Gigabyte/GTX_1060_Xtreme_Gaming/26.html
www.techpowerup.com/reviews/Gigabyte/GTX_1060_Xtreme_Gaming/29.html

GTX 1060 power consumption / noise and / OC temps are 138 watts / 28 dbA / 63C
The RX 580 power consumption / noise and / OC temps are 243 watts / 34 dbA /74C

Desired AIB 1060s (6GB) are going from $289 to $329 on newegg ... while desired RX 580s are running about $10 cheaper. Performance wise I'd call it almost a wash and worthy of a $10 premium. But the 580 uses 76 % more power, requiring the expense of a PSU that is 100 watts larger so that easily erases the cost advantage. It's 50% louder and adds 10C in GPU temps. And lets not forget that the 580 is a generation later than the 1060 which originally competed against the 480.

I'd love it if AMD could muster some competition but nVidia's but despite loads of "[insert AMD's new tech here] is going to change everything" ... it just hasn't happened. Pre-launch excitement fizzled post launch as the 200, 300, 400, 500, Fury and Vega failed to live up to the hype. We last saw competition at the high end (when both cards were overclocked) with the 780 vs 290x and the 780 Ti rendered that battle irrelevant. But much more so than the top tier battle, Im more worried that nVidia has taken the crown to a another tier with each generation. With 9xx series ... the domination dropped down another 2 tiers to the x70... with 10xx it dropped to xx60. Personally, as our most of the users we have built for, are what I call "hardware whores" ... total lack of loyalty, jumping on whatever has the best numbers ... overclocked.

So no... we, PC enthusistas, are not buying anything because of a "mindset" at least at the high end. I will agree however that at the low end, "mindset" has value. The best example I can give you here is IBM laptops. IBM made the A20 series which every year, back in the days when print media dominated, the A20 was awarded best laptop every year. It was very expensive and could easily run upwards of $3K. And while if you wanted 'the best", youd have to pay that and get an A20 because no one was offering anything comparable. At some point, some bean counter decided that IBM didn't sell enough of the A20 and discontinued the line. Soon after IBM lost laptop dominance and eventual spun the division off to Lenovo. With laptops, w/o making thodse magazine covers, the shine was off IBM... just like every junior high school kids needed 'Air Jordans' to make sure he "made the team", every business exec wants that IBM logo visible when he / she entered that business meeting. But for those buying individual components, its going to be all about the numbers. So when a teenage goes shopping wth mom for that new PC as little Johnny transitions to Jr High School, that nVidia logo will draw attention because lil Johnny read on the internet that "nVidia 1080 Ti is the best", he wanbts totell his friends that ha has an nVidia card... he'll also want water cooling. lots of RGB all so he can impress his friends, regardless of whether any of those choices give you less than AMD components, air coolers. But again, while the uninformed consumer may be fooled by this mindset, I think anyone who is spending time reading TPU forums, and who has read TPU reviews, is making the choice 'by the numbers".
Posted on Reply
#87
FordGT90Concept
"I go fast!1!11!1!"
John Naylor2. One of the reasons for the above is the market place. nVidia sold greater than 2 times more GTX 970s then all 20+ AMD 200 and 300 series cards combined. Back then, AMD had no answer to the 970 on up.... today, AMD has no answer to the 1060 on up.
R9 290X is still, to this day, faster than GTX 970. R9 290X/R9 390/RX 580 are all faster than GTX 1060. Vega 56 is a great deal faster than GTX 1070, and Vega 64 is in GTX 1070 Ti and GTX 1080 territory.
John Naylor3. Lets look at the list of "best gaming monitors:

www.blurbusters.com/faq/120hz-monitors/

Of the 29 monitors ....

.... all 29 have some form of MBR technology
.... 9 are Freesync
.... 20 are G-Sync
You're only looking at the top list which is panels with > 144 Hz. The best panels on the market are in the "Other Brands of Blur Reduction" list, namely, the FreeSync 2 Samsungs.
Posted on Reply
#88
ToxicTaZ
$500. extra sucks!

But that's the price you pay for performance.....some people are happy with 60Hz others like myself prefer Higher Hz and no Screen Tearing, Judder, Input Lag and better Benchmarking!

I myself waiting for ASUS new 200Hz monitor to come out! "ROG SWIFT PG35VQ"

www.asus.com/ca-en/Monitors/ROG-SWIFT-PG35VQ/

35" 21:9 "3K/HDR/200Hz" ….should be able to run all new games 100+FPS Ultra Mode with up coming next gen 1180/1180Ti

4K Ultra Mode is still very hard to run high FPS. Need 1080Ti SLI just to get 60fps to 90fps in new games.
Posted on Reply
#89
Xzibit
ToxicTaZ$500. extra sucks!

But that's the price you pay for performance.....some people are happy with 60Hz others like myself prefer Higher Hz and no Screen Tearing, Judder, Input Lag and better Benchmarking!

I myself waiting for ASUS new 200Hz monitor to come out! "ROG SWIFT PG35VQ"

www.asus.com/ca-en/Monitors/ROG-SWIFT-PG35VQ/

35" 21:9 "3K/HDR/200Hz" ….should be able to run all new games 100+FPS Ultra Mode with up coming next gen 1180/1180Ti

4K Ultra Mode is still very hard to run high FPS. Need 1080Ti SLI just to get 60fps to 90fps in new games.
Why you waiting for it, if you already have it in your system specs?

Another monitor that's taking its time to come out over a year already and its a AUO AHVA screen as well. expect 120hz+OC to 200hz
Posted on Reply
#90
hat
Enthusiast
I never looked until today... and damn, the cheapest gsync monitors seem to cost as much as a high end graphics card! Still not sure what's wrong with regular vsync. Latency and all that... never really experienced it myself. Seems I can also use "fast" vsync as well, with any monitor, which supposedly gives you the best of both worlds anyway... But yeah, that gsync is way too expensive to be worth it.

I'm stuck using a 60hz 1920x1080 screen now... but years ago I had a 120hz monitor. It was nice until it died on me. No fancy gsync then at the time either...
Posted on Reply
#91
ToxicTaZ
XzibitWhy you waiting for it, if you already have it in your system specs?

Another monitor that's taking its time to come out over a year already and its a AUO AHVA screen as well. expect 120hz+OC to 200hz
They have 240Hz monitors too for years but only TN 1080p.

"UWQHD" 3K HDR 200Hz is a new happy medium VA Technology. Better than 1080p and 1440p but less than 4K.

Probably only reach 200fps on older games like Diablo III witch Max's out @165fps and I still play quite a bit.

Building an second Rig right now ...with my older one becomes my wife's...

And yes Nvidia G-Sync HDR is ridiculously expensive.
Posted on Reply
#92
Slizzo
hatI never looked until today... and damn, the cheapest gsync monitors seem to cost as much as a high end graphics card! Still not sure what's wrong with regular vsync. Latency and all that... never really experienced it myself. Seems I can also use "fast" vsync as well, with any monitor, which supposedly gives you the best of both worlds anyway... But yeah, that gsync is way too expensive to be worth it.

I'm stuck using a 60hz 1920x1080 screen now... but years ago I had a 120hz monitor. It was nice until it died on me. No fancy gsync then at the time either...
It's really hard to explain just how much G-Sync makes a difference in games where you can't constantly hit the max refresh rate of the monitor in framerates. It's night and day. Once you see it with your own eyeballs you'll be a true believer.

I too am waiting for a 3440x1440 screen with higher than 120Hz refresh. 120Hz ones are on market right now, and are pretty good; but I fear going under 144Hz for regular desktop use will hurt.
Posted on Reply
Add your own comment
Jul 23rd, 2024 01:18 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts