Monday, June 25th 2018
NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing
PCPer had the opportunity to disassemble the ASUS ROG Swift PG27UQ 27", a 4K 144 Hz G-Sync HDR Monitor and found that the G-Sync module is a newer version than the one used on 1st generation G-Sync monitors (which of course do not support 4K / 144 Hz / HDR). The module is powered by an FPGA made by Altera (Intel-owned since 2015). The exact model number is Arria 10 GX 480, which is a high-performance 20 nanometer SoC that provides enough bandwidth and LVDS pins to process the data stream.
The FPGA is sold in low quantities for $2000 at Digikey and Mouser. Assuming that NVIDIA buys thousands, PCPer suggests that the price of this chip alone will add $500 to monitor cost. The BOM cost is further increased by 3 GB of DDR4 memory on the module. With added licensing fees for G-SYNC, this explains why these monitors are so expensive.
Sources:
PCPer Review, Altera Product Page
The FPGA is sold in low quantities for $2000 at Digikey and Mouser. Assuming that NVIDIA buys thousands, PCPer suggests that the price of this chip alone will add $500 to monitor cost. The BOM cost is further increased by 3 GB of DDR4 memory on the module. With added licensing fees for G-SYNC, this explains why these monitors are so expensive.
94 Comments on NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing
www.blurbusters.com/persistence-vs-motion-blur/
It all adds us. I'm going for lowest possible input lag.
High end TV's, smartphones, tablets, PC monitors etc. You see 120 Hz support on tons of devices these days. 60 Hz was never enough. Yes, it's "acceptable", but EVERYONE can see the difference between 60 fps/Hz and 120 fps/Hz when setup properly (Yep, I have seen people buy 120-240 Hz monitors and use them at 60 Hz...) Even in many stores that showcase high refresh rate monitors, they run 60 Hz. No wonder some people think it's all marketing.
It won't take many years before 120 Hz is the new 60 Hz. Atleast on high-end stuff.
This time, though the problem is a little different: fast refresh screens were expensive back then as well, but today, because of higher resolutions, the video cards that can push them are prohibitively expensive.
Edit: this is in reply to post #81
It's not a huge difference like 60fps/60Hz to 120/120. But it's definitely noticable and provides even better smoothness.
In the CRT days, 100 Hz were bare minimum for me. Just because 60 Hz aint blinking on LCD/OLED, does not mean it's smooth. It's just as terrible in terms of motion. They might have said that, but it's happening now.
Pretty much all 2018 high-end TV's supports 120 Hz native.
iPad Pro is 120 Hz native. Several Android phones has 90-120 Hz native.
More and more PC monitors and laptops has 120-240 Hz native.
Every single person can see the difference between 60 and 120 Hz. Most just don't know that 60 Hz is crap. It was choosen because of bandwidth and hardware limitations.
las's 120Hz with ULMB is actually 60Hz, but with a black frame inserted after each rendered frame. It's a trick that makes overdrive unnecessary, but needs panels that can refresh fast.
since physix era this way start nvidia dominance,, AMD making best Graphic card HD 5870 and better than GTX 480,, people still buy nvidia because MIND SET not performance,,, this happen when Nvidia logo always flying around on every game
2. One of the reasons for the above is the market place. nVidia sold greater than 2 times more GTX 970s then all 20+ AMD 200 and 300 series cards combined. Back then, AMD had no answer to the 970 on up.... today, AMD has no answer to the 1060 on up.
3. Lets look at the list of "best gaming monitors:
www.blurbusters.com/faq/120hz-monitors/
Of the 29 monitors ....
.... all 29 have some form of MBR technology
.... 9 are Freesync
.... 20 are G-Sync
Depending on budget (incl peripherals) and resolution, we recommend
2160p w/ $3k+ Budget ... wait for 11xx series and 144 hz IPS HDR G-Sync Monitors w/ ULMB
1440p w/ $2k+ Budget ... GTX 1070 series and 144 hz IPS HG-Sync Monitors w/ ULMB
1080p w/ ~1.5k budget ... GTX 1060 w/ G-Sync or at least MBR capable (Classic LightBoost Blur Reduction technology) Monitor
1080p w/ <1.5k budget ... RX 580 w/Free-Sync and / or MBR capable (Classic LightBoost Blur Reduction technology) Monitor This reads like Faux News ... doesn't agree with any published test numbers. I don't see any support of that position in TPU reviews, at least in the upper tiers.
www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1060/31.html
"It seems the GTX 1060 is everything AMD wanted the RX 480 to be"
While that comment was for the reference model, its the AIB cards that really matter to gaming enthusiasts. Lets give AMD an advantage by comparing the 1060 with the later generation 580.
We see that the AIB 580 is 1.064 times as fast as the reference 1060 and that when overclocked, it's 7.1% faster
www.techpowerup.com/reviews/Gigabyte/AORUS_RX_580_XTR/33.html
So the card is 1.14 (1.064 x 1.071) times as fast as the reference 1060.
The AIB 1060 is 1.041 times as fast as the reference 1060 but it overclocks 12.1% making it 1.17 times faster than the reference 1060.
www.techpowerup.com/reviews/Gigabyte/GTX_1060_Xtreme_Gaming/26.html
www.techpowerup.com/reviews/Gigabyte/GTX_1060_Xtreme_Gaming/29.html
GTX 1060 power consumption / noise and / OC temps are 138 watts / 28 dbA / 63C
The RX 580 power consumption / noise and / OC temps are 243 watts / 34 dbA /74C
Desired AIB 1060s (6GB) are going from $289 to $329 on newegg ... while desired RX 580s are running about $10 cheaper. Performance wise I'd call it almost a wash and worthy of a $10 premium. But the 580 uses 76 % more power, requiring the expense of a PSU that is 100 watts larger so that easily erases the cost advantage. It's 50% louder and adds 10C in GPU temps. And lets not forget that the 580 is a generation later than the 1060 which originally competed against the 480.
I'd love it if AMD could muster some competition but nVidia's but despite loads of "[insert AMD's new tech here] is going to change everything" ... it just hasn't happened. Pre-launch excitement fizzled post launch as the 200, 300, 400, 500, Fury and Vega failed to live up to the hype. We last saw competition at the high end (when both cards were overclocked) with the 780 vs 290x and the 780 Ti rendered that battle irrelevant. But much more so than the top tier battle, Im more worried that nVidia has taken the crown to a another tier with each generation. With 9xx series ... the domination dropped down another 2 tiers to the x70... with 10xx it dropped to xx60. Personally, as our most of the users we have built for, are what I call "hardware whores" ... total lack of loyalty, jumping on whatever has the best numbers ... overclocked.
So no... we, PC enthusistas, are not buying anything because of a "mindset" at least at the high end. I will agree however that at the low end, "mindset" has value. The best example I can give you here is IBM laptops. IBM made the A20 series which every year, back in the days when print media dominated, the A20 was awarded best laptop every year. It was very expensive and could easily run upwards of $3K. And while if you wanted 'the best", youd have to pay that and get an A20 because no one was offering anything comparable. At some point, some bean counter decided that IBM didn't sell enough of the A20 and discontinued the line. Soon after IBM lost laptop dominance and eventual spun the division off to Lenovo. With laptops, w/o making thodse magazine covers, the shine was off IBM... just like every junior high school kids needed 'Air Jordans' to make sure he "made the team", every business exec wants that IBM logo visible when he / she entered that business meeting. But for those buying individual components, its going to be all about the numbers. So when a teenage goes shopping wth mom for that new PC as little Johnny transitions to Jr High School, that nVidia logo will draw attention because lil Johnny read on the internet that "nVidia 1080 Ti is the best", he wanbts totell his friends that ha has an nVidia card... he'll also want water cooling. lots of RGB all so he can impress his friends, regardless of whether any of those choices give you less than AMD components, air coolers. But again, while the uninformed consumer may be fooled by this mindset, I think anyone who is spending time reading TPU forums, and who has read TPU reviews, is making the choice 'by the numbers".
But that's the price you pay for performance.....some people are happy with 60Hz others like myself prefer Higher Hz and no Screen Tearing, Judder, Input Lag and better Benchmarking!
I myself waiting for ASUS new 200Hz monitor to come out! "ROG SWIFT PG35VQ"
www.asus.com/ca-en/Monitors/ROG-SWIFT-PG35VQ/
35" 21:9 "3K/HDR/200Hz" ….should be able to run all new games 100+FPS Ultra Mode with up coming next gen 1180/1180Ti
4K Ultra Mode is still very hard to run high FPS. Need 1080Ti SLI just to get 60fps to 90fps in new games.
Another monitor that's taking its time to come out over a year already and its a AUO AHVA screen as well. expect 120hz+OC to 200hz
I'm stuck using a 60hz 1920x1080 screen now... but years ago I had a 120hz monitor. It was nice until it died on me. No fancy gsync then at the time either...
"UWQHD" 3K HDR 200Hz is a new happy medium VA Technology. Better than 1080p and 1440p but less than 4K.
Probably only reach 200fps on older games like Diablo III witch Max's out @165fps and I still play quite a bit.
Building an second Rig right now ...with my older one becomes my wife's...
And yes Nvidia G-Sync HDR is ridiculously expensive.
I too am waiting for a 3440x1440 screen with higher than 120Hz refresh. 120Hz ones are on market right now, and are pretty good; but I fear going under 144Hz for regular desktop use will hurt.