Wednesday, September 24th 2014

NVIDIA Sacrifices VESA Adaptive Sync Tech to Rake in G-SYNC Royalties
NVIDIA's G-SYNC technology is rivaled by AMD's project Freesync, which is based on a technology standardized by the video electronics standards association (VESA), under Adaptive Sync. The technology lets GPUs and monitors keep display refresh rates in sync with GPU frame-rates, so the resulting output appears fluid. VESA's technology does not require special hardware inside standards-compliant monitors, and is royalty-free, unlike NVIDIA G-SYNC, which is based on specialized hardware, which display makers have to source from NVIDIA, which makes it a sort of a royalty.
When asked by Chinese publication Expreview on whether NVIDIA GPUs will support VESA adaptive-sync, the company mentioned that NVIDIA wants to focus on G-SYNC. A case in point is the display connector loadout of the recently launched GeForce GTX 980 and GTX 970. According to specifications listed on NVIDIA's website, the two feature DisplayPort 1.2 connectors, and not DisplayPort 1.2a, a requirement of VESA's new technology. AMD's year-old Radeon R9 and R7 GPUs, on the other hand, support DisplayPort 1.2a, casting a suspicion on NVIDIA's choice of connectors. Interestingly, the GTX 980 and GTX 970 feature HDMI 2.0, so it's not like NVIDIA is slow at catching up with new standards. Did NVIDIA leave out DisplayPort 1.2a in a deliberate attempt to check Adaptive Sync?
Source:
Expreview
When asked by Chinese publication Expreview on whether NVIDIA GPUs will support VESA adaptive-sync, the company mentioned that NVIDIA wants to focus on G-SYNC. A case in point is the display connector loadout of the recently launched GeForce GTX 980 and GTX 970. According to specifications listed on NVIDIA's website, the two feature DisplayPort 1.2 connectors, and not DisplayPort 1.2a, a requirement of VESA's new technology. AMD's year-old Radeon R9 and R7 GPUs, on the other hand, support DisplayPort 1.2a, casting a suspicion on NVIDIA's choice of connectors. Interestingly, the GTX 980 and GTX 970 feature HDMI 2.0, so it's not like NVIDIA is slow at catching up with new standards. Did NVIDIA leave out DisplayPort 1.2a in a deliberate attempt to check Adaptive Sync?
114 Comments on NVIDIA Sacrifices VESA Adaptive Sync Tech to Rake in G-SYNC Royalties
give me give me give me....for nothing...world doesn't work like that.
That is off AMD news release(link below), how that reads to me is that NEW scaler chips that are needed to run adaptive-sync that amd said wouldn't need. Won't be read to near end of the year so that tells me mid to late q1 before monitors will be out aka march.
ir.amd.com/phoenix.zhtml?c=74093&p=irol-newsArticle&ID=1969277
I hope the upper management at Nvidia open their eyes and see the light; I mean, I understand they spent millions on researching and developing g-sync, but why not just support what's basically an industry standard?
Well, I guess we all know why, (profits) but still, not a good call Nvidia, not cool at all
I'm just saying "should" companies have you pay for every little feature or improvement that bestows what might be considered the "natural evolution" to improve the PC gaming experience? I see the whole Sync issue as just a shortcoming that "has it the moment to be resolved". Like cars finally getting disc brakes by the 70’s. Sure many optioned to get them or you paid more to have them at first, but now we look back and say today it's a standard that fixed an inadequacy.
Sure if a company develops a technology they see it as a revenue stream, and folk are willing pay and buy into the hardware ecosystem for the "latest and greatest" that’s that what some do. The folk that wait till it’s less bleeding edge have that a choice, once it finally evolves into the standard and can get it with the next panel purchase that's how it works for most mainstream.
I'm not arguing Gsync/FreeSync, but folks will need for the time being choose a path/ecosystem. It's not any issue for Nvidia to lock their cards to only Gsync... it’s their prerogative. It’s just those folk who see buying a new monitor in their immediate future can: A) Consider will there be Gsync panels in their price range coming to market, and does the card they have presently (at least 7XX) offer that?; or B) Do they hold to getting a monitor that includes that VESA standard in their price range and then does that mean they'll need an AMD card (basically R9-R7 will support the dynamic refresh for gaming).
Someone at nvidia does the job what he is paid for.
Seconds so far we can only speculate about AMD, so waste of time here...
That operates ? Is not the case because they gained relief in operation of GPU processor . CHEAT !
Not to speak of salted price of Nvidia products wich in the quiet and sneaky way planted 30% of the card mor on the buyer .But I have not seen this in 4K monitors. only is 60 Hz max framerate.
With this deception NVIDIA obtained higher scores than AMD already for some time . That we will be fed to be billed for , I did not expect .And as usual again paying middle class as the highest and not the 20 nm but within 28 nm. + G sinh card.:( I miss the announced CPU element in the GPU. Only optimization of the core is in the MG 204. Terror the next cheat please nVidia , :shadedshu:and Pay him:wtf:
www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-plans-to-support-adaptive-sync-technology/
Now everyone can stop complaining.
As for spanking....All in all, I'd say that G-Sync has probably paid for itself in marketing. Reviews and user feedback have been positive with the only downside being the added cost of ownership. It is also extremely short sighted to think that this came as ANY surprise to Nvidia - You do realise that the Nvidia's Display Technical Marketing Manager, Pablo Ortega, is on the VESA board of directors?
It constantly amazes me that the tech industry seems to be viewed by otherwise intelligent people as some kind of real life version of a hybrid Looney Tunes- Keystone Cops mashup.
Seconds... well ordering additional critter of silicon as seen in hardware implementations of gsync it ain't no leftovers of Tegra project. All Tegras are unique by their step numbers for example, the even old Tegra 2 chip in Optimus 2X is different than Tegra2 in Motorolla Atrix... their GPIO and address space is a completely different mess ie different customized silicon and someone spared time on that and ordered to manufacture the Gsync one.
So far I cannot foresee how few models of gsync enabled monitors can justify costs to produce such hardware... we cannot even speak of mass batches like I mentioned for those phones... a niche product...
The ASUS Swift monitor is $999.00 in Australia right now and some of that cost apparently can be as much if not more than $150 extra for for the GSync device in the monitor. I might have to buy it anyway but I think it's an unnecessary cost.
..... I'll walk myself out.....
lol