Wednesday, September 24th 2014

NVIDIA Sacrifices VESA Adaptive Sync Tech to Rake in G-SYNC Royalties

NVIDIA's G-SYNC technology is rivaled by AMD's project Freesync, which is based on a technology standardized by the video electronics standards association (VESA), under Adaptive Sync. The technology lets GPUs and monitors keep display refresh rates in sync with GPU frame-rates, so the resulting output appears fluid. VESA's technology does not require special hardware inside standards-compliant monitors, and is royalty-free, unlike NVIDIA G-SYNC, which is based on specialized hardware, which display makers have to source from NVIDIA, which makes it a sort of a royalty.

When asked by Chinese publication Expreview on whether NVIDIA GPUs will support VESA adaptive-sync, the company mentioned that NVIDIA wants to focus on G-SYNC. A case in point is the display connector loadout of the recently launched GeForce GTX 980 and GTX 970. According to specifications listed on NVIDIA's website, the two feature DisplayPort 1.2 connectors, and not DisplayPort 1.2a, a requirement of VESA's new technology. AMD's year-old Radeon R9 and R7 GPUs, on the other hand, support DisplayPort 1.2a, casting a suspicion on NVIDIA's choice of connectors. Interestingly, the GTX 980 and GTX 970 feature HDMI 2.0, so it's not like NVIDIA is slow at catching up with new standards. Did NVIDIA leave out DisplayPort 1.2a in a deliberate attempt to check Adaptive Sync?
Source: Expreview
Add your own comment

114 Comments on NVIDIA Sacrifices VESA Adaptive Sync Tech to Rake in G-SYNC Royalties

#51
GhostRyder
In all honesty this is not as surprising as you would think. They do not want to immediately undercut their own technology and implementations with a free alternative, just would not make sense.
Posted on Reply
#52
arbiter
astrix_auWhat do you think they are doing with Mantle, thanks to AMD Microsoft started working on DX12 after the announced they they were focusing on other areas but Mantle forced their hand on the issue.
MS stated they were working on directx12, 3 years before AMD mantle.
RCoonJesus, gamers these days think they deserve to get everything for free.
Yea R&D cost money and Nvidia is not UNICEF.
astrix_auMe neither and I guess we can thank Nvidia for the extra cost on the Asus Swift, I hope there is a version without the GSync I doubt it will be that noticeable, if you limit your FPS to your monitors refresh rate using RTSS or maxvariable in BF4 it's not needed IMO. Like I said my monitor plays smooth thanks to my 2x 290x's at 120hz. Those demos are laughable, marketing 101 on display that is all.
What is the different between using RTSS vs v-sync at that point? Its pretty much same thing.
NaitoLastly, if their hardware does support 1.2a, it'll only require a firmware update.
Would think that g-sync and adaptive sync are pretty similiar on how they signal monitor, just that g-sync requires monitor to talk back. So could be just as simple as software update in the driver. Nvidia probably won't enable it til monitors are there to support it. Right now to get g-sync adopted is more sales, when `1.2a monitor is out enable it so gives end user option for both.
Solidstate89So is nVidia just not going to support newer versions of DP? 1.3 is already available and with 4K becoming more and more the norm they can't hope to just not update their DP standard in the hopes of not supporting VESA Adaptive Sync.
We don't know what the process is to certify the card for 1.3. On top of that 1.3 spec was only finalized the week of 900 release so could took weeks or even months for the certification process. Could even be like the 1.2a support, Just software fix to enable it when it does get certified, Its just something we won't know.
Posted on Reply
#53
HM_Actua1
Animalpakfirst of all we must see if Free-Sync works the same or better than G-Sync then we can talk...

For now G-Sync is already a reality and is proven that works flawlessy.
DING DING DING
Posted on Reply
#54
HM_Actua1
GreiverBladeTHANKS RCoon!
exactly. the entitled elite.

give me give me give me....for nothing...world doesn't work like that.
Posted on Reply
#55
arbiter
Animalpakfirst of all we must see if Free-Sync works the same or better than G-Sync then we can talk...

For now G-Sync is already a reality and is proven that works flawlessy.
"Today, AMD announced collaborations with scaler vendors MStar, Novatek and Realtek to build scaler units ready with DisplayPort™ Adaptive-Sync and AMD's Project FreeSync by year end."

That is off AMD news release(link below), how that reads to me is that NEW scaler chips that are needed to run adaptive-sync that amd said wouldn't need. Won't be read to near end of the year so that tells me mid to late q1 before monitors will be out aka march.

ir.amd.com/phoenix.zhtml?c=74093&p=irol-newsArticle&ID=1969277
Posted on Reply
#56
eidairaman1
The Exiled Airman
Vesa are standards builders, youre better off with industry standards than proprietary equipment.
Posted on Reply
#57
heydan83
Well if the battle of Gsync and Adaptive Sync will continue at least I hope they release some Gsync/Adapctive Sync monitors (yes with both technology) at a considerable price, that way you will have the liberty to choice your GPU base on other stuff, dam Nvidia hope they throw the Gsync project asap so this entire technology can be standardize.
Posted on Reply
#58
Ferrum Master
arbiter"

That is off AMD news release(link below), how that reads to me is that NEW scaler chips that are needed to run adaptive-sync that amd said wouldn't need.
Wrong... Actually they need at last chips that support the DP 1.2a/1.3 bus and to be packed with any mainstram display.
Posted on Reply
#59
15th Warlock
As an owner of both AMD and Nvidia based systems, and a G-sync monitor, all I can say is, if Free sync is as good as G-sync Nvidia is committing a big disservice to owners of their cards by not supporting DP 1.2a.

I hope the upper management at Nvidia open their eyes and see the light; I mean, I understand they spent millions on researching and developing g-sync, but why not just support what's basically an industry standard?

Well, I guess we all know why, (profits) but still, not a good call Nvidia, not cool at all
Posted on Reply
#60
Casecutter
RCoonPeople keep talking about Dx12 in a Freesync/Gsync thread.
I don't understand why.
In response to your saying "Gamers want everything for free". The Dx12 comment was more an example... what would you think if MS was to say the Dx12 is going the way of a "yearly subscription" based upgrade (like Office), not just tied to Win9?

I'm just saying "should" companies have you pay for every little feature or improvement that bestows what might be considered the "natural evolution" to improve the PC gaming experience? I see the whole Sync issue as just a shortcoming that "has it the moment to be resolved". Like cars finally getting disc brakes by the 70’s. Sure many optioned to get them or you paid more to have them at first, but now we look back and say today it's a standard that fixed an inadequacy.

Sure if a company develops a technology they see it as a revenue stream, and folk are willing pay and buy into the hardware ecosystem for the "latest and greatest" that’s that what some do. The folk that wait till it’s less bleeding edge have that a choice, once it finally evolves into the standard and can get it with the next panel purchase that's how it works for most mainstream.

I'm not arguing Gsync/FreeSync, but folks will need for the time being choose a path/ecosystem. It's not any issue for Nvidia to lock their cards to only Gsync... it’s their prerogative. It’s just those folk who see buying a new monitor in their immediate future can: A) Consider will there be Gsync panels in their price range coming to market, and does the card they have presently (at least 7XX) offer that?; or B) Do they hold to getting a monitor that includes that VESA standard in their price range and then does that mean they'll need an AMD card (basically R9-R7 will support the dynamic refresh for gaming).
Posted on Reply
#61
Ferrum Master
Calm down... It's just trolling...

Someone at nvidia does the job what he is paid for.

Seconds so far we can only speculate about AMD, so waste of time here...
Posted on Reply
#62
bogami
Vertical sync is nothing new but in this nvidia have always been avoided. ,but now they are forcing a new product that is expensive and is not in many monitors. Virtually forcing the new standard .
That operates ? Is not the case because they gained relief in operation of GPU processor . CHEAT !
Not to speak of salted price of Nvidia products wich in the quiet and sneaky way planted 30% of the card mor on the buyer .But I have not seen this in 4K monitors. only is 60 Hz max framerate.
With this deception NVIDIA obtained higher scores than AMD already for some time . That we will be fed to be billed for , I did not expect .And as usual again paying middle class as the highest and not the 20 nm but within 28 nm. + G sinh card.:( I miss the announced CPU element in the GPU. Only optimization of the core is in the MG 204. Terror the next cheat please nVidia , :shadedshu:and Pay him:wtf:
Posted on Reply
#66
HumanSmoke
Casecutter"it will also eventually support the industry-standard Adaptive-Sync"... as who here thought it would never.
Well, given that both Nvidia and the OEM/ODM's have likely a significant amount of time, effort, and inventory built up around G-Sync it isn't overly surprising that they'd push the tech- especially when Adaptive-Sync isn't competition at the moment. Why Osborne yourself unnecessarily? Announcing to the world that you're introducing a cheaper alternative to the solution you're presently selling (and making profit from) when actually under no pressure to do so smacks of business suicide.
Posted on Reply
#67
Ferrum Master
HumanSmokeWell, given that both Nvidia and the OEM/ODM's have likely a significant amount of time, effort, and inventory built up around G-Sync it isn't overly surprising that they'd push the tech- especially when Adaptive-Sync isn't competition at the moment. Why Osborne yourself unnecessarily? Announcing to the world that you're introducing a cheaper alternative to the solution you're presently selling (and making profit from) when actually under no pressure to do so smacks of business suicide.
I think they actually got burned, the R/D cost to pull out such a stunt isn't a pocket money. I think they actually underestimated AMD and that they will make a deal with VESA and all major scaler makers. Someone will get spanked by Jen-Hsun either way...
Posted on Reply
#68
HumanSmoke
Ferrum MasterI think they actually got burned, the R/D cost to pull out such a stunt isn't a pocket money. I think they actually underestimated AMD and that they will make a deal with VESA and all major scaler makers. Someone will get spanked by Jen-Hsun either way...
Not sure about that. I'm guessing that G-Sync is just an extension of the research Nvidia did with Adaptive VSync, and I'm also guessing that Tegra chips aren't all that expensive (I'm also certain that Nvidia loved the chance to offload them and create another batch of "design wins"). I'm also guessing that the monitor OEMs are shouldering the greatest financial burden since their inventory needs to move before the VESA specced monitors gain traction.
As for spanking....All in all, I'd say that G-Sync has probably paid for itself in marketing. Reviews and user feedback have been positive with the only downside being the added cost of ownership. It is also extremely short sighted to think that this came as ANY surprise to Nvidia - You do realise that the Nvidia's Display Technical Marketing Manager, Pablo Ortega, is on the VESA board of directors?

It constantly amazes me that the tech industry seems to be viewed by otherwise intelligent people as some kind of real life version of a hybrid Looney Tunes- Keystone Cops mashup.
Posted on Reply
#69
Ferrum Master
HumanSmokeNvidia's Display Technical Marketing Manager, Pablo Ortega, is on the VESA board of directors?
Well It ain't an argument... And we are not disputing the loyalty of a certain person... it is way too nasty...

Seconds... well ordering additional critter of silicon as seen in hardware implementations of gsync it ain't no leftovers of Tegra project. All Tegras are unique by their step numbers for example, the even old Tegra 2 chip in Optimus 2X is different than Tegra2 in Motorolla Atrix... their GPIO and address space is a completely different mess ie different customized silicon and someone spared time on that and ordered to manufacture the Gsync one.

So far I cannot foresee how few models of gsync enabled monitors can justify costs to produce such hardware... we cannot even speak of mass batches like I mentioned for those phones... a niche product...
Posted on Reply
#70
HumanSmoke
Ferrum MasterWell It ain't an argument... And we are not disputing the loyalty of a certain person... it is way too nasty...
Don't flagellate yourself too much.
Ferrum MasterSeconds... well ordering additional critter of silicon as seen in hardware implementations of gsync it ain't no leftovers of Tegra project.
"Leftovers" ? No. The G-Sync Tegra is likely a semi-custom ASIC
Ferrum MasterAll Tegras are unique by their step numbers for example, the even old Tegra 2 chip in Optimus 2X is different than Tegra2 in Motorolla Atrix... their GPIO and address space is a completely different mess ie different customized silicon and someone spared time on that and ordered to manufacture the Gsync one.
Yes. That's what semi-custom ASIC means. Two points - 1. You don't know how much the silicon floor plan has been reworked, and 2. You don't know if the mask/fabbing cost outweighs the gain to the company. What I know is that The G-Sync's Tegra chip is around the same size and an entry level GPU (a GPU that when attached to a PCB with voltage regulation, I/O, power plugs and a heatsink retails for around $30) and the add-in G-Sync board sells for ~$200. Obviously the company isn't losing out or they wouldn't be making it (or they assume the expenditure is worth it to the company in other terms) and OEM/ODM's wouldn't be using the module. Now I know the forums are full of people who know more than the actual people who run these businesses, so while I list a few self-evident facts, you need not address them - even if you were able- since you obviously have more pressing business in hiring out your services as the pre-eminent business strategist of this era.
Ferrum MasterSo far I cannot foresee how few models of gsync enabled monitors can justify costs to produce such hardware...
It's no different from the R&D AIB's expend on low- volume esoteric products. You think Asus's custom limited edition Mars and Ares turn a direct monetary profit for the company?
Posted on Reply
#71
15th Warlock
HumanSmokeNot sure about that. I'm guessing that G-Sync is just an extension of the research Nvidia did with Adaptive VSync, and I'm also guessing that Tegra chips aren't all that expensive (I'm also certain that Nvidia loved the chance to offload them and create another batch of "design wins"). I'm also guessing that the monitor OEMs are shouldering the greatest financial burden since their inventory needs to move before the VESA specced monitors gain traction.
As for spanking....All in all, I'd say that G-Sync has probably paid for itself in marketing. Reviews and user feedback have been positive with the only downside being the added cost of ownership. It is also extremely short sighted to think that this came as ANY surprise to Nvidia - You do realise that the Nvidia's Display Technical Marketing Manager, Pablo Ortega, is on the VESA board of directors?

It constantly amazes me that the tech industry seems to be viewed by otherwise intelligent people as some kind of real life version of a hybrid Looney Tunes- Keystone Cops mashup.
The Tegra theory would make sense, except for the fact that in its current iteration Nvidia is using an FPGA chip an programing it with proprietary software to enable G-sync.
Posted on Reply
#72
xenocide
Factoring in how G-Sync works, I refuse to believe FreeSync will offer a comperable experience.
Posted on Reply
#73
astrix_au
GhostRyderIn all honesty this is not as surprising as you would think. They do not want to immediately undercut their own technology and implementations with a free alternative, just would not make sense.
Yeah fair enough that is their right but someone with a great monitor who will miss out just because of this won't feel so great about it. They should look at other avenues when doing R&D, maybe just the fact they could make more money from hardware was to appealing. If that is the case it starts a bad precedent where they will only put R&D where they think they can force people to pay extra for it instead of just having it included like all other technologies in CPU and GPU's.
The ASUS Swift monitor is $999.00 in Australia right now and some of that cost apparently can be as much if not more than $150 extra for for the GSync device in the monitor. I might have to buy it anyway but I think it's an unnecessary cost.
Posted on Reply
#74
astrix_au
xenocideFactoring in how G-Sync works, I refuse to believe FreeSync will offer a comperable experience.
Why is that?
Posted on Reply
#75
Scrizz
wickedcricketI swear I wouldn't put ANYTHING in your mouth :)
so where would you put it?






..... I'll walk myself out.....
lol
Posted on Reply
Add your own comment
Mar 15th, 2025 22:18 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts