# NVIDIA Sacrifices VESA Adaptive Sync Tech to Rake in G-SYNC Royalties



## btarunr (Sep 25, 2014)

NVIDIA's G-SYNC technology is rivaled by AMD's project Freesync, which is based on a technology standardized by the video electronics standards association (VESA), under Adaptive Sync. The technology lets GPUs and monitors keep display refresh rates in sync with GPU frame-rates, so the resulting output appears fluid. VESA's technology does not require special hardware inside standards-compliant monitors, and is royalty-free, unlike NVIDIA G-SYNC, which is based on specialized hardware, which display makers have to source from NVIDIA, which makes it a sort of a royalty.

When asked by Chinese publication Expreview on whether NVIDIA GPUs will support VESA adaptive-sync, the company mentioned that NVIDIA wants to focus on G-SYNC. A case in point is the display connector loadout of the recently launched GeForce GTX 980 and GTX 970. According to specifications listed on NVIDIA's website, the two feature DisplayPort 1.2 connectors, and not DisplayPort 1.2*a*, a requirement of VESA's new technology. AMD's year-old Radeon R9 and R7 GPUs, on the other hand, support DisplayPort 1.2a, casting a suspicion on NVIDIA's choice of connectors. Interestingly, the GTX 980 and GTX 970 feature HDMI 2.0, so it's not like NVIDIA is slow at catching up with new standards. Did NVIDIA leave out DisplayPort 1.2a in a deliberate attempt to check Adaptive Sync?





*View at TechPowerUp Main Site*


----------



## v12dock (Sep 25, 2014)

Gotta love Nvidia


----------



## INSTG8R (Sep 25, 2014)




----------



## Cheeseball (Sep 25, 2014)

From the article, it sounds like NVIDIA is *actively* blocking FreeSync, when in fact all it is is that the 900 series lacks DisplayPort 1.2a support, which is required for FreeSync.


----------



## CookieMonsta (Sep 25, 2014)

Utterly stupid move by NVIDIA if it proves to be true. History has not been kind to proprietary technology, NVIDIA does not want it's competitors to unite over a common standard lest it becomes marginalized by the LCD manufacturers.


----------



## The Von Matrices (Sep 25, 2014)

I wouldn't jump to conclusions until there actually are monitors supporting adaptive refresh in the market.  It's completely normal for companies to avoid mentioning and outright deny upcoming features/products in order to avoid cannibalizing sales of current products.


----------



## HM_Actua1 (Sep 25, 2014)

Gysnc works,

Gsync is awesome

Gsync is here now.


Nvidia pushing technology forward with their enginuity and innovations

AMD needs to start making some leaps and strides if they expect to survive.

I wish AMD was pushing technology more then there would be an actual competition between red/green thus pushing performance and technology at a faster rate instead the snail like speed of the past 6+ years.


----------



## ISI300 (Sep 25, 2014)

The fact that they support HDMI 2.0 (the first gpu to support that standard which is the latest HDMI revision), yet refuse to implement the latest displayport standard, makes the whole thing stink. Mother****ers!


----------



## arbiter (Sep 25, 2014)

I read off somewhere that a closed door meeting that 900 series can do 1.2a, all needs is software update so.


----------



## RejZoR (Sep 25, 2014)

We change graphic cards way more often than monitors. So being stuck to a single graphic card brand because monitor supports G-Sync only is dumb. But if you have FreeSync enabled monitor, you are free to choose whichever graphic cad brand suits the price/performance best. But G-Sync monitor users are stuck with NVIDIA whether they like it or not (if they want HW adaptive sync).


----------



## RejZoR (Sep 25, 2014)

arbiter said:


> I read off somewhere that a closed door meeting that 900 series can do 1.2a, all needs is software update so.



Sure, and all i need is a software update to convert my VGA port into DisplayPort... Things don't work that way with connectors and standards associated with them.


----------



## jigar2speed (Sep 25, 2014)

Hitman_Actual said:


> Gysnc works,
> 
> Gsync is awesome
> 
> ...



I totally agree, AMD are light years behind Nvidia and should kiss Nvidia's feet for coming up with Gsync - Nvidia has every right to charge everyone for this tech, and btw, how dare AMD copy Gsync and call it free sync - such dumb asses.  /if i have to put a sarcasm tag here, i seriously feel bad for you.


----------



## dansergiu (Sep 25, 2014)

This doesn't say that Nvidia decided not to support Adaptive Sync. It simply says that they are focusing on GSync. Also, it's unlikely that they will not support it since it's part of the standard, but most likely they will market the GSync as the better solution, at least for a while.

Anyway, my point is that the title of the article is slightly misleading; it looks more like click bait from a tabloid than an article on a tech publication to be honest.


----------



## semantics (Sep 25, 2014)

dansergiu said:


> This doesn't say that Nvidia decided not to support Adaptive Sync. It simply says that they are focusing on GSync. Also, it's unlikely that they will not support it since it's part of the standard, but most likely they will market the GSync as the better solution, at least for a while.
> 
> Anyway, my point is that the title of the article is slightly misleading; it looks more like click bait from a tabloid than an article on a tech publication to be honest.


Gsync is currently a better product. Given it's an actually buyable product, it can even do QHD at 144hz right now and it works pretty reliably. These are all things we'll have to assess with freesync they are not the same technology they are two different approaches and we have yet to see Freesync in the erratic frame environment of games. That being it's perfect for watching video which is unlikely to see frame rate drops and rises like a game would.


----------



## Animalpak (Sep 25, 2014)

first of all we must see if Free-Sync works the same or better than G-Sync then we can talk...

For now G-Sync is already a reality and is proven that works flawlessy.


----------



## arbiter (Sep 25, 2014)

RejZoR said:


> Sure, and all i need is a software update to convert my VGA port into DisplayPort... Things don't work that way with connectors and standards associated with them.



It could be just as simple as that, why say you support adaptive sync when there is 0 monitors out and according to press release won't be til least q1 on some.
"Today, AMD announced collaborations with scaler vendors MStar, Novatek and Realtek to build scaler units ready with DisplayPort™ Adaptive-Sync and AMD's Project FreeSync by year end._"
http://ir.amd.com/phoenix.zhtml?c=74093&p=irol-newsArticle&ID=1969277
_
That is off AMD's press rls, If they say end of year sounds like no monitor til near end of q1. As other said no reason to say support something isn't out when you have working product Now. Just to kill off your own sales when don't even know if adaptive sync will give any benefit to games or if that is limited to AMD proprietary code.




Animalpak said:


> first of all we must see if Free-Sync works the same or better than G-Sync then we can talk...
> 
> For now G-Sync is already a reality and is proven that works flawlessy.



I agree with that.



semantics said:


> Gsync is currently a better product. Given it's an actually buyable product, it can even do QHD at 144hz right now and it works pretty reliably. These are all things we'll have to assess with freesync they are not the same technology they are two different approaches and we have yet to see Freesync in the erratic frame environment of games. That being it's perfect for watching video which is unlikely to see frame rate drops and rises like a game would.



They are 2 different ways, we know well g-sync works but have yet to see how AMD's works. I said this before I don't take AMD's claims at face value, just this "prove it works like you say" then I will give credit they are due.


----------



## john_ (Sep 25, 2014)

When I posted one week ago about this, I was either ignored or treated as someone that was talking conspiracy theories from a very negative perspective. Either we like or we hate Nvidia, we do know that they stick with their proprietary standards, especially when they have the upper hand in the market. There is nothing strange here with them implementing 1.2 and not 1.2a. It's what they are doing for years. OpenCL is another example with even the 900 series cards supporting only OpenCL 1.1 if I am not mistaken, when AMD was supporting 1.2 for years. OpenCL 1.2 is 3 years old. 

It's no negative posting, or bashing, or conspiracy theories, and there is nothing strange here. It is business as usual for Nvidia.


----------



## the54thvoid (Sep 25, 2014)

It's very simple. Nvidia have worked on g-sync, whether you want it or not, they have a hardware implementation to address frame rate from GPU to monitor.
If you want it, you buy Nvidia's product. If you don't want it, you buy AMD product. Nvidia have pushed a business model to increase profit, to please shareholders.
Nvidia is a business, not a charity, it has zero requirement to work along 'free' business models. It has arguably spent a lot on R&D and builds a very capable vanilla card.
If you do not like what they do, you have no need to buy their products. Buy AMD instead.
By all review accounts, on the whole, Gsync works like a dream, why as a private company would they support a free or cheaper version?

If people start seeing Nvidia and AMD as businesses and not charities, a lot of arguments and misplaced anger venting could be avoided.


----------



## astrix_au (Sep 25, 2014)

Hitman_Actual said:


> Gysnc works,
> 
> Gsync is awesome
> 
> ...



What do you think they are doing with Mantle, thanks to AMD Microsoft started working on DX12 after the announced they they were focusing on other areas but Mantle forced their hand on the issue. 

Nvidia is forcing people to buy monitors with Gsync, if they released DP 1.2a then sales of GSync monitors would collapse when freesync would be available I'm not loyal or a fan boy. I was going to get 2x 780ti's but Mantle's 290x crossfire performance on BF4 won me over. The 780ti's would sit at 75% each wasting GPU performance. When DX12 comes out and they can run at 100% all the time and use max performance all the time I might switch over but then moves like this makes you think twice. 

AMD has planned to make Mantle open so we will wait and see when it gets released at the end of this year if that happens.


----------



## astrix_au (Sep 25, 2014)

semantics said:


> Gsync is currently a better product. Given it's an actually buyable product, it can even do QHD at 144hz right now and it works pretty reliably. These are all things we'll have to assess with freesync they are not the same technology they are two different approaches and we have yet to see Freesync in the erratic frame environment of games. That being it's perfect for watching video which is unlikely to see frame rate drops and rises like a game would.



You better hope it's not better than GSync since if you have that card your shit out of luck. Being on AMD I don't need to care about spending an extra $150 on something that you probably won't even notice lol I rather have that for free. I don't see the the jittering on my 120hz display that they show on those demos..... lol as they say a sucker is born every day. Those tests are probably the worst case scenarios if not purposely exaggerated.


----------



## RCoon (Sep 25, 2014)

The real crime here is that I don't care about Freesync or Gsync, and have no intention on buying into either.

Jesus, gamers these days think they deserve to get everything for free.


----------



## astrix_au (Sep 25, 2014)

RCoon said:


> The real crime here is that I don't care about Freesync or Gsync, and have no intention on buying into either.
> 
> Jesus, gamers these days think they deserve to get everything for free.


Me neither and I guess we can thank Nvidia for the extra cost on the Asus Swift, I hope there is a version without the GSync I doubt it will be that noticeable, if you limit your FPS to your monitors refresh rate using RTSS or maxvariable in BF4 it's not needed IMO. Like I said my monitor plays smooth thanks to my 2x 290x's at 120hz. Those demos are laughable, marketing 101 on display that is all.


----------



## RCoon (Sep 25, 2014)

astrix_au said:


> Me neither and I guess we can thank Nvidia for the extra cost on the Asus Swift, I hope there is a version without the GSync I doubt it will be that noticeable, if you limit your FPS to your monitors refresh rate using RTSS or maxvariable in BF4 it's not needed IMO. Like I said my monitor plays smooth thanks to my 2x 290x's at 120hz. Those demos are laughable, marketing 101 on display that is all.



Maybe we're bad examples, we have decent systems and don't see sub 45FPS instances. I imagine GSync and Freesync are more important for people with low end systems, or midrange systems on 4K ridiculousness. I would assume that's where the sync magic comes in handy for the low FPS ranges and dips. Either way, I don't know why people with high end systems care, they wouldn't see much improvement with gsync or freesync anyway.


----------



## the54thvoid (Sep 25, 2014)

astrix_au said:


> Me neither and I guess we can thank Nvidia for the extra cost on the Asus Swift, I hope there is a version without the GSync I doubt it will be that noticeable, if you limit your FPS to your monitors refresh rate using RTSS or maxvariable in BF4 it's not needed IMO. Like I said my monitor plays smooth thanks to my 2x 290x's at 120hz. Those demos are laughable, marketing 101 on display that is all.



The problem is you need to experience the sensation of Gsync. You cannot gauge it through a video or you tube. All the reviews are exceptionally positive about it for the most part.
I too don't care about either product but if Freesync works as well as Mantle, Nvidia might adapt their business model to compete. It might just remain peripheral technology though, much like Mantle.
FTR one of my BF4 mates went mantle with a 290 and truly appreciated the smoothness, though he did come from a 2 GB gtx680. On my part, I only game on one 780ti but our perf.render display outputs were identical. But, I do use a decent CPU so mantle is limited in purpose for my GPU needs.


----------



## wickedcricket (Sep 25, 2014)

astrix_au said:


> What do you think they are doing with Mantle, thanks to AMD Microsoft started working on DX12 after the announced they they were focusing on other areas but Mantle forced their hand on the issue.
> 
> Nvidia is forcing people to buy monitors with Gsync, if they released DP 1.2a then sales of GSync monitors would collapse when freesync would be available I'm not loyal or a fan boy. I was going to get 2x 780ti's but Mantle's 290x crossfire performance on BF4 won me over. The 780ti's would sit at 75% each wasting GPU performance. When DX12 comes out and they can run at 100% all the time and use max performance all the time I might switch over but then moves like this makes you think twice.
> 
> AMD has planned to make Mantle open so we will wait and see when it gets released at the end of this year if that happens.



Well following your line of though here. Isn't AMD "forcing" you to buy new Microsoft Software i.e. Windows 9 ? 

I agree with above "foreworders":



> Nvidia pushing technology forward with their enginuity and innovations





> For now G-Sync is already a reality and is proven that works flawlessy.



Exactly people need a solution right here, right now. I am having a blast using it, it is incredible and fun! As a consumer I am not going to wait for "I don't know how long in a distant future" for something that hasn't been even implemented/tested yet.

I also read that it using 1.2a, all needs is software update so, we are talking simple soft solution to be able to use "freesync" on nvidia cards. Why all this doom and gloom and rant.



> Nvidia has every right to charge everyone for this tech



Umm, yes they do if there are people willing to pay for it, duuuh???


----------



## astrix_au (Sep 25, 2014)

the54thvoid said:


> FTR one of my BF4 mates went mantle with a 290 and truly appreciated the smoothness, though he did come from a 2 GB gtx680. On my part, I only game on one 780ti but our perf.render display outputs were identical. But, I do use a decent CPU so mantle is limited in purpose for my GPU needs.



Yeah mantle runs awesome on the 290x, other cards are a different story specially older GCN cards. I almost bought 2 GTX 780 ti's, but I liked the idea of low level API. I thinking possibly going one day to Nvidia possibly 2x 980ti's they seem well priced.


----------



## Relayer (Sep 25, 2014)

from the article said:
			
		

> Did NVIDIA leave out DisplayPort 1.2a in a deliberate attempt to check Adaptive Sync?



They need to give you some reason besides 10% faster next gen cards to upgrade.


----------



## astrix_au (Sep 25, 2014)

wickedcricket said:


> Well following your line of though here. Isn't AMD "forcing" you to buy new Microsoft Software i.e. Windows 9 ?


I don't get that analogy.

You multi quoted with me in the first quote but failed to show who  you quoted on the other quotations that you replied to, try including the poster with the quote to stop confusion or else your putting words in my mouth lol.


----------



## Animalpak (Sep 25, 2014)

RCoon said:


> The real crime here is that I don't care about Freesync or Gsync, and have no intention on buying into either.
> 
> Jesus, gamers these days think they deserve to get everything for free.





astrix_au said:


> Me neither and I guess we can thank Nvidia for the extra cost on the Asus Swift, I hope there is a version without the GSync I doubt it will be that noticeable, if you limit your FPS to your monitors refresh rate using RTSS or maxvariable in BF4 it's not needed IMO. Like I said my monitor plays smooth thanks to my 2x 290x's at 120hz. Those demos are laughable, marketing 101 on display that is all.



For both of you.

You limit your gaming experience ( is really sad ) and ignore all the efforts and technolgy that developers realize  made to make games looks better to give us something is awesome to play and to look for our eyes.

All the graphics card advancing and you dont care about G-Sync or DirectX and many Others innovations ?

So you guys can play counter strike for another decade ?


----------



## Vlada011 (Sep 25, 2014)

It's still too early for all conclusions.
We must to compare Free Sync and G-Sync in real life.
That's most important .To check what work better during gaming.


----------



## RCoon (Sep 25, 2014)

Animalpak said:


> For both of you.
> 
> You limit your gaming experience ( is really sad ) and ignore all the efforts and technolgy that developers realize  made to make games looks better to give us something is awesome to play and to look for our eyes.
> 
> ...



No, I don't really care that much about GSync, it costs money, and I enjoy myself while I'm playing games currently (yes I do play CS:GO competitive actually!). I have no intention whatsoever to pay £350 for a monitor with GSync when I have a perfectly good 1440p monitor, and I'm not "struggling" to play games as is.
As for Freesync, I've not personally seen it working with my own two eyes, so it doesn't bother me either. For all you know it might not work that well. You haven't seen it in action.

In every case, it involves buying a new GPU and a new monitor. I play indie games for christ's sake, and a very select few "AAA" games. It makes no difference to my ability to play games enjoyably.

DirectX has absolutely nothing to do with this, so I'm not sure why you mentioned it.


----------



## wickedcricket (Sep 25, 2014)

astrix_au said:


> I don't get that analogy.
> 
> You multi quoted with me in the first quote but failed to show who  you quoted on the other quotations that you replied to, try including the poster with the quote to stop confusion or else your putting words in my mouth lol.




Well What part of what you wrote you do not understand:



> Nvidia is forcing people to buy monitors with Gsync



I am sorry to everyone I didn't put that name I quoted, I hope you didnt feel offended that is: Hitman_Actual, jigar2speed, Animalpak

I swear I wouldn't put ANYTHING in your mouth


----------



## GreiverBlade (Sep 25, 2014)

RCoon said:


> The real crime here is that I don't care about Freesync or Gsync, and have no intention on buying into either.
> 
> Jesus, gamers these days think they deserve to get everything for free.


THANKS RCoon!


----------



## john_ (Sep 25, 2014)

Animalpak said:


> You limit your gaming experience ( is really sad )



What is sad is that, year after year, more and more believe that better gaming experience doesn't mean "better gameplay" but "better graphics".


----------



## ne6togadno (Sep 25, 2014)

john_ said:


> What is sad is that, year after year, more and more believe that better gaming experience doesn't mean "better gameplay" but "better graphics".


sad but true


----------



## Naito (Sep 25, 2014)

ISI300 said:


> The fact that they support HDMI 2.0 (the first gpu to support that standard which is the latest HDMI revision), yet refuse to implement the latest displayport standard, makes the whole thing stink. Mother****ers!



Or it could be that Display 1.3 spec' was finalized later than HDMI 2.0 and/or it would have taken too long to get the product certified. One more possible explanation is that Nvidia may have seen it pointless to implement a 'bridge' specification we know as 1.2a and wait for 1.3 (which of course goes back to the first sentence). Lastly, if their hardware does support 1.2a, it'll only require a firmware update.


----------



## astrix_au (Sep 25, 2014)

RCoon said:


> Maybe we're bad examples, we have decent systems and don't see sub 45FPS instances. I imagine GSync and Freesync are more important for people with low end systems, or midrange systems on 4K ridiculousness. I would assume that's where the sync magic comes in handy for the low FPS ranges and dips. Either way, I don't know why people with high end systems care, they wouldn't see much improvement with gsync or freesync anyway.



I think you nailed it right there!


----------



## TheoneandonlyMrK (Sep 25, 2014)

astrix_au said:


> I think you nailed it right there!


And it's those lower end gamers that are never going to be buying g sync monitors so I can't see them doing all that well. 
Well this kills the will nvidia support freesync thread.

It is a bit odd though because I thought it was a good idea supporting all vesa standards ahh well.


----------



## Breit (Sep 25, 2014)

NVidia sure has HDMI 2.0 on their 9xx cards, but they lack HDCP 2.2 support for unknown reasons. This is needed for 4K Blu-ray playback...

So much for being slow at catching up with new standards.


----------



## Sasqui (Sep 25, 2014)

Who's surprised?  This is similar to the SLI story years back.  SLI would only work on a motherboard with a NV chipset, and they sued several entities for making work-arounds.

Their corporate creed is to monetize on proprietary technology, and to keep it that way as long as possible.

PC's at my house will never see a NV card, so long as I have a choice.


----------



## Solidstate89 (Sep 25, 2014)

So is nVidia just not going to support newer versions of DP? 1.3 is already available and with 4K becoming more and more the norm they can't hope to just not update their DP standard in the hopes of not supporting VESA Adaptive Sync.

They're going to have to support it eventually, whether they like it not.


----------



## phanbuey (Sep 25, 2014)

Honestly, I would be more surprised if they DID support it.  This is par for the course for them.  I mean - SLI, PhysX, 3D Vision, CUDA, nvidia is always pushing proprietary tech over the Open Source solutions.  That being said, I don't really care and will still buy G-Sync as it is here, ready to go, and will probably work better in the long run anyways.


----------



## the54thvoid (Sep 25, 2014)

Sasqui said:


> Who's surprised?  This is similar to the SLI story years back.  SLI would only work on a motherboard with a NV chipset, and they sued several entities for making work-arounds.
> 
> Their corporate creed is to monetize on proprietary technology, and to keep it that way as long as possible.
> 
> PC's at my house will never see a NV card, so long as I have a choice.



And this is how people with strong views on companies should react.  Don't buy it.

Hey Sasqui, given those views, I certainly hope you don't have any Mac or Apple products.  They invented proprietary!


----------



## astrix_au (Sep 25, 2014)

phanbuey said:


> Honestly, I would be more surprised if they DID support it.  This is par for the course for them.  I mean - SLI, PhysX, 3D Vision, CUDA, nvidia is always pushing proprietary tech over the Open Source solutions.  That being said, I don't really care and will still buy G-Sync as it is here, ready to go, and will probably work better in the long run anyways.


It's just one more think that can go wrong or the drivers for it, at least you can turn it off right? I wouldn't bother with it unless you tested it and think it's worth the extra price.


----------



## ensabrenoir (Sep 25, 2014)

........Nvidia is not going to support an Amd lead standard that would help its competition.....  those monsters.......

*Business | Define Business at Dictionary.com*

dictionary.reference.com/browse/*business*

Loading... the purchase and sale of goods in an attempt to make a profit.

I would close saying something about Amd's promises vs Nvidia's cost for actually delivering but.....well we know...

But I shall close with this Propierty standards always fail in the face of cheaper simpler solutions....
unless massive supports makes the propierty standard the norm...... Inferior products sold cheaply don't = sucess either......in short we decide, we vote with our wallets.


----------



## Breit (Sep 25, 2014)

nVidia could've done what AMD did in the first place regarding FreeSync/G-Sync, but they opted for proprietary like always...
In the end I'm not really sure if it was worth the trouble for nVidia. I mean do they really make heaps of money from G-Sync? I doubt it... They still use FPGAs on these G-Sync boards and those aren't really cheap. They aren't in the profit zone with this right now I guess.


----------



## Sasqui (Sep 25, 2014)

the54thvoid said:


> And this is how people with strong views on companies should react.  Don't buy it.
> 
> Hey Sasqui, given those views, I certainly hope you don't have any Mac or Apple products.  They invented proprietary!



How'd you know?    As for apple, I mostly can't stand iTunes, not to mention the sheep that follow apple.  That said, I do think they have some great products (that I would never own).  I digress...


----------



## Fourstaff (Sep 25, 2014)

I don't see either being mass adopted any time soon. Non-issue for most, for the select few AMD provide what they want. Everyone is free to vote with their wallet if Nvidia is providing the inferior good.


----------



## wickedcricket (Sep 25, 2014)

ensabrenoir said:


> ........Nvidia is not going to support an Amd lead standard that would help its competition.....  those monsters.......
> 
> *Business | Define Business at Dictionary.com*
> 
> ...



What ensabrenoir said.

I don't get how people still fail to understand that simple rule.


----------



## Casecutter (Sep 25, 2014)

RCoon said:


> Jesus, gamers these days think they deserve to get everything for free.


Err... Dx12 will be available… as yearly subscription download!


----------



## RCoon (Sep 25, 2014)

Casecutter said:


> Err... Dx12 will be available… as yearly subscription download!



People keep talking about Dx12 in a Freesync/Gsync thread.
I don't understand why.


----------



## GhostRyder (Sep 25, 2014)

In all honesty this is not as surprising as you would think.  They do not want to immediately undercut their own technology and implementations with a free alternative, just would not make sense.


----------



## arbiter (Sep 25, 2014)

astrix_au said:


> What do you think they are doing with Mantle, thanks to AMD Microsoft started working on DX12 after the announced they they were focusing on other areas but Mantle forced their hand on the issue.



MS stated they were working on directx12, 3 years before AMD mantle. 



RCoon said:


> Jesus, gamers these days think they deserve to get everything for free.



Yea R&D cost money and Nvidia is not UNICEF.



astrix_au said:


> Me neither and I guess we can thank Nvidia for the extra cost on the Asus Swift, I hope there is a version without the GSync I doubt it will be that noticeable, if you limit your FPS to your monitors refresh rate using RTSS or maxvariable in BF4 it's not needed IMO. Like I said my monitor plays smooth thanks to my 2x 290x's at 120hz. Those demos are laughable, marketing 101 on display that is all.



What is the different between using RTSS vs v-sync at that point? Its pretty much same thing.




Naito said:


> Lastly, if their hardware does support 1.2a, it'll only require a firmware update.



Would think that g-sync and adaptive sync are pretty similiar on how they signal monitor, just that g-sync requires monitor to talk back. So could be just as simple as software update in the driver. Nvidia probably won't enable it til monitors are there to support it. Right now to get g-sync adopted is more sales, when `1.2a monitor is out enable it so gives end user option for both.



Solidstate89 said:


> So is nVidia just not going to support newer versions of DP? 1.3 is already available and with 4K becoming more and more the norm they can't hope to just not update their DP standard in the hopes of not supporting VESA Adaptive Sync.



We don't know what the process is to certify the card for 1.3. On top of that 1.3 spec was only finalized the week of 900 release so could took weeks or even months for the certification process. Could even be like the 1.2a support, Just software fix to enable it when it does get certified, Its just something we won't know.


----------



## HM_Actua1 (Sep 25, 2014)

Animalpak said:


> first of all we must see if Free-Sync works the same or better than G-Sync then we can talk...
> 
> For now G-Sync is already a reality and is proven that works flawlessy.



DING DING DING


----------



## HM_Actua1 (Sep 25, 2014)

GreiverBlade said:


> THANKS RCoon!


exactly. the entitled elite.

give me give me give me....for nothing...world doesn't work like that.


----------



## arbiter (Sep 25, 2014)

Animalpak said:


> first of all we must see if Free-Sync works the same or better than G-Sync then we can talk...
> 
> For now G-Sync is already a reality and is proven that works flawlessy.



"Today, AMD announced collaborations with scaler vendors MStar, Novatek and Realtek to build scaler units ready with DisplayPort™ Adaptive-Sync and AMD's Project FreeSync by year end."

That is off AMD news release(link below), how that reads to me is that NEW scaler chips that are needed to run adaptive-sync that amd said wouldn't need. Won't be read to near end of the year so that tells me mid to late q1 before monitors will be out aka march.

http://ir.amd.com/phoenix.zhtml?c=74093&p=irol-newsArticle&ID=1969277


----------



## eidairaman1 (Sep 25, 2014)

Vesa are standards builders, youre better off with industry standards than proprietary equipment.


----------



## heydan83 (Sep 25, 2014)

Well if the battle of Gsync and Adaptive Sync will continue at least I hope they release some Gsync/Adapctive Sync monitors (yes with both technology) at a considerable price, that way you will have the liberty to choice your GPU base on other stuff, dam Nvidia hope they throw the Gsync project asap so this entire technology can be standardize.


----------



## Ferrum Master (Sep 25, 2014)

arbiter said:


> "
> 
> That is off AMD news release(link below), how that reads to me is that NEW scaler chips that are needed to run adaptive-sync that amd said wouldn't need.



Wrong... Actually they need at last chips that support the DP 1.2a/1.3 bus and to be packed with any mainstram display.


----------



## 15th Warlock (Sep 25, 2014)

As an owner of both AMD and Nvidia based systems, and a G-sync monitor, all I can say is, if Free sync is as good as G-sync Nvidia is committing a big disservice to owners of their cards by not supporting DP 1.2a.

I hope the upper management at Nvidia open their eyes and see the light; I mean, I understand they spent millions on researching and developing g-sync, but why not just support what's basically an industry standard?

Well, I guess we all know why, (profits) but still, not a good call Nvidia, not cool at all


----------



## Casecutter (Sep 25, 2014)

RCoon said:


> People keep talking about Dx12 in a Freesync/Gsync thread.
> I don't understand why.


In response to your saying "Gamers want everything for free".  The Dx12 comment was more an example... what would you think if MS was to say the Dx12 is going the way of a "yearly subscription" based upgrade (like Office), not just tied to Win9?

I'm just saying "should" companies have you pay for every little feature or improvement that bestows what might be considered the "natural evolution" to improve the PC gaming experience?  I see the whole Sync issue as just a shortcoming that "has it the moment to be resolved".  Like cars finally getting disc brakes by the 70’s.  Sure many optioned to get them or you paid more to have them at first, but now we look back and say today it's a standard that fixed an inadequacy.

Sure if a company develops a technology they see it as a revenue stream, and folk are willing pay and buy into the hardware ecosystem for the "latest and greatest" that’s that what some do. The folk that wait till it’s less bleeding edge have that a choice, once it finally evolves into the standard and can get it with the next panel purchase that's how it works for most mainstream.

I'm not arguing Gsync/FreeSync, but folks will need for the time being choose a path/ecosystem.  It's not any issue for Nvidia to lock their cards to only Gsync... it’s their prerogative.  It’s just those folk who see buying a new monitor in their immediate future can:  A) Consider will there be Gsync panels in their price range coming to market, and does the card they have presently (at least 7XX) offer that?; or B) Do they hold to getting a monitor that includes that VESA standard in their price range and then does that mean they'll need an AMD card (basically R9-R7 will support the dynamic refresh for gaming).


----------



## Ferrum Master (Sep 25, 2014)

Calm down... It's just trolling...

Someone at nvidia does the job what he is paid for.  

Seconds so far we can only speculate about AMD, so waste of time here...


----------



## bogami (Sep 25, 2014)

Vertical sync is nothing new but in this nvidia have always been avoided. ,but now they are forcing  a new product that is expensive and is not in many monitors. Virtually forcing the new standard .
That operates ? Is not the case because they gained relief in operation of GPU processor . CHEAT !
Not to speak of salted price of Nvidia products wich in the quiet and sneaky way planted 30% of the card mor on the buyer .But I have not seen this in 4K monitors.  only  is 60 Hz max framerate.
With this deception  NVIDIA obtained higher scores than AMD already for some time .         That we will be fed to be billed for , I did not expect .And as usual again paying middle class as the highest and not the 20 nm but within 28 nm. + G sinh card. I miss the announced  CPU element in the GPU. Only optimization of the core is in the MG 204. Terror the next cheat please nVidia , and Pay him


----------



## RCoon (Sep 25, 2014)

NVidia WILL support adaptive sync on display port

http://www.kitguru.net/components/g...ia-plans-to-support-adaptive-sync-technology/

Now everyone can stop complaining.


----------



## Ferrum Master (Sep 25, 2014)

RCoon said:


> Now everyone can stop complaining.


----------



## Casecutter (Sep 25, 2014)

RCoon said:


> NVidia WILL support adaptive sync on display port
> 
> http://www.kitguru.net/components/g...ia-plans-to-support-adaptive-sync-technology/
> 
> Now everyone can stop complaining.


 "it will also* eventually* support the industry-standard Adaptive-Sync"... as who here thought it would never.


----------



## HumanSmoke (Sep 25, 2014)

Casecutter said:


> "it will also* eventually* support the industry-standard Adaptive-Sync"... as who here thought it would never.


Well, given that both Nvidia and the OEM/ODM's have likely a significant amount of time, effort, and inventory built up around G-Sync it isn't overly surprising that they'd push the tech- especially when Adaptive-Sync isn't competition at the moment. Why Osborne yourself unnecessarily? Announcing to the world that you're introducing a cheaper alternative to the solution you're presently selling (and making profit from) when actually under no pressure to do so smacks of business suicide.


----------



## Ferrum Master (Sep 25, 2014)

HumanSmoke said:


> Well, given that both Nvidia and the OEM/ODM's have likely a significant amount of time, effort, and inventory built up around G-Sync it isn't overly surprising that they'd push the tech- especially when Adaptive-Sync isn't competition at the moment. Why Osborne yourself unnecessarily? Announcing to the world that you're introducing a cheaper alternative to the solution you're presently selling (and making profit from) when actually under no pressure to do so smacks of business suicide.



I think they actually got burned, the R/D cost to pull out such a stunt isn't a pocket money. I think they actually underestimated AMD and that they will make a deal with VESA and all major scaler makers. Someone will get spanked by Jen-Hsun either way...


----------



## HumanSmoke (Sep 25, 2014)

Ferrum Master said:


> I think they actually got burned, the R/D cost to pull out such a stunt isn't a pocket money. I think they actually underestimated AMD and that they will make a deal with VESA and all major scaler makers. Someone will get spanked by Jen-Hsun either way...


Not sure about that. I'm guessing that G-Sync is just an extension of the research Nvidia did with Adaptive VSync, and I'm also guessing that Tegra chips aren't all that expensive (I'm also certain that Nvidia loved the chance to offload them and create another batch of "design wins"). I'm also guessing that the monitor OEMs are shouldering the greatest financial burden since their inventory needs to move before the VESA specced monitors gain traction.
As for spanking....All in all, I'd say that G-Sync has probably paid for itself in marketing. Reviews and user feedback have been positive with the only downside being the added cost of ownership. It is also extremely short sighted to think that this came as ANY surprise to Nvidia - You do realise that the Nvidia's Display Technical Marketing Manager, Pablo Ortega, is on the VESA board of directors ?

It constantly amazes me that the tech industry seems to be viewed by otherwise intelligent people as some kind of real life version of a hybrid Looney Tunes- Keystone Cops mashup.


----------



## Ferrum Master (Sep 25, 2014)

HumanSmoke said:


> Nvidia's Display Technical Marketing Manager, Pablo Ortega, is on the VESA board of directors ?



Well It ain't an argument... And we are not disputing the loyalty of a certain person... it is way too nasty...

Seconds... well ordering additional critter of silicon as seen in hardware implementations of gsync it ain't no leftovers of Tegra project. All Tegras are unique by their step numbers for example, the even old Tegra 2 chip in Optimus 2X is different than Tegra2 in Motorolla Atrix... their GPIO and address space is a completely different mess ie different customized silicon and someone spared time on that and ordered to manufacture the Gsync one.

So far I cannot foresee how few models of gsync enabled monitors can justify costs to produce such hardware... we cannot even speak of mass batches like I mentioned for those phones... a niche product...


----------



## HumanSmoke (Sep 25, 2014)

Ferrum Master said:


> Well It ain't an argument... And we are not disputing the loyalty of a certain person... it is way too nasty...


Don't flagellate yourself too much.


Ferrum Master said:


> Seconds... well ordering additional critter of silicon as seen in hardware implementations of gsync it ain't no leftovers of Tegra project.


"Leftovers" ? No. The G-Sync Tegra is likely a semi-custom ASIC


Ferrum Master said:


> All Tegras are unique by their step numbers for example, the even old Tegra 2 chip in Optimus 2X is different than Tegra2 in Motorolla Atrix... their GPIO and address space is a completely different mess ie different customized silicon and someone spared time on that and ordered to manufacture the Gsync one.


Yes. That's what semi-custom ASIC means. Two points - 1. You don't know how much the silicon floor plan has been reworked, and 2. You don't know if the mask/fabbing cost outweighs the gain to the company. What I know is that The G-Sync's Tegra chip is around the same size and an entry level GPU (a GPU that when attached to a PCB with voltage regulation, I/O, power plugs and a heatsink retails for around $30) and the add-in G-Sync board sells for ~$200. Obviously the company isn't losing out or they wouldn't be making it (or they assume the expenditure is worth it to the company in other terms) and OEM/ODM's wouldn't be using the module. Now I know the forums are full of people who know more than the actual people who run these businesses, so while I list a few self-evident facts, you need not address them - even if you were able- since you obviously have more pressing business in hiring out your services as the pre-eminent business strategist of this era.


Ferrum Master said:


> So far I cannot foresee how few models of gsync enabled monitors can justify costs to produce such hardware...


It's no different from the R&D AIB's expend on low- volume esoteric products. You think Asus's custom limited edition Mars and Ares turn a direct monetary profit for the company?


----------



## 15th Warlock (Sep 26, 2014)

HumanSmoke said:


> Not sure about that. I'm guessing that G-Sync is just an extension of the research Nvidia did with Adaptive VSync, and I'm also guessing that Tegra chips aren't all that expensive (I'm also certain that Nvidia loved the chance to offload them and create another batch of "design wins"). I'm also guessing that the monitor OEMs are shouldering the greatest financial burden since their inventory needs to move before the VESA specced monitors gain traction.
> As for spanking....All in all, I'd say that G-Sync has probably paid for itself in marketing. Reviews and user feedback have been positive with the only downside being the added cost of ownership. It is also extremely short sighted to think that this came as ANY surprise to Nvidia - You do realise that the Nvidia's Display Technical Marketing Manager, Pablo Ortega, is on the VESA board of directors ?
> 
> It constantly amazes me that the tech industry seems to be viewed by otherwise intelligent people as some kind of real life version of a hybrid Looney Tunes- Keystone Cops mashup.



The Tegra theory would make sense, except for the fact that in its current iteration Nvidia is using an FPGA chip an programing it with proprietary software to enable G-sync.


----------



## xenocide (Sep 26, 2014)

Factoring in how G-Sync works, I refuse to believe FreeSync will offer a comperable experience.


----------



## astrix_au (Sep 26, 2014)

GhostRyder said:


> In all honesty this is not as surprising as you would think.  They do not want to immediately undercut their own technology and implementations with a free alternative, just would not make sense.


Yeah fair enough that is their right but someone with a great monitor who will miss out just because of this won't feel so great about it. They should look at other avenues when doing R&D, maybe just the fact they could make more money from hardware was to appealing. If that is the case it starts a bad precedent where they will only put R&D where they think they can force people to pay extra for it instead of just having it included like all other technologies in CPU and GPU's.
The ASUS Swift monitor is $999.00 in Australia right now and some of that cost apparently can be as much if not more than $150 extra for for the GSync device in the monitor.  I might have to buy it anyway but I think it's an unnecessary cost.


----------



## astrix_au (Sep 26, 2014)

xenocide said:


> Factoring in how G-Sync works, I refuse to believe FreeSync will offer a comperable experience.


Why is that?


----------



## Scrizz (Sep 26, 2014)

wickedcricket said:


> I swear I wouldn't put *ANYTHING* in your mouth


so where would you put it?






..... I'll walk myself out.....
lol


----------



## HumanSmoke (Sep 26, 2014)

15th Warlock said:


> The Tegra theory would make sense, except for the fact that in its current iteration Nvidia is using an FPGA chip an programing it with proprietary software to enable G-sync.


Much cheaper than Tegra then. Seems to add to the weight of evidence that G-Sync is, if not paying its way directly for Nvidia, is recouping its development costs via a combination of sales, marketing, and brand differentiation.


----------



## arbiter (Sep 26, 2014)

HumanSmoke said:


> Much cheaper than Tegra then. Seems to add to the weight of evidence that G-Sync is, if not paying its way directly for Nvidia, is recouping its development costs via a combination of sales, marketing, and brand differentiation.



New tech is always expensive, when there is say half dozen different monitors on market with g-sync costs will come down.


----------



## Strider (Sep 26, 2014)

I am sure this has been said somewhere in all of the comments, but what's one more time. lol

Adaptive Sync has been around far longer than G-Sync, just in laptops, and now that it's a display port 1.2a+ standard, it will be open to ALL desktop hardware.

Nvidia had no reason to create G-Sync, they could have done exactly what AMD did and push for adaptive sync, but they chose to create a completely separate proprietary technology. At a hardware level, G-Sync has no real advantages over adaptive or Freesync.

Nvidia only did what they did becasue they are like Apple, they go to great ends to maintain a closed ecosystem and are dead set against open anything in many aspects of their business. It's just how they operate.

PhysX is a perfect example, the engine can run on any hardware, the only reason it won't run on AMD or any other GPU is becasue it's locked, not becasue of any hardware limitation. This is something Nvidia did shortly after buying the engine from Ageia. In fact, you used to be able to run PhysX on an ATI GPU via a modified driver. However Nvidia went to great lengths to prevent this, and now if you want to run PhysX on anything but a pure Nvidia system, you need a hybrid AMD/Nvidia setup and modified driver. The only reason this is not blocked yet is because it's a software level block and there is little Nvidia can do to stop it.

The thing is, there is no point, by locking down PhysX Nvidia has come really close to killing it. The number of games that use it at the GPU level are miniscule, and dropping rapidly, compared to Havok or engine specific physics. Both of which can do anything PhysX can, and are not hardware locked or limited.

More recently, Nvidia has gone so far to lock the libraries used with GameWorks to actually hinder the performance of non-Nvidia based GPU's.

I am not trying to come off as a hater, or fanboy, just pointing out facts.

In my opinion, if this is true, it's a bad move for Nvidia. Hardware and software is moving more toward open standards, and Nvidia no longer rules the discrete GPU world, AMD has a very large share and it's showing no signs of slowing. In the end, this will only hurt Nvidia's business. There will be little to no reason to buy G-Sync over an adaptive sync capable display. There will be fewer displays and models that will support G-Sync over adaptive since it's a standard and G-Sync is not. The G-Sync displays will likely cost more, since the hardware is proprietary, and you will get no real added benefit other than the opportunity to wear your green team tag with pride.

=]


----------



## Naito (Sep 26, 2014)

I still see G-Sync kicking around for a while. Why? Because of a little feature that now seems to be exclusively paired with G-Sync, called ULMB (Ultra Low Motion Blur). It might not be long before Nvidia find out a way to have both technologies enabled at once (or at least to a degree). This will give an obvious advantage over Adaptive Sync, especially when high FPS rates are key. Adaptive Sync/G-Sync are pretty much useless for high FPS rates, so in a game like CS, hardcore gamers will probably go a G-Sync monitor which comes with ULMB (basically the LightBoost trick of yesteryear), to get the competitive edge. As far as I know, except for a one-off Samsung feature (which wasn't as good as LightBoost hack), there are no other competing technologies to LightBoost/ULMB in the PC market.

So to sum up, if you want something like LightBoost or ULMB in the future, you'll most likely have to buy a G-Sync monitor as I'm sure ULMB will remain an exclusive feature.


----------



## Relayer (Sep 26, 2014)

Solidstate89 said:


> So is nVidia just not going to support newer versions of DP? 1.3 is already available and with 4K becoming more and more the norm they can't hope to just not update their DP standard in the hopes of not supporting VESA Adaptive Sync.
> 
> They're going to have to support it eventually, whether they like it not.


Adaptivesync is an optional feature. Even then Adaptivesync is not Freesync. Freesync is AMD's way of offering real time dynamic refresh rate (syncing the monitor's refresh rate with the card's output). It uses the capabilities of Adaptivesync to accomplish it. nVidia can use other features of Adaptivesync (lowering refresh rate to fixed values for video playback, for example) and not offer real time refresh rate adjustment to interfere with Gsync.

Another possibility is they are going to offer it, but are just not saying so because they don't want to hurt current Gsync sales.


----------



## HumanSmoke (Sep 26, 2014)

Strider said:


> I am not trying to come off as a hater, or fanboy, just pointing out facts.


If that's the case, you're doing a piss poor job


Strider said:


> and Nvidia no longer rules the discrete GPU world, AMD has a very large share and it's showing no signs of slowing.


Meanwhile in the real world....*Nvidia holds 62% of the discrete graphics market*. These are facts - even when AMD/ATI have had a dominant product they haven't translated that into market share.









Strider said:


> PhysX is a perfect example, the engine can run on any hardware, the only reason it won't run on AMD or any other GPU is becasue it's locked, not becasue of any hardware limitation. This is something Nvidia did shortly after buying the engine from Ageia.


That's right - shortly after Nvidia *PAID $150 million for Ageia*, which was shortly after *AMD turned down the opportunity to buy the same company*.
Wanting PhysX but not wanting to pay for it...well, that's like waiting for your neighbour to buy a lawnmower rather than buy one yourself, then turning up on his doorstep to borrow it....and expecting him to provide the gas for it.


Strider said:


> There will be little to no reason to buy G-Sync over an adaptive sync capable display.


You mean aside from the fact that you can't buy an Adaptive-Sync monitor at the moment?


Strider said:


> There will be fewer displays and models that will support G-Sync over adaptive since it's a standard and G-Sync is not.


And Nvidia will most likely adopt Adaptive-Sync once it does become mainstream. At the moment Adaptive Sync is a specification - Nvidia makes no money off a VESA specification, it does however derive benefit from current G-Sync sales.


----------



## Ferrum Master (Sep 26, 2014)

HumanSmoke said:


> Nvidia, is recouping its development costs via a combination of sales, marketing, and brand differentiation.



I guess you are still bad at math. This product can be proffitable only if sold many hundred thousands...

Tegra 4 had caused in two year time over 400mil operating loss in the tegra division, as the same R/D team is incapable of efficiency. I cannot understand what kind of numbers roll in your head, but to pull any kind of beta silicon, iron it out, feed the binary blob coders and advertising monkeys costs millions...

It won't be profitable ever... Snake oil.


----------



## astrix_au (Sep 26, 2014)

Naito said:


> I still see G-Sync kicking around for a while. Why? Because of a little feature that now seems to be exclusively paired with G-Sync, called ULMB (Ultra Low Motion Blur).


I read somewhere that you can't use both at the same time. Here are a couple links I just found with those statements.

http://www.blurbusters.com/lightboost-sequel-ultra-low-motion-blur-ulmb/

http://hardforum.com/showthread.php?t=1812458

I don't know if they changed that yet though.


----------



## HumanSmoke (Sep 26, 2014)

Ferrum Master said:


> I guess you are still bad at math. This product can be proffitable only if sold many hundred thousands...


And I guess that you don't understand how a business built around a brand works. There is more to profit than sales of individual SKUs. You think AMD launched the 295X2 to deliberately lose money (I'm pretty sure it won't sell "hundred thousands" ) ?
Most other people who understand how the business actually works realise that halo and peripheral products are tools to enable further sales of mainstream products. Know what else doesn't turn a monetary profit? Gaming development SDK's, software utilities, Mantle(piece), PhysX, NVAPI, and most limited edition hardware. The profit comes through furtherance of the brand.
Why else do you think AMD poured R&D into their gaming program, Mantle(piece), and resources to bring analogues of ShadowPlay and GeForce Experience into being?

For some self-professed business genius you don't seem very astute in the strategy of marketing and selling a brand.


----------



## Ferrum Master (Sep 26, 2014)

HumanSmoke said:


> And I guess that you don't understand how a business built around a brand works. There is more to profit than sales of individual SKUs. You think AMD launched the 295X2 to deliberately lose money



Apples and oranges, dual card costs silicon wise nothing, just a new PCB. Not building whole new ASIC.

Mantle is just a natural side product of xbone/ps4 SDK development. It does not also require new ASIC's...

Boney... You act like a cheap car saler...


----------



## RCoon (Sep 26, 2014)

HumanSmoke said:


> Mantle(piece)



I think I may need to patent that after mentioning it in the gpu release thread. Sounds like it's catching on.


----------



## HumanSmoke (Sep 26, 2014)

Ferrum Master said:


> Apples and oranges, dual card costs silicon wise nothing, just a new PCB. Not building whole new ASIC.


Yeah right, an AIO costs nothing! 16 layer PCB costs nothing! PLX chip costs nothing!


Ferrum Master said:


> Mantle is just a natural side product of xbone/ps4 SDK development. It does not also require new ASIC's...


So Mantle hasn't cost AMD anything? What about the game bundles AMD give away with their cards? AMD don't turn a profit on them. The $5-8million they gave EA/DICE for BF4 sponsorship? AMD don't sell BF4, they give it away - IT COSTS THEM - Why? because it furthers the brand to have AAA titles associated with the companies graphics.

Could you make your argument any weaker? (Don't answer that as I'm sure you'll outdo yourself )



RCoon said:


> I think I may need to patent that after mentioning it in the gpu release thread. Sounds like it's catching on.



Just doing my bit for the proud tradition of internet speculation and my love of the running gag - also an appreciative nod toward your theory on AMD's Forward Thinking™ Future


----------



## Relayer (Sep 26, 2014)

xenocide said:


> Factoring in how G-Sync works, I refuse to believe FreeSync will offer a comperable experience.


How Gsync works is irrelevant. How FreeSync works is all that matters.

In theory, it's a really simple process. The card polls the Monitor to find out what it's refresh rate range is. Say it's 30-60 Hz. Any frame that falls withing that range is released immediately and the screen is instructed to refresh with it. If the card is going to slow it will resend the previous frame. To fast and it will buffer it until the monitor is ready.

With that said we'll have to wait until samples are available for testing before we know for sure. It could be worse, or it could be better.



arbiter said:


> New tech is always expensive, when there is say half dozen different monitors on market with g-sync costs will come down.



When there is competition in the marketplace, the price will come down. As long as nVidia is the only one producing the needed hardware they have no reason to lower prices because they are selling to multiple OEM's.


----------



## Ferrum Master (Sep 26, 2014)

HumanSmoke said:


> Yeah right, an AIO costs nothing! 16 layer PCB costs nothing! PLX chip costs nothing!
> 
> So Mantle hasn't cost AMD anything? What about the game bundles AMD give away with their car
> 
> Could you make your argument any weaker?



Nor pcb nor plx chip costs. First of all the bridge ain't theirs. The PCB is already drawn usualy for mobile solution and you have to stack them toghether, the VRM part is designed also by their manufacturer examples... It ain't a new ASIC...

The water cooling solution is also not designed from zero, they use ASETEK and not again a new ASIC build from scratch.

Okay the battlefield PR... I see you have gone even further in cheap car sales. And how that is connected with gsync? They both invest into game companies.

How is pulling your nose into AMD's PR business connected to gsync R/D costs and profitability of this program...
Boney you left your pelvic bone somewhere...


----------



## Naito (Sep 26, 2014)

astrix_au said:


> I read somewhere that you can't use both at the same time. Here are a couple links I just found with those statements.
> 
> http://www.blurbusters.com/lightboost-sequel-ultra-low-motion-blur-ulmb/
> 
> ...



Yeah, it can't work in tandem (but may in future?), but currently only one exists where the other exists, I have not seen a monitor that supports only G-Sync without ULMB, or vice versa.


----------



## Relayer (Sep 26, 2014)

HumanSmoke said:


> If that's the case, you're doing a piss poor job
> 
> Meanwhile in the real world....*Nvidia holds 62% of the discrete graphics market*. These are facts - even when AMD/ATI have had a dominant product they haven't translated that into market share.
> 
> ...


Why would anyone care about nVidia, there market share, financials, etc? Unless you work for them, or own stock. I sure don't choose my hardware by looking at what everyone else buys or who's going to make the highest profit on my purchase. I buy by price, features, reliability, and performance. If Matrox offered a card that was better suited for my needs, I'd buy it. I wouldn't care that they only have a very tiny part of the market. I'd be happy that I could find a product that catered to my needs.


----------



## Naito (Sep 26, 2014)

Relayer said:


> Why would anyone care about nVidia, there market share, financials, etc? Unless you work for them, or own stock. I sure don't choose my hardware by looking at what everyone else buys or who's going to make the highest profit on my purchase. I buy by price, features, reliability, and performance. If Matrox offered a card that was better suited for my needs, I'd buy it. I wouldn't care that they only have a very tiny part of the market. I'd be happy that I could find a product that catered to my needs.



That wasn't the original argument. It was in reply to:



Strider said:


> and Nvidia no longer rules the discrete GPU world, AMD has a very large share and it's showing no signs of slowing



HumanSmoke was just giving the actual statistics to prove that statement wrong.


----------



## Relayer (Sep 26, 2014)

Naito said:


> That wasn't the original argument. It was in reply to:
> 
> 
> 
> HumanSmoke was just giving the actual statistics to prove that statement wrong.


38% is a large share, and that's just Discrete. Add in APU's and the consoles and AMD does have major influence.


----------



## Naito (Sep 26, 2014)

Relayer said:


> 38% is a large share, and that's just Discrete. Add in APU's and the consoles and AMD does have major influence.



Consoles doesn't influence the PC market that much. And factoring APUs, you'll then have to bring Intel IGPs into the equation.

Besides the statement again, says "Nvidia no longer rules the *discrete GPU world*", not other markets.


----------



## HumanSmoke (Sep 26, 2014)

Ferrum Master said:


> Nor pcb nor plx chip costs. First of all the bridge ain't theirs. The PCB is already drawn usualy for mobile solution and you have to stack them toghether, the VRM part is designed also by their manufacturer examples... It ain't a new ASIC...


Design costs money. Testing costs money. Validation costs money. The AIO costs money whether it is already designed or not ( You think Asetek said to AMD "Hey we made this from one our existing products, so you can have it for free"  ). The PCB cost is higher and still needs to be laid out - which costs money.


Ferrum Master said:


> The water cooling solution is also not designed from zero, they use satellite and not again a new ASIC build from scratch.


AMD have a contract to SELL AIO's to AMD, not give them away to the needy. The AIO adds to the bill of meterials (BoM) as does the PCB, and the PLX chip.


Ferrum Master said:


> Okay the battlefield PR... I see you have gone even further in cheap car sales. And how that is connected with gsync? They both invest into game companies.


Which goes back exactly to what I was saying about profit not being inextricably linked to the direct sale of any particular part.


HumanSmoke said:


> Obviously the company isn't losing out or they wouldn't be making it (*or they assume the expenditure is worth it to the company in other terms*) and OEM/ODM's wouldn't be using the module.





Ferrum Master said:


> How is pulling your nose into AMD's PR business connected to gsync R/D costs and profitability of this program...


_Pulling my nose_??? Also I never mentioned G-Sync's R&D costs other than the fact that G-Sync is a means to elevate, publicize, and market the Nvidia brand. Just as allying a AAA game title does exactly the same thing, just as producing a halo hardware part does exactly the same thing, just as investing money and resources into a software ecosystem to support the hardware does exactly the same thing.

This is obviously beyond you or you just fail to see how business uses brand strategy to maintain and build market share. It's not rocket science. Please feel free to do whatever you're doing, but your audience just decreased by one at least.


Ferrum Master said:


> Boney you left your pelvic bone somewhere...


That another weird Latvian saying, like "pulling my nose"?


----------



## NC37 (Sep 26, 2014)

ZOMG! nVidia not supporting new standards that are clearly better but for some reason they just don't want to...thats....thats....not new news one bit. Moving along, nothing to see here.


----------



## Naito (Sep 26, 2014)

Seems to be a bit too much of ye olde _noise pulling_ going on in this thread.


----------



## Ferrum Master (Sep 26, 2014)

Boney, you are talking about bom costs that final buyer covers. Not the R/D cost...

You are comparing pocket money costs, like designing AIO... Not making a brand new product....

The saying is just as weird as you operate with facts... Just goofing around with numbers like a cheap car saler...


----------



## Naito (Sep 26, 2014)

NC37 said:


> ZOMG! nVidia not supporting new standards that are clearly better but for some reason they just don't want to...thats....thats....not new news one bit. Moving along, nothing to see here.



But yet we have not seen Adaptive Sync/Free Sync in action/available commercially. How can you make such a claim?


----------



## HumanSmoke (Sep 26, 2014)

Relayer said:


> Why would anyone care about nVidia, there market share, financials, etc? Unless you work for them, or own stock.


Well, firstly, all I was doing was pointing out an incorrect "statement of fact" from Strider, and secondly...
Because market share has a direct bearing upon revenue, and revenue has a direct bearing on R&D, and R&D has a direct bearing on future products, and future products have a direct bearing upon market share ? You see where it is going ? You really only have to look at the histories of 3dfx, S3, Rendition, Cirrus Logic, Trident, SGI, Tseng Labs, 3DLabs and a host of other IHV's to see what happens when the cycle fails to provide uplift.


Relayer said:


> I sure don't choose my hardware by looking at what everyone else buys or who's going to make the highest profit on my purchase. I buy by price, features, reliability, and performance.  If Matrox offered a card that was better suited for my needs, I'd buy it. I wouldn't care that they only have a very tiny part of the market. I'd be happy that I could find a product that catered to my needs.


Likewise, if enough people think the same way then their market share increases, they have more funds for development, and they stay competitive. Matrox won't offer you a card better suited to your needs because they failed to evolve as fast as ATI and Nvidia. The G400 was a great series of cards, but the writing was on the wall when they failed to develop a relevant successor. Everyone (should) buy on features, reliability, and performance - group those everybody's together and you have market share.


----------



## Naito (Sep 26, 2014)

HumanSmoke said:


> Everyone (should) buy on features, reliability, and performance - group those everybody's together and you have market share.



Marketing may muddle that a fair bit for the technologically un-savvy.


----------



## HumanSmoke (Sep 26, 2014)

Naito said:


> Marketing may muddle that a fair bit for the technologically un-savvy.


Yep, and that also plays as a sub-set of OEM sales. OEM's are by far the largest market for discrete graphics, so their marketing, pricing, competition, and whatever features they throw into the mix (freebee's, financing, add-ons, discounts) have a large part to play. I know of a few people that won't stray from the old guard such as Hewlett-Packard even though their telephone support rivals Guantanamo Bay and American Idol in the pantheon of "cruel and unusual punishment"


----------



## bwat47 (Sep 26, 2014)

astrix_au said:


> Yeah mantle runs awesome on the 290x, other cards are a different story specially older GCN cards. I almost bought 2 GTX 780 ti's, but I liked the idea of low level API. I thinking possibly going one day to Nvidia possibly 2x 980ti's they seem well priced.



I have an older GCN 1.0 card (amd 280x) and bf4 with mantle runs absolutely butter smooth on it.


----------



## arbiter (Sep 26, 2014)

Strider said:


> PhysX is a perfect example, the engine can run on any hardware, the only reason it won't run on AMD or any other GPU is becasue it's locked, not becasue of any hardware limitation. This is something Nvidia did shortly after buying the engine from Ageia. In fact, you used to be able to run PhysX on an ATI GPU via a modified driver. However Nvidia went to great lengths to prevent this, and now if you want to run PhysX on anything but a pure Nvidia system, you need a hybrid AMD/Nvidia setup and modified driver. The only reason this is not blocked yet is because it's a software level block and there is little Nvidia can do to stop it.



Um Nvidia was WILLING to license Physx to AMD but they refused to license it, so that puts it on AMD.


----------



## HumanSmoke (Sep 26, 2014)

arbiter said:


> Um Nvidia was WILLING to license Physx to AMD but they refused to license it, so that puts it on AMD.


Correct. Nvidia made a number of overtures to AMD, including what amounted to a come and get it offer that was picked up by most of the tech sites at the time


> _ Jason Paul, GeForce Product Manager: We are open to licensing PhysX, and have done so on a variety of platforms (PS3, Xbox, Nintendo Wii, and iPhone to name a few). We would be willing to work with AMD, if they approached us. We can’t really give PhysX away for “free” for the same reason why a Havok license or x86 license isn’t free—the technology is very costly to develop and support. In short, we are open to licensing PhysX to any company who approaches us with a serious proposal.  _


Then of course there was the Radeon PhysX episode that AMD fanboy revisionists conveniently sweep under the carpet. You can understand AMD not wanting to be at the mercy of Nvidia owned tech going forward (just as Nvidia wouldn't support Mantle while AMD are in sole charge of its direction), but certain people seem intent on rewriting history to fit their own fairy tale ideal of good and evil.


----------



## arbiter (Sep 26, 2014)

HumanSmoke said:


> Correct. Nvidia made a number of overtures to AMD, including what amounted to a come and get it offer that was picked up by most of the tech sites at the time
> 
> Then of course there was the Radeon PhysX episode that AMD fanboy revisionists conveniently sweep under the carpet. You can understand AMD not wanting to be at the mercy of Nvidia owned tech going forward (just as Nvidia wouldn't support Mantle while AMD are in sole charge of its direction), but certain people seem intent on rewriting history to fit their own fairy tale ideal of good and evil.



AMD wants it for free so they get free R&D from nvidia, don't know what world AMD lives in to think nvidia is gonna foot the bill for all the R&D work while AMD gets to use it for free and make the profit from it.

Probably the reason AMD pushed for adaptive sync as a standard, cheap easy way to get their tech in monitors. If they tried going about nvidia's way they don't have the weight with most companies to do it.


----------



## HisDivineOrder (Sep 26, 2014)

I wouldn't worry about nVidia supporting or not supporting a technology that so far has no monitors even announced.

Perhaps when a monitor capable of using the new VESA spec actually is announced then nVidia can test to see if they can support it with a patch or an update to their cards or if they need a new line of cards to support it.

As it is right now, they'd be guessing since there are only Gsync capable monitors on the market.

If I read that quote exactly the way it's stated, I'm reading it as the guy saying, "(Today) we're focusing only on gsync (because there aren't any Freesync/Adaptive Sync (VESA's)-capable monitors out right now)."

And until they're even announced, it's going to be hard for nVidia to fully test said monitors out to be sure they can guarantee compatibility.  Meanwhile, you want them to, what?  Take sales away from Gsync because of monitors that might be available in six months, but probably will be out in 12?

Heh.


----------



## arbiter (Sep 27, 2014)

HisDivineOrder said:


> I wouldn't worry about nVidia supporting or not supporting a technology that so far has no monitors even announced.
> 
> Perhaps when a monitor capable of using the new VESA spec actually is announced then nVidia can test to see if they can support it with a patch or an update to their cards or if they need a new line of cards to support it.



Yea when looks like no monitors will be out using it til looking like mid to late q1 since chips won't be shipped til end of the year.  It does seem to give a certain group of people to complain when in reality there is nothing out to use it.


----------



## Relayer (Sep 27, 2014)

arbiter said:


> Um Nvidia was WILLING to license Physx to AMD but they refused to license it, so that puts it on AMD.


GPU accelerated PhysX has zero value to AMD, which is why they won't pay for it. It would be like AMD charging for Mantle and expecting nVidia to buy it. Especially when you consider they've said they aren't interested for free. lol

Likely the last feature that would make me choose a card would be PhysX.


----------



## Deadlyraver (Sep 27, 2014)

NVIDIA is taking advantage of it's place too seriously, they have the power to do this so long as consumers will be consumers.


----------



## Fluffmeister (Sep 29, 2014)

When is Mantle going to be open?


----------



## arbiter (Sep 30, 2014)

Fluffmeister said:


> When is Mantle going to be open?



Way its going Never. Longer and longer it takes less viable game dev's will consider it.


----------



## Relayer (Oct 3, 2014)

arbiter said:


> Way its going Never. Longer and longer it takes less viable game dev's will consider it.


They said the end of the year. What do you mean by, the way it is going?


----------



## arbiter (Oct 3, 2014)

Relayer said:


> They said the end of the year. What do you mean by, the way it is going?



MS announced their blog, that DirectX 12 will be shipping With Windows 10. Not an update later, With it. So ....

http://blogs.msdn.com/b/directx/archive/2014/10/01/directx-12-and-windows-10.aspx


----------

