# NVIDIA G-SYNC now Supports FreeSync/VESA Adaptive-Sync Technology



## btarunr (Jan 7, 2019)

NVIDIA finally got around to realizing that the number of monitors with VESA adaptive-sync overwhelmingly outnumber those supporting NVIDIA G-Sync, and is going ahead with adding support for adaptive-sync monitors. This however, comes with a big rider. NVIDIA is not immediately going to unlock adaptive-sync to all monitors, just the ones it has tested and found to work "perfectly" with their hardware. NVIDIA announced that it has found a handful of the 550+ monitor models in the market that support adaptive-sync, and has enabled support to them. Over time, as it tests more monitors, support for these monitors will be added through GeForce driver updates, as a "certified" monitor. 

At their CES event, the company provided a list of monitors that they already tested and that fulfill all requirements. G-Sync support for these models from Acer, ASUS, AOC, Agon and BenQ will be automatically enabled with a driver update on January 15th.

*Update*: We received word from NVIDIA that you can manually enable G-SYNC on all Adaptive-Sync monitors, even non-certified ones: "For gamers who have monitors that we have not yet tested, or that have failed validation, we'll give you an option to manually enable VRR, too."

*Update 2*: NVIDIA released these new Adaptive-Sync capable drivers, we tested G-SYNC on a FreeSync monitor.







*View at TechPowerUp Main Site*


----------



## OneMoar (Jan 7, 2019)

HOLY SHIT I know it winter but did hell freeze over ?


----------



## ShurikN (Jan 7, 2019)

Hey g-sync, Physx says hi


----------



## xkm1948 (Jan 7, 2019)

DAYMMMN NICE!



ShurikN said:


> Hey g-sync, Physx says hi



You do know PhysX is used in games all the time right?


----------



## ShurikN (Jan 7, 2019)

xkm1948 said:


> DAYMMMN NICE!
> 
> 
> 
> You do know PhysX is used in games all the time right?


You do know I'm talkin about proprietary hardware, right? 
And if by all the time you mean a handful of games, then sure...


----------



## xkm1948 (Jan 7, 2019)

Reading the slides carefully, i think you can even turn on ASync on non official supported monitors. It just not gonna give best ASync performance. Welp still better than no ASync support at all.


----------



## Cybrshrk (Jan 7, 2019)

It was good for them to get the ball rolling for VRR with gsync but the time has come for an open standard to be the norm and good on Nvidia for realizing this sooner rather than later. 

With the announcement of hmdi 2.1 on lg oled and this I'll be happy to upgrade my 2016 oled to this year's model and finally have the full complete 4k living room pc experience I've been trying to build for like 5 years.


----------



## epiqpnwage (Jan 7, 2019)

xkm1948 said:


> DAYMMMN NICE!
> 
> 
> 
> You do know PhysX is used in games all the time right?



You do know that he is talking about the physyx cards nvidia tried to sell sheep like you who don't know better right?


----------



## kastriot (Jan 7, 2019)

G-Sync monitors down toilet.


----------



## Fatalfury (Jan 7, 2019)

So its all over for AMD...
The only advantage AMD had ..........aaaandd  its ..........Gone!!!


----------



## xkm1948 (Jan 7, 2019)

Fatalfury said:


> So its all over for AMD...
> The only advantage AMD had ..........aaaandd  its ..........Gone!!!



Well GCN GPUs can still mine crypto coins like mad man, if it ever take off again


----------



## Candor (Jan 7, 2019)

So support for current monitors on the market now?

I'm assuming there will be no love for older monitors?

I wish I was an optimist


----------



## TheLostSwede (Jan 7, 2019)

xkm1948 said:


> Reading the slides carefully, i think you can even turn on ASync on non official supported monitors. It just not gonna give best ASync performance. Welp still better than no ASync support at all.



For VRR monitors yet to be validated as G-SYNC Compatible, a new NVIDIA Control Panel option will enable owners to try and switch the tech on - it may work, it may work partly, or it may not work at all. To be sure, only purchase a monitor listed as “G-SYNC Compatible” on our site. 
From https://www.nvidia.com/en-us/geforce/news/g-sync-ces-2019-announcements/


----------



## MuhammedAbdo (Jan 7, 2019)

The news piece is wrong, GSync will be enabled on ANY freesync monitor



> For gamers who have monitors that we have not yet tested, or that have failed validation, we’ll give you an option to manually enable VRR, too.


https://blogs.nvidia.com/blog/2019/01/06/g-sync-displays-ces/


----------



## sam_86314 (Jan 7, 2019)

Perfect timing, since I just got a QHD 144Hz FreeSync monitor.

Makes me wonder how no one figured out how to force FreeSync support on Nvidia cards since it was just a software switch.


----------



## Candor (Jan 7, 2019)

400 tested and 12 passed?

I'm not sure what to think about this.


----------



## Nkd (Jan 7, 2019)

Fatalfury said:


> So its all over for AMD...
> The only advantage AMD had ..........aaaandd  its ..........Gone!!!



Not really. I doubt that much ever made a difference. Price/Performance is whats gonna sell. This is just going to sell more Adaptive Sync monitors if anything.



Candor said:


> 400 tested and 12 passed?
> 
> I'm not sure what to think about this.



I think that is bull. It seems more like they are trying to push select monitors from manufacturers that make Gsync monitors. Maybe they some deal worked out lol.



sam_86314 said:


> Perfect timing, since I just got a QHD 144Hz FreeSync monitor.
> 
> Makes me wonder how no one figured out how to force FreeSync support on Nvidia cards since it was just a software switch.



It wasnt that easy. You would have a work around with windows and having nvidia card do the work and amd gpu doing adaptive sync. But if Nvidia doesn't have any code what so ever to support adaptive sync there is nothing really for anyone to mess around with in the first place. Now that they do, they know if they lock in to tested monitors people will unlock it for another freesync monitors anyways so they allow you to force it on any freesync monitor.


----------



## steen (Jan 7, 2019)

sam_86314 said:


> Makes me wonder how no one figured out how to force FreeSync support on Nvidia cards since it was just a software switch.



Desktop driver locked to only support variable refresh with Gsync module. New drivers will support both, but Freesync won't be as snappy on Nvidia. Choose your reasons.  (There are technical reasons too).



Candor said:


> 400 tested and 12 passed?
> 
> I'm not sure what to think about this.



PR. On Nvidia HW. Support will improve over time. Freesync2+ should be fine.


----------



## Viruzz (Jan 7, 2019)

ShurikN said:


> You do know I'm talkin about proprietary hardware, right?
> And if by all the time you mean a handful of games, then sure...



You have no idea! Its used on consoles all the time, its been open sourced long time ago, what drugs are you on?


----------



## R0H1T (Jan 7, 2019)

Fatalfury said:


> So its all over for AMD...
> The only advantage AMD had ..........aaaandd  its ..........Gone!!!


Is that so, must be why RTX cards are now gonna fly off the shelves oh wait


----------



## SIGSEGV (Jan 7, 2019)

so, NVidia can use both Gaysync and FreeSync while AMD just locked to FreeSync?

(0_0)

It's like M$ vs Linux. This is not good at all.


----------



## ISI300 (Jan 7, 2019)

Thanks Nvidia, for finally coming to your senses.
So will any cards that support G-Sync work? What about 700-series cards?


----------



## Space Lynx (Jan 7, 2019)

This is great news.


----------



## davideneco (Jan 7, 2019)

SIGSEGV said:


> so, NVidia can use both Gaysync and FreeSync while AMD just locked to FreeSync?
> 
> (0_0)



Do you know what is gsync ???? Apparently not


----------



## SIGSEGV (Jan 7, 2019)

davideneco said:


> Do you know what is gsync ???? Apparently not



enlighten me pls ;-)


----------



## HornetMaX (Jan 7, 2019)

Not long ago I bought a 21:9 35" G-Sync monitor (AOC ag352ugc): I feel somebody has just robbed me of 200 dollars/euros.

That said, obviously good decision. Just 5 years too late.


----------



## Space Lynx (Jan 7, 2019)

SIGSEGV said:


> enlighten me pls ;-)



Google it.


----------



## Fatalfury (Jan 7, 2019)

R0H1T said:


> Is that so, must be why RTX cards are now gonna fly off the shelves oh wait




it will. Once RTX 2060 & rtx 2050 ti are on the shelves.


----------



## Mistral (Jan 7, 2019)

Fatalfury said:


> So its all over for AMD...
> The only advantage AMD had ..........aaaandd  its ..........Gone!!!



By the same level of logic, with this nVidia practically admits that it's been ripping clients off for years.



Fatalfury said:


> it will. Once RTX 2060 & rtx 2050 ti are on the shelves.



Can't wait for all the RTX enabled games that an RTX 2050 will be able to run...


----------



## FordGT90Concept (Jan 7, 2019)

Looks like NVIDIA finally pulled their head out of their ass.  I suspect their stock price tumbling had something to do with it.

NVIDIA says FreeSync may not 100% work with G-Sync and this statement is true because drivers have to be optimized for it to work right on most monitors.  NVIDIA's drivers naturally lack the optimizations AMD did so...support is iffy...but if you have an NVIDIA card and a FreeSync monitor, no harm in trying it but your mileage will vary.


----------



## john_ (Jan 7, 2019)

Well, this is something really positive. Unfortunately that "GSync compatible" logo, will drive prices up. Nvidia found a way, not only to make money, but also to narrow the price difference between good FreeSync monitors and Nvidia monitors. I only hope manufacturers keep two monitors in the market. One with the Nvidia Gsync compatible badge and the higher price to cover Nvidia's royalties and one at the same quality standards, only FreeSync logo and the correct, lower, price.


----------



## FordGT90Concept (Jan 7, 2019)

GSync compatible should be offered at no additional cost like FreeSync compatible is.


----------



## beautyless (Jan 7, 2019)

Glad, so now I'm willing to buy NVidia graphic.


----------



## CheapMeat (Jan 7, 2019)

ShurikN said:


> Hey g-sync, Physx says hi




Hell did freeze over: https://www.techpowerup.com/forums/threads/nvidia-physx-now-open-source.250193/


----------



## john_ (Jan 7, 2019)

FordGT90Concept said:


> GSync compatible should be offered at no additional cost like FreeSync compatible is.


It's Nvidia. It will have a cost. And manufacturers will be willing to pay that extra cost, because it will be an extra assurance to the buyer that, not only this monitor is compatible with their card, but also that it offers top quality.


----------



## Vayra86 (Jan 7, 2019)

OneMoar said:


> HOLY SHIT I know it winter but did hell freeze over ?



The moment Intel announced they were going dedicated GPU and supporting VESA adaptive sync, I kinda knew this was going to happen.

In fact, ever since FreeSync got launched people said it would happen at some point.



john_ said:


> It's Nvidia. It will have a cost. And manufacturers will be willing to pay that extra cost, because it will be an extra assurance to the buyer that, not only this monitor is compatible with their card, but also that it offers top quality.



No, they are certifying and testing older monitors mostly. Its as much free as FreeSync is. And the article says they will also enable the option on non-certified FreeSync monitors...


----------



## londiste (Jan 7, 2019)

Candor said:


> 400 tested and 12 passed?
> I'm not sure what to think about this.


Nvidia is trying to maintain some type of minimum acceptable quality on these. I am willing to bet they will only go after anything that has LFC support.
They are not in a rush and probably will slowly go over potential candidates for GSync Compatible status over time. 
The monitors from the announcement are not new and some of these are from years ago so this will not be just new monitors that have a chance.

Ths lists for both manufacturers are:
- https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/ - 62 GSync monitors plus 12 GSync Compatible from the announcement.
- https://www.amd.com/en/products/freesync-monitors - 568 FreeSync monitors

Looking closer a the FreeSync monitors - filtering by LFC Yes will cut the list down to 330. And there is a lot of bullshit spec monitors listed. 48-75 Hz is not a valid LFC range, not even with the newer reduced AMD spec that has proven to be problematic. Original was FreeSync range at 2.5x, updated one is 2.0x (that AMD acknowledges in some places but not others).
Among these FreeSync monitors with LFC there are 40 with DisplayPort input, 118 with HDMI input and 172 with both. Unknown if Nvidia will be able to support FreeSync over HDMI but they probably will not even if they can, at least not officially - note the DisplayPort Adaptive Sync messaging all over the announcement.
- out of 40 monitors with DisplayPort are all proper LFC support
- out of 172 monitors with both inputs almost all are OK by spec, 11 (17 by HDMI input) are in the likely problematic 2.0-2.5 range.
- out of 118 monitors with HDMI support only 8 have LFC support.

Altogether 201 potential suspects for Nvidia to potentially make GSync Compatible.
That is honestly less than I expected.


----------



## OneMoar (Jan 7, 2019)

unless your framerates have wild swings really gsync does you no good over VESA adaptive sync/freesync especially at <85hz
I expected this to happen __*eventually*_ _but this is kind of out of left field


----------



## Vayra86 (Jan 7, 2019)

I'm quietly laughing as well at all those people paying premium on Gsync right now... especially those I've warned not to that did it anyway 



Fatalfury said:


> it will. Once RTX 2060 & rtx 2050 ti are on the shelves.



At 360-370 EUR a pop... yeah right


----------



## stimpy88 (Jan 7, 2019)

A customer friendly move from nGreedia...  A customer backlash, and VESA calling time on your tricks can really be a good thing, aye?  Now try and drop those insane prices, and you might have a serious PR win on your hands.

I'm in the market for a new monitor, and was looking at a G-Sync monitor, but did not care much for paying the nGreedia tax...  I wonder if we will see a drop in the price of these monitors, before G-Sync is completely gone from the market?

So, thank you AMD, for successfully lobbying VESA to include support variable refresh rates in it's standards, nGreedia must really hate you right about now...



sam_86314 said:


> Perfect timing, since I just got a QHD 144Hz FreeSync monitor.
> 
> Makes me wonder how no one figured out how to force FreeSync support on Nvidia cards since it was just a software switch.


It was blocked in the nGreedia driver.


----------



## ZoneDymo (Jan 7, 2019)

Fatalfury said:


> So its all over for AMD...
> The only advantage AMD had ..........aaaandd  its ..........Gone!!!



obvious troll comment, still there is one other advantage, and that is that AMD is AMD and not Nvidia, which is a plus in my book.


----------



## stimpy88 (Jan 7, 2019)

john_ said:


> Well, this is something really positive. Unfortunately that "GSync compatible" logo, will drive prices up. Nvidia found a way, not only to make money, but also to narrow the price difference between good FreeSync monitors and Nvidia monitors. I only hope manufacturers keep two monitors in the market. One with the Nvidia Gsync compatible badge and the higher price to cover Nvidia's royalties and one at the same quality standards, only FreeSync logo and the correct, lower, price.



I'm not really sure what your saying here?  Are you actually saying that you still want to pay more money for a logo on a monitor that performs exactly the same function as one without the logo, and $100-$200 cheaper?  You do realise that G-Sync is now doing exactly the same job, just with the nGreedia chip and it's royalties enabling/allowing variable refresh rate support in the nGreedia driver?  The writing was on the wall for Greed-Sync ever since AMD successfully lobbied VESA to include it as a standard in the latest DisplayPort specifications...  A fool and his money, and all that...

Any logical/sane person would be thanking AMD for saving them hundreds of dollars, and removing a false, proprietary "standard" from the market, that only still exists due to greed.  It just would not have been possible for nGreedia to have kept G-Sync going after DisplayPort 1.4 and HDMI 2.1 were released, which has happened, and monitors are already for sale on the market.  You also have to remember that the monitor manufacturers also have to pay nGreedia for that shiny G-Sync logo, so why should they, and why should I, when it literally does nothing on the latest tech?


----------



## Metroid (Jan 7, 2019)

I have a Samsung 28 inch ue590 freesynch compatible and it is not on nvidia compatible monitors, matter of fact only 12 monitors from 400 tested, what a bull is that?

That is a way of saying, "hey we support freesynch but only the monitors we have got some money back from manufactures we have chosen so.


----------



## john_ (Jan 7, 2019)

Vayra86 said:


> No, they are certifying and testing older monitors mostly. Its as much free as FreeSync is. And the article says they will also enable the option on non-certified FreeSync monitors...


It's in Nvidia's best interest to offer it with a cost. It will make them money and I bet they think they diserve that money being those who brought Adaptive Sync to PCs and also will narrow somehow the price difference between top FreeSync monitors and GSync monitors. And again, this is Nvidia. They never lose the chance to make money. Of course we disagree, but that's just my opinion. Manufacturers on the other hand will gladly pay a small amount if that can move their models to the top of the list of possible monirtors to buy, that Nvidia owners have in their minds.


stimpy88 said:


> Are you actually saying that you still want to pay more money for a logo on a monitor that performs exactly the same function as one without the logo, and $100-$200 cheaper?


Never said that.


----------



## kieguru (Jan 7, 2019)

Fatalfury said:


> So its all over for AMD...
> The only advantage AMD had ..........aaaandd  its ..........Gone!!!



Not really, I'd say AMD's biggest advantage currently is pricing. I've seen an RX 570 for £130 new, fantastic price for 1080p with pretty close, if not the same scores as a 3gb 1060 @ £200.


----------



## BigDaddy_Jeff (Jan 7, 2019)

Acer XFA240 and Acer XF240H is the same monitor right? I did the research and the spec is exactly the same.

FYI, i owned the XF240H


----------



## londiste (Jan 7, 2019)

Metroid said:


> I have a Samsung 28 inch ue590 freesynch compatible and it is not on nvidia compatible monitors, matter of fact only 12 monitors from 400 tested, what a bull is that?
> 
> That is a way of saying, "hey we support freesynch but only the monitors we have got some money back from manufactures we have chosen so.


U28E590D? Freesync range 40-60Hz. Sorry but when we look at variable refresh rate, this is a bullshit monitor. Samsung added FreeSync on this just for the nice logo and so they could say it supports it.


----------



## FordGT90Concept (Jan 7, 2019)

londiste said:


> Nvidia is trying to maintain some type of minimum acceptable quality on these. I am willing to bet they will only go after anything that has LFC support.
> They are not in a rush and probably will slowly go over potential candidates for GSync Compatible status over time.
> The monitors from the announcement are not new and some of these are from years ago so this will not be just new monitors that have a chance.
> 
> ...


LFC has nothing to do with it.  Most likely NVIDIA phoned FreeSync monitor manufacturers and asked them to send whatever monitors they're willing to send for apative sync compatability testing and those manufacturers only sent 12 monitors.  These are likely all relatively new monitors that the manufacturer is still producing so getting it Gsync compatibility tested gives them another selling point against the competition.



stimpy88 said:


> I wonder if we will see a drop in the price of these monitors, before G-Sync is completely gone from the market?


Not likely unless NVIDIA is abandoning G-Sync modules altogether.  G-Sync modules do have technical advantages over VESA VRR so they will likely be able to sell the remaining stock at MSRP even with VESA VRR options available.



stimpy88 said:


> So, thank you AMD, for successfully lobbying VESA to include support variable refresh rates in it's standards, nGreedia must really hate you right about now...


NVIDIA is a VESA member too.  I wouldn't be surprised if they were supportive of the adaptive sync standard knowing how pathetic G-Sync monitor sales were.  Their hardware didn't support VRR until now because they were trying to recoup R&D costs.  They knew it would eventually happen though--especially where TVs are concerned.



londiste said:


> U28E590D? Freesync range 40-60Hz. Sorry but when we look at variable refresh rate, this is a bullshit monitor. Samsung added FreeSync on this just for the nice logo and so they could say it supports it.


4K is difficult to drive for video cards and, if you do some searching, there's a lot of non-LFC 4K monitors out there with 30~60 Hz range because GPUs can solidly deliver that range at 4K.  Manufacturers wouldn't bother certifying it if there wasn't any demand for it.


----------



## renz496 (Jan 7, 2019)

ShurikN said:


> You do know I'm talkin about proprietary hardware, right?
> And if by all the time you mean a handful of games, then sure...



lol handful of games? the market share between PhysX and Havok is pretty much even right now with slight edge towards havok. that is not handful.


----------



## stimpy88 (Jan 7, 2019)

john_ said:


> Never said that.



OK, but what does this mean then...?
"*I only hope manufacturers keep two monitors in the market*. One with the Nvidia Gsync compatible badge and the *higher price* to cover Nvidia's royalties and one at the same quality standards, only FreeSync logo and the correct, *lower, price*."

Are you saying that you want people "confused" in to spending more money for a badge that is totally meaningless on the latest monitors?  If not, then I simply do not understand what you are trying to say?


----------



## Vayra86 (Jan 7, 2019)

john_ said:


> It's in Nvidia's best interest to offer it with a cost. It will make them money and I bet they think they diserve that money being those who brought Adaptive Sync to PCs and also will narrow somehow the price difference between top FreeSync monitors and GSync monitors. And again, this is Nvidia. They never lose the chance to make money. Of course we disagree, but that's just my opinion. Manufacturers on the other hand will gladly pay a small amount if that can move their models to the top of the list of possible monirtors to buy, that Nvidia owners have in their minds.



You can think whatever you prefer to think, but please just read the damn article because it says _existing monitors have been certified. _Do you really believe those will now get a price increase pushed by Nvidia? You realize that would hit the FreeSync part of the deal as well?

You're not making any sense.


----------



## Nxodus (Jan 7, 2019)

B-but this forum told me Nvidia is evil and greedy


----------



## john_ (Jan 7, 2019)

stimpy88 said:


> OK, but what does this mean then...?
> "*I only hope manufacturers keep two monitors in the market*. One with the Nvidia Gsync compatible badge and the *higher price* to cover Nvidia's royalties and one at the same quality standards, only FreeSync logo and the correct, *lower, price*."
> 
> Are you saying that you want people "confused" in to spending more money for a badge that is totally meaningless on the latest monitors?  If not, then I simply do not understand what you are trying to say?


 Yes, you obviously do not understand. The easiest thing for a manufacturer would be to have only one "GSYNC compatible" monitor in the market. That means, EVERYONE pays Nvidia's tax, even those who have AMD cards.



Vayra86 said:


> You can think whatever you prefer to think, but please just read the damn article because it says _existing monitors have been certified. _Do you really believe those will now get a price increase pushed by Nvidia? You realize that would hit the FreeSync part of the deal as well?
> 
> You're not making any sense.


You are also free to believe whatever you prefer to think. You can also choose to learn to read what the damn article says and not what you want it to say. If you could do that you would understand that Nvidia specifically pointed out ,that out of 500+ monitors only 12 where good enough for them. That's a deliberate attempt to create the image that over 90% of FreeSync monitors out there are at least subpar in some areas. That's a deliberate attempt to give extra value to their "GSync compatible" badge. That's a clear indication that that badge will be coming at a cost. Also, if you be kind enough to use logic, you could at least understand that I am talking about prices of future monitors. And if you learn to read the damn article again you will understand that except those 12 monitors, Nvidia does not consider the others GSync compatible, even if they are. They didn't pass their quality tests. They work, but Nvidia does not consider them good enough for the task. As for that "_existing monitors have been certified_" let me repeat my self. Read the damn article that is written, not the article that you have in your mind.


----------



## Vayra86 (Jan 7, 2019)

Nxodus said:


> B-but this forum told me Nvidia is evil and greedy



You were asking for more Gsync enabled monitor offerings in another topic a few days back.

There you go  Free of charge



Mistral said:


> By the same level of logic, with this nVidia practically admits that it's been ripping clients off for years.



Wasn't that blatantly obvious from the get-go? Gsync was always a ripoff.


----------



## Axaion (Jan 7, 2019)

Friggin, finally, bye g-sync tax


----------



## Nxodus (Jan 7, 2019)

Vayra86 said:


> You were asking for more Gsync enabled monitor offerings in another topic a few days back.
> 
> There you go  Free of charge
> 
> ...



I didn't expect getting hundreds of viable monitors in a single day 
G-snyc wasn't a complete ripoff, come on mate. It's not black and white.


----------



## Vayra86 (Jan 7, 2019)

Nxodus said:


> I didn't expect getting hundreds of viable monitors in a single day
> G-snyc wasn't a complete ripoff, come on mate. It's not black and white.



Todays' article underlines that it was... Nvidia just certified monitors without Gsync module as capable. I don't see how you can get a clearer definition of ripoff honestly.


----------



## Metroid (Jan 7, 2019)

londiste said:


> U28E590D? Freesync range 40-60Hz. Sorry but when we look at variable refresh rate, this is a bullshit monitor. Samsung added FreeSync on this just for the nice logo and so they could say it supports it.



It's a 4k monitor, hence why 40-60, I overclocked it to 72hz and made a nice 2560x1440. They should have supported it since is a 4k monitor.



Axaion said:


> Friggin, finally, bye g-sync tax



There is no bye g-sync tax since they choose which ones they will allow to, meaning there is a devil's deal around it.

The bye g-sync tax would be true only if all freesync monitors would be allowed which is not the case.


----------



## Vayra86 (Jan 7, 2019)

Metroid said:


> The bye g-sync tax would be true only if all freesync monitors would be allowed which is not the case.



It is.

*Update*_: We received word from NVIDIA that you can manually enable G-SYNC on all Adaptive-Sync monitors, even non-certified ones: "For gamers who have monitors that we have not yet tested, or that have failed validation, we'll give you an option to manually enable VRR, too." _



Nxodus said:


> Ripoff means you get nothing for your money. You still got something out of g-sync... for a tax



I guess, if that makes people feel better, by all means


----------



## Nxodus (Jan 7, 2019)

Vayra86 said:


> Todays' article underlines that it was... Nvidia just certified monitors without Gsync module as capable. I don't see how you can get a clearer definition of ripoff honestly.



Ripoff means you get nothing for your money. You still got something out of g-sync... for a tax


----------



## PanicLake (Jan 7, 2019)

SIGSEGV said:


> so, NVidia can use both Gaysync and FreeSync while AMD just locked to FreeSync?
> 
> (0_0)
> 
> It's like M$ vs Linux. This is not good at all.


You do realize that G-Sync monitors are destined to disappear right? Who will buy a monitor that add 100-200$ to the price just to have G-Sync now?
Also, a monitor with only G-Sync will lock you down with nVidia limiting you future video card choices.



Nxodus said:


> B-but this forum told me Nvidia is evil and greedy


It still is, don't worry. The plan didn't go as they planned it yet again.


----------



## Turmania (Jan 7, 2019)

Finally a proper move from the green camp. Makes me wonder are they scared off red camp`s upcoming GPU`s?


----------



## Metroid (Jan 7, 2019)

Vayra86 said:


> It is.
> 
> *Update*_: We received word from NVIDIA that you can manually enable G-SYNC on all Adaptive-Sync monitors, even non-certified ones: "For gamers who have monitors that we have not yet tested, or that have failed validation, we'll give you an option to manually enable VRR, too." _



Good, now we just need a driver update then, wait we don't have it yet.


```
GeForce Game Ready Driver                                  

Version:  417.35  WHQL  Release Date:  2018.12.12  Operating System:  Windows 10 64-bit Language: English (US)  File Size: 543.95 MB
```



GinoLatino said:


> You do realize that G-Sync monitors are destined to disappear right? Who will buy a monitor that add 100-200$ to the price just to have G-Sync now?



Thank AMD for this, we all knew this has been coming, we did not know when, finally.


----------



## Rahmat Sofyan (Jan 7, 2019)

I feel sorry for "gsynced" monitor owner .. they paying too much ..


----------



## john_ (Jan 7, 2019)

Turmania said:


> Finally a proper move from the green camp. Makes me wonder are they scared off red camp`s upcoming GPU`s?



This could be an indication that Navi/7nm Vega/7nm Polaris, whatever AMD is going to announce, is not that bad. Considering that Nvidia is doing the same mistake that it did 10 years ago with PhysX, I mean concentrating into promoting a specific feature and asking extra money for that, but seeing that people don't bite, letting FreeSync monitors work with Nvidia cards probably was a necessary move.


----------



## Metroid (Jan 7, 2019)

Rahmat Sofyan said:


> I feel sorry for "gsynced" monitor owner .. they paying too much ..



They got their e-penis self confidence larger for sometime, for them it was a good deal nonetheless.


----------



## Aquinus (Jan 7, 2019)

btarunr said:


> NVIDIA is not immediately going to unlock adaptive-sync to all monitors, just the ones it has tested and found to work "perfectly" with their hardware.


That wreaks of desperation to not make G-Sync not look like such a waste of money. Watch them only approve the most expensive of displays just to make G-Sync look worth it.

Honest question, if you could use a FreeSync display on a nVidia card (any FreeSync Display,) would you *ever* buy a G-Sync screen with the nVidia tax tacked on to it? I sure as hell wouldn't.


----------



## londiste (Jan 7, 2019)

Turmania said:


> Finally a proper move from the green camp. Makes me wonder are they scared off red camp`s upcoming GPU`s?


VRR (variable refresh rate) is coming up in a big way.
- DP Adaptive sync that (half of) Freesync is based on, has been an established thing for several years now.
- HDMI 2.1 is coming, just look at the announcements on this CES. It has VRR in the standard and it is coming to TVs first and monitors probably won't be far behind.
Nvidia has a desperate need to get on this VRR standards train. GSync worked for them when there were no widely adopted alternatives but with HDMI now joining DP having standard VRR functionality... they had no choice.



Metroid said:


> It's a 4k monitor, hence why 40-60, I overclocked it to 72hz and made a nice 2560x1440. They should have supported it since is a 4k monitor.


The thing is, FreeSync will only work on it while your FPS is between 40 and 60. With LFC, it would also work when FPS drops below that. This is one of the things GSync got right from the start.


----------



## Vayra86 (Jan 7, 2019)

john_ said:


> This could be an indication that Navi/7nm Vega/7nm Polaris, whatever AMD is going to announce, is not that bad. Considering that Nvidia is doing the same mistake that it did 10 years ago with PhysX, I mean concentrating into promoting a specific feature and asking extra money for that, but seeing that people don't bite, letting FreeSync monitors work with Nvidia cards probably was a necessary move.



Possibly, but I don't consider that very likely. Nvidia has lots of wiggle room in their line up either in performance or in price. That is why they can be arrogant with Turing and launch it as they did, stating they will single handedly move gaming towards realtime RT (crystal ball says it ain't happenin').

I think what's more likely is that Gsync sales were not all that much to begin with (its an overpriced niche) and this counters the AMD FreeSync offerings that are consistently better deals in the midrange. AMD doesn't really even need new GPUs to make Gsync look 'meh'.

One thing is absolutely true, and that is 'Thank you AMD'. They followed Gsync suit with their own open technology, and this time it stuck, it got better, and it effected a change in the marketplace. Its not always like that, and I think its fair to admit this was a fantastic move on their part. Not so much Nvidia, for them its just reality catching up and they should have done this years ago.


----------



## stimpy88 (Jan 7, 2019)

Turmania said:


> Finally a proper move from the green camp. Makes me wonder are they scared off red camp`s upcoming GPU`s?


Nope, I don't think AMD have anything coming that nGreedia can't easily better, unfortunatly.  However, nGreedia may well be aware of just how low the average customer regards their shady shit.

I wonder what the next item of "goodwill" nGreedia will bestow upon us mere plebs, after the Phys-X and Greed-Sync announcements...?


----------



## Metroid (Jan 7, 2019)

LG OLED TV's will be coming with Variable Refresh Rates and because of that and some other things nvidia decided to do this. Nvidia cashed out from G-sync anyway since 2013, lots of money was made and lasted until now I guess.


----------



## Rahmat Sofyan (Jan 7, 2019)

Metroid said:


> They got their e-penis self confidence larger for sometime, for them it was a good deal nonetheless.



LOL, but yeah at first launch till now I still wonder, is it gsync module really needed or just driver and software to lock and unlock the feature ..


----------



## TheGuruStud (Jan 7, 2019)

Nxodus said:


> B-but this forum told me Nvidia is evil and greedy



They want to sell more Turdings and the jig is up. Gsync sales must be abysmal, too.

Nvidia told me adaptive sync was so terrible that you needed a their hardware solution with a fat mark up.


----------



## Chazragg (Jan 7, 2019)

was happy to see the mg278q appear on the list, bought it not long back bring on the 15th!


----------



## londiste (Jan 7, 2019)

Rahmat Sofyan said:


> LOL, but yeah at first launch till now I still wonder, is it gsync module really needed or just driver and software to lock and unlock the feature ..


It was needed back in 2013 but now has widespread alteratives. Gsync module is basically a scaler with variable refresh rate support.


----------



## Metroid (Jan 7, 2019)

Also nvidia had nothing to announce other than the rtx 2060, they needed to step up to amd since it looks like amd will be showing off 7nm gpu's and cpu's. Intel will also have to show something  interesting, if not then shares will crash hard.


----------



## Aquinus (Jan 7, 2019)

I find it amusing that this coincided with the open source amdgpu driver gaining vrr support as of 5.0-rc1 that was released yesterday.


----------



## Legacy-ZA (Jan 7, 2019)

Finally.

I was about to go and enable this in my nvidia driver, when I realized. I am still using my old 120Hz Samsung monitor before Freesync and Gsync was a thing. Ah the days when people fought back feebly with arguments like "you can't see more than 30fps" ^_^

I was in the market to buy a new monitor quite a while ago, then I saw the price tags of the monitors that came with G-Sync, suffice to say, I didn't buy a monitor. I  will have to save up again for a monitor, until then, fast sync will have to do for now.

nVidia, take this to heart, some people will never support such greed, me being one of them. That being said, you dodged a bullet nVidia, I was going to buy a AMD graphics card with a Freesync monitor this year, now you get to keep me as a customer, for now... as I will just buy a monitor. If your greed continues however, I will just buy a AMD GPU in the future, just like how I am going to give Intel the finger for good when AMD releases their Zen 2 this year. Just how I already gave Razer the finger.


----------



## Nxodus (Jan 7, 2019)

TheGuruStud said:


> They want to sell more Turdings and the jig is up.



Or, maybe, if we don't view this from an "Nvidia hating" angle, monitor manufacturers might have thrown a hissy fit about the eternal struggle to pick between two standards. I'm just theorycrafting. I know, Nvidia is a soulless megacorp, but still, occasionaly, they might do something for free, to boost the industry and innovation and whatnot. I mean, they have the power to keep this g-sync misery going on for eternity, so why retreat


----------



## FordGT90Concept (Jan 7, 2019)

Aquinus said:


> I find it amusing that this coincided with the open source amdgpu driver gaining vrr support as of 5.0-rc1 that was released yesterday.


NVIDIA couldn't be that quick to do their testing and make this announcement.


The driver is releasing on the 15th.


----------



## Aquinus (Jan 7, 2019)

FordGT90Concept said:


> NVIDIA couldn't be that quick to do their testing and make this announcement.


This has been staged for release for over a month in the Linux kernel. It only just now made it into mainline. They had time since it was announced late in November.


----------



## FordGT90Concept (Jan 7, 2019)

That makes more sense then.

I still think NVDA stock tumbling had something to do with it too.  It's not clear that NVIDIA is even making money on G-Sync modules because of their low volumes.  It may make financial sense to discontinue production.

Put the two together, along with CES as the perfect launch venue, and you get this announcement.


----------



## Metroid (Jan 7, 2019)

> There are hundreds of monitor models available capable of variable refresh rates (VRR) using the VESA DisplayPort Adaptive-Sync protocol. However, the VRR gaming experience can vary widely.
> To improve the experience for gamers, NVIDIA will test monitors. Those that pass our validation tests will be G-SYNC Compatible and enabled by default in the GeForce driver.
> G-SYNC Compatible tests will identify monitors that deliver a baseline VRR experience on GeForce RTX 20-series and GeForce GTX 10-series graphics cards, and activate their VRR features automatically.
> Support for G-SYNC Compatible monitors will begin Jan. 15 with the launch of our first 2019 Game Ready driver. Already, 12 monitors have been validated as G-SYNC Compatible (from the 400 we have tested so far). We’ll continue to test monitors and update our support list. For gamers who have monitors that we have not yet tested, or that have failed validation, we’ll give you an option to manually enable VRR, too.
> *For VRR monitors yet to be validated as G-SYNC Compatible, a new NVIDIA Control Panel option will enable owners to try and switch the tech on – it may work, it may work partly, or it may not work at all.*



https://blogs.nvidia.com/blog/2019/01/06/g-sync-displays-ces/

So there we have it, it may not work at all.


----------



## FordGT90Concept (Jan 7, 2019)

I know for a fact that AMD has to tweak their drivers to support some FreeSync monitors.  I do not know what that entails or why it is necessary when the GPU should be sending exactly what the monitor expects.

That reason is why NVIDIA gave that caveat: some monitors need extra TLC to work with VESA adaptive sync and if they don't have that TLC, your mileage may vary.  They aren't going to make a blanket statement that it works because they can't guarantee it will unless the monitor was certified by NVIDIA to work (same as AMD).

You'd think it would be a one-size-fits-all thing but it isn't.


----------



## oxidized (Jan 7, 2019)

Legacy-ZA said:


> Finally.
> 
> I was about to go and enable this in my nvidia driver, when I realized. I am still using my old 120Hz Samsung monitor before Freesync and Gsync was a thing. Ah the days when people fought back feebly with arguments like "you can't see more than 30fps" ^_^
> 
> ...



They'd be so sad to see you go 
/s


----------



## londiste (Jan 7, 2019)

FordGT90Concept said:


> I know for a fact that AMD has to tweak their drivers to support some FreeSync monitors.  I do not know what that entails or why it is necessary when the GPU should be sending exactly what the monitor expects.


Because monitor manufacturers are morons and configure their monitors wrong. At least this has been the cases with some widely publicized cases  - I believe including the Samsung monitors that part of the AMD combo deal


----------



## Robcostyle (Jan 7, 2019)

Damn u, nvidia - even inferior mg278 got support, but 279 didn’t.. just as always


----------



## vahn3565 (Jan 7, 2019)

epiqpnwage said:


> You do know that he is talking about the physyx cards nvidia tried to sell sheep like you who don't know better right?



You do know that was Ageia and not Nvidia right? Nvidia bought the technology and integrated it into their GPUs, they didn't sell proprietary hardware for physx. FFS


----------



## oxidized (Jan 7, 2019)

Robcostyle said:


> Damn u, nvidia - even inferior mg278 got support, but 279 didn’t.. just as always



They'll probably keep adding models over time, there's no point in supporting something inferior and not something superior.


----------



## robal (Jan 7, 2019)

Very good news! Knowing it's coming from Nvidia, I expect some kind of catch


----------



## wolf (Jan 7, 2019)

stimpy88 said:


> nGreedia may well be aware of just how low the average customer regards their shady shit.



I don't think those sentiments are an accurate reflection of the average consumer at all. But hey, I'm basing that off my own small sample size too. 

Basically of the gamers and tech heads I know and talk to, about 1 in 10 or so is one of these 'ngreedia' bandwagoners, the rest happily use either their products or AMD's without being so tilted about it.


----------



## Vayra86 (Jan 7, 2019)

wolf said:


> I don't think those sentiments are an accurate reflection of the average consumer at all. But hey, I'm basing that off my own small sample size too.
> 
> Basically of the gamers and tech heads I know and talk to, about 1 in 10 or so is one of these 'ngreedia' bandwagoners, the rest happily use either their products or AMD's without being so tilted about it.



The salt in this topic is absolutely stunning isn't it. All consumers now officially lost their vendor-lock in for both FreeSync and Gsync and people complain.


----------



## jabbadap (Jan 7, 2019)

Do they support HDMI too, or is this only for DP?


----------



## Old Ladies (Jan 7, 2019)

This comment section is toxic.

This is good news for everyone. I currently have a 1080 Ti and the original ROG swift at 2560x1440 144hz gsync. It has served me well and has been easily the best monitor I have ever owned.

The main reason for going with gsync is that they ALL have support for 1-max refresh rate support for VRR. This is why most freesync monitors won't be gsync compatible as most were shitty monitors that have a small VRR range. So if your frames dropped it turned off VRR.

No matter what if you are playing on most of the latest games you will have frame dips and this is why I loved gsync as it didn't matter gsync was always turned on and I could tell.

Though I was thinking about switching to AMD in the future if they ever compete in the high end again as the amount of freesync monitors and being cheaper was appealing to me. I also don't like being locked in with only one company. This is great news for Nvidia users and probably hurt AMD if Nvidia ever come back to reality with their prices on GPUs.


----------



## Sasqui (Jan 7, 2019)

Didn't see this coming.  And no, they are not doing it out of the kindness of their black hearts, they must be losing sales by banning the use of Free-Sync on monitors equipped with G-Sync.

That said, to buy a G-Sync monitor, you're still going to have to pay the premium for the proprietary hardware and technology.  It'll be just a little more attractive now.


----------



## GoldenX (Jan 7, 2019)

Finally some good news from Nvidia.
This is great.


----------



## Ruyki (Jan 7, 2019)

This move makes sense for nVidia. G-sync monitors are around 100-200 USD more expensive compared to the adaptive sync/freesync model which is otherwise identical making them a hard sell. Even many nVidia users probably purchased an adaptive sync/freesync monitor. Such users may consider an AMD GPU when it's time to upgrade which is something nVidia should avoid.


----------



## medi01 (Jan 7, 2019)

SIGSEGV said:


> so, NVidia can use both Gaysync and FreeSync while AMD just locked to FreeSync?


Nope.

It's "so, FreeSync monitor can be used with AMD, nVidai and later Intel, while GSync are overpriced and only work with nGreedia?"



Nxodus said:


> Or, maybe, if we don't view this from an "Nvidia hating" angle, monitor manufacturers might have thrown a hissy fit about the eternal struggle to pick between two standards


BS.
GSync not only "was not free to use" unlike FreeSync based standard, it was also "not for anyone to license" because, you know, "investments" of nVidia.
As VRR-ish tech was right there in notebooks for years already and as Vesa standard, no less, the main point of the mentioned "investment" was to grab market using proprietary tech AND have hefty premium for Huang branded chips.

AMD's version, on the other hand, was so cool, most upscaler chips on the market included it out of the box.

And even today, with greedia's dominance in GPU market they have failed so miserably.

Two standards talk would apply if GSync was available for others to use.


----------



## OneMoar (Jan 7, 2019)

good lord the AMD fanboy is strong here

once again if you wanna blame somebody for state of the GPU market blame AMD
its there fault  for once again not being competitive

nvidia gets free rein because they have no competition you want that to change bitch at amd


----------



## kings (Jan 7, 2019)

Every thread with Nvidia on the title, has become into a Nvidia bashing nowadays, even if it´s good news for consumers!

It´s pretty sad really, seems no one cares about games, hardware and tech anymore, the only important thing is bashing Nvidia in every topic.

It feels that some people need to bash Nvidia everyday, to feel some kind of joy in their lives.


----------



## INSTG8R (Jan 7, 2019)

OneMoar said:


> good lord the AMD fanboy is strong here
> 
> once again if you wanna blame somebody for state of the GPU market blame AMD
> its there fault  for once again not being competitive
> ...


Sorry bud this about monitors not GPUs a 460 Can use Freesync just as well Vega as a 1050 can use G-Sync as well as 2080ti. Swing and a Miss.


----------



## Gasaraki (Jan 7, 2019)

Candor said:


> 400 tested and 12 passed?
> 
> I'm not sure what to think about this.



Most freesync monitors are shit?


----------



## GoldenX (Jan 7, 2019)

Most monitors are shit, period.


----------



## junglist724 (Jan 7, 2019)

epiqpnwage said:


> You do know that he is talking about the physyx cards nvidia tried to sell sheep like you who don't know better right?



Nvidia never made PhysX cards. They killed Ageia's hardware division the moment they acquired the company.


----------



## HD64G (Jan 7, 2019)

So, customers won this time as most didn't pay the nvidia tax called G-sync. RTX sales are lower than they thought would be, so the most possible is that they will lower the prices. ANd once new GPUs from AMD are on sale, even better for us, at least if they are competitive in the vfm factor as they use to be (Vega were the bad product mainly due to HDM2 production and price problems, whereas Vega56 still is a *great* GPU when found close to $350). And people, remember that freesync is a free tech made by AMD. So, give some credit where it belongs...


----------



## jabbadap (Jan 7, 2019)

Gasaraki said:


> Most freesync monitors are shit?



Well for compatibility VRR range ratio must be 2.44:1, which rules out every 30-60Hz/40-75Hz/40-90Hz VRR freesync monitors out there.


----------



## deu (Jan 7, 2019)

To be honest I suspect the issues and cost of implementing Gsync to 4K 144hz displays may have a hand in this. To add 100-150 dollars is on thing but the additional 500$ to a screen that is suppose to dip to... 500 dollars at some point is not a viable price: https://www.techpowerup.com/245463/nvidia-g-sync-hdr-module-adds-usd-500-to-monitor-pricing.


----------



## EarthDog (Jan 7, 2019)

SIGSEGV said:


> so, NVidia can use both Gaysync and FreeSync while AMD just locked to FreeSync?
> 
> (0_0)
> 
> It's like M$ vs Linux. This is not good at all.


Happysync..... interesting....

....or are you using that word in a discriminatory and derogatory manner?

How is this ok TPU?


----------



## Robotics (Jan 7, 2019)

Nvidia stays behind AMD in this race. Look at this chart, it is weird for support some monitors.


----------



## moproblems99 (Jan 7, 2019)

Well, this will certainly make me look at NV 20/30 series when I replace my Vega.


----------



## GoldenX (Jan 7, 2019)

I still have an analog display, but now I'm sure the next one is going to be freesync, hopefully with a fair priced Nvidia card by then.


----------



## moproblems99 (Jan 7, 2019)

OneMoar said:


> once again if you wanna blame somebody for state of the GPU market blame AMD
> its there fault for once again not being competitive



Actually, it is consumers' fault.  AMD has had better products multiple times and the sales figures were still in NV's favor.  In my opinion, this led AMD to where they are now.  Focusing on CPUs and GPU segments where the money is.  Unfortunately, those GPU segments don't always align with the most vocal and complain-y.


----------



## medi01 (Jan 7, 2019)

OneMoar said:


> nvidia gets free rein because they have no competition you want that to change



I don't want to change that.
I want to watch green brains bending backwards further and further.
I want to see them pay $250+ for gsync label and wonder WTF later, more of that please.
We already got to perf/$ not going up with next gens (thank you, Huang), but want to see perf/$ drop as we "progress".

Heck, between AMD trouncing market in Q3 with cheap 7nm cards and Huang pwning greenboi more, while I'd choose the former, the latter is not far behind.


----------



## GoldenX (Jan 7, 2019)

moproblems99 said:


> Actually, it is consumers' fault.  AMD has had better products multiple times and the sales figures were still in NV's favor.  In my opinion, this led AMD to where they are now.  Focusing on CPUs and GPU segments where the money is.  Unfortunately, those GPU segments don't always align with the most vocal and complain-y.


True.
I would welcome some more money on driver development (not on gaming functions or GUI, on proper driver development). AMD has the worst OpenGL driver, even Intel stomps them there.


----------



## xkm1948 (Jan 7, 2019)

Vayra86 said:


> The salt in this topic is absolutely stunning isn't it. All consumers now officially lost their vendor-lock in for both FreeSync and Gsync and people complain.



It just shows how toxic some of the fans here are. They dont have two working brain neurons to rub together and all they care is brand loyalty to death.



kings said:


> Every thread with Nvidia on the title, has become into a Nvidia bashing nowadays, even if it´s good news for consumers!
> 
> It´s pretty sad really, seems no one cares about games, hardware and tech anymore, the only important thing is bashing Nvidia in every topic.
> 
> It feels that some people need to bash Nvidia everyday, to feel some kind of joy in their lives.




Yep. Too sad or angry IRL so they have to invest their emotions purely in hating something. What a pathetic life.

To know who i am referring to just look at who has been downvoting all the pro-consumer comments. It is these guys don’t wanna see progression even for the benefit of average consumers. Truly pathetic and disgusting.


----------



## medi01 (Jan 7, 2019)

Vayra86 said:


> vendor-lock in for both FreeSync


"FreeSync vendor-lock", brought to you by creators of "$350 for low range GPU is fine".


----------



## moproblems99 (Jan 7, 2019)

Now, if only the monitor I wanted wasn't $1200....


----------



## londiste (Jan 7, 2019)

medi01 said:


> "FreeSync vendor-lock", brought to you by creators of "$350 for low range GPU is fine".


You are referring to the newly released RTX 2060, right? OK. It is a TU106-based product which should normally be mid-range (and a cut-down mid-range) product. However, you might want to scour some recent discussion and recommendation threads - you will find a lot of happyness and recommendations about awesome deals for Vega 56/64 around the $350 price range. This is the *highest *performance the competition offers at this point. You were talking about low range GPUs? Come on...



jabbadap said:


> Well for compatibility VRR range ratio must be 2.44:1, which rules out every 30-60Hz/40-75Hz/40-90Hz VRR freesync monitors out there.


Wrote a little post about that a few pages ago:
https://www.techpowerup.com/forums/...ve-sync-technology.251237/page-2#post-3971286
tl;dr - out of 568 monitors that are on AMD's Freesync monitors list today this will leave 201 as potential candidates.

I honestly feel this type of testing and validation is something AMD should have done a year or two ago. Freesync 2 announcement was a good opportunity for this but they just didn't


----------



## Vayra86 (Jan 7, 2019)

medi01 said:


> "FreeSync vendor-lock", brought to you by creators of "$350 for low range GPU is fine".



How are these related? Once again I struggle to find the reason to your madness...

Both AMD and Nvidia ran with their own implementation of variable refresh... its really that simple. The only thing missing is AMD support for Gsync monitors  This is a pretty smart move on Nvidia's part, and for consumers it is only win-win: AMD users are likely to get access to more, higher quality (broader refresh ranges) variable refresh monitors, and Nvidia users can access the technology for free like everyone else.



moproblems99 said:


> Now, if only the monitor I wanted wasn't $1200....



Less money moproblems?


----------



## londiste (Jan 7, 2019)

Vayra86 said:


> for consumers it is only win-win: AMD users are likely to get access to more, higher quality (broader refresh ranges) variable refresh monitors


Oh, you are actually very much right. Whether you are looking to find a FreeSync monitor for AMD or Nvidia cards G-Sync Compatible will serve well as an indication of reasonably good VRR monitor


----------



## Fluffmeister (Jan 7, 2019)

londiste said:


> Oh, you are actually very much right. Whether you are looking to find a FreeSync monitor for AMD or Nvidia cards G-Sync Compatible will serve well as an indication of reasonably good VRR monitor



Hehe yeah just thinking that, the good FreeSync monitors should be G-Sync Certified.

I'm sure there will just be a list somewhere which shouldn't hurt too many feelings.


----------



## Semel (Jan 7, 2019)

It's actually a pretty good move.

1) Huang was milking GTX users for years with all this proprietary gsync scam => Profit.

2) Huang announces that now Pascal and turing can work with freesync => more gamers who don't' want to deal with gsync will buy Nvidia GPUs instead of getting AMD's gpus.

It's a preventive strike against whatever AMD is about to announce at CES 2019 (new GPUs or whatever)

Does it make Nvidia look kinda bad? Yeah, it does. However, the long-term financial benefits of this announcement outweigh all this.

PS Fanatics will just keep blindly praising their god-father figure, of course  ignoring the fact that Huang  was screwing them over all these years just to milk more money from them heh


----------



## phanbuey (Jan 7, 2019)

Semel said:


> It's actually a pretty good move.
> 
> 1) Huang was milking GTX users for years with all this proprietary gsync scam => Profit.
> 
> ...



At $1200+ a card Im not sure how many fanatics he really has left.  Once AMD comes up with something the exodus will commence.


----------



## moproblems99 (Jan 7, 2019)

Vayra86 said:


> Less money moproblems?



It's not about the money as much as it is the principle.  I just can't spend that for a monitor...much like a gpu...absurd prices.



phanbuey said:


> At $1200+ a card Im not sure how many fanatics he really has left.  Once AMD comes up with something the exodus will commence.



Don't count on that.


----------



## R-T-B (Jan 7, 2019)

Vayra86 said:


> Wasn't that blatantly obvious from the get-go? Gsync was always a ripoff.



Agreed.  If only this had come a few weeks sooner.  My old monitor died and I was forced to source a rapid replacement during christmas shopping season.  Went with gsync for the notear, now kinda regret it.

Oh well, monitor still works correct? And she works well.  So pretty...



Semel said:


> Does it make Nvidia look kinda bad? Yeah, it does.



lol in what universe does this look bad?


----------



## moproblems99 (Jan 7, 2019)

R-T-B said:


> Agreed. If only this had come a few weeks sooner. My old monitor died and I was forced to source a rapid replacement during christmas. Went with gsync for the notear, now kinda regret it.
> 
> Oh well, monitor still works correct? And she works well. So pretty...



Outside of the 30 day window?


----------



## R-T-B (Jan 7, 2019)

EarthDog said:


> How is this ok TPU?



Sadly, they totally let the dogs lose there man.  There is no enforcement of that.



moproblems99 said:


> Outside of the 30 day window?



Yeah, sadly.  It was early november.


----------



## RichF (Jan 7, 2019)

SIGSEGV said:
			
		

> so, NVidia can use both Gaysync and FreeSync while AMD just locked to FreeSync?





SIGSEGV said:


> enlighten me pls ;-)


I has nothing to do with trolling gay people, their friends, their families, and/or anyone who has benefitted from their work.

Google returns 1,450,000 results for G-Sync so you can enlighten yourself about that particular topic. Here's the Wikipedia entry:

https://en.wikipedia.org/wiki/Nvidia_G-Sync


----------



## Fluffmeister (Jan 7, 2019)

Cunning move by Nvidia, launch G-SYNC, AMD are like OMG nice idea.... eh how about "FreeSync" and let the monitor manufacturers do all the work. AMD fans rejoice, market expands, Nvidia drop the mic and steal their thunder.

Superb.


----------



## RichF (Jan 7, 2019)

moproblems99 said:


> Actually, it is consumers' fault.


This is reductionism taken too far. Consumers don't exist and operate in a bubble. There are a lot of things to blame for the situation we're in:

1) Lack of antitrust enforcement. Duopolies and monopolies are common in tech.

2) The corporation. Corporations are not designed to benefit humanity. They're, as Ambrose Bierce said (quoted in Civilization), an ingenious device for obtaining individual profit without individual responsibility. Put more simply, they're about wealth consolidation. Wealth consolidation means giving a smaller number of people more of the resources pie so they can have more privileged lives. The sales pitch for this kind of social planning is that their privilege trickles down.

3) Marketing. Corporate/political marketing is designed to confuse people with emotion to get them to part with more of their money than they should. Money is essentially a person's life, the currency of a person's time, energy, and ability. 

4) Tribalism indoctrination. People are generally trained to think tribally, in an us vs. them dichotomy (like football/soccer and the "two-party" system). This makes it easy to substitute duopoly, for example, in lieu of having an actually competitive marketplace.

I found it droll to see the claim that we have a really crowded and competitive GPU market in the same article that argued that Vega is so overpriced that it's not competitive enough to be recommended. link



			
				Dave James said:
			
		

> Without the price drop the Vega cards are prohibitively expensive, especially compared with the $350 (£329) RTX 2060.
> 
> As it is, it looks like Sapphire was just trying to make the highest-performing AMD gaming cards as relevant as possible in light of the latest Nvidia release.
> 
> Unfortunately Sapphire is no longer looking to give its RX Vega cards any help in the crowded, competitive graphics card market.


Claim A = Without a price drop Vega is not competitive, meaning the only "competition" in the market involves Nvidia with itself. 
Claim B = We have a crowded/competitive graphics card market.

While it's possible to "yeah, but" my point with the practice of releasing lots of barely different 3rd-party cards, I don't consider the market nearly competitive enough. Duopolies aren't good enough and the argument he made is that AMD isn't even competing at the current pricing, therefore we're talking about monopoly which is even worse.


----------



## xkm1948 (Jan 7, 2019)

Fluffmeister said:


> Cunning move by Nvidia, launch G-SYNC, AMD are like OMG nice idea.... eh how about "FreeSync" and let the monitor manufacturers do all the work. AMD fans rejoice, market expands, Nvidia drop the mic and steal their thunder.
> 
> Superb.



Annnd thumbs down coming your way for stating the truth  from angry red men in 3,2,1


----------



## moproblems99 (Jan 7, 2019)

RichF said:


> This is reductionism taken too far. Consumers don't exist and operate in a bubble. There are a lot of things to blame for the situation we're in:




So, how would you propose to fix the duopolies we have in GPUs and CPUs?
What does this have to do with anything?
Who's fault is it they pull out a credit card after being duped by a marketing team?


----------



## Fluffmeister (Jan 7, 2019)

xkm1948 said:


> Annnd thumbs down coming your way for stating the truth  from angry red men in 3,2,1



It's OK my friend, Nvidia aren't moving the market forward apparently, yet team red happily wait 15 months for Vega... to get the same peformance as a GTX 1080, albiet with worse power consumption of course.

I love them really.


----------



## moproblems99 (Jan 7, 2019)

xkm1948 said:


> angry red men



You should be ashamed of yourself for being racist towards Native Americans...


----------



## RichF (Jan 7, 2019)

moproblems99 said:


> So, how would you propose to fix the duopolies we have in GPUs and CPUs?


The first step is to recognize/understand the situation. For the most part, we're not at that point yet as a tech community. People complain about the symptoms but don't put much, if any, effort into going beyond those to the systemic causes.

Large-scale changes don't happen without large-scale citizen involvement so it's important for people to communicate with one another about the situation we're in and develop solutions. Nothing will improve when consumers abandon their role in the battle between consumer need and corporate desire. Consumer need is to get value for one's life/money. Corporate desire is to give them as little as possible in exchange for it.

As for the other two questions, the second is a troll and the third was already answered.


----------



## Totally (Jan 7, 2019)

OneMoar said:


> HOLY SHIT I know it winter but did hell freeze over ?



The skeptic in me says that in the future when things are once again going their way, they'll discover an issue/bug and the "fix" they'll implement will effectively revert this outright or from that point on.


----------



## moproblems99 (Jan 7, 2019)

RichF said:


> The first step is to recognize/understand the situation. For the most part, we're not at that point yet as a tech community. People complain about the symptoms but don't put much, if any, effort into going beyond those to the systemic causes.
> 
> Large-scale changes don't happen without large-scale citizen involvement so it's important for people to communicate with one another about the situation we're in and develop solutions. Nothing will improve when consumers abandon their role in the battle between consumer need and corporate desire. Consumer need is to get value for one's life/money. Corporate desire is to give them as little as possible in exchange for it.
> 
> As for the other two questions, the second is a troll and the third was already answered.



You must be in politics because there are no answers in there.  It is so easy to solve the problems with GPUs and CPUs that the answer borders on silly.  If it doesn't fit your budget, or your value...don't buy it.  If the products don't sell...prices will inevitably go down.  High-end  PCs are nothing more than a luxury.  However, if the products sell....the prices aren't too high.

For things like high-end PCs, the consumer has complete control.  Much like government, consumers have the PC industry they deserve.


----------



## RichF (Jan 7, 2019)

moproblems99 said:


> You must be in politics because there are no answers in there.


This is an _ad hominem_ as well as avoidance of the substance of my posts. I have a policy of ignoring people who start response posts with ad homs.


----------



## moproblems99 (Jan 7, 2019)

RichF said:


> This is an _ad hominem_ as well as avoidance of the substance of my posts. I have a policy of ignoring people who start response posts with ad homs.



Well, I guess you can add me to that because this is the second post with out any answers.


----------



## RichF (Jan 7, 2019)

Avoiding the substance of a person's arguments and information with the "answers demand" is a classic rhetorical trick.

(It's a bit better than raw trolling with statements like "What does this have to do with anything?" but not much.)

The claim is that a person can't legitimately expose and discuss problems without then taking another step — providing solutions to those problems. It means journalists are all corrupt/useless unless everything they write is an editorial.

It means teachers can't teach students about any problems without solving them for them. If they don't solve them then it's proof that they don't know anything about the problems.

This rhetorical strategy is typically adopted as a way to avoid the issues. It's generally seen as easier to attack suggested solutions to problems than to discuss the problems with any depth. It's a way of ignoring facts in favor of debates (analysis of existing problems) over opinion (suggested solutions, which are necessarily more speculative), as opinion is more difficult to prove. It makes it easier to defend one's "point" while offering little substance to back it up.

The truth is that a person can, and should, discuss reality without being expected to solve all of its problems as well. Those are two separate things. The discussion is valuable because it can give others tools to help them to solve their problems. Also, there isn't a huge separation between the two things. One can't solve anything unless one understands the reality. It's a process and it can be done as a team.

Teamwork doesn't happen when others respond with ad homs and trolls like "What does that have to do with anything?"


----------



## moproblems99 (Jan 8, 2019)

The problem is the 'substance' of your posts have nothing to do with questions so not sure what you are looking for.


----------



## FordGT90Concept (Jan 8, 2019)

Fluffmeister said:


> Cunning move by Nvidia, launch G-SYNC, AMD are like OMG nice idea.... eh how about "FreeSync" and let the monitor manufacturers do all the work. AMD fans rejoice, market expands, Nvidia drop the mic and steal their thunder.
> 
> Superb.


The original idea was in VESA embedded DisplayPort (eDP) standard as a power saving feature.  NVIDIA basically made an external eDP chip that functions over DP.  AMD looked at both, saw NVIDIA was doing it expensively when it should be done cheaply (like eDP), made a proof of concept (first DisplayPort, then HDMI) and got it ratified by VESA and the HDMI Forum.  Now NVIDIA's expensive approach has stabbed them in the back because they loss exclusivity and didn't hold any patents to stop AMD/VESA/HDMI Forum, so NVIDIA kept the charade going as long as they could.

The only "cunning" thing NVIDIA ever did in regards to adaptive sync is expanding the idea of eDP to external DP panels.  After that was full of stupid.  Namely it follows NVIDIA's line of thinking where step "b" is always "profits," not "accessibility."  As pointed out many times in this thread, NVIDIA's desire for PhysX profits sidelined the technology's intent (hardware accelerated physics simulations).  NVIDIA could have beat AMD to the punch by finding a driver solution to the problem but there's no profits in drivers.


----------



## xkm1948 (Jan 8, 2019)

moproblems99 said:


> You should be ashamed of yourself for being racist towards Native Americans...



You should be ashamed of yourself for making such associations


----------



## moproblems99 (Jan 8, 2019)

FordGT90Concept said:


> Namely it follows NVIDIA's line of thinking where step "b" is always "profits," not "accessibility."



I always like to take a jab at NV when I can but I can't for the above.  Either fools are parted with their money or the price ain't too damn high...


----------



## terroralpha (Jan 8, 2019)

epiqpnwage said:


> You do know that he is talking about the physyx cards nvidia tried to sell sheep like you who don't know better right?



nvidia never sold physx cards. physx sold physx cards until nvidia bought them out.


----------



## jigar2speed (Jan 8, 2019)

Fatalfury said:


> So its all over for AMD...
> The only advantage AMD had ..........aaaandd  its ..........Gone!!!


Stop having fatalfury with your brain. (Why do i feel i have typed this before to your other comment as well)



terroralpha said:


> nvidia never sold physx cards. physx sold physx cards until nvidia bought them out.


Actually they did sell standalone physx card for some time, then introduced physx in their GPUs.


----------



## Mussels (Jan 8, 2019)

Already beaten to it by the edit to the post, but it sounds like they're doing a few Tiers of Gsync/freesync


1. Entry level VESA standard
2. Gsync compatible (Works, but no official branding/support)
3. Officially supported/advertised (the current type)
4. New ultra super premium variant with HDR support


I am so cucking excited by this, although i still cant understand how Freesync/Gsync can look better than plain old high refresh in the first place.


----------



## FordGT90Concept (Jan 8, 2019)

FTFY:


Mussels said:


> 1. VESA adaptive sync implementation (it knows how to signal FreeSync monitors but compatibility is not guaranteed)
> 2. GSYNC Compatible (same as #1 but compatibility is tested/guaranteed)
> 3. GSYNC (all GSYNC module equipped monitors)
> 4. GSYNC Ultimate (equivalent to FreeSync 2, I think GSYNC module is required)


The question is, can GSYNC module equipped monitors handle an adapative sync signal from an AMD card? Especially GSYNC Ultimate?


----------



## medi01 (Jan 8, 2019)

Vayra86 said:


> Both AMD and Nvidia ran with their own implementation of variable refresh... its really that simple.


Let me state something very apparent, but since you seem to have reading comprehension problems, let me highlight it a bit:  *Only nVidia played vendor lock in game. AMD did not.*



Vayra86 said:


> I fail to see how bending over backwards is related to bending over backwards.


Yep.


----------



## Vayra86 (Jan 8, 2019)

RichF said:


> Avoiding the substance of a person's arguments and information with the "answers demand" is a classic rhetorical trick.
> 
> (It's a bit better than raw trolling with statements like "What does this have to do with anything?" but not much.)
> 
> ...



So, please, dear god... *talk about substance then*. What solutions do you propose? @moproblems99 proposed one, and I think its the only realistic one: don't buy. Wait. But even that is a questionable tactic because it takes an awful lot of people to get in line. 

The idea that consumers have some direct form of control on a free market is slowly but surely dying off. And not just the idea - companies/corporations are getting way too powerful. You can blame the internet for that. Look at Blizzard - one of the finest PC game devs just told its entire, most loyal community they'd start focusing on mobile games. They even laughed about it on stage - don't you have a phone? What they actually said was: screw you, hardcore fanbase, we're going to dumb down everything and you can take it or leave it. And no, we have nothing for you, target audience. You can use your smartphone or you can sod off.

You made a good comment about the tech community and about providing tools and information to others to create a movement. That is the 'how'... what I am missing though is the 'what'. 'What' are we really going to do? You can look at Turing, you can search my post history and you can see what happens in our 'tech community' when solid arguments are presented to _not buy into it. _The same guy that liked your post (@xkm1948 , talking about you... and you're shitposting here again while adding nothing substantial to any discussion) is the guy that insta-replies to those arguments with 'haters gonna hate'. Think about that one for abit.


----------



## FordGT90Concept (Jan 8, 2019)

medi01 said:


> Let me state something very apperent, but since you seem to have reading comprehension problems, let me highlight it a bit:  *Only nVidia played vendor lock in game. AMD did not.*


Yup, this NVIDIA announcement is only possibile because AMD's ecosystem created all of these FreeSync monitors that don't care what GPU they are connected to.  NVIDIA is jumping on AMD's bandwagon, not the other way around.  AMD ~= VESA in this regard.  Soon (I hope) Intel will launch their own adaptive sync implementation.


----------



## Vayra86 (Jan 8, 2019)

medi01 said:


> Let me state something very apperent, but since you seem to have reading comprehension problems, let me highlight it a bit:  *Only nVidia played vendor lock in game. AMD did not.*
> 
> 
> Yep.



I reported your post. You are quoting me with different text, that is unacceptable. Its fine if you're all emotional, just keep it civil and straight.



FordGT90Concept said:


> Yup, this NVIDIA announcement is only possibile because AMD's ecosystem created all of these FreeSync monitors that don't care what GPU they are connected to.  NVIDIA is jumping on AMD's bandwagon, not the other way around.  AMD ~= VESA in this regard.  Soon (I hope) Intel will launch their own adaptive sync implementation.



Absolutely, but in practice, both FreeSync and Gsync resulted in a vendor lock-in for each camps' GPUs. There's no way around that... This isn't about pointing fingers; its about reality for a consumer. And ironically, AMD users are still locked to their FreeSync option now, while Nvidia users are not.


----------



## moproblems99 (Jan 8, 2019)

Vayra86 said:


> The idea that consumers have some direct form of control on a free market is slowly but surely dying off.



You are correct in the sense that consumers choices are dwindling but the fact remains that (for now) consumers are in control over their wallets and truly shape luxury markets.



Vayra86 said:


> You can use your smartphone or you can sod off.



Hopefully, they choose the latter.  While it is tough, if they give in....it will only get worse.  Give them what they asked for, nothing.


----------



## FordGT90Concept (Jan 8, 2019)

Vayra86 said:


> Absolutely, but in practice, both FreeSync and Gsync resulted in a vendor lock-in for each camps' GPUs. There's no way around that... This isn't about pointing fingers; its about reality for a consumer. And ironically, AMD users are still locked to their FreeSync option now, while Nvidia users are not.


NVIDIA locked FreeSync out.  NVIDIA could have enabled AMD cards to drive GSYNC monitors and NVIDIA cards to drive FreeSync monitors years ago, but didn't.  AMD is 100% blameless here.  It is extremely likely NVIDIA will never allow AMD cards to drive GSYNC module-equipped monitors at adaptive refresh rates.  NVIDIA loves exclusivity (see GPP), consumers be damned.

AMD meant it when they called it *Free*Sync.


----------



## moproblems99 (Jan 8, 2019)

Now NV users will be able to drive UltraWides at 144Hz instead of the 120Hz cap that G-Sync provided.


----------



## Vayra86 (Jan 8, 2019)

FordGT90Concept said:


> NVIDIA locked FreeSync out.  NVIDIA could have enabled AMD cards to drive GSYNC monitors and NVIDIA cards to drive FreeSync monitors years ago, but didn't.  AMD is 100% blameless here.  It is extremely likely NVIDIA will never allow AMD cards to drive GSYNC module-equipped monitors at adaptive refresh rates.  NVIDIA loves exclusivity (see GPP), consumers be damned.
> 
> AMD meant it when they called it *Free*Sync.



No need to explain this to me, I know. You're completely missing my point and the toxic response from medi01 also underlines that. It happens a lot on this forum and should be something to reflect on... take off the tinted glasses. You're not getting paid for being pro-anybody.

What I said was, even while _explicitly saying its not about pointing fingers_ that Nvidia users are now in the ironical situation that they DO have full access to all variable refresh monitors while AMD users do not - and that even FreeSync was a lock-in no matter AMD's intent with their approach. The harsh, business reality is that AMD is now once again left with a less interesting proposition.

In terms of doing business you might have to wonder whether FreeSync was a _smart_ move. A good one yes, I can only agree and I applaud them putting the 'consumer first' relative to Nvidia's move. But not a smart one from a business perspective. You are a smart man, surely you can see the paradox here. This company struggles to make profit, yet consistently drops the ball when it gets a chance to do so. Gsync was paid, nothing stopped AMD from making it a little bit less costly and still earn money on it. Money that could have gone to the R&D to actually keep playing in the GPU field, for example...

Its a pattern with AMD - good intentions with a touch of naive and lacking insight in how the market will respond to it, and what the bottom line will be as a result of that. Good intentions don't make you rich, unfortunately. Most consumers aren't brand loyal at all, but susceptible to marketing and 'good deals'.


----------



## moproblems99 (Jan 8, 2019)

Vayra86 said:


> In terms of doing business you might have to wonder whether FreeSync was a _smart_ move.



Ultimately it was, especially in the short term.  However, Nvidia was again able to maneuver themselves into a good spot - although I believe it was purely by accident this time.


----------



## FordGT90Concept (Jan 8, 2019)

Vayra86 said:


> What I said was, even while _explicitly saying its not about pointing fingers_ that Nvidia users are now in the ironical situation that they DO have full access to all variable refresh monitors while AMD users do not...


Pray tell me what AMD user cares about GSYNC?  There's far more people running NVIDIA cards and FreeSync monitors (without using adaptive sync) than AMD cards and GSYNC monitors.  Why? Cost.  You would have had an argument here if GSYNC monitors were roughly the same cost as FreeSync monitors but they aren't by design.



Vayra86 said:


> ...and that even FreeSync was a lock-in no matter AMD's intent with their approach.


There was no "lock-in" ever.  NVIDIA is proving that now.  FreeSync is available to all with the only barrier being implementation.  NVIDIA choose to *lock-out* FreeSync on NVIDIA cards.



Vayra86 said:


> In terms of doing business you might have to wonder whether FreeSync was a _smart_ move.


Yes, it was.  Fixed refresh rates have been the norm for decades.  The only way to change the status quo was to eliminate barriers (like GSYNC module) to adaptive refresh rates which, technical problems aside, is a superior solution to both GPU and monitor design.  From the business perspective, the program tightened the relationship between AMD and monitor/TV manufacturers.  This is why the market is flooding with FreeSync-branded displays and NVIDIA tapped out.



moproblems99 said:


> Ultimately it was, especially in the short term.  However, Nvidia was again able to maneuver themselves into a good spot - although I believe it was purely by accident this time.


We don't know how rough NVIDIA will have it (how many did they test to only get 12 working?).  AMD had a lot of growing pains with FreeSync that NVIDIA is now taking on.


----------



## Vayra86 (Jan 8, 2019)

FordGT90Concept said:


> There was no "lock-in" ever.



Is the plank really that thick? Or do you just want to say no today?

https://en.wikipedia.org/wiki/Vendor_lock-in

In economics, *vendor lock-in*, also known as *proprietary lock-in* or *customer lock-in*, makes a customer dependent on a vendor for products and services, unable to use another vendor without substantial switching costs. 

Let's see: AMD GPU needed FreeSync monitor. FreeSync monitor needed AMD GPU to use the tech... Nvidia GPU needed Gsync monitor. In both cases, switching brands while maintaining variable refresh would have incurred a 'substantial switching cost' - a cost higher than the cost of just a new GPU, or the cost of just a new monitor.

I mean... its not thát complicated is it.



FordGT90Concept said:


> Pray tell me what AMD user cares about GSYNC?  There's far more people running NVIDIA cards and FreeSync monitors (without using adaptive sync) than AMD cards and GSYNC monitors.  Why? Cost.  You would have had an argument here if GSYNC monitors were roughly the same cost as FreeSync monitors but they aren't by design.



What? FreeSync is not even a nice to have for an Nvidia card user - up until today. They simply didn't care - or they paid for Gsync.


----------



## medi01 (Jan 8, 2019)

Vayra86 said:


> FreeSync was a lock-in...



Stating that kind of nonsense in THIS VERY THREAD requires skills only certain #teamgreen folks possess.


----------



## FordGT90Concept (Jan 8, 2019)

Vayra86 said:


> AMD GPU needed FreeSync monitor.


If you want to use VESA adaptive sync, yes; otherwise, no.



Vayra86 said:


> FreeSync monitor needed AMD GPU to use the tech...


False.  AMD just happened to be the only implementation of VESA adaptive sync which is no longer the case.



Vayra86 said:


> Nvidia GPU needed Gsync monitor.


If you want to use GSYNC, yes; otherwise, no.



Vayra86 said:


> In both cases, switching brands while maintaining variable refresh would have incurred a 'substantial switching cost' - a cost higher than the cost of just a new GPU, or the cost of just a new monitor.


The cost of GPU and monitor is implied (these items are going to cost you a significant amount regardless of who you buy it from).  The "switching cost" is exclusively the GSYNC module which represents the NVIDIA lock-in.  If you have a GSYNC monitor, you're not very likely to consider an AMD card to replace a dead NVIDIA card because of how much you spent on that GSYNC module-equipped monitor.  FreeSync monitor on NVIDIA card?  You really didn't pay a premium for the monitor.  Losing adaptive sync sucks but many people do it, especially lately, because Vega 64 can't keep up to cards like the GTX 1080 Ti.  So you buy a faster video card, pushing into framerates where adaptive sync matters less.  AMD is doing nothing to compel you not to buy an NVIDIA card.



Vayra86 said:


> What? FreeSync is not even a nice to have for an Nvidia card user - up until today. They simply didn't care - or they paid for Gsync.


How many people bought Vega + FreeSync because of the GSYNC fee?  How many people bought GeForce + GSYNC because of the non-existent FreeSync fee?


----------



## Vayra86 (Jan 8, 2019)

medi01 said:


> Stating that kind of nonsense in THIS VERY THREAD requires skills only certain #teamgreen folks possess.



The definition of vendor lock-in disagrees with that. Try reading for a change.


----------



## moproblems99 (Jan 8, 2019)

FordGT90Concept said:


> We don't know how rough NVIDIA will have it (how many did they test to only get 12 working?). AMD had a lot of growing pains with FreeSync that NVIDIA is now taking on.



From where I sit, they needed some good publicity and they got it.  It can't be viewed as anything but a good thing for them to open up Freesync capabilities.  There may be growing pains but looking at recent history and the rabid fanbase, I don't think it matters.


----------



## Slizzo (Jan 8, 2019)

moproblems99 said:


> Now NV users will be able to drive UltraWides at 144Hz instead of the 120Hz cap that G-Sync provided.



Well, there's the 2nd gen G-Sync module out there that has yet to be put into an ultrawide. You know, the one that needs a fan but is in those 144Hz 4K displays? That's why the LG 950G is handicapped to 120Hz while the LG 950F is running at it's native 144Hz.

I can see why they rolled out without the 2nd gen module, it's more expensive and there probably aren't a lot on hand. But boy is it nice for the consumers that they can now buy the 144Hz 950F anyway if they have an NVIDIA card.


----------



## Rahnak (Jan 8, 2019)

FordGT90Concept said:


> False.  AMD just happened to be the only implementation of VESA adaptive sync which is no longer the case.



Uh.. That's not how it works. If AMD is the only one implementing it, then you're locked to AMD.

I have a FreeSync monitor and AMD GPU. Before this announcement, if I wanted to upgrade my GPU and keep using FreeSync I had to buy another AMD GPU thus locking me in to AMD products only.


----------



## Vayra86 (Jan 8, 2019)

FordGT90Concept said:


> If you want to use VESA adaptive sync, yes; otherwise, no.
> 
> 
> False.  AMD just happened to be the only implementation of VESA adaptive sync which is no longer the case.
> ...



So what will happen as an Nvidia user, you d buy an AMD card to enjoy on your Gsync monitor when you upgrade? Or would you rather buy another Nvidia GPU? That is a lock in.

And with an AMD Freesync setup, would you have bought an Nvidia card? Of course not, but now you can and lose nothing in the process.

Get it?


----------



## FordGT90Concept (Jan 8, 2019)

Vayra86 said:


> And with an AMD Freesync setup, would you have bought an Nvidia card? Of course not, but now you can and lose nothing in the process.


Quite common.  A friend of mine did exactly that.  He has a GTX 1070 and I recommended to him a Nixeus FreeSync capable 27" 1440p gaming monitor.  He bought it and used it at fixed 144 Hz.  Why? because FreeSync capabilities cost nothing extra and the monitor otherwise is an excellent gaming panel.  So now he might get the added benefit of being able to use FreeSync (Nixeus already said they're looking into it) with his NVIDIA card.  He was not locked in to anything because that cost component nor exclusivity component of FreeSync was ever there.  He got exactly what he paid for originally and this announcement is just icing on the cake.

Prior to this announcement, the only way to get adaptive sync on NVIDIA hardware was by paying the GSYNC fee.  Look at it this way (assuming NVIDIA card):
1) pay GSYNC fee to get adaptive sync support.
2) pay for a fixed sync monitor without adapative sync support.
3) pay roughly the same as #2 for a adaptive sync monitor but no GSYNC support.

#1 is vendor lock-in
#3 makes the most financial sense

AMD cards only ever had #2 and #3 as options.


----------



## Vayra86 (Jan 8, 2019)

FordGT90Concept said:


> Quite common.  A friend of mine did exactly that.  He has a GTX 1070 and I recommended to him a Nixeus FreeSync capable 27" 1440p gaming monitor.  He bought it and used it at fixed 144 Hz.  Why? because FreeSync capabilities cost nothing extra and the monitor otherwise is an excellent gaming panel.  So now he might get the added benefit of being able to use FreeSync (Nixeus already said they're looking into it) with his NVIDIA card.  He was not locked in to anything because that cost component nor exclusivity component of FreeSync was ever there.  He got exactly what he paid for originally and this announcement is just icing on the cake.
> 
> Prior to this announcement, the only way to get adaptive sync on NVIDIA hardware was by paying the GSYNC fee.  Look at it this way (assuming NVIDIA card):
> 1) pay GSYNC fee to get adaptive sync support.
> ...



You turned the example around. Monitors tend to last longer than GPUs, so when you upgrade a GPU why would you not get the better combo? 

Nonetheless I get what you are saying about the cost aspect wrt Freesync offerings and vendor lock in in a general sense.


----------



## londiste (Jan 8, 2019)

Guys, could we please just get along. Regardless of the reasoning behind Nvidia's announcement, getting a more standard VRR environment is a *good* thing


----------



## FordGT90Concept (Jan 8, 2019)

Vayra86 said:


> You turned the example around. Monitors tend to last longer than GPUs, so when you upgrade a GPU why would you not get the better combo?
> 
> Nonetheless I get what you are saying about the cost aspect wrt Freesync offerings and vendor lock in in a general sense.


You answered your own question: cost.  FreeSync versus fixed sync has very little difference in cost (for the manufacture, literally just time, the value of a single monitor, and S&H).


Vendor lock-in requires a proprietary *or* customer component.  VESA adapative sync is an open standard (for members) and AMD has open sourced their VESA adpative sync implementation (trademarked as FreeSync) through the GPUOpen initiative.  You're accusing AMD of customer lock-in (NVIDIA proved they didn't with this announcement) when NVIDIA is provably guilty of both proprietary *and* customer lock-in.

Judging by the reply quoted above, yeah, I think you finally "get it."


----------



## Vayra86 (Jan 8, 2019)

londiste said:


> Guys, could we please just get along. Regardless of the reasoning behind Nvidia's announcement, getting a more standard VRR environment is a *good* thing



Good idea, going in circles now.


----------



## FordGT90Concept (Jan 8, 2019)

Looking at VRR support in DP and HDMI specs and cross referencing NVIDIA architectures, only Maxwell, Pascal, and Turing cards likely support VESA adaptive sync over DisplayPort and none of them (unless they mimic AMD's proprietary HDMI signaling) support VRR over HDMI.  DP 1.2a was created specifically for AMD VRR which NVIDIA did not implement.  VRR became a broad standard in DP 1.3 which Maxwell supports.  VRR was not standardized at all in HDMI spec until 2.1.  Turing has 2.0b support.

Researching this, I rediscovered NVIDIA's DP firmware update back in June...I wonder if it is related to this news:
https://www.techspot.com/news/74994...splayport-issues-maxwell-pascal-graphics.html


----------



## Rahnak (Jan 8, 2019)

FordGT90Concept said:


> Looking at VRR support in DP and HDMI specs and cross referencing NVIDIA architectures, only Maxwell, Pascal, and Turing cards likely support VESA adaptive sync over DisplayPort and none of them (unless they mimic AMD's proprietary HDMI signaling) support VRR over HDMI.  DP 1.2a was created specifically for AMD VRR which NVIDIA did not implement.  VRR became a broad standard in DP 1.3 which Maxwell supports.  VRR was not standardized at all in HDMI spec until 2.1.  Turing has 2.0b support.
> 
> Researching this, I rediscovered NVIDIA's DP firmware update back in June...I wonder if it is related to this news:
> https://www.techspot.com/news/74994...splayport-issues-maxwell-pascal-graphics.html



From an article on AnandTech, the new driver will enable VESA Adaptive Sync only on Pascal and newer cards.


----------



## FordGT90Concept (Jan 8, 2019)

Wow, they really dropped the ball there.


----------



## jabbadap (Jan 8, 2019)

Uhm I'm not really sure if Maxwell has displayport over 1.2(At least every maxwell cards specs says displayport 1.2 i.e. gtx980ti and maximum resolution is lower than what is possible with 1.3/1.4). Adaptive sync were included to 1.2a version. That TEchspot article says that connected to monitor that support 1.3/1.4 have problems not that Maxwell have displayport 1.3/1.4 connector.


----------



## londiste (Jan 8, 2019)

Nvidia says G-Sync requirement is 600 series (650Ti and up) or newer. So, basically starting Kepler. There is no reason to assume they will reduce support in some way. Older GPUs like 500 series (Fermi) did not have DisplayPort. Also, 600-series is as far back as their current driver support goes.


----------



## FordGT90Concept (Jan 8, 2019)

That's for GSYNC as in monitors equipped with the GSYNC module.  We're talking about "GSYNC Compatible" which is what NVIDIA is calling their adaption of VESA adaptive sync standard.  As Rahnak pointed out, "GSYNC Compatible" is only on Pascal and newer cards.


----------



## londiste (Jan 8, 2019)

FordGT90Concept said:


> That's for GSYNC as in monitors equipped with the GSYNC module.  We're talking about "GSYNC Compatible" which is what NVIDIA is calling their adaption of VESA adaptive sync standard.  As Rahnak pointed out, "GSYNC Compatible" is only on Pascal and newer cards.


We'll have to wait and see. Nvidia does not say anywhere that it is limited to Pascal/Turing and there is no technical reason for anything like that. Just the opposite, there are good reasons for Nvidia to keep the list of GPUs supporting G-Sync stable.


----------



## FordGT90Concept (Jan 8, 2019)

I suspect that last monitor shown doesn't even has FreeSync certification.


----------



## Xzibit (Jan 8, 2019)

FordGT90Concept said:


> I suspect that last monitor shown doesn't even has FreeSync certification.



It does. last one is a LG G-Series UW-C, Second to last is a Samsung C or K series if i remember the letters correctly.

Funny how in the last monitor which looks like a LG G-series UW-C when he moves the mouse at the end the blinking goes away.

We would have heard if LG had defected monitors by now. Those models have been out for 3yrs. The newer ones have a different base.


----------



## FordGT90Concept (Jan 8, 2019)

it is entirely possible that NVIDIA is trying to mislead the media too.  Perhaps he should have pointed at the fourth monitor and said "this is your FreeSync monitor without AMD driver optimizations."  If 400+ monitors are certified and only 12 work with NVIDIA cards without driver optimizations, NVIDIA has a lot of work ahead of them that I suspect they're going to go about lackadaisically.


----------



## Xzibit (Jan 8, 2019)

FordGT90Concept said:


> *it is entirely possible that NVIDIA is trying to mislead the media too*.  Perhaps he should have pointed at the fourth monitor and said "this is your FreeSync monitor without AMD driver optimizations."  If 400+ monitors are certified and only 12 work with NVIDIA cards without driver optimizations, NVIDIA has a lot of work ahead of them that I suspect they're going to go about lackadaisically.



Well its Gordon. He doesn't hide his bias in favor of Nvidia.  On his podcast its like a on-going joke that the rest of the guys and gal will roll their eyes while Gordon tries to convince them of what a great deal your getting. Like Toms Hardware "just buy it" but in podcast form.  Its entertaining.

To his defense he did say "Nvidia told me" and "This is what they told me"  He never says he tested it to verify.


----------



## FordGT90Concept (Jan 8, 2019)

He's just reading the label: "Non-validated" (by NVIDIA).  I hope we'll see testing comparing AMD vs NVIDIA cards across several FreeSync certified monitors.  It'll also be interesting to look at where things stand a year down the road: did NVIDIA/monitor manufacturers put in an honest effort to make them work or did NVIDIA phone it only slapping their label on FreeSync monitors that work out of the box.


----------



## Xzibit (Jan 8, 2019)

FordGT90Concept said:


> He's just reading the label: "Non-validated" (by NVIDIA).  I hope we'll see testing comparing AMD vs NVIDIA cards across several FreeSync certified monitors.  It'll also be interesting to look at where things stand a year down the road: did NVIDIA/monitor manufacturers put in an honest effort to make them work or did *NVIDIA phone it only slapping their label on FreeSync monitors that work out of the box.*



Probably that.  Remember Nvidia Tom Peterson said the G-Sync module has a, he called it a "look-aside buffer" for synchronizing.


----------



## FordGT90Concept (Jan 8, 2019)

Yeah, where GPU VRAM is the buffer in FreeSync.  The GPU/driver does all of the tricks the GSYNC module does...and NVIDIA drivers seem to be lacking a lot of that.


----------



## londiste (Jan 9, 2019)

FordGT90Concept said:


> it is entirely possible that NVIDIA is trying to mislead the media too.  Perhaps he should have pointed at the fourth monitor and said "this is your FreeSync monitor without AMD driver optimizations."


What is being demonstrated is LFC running on a monitor that does not support it.
LG 34UM69G-B has 40-75Hz frequency range.

Nvidia has said they want 2.4 range to even consider a monitor being G-Sync Compatible. This is an example of a monitor that does not fit that requirement.
He probably should have pointed at that monitor and said "This is a crappy VRR monitor".



Xzibit said:


> Just FYI - The LG 34UM69G-B is flat the one in the video has a curv to it.


Oh. You are right. It was mentioned in the Youtube comments and I did not check very well. The point remains though, it is more than likely a monitor with a too small VRR range.

There is a very good technical reason for the requirement of a wide enough frequency range. It needs to be at least 2 to be able to double the frames when FPS falls below frequency range. Exactly 2 is too small because frequency needs to be more dynamic and this causes pretty much exactly what is demonstrated in the video. So manufacturers use a higher requirement, AMD uses 2.5 for LFC in Freesync and Nvidia now says 2.4 for G-Sync Compatible. Unofficial solution to these problems for Freesync monitors has generally been to manually increase the monitors range definition and hope that monitor works fine with it. This is effectively monitor overclocking and not guaranteed.

This frame doubling is the crux of both AMD's LFC (Low FrameRate Compensation) and has been part of basic Nvidia's G-Sync spec from the start. Monitor can (or is tested, specced and guaranteed to) work with a certain frequency range. Minimal refresh rate is usually 30-40 Hz while maximal varies a lot - 75, 100, 120, 144, 165, 240 Hz are most common ones.

Variable Refresh Rate (VRR) uses this entire range as opposed to a fixed refresh rate but it still cannot go beyond the range. When FPS drops below the minimum supported refresh rate simple VRR method of fixing refresh rate to current FPS (yes, technically GPU will trigger a refresh but for high-level explanation this is close enough) will no longer work as monitor will not be able to refresh at too low a rate. The solution was to start doubling frames. For every frame coming from GPU monitor gets refreshed twice. For example, when game runs at 20 FPS, monitor refreshes at 40 Hz and each frame from GPU is shown twice on monitor. This doubling may be repeated again if necessary, for example 10 FPS on monitor with 40 Hz minimum refresh rate will get each frame shown 4 times.

This is a simple and elegant solution that is not really a problem with a real wide frequency range gaming monitor - for example, the initial GSync requirement was 30-144Hz with properly low minimum refresh rate and a wide range (maximum is 4.8 times minimum). This does become a problem on monitors with high minimum refresh rate and/or narrow frequency range. There have been a lot of Freesync monitors with ranges like 48-75 Hz which AMD never bothered to tackle in any way.

In practice, such monitor with 48-75 Hz range will work well and do VRR in 48-75 FPS range but not ouside of it. Given that these are less expensive monitors and are likely to be paired with less expensive GPUs, drops below this range will be noticeable and not benefit from VRR.


----------



## FordGT90Concept (Jan 9, 2019)

londiste said:


> There have been a lot of Freesync monitors with ranges like 48-75 Hz which AMD never bothered to tackle in any way.


What is Enhanced Sync (mimics what G-Sync does < monitor refresh rate) and Frame Rate Target Control (minimal input lag cap not only saving power but also preventing tearing)? 

I have to assume the panels they demo'd were running above the monitor's refresh range.  If all it needed was FRTC set, then NVIDIA went full retard.


----------



## londiste (Jan 9, 2019)

FordGT90Concept said:


> What is Enhanced Sync (mimics what G-Sync does < monitor refresh rate) and Frame Rate Target Control (minimal input lag cap not only saving power but also preventing tearing)?


Enhanced Sync (as well as its counterpart Fast Sync) is absolutely not what G-Sync/Freesync does. Enhanced/Fast Sync is useful when FPS you are getting is (much, for real use it's have to be 2+ times) larger than monitor frequency. With these monitor refresh will still be happening a fixed rate but the frame being shown at the point of refresh is the latest frame that GPU generated. The main intent is reducing input lag compared to Vsync. For example, when monitor refreshes at 60 Hz and game is running at 120 FPS GPU will be generating two frames for each monitor refresh period with the last one being shown. This should be compared to Vsynced situation where the first frame is shown and the first frame is is 8ms older. This is a little simplified but this is the idea. The other part is Vsync basically cutting framerate to half when FPS drops below monitor frequency but this is a different discussion.

Frame Rate target Control limits the maximum frame rate. This is useful depending on circumstances but is not really directly related to Variable Refresh Rate things. And it has no effect on what I described above because these issues occur at refresh rate minimum, not the maximum.



FordGT90Concept said:


> I have to assume the panels they demo'd were running above the monitor's refresh range.  If all it needed was FRTC set, then NVIDIA went full retard.


I have not seen exact details anywhere but the problem is probably primarily with running below the refresh range. You are kind of right though in that frame doubling as the solution to this problem would lead to trying to run above the range. It has nothing to do with going full retard. This demonstrates - and very much correctly - this specific problem.


----------



## FordGT90Concept (Jan 9, 2019)

londiste said:


> Enhanced Sync (as well as its counterpart Fast Sync) is absolutely not what G-Sync/Freesync does. Enhanced/Fast Sync is useful when FPS you are getting is (much, for real use it's have to be 2+ times) larger than monitor frequency. With these monitor refresh will still be happening a fixed rate but the frame being shown at the point of refresh is the latest frame that GPU generated. The main intent is reducing input lag compared to Vsync. For example, when monitor refreshes at 60 Hz and game is running at 120 FPS GPU will be generating two frames for each monitor refresh period with the last one being shown. This should be compared to Vsynced situation where the first frame is shown and the first frame is is 8ms older. This is a little simplified but this is the idea. The other part is Vsync basically cutting framerate to half when FPS drops below monitor frequency but this is a different discussion.


5:30








TL;DW: Enhanced Sync allows tearing in situations where it preferable (like at low frame rate) and attempts to eliminate tearing where it is not (high frame rate).

Enhanced Sync fills in the edge cases in FreeSync--they're meant to compliment each other.

If FastSync truly works like Enhanced Sync does, then Fast Sync should be enabled when driving any FreeSync monitor.



londiste said:


> Frame Rate target Control limits the maximum frame rate. This is useful depending on circumstances but is not really directly related to Variable Refresh Rate things. And it has no effect on what I described above because these issues occur at refresh rate minimum, not the maximum.


Not "limits" (that is v-sync), it paces the card so the card is producing approximately as many frames as is needed.  60 fps = targets a new frame every 16.67 ms. 144 fps = targets a new frame every 6.94 ms.  Because of targeting, it has less stutter than v-sync because the graphics card isn't sitting on a frame for potentially 16+ ms.



londiste said:


> I have not seen exact details anywhere but the problem is probably primarily with running below the refresh range. You are kind of right though in that frame doubling as the solution to this problem would lead to trying to run above the range. It has nothing to do with going full retard. This demonstrates - and very much correctly - this specific problem.


AMD fixed it in time.  The question is will NVIDIA?


----------



## londiste (Jan 9, 2019)

FordGT90Concept said:


> Enhanced Sync allows tearing in situations where it preferable (like at low frame rate) and attempts to eliminate tearing where it is not (high frame rate).


VRR does not have tearing. If it does, it is a crappy VRR solution.



FordGT90Concept said:


> Not "limits" (that is v-sync), it paces the card so the card is producing approximately as many frames as is needed.  60 fps = targets a new frame every 16.67 ms. 144 fps = targets a new frame every 6.94 ms.  Because of targeting, it has less stutter than v-sync.


Frame Rate Target Control is a frame limiter, pure and simple. That is 100% of what it does. It has less stutter than Vsync but also has tearing (which is what VSync is intended to prevent).
There are more appropriate solutions to run with Vsync while framerate is high and disabling Vsync when framerate drops below monitor refresh rate - Dynamic Vsync (AMD) or Adaptive Vsync (Nvidia).



FordGT90Concept said:


> AMD fixed it in time.  The question is will NVIDIA?


Fixed what? This particular monitor? It is a crappy monitor VRR monitor and should not be used as such.
Nvidia decided from day one that VRR solution has to work from 0 to max refresh rate of monitor and made this a requirement. So the VRR FPS range for G-Sync monitors have always started from 0.
AMD did provide a method for this eventually with LFC but does not require it (well, does for FreeSync2 which is a different story).


----------



## FordGT90Concept (Jan 9, 2019)

londiste said:


> VRR does not have tearing. If it does, it is a crappy VRR solution.





londiste said:


> There are more appropriate solutions to run with Vsync while framerate is high and disabling Vsync when framerate drops below monitor refresh rate - Dynamic Vsync (AMD) or Adaptive Vsync (Nvidia).


Obviously didn't absorb the wisdom of Scott Wasson. 



londiste said:


> Frame Rate Target Control is a frame limiter, pure and simple. That is 100% of what it does. It has less stutter than Vsync but also has tearing (which is what VSync is intended to prevent).


Unrelated technologies.  FRTC paces the production of frames.  You need v-sync or enhanced sync to address tearing.



londiste said:


> Fixed what? This particular monitor? It is a crappy monitor VRR monitor and should not be used as such.


Reviewers disagree (assuming the alleged model is correct):
https://www.amazon.com/LG-34UM69G-B-34-Inch-UltraWide-Reduction/dp/B06XFXX5JH
https://www.newegg.com/Product/Product.aspx?Item=N82E16824025514



londiste said:


> Nvidia decided from day one that VRR solution has to work from 0 to max refresh rate of monitor and made this a requirement. So the VRR range for G-Sync monitors have always started from 0.
> AMD did provide a method for this eventually with LFC but does not require it (well, does for FreeSync2 which is a different story).


With Enhanced Sync enabled, there should be a single tear at the most yet, what does the guy in the video complain about? "Blinking."  Go look at the reviews again for the monitor.  How many complaints of blinking are there?  None?





It's clear these monitors are fine by FreeSync spec.  NVIDIA's just not ready to drive them.


Edit: Found another wrench to throw into the mix: the LG monitor is DisplayPort 1.2 where most of those that are marked "GSYNC Compatible" are DisplayPort 1.4.  Might have something to do with NVIDIA struggling to drive it.


----------



## londiste (Jan 9, 2019)

FordGT90Concept said:


> Obviously didn't absorb the wisdom of Scott Wasson.


I have no idea who Scott Wasson is or the relevance here.


			
				https://www.amd.com/en/technologies/frtc said:
			
		

> Frame Rate Target Control (FRTC) is a new feature we’re introducing with the AMD Radeon™ Fury X graphics card, enabling users to set a target maximum frame rate when playing an application in full screen mode; the benefit being that FRTC can reduce GPU power consumption (great for games running at frame rates much higher than the display refresh rate) and therefore reduce heat generation and fan speeds/noise on the graphics card.


Even AMD is not claiming this to be anything other than frame limiter. You are right about needing something else for tearing and that is what I said.


FordGT90Concept said:


> Reviewers disagree (assuming the alleged model is correct):
> https://www.amazon.com/LG-34UM69G-B-34-Inch-UltraWide-Reduction/dp/B06XFXX5JH
> https://www.newegg.com/Product/Product.aspx?Item=N82E16824025514


I said crappy VRR monitor and I will stand by my statement.


FordGT90Concept said:


> It's clear these monitors are fine by FreeSync spec.


Oh, I agree. Its just that this stupid small range is OK by Freesync spec.

Edit:
Nvidia is not struggling to drive this monitor. They are unwilling to put their mark - 'G-Sync Compatible' in this case - on what they do not think is a good VRR monitor.
The monitor in the video is a showcase of what happens if they drive this monitor as they would drive a one with a wide enough refresh rate range. Probably.


----------



## FordGT90Concept (Jan 9, 2019)

londiste said:


> I have no idea who Scott Wasson is or the relevance here.


The video you didn't watch. He's the product manager at AMD, co-founded Ars Technica and Tech Report.


----------



## londiste (Jan 9, 2019)

FordGT90Concept said:


> What is Enhanced Sync (mimics what G-Sync does < monitor refresh rate)





FordGT90Concept said:


> TL;DW: Enhanced Sync allows tearing in situations where it preferable (like at low frame rate) and attempts to eliminate tearing where it is not (high frame rate).


Do you see the contradiction there?

FreeSync/G-Sync address situations up to the monitor refresh rate. Enhanced/Fast Sync address situations above monitor refresh rate, in case of both these technologies with the specific goal of minimizing input lag. Both Enhanced and Fast Sync are not without downsides, microstutter being the main problem. This is why a common recommendation for both Enhanced and Fast Sync is to have FPS at least 2 times as high as monitor refresh rate.


----------



## Deleted member 158293 (Jan 9, 2019)

If you're looking for a high end Freesync monitor then you probably should buy sooner rather than later.  I'd expect the price on high end Freesync monitors to increase substantially over the course of the next 1+ year as manufacturers jockey, and pay, for new g-sync compatible branding.  Branding is not free.

Hopefully it won't affect mid-range Freesync monitor pricing too much...


----------



## FordGT90Concept (Jan 9, 2019)

londiste said:


> Do you see the contradiction there?


If you don't want tearing below minimum refresh rate, enable v-sync.



londiste said:


> FreeSync/G-Sync address situations up to the monitor refresh rate. Enhanced/Fast Sync address situations above monitor refresh rate, in case of both these technologies with the specific goal of minimizing input lag.


Enhanced Sync:
FPS > Hz: sends the most recent completed frame to the monitor
FPS < Hz: sends whatever it has (at most one tear between old frame and new frame)
It does both.  Watch the damn video.



londiste said:


> Both Enhanced and Fast Sync are not without downsides, microstutter being the main problem.


Not true of Enhanced Sync.



londiste said:


> This is why a common recommendation for both Enhanced and Fast Sync is to have FPS at least 2 times as high as monitor refresh rate.


By whom?  AMD doesn't give any recommendations for Enhanced Sync because it is designed to deal with all frame rates, fixed sync, and FreeSync.



yakk said:


> If you're looking for a high end Freesync monitor then you probably should buy sooner rather than later.  I'd expect the price on high end Freesync monitors to increase substantially over the course of the next 1+ year as manufacturers jockey, and pay, for new g-sync compatible branding.  Branding is not free.


If NVIDIA demands money, they'll be sold as separate models with separate price structures (not unlike GSYNC now).  It only takes one FreeSync monitor maintaining their low cost to make all of the rest fall in line.  The monitor market is extremely competitive.


----------



## londiste (Jan 9, 2019)

FordGT90Concept said:


> If you don't want tearing below minimum refresh rate, enable v-sync.


I would rather use some VRR thing.
But on the topic, you said Enhanced Sync mimics what Gsync does which is simply wrong. And your later comment while correct contradicts the first one directly.



FordGT90Concept said:


> Enhanced Sync:
> FPS > Hz: sends the most recent completed frame to the monitor
> FPS < Hz: sends whatever it has (at most one tear between old frame and new frame)
> It does both.  Watch the damn video.


Enhanced sync does effectively nothing when FPS < Hz.
It has little effect when one frame is completed during refresh period which can introduce some microstutter depending on exact timing.
It does awesome when FPS >> Hz is then the latest frame is really the latest.


----------



## matar (Jan 9, 2019)

Great move Nvidia , NOW Next move drop the prices of RTX Titan $1199 , RTX 2080Ti $649 , RTX 2080 $499 , RTX 2070 $349 , RTX 2060 $249


----------



## danado (Jan 10, 2019)

londiste said:


> Guys, could we please just get along. Regardless of the reasoning behind Nvidia's announcement, getting a more standard VRR environment is a *good* thing



That is a correct approach. 
You are right



Gasaraki said:


> Most freesync monitors are shit?


It is strange, but AG241QG that is on the short list is having already G-Sync inside.


----------



## Mussels (Jan 14, 2019)

15th of jan here already, wheres my driiiveeeeeeers


----------



## FordGT90Concept (Jan 15, 2019)

It's 4:30 PM January 14 in Sunnyvale, CA.  Wait at least 17 more hours.


----------



## Candor (Jan 15, 2019)

Mussels said:


> 15th of jan here already, wheres my driiiveeeeeeers



It's hard for those of us living in the future


----------



## wolf (Jan 15, 2019)

Almost 2pm on the 15th here in Perth Australia, I want this driver dammit!

Well since owning a Freesync monitor it's already been 18 months, another 18 hours wont hurt.


----------



## Mussels (Jan 15, 2019)

I'm le tired and want naptime, so they need to hurry up


----------



## danado (Jan 15, 2019)

Mussels said:


> I'm le tired and want naptime, so they need to hurry up


Here they are!! Are available in this moment.



wolf said:


> Almost 2pm on the 15th here in Perth Australia, I want this driver dammit!
> 
> Well since owning a Freesync monitor it's already been 18 months, another 18 hours wont hurt.


Are you there? The driver is ready now!!!


----------



## Mussels (Jan 15, 2019)

yeeeee boooooi


----------



## Candor (Jan 15, 2019)

GTX 1080 here running on an AG271QX monitor (freesync range 30-144hz).

G-Sync was enabled by default. No bad stuff happening, no blanking of the screen or pulsing.

Honestly I can't tell if it's working.

The old G-SYNC Pendulum Demo does not work. (Obviously it's outdated - released in 2015).


----------



## danado (Jan 15, 2019)

Is anybody with news


Candor said:


> GTX 1080 here running on an AG271QX monitor (freesync range 30-144hz).
> 
> G-Sync was enabled by default. No bad stuff happening, no blanking of the screen or pulsing.
> 
> ...



Thank you. I have the same configuration at home with yours, but in this moment I am at the office.
These are good news and in a way expected.


----------



## Mussels (Jan 15, 2019)

enabled and working fine - i can go into the settings menu of my monitor and see res/refresh info in realtime and it shows the refresh flickering all over the place, so its a green light for me

In the one game i've tested (supreme commander: forged alliance, which has a 100FPS cap) i'm seeing some very faint flickering of brightness - but its only noticeable if i look for it really hard in dim background areas


----------



## danado (Jan 15, 2019)

Mussels said:


> enabled and working fine - i can go into the settings menu of my monitor and see res/refresh info in realtime and it shows the refresh flickering all over the place, so its a green light for me
> 
> In the one game i've tested (supreme commander: forged alliance, which has a 100FPS cap) i'm seeing some very faint flickering of brightness - but its only noticeable if i look for it really hard in dim background areas



The HP 14402 is yours?


----------



## danado (Jan 15, 2019)

Candor said:


> Blur Buster's "Variable Refresh Rate Simulation" seems to confirm mine is working.
> 
> Haven't tried windowed mode yet.



Try For Honor in windowed mode if you can.


----------



## Candor (Jan 15, 2019)

Hmmm. Still no real confirmation it's working yet.

At least I have no negative issues.

Everything is smooth, but it looked that way before


----------



## FordGT90Concept (Jan 15, 2019)

Candor said:


> The old G-SYNC Pendulum Demo does not work. (Obviously it's outdated - released in 2015).


Try AMD Windmill demo:
https://drive.google.com/drive/folders/0B0RkAW7Y4oRSd1gtSkFPcXB6RGM


----------



## xpredator_13 (Jan 15, 2019)

Flickering issues on LG 34UC79G...sadly...


----------



## danado (Jan 16, 2019)

Candor said:


> Hmmm. Still no real confirmation it's working yet.
> 
> At least I have no negative issues.
> 
> ...


For the same monitor I have checked and is moved in Freesync mode. Also G-Sync Demo Pendulum it works in G-Sync mode.


----------



## John Naylor (Jan 16, 2019)

Well it will increase sales for nVidia cards since AMD has no horse in the race in the top 4 tiers.   However, with no alternative for MBR tech (ULMB) anyone looking to stay above 60 fps won't care.


----------



## c2DDragon (Jan 16, 2019)

Candor said:


> Hmmm. Still no real confirmation it's working yet.
> 
> At least I have no negative issues.
> 
> ...


It seems there is an indicator :




Not tested yet with my XG270HU but for sure I will.

Edit : It's working like a charm !


----------



## Animalpak (Jan 16, 2019)

G-Sync is a dedicated hardware ( with some similar characteristics of a GPU ) module installed behind the LCD screen, so explain to me why now monitors that dont have it now can do the same thing ?


----------



## c2DDragon (Jan 17, 2019)

Animalpak said:


> G-Sync is a dedicated hardware ( with some similar characteristics of a GPU ) module installed behind the LCD screen, so explain to me why now monitors that dont have it now can do the same thing ?


G-sync just adds :
_less input lag
_an overdrive (well hmm ok lol)
_refresh rate starts lower than freesync or adaptative sync
_profits ? (for nVidia of course)
It's qualified as high end but it's just a marketing stuff which adds like $200 for something you may not be able to notice compared to freesync screens.
The thing is G-sync screens may have less (to no) ghosting compared to some Freesync 1 stuff, I don't know about adaptative sync screens. It can be not noticeable for some out of a bench by the way.

Edit : I just want to tell I'm not a fanboy or anything.
The only ATI (before AMD for young people) I owned was a X800GTO and the screen I bought was just for 1440p @ 144Hz, no way I planned to use Freesync on it. Now nVidia permits to use G-sync on it, I'm happy cause no way I would have bought a G-sync screen for this kind of small feature.
Small feature because I think it's only good when you get lower than 60fps, to permit the experience to remain smooth. For real right now I tried in The Division & AC Origins but I only get lower than 60 in The Division with nvidia custom shadows in some areas. I just used those shadows for the test and the sync is working good, by the way it's a no go for me to have less than 70fps (I picked back the ultra shadows instead of special ones) so I don't feel the need of the feesync/g-sync feature, I added it because why not ? It will be good for non optimized console port games I guess...before I switch the 1080Ti for something more powerful.


----------



## Animalpak (Jan 17, 2019)

c2DDragon said:


> G-sync just adds :
> _less input lag
> _an overdrive (well hmm ok lol)
> _refresh rate starts lower than freesync or adaptative sync
> ...



If i know that freesync monitors do the same job as g-sync now Nvidia has to give me a reason why I should pay that extra G-Sync module in the future ?


----------



## FordGT90Concept (Jan 17, 2019)

c2DDragon said:


> _less input lag










Enhanced Sync + FreeSync < V-Sync + G-Sync

Think about it: the signal is literally being processed by two GPUs (the graphics card and the GPU known as the G-Sync Module in the monitor itself).  There's really no circumstance where G-Sync should be faster than FreeSync unless the graphics card driving it is faster.  That's no fault of the technology though.


----------



## GoldenX (Jan 17, 2019)

Doesn't G-Sync save a bit of VRAM compared to Freesync? You use the RAM in the G-Sync module for frame buffering instead of your GPU's.


----------



## FordGT90Concept (Jan 17, 2019)

AMD cards are not short on VRAM so it's not an issue. Also, you're talking at most one frame. At 10-bit per color, that's maximum 45 MiB at 4K--a pittance.


----------



## c2DDragon (Jan 17, 2019)

FordGT90Concept said:


> Enhanced Sync + FreeSync < V-Sync + G-Sync
> 
> Think about it: the signal is literally being processed by two GPUs (the graphics card and the GPU known as the G-Sync Module in the monitor itself).  There's really no circumstance where G-Sync should be faster than FreeSync unless the graphics card driving it is faster.  That's no fault of the technology though.


Well it's marketing I have read  Scam it was then.
Those tests make people think. I never had trust in this G-sync tech by the way.
(real) Competitive people will not play in 4k@60fps@60Hz with G-sync & Vsync ON for sure so the little input lag will never be noticeable with a recent monitor, in games where a graphic card might struggle to keep the fps high. I say this cause input lag can be an issue in competitive games only. I don't know anybody who can notice it even on a TV screen as it is about ghosting but cyborgs might exist 
No way can somebody play a competitive game with ultra graphics if it kills the min/max fps but lies have been told about G-Sync module for sure.

By the way I have read it's better to let nVidia Vsync to default in the nVidia control panel BUT put it OFF in games so G-Sync will do the job as intended.



Animalpak said:


> If i know that freesync monitors do the same job as g-sync now Nvidia has to give me a reason why I should pay that extra G-Sync module in the future ?


It appears there is a sort of certification, they said the screens pass like 300 tests for quality, so you would pay like $200 extra to be sure the "thing" works.






Now they tested the Freesync panels, I see no reason if you can see the monitor you want pass the nVidia tests (maybe not 300 I don't know about this I didn't read much about those). They report the Acer XG270HU as a 2560x1080 panel...it's 2560x1440 so you can see a lack of professionalism.
I do understand why they did lock G-Sync on monitors without the module : 1. Money / 2. Less bugs to fix
Now let's remember ATI hmm AMD said you could use different cards to pair in your computer for calculation (it could be games or anything) like a 980Ti + a RX 560X for example. Like a SLI without any synchronization. It's all about drivers. Imagine you could pair your 2080Ti with a 1080Ti, I don't see why you could not. Drivers just let you use 1 of the cards for PhysX but think about it, it's a driver lock.


----------



## RichF (Jan 19, 2019)

Vayra86 said:


> So, please, dear god... *talk about substance then*. What solutions do you propose?


Thanks for merely recycling the argument I rebutted, ignoring the substantive rebuttal.

My entire previous post, since you didn't notice, was a rebuttal to the "What solutions do you propose?" bit. Recycling a distraction instead of rebutting a post is another tired rhetorical strategy.

Also, trying to condemn me because of who decides to like my posts is really cheesy, along with language like "dear god".


----------



## Vayra86 (Jan 19, 2019)

RichF said:


> Thanks for merely recycling the argument I rebutted, ignoring the substantive rebuttal.
> 
> My entire previous post, since you didn't notice, was a rebuttal to the "What solutions do you propose?" bit. Recycling a distraction instead of rebutting a post is another tired rhetorical strategy.
> 
> Also, trying to condemn me because of who decides to like my posts is really cheesy, along with language like "dear god".



That is a rather complicated way of saying 'I was actually adding the same nonsense to this topic as the guy I was correcting'... and this post is another one. Your walls of text are pointless to read if there's nothing in them.


----------



## RichF (Jan 19, 2019)

Vayra86 said:


> That is a rather complicated way of saying 'I was actually adding the same nonsense to this topic as the guy I was correcting'... and this post is another one..


Still no rebuttal or substantive response to my posts, then.

Maybe a few more posturing bits like "dear god" and bolded text will increase the relevance.

Oh, I see you've added one: "walls of text are pointless". Since it's clear you have nothing on-topic to discuss I'm out.


----------



## Vayra86 (Jan 19, 2019)

RichF said:


> Still no rebuttal or substantive response to my posts, then.
> 
> Maybe a few more posturing bits like "dear god" and bolded text will increase the relevance.



This is what I and others were wondering about:

https://www.techpowerup.com/forums/...ve-sync-technology.251237/page-6#post-3971901

There is no substance here. Just a lot of words to convey the fact you don't like the tone of this discussion. OK. We got the memo - except everyone responding to you was already asking for that substance, including myself. We were _listening_ - _waiting, _for you to make your point or drive it home. My 'dear god' got in there because after a few responses from others, we we were still not clear on the point of the post I linked above.

You can leave whenever you want to... but all this was is a bit of miscommunication. Literally nobody responded to your post's content, its up to you to figure out why. Or you can turn around and leave. All the same to me...


----------



## Mussels (Jan 19, 2019)

After more extensive testing on my screen:

Everything works well and games tend to feel smoother, but under 30? 45? (its hard to be sure, i cant find the freesync range advertise for my screen) FPS i get these frames with different brightness, they're super fast and hard to see, some lighter some darker.

Overall its actually encouraged me to look into a Gsync screen in the future, as i've been shown the potential Gsync has to offer over regular Vsync


----------



## bajs11 (Jan 22, 2019)

wow
this shows Nvidia is actually capable of making their loyal customers happy and not just keep screwing them over and over 

I've been wanting to upgrade to either a 1440p ips monitor for awhile and maybe its time to get one with freesync


----------



## goodeedidid (Jan 28, 2019)

xkm1948 said:


> DAYMMMN NICE!
> 
> 
> 
> You do know PhysX is used in games all the time right?


No it's not


----------



## INSTG8R (Jan 28, 2019)

Mussels said:


> After more extensive testing on my screen:
> 
> Everything works well and games tend to feel smoother, but under 30? 45? (its hard to be sure, i cant find the freesync range advertise for my screen) FPS i get these frames with different brightness, they're super fast and hard to see, some lighter some darker.
> 
> Overall its actually encouraged me to look into a Gsync screen in the future, as i've been shown the potential Gsync has to offer over regular Vsync



Heres the “Master List”
https://www.amd.com/en/products/freesync-monitors


----------

