• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD's FreeSync ?

I was more alluding to the fact that if you had a G-Sync monitor and wanted to update it but keep your Kepler card. The usability and selling pool is slim. Due to Nvidia reference desing from Fermi and below didn't include DP. It left it up to AIB partners to include. AMD had mini-DP in there reference.
 
I was more alluding to the fact that if you had a G-Sync monitor and wanted to update it but keep your Kepler card. The usability and selling pool is slim. Due to Nvidia reference desing from Fermi and below didn't include DP. It left it up to AIB partners to include. AMD had mini-DP in there reference.

I don't think the incompatibility with Fermi is due to the lack of DP. NVidia straight up said that it doesn't support G-Sync on anything less than a GTX 650 Ti Boost even if the lesser card does have DisplayPort. It's either an artificial software limitation, or G-sync is using some logic only present in the Kepler architecture.

That said, the resale market is only expanding, since I would keep the monitor for at least 1-2 years and by that point there will be two generations of NVidia cards within the market that will support it. Plus, even if there is a limited G-Sync market, the loss is limited since the monitor will never be worth less than the model without the G-sync board,
 
on anything less than a GTX 650 Ti Boost
NVIDIA said they are using some special logic from SLI for G-Sync, which isn't available in earlier cards. No way to know for sure.
 
Well the only sure thing is this is NOT a good time to buy a PC display.... in a year from now all monitors will either have new DP with Freesync support or a not-so-overpriced-anymore G-sync chip...
 
NVIDIA said they are using some special logic from SLI for G-Sync, which isn't available in earlier cards. No way to know for sure.

I think hes talking G-Sync usability I'm just talking monitor usability in general. A G-Sync monitor will just have a DisplayPort. A "FreeSync" would have all the normal inputs plus the DP v1.3 since its a standard.

Back on topic...
G-Sync doesn't have the ability to control the backlight like DP v1.3 panels will be able to do. That could render G-Sync obsolete and force Nvidia to accept "FreeSync".
LightBoost not needed.
MotionBlur gone.
Everything can be done thru the driver on the fly if needed.

Well the only sure thing is this is NOT a good time to buy a PC display.... in a year from now all monitors will either have new DP with Freesync support or a not-so-overpriced-anymore G-sync chip...

Aint that the truth.
 
Last edited:
I would like to AMD FreeSync win and to open eye to NVIDIA fans who had explanation and good reason for every NVIDIA move and their moves in last two years are terrible.
It would be interesting in Free Sync start to work with previous series and NVIDIA GSync only for Kepler and next cards maybe... Imagine FreeSync work with GTX480/GTX580 but NVIDIA don't want to support with drivers. :) That will be nice for all who forgive everything what NVIDIA done as main purpose to earn money and last place is relationship with customer, improvement technology, better gaming for all...
 
GSync is "GPU drives VBI," whereas FreeSync is "driver speculates VBI." The outcome can be close, but one is superior than the other.

FreeSync uses variable VBI, meaning the driver needs to setup the proper VBI for the next frame, therefore requires the driver to predict the future.

If the app isn't running in constant FPS, then FreeSync will fail when FPS changes, and you will still see stuttering. Also, you need to enable VSYNC, therefore you still have the lag issue that GSync solves by working without VSYNC.

Sure you will have a better experience, but not as good as GSync. With FreeSync you will have software overhead. If you predict conservatively you lose FPS; if you predict aggressively you might end up with more stuttering than plain VSYNC.

GSync solves the problems by holding VBI until the next frame is drawn, therefore there is no speculation, so it works under all circumstances. You simply can't do that in software, because software runs on the computer, not the monitor. You have to have a monitor smart enough to wait for the next GPU command to do the drawing, and that's why NVIDIA has to do it with a separate board. There is no VESA standard for that.

One person, "purehg," has posted in many of the online forums about this and summarizes the differences perfectly.
 
Monitors, blah, I'm more interested in whether TVs will commonly have DP 1.3. My guess is, good luck with that, unless you buy a huge, ultra expensive TV.

I take it a DP to HDMI adapter or DP to HDMI cable from your GPU to your TV won't suffice?
 
Last edited by a moderator:
What is free about it?

From the other thread:

It's not like Nvidia is without need for constant improvements. Their drivers have been hit and miss lately, G-Sync is showing 25-30 FPS dropouts, and this is while they are spending much of their time working on features that have nothing to do with needed game performance improvement. Which is what I meant by Faceworks, Shadowplay "crap". The latter may add race game -like replay convenience, but it's not a necessity compared to gameplay performance.

Drivers are complicated things, but they certainly aren't having to deal with the issues AMD are now, having largely addressed them a long time ago. nVida integrated lot's of great functionality and features into their driver whilst offering great support across the board, can't see a problem frankly.

And while AMD ARE having trouble with Mantle's launch, mostly due to DICE's problems with coding BF4, they WERE the ones that motivated MS to finally get up off their corporate arses and finally make a more efficient API, vs stubbornly plodding along with this Metro UI gimmickry that many wanted them to make a toggle off option for.

That's a fine rant, but I gather DICE helped co-develop Mantle, besides I'm not really sure how creating a new API that (still currently) requires Windows is teaching Microsoft a lesson, they should have focused on Linux first, but then I guess DICE couldn't have sold millions of copies of BF4, hmm.

Meanwhile Nvidia plays MS' lapdogs with DX12, while most forget that AMD said from the beginning that their intent with Mantle all along was for the industry to have a more efficient API, regardless of where it came from. They were only responding to developer interest, and at least they DID take those devs seriously in the first place, like MS finally is now after AMD convinced them there could be something to Mantle.

It seems all the hype keeps a lot of people on the net from putting things in perspective. Money (vs goal) driven companies easily cloud people's judgement.

Very noble of them I'm sure, but devs and the publishers want to sell to as many people as possible, AMD need to pull their finger out then. But then seeing as they are fully behind DX12 too, it will be fun to watch how it pans out.
 
Last edited:
WOW this is scary reading here. Not one person has posted what "freesync" is or could be. it's really nothing. The LCD has to have a chip in it like Gsync in order for amd's freesync to even be anything.

Right now there are very few LAPTOP LCD's that have the variable refresh ability.

just because VESA in going to introduce VR into a standard, that means the manufacturer still has to build or design a G-sync like chip. Do you really think they're going to make that a standard? Could they make that backwards compatible?
 
WOW this is scary reading here. Not one person has posted what "freesync" is or could be. it's really nothing. The LCD has to have a chip in it like Gsync in order for amd's freesync to even be anything.

Right now there are very few LAPTOP LCD's that have the variable refresh ability.

just because VESA in going to introduce VR into a standard, that means the manufacturer still has to build or design a G-sync like chip. Do you really think they're going to make that a standard? Could they make that backwards compatible?

G-Sync module is just a T-Con with a large video buffer. There is nothing special in it.

If VESA has accepted AMD proposal into DP 1.2a it most certainly will be in DP 1.3. This means the signal to communicate wont hinder any other standards. One wouldn't have to remove all there current monitor inputs and T-Con to be replaced by a specific one that disable audio and voids their warranty.
 
From my understanding FreeSync differs from G-Sync in that it DOESN'T require a special addon chip, just DP1.3. My only contention there as I mentioned earlier is it may exclude TVs, which are getting pretty common for gaming and tend to have only HDMI ports.

@Fluffmeister,
I wasn't implying AMD is "teaching MS a lesson" with Mantle, just that the dev interest in it created enough of a stir that MS took it seriously and started working on their OWN API improvements, and it's about time. In that sense however AMD are indirectly teaching them to focus rather than dawdle, if nothing else.
 
G-Sync module is just a T-Con with a large video buffer. There is nothing special in it.

If VESA has accepted AMD proposal into DP 1.2a it most certainly will be in DP 1.3. This means the signal to communicate wont hinder any other standards. One wouldn't have to remove all there current monitor inputs and T-Con to be replaced by a specific one that disable audio and voids their warranty.

you are incorrect again:

http://techreport.com/news/25878/nvidia-responds-to-amd-free-sync-demo


important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.

That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia's own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know." Nvidia's intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction.

When asked about a potential VESA standard to enable dynamic refresh rates, Petersen had something very interesting to say: he doesn't think it's necessary, because DisplayPort already supports "everything required" for dynamic refresh rates via the extension of the vblank interval. That's why, he noted, G-Sync works with existing cables without the need for any new standards. Nvidia sees no need and has no plans to approach VESA about a new standard for G-Sync-style functionality—because it already exists.

That said, Nvidia won't enable G-Sync for competing graphics chips because it has invested real time and effort in building a good solution and doesn't intend to "do the work for everyone." If the competition wants to have a similar feature in its products, Petersen said, "They have to do the work. They have to hire the guys to figure it out."
 
Which is all pretty pointless.

If the monitor manufacturers choose to adhere to the spec, they will reconfigure the internals accordingly.

Do you really think the AMD engineers are ignorant of any of this, or what the hardware manufacterers will need to do?
 
Last edited:
I'll reply back in a few years and ask "hey guys what happen to your freesync"

oh looks like none of the manufacturers went the extra mile to implement that... Not so free or real is it?
 
Some of you need to actually READ the articles attached to these video demos, because many of you are still implying AMD's method requires cooperation from monitor manufacturer's to retrofit new hardware. It only requires DisplayPort 1.3. Lots of GPUs and monitors now already have DisplayPort, and version 1.3 will be the new standard Q2 2014.

"In our review I was pretty pleased with G-Sync. I’d be even more pleased if all panels/systems supported it. AMD’s “FreeSync” seems like a step in that direction (and a sensible one too that doesn’t require any additional hardware)"

Source: http://www.anandtech.com/show/7641/amd-demonstrates-freesync-free-gsync-alternative-at-ces-2014
 
Some of you need to actually READ the articles attached to these video demos, because many of you are still implying AMD's method requires cooperation from monitor manufacturer's to retrofit new hardware. It only requires DisplayPort 1.3. Lots of GPUs and monitors now already have DisplayPort, and version 1.3 will be the new standard Q2 2014.

"In our review I was pretty pleased with G-Sync. I’d be even more pleased if all panels/systems supported it. AMD’s “FreeSync” seems like a step in that direction (and a sensible one too that doesn’t require any additional hardware)"

Source: http://www.anandtech.com/show/7641/amd-demonstrates-freesync-free-gsync-alternative-at-ces-2014

you are wrong again!!!!! Are you not able to grasp the understanding that display architecture other than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.
 
you are wrong again!!!!! Are you not able to grasp the understanding that display architecture other than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.

Scaler chips are needed in standalone monitors because the image doesn't always match the output on the display so it needs to be resized.

I wonder if a GPU can do that ?
 
I got nothing out of that video. What point is it trying to make?
exactly, thats what you've got from amd "freesync" :laugh:
 
Back
Top