Friday, August 31st 2018

Intel's Chris Hook Confirms Commitment to Support VESA Adaptive Sync on Intel GPUs

Intel's Chris Hook (there's something strange there) said in a conversation with r/Hardware's moderator dylan522p that the company is still planning on adding support for VESA's Adaptive Sync (also known as AMD's own FreeSync branding) in Intel GPUs. To put this in perspective, Intel is the single largest player in the overall graphics market; their integrated solutions mean they have the highest graphics accelerator share in the market, even against AMD and NVIDIA - and Intel hasn't even entered the discrete graphics market - yet.

It makes sense that the blue giant would be pursuing this option - royalty-free frame syncing beats developing a proprietary alternative. A quick thought-exercise could point towards NVIDIA's G-Sync being rendered irrelevant with such strong support from the industry.
Sources: r/ Hardwaree subreddit - via Chris Hook, via Overclock3D
Add your own comment

80 Comments on Intel's Chris Hook Confirms Commitment to Support VESA Adaptive Sync on Intel GPUs

#51
Arjai
I don't care what any of you say. The guys that have done the work testing the systems have proven to me, G-Synch is not worth the money. In fact, Nvidea, of which I have a couple cards, work fine. But, like so many people say wrong, AMD cards also work.

My next card will be a 580. Laugh if you must. I have a 1080p TV and an A-10 build I will be using with the 580, to play games, as I find time to. If any of you think, I will not get at least 70 fps on my games, you are stoned or, stupid. The complete reality is, we as humans cannot see past 40 fps. Granted, I understand the TV/Monitor refresh vs. the gpu FPS. I still think there is too much e-peening about this.

How many here are PRO Gamers? And... isn't true that, very few, of them even bother with 4k? I, a few days ago, stood face to screen with a 1080p and a 4k monitor, side by side.

Know what? They looked the same. Both were 42 inch. Both were playing the same thing, from the same antenna, of a 4k stream. No difference!

Granted, 4k is a better quality signal, has more stuffs. but it is barely, if any, noticeable on a 42 inch screen. So, on my, TV, at 1080 and 40 inches, it would not make a difference. If I were to buy a 50, or bigger, inch screen, One would probably notice the difference.

2 cents.

BTW, I would get a Freesync, why pay more? Even if you have the money?

The video that Kyle made? What makes any of you not believe what was said? 50% either way. 100% said the $200 was not worth it.

Linus, that spent all day testing it, showed the lag associated with Gsynch, in all but one scenario with VESA off.

I cannot justify the slight difference Nvidea has. Nor should it matter, unless one is a Pro-Gamer.

Pay more.for what? Again? This is like a political argument, too much bias!

:lovetpu:
Posted on Reply
#52
Fatalfury
i hope intel's GPU is not on the 14nm++++ process...
Posted on Reply
#53
oxidized
R-T-BIsn't that basically "seems smoother to me, therefore it is?"

Sorry dude, I'm not really buying it either in this instance.
It's not like you have to BUY it, it's like that, either you believe it or not doesn't really matter.
MistralDude, FreeSync's maximum supported refresh rate range is 9–240 Hz. How much lower do you want to go? If for whatever reason your game is running at 8 frames or less, adjust your bloody settings instead.

en.wikipedia.org/wiki/FreeSync

And this is a gross oversimplification, but also where you logically get to if you extrapolate on how NV's and AMD's respective approaches work: if you have two screens with the same specs and one does directly what the PC tells it while the other relies on an in-between board, which one is likely to produce a better result?

The fact is, for all practical intends and purposes the two are pretty evenly matched, while one costs considerably more.
Oh yeah? Now find me a monitor that has a freesync working from under 30 fps please. Actually most of them starts from like 45 or higher, leaving drops of frame (beneath a reasonable amount) uncovered, so there's that if you need stats compared to stats, but as i already said the difference is noticeable only in person, it's not like you can compare their specs.
Posted on Reply
#54
Vayra86
oxidizedThere's nothing i can show that can prove anything because you have to own or at least see it in person, you can't show differences like those in any video or benchmark, it's not like we're saying x videocard is faster than y videocard. Also input lag isn't like the only thing to evaluate stuff here, g-sync works from lower frequency/framerate and feels smoother, besides, that is a pretty approximate test and crysis 3 isn't even a good game to test these stuff on.
He himself doesn't understand what he tried to prove in the end, so yeah as i said, there's no video and comparison that tells you which one is the best, you just have to try yourself. And yes since there's hardware involved, that's most likely what's going to happen in the end.
Oh and you're starting to sound more and more like a butthurt fanboy.



Was he? Then i guess the community was wrong, or he's very dependant on the amount of money he receives from a company, that would explain why he's totally unreliable and everyone with a normally functioning brain would see that
Sorry bud but the perceivable differences between Gsync and recent FreeSync (not the earliest implementation) is just not there - its entirely in the realm of unnoticeable and Linus's test shows that quite well, in fact he has also tested several different setting tweaks by forcing Vsync on and off etc. and Gsync and FreeSync both respond differently to either setting - each has its ideal setup and once used, they both work very well and are near identical even in button-to-pixel response and frame pacing. I'm not even a fan of Linus at all, but that was a pretty decent test.

The problems with FreeSync happened indeed because monitors would not support a wide range of refresh rates, but realistically, if you use a decent monitor, FreeSync is completely similar. And even with less decent monitors with a tighter FreeSync range, as long as you're in that range and using the right ingame settings, the experience is 100% the same as with Gsync. The only thing you may not get is monitor strobing but you can't even use that together with adaptive sync anyway.

About sub 30 fps... you can just Vsync (available on any monitor at 0,-) that and the input lag penalty is nearly the same, because its SLOW anyway. Not sure why that is even remotely part of this discussion... who the hell games at sub 30 fps on high end gear? You're grasping at straws here desperately defending a grossly overpriced adaptive sync. Buyer's remorse?

In the end, budget matters. Spending 300-400 extra on Gsync is money that cannot go to a higher resolution, a better panel, or getting a VA/IPS over a crappy TN. Those are much more vital elements to a good monitor. You can alleviate most of the problems with fixed refresh rate screens with clever tweaking and sync options through frame capping, Fast Sync and Adaptive Sync from the driver. With the added bonus of being able to add strobe as well, which does a LOT more when it comes to quality of life for gaming. Hell you can even spend the 400 bucks on a fatter GPU so you never hit low FPS anyway and remove the need for adaptive sync entirely. Gsync is literally one of the worst investments to make on a monitor unless your budget is infinite. Never mind the vendor lock in, because you tend to keep a monitor for several GPU's worth.
Posted on Reply
#55
dogsbody
hatWhile I'm happy to see an open source standard, as opposed to the very expensive alternative, embraced by Intel here... I fail to see what good it does... unless Intel really has something up their sleeve in the discrete graphics card department.
It allows for stutter-free gameplay experience on iGPUs where hitting that VSync cap can prove difficult more often than not.

I wish my HD530 would support FreeSync ...
Posted on Reply
#56
hat
Enthusiast
I know that the age-old standard vSync isn't perfect... but if your game already isn't performing well, I'd think it would still be laggy, no matter what vSync you're using (or none at all). FPS dips that low aren't good no matter what.
Posted on Reply
#57
Vya Domus
dj-electricFreeSync and Vulkan are being pushed by Raja and Chris
And it's a good thing, too bad Intel practically has nothing relevant to exert this push upon.
Zubasaand there are Monitors like the LG 43UD79-B which has a range of 56~61Hz, absolute garbage.
Meh, still better than nothing and it's cheap.
Posted on Reply
#58
oxidized
Vayra86Sorry bud but the perceivable differences between Gsync and recent FreeSync (not the earliest implementation) is just not there - its entirely in the realm of unnoticeable and Linus's test shows that quite well, in fact he has also tested several different setting tweaks by forcing Vsync on and off etc. and Gsync and FreeSync both respond differently to either setting - each has its ideal setup and once used, they both work very well and are near identical even in button-to-pixel response and frame pacing. I'm not even a fan of Linus at all, but that was a pretty decent test.

The problems with FreeSync happened indeed because monitors would not support a wide range of refresh rates, but realistically, if you use a decent monitor, FreeSync is completely similar. And even with less decent monitors with a tighter FreeSync range, as long as you're in that range and using the right ingame settings, the experience is 100% the same as with Gsync. The only thing you may not get is monitor strobing but you can't even use that together with adaptive sync anyway.

About sub 30 fps... you can just Vsync (available on any monitor at 0,-) that and the input lag penalty is nearly the same, because its SLOW anyway. Not sure why that is even remotely part of this discussion... who the hell games at sub 30 fps on high end gear? You're grasping at straws here desperately defending a grossly overpriced adaptive sync. Buyer's remorse?

In the end, budget matters. Spending 300-400 extra on Gsync is money that cannot go to a higher resolution, a better panel, or getting a VA/IPS over a crappy TN. Those are much more vital elements to a good monitor. You can alleviate most of the problems with fixed refresh rate screens with clever tweaking and sync options through frame capping, Fast Sync and Adaptive Sync from the driver. With the added bonus of being able to add strobe as well, which does a LOT more when it comes to quality of life for gaming. Hell you can even spend the 400 bucks on a fatter GPU so you never hit low FPS anyway and remove the need for adaptive sync entirely. Gsync is literally one of the worst investments to make on a monitor unless your budget is infinite. Never mind the vendor lock in, because you tend to keep a monitor for several GPU's worth.
It is there. The fact you and others here don't see it doesn't make it "not there". The test Linus made proves nothing, and isn't even a reliable test. Input lag =/= low framerate, the 2 don't feel at all the same and not always having one is better than the other, depends on the game and everything, it's not like you have to play all the time at sub 30 FPS (or better sub 40 since there's very few monitors that support that low framerate for Freesync, most of them don't go under 40Hz/fps) but even with high end hardware you'll end up having some dips to under 40fps at certain resolutions, 30-75Hz/FPS is where this tech should work better. I'm not defending it, i'm just stating the facts, nvidia's tech is better, then you can say whatever you want about the price, about the perf/price ratio, about quality of monitors coming with the tech, about availability, i'm strictly talking about performance and no, it's not buyer's remorse because i didn't buy none of those, but i tried them extensively, if i was to buy something i'd find a good deal for a g-sync monitor, even because i have a nvidia card.

Oh cmon, stop with this 300-400-500, anyone seem to have a different perspective on these prices here, you can find good monitors for like 100/150€ over the freesync versions when possible, that is, i'm totally with you on the fact it's probably better to have other techs instead of these.
Posted on Reply
#59
Vayra86
oxidizedIt is there. The fact you and others here don't see it doesn't make it "not there". The test Linus made proves nothing, and isn't even a reliable test. Input lag =/= low framerate, the 2 don't feel at all the same and not always having one is better than the other, depends on the game and everything, it's not like you have to play all the time at sub 30 FPS (or better sub 40 since there's very few monitors that support that low framerate for Freesync, most of them don't go under 40Hz/fps) but even with high end hardware you'll end up having some dips to under 40fps at certain resolutions, 30-75Hz/FPS is where this tech should work better. I'm not defending it, i'm just stating the facts, nvidia's tech is better, then you can say whatever you want about the price, about the perf/price ratio, about quality of monitors coming with the tech, about availability, i'm strictly talking about performance and no, it's not buyer's remorse because i didn't buy none of those, but i tried them extensively, if i was to buy something i'd find a good deal for a g-sync monitor, even because i have a nvidia card.

Oh cmon, stop with this 300-400-500, anyone seem to have a different perspective on these prices here, you can find good monitors for like 100/150€ over the freesync versions when possible, that is, i'm totally with you on the fact it's probably better to have other techs instead of these.
Something can be technically better and still totally not worth it. It goes like that more often than not: examples being "the megapixel race" the "screen real estate race on smartphones" or "1ms 240hz at the expense of everything else" etc etc ad infinitum.

My main point is, Gsync is heavily overrated. There are many other and cheaper ways to get a good, and even much better viewing and gaming experience. That combined with vendor lockinand a fat price increase makes for a pretty questionable way to use budget - whereas Freesync with its low cost is inherently much more sensible.
Posted on Reply
#60
TheOne
MusselsI'm screwed either way lol, no IGP at all :(
I'm sure NVIDIA will block it long before Intel starts using it anyway.
Posted on Reply
#61
oxidized
Vayra86Something can be technically better and still totally not worth it. It goes like that more often than not: examples being "the megapixel race" the "screen real estate race on smartphones" or "1ms 240hz at the expense of everything else" etc etc ad infinitum.

My main point is, Gsync is heavily overrated. There are many other and cheaper ways to get a good, and even much better viewing and gaming experience. That combined with vendor lockinand a fat price increase makes for a pretty questionable way to use budget - whereas Freesync with its low cost is inherently much more sensible.
Not totally not worth it. Gsync is a better version of adaptive sync compared to freesync, and if one isn't worth it then the other one isn't too, not money speaking, i already said multiple times i'm totally with you regarding the price, although it's not 300-400 or even 500$ more compared to freesync monitors.
Posted on Reply
#62
Vya Domus
TheOneI'm sure NVIDIA will block it long before Intel starts using it anyway.
I am not quite sure about that. I don't fully understand how it works but given the recent AMD Freesync thing it seems that this is tied more to Windows and how it handles framebuffers, more specifically about how it lets third parties access framebuffers from other devices. It may be up to MS to block this sort of thing and if they do then that would be some proper corporate ill intent.
Posted on Reply
#63
TheOne
Vya DomusI am not quite sure about that. I don't fully understand how it works but given the recent AMD Freesync thing it seems that this is tied more to Windows and how it handles framebuffers, more specifically about how it lets third parties access framebuffers from other devices. It may be up to MS to block this sort of thing and if they do then that would be some proper corporate ill intent.
It's possible that it will be to difficult to block to be worth the effort, but NVIDIA will probably still look into it. Of course it would be nice if NVIDIA would just support it as a budget option, I just hope if Intel is able to achieve a dedicated GPU on par or better than NVIDIA's that they price them reasonably.
Posted on Reply
#64
Mistral
oxidizedOh yeah? Now find me a monitor that has a freesync working from under 30 fps please. Actually most of them starts from like 45 or higher, leaving drops of frame (beneath a reasonable amount) uncovered, so there's that if you need stats compared to stats, but as i already said the difference is noticeable only in person, it's not like you can compare their specs.
You do realize that there are no G-Sync monitors that go under 30Hz either, right? You were criticizing FreeSync, and the tech itself is not the limitation. Manufacturers are free to implement it the way they choose, and you are free to buy the one that suits your needs. The FreeSync monitors that you complain about for not going under 45Hz pretty much all cost less than $300. You will not find a G-Sync display at that range. You can always spend more for a higher quality FreeSync display and it'll still cost less than a comparable G-Sync equivalent. There is no arguing with that.
ArjaiMy next card will be a 580. Laugh if you must. I have a 1080p TV and an A-10 build I will be using with the 580, to play games, as I find time to. If any of you think, I will not get at least 70 fps on my games, you are stoned or, stupid. The complete reality is, we as humans cannot see past 40 fps. Granted, I understand the TV/Monitor refresh vs. the gpu FPS. I still think there is too much e-peening about this.

Know what? They looked the same. Both were 42 inch. Both were playing the same thing, from the same antenna, of a 4k stream. No difference!

Granted, 4k is a better quality signal, has more stuffs. but it is barely, if any, noticeable on a 42 inch screen. So, on my, TV, at 1080 and 40 inches, it would not make a difference. If I were to buy a 50, or bigger, inch screen, One would probably notice the difference.
There's no reason to laugh at a 580, at 1080p that thing runs pretty much any game maxed out if you keep the AA reasonable. That said, it's the A-10 that will be holding you back. Unless you are talking about a Fairchild Republic A-10, in which case I'd love to know how you plug a 580 and a TV to it.

Regarding TV size, 1080p is quite sharp even for 50", it all depends how far from the TV you sit and what the dot pitch is.

The "we as humans cannot see past 40 fps" part is categorically wrong, just google it. Going from 60Hz to 75Hz alone makes a huge difference, and 120Hz is like a whole another level. That said, trying for higher than 144Hz is useless for all practical reasons.
Posted on Reply
#65
oxidized
MistralYou do realize that there are no G-Sync monitors that go under 30Hz either, right? You were criticizing FreeSync, and the tech itself is not the limitation. Manufacturers are free to implement it the way they choose, and you are free to buy the one that suits your needs. The FreeSync monitors that you complain about for not going under 45Hz pretty much all cost less than $300. You will not find a G-Sync display at that range. You can always spend more for a higher quality FreeSync display and it'll still cost less than a comparable G-Sync equivalent. There is no arguing with that.
I'm pretty sure i've seen g-sync monitors starting from 26Hz, or 25Hz not sure, if i manage to find that i'll post it here. Besides that the g-sync is still better and you can argue as much as you want.
Posted on Reply
#66
krykry
oxidizedBesides that the g-sync is still better and you can argue as much as you want.
I kek'd at this. Thanks for making my day.
Posted on Reply
#67
oxidized
krykryI kek'd at this. Thanks for making my day.
Good, i'm happy for you.
Posted on Reply
#68
moproblems99
oxidized30-75Hz/FPS is where this tech should work better
My monitor has a Freesync range of 30 - 75fps and cost $475. 32" 3440 x 1440.
Posted on Reply
#69
FordGT90Concept
"I go fast!1!11!1!"
oxidizedBesides that the g-sync is still better and you can argue as much as you want.
Two things will always be true of G-Sync: it will cost more and have higher input lag (both because of the extra hardware step involved). That first one is a deal breaker for most potential customers.

Let these words sink in for you: there's absolutely nothing the G-Sync module can do that the GPU can't do in regards to adaptive sync.

So how is it "better?" The only thing empirically "better" I can think of is that monitor manufacturers can be a lot more lazy implementing G-Sync than adaptive sync. Consumers pay a premium for their laziness though so...yeah...not really "better."


Back on topic, I really want to know how Intel is going to handle it. AMD created the FreeSync brand for certification purposes. It stands to reason that Intel would create a similar brand also for certification purposes. Even though one monitor could be FreeSync branded and theoretically work fine with Intel's brand, the monitors might end up carrying branding for both. I don't think that was VESA's intent with the standard. It was supposed to be universal and just work like DisplayPort in general. AMD creating FreeSync may have created a snowball effect that was unintended.

On the other hand, maybe Intel will simply jump on the FreeSync bandwagon where any given FreeSync monitor will work powered by an Intel GPU too. If Intel does throw its hat in behind FreeSync entirely, I'm going to laugh at NVIDIA. How long will they beat their dead horse? NVIDIA could seriously lose the home theater PC market if they don't abandon G-Sync over the next decade.
Posted on Reply
#70
sepheronx
after seeing my performance in Fallout 4 with a GTX1070, I don't think I will need any kind of special vsync. My FPS doesn't exceed 30fps in downtown area in that game (which is surprising to see that my cpu/gpu usage stays low even in the low fps area, dunno why).
Posted on Reply
#71
oxidized
FordGT90ConceptTwo things will always be true of G-Sync: it will cost more and have higher input lag (both because of the extra hardware step involved). That first one is a deal breaker for most potential customers.

Let these words sink in for you: there's absolutely nothing the G-Sync module can do that the GPU can't do in regards to adaptive sync.

So how is it "better?" The only thing empirically "better" I can think of is that monitor manufacturers can be a lot more lazy implementing G-Sync than adaptive sync. Consumers pay a premium for their laziness though so...yeah...not really "better."


Back on topic, I really want to know how Intel is going to handle it. AMD created the FreeSync brand for certification purposes. It stands to reason that Intel would create a similar brand also for certification purposes. Even though one monitor could be FreeSync branded and theoretically work fine with Intel's brand, the monitors might end up carrying branding for both. I don't think that was VESA's intent with the standard. It was supposed to be universal and just work like DisplayPort in general. AMD creating FreeSync may have created a snowball effect that was unintended.

On the other hand, maybe Intel will simply jump on the FreeSync bandwagon where any given FreeSync monitor will work powered by an Intel GPU too. If Intel does throw its hat in behind FreeSync entirely, I'm going to laugh at NVIDIA. How long will they beat their dead horse? NVIDIA could seriously lose the home theater PC market if they don't abandon G-Sync over the next decade.
You mean G-sync has less input lag. "Let these words sink in for you" yeah sure, i should ignore completely what i saw and what people i know saw, because someone on the internet is telling me? I'll surely do that.
Customers pay a premium for premium performance, in the case of nvidia it's a premium premium, free not to buy, but not to ignore the fact it's better even only slightly, partly because of proprietary hardware implementation.
moproblems99My monitor has a Freesync range of 30 - 75fps and cost $475. 32" 3440 x 1440.
I'm sure it's a GREAT monitor. What model are we talking about exactly?
Posted on Reply
#72
FordGT90Concept
"I go fast!1!11!1!"
oxidizedYou mean G-sync has less input lag.
GeForce has less input lag than Radeon because of architectural differences. That has naught to do with how G-Sync is implemented.
Posted on Reply
#73
oxidized
FordGT90ConceptGeForce has less input lag than Radeon because of architectural differences. That has naught to do with how G-Sync is implemented.
Could be, but the result is that overall g-sync has (or feels with) less input lag, put it on the card, put it on the monitor, but it's how it is
Posted on Reply
#74
StrayKAT
sepheronxafter seeing my performance in Fallout 4 with a GTX1070, I don't think I will need any kind of special vsync. My FPS doesn't exceed 30fps in downtown area in that game (which is surprising to see that my cpu/gpu usage stays low even in the low fps area, dunno why).
Seriously? That sounds bad. I'm surprised the 1070 would go so low. What resolution?
Posted on Reply
#75
sepheronx
StrayKATSeriously? That sounds bad. I'm surprised the 1070 would go so low. What resolution?
1080p

For some reason, cpu and gpu usage drops to like 30% when it happens
Posted on Reply
Add your own comment
Dec 22nd, 2024 11:05 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts