Friday, August 31st 2018
Intel's Chris Hook Confirms Commitment to Support VESA Adaptive Sync on Intel GPUs
Intel's Chris Hook (there's something strange there) said in a conversation with r/Hardware's moderator dylan522p that the company is still planning on adding support for VESA's Adaptive Sync (also known as AMD's own FreeSync branding) in Intel GPUs. To put this in perspective, Intel is the single largest player in the overall graphics market; their integrated solutions mean they have the highest graphics accelerator share in the market, even against AMD and NVIDIA - and Intel hasn't even entered the discrete graphics market - yet.
It makes sense that the blue giant would be pursuing this option - royalty-free frame syncing beats developing a proprietary alternative. A quick thought-exercise could point towards NVIDIA's G-Sync being rendered irrelevant with such strong support from the industry.
Sources:
r/ Hardwaree subreddit - via Chris Hook, via Overclock3D
It makes sense that the blue giant would be pursuing this option - royalty-free frame syncing beats developing a proprietary alternative. A quick thought-exercise could point towards NVIDIA's G-Sync being rendered irrelevant with such strong support from the industry.
80 Comments on Intel's Chris Hook Confirms Commitment to Support VESA Adaptive Sync on Intel GPUs
My next card will be a 580. Laugh if you must. I have a 1080p TV and an A-10 build I will be using with the 580, to play games, as I find time to. If any of you think, I will not get at least 70 fps on my games, you are stoned or, stupid. The complete reality is, we as humans cannot see past 40 fps. Granted, I understand the TV/Monitor refresh vs. the gpu FPS. I still think there is too much e-peening about this.
How many here are PRO Gamers? And... isn't true that, very few, of them even bother with 4k? I, a few days ago, stood face to screen with a 1080p and a 4k monitor, side by side.
Know what? They looked the same. Both were 42 inch. Both were playing the same thing, from the same antenna, of a 4k stream. No difference!
Granted, 4k is a better quality signal, has more stuffs. but it is barely, if any, noticeable on a 42 inch screen. So, on my, TV, at 1080 and 40 inches, it would not make a difference. If I were to buy a 50, or bigger, inch screen, One would probably notice the difference.
2 cents.
BTW, I would get a Freesync, why pay more? Even if you have the money?
The video that Kyle made? What makes any of you not believe what was said? 50% either way. 100% said the $200 was not worth it.
Linus, that spent all day testing it, showed the lag associated with Gsynch, in all but one scenario with VESA off.
I cannot justify the slight difference Nvidea has. Nor should it matter, unless one is a Pro-Gamer.
Pay more.for what? Again? This is like a political argument, too much bias!
:lovetpu:
The problems with FreeSync happened indeed because monitors would not support a wide range of refresh rates, but realistically, if you use a decent monitor, FreeSync is completely similar. And even with less decent monitors with a tighter FreeSync range, as long as you're in that range and using the right ingame settings, the experience is 100% the same as with Gsync. The only thing you may not get is monitor strobing but you can't even use that together with adaptive sync anyway.
About sub 30 fps... you can just Vsync (available on any monitor at 0,-) that and the input lag penalty is nearly the same, because its SLOW anyway. Not sure why that is even remotely part of this discussion... who the hell games at sub 30 fps on high end gear? You're grasping at straws here desperately defending a grossly overpriced adaptive sync. Buyer's remorse?
In the end, budget matters. Spending 300-400 extra on Gsync is money that cannot go to a higher resolution, a better panel, or getting a VA/IPS over a crappy TN. Those are much more vital elements to a good monitor. You can alleviate most of the problems with fixed refresh rate screens with clever tweaking and sync options through frame capping, Fast Sync and Adaptive Sync from the driver. With the added bonus of being able to add strobe as well, which does a LOT more when it comes to quality of life for gaming. Hell you can even spend the 400 bucks on a fatter GPU so you never hit low FPS anyway and remove the need for adaptive sync entirely. Gsync is literally one of the worst investments to make on a monitor unless your budget is infinite. Never mind the vendor lock in, because you tend to keep a monitor for several GPU's worth.
I wish my HD530 would support FreeSync ...
Oh cmon, stop with this 300-400-500, anyone seem to have a different perspective on these prices here, you can find good monitors for like 100/150€ over the freesync versions when possible, that is, i'm totally with you on the fact it's probably better to have other techs instead of these.
My main point is, Gsync is heavily overrated. There are many other and cheaper ways to get a good, and even much better viewing and gaming experience. That combined with vendor lockinand a fat price increase makes for a pretty questionable way to use budget - whereas Freesync with its low cost is inherently much more sensible.
Regarding TV size, 1080p is quite sharp even for 50", it all depends how far from the TV you sit and what the dot pitch is.
The "we as humans cannot see past 40 fps" part is categorically wrong, just google it. Going from 60Hz to 75Hz alone makes a huge difference, and 120Hz is like a whole another level. That said, trying for higher than 144Hz is useless for all practical reasons.
Let these words sink in for you: there's absolutely nothing the G-Sync module can do that the GPU can't do in regards to adaptive sync.
So how is it "better?" The only thing empirically "better" I can think of is that monitor manufacturers can be a lot more lazy implementing G-Sync than adaptive sync. Consumers pay a premium for their laziness though so...yeah...not really "better."
Back on topic, I really want to know how Intel is going to handle it. AMD created the FreeSync brand for certification purposes. It stands to reason that Intel would create a similar brand also for certification purposes. Even though one monitor could be FreeSync branded and theoretically work fine with Intel's brand, the monitors might end up carrying branding for both. I don't think that was VESA's intent with the standard. It was supposed to be universal and just work like DisplayPort in general. AMD creating FreeSync may have created a snowball effect that was unintended.
On the other hand, maybe Intel will simply jump on the FreeSync bandwagon where any given FreeSync monitor will work powered by an Intel GPU too. If Intel does throw its hat in behind FreeSync entirely, I'm going to laugh at NVIDIA. How long will they beat their dead horse? NVIDIA could seriously lose the home theater PC market if they don't abandon G-Sync over the next decade.
Customers pay a premium for premium performance, in the case of nvidia it's a premium premium, free not to buy, but not to ignore the fact it's better even only slightly, partly because of proprietary hardware implementation. I'm sure it's a GREAT monitor. What model are we talking about exactly?
For some reason, cpu and gpu usage drops to like 30% when it happens