Friday, August 31st 2018

Intel's Chris Hook Confirms Commitment to Support VESA Adaptive Sync on Intel GPUs

Intel's Chris Hook (there's something strange there) said in a conversation with r/Hardware's moderator dylan522p that the company is still planning on adding support for VESA's Adaptive Sync (also known as AMD's own FreeSync branding) in Intel GPUs. To put this in perspective, Intel is the single largest player in the overall graphics market; their integrated solutions mean they have the highest graphics accelerator share in the market, even against AMD and NVIDIA - and Intel hasn't even entered the discrete graphics market - yet.

It makes sense that the blue giant would be pursuing this option - royalty-free frame syncing beats developing a proprietary alternative. A quick thought-exercise could point towards NVIDIA's G-Sync being rendered irrelevant with such strong support from the industry.
Sources: r/ Hardwaree subreddit - via Chris Hook, via Overclock3D
Add your own comment

80 Comments on Intel's Chris Hook Confirms Commitment to Support VESA Adaptive Sync on Intel GPUs

#26
R0H1T
TheinsanegamerNI cant wait for intel GPUs to be fast enough to make use of freesync. See you guys in 2090.
I suppose you'll be sleeping in cryostasis then, for 70 years?
Posted on Reply
#27
EatingDirt
oxidizedLittle to no evidence coming from Kyle "gimme your AMD shekels" Bennett? Have you tried the difference between the 2 in person? Well i had, and i noticed the difference, and like me, a few other people i know.

Now if we want to talk how that increase isn't always worth the price tag compared to freesync ones, i can understand and maybe even share the same thoughts, but i'm only talking performance vs performance.
Again, nothing to support the claim that gysnc is better than freesync, other than saying you've tried them both. This is not evidence. I assume when you tried the two you didn't try them blindly, have you considered that you may just have a bias towards Nvidia from the onset and that's why you preferred gsync?
Posted on Reply
#28
FordGT90Concept
"I go fast!1!11!1!"
I just wonder if Intel is going to create its own branding for it.

FreeSync is entirely firmware/software based. There is no dedicated hardware other than what is in the HDMI/DisplayPort standards that support it.
Posted on Reply
#29
Totally
oxidizedG Sync still give a better result in the end, i'm not sure any other technology can achieve the same.
I'm not saying that it isn't but the problem is that it shouldn't be as expensive as it is.
Posted on Reply
#30
oxidized
EatingDirtAgain, nothing to support the claim that gysnc is better than freesync, other than saying you've tried them both. This is not evidence. I assume when you tried the two you didn't try them blindly, have you considered that you may just have a bias towards Nvidia from the onset and that's why you preferred gsync?
No i don't care about brands, you actually sound like someone who does tho. G-sync is better, you can ask everyone really unbiased their opinions after have tried both extensively and they'll all tell you the same. The proprietary hardware for g-sync gives the feature an edge over freesync.
Totallyno one is saying that it isn't but the problem is that it shouldn't be as expensive as it is.
I am totally with you on that.
Someone is actually saying that it isn't, and using Kyle Bennett as source! ROFL.
Posted on Reply
#32
EatingDirt
oxidizedNo i don't care about brands, you actually sound like someone who does tho. G-sync is better, you can ask everyone really unbiased their opinions after have tried both extensively and they'll all tell you the same. The proprietary hardware for g-sync gives the feature an edge over freesync.



I am totally with you on that.
Someone is actually saying that it isn't, and using Kyle Bennett as source! ROFL.
I'm sorry, are you continuing to avoid showing any evidence that validates the claim that gsync is better than freesync? No? I thought not.

Just because there's hardware involved doesn't make it better. Until you can come up with any evidence to support your claim that one is better than the other, you should really stop trolling, because that's all you're doing.

I'll just put this here as well:
Posted on Reply
#33
krykry
oxidizedNo i don't care about brands, you actually sound like someone who does tho. G-sync is better, you can ask everyone really unbiased their opinions after have tried both extensively and they'll all tell you the same. The proprietary hardware for g-sync gives the feature an edge over freesync.



I am totally with you on that.
Someone is actually saying that it isn't, and using Kyle Bennett as source! ROFL.
When the Freesync vs G-sync video of Kyle Bennetts came out, he was still known in the community as Nvidia shill.
Posted on Reply
#34
ZeppMan217
FordGT90ConceptI just wonder if Intel is going to create its own branding for it.

FreeSync is entirely firmware/software based. There is no dedicated hardware other than what is in the HDMI/DisplayPort standards that support it.
They could probably call it Intel Sync, and pay monitor makers to have Intel Sync on their feature lists above AMD Freesync.
Posted on Reply
#35
oxidized
EatingDirtI'm sorry, are you continuing to avoid showing any evidence that validates the claim that gsync is better than freesync? No? I thought not.

Just because there's hardware involved doesn't make it better. Until you can come up with any evidence to support your claim that one is better than the other, you should really stop trolling, because that's all you're doing.

I'll just put this here as well:
There's nothing i can show that can prove anything because you have to own or at least see it in person, you can't show differences like those in any video or benchmark, it's not like we're saying x videocard is faster than y videocard. Also input lag isn't like the only thing to evaluate stuff here, g-sync works from lower frequency/framerate and feels smoother, besides, that is a pretty approximate test and crysis 3 isn't even a good game to test these stuff on.
He himself doesn't understand what he tried to prove in the end, so yeah as i said, there's no video and comparison that tells you which one is the best, you just have to try yourself. And yes since there's hardware involved, that's most likely what's going to happen in the end.
Oh and you're starting to sound more and more like a butthurt fanboy.
krykryWhen the Freesync vs G-sync video of Kyle Bennetts came out, he was still known in the community as Nvidia shill.
Was he? Then i guess the community was wrong, or he's very dependant on the amount of money he receives from a company, that would explain why he's totally unreliable and everyone with a normally functioning brain would see that
Posted on Reply
#36
Nkd
oxidizedFreesync probably has a better cost/performance ratio, as per usual when talking about AMD products, but we're talking performance here, and that's on G-sync side.



I actually have never seen a product which cost 3/400$ more just because it had nvidia's tech instead of AMD's, sure nvidia probably pumped up the price as per usual when they know they have even only a slight advantage, but still the additional performance is there.



Little to no evidence coming from Kyle "gimme your AMD shekels" Bennett? Have you tried the difference between the 2 in person? Well i had, and i noticed the difference, and like me, a few other people i know.

Now if we want to talk how that increase isn't always worth the price tag compared to freesync ones, i can understand and maybe even share the same thoughts, but i'm only talking performance vs performance.
LOL. Whats wrogn with gsync? adding 500 dollars to the cost of the monitor the new montiors that are thick as a brick, run hot, expensive and delayed, and it costs 2000 for a 27inch HDR monitor? Common! Gsync may be great but lost have lost their minds and are blinding by the personal bias. Nvidia rapes your wallet by bending you over and they ask to for more lol. Would you grab a 4k tv with freesync or would grab a Nvidia's new once again proprietary expensive Big F'in Display models. O wait they are again delayed and going to cost 5k lol! Nvidia has lost their minds.

I got no problem with Gsync, but really how long do they expect people to pay top dollars? I think slowly they will start losing market share as soon as intel joins the fight. Thats fine sell gsync but their ignorance to enable freesync will probably end bad.
oxidizedThere's nothing i can show that can prove anything because you have to own or at least see it in person, you can't show differences like those in any video or benchmark, it's not like we're saying x videocard is faster than y videocard. Also input lag isn't like the only thing to evaluate stuff here, g-sync works from lower frequency/framerate and feels smoother, besides, that is a pretty approximate test and crysis 3 isn't even a good game to test these stuff on.
He himself doesn't understand what he tried to prove in the end, so yeah as i said, there's no video and comparison that tells you which one is the best, you just have to try yourself. And yes since there's hardware involved, that's most likely what's going to happen in the end.
Oh and you're starting to sound more and more like a butthurt fanboy.



Was he? Then i guess the community was wrong, or he's very dependant on the amount of money he receives from a company, that would explain why he's totally unreliable and everyone with a normally functioning brain would see that
LOL what a bunch of haters. Yea a journalist that brakes the nvidia GPP bullshit story and spends 4k ordering the new turing cards when he could have gotten them for free by signing the NDA. Which he made an open poll about at hardforum and 70% of the users voted no and he didn't do it. Yea Nvidia offered him free cards if he would sign the NDA. Yea that Kyle who listens to his viewers because he loves money? GTFO! He is unreliable? Really? He is Nvidia Shill? really?

Oh wait who broke the article about Polaris being a hot mess and everyone thought he hated AMD! and then he broke the GPP and got cut off by nvidia after that! Yea it wasn't the money. He lost after he published those stories. If you guys don't know the facts or don't follow and play along fanboy lines. Just Sh*t UP and move on.

Kyle doesn't give two sh*ts about what a company thinks about him. He does what is right. Go read up his article. The guy says it like it is.
Posted on Reply
#37
GoldenX
Good news, there is a lot of people suffering with Intel IGPs for gaming, this would help a lot.
Posted on Reply
#38
oxidized
Nkd...
I'd argue with you if you read carefully what i wrote.

The guy says it like it is.

ROFL
Posted on Reply
#39
coonbro
I don't worry about it I just buy the card I need and use what ever monitor I got around. in 16 years that all ways worked great aand looked great . non gstink / freestink displays tend to do all I ask with out issue with any cards
Posted on Reply
#40
TheOne
I wonder if they enable it on their IGP's if you could use the same workaround people have been using with AMD's APU's to run Freesync while using an NVIDIA GPU.
Posted on Reply
#41
Mussels
Freshwater Moderator
TheOneI wonder if they enable it on their IGP's if you could use the same workaround people have been using with AMD's APU's to run Freesync while using an NVIDIA GPU.
I'm screwed either way lol, no IGP at all :(
Posted on Reply
#42
R-T-B
oxidizedI'd argue with you if you read carefully what i wrote.
Isn't that basically "seems smoother to me, therefore it is?"

Sorry dude, I'm not really buying it either in this instance.
Posted on Reply
#43
Totally
NkdLOL. Whats wrogn with gsync? adding 500 dollars to the cost of the monitor the new montiors that are thick as a brick, run hot, expensive and delayed, and it costs 2000 for a 27inch HDR monitor? Common! Gsync may be great but lost have lost their minds and are blinding by the personal bias. Nvidia rapes your wallet by bending you over and they ask to for more lol. Would you grab a 4k tv with freesync or would grab a Nvidia's new once again proprietary expensive Big F'in Display models. O wait they are again delayed and going to cost 5k lol! Nvidia has lost their minds.

I got no problem with Gsync, but really how long do they expect people to pay top dollars? I think slowly they will start losing market share as soon as intel joins the fight. Thats fine sell gsync but their ignorance to enable freesync will probably end bad.




LOL what a bunch of haters. Yea a journalist that brakes the nvidia GPP bullshit story and spends 4k ordering the new turing cards when he could have gotten them for free by signing the NDA. Which he made an open poll about at hardforum and 70% of the users voted no and he didn't do it. Yea Nvidia offered him free cards if he would sign the NDA. Yea that Kyle who listens to his viewers because he loves money? GTFO! He is unreliable? Really? He is Nvidia Shill? really?

Oh wait who broke the article about Polaris being a hot mess and everyone thought he hated AMD! and then he broke the GPP and got cut off by nvidia after that! Yea it wasn't the money. He lost after he published those stories. If you guys don't know the facts or don't follow and play along fanboy lines. Just Sh*t UP and move on.

Kyle doesn't give two sh*ts about what a company thinks about him. He does what is right. Go read up his article. The guy says it like it is.
Other than the shit show at which is the new norm for an AMD gpu launch I don’t remember anything about Polaris being a 'hot mess'. Could you be a little more specific?
Posted on Reply
#44
GhostRyder
You know what would be funny, if like the current Nvidia bug where you can support Freesync on Nvidia GPU's through AMD Integrated/discrete GPU's if you could do the same thing. Would be pretty sweet!
Posted on Reply
#45
Mistral
oxidizedAlso input lag isn't like the only thing to evaluate stuff here, g-sync works from lower frequency/framerate and feels smoother
Dude, FreeSync's maximum supported refresh rate range is 9–240 Hz. How much lower do you want to go? If for whatever reason your game is running at 8 frames or less, adjust your bloody settings instead.

en.wikipedia.org/wiki/FreeSync

And this is a gross oversimplification, but also where you logically get to if you extrapolate on how NV's and AMD's respective approaches work: if you have two screens with the same specs and one does directly what the PC tells it while the other relies on an in-between board, which one is likely to produce a better result?

The fact is, for all practical intends and purposes the two are pretty evenly matched, while one costs considerably more.
Posted on Reply
#46
StrayKAT
MistralDude, FreeSync's maximum supported refresh rate range is 9–240 Hz. How much lower do you want to go? If for whatever reason your game is running at 8 frames or less, adjust your bloody settings instead.

en.wikipedia.org/wiki/FreeSync

And this is a gross oversimplification, but also where you logically get to if you extrapolate on how NV's and AMD's respective approaches work: if you have two screens with the same specs and one does directly what the PC tells it while the other relies on an in-between board, which one is likely to produce a better result?

The fact is, for all practical intends and purposes the two are pretty evenly matched, while one costs considerably more.
I think it depends on the implementation (my monitor and TV's Freesync low point is 45Hz, I think?). Although the Vesa/Adaptive spec itself is low.
Posted on Reply
#47
Zubasa
StrayKATI think it depends on the implementation (my monitor and TV's Freesync low point is 45Hz, I think?). Although the Vesa/Adaptive spec itself is low.
Exactly, the problem with Freesync is that Low Framerate Compensation is not mandatory,
and there are Monitors like the LG 43UD79-B which has a range of 56~61Hz, absolute garbage.
There are plenty of Freesync monitors with much better range, but manufacturers can get away with crap is the problem.
Freesync 2 though does require LFC, so that itself is a big improvement.
Posted on Reply
#48
Mussels
Freshwater Moderator
ZubasaExactly, the problem with Freesync is that Low Framerate Compensation is not mandatory,
and there are Monitors like the LG 43UD79-B which has a range of 56~61Hz, absolute garbage.
There are plenty of Freesync monitors with much better range, but manufacturers can get away with crap is the problem.
Freesync 2 though does require LFC, so that itself is a big improvement.
i had to google that one just to be sure, that is the worst implementation possible for freesync
Posted on Reply
#49
Zubasa
Musselsi had to google that one just to be sure, that is the worst implementation possible for freesync
Oh even my monitor has the Standard Engine setting for Freesync which reports 120Hz to 144Hz, again pretty much useless.
But it does have the Ultimate Engine setting which gives 48~144Hz and actually very useful, although it would be using Pixel Overdrive etc to achieve that range.
Posted on Reply
#50
Deeveo
coonbroaint amd freestink in the supporting card s driver not the monitors ? I'm going to relook that up that's what I thought amd claimed and said on it so any monitor should work if so

Simply put, FreeSync allows AMD’s video cards and APUs directly and dynamically control the refresh rate of a connected monitor. Most monitors are locked into refresh 60 times per second, but quick ones will refresh 75, 120 or 144 times per second. With FreeSync enabled, the monitor will refresh its image in sync with the game that’s being played, up to its maximum level, and adjusting down when necessary.


any monitor has to be able to is that not just be a freestink monitor . sounds like there just branding any capable monitor as freestink as sales hype ? the I guess with that at least you think it can ??

sounds like monitor overclocking in the end

www.pcgamer.com/how-to-overclock-your-monitor-to-a-higher-refresh-rate/
coonbrowell the NVidia cash cow you buy a 500 buck gpu of theres then feel the need to buy a monitor at its cost + another NVidia hardware built in it at there extra cost

like I say AMD freestink is just adaptive monitor overclocking through there driver software freestink approved or not if its capable to do so - a freestink branded monitor just shows you it can overclock with out guessing if the one you got /get will [opinion]

ya, NVidia could easily do it
coonbroI like the concept and AMD 's way [ it works and simple ] just sad it just AMD only applies it to there cards software . I don't see NVidia coming down off there high horse and implementing it in theres . I had to move off my amd cards due to lack of support of thins that worked great before . my 7850 was a solid card for what it was but when later drivers did not support games I run ? well time to move on to NVidia that all my stuff works

one example
steamcommunity.com/app/12140/discussions/0/864961721978159686/

use older 12.6 works use later don't . I'm not going to swap drivers all day to do this and the to do that crap as it was getting with amd then the blackscreening with the 14.xx and up
Freesync/Adaptive sync requires that the monitor uses a scaler that supports it, so it can't be done just on the graphics driver. All monitors have a scaler so it doesn't need extra hardware for Adaptive Sync, just one that is capable.

NVidia GSYNC module on the other hand is a separate piece of hardware which NVidia sells to the monitor manufacturer(s) at a set price and they need to integrate it to the product raising costs. The new HDR capable GSYNC module is said to cost around 400-500$ (for the manufacturer) which will add do to the cost of the monitor.
Posted on Reply
Add your own comment
Dec 22nd, 2024 11:40 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts