Saturday, October 19th 2013

ASUS Announces Adoption of NVIDIA G-SYNC Technology

ASUS is excited to announce that it will adopt NVIDIA G-SYNC technology into its next generation monitor lineup. NVIDIA G-SYNC, a significant innovation in display technology, provides an excellent gaming experience and stunning visual quality by synchronizing the monitor's refresh rate to the GPU's render rate. Images are displayed the moment they are rendered, which results in smoother gameplay and sharper images through the elimination of onscreen tearing, stuttering, and latency.

G-SYNC technology includes an NVIDIA G-SYNC module that requires integration into monitors, as well as hardware and software incorporated into its Kepler-based GPUs. It removes the need to use older technologies such as V-Sync, which can eliminate tearing but can increase latency and stuttering during gameplay. G-SYNC eliminates this trade-off leading to an uncompromised PC gaming experience.
ASUS, in a joint effort with NVIDIA, has been working over the past year to bring G-SYNC technology to market and will incorporate it into new monitors beginning in 2014. ASUS plans to release a G-SYNC-enhanced VG248QE gaming monitor in the first half of 2014 with pricing set at $399 USD in North America.

"ASUS strives to provide the best possible gaming experience by being at the forefront of technology and innovation. We are excited to be first to support and embrace NVIDIA's new G-SYSNC technology in upcoming ASUS gaming monitors. Gamers are certain to be impressed with its incredible step-up in smoothness and visual quality." said Vincent Chou, Associate Vice President of Display Business Unit ASUSTek.
Add your own comment

66 Comments on ASUS Announces Adoption of NVIDIA G-SYNC Technology

#51
tuklap
XzibitDoesn't seam so according to Nvidia

Nvidia G-Sync Homepage



Not all Kepler cards are supported either so make sure you have a supported GPU first.
Owww... too bad.. same old proprietary move by nvidia. but hey.. people like it ^_^ :toast:
Posted on Reply
#52
TheoneandonlyMrK
buggalugsgawd, its not enough to just pick the green team or the red team anymore......now you need specific cards within their range to access specific features.

I don't like where this is going.

Anyway, I don't think its really necessary..I don't get any noticeable tearing and stuttering since I got a 120Hz monitor and that's with NVidia and AMD graphics cards.

The only thing where this could come in handy is with the new 4K monitors, even with a highend card we're going to struggle to get 30fps so this could be worthwhile but being proprietary ruins it.

I guess AMD will release a similar gizmo and us consumers will be worse off, giving us less choice and freedom to change graphics cards because it will be tied to an expensive monitor.
Nah that's mostly the green team ,
amd do lock out lower cards but not before 3 years typically ,

nvidia try all that they can to keep there fans buying new green stuff my mates a doosie for green I am too just a diff type,

hes already got an nvidia 3d monitor he bought two years ago for Way over the odds and even he isn't daft enough to just keep buying this stuff ,, well I think.

I like how nv are working on making slow shit look good though wtf and I watched two hrs of three sellouts waffling absolute ass smoke to an nv rep to be able to say seriously,,,


this is only selling to fans and the geek elite with mega doe, because at 1080p where most of the Western world games at, this isn't selling , a bit like shield I mean any sane guy would just buy a better phone but a fan aahh.


Amd have also done some Good work lately on a new type of display Standard (open unlicensed) to identify and tile correctly 4k+ screens afaik visa id based thats how They role.
Posted on Reply
#53
The Von Matrices
theoneandonlymrkNah that's mostly the green team amd do lock out lower cards but not before 3 years nvidia try all that they can to keep there fans buying new green stuff my mates a doosie for green I am too just diff type hes already got an nvidia 3d monitor he bought two years ago for Way over the odds and even he isn't daft enough to just keep buying this stuff ,, well I think.

I like how nv are working on making slow shit look good though wtf and I watched two hrs of three sellouts waffling absolute ass smoke to an nv rep to be able to say seriously,,, this is only selling to fans and the geek elite with mega doe because at 1080p where most of the Western world games this isn't selling , bit like shield I mean any sane guy would just buy a better phone but a fan aahh.
A bit of punctuation would be extremely helpful in discerning exactly what you are trying to say.
Posted on Reply
#54
Fluffmeister
Does anyone actually understand what the hell theoneandonlymrk is going on about?

I assume Nvidia are evil and AMD are awesome, but beyond that... no clue.
Posted on Reply
#55
TheoneandonlyMrK
The Von MatricesA bit of punctuation would be extremely helpful in discerning exactly what you are trying to say.
I suggest more practice reading english then and return with a better less bitchy comment , it's late and phones aren't that great to type on.
Posted on Reply
#56
TheoneandonlyMrK
FluffmeisterDoes anyone actually understand what the hell theoneandonlymrk is going on about?

I assume Nvidia are evil and AMD are awesome, but beyond that... no clue.
Nvidia nah im talking about gsync fool its a fail imho ..... simple enough.
Posted on Reply
#57
Fluffmeister
theoneandonlymrkNvidia nah im talking about gsync fool its a fail imho ..... simple enough.
G-Sync is a fail and AMD are teh awesome! I get it Mr Simples.

But damn, lay off the drugs and for the love of God, try to form a decent sentence at least once in your life.
Posted on Reply
#58
The Von Matrices
GSG-9I built my computer, and selected my monitors to be utalised in a variety of ways. You are going to find a short list of people who ONLY do multiplayer-FPS-gaming with their computer.

If you find that specific person, I think you are correct, they are missing something. For the
rest of us in reality, we do a lot with our computers. :)
I agree that all users have a balance of work. My argument is that each panel technology has advantages and disadvantages; IPS is not superior to TN, just different. I get frustrated with the users who blindly advocate IPS panels and antagonize TN panels without recognizing that they fit different usage cases.

If this monitor had a thinner bezel I would seriously consider it for an upgrade.
Posted on Reply
#59
TheoneandonlyMrK
FluffmeisterG-Sync is a fail and AMD are teh awesome! I get it Mr Simples.

But damn, lay off the drugs and for the love of God, try to form a decent sentence at least once in your life.
Lay off personal insults and so will I but as is your a stuck up twat now feck off
Posted on Reply
#60
Frick
Fishfaced Nincompoop
theoneandonlymrkLay off personal insults and so will I but as is your a stuck up twat now feck off
But it's true. It's not that difficult to understand you but with at least a resemblence of punctuation (and using their/there/they're properly) and capital letters would make things so much easier. And you're from the UK, being bad at english is not an excuse, and neither is typing on a phone or being a lazy ass. :)

On topic though I'm not sure you know what you're on about. The thing is pratically no one has seen this tech irl, and I assume that is the thing you have to do to understand it. When you've seen it, then you can pass judgement.
Posted on Reply
#61
Octavean
G-Sync is not something that is easy to demo. From watching the video I linked to its obvious that it takes a lot of explaining and examples to make it clear,......to some people.

Personally I find the issues that necessitate V-Sync and issues with V-Sync itself mildly annoying. Anything that will abate these issues would be welcome IMO but not necessarily invaluable. Additional hardware tech that amounts to a definitive gamer oriented products such as G-Sync compliant monitors IMO is additive. Clearly such products will be more expensive and limited in scope (nVidia proprietary, lower resolution, ect).

I think there is room in the market for G-Sync though. The fact that it is proprietary rather then an industry standard doesn't bother me. What bothers me is that the industry didn't address the issue with an industry standard to correct these afflictions years ago, which, is what necessitates the proprietary solution in the first place.

There was some talk that in the future there may be monitors that you can simply add a G-Sync module to.

I often switch between nVidia and AMD/(ATI) so for example I went from an HD 6870 to a GTX670. People who do this know that they lose or gain proprietary features exclusive to their hardware choices. So having a monitor with G-Sync and a video card that doesn't support it for a generation or two of GPU isn't really a big deal to me.

Having said that, my next monitor will likely be a 4K display so nuts to this :)
Posted on Reply
#62
arbiter
OctaveanG-Sync is not something that is easy to demo. From watching the video I linked to its obvious that it takes a lot of explaining and examples to make it clear,......to some people.
True, its pretty much impossible to demo over a webcast. Since it throws the whole vertical refresh rate that has been in place since CRT, it allows the monitor to update its image every time a frame comes outta the video card. So 1 second you could be looking at 70fps or 70 hz, the next could be 120fps or 120hz.
tuklapOwww... too bad.. same old proprietary move by nvidia. but hey.. people like it ^_^ :toast:
Um considering this tech is really only benefit is higher end people that want best looking game experience, supporting cards that are not even meant for for what the tech is for is pretty pointless.
BiggieShadyWhen GPU is delivering each frame in less than 16.67 ms, it gets synced to 60 fps and every frame takes 16.67 ms.
When the GPU's frame takes longer and misses the monitor's sync even by a nano-second, GPU has to wait for another 16.67 ms, effectively halving a frame rate.
This is/was all because of how CRT monitors works/worked.
So, it's not stupid by design - just last remains of a CRT monitors era technology.

About g-sync, I welcome it as a good start although this tech would be best as an evolution of a DVI.
Yea pretty much, it was left in from old CRT days and causes problems of tearing when playing. This tech from Nvidia is probably most innovative tech in monitors in last 10 years. AMD fanboyz that are die hard amd people will do nothing but attack something that is good for the market. Like physx Nvidia most likely be opening to licensing it, but AMD will say they were locked out. (and yes an nvidia product manager was asked about it and they said they would.)
Posted on Reply
#63
DannibusX
I, for one, wouldn't mind seeing AMD, Nvidia, Intel and other interested parties form either a company or a partnership to share some areas of technology. Whether developing new hardware features like G-Sync or API's like Mantle and just let each other use them across the board. They're not really selling points for your hardware and the more you develop them the less they get used, especially if you make them exclusive.

I'm not saying for AMD, Nvidia and Intel to share CPU/GPU designs, but to develop and share software technology that works across the board, maybe flag Valve down to help reign in the OpenAPI's.
Posted on Reply
#64
TheoneandonlyMrK
DannibusXI, for one, wouldn't mind seeing AMD, Nvidia, Intel and other interested parties form either a company or a partnership to share some areas of technology. Whether developing new hardware features like G-Sync or API's like Mantle and just let each other use them across the board. They're not really selling points for your hardware and the more you develop them the less they get used, especially if you make them exclusive.

I'm not saying for AMD, Nvidia and Intel to share CPU/GPU designs, but to develop and share software technology that works across the board, maybe flag Valve down to help reign in the OpenAPI's.
Playing fair intel and amd have cross licensed in the past its only nvidia that has not but I agree with what you're saying.
Posted on Reply
#65
The Von Matrices
DannibusXI, for one, wouldn't mind seeing AMD, Nvidia, Intel and other interested parties form either a company or a partnership to share some areas of technology. Whether developing new hardware features like G-Sync or API's like Mantle and just let each other use them across the board. They're not really selling points for your hardware and the more you develop them the less they get used, especially if you make them exclusive.

I'm not saying for AMD, Nvidia and Intel to share CPU/GPU designs, but to develop and share software technology that works across the board, maybe flag Valve down to help reign in the OpenAPI's.
theoneandonlymrkPlaying fair intel and amd have cross licensed in the past its only nvidia that has not but I agree with what you're saying.
Well, previously it was Microsoft that reigned in proprietary software features by having the DirectX standard be dominant. Essentially, if it wasn't in DirectX, it wasn't used because developers programmed for DirectX. This is exactly what happened with AMD's tesselator in the HD2000-4000 series; it didn't get used because it wasn't part of the standard until DX11. NVidia has always championed CUDA but it didn't catch on either because it wasn't a DirectX standard. Now with AMD championing Mantle and attempting to replace the use of DirectX on its own cards this whole thing could break down with developers programming two codepaths - one for CUDA and one for Mantle, but this is a discussion for another thread.

Unfortunately, this is what happens with open standards bodies; not just for NVidia or AMD but for all big companies. Big companies join open standards bodies with their own agendas instead of open minds. If they don't like what the open standards body advocates then they leave and use their power to develop competing, proprietary technology that undermines the open standard and fractures the market instead of unifying it.

Closed standards bodies that sell licenses to their standards are the best way to ensure compliance. It does require royalties to be paid, but it also ensures that there is a consistent vision and a strong backer of the standard. Good examples of these closed standards working in the computer field are DirectX and HDMI.
Posted on Reply
#66
arbiter
The Von MatricesWell, previously it was Microsoft that reigned in proprietary software features by having the DirectX standard be dominant. Essentially, if it wasn't in DirectX, it wasn't used because developers programmed for DirectX. This is exactly what happened with AMD's tesselator in the HD2000-4000 series; it didn't get used because it wasn't part of the standard until DX11. NVidia has always championed CUDA but it didn't catch on either because it wasn't a DirectX standard. Now with AMD championing Mantle and attempting to replace the use of DirectX on its own cards this whole thing could break down with developers programming two codepaths - one for CUDA and one for Mantle, but this is a discussion for another thread.
Cuda doesn't really ditch Directx, cuda being part of Physx is just tech in game and still needs directx. Mantle completely ditches directx. physx has more likely chance to survive since it adds physic's, is gonna be like old opengl/directx options in games, back when mostly it graphic wise was same. Problem is how few games will even use it outside ones that AMD will have to write a check for. I have also said this many times, is question of what stability issues will mantle bring in from a game with small code errors. Will it cause system wide OS crash like days programs had low level axx to hardware in old Windows 9x days? its still up for debate.
Posted on Reply
Add your own comment
Sep 4th, 2024 17:23 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts