# A 1 ms GTG monitor vs. A G-Sync enabled monitor



## Deleted member 138597 (Dec 11, 2013)

Is the smoothness of a 1 ms GTG monitor equal to the smoothness of a G-Sync enabled monitor? If it is or very nearly so, then it's a good news for AMD fans. Anyway, will a 1440p monitor be better or worse against a 1080p monitor for desktop use (I mean, the monitor's quite close to sight, so will it be too awkward for a 27" monitor?)?


----------



## BiggieShady (Dec 11, 2013)

Grey to gray time has nothing to do with syncing monitor refresh rate to game's frame rate which is what g-sync does.


----------



## Deleted member 138597 (Dec 11, 2013)

What's better anyway?


----------



## BiggieShady (Dec 11, 2013)

If you play games that run at 60 fps all the time on 60 Hz monitor (or 120 fps on 120 Hz monitor) then g-sync will make no difference, you are better off with faster response. If frame rate dips below 60 then g-sync will provide much better experience.
In any case even 6 ms GTG is fine for most of the games.
... and btw, in order to use g-sync, you'll need to go for nvidia card at least 6 series, 650 Ti boost or better.


----------



## brandonwh64 (Dec 11, 2013)

BiggieShady said:


> Grey to gray time has nothing to do with syncing monitor refresh rate to game's frame rate which is what g-sync does.



Completely off topic but how did you get a animated avatar?


----------



## Deleted member 138597 (Dec 11, 2013)

BiggieShady said:


> If you play games that run at 60 fps all the time on 60 Hz monitor (or 120 fps on 120 Hz monitor) then g-sync will make no difference, you are better off with faster response. If frame rate dips below 60 then g-sync will provide much better experience.
> In any case even 6 ms GTG is fine for most of the games.
> ... and btw, in order to use g-sync, you'll need to go for nvidia card at least 6 series, 650 Ti boost or better.


What if I want to go for a AMD card for its lesser price (i.e. An R9 290X) over Nvidia ones? Will G-Sync be a wastage of money? And why should I choose an relatively expensive nvidia card except for G-Sync? Will AMD bring similar tech later or will Nvidia be kind enough to make it open for its competetitor?


----------



## ne6togadno (Dec 11, 2013)

that g-sync bs. it is as useful as physics.
what you need for gaming is monitor with low input lag (check this)
1440p monitor will give you about 36% more pixels over 1080p so your desktop will be 36% bigger.


----------



## Frick (Dec 11, 2013)

ne6togadno said:


> that g-sync bs. it is as useful as physics.
> what you need for gaming is monitor with low input lag (check this)
> 1440p monitor will give you about 36% more pixels over 1080p so your desktop will be 36% bigger.



So you've seen one irl?????


----------



## BiggieShady (Dec 11, 2013)

brandonwh64 said:


> Completely off topic but how did you get a animated avatar?



I have seen someone with animated avatar (tatty_one?) and tried some GIFs ... apparently GIF-s under 50 kB that don't get automatically resized (under 200x200) work fine .... trial and error


----------



## W1zzard (Dec 11, 2013)

Frick said:


> So you've seen one irl?????



I have and it's a HUGE difference (when running between 35 and 59 FPS). As others have mentioned before, it has nothing to do with pixel response time, but low response time = less blur. G-Sync = less stuttering.

Not sure if it's worth the price point though.


----------



## ne6togadno (Dec 12, 2013)

Frick said:


> So you've seen one irl?????


nop i havnt. i havnt seen physics either.
oh wait. there was a game that had asked me for physics - two worlds if i remember well. havent seen anything different in it.
oh and i heve seen nvidia ad about physics in metro last light. it was showing that when shooting at walls concrete particles fly around. very impressive. when i play games my only goal is to shoot at walls so i can enjoin how cool concrete particles fly around.



W1zzard said:


> I have and it's a HUGE difference (when running between 35 and 59 FPS). As others have mentioned before, it has nothing to do with pixel response time, but low response time = less blur. G-Sync = less stuttering.
> 
> Not sure if it's worth the price point though.


if i have problems with framerates in game i would rather invest in better graphics (which will also let me use my rig longer) or will drop video settings if i cant afford upgrade atm, then to pay for something that will limit my choice when upgrading vga card later


----------



## Deleted member 138597 (Dec 16, 2013)

Thankyou all of you people for helping me figuring this out. I really appreciete your suggestions. Thanks again for your kind support.


----------



## Tatty_One (Dec 16, 2013)

brandonwh64 said:


> Completely off topic but how did you get a animated avatar?


Probably the same way you did


----------



## brandonwh64 (Dec 16, 2013)

Tatty_One said:


> Probably the same way you did



LOL late to the party TATTY hahaha


----------



## FX-GMC (Dec 16, 2013)

ne6togadno said:


> nop i havnt. i havnt seen physics either.
> oh wait. there was a game that had asked me for physics - two worlds if i remember well. havent seen anything different in it.
> oh and i heve seen nvidia ad about physics in metro last light. it was showing that when shooting at walls concrete particles fly around. very impressive. when i play games my only goal is to shoot at walls so i can enjoin how cool concrete particles fly around.



Ah, you mean PhysX.

Physics is a science.


----------



## ne6togadno (Dec 17, 2013)

FX-GMC said:


> Ah, you mean PhysX.
> 
> Physics is a science.


ya
not my fault nvidia cant spell


----------



## kn00tcn (Dec 17, 2013)

ageia.... it was physx beforce nvidia bought it, level up your anti nv knowledge

gsync is hardly bs, it's the future if there's a new public standard for displayport, LCDs have spent too long wasting time at fixed refresh rates & acting like CRTs to the OS


----------



## xenocide (Dec 17, 2013)

ne6togadno said:


> nop i havnt. i havnt seen physics either.
> oh wait. there was a game that had asked me for physics - two worlds if i remember well. havent seen anything different in it.
> oh and i heve seen nvidia ad about physics in metro last light. it was showing that when shooting at walls concrete particles fly around. very impressive. when i play games my only goal is to shoot at walls so i can enjoin how cool concrete particles fly around.


 
You clearly miss to point of PhysX.  It's not to make shooting walls pretty, it's to add realistic physics simulations to games.  Anything to can add realism adds depth to the game, and makes you less aware you're playing in a fictional world.  Also, you have an AMD video card which means PhysX doesn't run properly on your system, it's emulated on the CPU using an instruction set created by cavemen.



ne6togadno said:


> if i have problems with framerates in game i would rather invest in better graphics (which will also let me use my rig longer) or will drop video settings if i cant afford upgrade atm, then to pay for something that will limit my choice when upgrading vga card later


 
So you can run every game on the market at absolute max settings and maintain a constant 60 fps?  I call BS looking at those system specs.


----------



## Cotton_Cup (Dec 17, 2013)

well if you're getting high FPS on 1080 then I suggest just get a new monitor like a 1440 that way your FPS will go down a bit more. I like ISP more than TN and so far and not sure if there are any 120hz IPS monitors.  but I prefer going bigger resolution compared to a 1080 120hz since I like big resolution my self.


----------



## qubit (Dec 17, 2013)

W1zzard said:


> I have and it's a HUGE difference (when running between 35 and 59 FPS). As others have mentioned before, it has nothing to do with pixel response time, but low response time = less blur. G-Sync = less stuttering.
> 
> Not sure if it's worth the price point though.


The way I understand it, the gold standard as it were is to have the card locked at 120Hz vsync with no dropped frames? G-Sync helps massively with the situation where the framerate cannot maintain 120fps. I'll likely end up buying this when they bring out a 27" G-Sync monitor and I have the appropriate graphics card.


----------



## xenocide (Dec 17, 2013)

Cotton_Cup said:


> well if you're getting high FPS on 1080 then I suggest just get a new monitor like a 1440 that way your FPS will go down a bit more. I like ISP more than TN and so far and not sure if there are any 120hz IPS monitors.  but I prefer going bigger resolution compared to a 1080 120hz since I like big resolution my self.


 
IPS Panels are better for image quality, but they tend to lack a bit in response time.  I know there was a 120hz IPS Panel floating around (it made the TPU news feed) but I'm not sure it was ever released outside of Korea or China (wherever it was made).  120hz isn't all its cracked up to be, then again, when I play on monitors that only offer 60hz it seems less enjoyable...


----------



## ne6togadno (Dec 17, 2013)

kn00tcn said:


> ageia.... it was physx beforce nvidia bought it, level up your anti nv knowledge


?!? 



kn00tcn said:


> gsync is hardly bs, it's the future if there's a new public standard for displayport...


ugh... what are you trying to say. what is connection between new dp standard and g-sync?



xenocide said:


> You clearly miss to point of PhysX.  It's not to make shooting walls pretty, it's to add realistic physics simulations to games.  Anything to can add realism adds depth to the game, and makes you less aware you're playing in a fictional world.


rly.
i remember velve said same thing about "realistic physics simulations" back when source was revealed but still that doesnt stoped crytek and epic to make their own engines.
it may work for you but i'd pity the game that need to count on "realistic physics simulation" in order to keep immerison.
one can make realystic physics simulations with any current game engine. it is just question of willingness and time+money. physics simulation doesnt start and end with physx. physx+sdk make it easier but it isnt THE ONE AND ONLY  condition to have physics simulations.
if you enjoy so much on flying particles pay extra for your physx card. it's your money you decide how to spend it. that doesnt change the fact that phisx is last reason for buying nvida card.
i personally dont care how heads will explode as soon as there are enough heads to shoot at so when i look for new video card i'd rather look for price/performance ratio in my price range then for secondary features that are implemented in just few games.



xenocide said:


> Also, you have an AMD video card which means PhysX doesn't run properly on your system, it's emulated on the CPU using an instruction set created by cavemen.


ever thought that i might own(ed) nvidia video card(s) too?



xenocide said:


> So you can run every game on the market at absolute max settings and maintain a constant 60 fps?  I call BS looking at those system specs.


i care less how you call it or not.
i havent said anything about my current specs so far so i dont see how my specs mess with g-sync and should it be decisive factor for buying vga and/or monitor. does my sys specs and game settings your stronges argument pro g-sync...
i run all games i like on setting that make em look the way i like and have frame rate that let me play w/o lag, shuttering or any negative "efects" you may think for. in fact i quite offten lower max settings intentionally cause those "realistic effects" are implemented in the way that actually for me brakes gameplay instead to add immersion


 edit:


qubit said:


> The way I understand it, the gold standard as it were is to have the card locked at 120Hz vsync with no dropped frames? G-Sync helps massively with the situation where the framerate cannot maintain 120fps. I'll likely end up buying this when they bring out a 27" G-Sync monitor and I have the appropriate graphics card.


you cant lock card that cant keep constant 60+ fps to 120fps. g-sync is lowerting monitor refresh to match vga fps so you dont see one and the same frame till vga manage to render next frame. this way they make illusion of fluid shutturing free fps. g-sync wont boost your vga fps so if your card cant reach constant 30+ fps your game will lag, shutter (call it whatever you like) and only thing can fix it is better vga.


----------



## Chetkigaming (Dec 17, 2013)

1-2 ms 120hz monitors are veery sweet, buy 1 of them or wait 4-5 month for g-sync. No another ways if you have money and love gaming. ips or other "1600p bomb" is shit compared to the 1ms /120hz. My advice is not to go for higher resolutions at least few years. No valuable reason to lose fps for some hyper-saturated colors.


----------



## Aithos (Dec 17, 2013)

xenocide said:


> You clearly miss to point of PhysX.  It's not to make shooting walls pretty, it's to add realistic physics simulations to games.  Anything to can add realism adds depth to the game, and makes you less aware you're playing in a fictional world.  Also, you have an AMD video card which means PhysX doesn't run properly on your system, it's emulated on the CPU using an instruction set created by cavemen.
> 
> 
> 
> So you can run every game on the market at absolute max settings and maintain a constant 60 fps?  I call BS looking at those system specs.


 
I can!


----------



## Aithos (Dec 17, 2013)

xenocide said:


> IPS Panels are better for image quality, but they tend to lack a bit in response time.  I know there was a 120hz IPS Panel floating around (it made the TPU news feed) but I'm not sure it was ever released outside of Korea or China (wherever it was made).  120hz isn't all its cracked up to be, then again, when I play on monitors that only offer 60hz it seems less enjoyable...


 
There is no 120hz IPS panel, the Korean panels are being overclocked because they use a very basic PCB that is so dumb it doesn't even prevent you from sending a 120hz signal because it doesn't know it's not supposed to.  They also only work as gaming monitors because they lack a scaler and have only a single input which means that contrary to normal 1440p IPS panels they don't have input lag.

As for the previous comment about 120fps and VSYNC, even if you run 120fps solid VSYNC is always a bad idea because it introduces significant processing lag to your gameplay.  It's not a big deal for single player games, but it's virtually unplayable for multiplayer.  If you can't tell the difference then you're already hopeless, it's really, really noticable.  I recently got my 1440p monitor and overclocked it, for some reason when I switched resolutions it enabled VSYNC.  It took me like 10 min to figure out why I was lagging so badly in CS:GO with my beast of a computer, then I noticed VSYNC was on and the second I turned it off everything was perfect again.

G-Sync also introduces lag, it's just much smarter than VSYNC so it's a lot less lag.  Personally, I'll make sure I'm always running 120fps and just leave it off even if I have a monitor capable of using it.  Any processing you do introduces lag...the more inputs...more lag....more scaling...you guessed it..more lag.  Lag = evil.  Don't have lag!


----------



## Sasqui (Dec 17, 2013)

qubit said:


> The way I understand it, the gold standard as it were is to have the card locked at 120Hz vsync with no dropped frames? G-Sync helps massively with the situation where the framerate cannot maintain 120fps. I'll likely end up buying this when they bring out a 27" G-Sync monitor and I have the appropriate graphics card.



I abhor the idea of having to purchase an NV product to take advantage of another product made by NV... reminds me of what they did with SLI for so many years...  greed.  Yes, not arguing they do make great products.


----------



## kn00tcn (Dec 17, 2013)

ne6togadno said:


> ?!?



i'm just recommending you get more familiar with what you're going to be debating in general, otherwise it ends up looking foolish if you cant even get the name right or the reason it exists (physx started as an addon card by an unrelated company, that's quite a big difference to the situation)



ne6togadno said:


> ugh... what are you trying to say. what is connection between new dp standard and g-sync?



the concept of gsync is the future, there is no reason for an LCD to be at a fixed refresh rate at all times, so there needs to be pressure put on VESA, monitor makers, AMD, and so on to update the display standards to get around the limitations of vsync when framerate is below the refresh (of course having it nvidia exclusive with only like 3 expensive monitor models is not a solution, it's just baby steps for now)

i'll come out right now & say that i'm not pro nvidia & have not been since early 2008, just dont like the attitude & business practices, but that doesnt mean gsync is a bad idea or that the engineers/employees/programmers are bad people

there's also something to note for physx.... it's simply a library, there arent that many gpu accelerated (& thus locked to nvidia) games out there compared to the 100s & 100s of cpu based & console based physx enabled games

that fact alone, that devs dont have to worry about coding a physics engine, that it's built into unreal3 for example, that it's multiplatform, it's just another powerful tool to make games

very few issues are clear cut

for our closing thought of the day, the irony of 'the way it's meant to be played' launching together with the failure of the geforce 5 series is pretty hilarious


----------



## Aithos (Dec 18, 2013)

kn00tcn said:


> i'm just recommending you get more familiar with what you're going to be debating in general, otherwise it ends up looking foolish if you cant even get the name right or the reason it exists (physx started as an addon card by an unrelated company, that's quite a big difference to the situation)
> 
> 
> 
> ...


 
I disagree with the bolded statement.  Any kind of "syncing" technology introduces lag, so long as the refresh rate is fast enough and you can produce enough frames there is no reason for any kind of "sync" technology.  G-Sync is much lower lag than V-Sync, but it's still there and the recent review from Anandtech does a good job of explaining it.  Any time you're doing processing with a signal (which syncing frames is) then you're introducing *some* lag.  Lag is evil, I don't want lag.  120hz or higher lightboost IPS would be MUCH better than a 1400p G-Sync monitor...

I don't think G-Sync is a bad idea, but I will always want a way to turn it off and make sure it won't affect my performance.  Unless they can completely *eliminate* the lag (which I question is even possible) then I don't want it.  I'd rather have a powerful system and keep my framerates high...


----------



## kn00tcn (Dec 19, 2013)

Aithos said:


> I disagree with the bolded statement.  Any kind of "syncing" technology introduces lag, so long as the refresh rate is fast enough and you can produce enough frames there is no reason for any kind of "sync" technology.  G-Sync is much lower lag than V-Sync, but it's still there and the recent review from Anandtech does a good job of explaining it.  Any time you're doing processing with a signal (which syncing frames is) then you're introducing *some* lag.  Lag is evil, I don't want lag.  120hz or higher lightboost IPS would be MUCH better than a 1400p G-Sync monitor...
> 
> I don't think G-Sync is a bad idea, but I will always want a way to turn it off and make sure it won't affect my performance.  Unless they can completely *eliminate* the lag (which I question is even possible) then I don't want it.  I'd rather have a powerful system and keep my framerates high...



alright so the future standard should also be optional as that's another thing i highly value, user control

this way the user can use fixed refresh mode, or framerate based refresh mode, & some new card+cable can support older legacy monitors with the sync type feature disabled, this is tech standards done right  similar to usb3, pcie3, sata3


----------



## Aithos (Dec 20, 2013)

kn00tcn said:


> alright so the future standard should also be optional as that's another thing i highly value, user control
> 
> this way the user can use fixed refresh mode, or framerate based refresh mode, & some new card+cable can support older legacy monitors with the sync type feature disabled, this is tech standards done right  similar to usb3, pcie3, sata3


 
Yep, as long as I'm able to turn it off and have no impact to performance for it being included with the monitor, knock yourself out...

I'm not against something that will help people on the lower spectrum of the FPS curve have a more playable experience.  I think it's a great idea, and everything I've heard said it looks AWESOME.  Maybe the lag will be unnoticable and I'd even see a benefit at high FPS, I haven't gotten to see one in person so I can't say.  Unfortunately, the trend is all matte finish monitors and I can't stand them...so even when they start launching it will probably be quite some time before a model I would consider becomes available.  I hope I'm wrong though since I'm still very interested in seeing it...


----------



## kn00tcn (Dec 21, 2013)

Aithos said:


> Unfortunately, the trend is all matte finish monitors and I can't stand them.



are you saying you prefer glossy? i'm the opposite, cant stand reflections


----------



## Aithos (Dec 23, 2013)

kn00tcn said:


> are you saying you prefer glossy? i'm the opposite, cant stand reflections


 
Yes, I prefer glossy.  I get headaches and eye strain from monitors with heavy matte finishes, which is nearly all of them these days.  No one makes a semi-glossy finish anymore and it's really annoying.  Reflections are only an issue if you have a lighting source directly behind your monitor.  I don't even notice reflections on mine, but I do notice it's nice and crisp and doesn't sparkle like the matte IPS I have at work...


----------



## qubit (Dec 23, 2013)

Aithos said:


> Yes, I prefer glossy.  I get headaches and eye strain from monitors with heavy matte finishes, which is nearly all of them these days.  No one makes a semi-glossy finish anymore and it's really annoying.  Reflections are only an issue if you have a lighting source directly behind your monitor.  I don't even notice reflections on mine, but I do notice it's nice and crisp and doesn't sparkle like the matte IPS I have at work...


I hate the sparkly effect too, really detracts from a sharp picture.


----------



## HM_Actua1 (Apr 17, 2014)

qubit said:


> I hate the sparkly effect too, really detracts from a sharp picture.



G-sync any day of the week. Not to mention that G-sync monitor will be 1ms. At least the specs I've seen and the Asus/Gsync monitor I have.


----------



## Deleted member 138597 (Apr 24, 2014)

When will that FreeSync from AMD coming? And I don't see any monitors with G-Sync except ASUS. Did they stopped its production?


----------



## johnspack (Apr 24, 2014)

1ms monitors?  when did they come out with those?


----------



## Xzibit (Apr 24, 2014)

Hitman_Actual said:


> G-sync any day of the week. Not to mention that G-sync monitor will be 1ms. At least the specs I've seen and the Asus/Gsync monitor I have.



Wow, Are you on a mission to resurrect any old thread that has to do with G-Sync.

Can't wait for the next thread you spam.



Shamonto Hasan Easha said:


> When will that FreeSync from AMD coming? And I don't see any monitors with G-Sync except ASUS. Did they stopped its production?



There was no monitor just a *Do-it-Yourself Kit* which was made for ASUS VG248QE and isn't being sold any more.
Monitors are suppose to be available Q2 2014








johnspack said:


> 1ms monitors?  when did they come out with those?



Couple of companies have them.  There all on TN panels.


----------



## johnspack (Apr 24, 2014)

Oh come on,  when did irony die? ....   what kind of monitor do I have?


----------



## HM_Actua1 (Apr 24, 2014)

I'm running the *G-SYNC: ASUS VG248QE and its 1ms. the ROG swift will be 1ms GTG as well.*


----------

