# Screen tearing? Is it bad?



## Funtoss (Mar 23, 2011)

Are screen tearing bad for the graphics card or monitor?


----------



## DanishDevil (Mar 23, 2011)

It's not bad for anything. It's basically throwing more frames at the monitor than it can display, but it doesn't hurt anything. It's a matter of preference. If you don't mind how it looks, keep vertical sync off. If it bothers you, turn it on.


----------



## Melvis (Mar 23, 2011)

Nope not bad at all, i don't get any anyway, and in some games i get input lag if i turn Vsync on, so mine is off in all games.


----------



## Funtoss (Mar 23, 2011)

wow, okay thank you very much 
didnt really know what it meant and if it was good or bad


----------



## DanishDevil (Mar 23, 2011)

It is a pretty violent sounding term. It can end up being pretty annoying sometimes though:


----------



## 2DividedbyZero (Mar 23, 2011)

i didn't think you could take a 'screenshot' of screen tearing i.e vsync off, as it was the monitor that displayed the 'tear' not the output from the gfx card. to me that looks like a fault with the card but i could be wrong



DanishDevil said:


>


----------



## Bjorn_Of_Iceland (Mar 23, 2011)

Its bad for your eyes because it looks bad.


----------



## DanishDevil (Mar 23, 2011)

2DividedbyZero said:


> i didn't think you could take a 'screenshot' of screen tearing i.e vsync off, as it was the monitor that displayed the 'tear' not the output from the gfx card. to me that looks like a fault with the card but i could be wrong



It very well could be. I just google imaged it, but that is what it ends up looking like IRL.


----------



## crazyeyesreaper (Mar 23, 2011)

Oblivion is the easiest game to see screen tearing in run down a corridor with any bars or the arena at the gate before it drops just turning the camera slowly shows tearing up the wazzoo looks horrendous,

some game engines dont tear that badly others its extremely noticeable if you dont mind tearing leave v sync off if you hate it turn vsync on, overall it dosent matter it will never hurt the system

altho that said, if your getting 300fps in games that dont need it using Vsync allows you to maintain 60fps on a 60hz monitor for example and it lowers the load on the GPU, meaning at the best of times it helps with image quality and lowers power consumption etc in the worst of times it offers input lag. It simply is what it is

example say im playing oblivion at 1920x1200 with a refresh rate of 60hz thats 60fps the monitor can display, since i hate screen tearing i leave vsync on with vsync on my gpus hover around 20% usage each via system specs to the left, and i never go above 55'c if i turn vsync off i get 300+ fps my gpus jump to 99% usage i get screen tearing and temps go to 75-85'c fan spins up etc, since i feel no input lag vsync stays on as its nicer on my power bill and nicer for my eyes


----------



## 2DividedbyZero (Mar 23, 2011)

DanishDevil said:


> It very well could be. I just google imaged it, but that is what it ends up looking like IRL.



roger, for demonstration purposes only, gotcha


----------



## qubit (Mar 23, 2011)

Funtoss said:


> Are screen tearing bad for the graphics card or monitor?



The tearing itself is only bad in that it looks crap, like Danish's screenshot previously.

However, it's not that great for the card if it freewheels with a super high frame rate. When that happens, it stresses out the power circuitry, which can be clearly heard.

Say you're running Fraps on a high end system and playing an old game, you can easily achieve 800fps or more, which will make graphics cards squeal like a banshee. That's your power circuitry stressed out and it won't last that long being run like that.

The moral of the story is to leave vsync on. There's really no point in turning it off for normal gameplay anyway. It must be turned off for benchmarking, though.

Also, I think the replies that DanishDevil and crazyeyesreaper gave you were really good.


----------



## HookeyStreet (Mar 23, 2011)

I hate horizontal tearing!!!!!!!!!!!!!


----------



## Funtoss (Mar 23, 2011)

wow that explains a lot !
i might leave v sync on from now on in some games coz i lagg in other games if i leave it on ( should i lower the video settings? )

Thanks for taking time out for answering my question


----------



## brandonwh64 (Mar 23, 2011)

Counterstrike 1.6 does this at 1000FPS


----------



## qubit (Mar 23, 2011)

Funtoss said:


> wow that explains a lot !
> i might leave v sync on from now on in some games coz i lagg in other games if i leave it on ( should i lower the video settings? )
> 
> Thanks for taking time out for answering my question



You're welcome. 

If you see lag, try setting the maximum pre-rendered frames to zero in your nvidia graphics control panel. I don't know what the equivalent setting is for AMD.


----------



## Spectrum (Mar 23, 2011)

you have to do that in ati traytools for amd cards.


----------



## Funtoss (Mar 25, 2011)

thank you so much guys lol i dont play with v sync now and i get MUCH MORE BETTER PERFORMANCE!


----------



## qubit (Mar 25, 2011)

Funtoss said:


> thank you so much guys lol i dont play with v sync now and i get MUCH MORE BETTER PERFORMANCE!



You don't get any more performance. :shadedshu The monitor only shows you a fixed framerate, usually 60Hz. Having the card unsynced only results in screen tearing and judder. Remember Danish's screenshot?


----------



## johnspack (Mar 25, 2011)

That's weird,  I've never in 20 years of gaming seen that tearing effect.  I always force vsync off as well.  Guess I've always used right combo of video card and monitor?  If I saw that effect,  I probably would think my card was dying,  glad I know now!


----------



## qubit (Mar 25, 2011)

johnspack said:


> That's weird,  I've never in 20 years of gaming seen that tearing effect.  I always force vsync off as well.  Guess I've always used right combo of video card and monitor?  If I saw that effect,  I probably would think my card was dying,  glad I know now!



You probably haven't noticed it. Also, it's quite transient and varies significantly. If a card is rendering at something like 150FPS or more, you tend not to see it. The system is running unsynced, so the exact effect will vary moment by moment.

Judders and tearing tend to happen the most when the monitor refresh is 60Hz and the card is rendering 70-90FPS.

Also, I believe the design of the game engine itself can affect how much you see.


----------



## Swamp Monster (Mar 25, 2011)

qubit said:


> You don't get any more performance.



And I tough't that it's common knowledge that enabling vsync has negative effect on performance Even in games "readme" files often is written that if you get too low FPS, it's recommended to disable vsync.


----------



## crazyeyesreaper (Mar 25, 2011)

vsync dosent effect performance for christs sake ppl it just syncs the monitor with the card, nothing else sometimes it causes input lag or mouse lag, but otherwise theres no big issue,

40fps with vsync is 40 without it dosent matter, anything over 60 can in FPS games make things smoother but as people well know 300fps is great but some games get fucked up at high frame rates its a balancing act, and i for one prefer image quality,   so vsync on in all games but Bad Company 2 when you play games like Oblivion, Fallout 3 or New Vegas, and many other titles, the tearing can be so horrendous it makes you eyes bleed that image posted above, imagine every time you move look around take an action attack somethinge verything that moves the camera causes tearing its insane, certain games benefit from no vsync for sure but most youll never ever notice the difference, in terms of performance anyway.

it really depends on game engine and the game

Dead Space, has mouse lag in menus with vsync etc but as mentioned before, Oblivion tears and has issues with the gamebryo engine. but yea really depends on the title.


----------



## johnspack (Mar 25, 2011)

Why use vsync to reduce fps,  crank up quality settings instead.  If the game doesn't go high enough,  then add control panel settings to crank up aa or whatever.  You'll never see tearing,  that's the whole point!  And the graphics rock!


----------



## crazyeyesreaper (Mar 25, 2011)

oblivion still tears for me even with low FPS, and honestly i have 2x 6970s even Metro 2033 maxxed out gets 70+ fps for me there litterally isnt a damn game in existance i cant get more then 60fps in lol, so just cranking settings dosent cut it lol and just cranking the settings means i tend to hit a vram limit before i hit a fps drop lol granted my situation is a bit different then most.


----------



## robal (Mar 25, 2011)

Funtoss said:


> Are screen tearing bad for the graphics card or monitor?



Technically, no.

But you could say it is to some extent...
It maybe straining for your eyes. It varies from person to person, but I hate the effect and play with Vsync ON whenever possible.

Another good reason for Vsync is that, when the game is displaying a simpler scene, your PC will not throw 100% resources at it to render 1000FPS...
Having FPS limited by Vsync, means that your PC will waste less energy, be quieter and cooler during the game.

Cheers,


----------



## qubit (Mar 25, 2011)

johnspack said:


> Why use vsync to reduce fps,  crank up quality settings instead.  If the game doesn't go high enough,  then add control panel settings to crank up aa or whatever.  You'll never see tearing,  that's the whole point!  And the graphics rock!



hmmm... If you haven't understood what vsync is all about by reading this thread, then I doubt that you'll understand any further explanation.


----------



## Mussels (Mar 25, 2011)

Vsync on can save you lots of power, heat and noise, and even lengthen the lifespan of your hardware since its not being pushed so hard.

edit: robal said that right above me, but i'll say it anyway to repeat it.


----------



## Bjorn_Of_Iceland (Mar 25, 2011)

Mussels said:


> Vsync on can save you lots of power, heat and noise, and even lengthen the lifespan of your hardware since its not being pushed so hard.
> 
> edit: robal said that right above me, but i'll say it anyway to repeat it.



Say that I switch to a 120hz monitor, will that make my rig consume more power?


----------



## Mussels (Mar 25, 2011)

Bjorn_Of_Iceland said:


> Say that I switch to a 120hz monitor, will that make my rig consume more power?



yes. but still less than if it were running at 300FPS with Vsync off. the power savings only kick in if you get FPS that is higher than your refresh rate, with Vsync on.


a good example is how i can get 300+FPS in the menus of starcraft II, but dip to 30 in the game. in that case, running in the menus stresses my system far more, and could build up a lot of completely unnecessary heat.


----------



## qubit (Mar 25, 2011)

Bjorn_Of_Iceland said:


> Say that I switch to a 120hz monitor, will that make my rig consume more power?



Yes, but the difference will be tiny and insignificant. Letting a high end graphics card freewheel at 1000fps however, will use much more power and over time, damage the card.


----------



## Mussels (Mar 25, 2011)

qubit said:


> Yes, but the difference will be tiny and insignificant. Letting a high end graphics card freewheel at 1000fps however, will use much more power and over time, damage the card.



not insignificant, it depends on the cards. if he's using 20% of his GPU for 60FPS in an old game and 40% for 120, then thats doubled the cards (not the systems!) power consumption. that might be an extra 40W or something depending on the card (or double that, if in crossfire/SLI)


regardless, Vsync on is still the best option to do. why run your hardware to 100% when its not needed?


----------



## qubit (Mar 25, 2011)

Mussels said:


> not insignificant, it depends on the cards. if he's using 20% of his GPU for 60FPS in an old game and 40% for 120, then thats doubled the cards (not the systems!) power consumption. that might be an extra 40W or something depending on the card (or double that, if in crossfire/SLI)



I still have a feeling the difference will be pretty small, but I see where you're coming from with your explanation. I think it would be interesting to measure it, rather than just speculating. Then we'd know properly.



Mussels said:


> regardless, Vsync on is still the best option to do. why run your hardware to 100% when its not needed?



Well, duh! Of course. Why do some people find this so hard to understand and sit there all stupidly happy with their graphics card freewheeling unsyncronized, with all the drawbacks that has? :shadedshu

This is just the same as the clueless users at work, who insist on running their LCD monitors at non-native resolutions and will fight you tooth and nail to set it properly for them. These people are so fail.


----------

