• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GSYNC lowering GPU Usage Resulting in Lower FPS?

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.14/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
What's the problem?

I think his issue is his monitor is 144Hz, so he is under the impression that he should be seeing the maximum possible framerate as long as it is under 144FPS. But that isn't how Gsync works. It still has to do some syncing with the monitor, and that causes an overhead that reduces framerates.

Gsync is designed to minimize the delay from when the frame is rendered and the monitor displays it. This reduces tearing while also reducing input lag.

Compare this to vsync, where the GPU might render a frame, but then have to wait before sending it to the monitor so that it syncs with the static refresh rate of the monitor. This introduced a lot of input lag.

But this whole thread is the perfect practical example of why I'm convinced Gsync is worthless for any high end rig. OP has also concluded that effectively Gsync is pointless right there in the TS. Too bad it took a purchase of it to get there ;)

I disagree, but agree. I think it depends on if you are sensitive to input lag or not. If you are a person that notices input lag, then Gsync and Freesync are great. But for most people with high end rigs that aren't sensitive to input lag, that are also paired with a high refresh rate monitor, adaptive vsync is the better(and way cheaper) option in my opinion.
 
Joined
Sep 17, 2014
Messages
22,006 (6.01/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
I think his issue is his monitor is 144Hz, so he is under the impression that he should be seeing the maximum possible framerate as long as it is under 144FPS. But that isn't how Gsync works. It still has to do some syncing with the monitor, and that causes an overhead that reduces framerates.

Gsync is designed to minimize the delay from when the frame is rendered and the monitor displays it. This reduces tearing while also reducing input lag.

Compare this to vsync, where the GPU might render a frame, but then have to wait before sending it to the monitor so that it syncs with the static refresh rate of the monitor. This introduced a lot of input lag.



I disagree. If you are person that notices input lag, then Gsync and Freesync are great. But for most people with high end rigs that aren't sensitive to input lag, that are also paired with a high refresh rate monitor, adaptive vsync is the better(and way cheaper) option in my opinion.

I feel Adaptive sync and Fast Sync pretty much cover all bases together. The real advantage of Gsync or FreeSync is below 60 fps, and if you have a high end rig that can't reliably push 60, just tweak a few things to make it so = stable end result at no cost and above all, you don't have to deal with the myriad of issues surrounding Gsync in the first place, you get to use monitor strobe functionality to improve motion resolution/reduce blur, etc etc etc.

If you are highly sensitive to input lag, you'd be making sure you have as little FPS variance as possible because that in itself is input lag at the most basic level. You'd also want the highest possible min. fps. Gsync doesn't really fit in there, its only added purpose is to eliminate tearing, which, in a high refresh monitor at 60+ fps, hardly ever happens anyway.
 
Joined
Aug 29, 2005
Messages
7,206 (1.03/day)
Location
Stuck somewhere in the 80's Jpop era....
System Name Lynni PS \ Lenowo TwinkPad L14 G2
Processor AMD Ryzen 7 7700 Raphael \ i5-1135G7 Tiger Lake-U
Motherboard ASRock B650M PG Riptide Bios v. 2.02 AMD AGESA 1.1.0.0 \ Lenowo BDPLANAR Bios 1.68
Cooling Noctua NH-D15 Chromax.Black (Only middle fan) \ Lenowo C-267C-2
Memory G.Skill Flare X5 2x16GB DDR5 6000MHZ CL36-36-36-96 AMD EXPO \ Willk Elektronik 2x16GB 2666MHZ CL17
Video Card(s) Asus GeForce RTX™ 4070 Dual OC GPU: 2325-2355 MEM: 1462| Intel® Iris® Xe Graphics
Storage Gigabyte M30 1TB|Sabrent Rocket 2TB| HDD: 10TB|1TB \ WD RED SN700 1TB
Display(s) LG UltraGear 27GP850-B 1440p@165Hz | LG 48CX OLED 4K HDR | Innolux 14" 1080p
Case Asus Prime AP201 White Mesh | Lenowo L14 G2 chassis
Audio Device(s) Steelseries Arctis Pro Wireless
Power Supply Be Quiet! Pure Power 12 M 750W Goldie | 65W
Mouse Logitech G305 Lightspeedy Wireless | Lenowo TouchPad & Logitech G305
Keyboard Ducky One 3 Daybreak Fullsize | L14 G2 UK Lumi
Software Win11 Pro 23H2 UK | Win11 LTSC UK / Arch (Fan)
Benchmark Scores 3DMARK: https://www.3dmark.com/3dm/89434432? GPU-Z: https://www.techpowerup.com/gpuz/details/v3zbr
Correct me if I am wrong but G-Sync helps making the frames match on both the monitor and gpu so tearing ain't shown and ofc it will be a limitation at some point but still when u get help synchronizing the frames send out from your graphics card to what's shown on your monitor ofc ur gpu will have less work to do depending on the game u r running.

I have noticed when gaming overwatch I actually has gotten a better average fps with my setup getting a g-sync monitor.
 
Joined
Apr 6, 2017
Messages
8 (0.00/day)
Location
Bakersfield, CA
System Name Protoss V2
Processor Intel i7-7700K @ 4.9 GHz
Motherboard Asus Maximus IX Hero Z270
Cooling NH-D15
Memory Corsair Vengeance LPX 16GB 3 GHz
Video Card(s) Nvidia GTX 1080FE SLI @ 2.1 GHz
Storage Samsung 840 Evo 500 GB; WD Black 1 TB x2 RAID 0
Display(s) Asus ROG Swift PG278Q
Case Corsair Air 540 White
Audio Device(s) N/A
Power Supply EVGA SuperNOVA 850G2
Mouse Corsair M65 RGB
Keyboard Corsair K95 RGB
Software Windows 10 Pro
This is exactly what GSync is supposed to do... lower the framerate (and by proxy, GPU utilization if your GPU's are overpowered) to match the monitor's refresh rate, and conversely, to lower the monitor's refresh rate to match the GPU's framerate if the framerate dips below the monitor's max refresh rate.

What's the problem?

I guess I was just being too paranoid about my GPU utilization being lower than what my monitors max refresh rate. As I've stated, with GSYNC off, both cards can achieve of 99% utilization. But not when GSYNC is enabled.

If you're being realistic, when you can hit above 100 FPS consistently with any setup, Gsync becomes a problem rather than a solution.

Remove Gsync, use Fast Sync, and you're tear free and maximizing FPS. If your FPS fluctuates too much, use Gsync and accept what it does for you :)

But this whole thread is the perfect practical example of why I'm convinced Gsync is worthless for any high end rig. OP has also concluded that effectively Gsync is pointless right there in the TS. Too bad it took a purchase of it to get there ;)

Yeah, I'm never buying another GSYNC panel again. I'm just going to wait for HDMI 2.1 to arrive and let that fix the problem. I've had this monitor for a year now and I never saw this problem with Maxwell (GTX 980 SLI). But next time I'm thinking of just buying an ultrawide 1440P monitor.

Correct me if I am wrong but G-Sync helps making the frames match on both the monitor and gpu so tearing ain't shown and ofc it will be a limitation at some point but still when u get help synchronizing the frames send out from your graphics card to what's shown on your monitor ofc ur gpu will have less work to do depending on the game u r running.

I have noticed when gaming overwatch I actually has gotten a better average fps with my setup getting a g-sync monitor.

There in lies my problem. When I have GSYNC enabled, both my 1080s utilization is just around 80% - 90%, but when GSYNC is disabled, both 1080s are pegged on 99% usage on any game. I mean it would dip at 95%, but that's about it. I just don't understand that part lol. But it seems as though it is perfectly normal.
 
Joined
Mar 11, 2009
Messages
1,778 (0.31/day)
Location
Little Rock, AR
System Name Gamer
Processor AMD Ryzen 3700x
Motherboard AsRock B550 Phantom Gaming ITX/AX
Memory 32GB
Video Card(s) ASRock Radeon RX 6800 XT Phantom Gaming D
Case Phanteks Eclipse P200A D-RGB
Power Supply 800w CM
Mouse Corsair M65 Pro
Software Windows 10 Pro
There in lies my problem. When I have GSYNC enabled, both my 1080s utilization is just around 80% - 90%, but when GSYNC is disabled, both 1080s are pegged on 99% usage on any game. I mean it would dip at 95%, but that's about it. I just don't understand that part lol. But it seems as though it is perfectly normal.

With Gsync disabled, your graphics card is going to spit out as many frames as possible, and your GPU usage will be at or close to 100%. The problem with this is that your monitor has a max refresh rate. It can only do a maximum framerate, and if your GPU goes above that maximum framerate, you will get screen tearing. VSync and GSync are technologies designed specifically to make sure that this doesn't happen. Your GPU is showing lower utilization because it IS being utilized less, because your framerate is being lowered to whatever the max framerate is. This is the exact same thing that happens with VSync. Both technologies do the same thing. (sort of... nobody get technical in here and crucify me, I'm trying to simplify things...)

Gsync ALSO has the added benefit of doing the opposite... it lowers the screen's refresh rate to match the graphics card's framerate IF the framerate drops below the monitor's max refresh rate. This results in no input lag, which can be a problem with VSync.

So your graphics card is doing EXACTLY what it is supposed to do. It is slowing itself down to match the monitor's framerate. That is why you see less utilization. When you turn GSync off, your card is spitting out more frames than your monitor can take, and you NEVER see those extra frames. It's physically impossible for framerate above your monitors maximum to help you. If your monitor's max refresh rate is 144hz, and your graphics card is spitting out 200 fps at 100% utilization, you are NOT seeing those extra frames. They are just resulting in screen tearing. When you turn GSync on, it slows your graphics card down to match the maximum refresh rate of your monitor. This results in the highest possible visible framerate while eliminating screen tearing.

You're seeing a ghost here. There is nothing wrong with the fact that your GPU is only being used 80% while GSync is on. That is a GOOD thing, and means that your graphics card has more horsepower than your monitor can handle. This means that if you play a more demanding game, or crank up the settings, you've still got that extra horsepower in reserve, and can still meet the maximum framerate of your monitor. It also means that your graphics card is running cooler, and not working as hard, for no reason and spitting out extra frames that you can't even see because your monitor can't display them. This could potentially mean it would extend the graphics card's life.

In short, that extra 20% of graphics power that isn't being used when GSync is on will NOT help you at all. You can't see the extra frames provided by that power, because your monitor can't display them.

Do yourself a favor, and turn GSync on, and never look at that utilization percentage number again. You're completely misunderstanding what is going on behind the scenes, and it's causing you pain. Turn GSync on, enjoy nice smooth framerates at the maximum that you could see with your monitor anyway, and have fun. The utilization percentage means nothing in this case.
 
Joined
Apr 6, 2017
Messages
8 (0.00/day)
Location
Bakersfield, CA
System Name Protoss V2
Processor Intel i7-7700K @ 4.9 GHz
Motherboard Asus Maximus IX Hero Z270
Cooling NH-D15
Memory Corsair Vengeance LPX 16GB 3 GHz
Video Card(s) Nvidia GTX 1080FE SLI @ 2.1 GHz
Storage Samsung 840 Evo 500 GB; WD Black 1 TB x2 RAID 0
Display(s) Asus ROG Swift PG278Q
Case Corsair Air 540 White
Audio Device(s) N/A
Power Supply EVGA SuperNOVA 850G2
Mouse Corsair M65 RGB
Keyboard Corsair K95 RGB
Software Windows 10 Pro
With Gsync disabled, your graphics card is going to spit out as many frames as possible, and your GPU usage will be at or close to 100%. The problem with this is that your monitor has a max refresh rate. It can only do a maximum framerate, and if your GPU goes above that maximum framerate, you will get screen tearing. VSync and GSync are technologies designed specifically to make sure that this doesn't happen. Your GPU is showing lower utilization because it IS being utilized less, because your framerate is being lowered to whatever the max framerate is. This is the exact same thing that happens with VSync. Both technologies do the same thing. (sort of... nobody get technical in here and crucify me, I'm trying to simplify things...)

Gsync ALSO has the added benefit of doing the opposite... it lowers the screen's refresh rate to match the graphics card's framerate IF the framerate drops below the monitor's max refresh rate. This results in no input lag, which can be a problem with VSync.

So your graphics card is doing EXACTLY what it is supposed to do. It is slowing itself down to match the monitor's framerate. That is why you see less utilization. When you turn GSync off, your card is spitting out more frames than your monitor can take, and you NEVER see those extra frames. It's physically impossible for framerate above your monitors maximum to help you. If your monitor's max refresh rate is 144hz, and your graphics card is spitting out 200 fps at 100% utilization, you are NOT seeing those extra frames. They are just resulting in screen tearing. When you turn GSync on, it slows your graphics card down to match the maximum refresh rate of your monitor. This results in the highest possible visible framerate while eliminating screen tearing.

You're seeing a ghost here. There is nothing wrong with the fact that your GPU is only being used 80% while GSync is on. That is a GOOD thing, and means that your graphics card has more horsepower than your monitor can handle. This means that if you play a more demanding game, or crank up the settings, you've still got that extra horsepower in reserve, and can still meet the maximum framerate of your monitor. It also means that your graphics card is running cooler, and not working as hard, for no reason and spitting out extra frames that you can't even see because your monitor can't display them. This could potentially mean it would extend the graphics card's life.

In short, that extra 20% of graphics power that isn't being used when GSync is on will NOT help you at all. You can't see the extra frames provided by that power, because your monitor can't display them.

Do yourself a favor, and turn GSync on, and never look at that utilization percentage number again. You're completely misunderstanding what is going on behind the scenes, and it's causing you pain. Turn GSync on, enjoy nice smooth framerates at the maximum that you could see with your monitor anyway, and have fun. The utilization percentage means nothing in this case.

Thanks for this. I have been ignoring that fact actually. I even uninstalled RivaTuner just so I don't get the urge to turn the OSD on.
 
Top