• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Screen Tearing free setup with AMD GPU (<450€) + G-Sync monitor

Joined
Sep 26, 2022
Messages
253 (0.28/day)
Location
Portugal
System Name Main
Processor 5700X
Motherboard MSI B450M Mortar
Cooling Corsair H80i v2
Memory G.SKILL Ripjaws V 32GB (2x16GB) DDR4-3600MHz CL16
Video Card(s) MSI RTX 3060 Ti VENTUS 2X OC 8GB GDDR6X
Display(s) LG 32GK850G
Case NOX HUMMER ZN
Power Supply Seasonic GX-750
Long Version:
*some time before November 2020*

Happy me using a GTX 1060 with a 1080p 60Hz monitor (actually a TV, but I don't think it matters) to play "old-ish" games like Dishonored, Bioshock Infinite with High/Max details.
In the past I had issues with VSync, and discovered that I managed to play without screen tearing by using a frame limiter (RTSS) set to 58 or 59fps and Nvidia's "Fast Sync" set on 3D applications on Nvidia Control Panel.

*November 2020 - monitor (TV) dies...*

Found a good deal on a G-Sync only 2560*1440/144Hz monitor (LG 32GK850G), and thought that since the 1060 was still able to run most of the games I was playing at the time (like GTA V) at 1080p60, with the help of G-Sync it should be able to play games at 1440p smoothly even if below 60fps. So I went for it!

*GPU Crypto mining boom happens AND jump to 2023*

Today the 1060 is starting to show it's age, and I'm looking for alternatives at below 450€. AMD seems to provide best value and I'm willing to sacrifice G-Sync/Adaptative Sync, but not Screen tearing free gaming, "eye candy" or at least "fixed" 60/70 fps. I hope my GPU doesn't die before the 4060/4060TI come out (or RX 7600/7700 series), but with the current pricing trend I'm worried that even those might still be outside my budget or not provide big enough performance/€ improvement over current gen.
I've read something that Enhanced Sync is similar to Fast Sync, but also that it works best if the GPU is rendering frames above the monitors max refresh rate - 144Hz - which I know won't happen with a 450€ GPU without severely downgrading visual quality and I don't want that.
I currently have an option to reduce the monitor's refresh rate from 144 to 85 or 60Hz, will that option still exist while using an AMD card? And will that "trick" Enhanced Sync to "work" at those frame rates?
1673740405241.png

Also, I just learned that Nvidia Control Panel apparently has an option to manually create a profile for a custom resolution/refresh rate. Does it work well (gave me a bunch of warnings when I clicked on it), and does AMD have something similar?
So what other options are there? Can VSync be configured to work at half the maximum refresh rate (72Hz)? I could adjust the frame limiter according to the "average" frame rate I can achieve on each game.

TL,DR version:
GPU prices are insane.
I plan to upgrade my GTX 1060 and hope to spend less than 450€. So I'm considering if changing to AMD is worth it. However I own a G-Sync only monitor (LG 32GK850G) which I'm not planing to replace.
My main concern is screen tearing, and want to know which solutions there are (VSync, Enhanced Sync, etc), and what compromises have to be made (fixed frame rate? 60 fps?).

Bottom line
I just want to know if an AMD GPU can theoretically provide a "tear-free" experience by combining some softwares and settings/tweaks or if I just stick with Nvidia even though at the same price (<450€) they will probably deliver less performance (I don't care about Ray Tracing ATM).

Thanks in advance
 
Joined
Feb 14, 2012
Messages
1,888 (0.40/day)
Location
Romania
I have the same monitor. I would stick to an Nvidia GPU. It's not worth it to switch to AMD and loose Gsync only to save some money. I pair it with a 3060 Ti and it's ok for 1440p gaming. Yes you can run the monitor at 120Hz or 60Hz on DP, but why would you ?
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
43,976 (6.81/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Long Version:
*some time before November 2020*

Happy me using a GTX 1060 with a 1080p 60Hz monitor (actually a TV, but I don't think it matters) to play "old-ish" games like Dishonored, Bioshock Infinite with High/Max details.
In the past I had issues with VSync, and discovered that I managed to play without screen tearing by using a frame limiter (RTSS) set to 58 or 59fps and Nvidia's "Fast Sync" set on 3D applications on Nvidia Control Panel.

*November 2020 - monitor (TV) dies...*

Found a good deal on a G-Sync only 2560*1440/144Hz monitor (LG 32GK850G), and thought that since the 1060 was still able to run most of the games I was playing at the time (like GTA V) at 1080p60, with the help of G-Sync it should be able to play games at 1440p smoothly even if below 60fps. So I went for it!

*GPU Crypto mining boom happens AND jump to 2023*

Today the 1060 is starting to show it's age, and I'm looking for alternatives at below 450€. AMD seems to provide best value and I'm willing to sacrifice G-Sync/Adaptative Sync, but not Screen tearing free gaming, "eye candy" or at least "fixed" 60/70 fps. I hope my GPU doesn't die before the 4060/4060TI come out (or RX 7600/7700 series), but with the current pricing trend I'm worried that even those might still be outside my budget or not provide big enough performance/€ improvement over current gen.
I've read something that Enhanced Sync is similar to Fast Sync, but also that it works best if the GPU is rendering frames above the monitors max refresh rate - 144Hz - which I know won't happen with a 450€ GPU without severely downgrading visual quality and I don't want that.
I currently have an option to reduce the monitor's refresh rate from 144 to 85 or 60Hz, will that option still exist while using an AMD card? And will that "trick" Enhanced Sync to "work" at those frame rates?
View attachment 279138
Also, I just learned that Nvidia Control Panel apparently has an option to manually create a profile for a custom resolution/refresh rate. Does it work well (gave me a bunch of warnings when I clicked on it), and does AMD have something similar?
So what other options are there? Can VSync be configured to work at half the maximum refresh rate (72Hz)? I could adjust the frame limiter according to the "average" frame rate I can achieve on each game.

TL,DR version:
GPU prices are insane.
I plan to upgrade my GTX 1060 and hope to spend less than 450€. So I'm considering if changing to AMD is worth it. However I own a G-Sync only monitor (LG 32GK850G) which I'm not planing to replace.
My main concern is screen tearing, and want to know which solutions there are (VSync, Enhanced Sync, etc), and what compromises have to be made (fixed frame rate? 60 fps?).

Bottom line
I just want to know if an AMD GPU can theoretically provide a "tear-free" experience by combining some softwares and settings/tweaks or if I just stick with Nvidia even though at the same price (<450€) they will probably deliver less performance (I don't care about Ray Tracing ATM).

Thanks in advance
Yes AMD has custom Resolutions, I just discovered this with the Last Known Released Driver for a R7 250X that works for Windows 11.

To prevent tearing you enable VSync and ensure monitor is running at 60, 75 Hz...

Grab a 6600XT/6700/XT/6800

No need to be forced into proprietary hardware

You have a PM
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
8,229 (2.32/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6400CL32┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
@tvshacker for now if you're limited by the monitor you have a couple of options for fixed framerate, some people hype Radeon Chill but FRTC works a lot better for me.

If you're familiar with NVCP's framelimiter, FRTC is similar. Chill is a little different and more of a "target" which is probably why it's less consistent.

Both should be found under Global Graphics, though FRTC might be under the Advanced dropdown at the bottom. And Chill can be set per game but FRTC is global only (though you can just set a baseline with FRTC and set per-game settings from there).

Without Freesync/G-sync as long as you can consistently max out the fixed framerate limit you've set, use V-sync, and a framelimiter like FRTC things should still be smooth, barring certain game engine eccentricities.

The combination of Chill/FRTC and Vsyncight might add some input lag, but enabling Anti-Lag should mitigate a chunk of that (same Global Graphics/per-game).

I have yet to see Enhanced Sync do anything meaningful or positive for me. It's described as Fast Sync equivalent for AMD, but it's about as equally useless. If the specific game requires Vsync for smoothness, then Fast Sync wouldn't do the trick, and if you don't need to run Vsync then you wouldn't run Fast Sync anyway.
 
Last edited:
Joined
Sep 26, 2022
Messages
253 (0.28/day)
Location
Portugal
System Name Main
Processor 5700X
Motherboard MSI B450M Mortar
Cooling Corsair H80i v2
Memory G.SKILL Ripjaws V 32GB (2x16GB) DDR4-3600MHz CL16
Video Card(s) MSI RTX 3060 Ti VENTUS 2X OC 8GB GDDR6X
Display(s) LG 32GK850G
Case NOX HUMMER ZN
Power Supply Seasonic GX-750
@tabascosauz Just to be clear, all of your solutions require changing the refresh rate of the monitor right?

I have the same monitor. I would stick to an Nvidia GPU. It's not worth it to switch to AMD and loose Gsync only to save some money. I pair it with a 3060 Ti and it's ok for 1440p gaming. Yes you can run the monitor at 120Hz or 60Hz on DP, but why would you ?
Well, there are a couple more reasons to consider changing to AMD. The 3060Ti and 4070Ti seem a bit "starved" on the VRAM bus width and amount for their tiers, which might hinder their performance in the long run. And I don't play competitive or fast paced shooters frequently so not playing at 144Hz/fps isn't a major downside, and if the new card manages to stay above 60fps, LFC won't be necessary either, so I think bringing the refresh rate down to 60 or 85Hz won't bother me much.
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
8,229 (2.32/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6400CL32┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
@tabascosauz Just to be clear, all of your solutions require changing the refresh rate of the monitor right?

Well, there are a couple more reasons to consider changing to AMD. The 3060Ti and 4070Ti seem a bit "starved" on the VRAM bus width and amount for their tiers, which might hinder their performance in the long run. And I don't play competitive or fast paced shooters frequently so not playing at 144Hz/fps isn't a major downside, and if the new card manages to stay above 60fps, LFC won't be necessary either, so I think bringing the refresh rate down to 60 or 85Hz won't bother me much.

No, none of the 3 (Chill, FRTC, Enhanced Sync) require changing the refresh rate of the monitor. Leave it on 165Hz, you'll appreciate the smoothness doing normal desktop stuff. 60Hz daily use on a big screen is positively insufferable after becoming accustomed to 165Hz. FRTC's framelimiter only affects games.

Chill and Anti-Lag are always mutually exclusive (at least until AMD introduces that Hyper-RX feature they announced), so best bet is probably FRTC + Anti-Lag. Especially since you aren't a hardcore competitive gamer so the extra input lag from Vsync is not such a big deal.

The only annoying thing for me is that FRTC is not available on a per-game basis.

AMD frtc.png
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
8,130 (1.10/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 3
Software Win 11 Pro x64
FYI just because it says Gsync doesn’t mean an AMD card cant make use of VRR. I have the AW OLED which has Gysnc Ultimate(the module) currently hooked to my new 6950XT and previously to a 6700XT. I have Adaptive Sync available which is just a plain way of saying it’s doing Gysnc/Freesync or basically ”generic” VRR.
I bet if you dug a little deeper( I tried a bit) you’ll find the monitor is “Gysnc Compatible“ so basically it can do VRR with either side like almost any monitor.
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
8,229 (2.32/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6400CL32┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
FYI just because it says Gsync doesn’t mean an AMD card cant make use of VRR. I have the AW OLED which has Gysnc Ultimate(the module) currently hooked to my new 6950XT and previously to a 6700XT. I have Adaptive Sync available which is just a plain way of saying it’s doing Gysnc/Freesync or basically ”generic” VRR.
I bet if you dug a little deeper( I tried a bit) you’ll find the monitor is “Gysnc Compatible“ so basically it can do VRR with either side like almost any monitor.

Sadly that LG monitor is one of the old Gsync monitors. Nvidia initially promised to update them for compatibility but it ultimately was the responsibility of manufacturers. Basically only newer hardware Gsync monitors after a certain cutoff date actually received the firmware/design update for it. It was an old thread though so maybe I don't remember 100%

There is a 32GK850F but that is a completely different monitor built around Freesync.

Nowadays Gsync hardware and Gsync Ultimate have no compatibility issues.
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
8,130 (1.10/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 3
Software Win 11 Pro x64
Sadly that LG monitor is one of the old Gsync monitors. Nvidia initially promised to update them for compatibility but it ultimately was the responsibility of manufacturers. Basically only newer hardware Gsync monitors after a certain cutoff date actually received the firmware/design update for it. It was an old thread though so maybe I don't remember 100%

Nowadays Gsync hardware and Gsync Ultimate have no compatibility issues.
Which means like me on my “brand new” OLED with Gsync Ultimate on AMD I get Adaptive Sync. You aren‘t “locked out” of VRR you just don’t get that “brand name” available. My point still stands. That monitor doesn’t have a module so it will be generic VRR that NV just slapped their name overtop what probably used to actually be Freesync. ALOT of monitors had Freesync basically “scrubbed” from the details by NV. Even one of my previous monitors that was marketed with Freesync now has zero trace of it on the prod description.
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
8,229 (2.32/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6400CL32┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
Which means like me on my “brand new” OLED with Gsync Ultimate on AMD I get Adaptive Sync. You aren‘t “locked out” of VRR you just don’t get that “brand name” available. My point still stands.

NVIDIA Open Up Support for Adaptive-sync/FreeSync for Future Native G-sync Module Screens - TFTCentral

2019:
This new firmware is being used now for future Native G-sync screens, and the Acer Predator XB273 X is the first we’ve seen advertised with these new features. We confirmed with NVIDIA that it will NOT be possible to update firmware to any existing Native G-sync screen, or request updates to allow your current G-sync screen to be updated so that it would work with AMD graphics cards. The new firmware will only be applied to future G-sync module displays.

That seems to be as close we get to an official statement by Nvidia that VRR does not work at all on these old screens. Also there was some mention about HDMI VRR being the focus of Nvidia's efforts, which the 850G is incapable of - all VRR on the 850G can only be handled through the G-sync module over Displayport (which itself was not updated to support VRR/Freesync). Last I checked there was no mention of any update rolled out by LG for the 850G.

I think the original thread might explain better than I can:

 
Last edited:

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
8,130 (1.10/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 3
Software Win 11 Pro x64
s O
NVIDIA Open Up Support for Adaptive-sync/FreeSync for Future Native G-sync Module Screens - TFTCentral

2019:


That seems to be as close we get to an official statement by Nvidia that VRR does not work at all on these old screens. Also there was some mention about HDMI VRR being the focus of Nvidia's efforts, while the 32GK850G only has a 60Hz normal HDMI port - all VRR can only be handled through the G-sync module over Displayport. Last I checked there was no mention of any update rolled out by LG for the 850G.

I think the original thread might explain better than I can:

Okay but the point your missing here is unless it has the actual module(it doesn’t) its just using VRR with the name slapped on it by NV there’s no way to “lock out” generic VRR module or not, Gsync name or not. It was AMD who came out first with this why there are so many labeled “Gsync Compatible“ and Freesync just isn’t mentioned at all. Now of course there are full on Gsync(mine) and Freesync(my previous G5) monitors. But there is no way for either to stop the other from using VRR.
Edit: This is my "option" for VRR now-
ASync.jpg
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
8,229 (2.32/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6400CL32┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
s O

Okay but the point your missing here is unless it has the actual module(it doesn’t) its just using VRR with the name slapped on it by NV there’s no way to “lock out” generic VRR module or not, Gsync name or not. It was AMD who came out first with this why there are so many labeled “Gsync Compatible“ and Freesync just isn’t mentioned at all. Now of course there are full on Gsync(mine) and Freesync(my previous G5) monitors. But there is no way for either to stop the other from using VRR.

I think you're misunderstanding something. The 850G is a hardware G-sync monitor, with module. It's just an old one that only supports G-sync over Displayport.

The 850F is the newer cousin - no hardware module, with Freesync Premium (by extension, G-sync Compatible) support.

RTINGS tested the 850G and updated the review all the way up to mid-2020. There is no VRR support outside of the original Nvidia module G-sync. The 850G predates the introduction of VESA VRR/Freesync entirely, so without a firmware update there certainly wouldn't be any way to use Freesync on a G-sync module.

There is no reason why the AW3423DW wouldn't be able to do Freesync. It's a new monitor with new firmware after the cutoff date, and reviews at RTINGs verified to support Freesync:

32gk850g vrr.png
aw3423dw vrr.png
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
8,130 (1.10/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 3
Software Win 11 Pro x64
I think you're misunderstanding something. The 850G is a hardware G-sync monitor, with module. It's just an old one that only supports G-sync over Displayport.
Okay fair point. It has the original module tho it “should” be labeled “Ultimate“ with the module but I guess as you said it’s an old one.
As it stands today if it has the module and on AMD you get the option I showed in my previous post.I‘ve lost “Freesync“ just get Adaptive Sync as my “generic” option.
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
8,229 (2.32/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6400CL32┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
Okay fair point. It has the original module tho it “should” be labeled “Ultimate“ with the module but I guess as you said it’s an old one.
As it stands today if it has the module and on AMD you get the option I showed in my previous post.I‘ve lost “Freesync“ just get Adaptive Sync as my “generic” option.

Rather curious that Dell chose to give it Freesync capability but not advertise it officially with AMD. I guess they want to market their cheaper AW3423DWF that ditches the hardware module for official Freesync Premium Pro, and make G-sync sound more exclusive.

For G-sync module/G-sync Ultimate, I have a feeling Freesync doesn't quite work the way it usually does natively (where the panel itself handles VRR). Probably the signal must still be processed separately by the G-sync module. So if that's true, then the firmware update explanation makes sense. After the VRR cutoff in 2019, the physical modules did not change, but were reprogrammed.

As much as I hate to give Nvidia any credit, Freesync running on G-sync module might still work better. Native Freesync Premium/Pro occasionally has unavoidable flicker under certain conditions (ie. games inherently framecapped close to the lower VRR bound of 48Hz). All hardware G-sync monitors have VRR working down to 1Hz (at least on G-SYNC, but they still have the capability), stands to reason they might avoid the flicker.
 
Last edited:

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
8,130 (1.10/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 3
Software Win 11 Pro x64
Rather curious that Dell chose to give it Freesync capability but not advertise it officially with AMD. I guess they want to market their cheaper AW3423DWF that ditches the hardware module for official Freesync Premium Pro, and make G-sync sound more exclusive.

For G-sync module/G-sync Ultimate, I have a feeling Freesync doesn't quite work the way it usually does natively (where the panel itself handles VRR). Probably the signal must still be processed separately by the G-sync module. So if that's true, then the firmware update explanation makes sense. After the VRR cutoff in 2019, the physical modules did not change, but were reprogrammed.

As much as I hate to give Nvidia any credit, Freesync running on G-sync module might still work better. Native Freesync Premium/Pro occasionally has unavoidable flicker under certain conditions (ie. games inherently framecapped close to the lower VRR bound of 48Hz). All hardware G-sync monitors have VRR working down to 1Hz, stands to reason they might avoid the flicker.
Well I have had a Freesync monitor since it was a thing. I was a very early adopter because tearing literally makes me nauseous. Premium adds LFRC and Pro means it’s got HDR cert by AMDs standard.
I highly doubt it’s using t module for VRR(we’re talking about NV here) The fact it has a 1-175hz range comes down to it being an OLED more than anything as far as I’m concerned. I haven’t looked at the Fs spec(I don’t want to I’ll just get frustrated buyer’s remorse not waiting a bit longer for it)I’d suspect it would also have a a full 1-176hz range too something no Freesync monitor has been able to do.
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
8,229 (2.32/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6400CL32┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
Well I have had a Freesync monitor since it was a thing. I was a very early adopter because tearing literally makes me nauseous. Premium adds LFRC and Pro means it’s got HDR cert by AMDs standard.
I highly doubt it’s using t module for VRR(we’re talking about NV here) The fact it has a 1-175hz range comes down to it being an OLED more than anything as far as I’m concerned. I haven’t looked at the Fs spec(I don’t want to I’ll just get frustrated buyer’s remorse not waiting a bit longer for it)I’d suspect it would also have a a full 1-176hz range too something no Freesync monitor has been able to do.

Not quite because of OLED - 1Hz lower bound is a requirement for all G-sync module monitors (no exceptions), 30-48Hz is the lowest that any native Freesync monitor can go (no exceptions). AW3423DWF is pretty much your panel minus the G-sync module; very similar QD-OLED panel, 48Hz minimum.

AMD FreeSync™ Monitors | AMD
GeForce G-SYNC Monitors: Manufacturers & Specs (nvidia.com)

Anyways, I'm pretty sure even G-sync hardware would be running Freesync with a lower limit of say 48Hz, not the 1Hz they're physically capable of. Just food for thought.
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
8,130 (1.10/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 3
Software Win 11 Pro x64
Not quite because of OLED - 1Hz lower bound is a requirement for all G-sync module monitors (no exceptions), 30-48Hz is the lowest that any native Freesync monitor can go (no exceptions). AW3423DWF is pretty much your panel minus the G-sync module; very similar QD-OLED panel, 48Hz minimum.

AMD FreeSync™ Monitors | AMD
GeForce G-SYNC Monitors: Manufacturers & Specs (nvidia.com)

Anyways, I'm pretty sure even G-sync hardware would be running Freesync with a lower limit of say 48Hz, not the 1Hz they're physically capable of. Just food for thought.
Actually now that I have read a review on the F, I knew already it’s the identical panel(I mean Samsumg just makes the one) but I did find this a bit interesting and I’ll have to jump on my PC and drop and Edit with a pic to show why. The F wasn’t even a thing when I ordered mine back in August but it was announced just before I received mine in November…

“In truth, there’s no big difference. The AW3423DW has a bit wider VRR range of 30-175Hz, but you’re not going to notice the 10Hz maximum refresh rate difference, while the AW3423DWF uses LFC below 48FPS (47FPS tripled to 141Hz, for instance) to prevent tearing anyway.”

Edit: This is curious no? Now I have had an issue with ATS/ETS with well now 2 cards not clocking up properly in the main menu without doing a "fiddle" in the GFX and ends up at like 3FPS, now there is no tearing but no LFC means it's a slideshow and quite difficult to even correct it.
FSrange.jpg


So it seems IMO the F actually has an advantage with LFC I know it was really good on my previous Odyssey G5 because at first I was pretty disappointed in its FS range(60-165) but if it hit 59 you bet LFC kicked in and made it pretty much unnoticeable. My previous monitor was a Premium Pro so it wasn‘t my first LFC capable monitor but the previous had a 45-144 range so it would be quite rare for it to ever kick in and well you never really want to get that low to begin with…


Derp
Edit: sorry @tvshacker I missed the bit you already own the monitor. So yeah you are gonna have to stay NV if you want to keep Gsync apparently...

 
Last edited:
Joined
Sep 26, 2022
Messages
253 (0.28/day)
Location
Portugal
System Name Main
Processor 5700X
Motherboard MSI B450M Mortar
Cooling Corsair H80i v2
Memory G.SKILL Ripjaws V 32GB (2x16GB) DDR4-3600MHz CL16
Video Card(s) MSI RTX 3060 Ti VENTUS 2X OC 8GB GDDR6X
Display(s) LG 32GK850G
Case NOX HUMMER ZN
Power Supply Seasonic GX-750
Derp
Edit: sorry @tvshacker I missed the bit you already own the monitor. So yeah you are gonna have to stay NV if you want to keep Gsync apparently...
I mentioned on the OP that I'm willing to sacrifice GSync and 144Hz but not willing to tolerate screen tearing and major loss in image quality settings
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
8,130 (1.10/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 3
Software Win 11 Pro x64
I mentioned on the OP that I'm willing to sacrifice GSync and 144Hz but not willing to tolerate screen tearing and major loss in image quality settings
Yeah you absolutely have my support because tearing literally makes me nauseous, my eyes wig out and I feel illl. You wouldn’t lose 144hz and well there’s a long running debate about if one side has better image quality then the other that will probably never be settled. But as this thread has made clear you can only have Gsync. Now not losing 144hz would really mean it would have to be a rare situation to actually get any tearing unless you really pushed it past 144hz I know of one situation which was the new Wolfensteins would tear horribly unless you used Vsync. I’d have to install it again to see if that’s still true. But having a high refresh monitor really makes tearing rare with or without VRR.
 
Joined
Sep 26, 2022
Messages
253 (0.28/day)
Location
Portugal
System Name Main
Processor 5700X
Motherboard MSI B450M Mortar
Cooling Corsair H80i v2
Memory G.SKILL Ripjaws V 32GB (2x16GB) DDR4-3600MHz CL16
Video Card(s) MSI RTX 3060 Ti VENTUS 2X OC 8GB GDDR6X
Display(s) LG 32GK850G
Case NOX HUMMER ZN
Power Supply Seasonic GX-750
a long running debate about if one side has better image quality then the other that will probably never be settled
That is not what I meant. I meant reducing texture quality/resolution, draw distances, and switching off AA or other details in order to boost framerate.
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
8,130 (1.10/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 3
Software Win 11 Pro x64
That is not what I meant. I meant reducing texture quality/resolution, draw distances, and switching off AA or other details in order to boost framerate.
Well only a GPU upgrade will stop you having to do that obviously if want to avoid compromises. I mean 1440 let’s you run lower AA for example but I get where you’re coming from. If you want to keep the features you use now obviously have to stay with NV and as you said wait it out for an upgrade that suits your budget, Not sure how your used market is but will get you say a higher level Ampere for the money against getting something new but maybe lower end but then 1440 isn’t that hard to drive with a newer mid range card easily too.
 
Joined
Feb 14, 2012
Messages
1,888 (0.40/day)
Location
Romania
LFC won't be necessary either, so I think bringing the refresh rate down to 60 or 85Hz won't bother me much.
You missundertand these thechnologies. Your image will be smoother and have less tearing at 144Hz than at 60 with or without vsync and LFC is great at low framerate. You are worried about vRAM ? ok then, i`m not, i can always turn down my textures. Sell your monitor and 1060. Buy a newer monitor that has Freesync Premium and an AMD GPU. You will loose ray tracing and DLSS but gain 4GB of extra vRAM.
 
Joined
Sep 26, 2022
Messages
253 (0.28/day)
Location
Portugal
System Name Main
Processor 5700X
Motherboard MSI B450M Mortar
Cooling Corsair H80i v2
Memory G.SKILL Ripjaws V 32GB (2x16GB) DDR4-3600MHz CL16
Video Card(s) MSI RTX 3060 Ti VENTUS 2X OC 8GB GDDR6X
Display(s) LG 32GK850G
Case NOX HUMMER ZN
Power Supply Seasonic GX-750
You missundertand these thechnologies. Your image will be smoother and have less tearing at 144Hz than at 60 with or without vsync and LFC is great at low framerate. You are worried about vRAM ? ok then, i`m not, i can always turn down my textures. Sell your monitor and 1060. Buy a newer monitor that has Freesync Premium and an AMD GPU. You will loose ray tracing and DLSS but gain 4GB of extra vRAM.
The issue is not "4GB of extra vRAM", the issue is that Nvidia seems to be crippling their (newest) products on purpose, how else can you explain that 4070TI is on average on par with the 3090TI except at 4K?
To me it's Nvidia's implementation of "planned obsolescence", and that makes not want to support them until their behaviour changes.

Without Freesync/G-sync as long as you can consistently max out the fixed framerate limit you've set, use V-sync, and a framelimiter like FRTC things should still be smooth, barring certain game engine eccentricities.
No, none of the 3 (Chill, FRTC, Enhanced Sync) require changing the refresh rate of the monitor. Leave it on 165Hz, you'll appreciate the smoothness doing normal desktop stuff. 60Hz daily use on a big screen is positively insufferable after becoming accustomed to 165Hz. FRTC's framelimiter only affects games.
I'm still wrapping my head around this. To my knowledge VSync is locked to the monitor's refresh rate and not the framerate of 3D applications.
For example: let's say I set FRTC to 85 and switch on VSync, and the monitor refresh rate is still configured to 144Hz. What will VSync do in this cases? How will it it work?
 
Last edited:

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
8,229 (2.32/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6400CL32┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
The issue is not "4GB of extra vRAM", the issue is that Nvidia seems to be crippling their (newest) products on purpose, how else can you explain that 4070TI is on average on par with the 3090TI except at 4K?
To me it's Nvidia's implementation of "planned obsolescence", and that makes not want to support them until their behaviour changes.



I'm still wrapping my head around this. To my knowledge VSync is locked to the monitor's refresh rate and not the framerate of 3D applications.
For example: let's say I set FRTC to 85 and switch on VSync, and the monitor refresh rate is still configured to 144Hz. What will VSync do in this cases? How will it it work?

Not 100% sure how Vsync behaves when Freesync is disabled in that scenario, but if FRTC works as most people have suggested (ie. driver level limiter like Nvidia), Vsync usually doesn't override a lower fps framelimiter.

War Thunder is one game that pretty much requires Vsync for smoothness, VRR irrelevant. I will test setting FRTC lower tomorrow, but Nvidia framecap at 90fps always worked in spite of Vsync.
 
Joined
Sep 26, 2022
Messages
253 (0.28/day)
Location
Portugal
System Name Main
Processor 5700X
Motherboard MSI B450M Mortar
Cooling Corsair H80i v2
Memory G.SKILL Ripjaws V 32GB (2x16GB) DDR4-3600MHz CL16
Video Card(s) MSI RTX 3060 Ti VENTUS 2X OC 8GB GDDR6X
Display(s) LG 32GK850G
Case NOX HUMMER ZN
Power Supply Seasonic GX-750
Fun week ahead! Around my parts the 4060 should have an MSRP of 335€ and the 4060Ti 449€ (which is on par with what the "vanilla" 3070 costs now), which is more than I expect an 8GB card should cost in 2023, and the 16GB 4060Ti is just ridiculously priced (559€).
I'm starting to consider going for a 12GB 3060 if I can find them new below 300€ (I actually found one but I never heard of the store before).
 
Top