• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Radeon Chill - does anybody uses it?

itomic009

New Member
Joined
Feb 15, 2023
Messages
3 (0.01/day)
I recently upgraded to RX 6700 XT 12GB and am very satisfied. I have a 1440p 75Hz display, and I got curious about the Radon Chill. I know it has been here for years, but I never bothered to try it out too much. I anyways had a weaker GPU, so I didn't need to restrain the GPU in the first place. Now, I tested it in Uncharted 4, and I like the feature since the GPU does not run 100% (I undervolted it a bit as well), and I have a rock solid framerate. The best option is to set it one fps lower than the monitor's refresh rate, especially if you have a FreeSync one. Since mine is 75Hz, I set the minimum and the maximum on the Radeon Chipp to 74fps, and the gameplay is buttery smooth, with my GPU drawing about 40 - 50W less than without it. In the adventure and not so fast-paced games, it is okay. I do not know how it behaves in faster-paced games.

What are your experiences s with the Radeon Chill? Do you use it?
 

Wildmazay

New Member
Joined
Jan 19, 2023
Messages
8 (0.01/day)
Hello,
Yes im using it, but not in games.
I like my rig to be quiet (to don't hear it's noises), so im turning on every possible fan-reducing-noice features. But when it comes to gaming im setting it to turn-off state because the sounds of the game in my headphones are louder anyway
 
Joined
Nov 26, 2021
Messages
1,439 (1.46/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
I use it in older games that don't need the full power of the GPU. I use the defaults of 75 and 144 for my monitor, and it helps keep both the average power draw and GPU fan speed low.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
16,671 (4.66/day)
Location
Kepler-186f
I use it in older games that don't need the full power of the GPU. I use the defaults of 75 and 144 for my monitor, and it helps keep both the average power draw and GPU fan speed low.

if I frame cap all my games at 160 fps on a 165hz monitor, doesn't this effectively do the same thing, as the gpu only draws as many watts as it needs.
 
Joined
Aug 29, 2005
Messages
7,168 (1.04/day)
Location
Stuck somewhere in the 80's Jpop era....
System Name Lynni PS \ Lenowo TwinkPad L14 G2
Processor AMD Ryzen 7 7700 Raphael \ i5-1135G7 Tiger Lake-U
Motherboard ASRock B650M PG Riptide Bios v. 2.02 AMD AGESA 1.1.0.0 \ Lenowo BDPLANAR Bios 1.68
Cooling Noctua NH-D15 Chromax.Black (Only middle fan) \ Lenowo C-267C-2
Memory G.Skill Flare X5 2x16GB DDR5 6000MHZ CL36-36-36-96 AMD EXPO \ Willk Elektronik 2x16GB 2666MHZ CL17
Video Card(s) Asus GeForce RTX™ 4070 Dual OC GPU: 2325-2355 MEM: 1462| Intel® Iris® Xe Graphics
Storage Gigabyte M30 1TB|Sabrent Rocket 2TB| HDD: 10TB|1TB \ WD RED SN700 1TB
Display(s) LG UltraGear 27GP850-B 1440p@165Hz | LG 48CX OLED 4K HDR | Innolux 14" 1080p
Case Asus Prime AP201 White Mesh | Lenowo L14 G2 chassis
Audio Device(s) Steelseries Arctis Pro Wireless
Power Supply Be Quiet! Pure Power 12 M 750W Goldie | 65W
Mouse Logitech G305 Lightspeedy Wireless | Lenowo TouchPad & Logitech G305
Keyboard Ducky One 3 Daybreak Fullsize | L14 G2 UK Lumi
Software Win11 Pro 23H2 UK | Arch (Fan)
Benchmark Scores 3DMARK: https://www.3dmark.com/3dm/89434432? GPU-Z: https://www.techpowerup.com/gpuz/details/v3zbr
I use it in both new and older titles works fine and no complains really.

I do it for power savings because eletricity prices can be a bitch.
 
Joined
Feb 16, 2023
Messages
1 (0.00/day)
Processor i5-11600K
Motherboard MSI MPG Z590 gaming force
Memory PNY XLR8 16GBx2 3200MHz
Video Card(s) RX 580 8GB Sapphire nitro, RX 580 8GB MSI gaming X
Storage PNY 2TB XLR8 CS3040
No I don't. It won't work with crossfire setups. There are other ways to limit framerate for those still using multiple cards (Frame rate target control).
 
Joined
Nov 26, 2021
Messages
1,439 (1.46/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
if I frame cap all my games at 160 fps on a 165hz monitor, doesn't this effectively do the same thing, as the gpu only draws as many watts as it needs.
It depends upon the title, but since Chill has both a lower limit and an upper limit, it usually is more efficient than a frame rate cap.
 
Joined
Jun 27, 2019
Messages
1,971 (1.06/day)
Location
Hungary
System Name I don't name my systems.
Processor i5-12600KF 'stock power limits/-115mV undervolt+contact frame'
Motherboard Asus Prime B660-PLUS D4
Cooling ID-Cooling SE 224 XT ARGB V3 'CPU', 4x Be Quiet! Light Wings + 2x Arctic P12 black case fans.
Memory 4x8GB G.SKILL Ripjaws V DDR4 3200MHz
Video Card(s) Asus TuF V2 RTX 3060 Ti @1920 MHz Core/@950mV Undervolt
Storage 4 TB WD Red, 1 TB Silicon Power A55 Sata, 1 TB Kingston A2000 NVMe, 256 GB Adata Spectrix s40g NVMe
Display(s) 29" 2560x1080 75 Hz / LG 29WK600-W
Case Be Quiet! Pure Base 500 FX Black
Audio Device(s) Onboard + Hama uRage SoundZ 900+USB DAC
Power Supply Seasonic CORE GM 500W 80+ Gold
Mouse Canyon Puncher GM-20
Keyboard SPC Gear GK630K Tournament 'Kailh Brown'
Software Windows 10 Pro
Used to when I had my RX 570 for nearly 3 years with a 75Hz monitor that has 40-75 Hz Freesync range so I had mine also capped at 74.
Worked pretty well in almost every game save it for a few odd ones where it refused to work but in general it was nice and tbh I still kinda miss that Freesync on top.

Now that I've switched to Nvidia I simply limit my max frames on a driver level or if need be sync it down to avoid tearing.
I mean it works but I liked my chill+Freesync more.
 
Joined
Sep 20, 2019
Messages
499 (0.28/day)
Processor i9-9900K @ 5.1GHz (H2O Cooled)
Motherboard Gigabyte Z390 Aorus Master
Cooling CPU = EK Velocity / GPU = EK Vector
Memory 32GB - G-Skill Trident Z RGB @ 3200MHz
Video Card(s) AMD RX 6900 XT (H2O Cooled)
Storage Samsung 860 EVO - 970 EVO - 870 QVO
Display(s) Samsung QN90A 50" 4K TV & LG 20" 1600x900
Case Lian Li O11-D
Audio Device(s) Presonus Studio 192
Power Supply Seasonic Prime Ultra Titanium 850W
Mouse Logitech MX Anywhere 2S
Keyboard Matias RGB Backlit Keyboard
Software Windows 10 & macOS (Hackintosh)
yes I use Chill for the sole purpose as an alternative to FRTC (Frame rate target control). I set the min max values to the same exact value

FRTC apparently adds (or may add) input lag. So using chill in such a way lets you use a pseudo FRTC alternative without the added input lag.

also you can use chill on a per game profile basis. FRTC on the other hand is a global parameter and you cannot adjust that in a game profile. this is handy for me because there are a couple games I need to change to either 100Hz or 60Hz from my standard 120Hz display.

My opinion is you need to be a serious competitive gamer with high anal retentive characteristics to care enough that your GPU should be outputting for ex 372 FPS (any value over your display's max refresh) despite you are using a for ex 240Hz display......"normal" people are not going to notice a difference if the GPU was only outputting 240 FPS
 
Joined
Oct 26, 2018
Messages
211 (0.10/day)
Processor Intel i5-13600KF
Motherboard ASRock Z790 PG Lightning
Cooling NZXT Kraken 240
Memory Corsair Vengeance DDR5 6400
Video Card(s) XFX RX 7800 XT
Storage Samsung 990 Pro 2 TB + Samsung 860 EVO 1TB
Display(s) Dell S2721DGF 165Hz
Case Fractal Meshify C
Power Supply Seasonic Focus 750
Mouse Logitech G502 HERO
Keyboard Logitech G512
I set the minimum and the maximum on the Radeon Chipp to 74fps
I think it's pointless to use like this, just use "Frame rate target control" if you only care about smooth frame rate.
The "Min FPS" is more accurately called "Idle FPS". Your power savings are coming from the undervolt.
Radeon Chill constantly checks for motion looking for an oppurtunity to lower the FPS, but your idle FPS is the same as the max FPS, so no chill, no power savings.
If you want to save more power set the "Min FPS" to a lower number, and it will run at lower FPS when you're AFK.
I've used Chill when I played Ark and I was AFK a lot, but normally I'm not AFK when I'm gaming and I don't want the FR jumping around.
if I frame cap all my games at 160 fps on a 165hz monitor, doesn't this effectively do the same thing, as the gpu only draws as many watts as it needs.
It depends upon the title, but since Chill has both a lower limit and an upper limit, it usually is more efficient than a frame rate cap.
Radeon Chill sets a lower frame rate limit when you are idle. It's not a lower limit, but "Idle target FPS", and setting a lower FPS can reduce power consumption when idle.
It's not inhearently more effecient, since power saving will come with reduced frame rates, and in this case with Idle FPS=Max FPS there is no potential for power savings.

FRTC apparently adds (or may add) input lag. So using chill in such a way lets you use a pseudo FRTC alternative without the added input lag.
I think any driver FPS limit has this potential including chill.
I think it depends on the engine, but in general for the sake of reducing input lag, an in-game FPS limit is preferable.
 
Joined
Feb 29, 2016
Messages
600 (0.19/day)
Location
Chile
System Name Fran
Processor AMD Ryzen 7 5700X
Motherboard ROG Strix B550-F GAMING WIFI II
Cooling Hyper 212 Turbo ARGB
Memory 32GB DDR4 3600MHz
Video Card(s) XFX RX 6700 10GB
Storage Samsung 970 EVO Plus 1TB, Kingston A1000 480GB
Display(s) Lenovo G27q-20 (1440p, 165Hz)
Case NZXT H510
Audio Device(s) MOONDROP Aria SE
Power Supply SuperFlower Leadex Gold III 850W
Mouse Logitech G302
Keyboard IK75 v3 (QMK version)
if I frame cap all my games at 160 fps on a 165hz monitor, doesn't this effectively do the same thing, as the gpu only draws as many watts as it needs.
Some frame capping implementations in older games suck and make frame pacing inconsistent. That is where Chill actually shines (and for saving energy while afk).
 
Last edited:
Joined
Jan 14, 2019
Messages
10,865 (5.35/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
I'm a bit conflicted, to be honest. For now, I use a standard 60 FPS cap, but reading this thread makes me wonder if Chill would be better.

Edit: I've just checked - I cannot enable Chill without disabling Anti-Lag. Now, which one is better? Chill, or Anti-Lag with an FPS cap?
 
Last edited:

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
16,671 (4.66/day)
Location
Kepler-186f
I'm a bit conflicted, to be honest. For now, I use a standard 60 FPS cap, but reading this thread makes me wonder if Chill would be better.

Edit: I've just checked - I cannot enable Chill without disabling Anti-Lag. Now, which one is better? Chill, or Anti-Lag with an FPS cap?

anti-lag is useless from what i understand, i always turn it off.
 
Joined
Jan 14, 2019
Messages
10,865 (5.35/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
anti-lag is useless from what i understand, i always turn it off.
I've just had a quick test in God of War. Anti-Lag + a fixed FPS cap actually works better, in my opinion.

Chill works well when I'm doing something and when I'm not, but there are minor stutters in between. Also, it doesn't seem to reach the upper FPS target ever. The idea is good, but the implementation is a bit meh.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
16,671 (4.66/day)
Location
Kepler-186f
I've just had a quick test in God of War. Anti-Lag + a fixed FPS cap actually works better, in my opinion.

Chill works well when I'm doing something and when I'm not, but there are minor stutters in between. Also, it doesn't seem to reach the upper FPS target ever. The idea is good, but the implementation is a bit meh.

i'm willing to give anti-lag a go.

what do you cap your fps at? how many fps below the max?
 
Joined
Jan 14, 2019
Messages
10,865 (5.35/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
i'm willing to give anti-lag a go.

what do you cap your fps at? how many fps below the max?
I cap it at my monitor's refresh rate, 60 Hz. I do have Freesync, though.
 
Joined
Sep 20, 2019
Messages
499 (0.28/day)
Processor i9-9900K @ 5.1GHz (H2O Cooled)
Motherboard Gigabyte Z390 Aorus Master
Cooling CPU = EK Velocity / GPU = EK Vector
Memory 32GB - G-Skill Trident Z RGB @ 3200MHz
Video Card(s) AMD RX 6900 XT (H2O Cooled)
Storage Samsung 860 EVO - 970 EVO - 870 QVO
Display(s) Samsung QN90A 50" 4K TV & LG 20" 1600x900
Case Lian Li O11-D
Audio Device(s) Presonus Studio 192
Power Supply Seasonic Prime Ultra Titanium 850W
Mouse Logitech MX Anywhere 2S
Keyboard Matias RGB Backlit Keyboard
Software Windows 10 & macOS (Hackintosh)
I said what I meant, and meant what I said lol

if you set the min and the max FPS value of the chill slider to same value then it acts like a frame limiter like it normally does, BUT, due to the min limit being the same, it doesn't do the FPS dip (the "chill" part of the name gimmick) when it detects low/no motion. I thought it sounded funny the first time I saw someone post about this method too, but as far as I am concerned it works just as good as FRTC


I think any driver FPS limit has this potential including chill.
I think it depends on the engine, but in general for the sake of reducing input lag, an in-game FPS limit is preferable.
makes sense. I really end up using chill more often due to it is customizable per game profile.....because in all honesty.....even playing games that would make most people notice input lag like Doom Eternal on nightmare difficulty, I cannot tell the difference when my TV is in "game mode" or not in game mode. supposed to be a difference of like 5ms to north of 70ms...I cannot tell the difference lol so my point being input lag has never been a real deciding factor for me. just some icing on the cake I suppose
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
16,671 (4.66/day)
Location
Kepler-186f
I said what I meant, and meant what I said lol

if you set the min and the max FPS value of the chill slider to same value then it acts like a frame limiter like it normally does, BUT, due to the min limit being the same, it doesn't do the FPS dip (the "chill" part of the name gimmick) when it detects low/no motion. I thought it sounded funny the first time I saw someone post about this method too, but as far as I am concerned it works just as good as FRTC



makes sense. I really end up using chill more often due to it is customizable per game profile.....because in all honesty.....even playing games that would make most people notice input lag like Doom Eternal on nightmare difficulty, I cannot tell the difference when my TV is in "game mode" or not in game mode. supposed to be a difference of like 5ms to north of 70ms...I cannot tell the difference lol so my point being input lag has never been a real deciding factor for me. just some icing on the cake I suppose

ty for clarifying, I was confused. lol
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
7,937 (2.38/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6400CL32┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
yes I use Chill for the sole purpose as an alternative to FRTC (Frame rate target control). I set the min max values to the same exact value

FRTC apparently adds (or may add) input lag. So using chill in such a way lets you use a pseudo FRTC alternative without the added input lag.

also you can use chill on a per game profile basis. FRTC on the other hand is a global parameter and you cannot adjust that in a game profile. this is handy for me because there are a couple games I need to change to either 100Hz or 60Hz from my standard 120Hz display.

My opinion is you need to be a serious competitive gamer with high anal retentive characteristics to care enough that your GPU should be outputting for ex 372 FPS (any value over your display's max refresh) despite you are using a for ex 240Hz display......"normal" people are not going to notice a difference if the GPU was only outputting 240 FPS

I haven't really noticed FRTC adding input lag (because Anti-Lag also exists when not using Chill), but yours was otherwise exactly my experience as well.

Set FRTC global to start (e.g. 160fps), then using Chill on a per-game basis where necessary.

FRTC almost always works well. I think there were only a few exceptions where Chill didn't work so well - War Thunder where it does nothing but FRTC smooths everything out.

Using Chill "as intended" and letting clocks drop, I was not a fan. Already the inherent VRAM (possibly also core) clock fluctuations were wreaking havoc in games, didn't need Chill adding to the problems. Maybe it makes a bigger difference in RDNA2 where it's not idling aggressively in game? dunno
 
Joined
Sep 20, 2019
Messages
499 (0.28/day)
Processor i9-9900K @ 5.1GHz (H2O Cooled)
Motherboard Gigabyte Z390 Aorus Master
Cooling CPU = EK Velocity / GPU = EK Vector
Memory 32GB - G-Skill Trident Z RGB @ 3200MHz
Video Card(s) AMD RX 6900 XT (H2O Cooled)
Storage Samsung 860 EVO - 970 EVO - 870 QVO
Display(s) Samsung QN90A 50" 4K TV & LG 20" 1600x900
Case Lian Li O11-D
Audio Device(s) Presonus Studio 192
Power Supply Seasonic Prime Ultra Titanium 850W
Mouse Logitech MX Anywhere 2S
Keyboard Matias RGB Backlit Keyboard
Software Windows 10 & macOS (Hackintosh)
I haven't really noticed FRTC adding input lag (because Anti-Lag also exists when not using Chill), but yours was otherwise exactly my experience as well.

Set FRTC global to start (e.g. 160fps), then using Chill on a per-game basis where necessary.

FRTC almost always works well. I think there were only a few exceptions where Chill didn't work so well - War Thunder where it does nothing but FRTC smooths everything out.

Using Chill "as intended" and letting clocks drop, I was not a fan. Already the inherent VRAM (possibly also core) clock fluctuations were wreaking havoc in games, didn't need Chill adding to the problems. Maybe it makes a bigger difference in RDNA2 where it's not idling aggressively in game? dunno
believe me, I cannot notice any added input lag either. I'm kinda just regurgitating the note/warning that is the AMD Adrenaline or what others have shared through legit investigations. It is measurable, but hell I cannot tell the diff between game mode and normal mode on my TV which adds around 70ms lol

yea the couple times I tried chill out as intended it was jumping all over the place. for ex, if you're completely still it should drop, but in a particular outdoor scene it is raining and just the rain drops moving across the screen is enough to trigger the chill part to stop and ramp back up. problem is it was doing it pretty inconsistently so FPS felt very jerky even on a display with VRR/Freesync. I think it's a YMMV kinda thing, depending on your expectations, settings, and what game(s) you're playing. It was very "scene dependent" when I gave it a shot
 

itomic009

New Member
Joined
Feb 15, 2023
Messages
3 (0.01/day)
I will have to test it more in some other games, but as for now, Uncharted 4 works great for me. As far as I can see and "feel" (regarding the potential lag), the gameplay is perfect, and my GPU isn't loaded 100% all the time, to maintain it. That is good from my point of view since I wouldn't get anything extra from the twice the faster GPU than mine. It will not make my gameplay any better or more enjoyable.

By recommendation, I use the settings to set it to 1fps lower than my display cap (I have FreeSync 48Hz-75Hz range). The goal is to keep the frame time delivery smooth. That is why I did not put it lower than 74fps. I know that I should have set it to lower to "chill," but my point is to have a stable framerate and to cap it by my monitor's max refresh rate. Also, I do have lower power consumption because my GPU can constantly deliver around 90 - 100 fps on average. Therefore, by limiting it to 74fps, it does not have to work 100% all the time. Also, I do not need the "chill" part on the low end since I am not AFK a lot when I game.

Anyway, as I mentioned, I did not ever use the Chill option, even though I am on AMD GPUs since they introduced it, but I appreciate it, and it is a cool feature. I will also play with it in other games to see whether it suits me.
 
Last edited:
Joined
Jan 14, 2019
Messages
10,865 (5.35/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
I haven't really noticed FRTC adding input lag (because Anti-Lag also exists when not using Chill), but yours was otherwise exactly my experience as well.

Set FRTC global to start (e.g. 160fps), then using Chill on a per-game basis where necessary.

FRTC almost always works well. I think there were only a few exceptions where Chill didn't work so well - War Thunder where it does nothing but FRTC smooths everything out.

Using Chill "as intended" and letting clocks drop, I was not a fan. Already the inherent VRAM (possibly also core) clock fluctuations were wreaking havoc in games, didn't need Chill adding to the problems. Maybe it makes a bigger difference in RDNA2 where it's not idling aggressively in game? dunno
I just tried yesterday with God of War. I compared a 60 FPS lock with Anti-Lag vs Chill at 48-60 FPS (that's my monitor's Freesync range) and no Anti-Lag. Conclusion: FRTC + Anti-Lag works better.

Chill is okay when I'm standing around, doing nothing, but the card clocks so far down that it takes a fraction of a second to wake up when I start moving which results in a slight hitch. It isn't annoying once or twice, but when it happens every time you start moving the mouse, it gets tiring for your eyes. Also, it never seems to reach the higher 60 FPS limit, it hovers around 56-58, while with FRTC, it runs at a constant 60.
 

chrisslyi

New Member
Joined
Apr 28, 2023
Messages
2 (0.00/day)
Hi all,

just registered here to give a heads up about FRTC and Chill.

FRTC does bring microstutter lag with it ( at least in my experience ). Not in every game is affected though. As skizzo said, enabling Chill does not do that for example in CSGO. I almost banged my head against the wall after buying a 7900XT and XTX, trying evey tweak under the sun in BIOS/Windows/driver/hardware (even QVL Memory lol.) and games itself to get rid of the microstutter. Nothing worked.

After FRTC accidentaly deactivated itself yesterday and frames were uncapped I could see all games ran perfectly smooth. Couldn't believe my eyes. Restarted the game and FRTC was active again - same issue.
Used chill just for fun and it does work multiple times better than FRTC - for at least CSGO and Dead Island 2. I have to say setting the min/max values same seemed dumb first, but is working as of now after multiple tests ran.

Never had those issues with Nvidias FRTC.. Was almost ready to return the card for loss and buy one again. But this.. this saved me and hopefully many others.
 
Joined
Jan 14, 2019
Messages
10,865 (5.35/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Hi all,

just registered here to give a heads up about FRTC and Chill.

FRTC does bring microstutter lag with it ( at least in my experience ). Not in every game is affected though. As skizzo said, enabling Chill does not do that for example in CSGO. I almost banged my head against the wall after buying a 7900XT and XTX, trying evey tweak under the sun in BIOS/Windows/driver/hardware (even QVL Memory lol.) and games itself to get rid of the microstutter. Nothing worked.

After FRTC accidentaly deactivated itself yesterday and frames were uncapped I could see all games ran perfectly smooth. Couldn't believe my eyes. Restarted the game and FRTC was active again - same issue.
Used chill just for fun and it does work multiple times better than FRTC - for at least CSGO and Dead Island 2. I have to say setting the min/max values same seemed dumb first, but is working as of now after multiple tests ran.

Never had those issues with Nvidias FRTC.. Was almost ready to return the card for loss and buy one again. But this.. this saved me and hopefully many others.
That's weird... I have the exact opposite experience. Setting Chill to a 60 FPS target never gives me 60 FPS. More like 58 with stutters that I can't explain. FRTC at 60 is butter smooth 99% of the time.

Edit: typo
 
Last edited:
Top