Wednesday, January 14th 2015
NVIDIA GeForce GTX TITAN X To Feature Tweakable Idle Fan-off Mode
Taking advantage of the low TDP of GeForce GTX 970 and GTX 980, several NVIDIA add-in card (AIC) partners such as ASUS, MSI, and Palit, innovated their VGA cooling solutions to feature idle fan-off. Such a feature lets the card turn its fans completely off, when the GPU is idling, or is below a temperature threshold, making the card completely silent when not gaming. NVIDIA plans to standardize this with its next-generation GeForce GTX TITAN X graphics card.
Given that its TITAN family of super high-end graphics cards never get to be custom-designed by AICs, NVIDIA has decided to standardize an idle fan-off feature of its own. Unlike AICs, who have used specialized fan-controller chips that take auxiliary temperature input to decide when to turn the fan off, NVIDIA's approach will be more driver-based. Future drivers accompanying the GTX TITAN X will offer a new feature, which when enabled, lets you choose between a non-linear fan curve that keeps the fan off; and one that runs it at low speeds. This should let the driver power the fan of a GTX TITAN X completely off, until it reaches a temperature threshold, and only then begin to ramp up speeds. It could help not just idle (desktop), but also light-gaming scenarios (think League of Legends).
Since it's a driver-based feature, third-party GPU software developers (eg: EVGA, MSI, etc.), will be able to create apps that let users toggle this feature, such as setting a fan-cutoff threshold that's appropriate to your environment, letting the fan spin at low speeds no matter the temperature. You'll get to choose if you want complete silence, or lower idle temperatures. This would end up being more flexible than the implementations AICs made with their GTX 900 series products.
Given that its TITAN family of super high-end graphics cards never get to be custom-designed by AICs, NVIDIA has decided to standardize an idle fan-off feature of its own. Unlike AICs, who have used specialized fan-controller chips that take auxiliary temperature input to decide when to turn the fan off, NVIDIA's approach will be more driver-based. Future drivers accompanying the GTX TITAN X will offer a new feature, which when enabled, lets you choose between a non-linear fan curve that keeps the fan off; and one that runs it at low speeds. This should let the driver power the fan of a GTX TITAN X completely off, until it reaches a temperature threshold, and only then begin to ramp up speeds. It could help not just idle (desktop), but also light-gaming scenarios (think League of Legends).
Since it's a driver-based feature, third-party GPU software developers (eg: EVGA, MSI, etc.), will be able to create apps that let users toggle this feature, such as setting a fan-cutoff threshold that's appropriate to your environment, letting the fan spin at low speeds no matter the temperature. You'll get to choose if you want complete silence, or lower idle temperatures. This would end up being more flexible than the implementations AICs made with their GTX 900 series products.
43 Comments on NVIDIA GeForce GTX TITAN X To Feature Tweakable Idle Fan-off Mode
I think this should be a standard feature especially with people today wanting their rig to be as quiet as possible so this is going to be cool to have it where software can just take advantage of it.
anyway next titan confirm?
I get worried every time nvidia starts messing with fan control software
soldering of the ic will be extremely weak at T> 150°C ( fusion temperature of soldering alloy is 180°C )
plastic will be very soft/start to carbonize a T> 120°C ( depend if plastic is thermoset or thermoplast )
electric resistance depends on temperature.. higher temperature mean higher resistance so higher power consumption and then higher temperature
everything over 70°C will be very hot if touched and can cause burns to human skin.. so it will be very unsafety ( case will be very hot too! )
so.. no one want ambient temperature over 60°C for long run and or stability ( nor under 10°C )
Marketing will be great. Like the Asus Strix 0db
i DIN'T even hear any sound of my 670 card while on idle, (unless your head is only 10cm from the pc u will hear everything including the case fan) the only time i can hear my gpu fan is when i increase the gpu fan speed to 60% .
Beside, i believe every dekstop case got at least 3 fans, that fans alone will sound much more noisy compare to a small GPU fan while on idle. The term of silent while on idle is not really a selling point. It should be "Low power consumption while on idle".
There is no need for this feature as when fans run on low speeds, they are not audible anyways!
At least, it shows that their marketing department is kept busy. :)
xzibit:
"With boost you get several fluctuations a second to peak and low. The 900 series hits peaks of drawing 290w a handful of times in a second. Not enough time for the fan to react. You are going to see more degradation in circuits if they aren't being beefed up to compensate for it. Shorter life span and maybe lowering warranties due to expected higher rate of failure. The feature could always come with a user warning."
overcloker_2001:
capacitor will die extremelòy fast in an ambient over 85°C ( something like 1'000h or less )
soldering of the ic will be extremely weak at T> 150°C ( fusion temperature of soldering alloy is 180°C )
plastic will be very soft/start to carbonize a T> 120°C ( depend if plastic is thermoset or thermoplast )
electric resistance depends on temperature.. higher temperature mean higher resistance so higher power consumption and then higher temperature
everything over 70°C will be very hot if touched and can cause burns to human skin.. so it will be very unsafety ( case will be very hot too! )
so.. no one want ambient temperature over 60°C for long run and or stability ( nor under 10°C )
It may prove to be one of those techs that is very niche but at least it will be available to those who want it. I still would prefer something like 10% fan speed no matter what as at least some air would be moving but that's just me.
Sapphire is also using a open double pitch-fork design as appose to an enclosed fin block like Nvidia Titan reference models would have. Which would not benefit from any kind of case air flow if the fan isn't spinning.
Maybe that's it. It just doesn't make sense to buy this expensive card (which is a stop-gap between generations of cards) for so much money only for it to be replaced by the next gen a half-year away.