Sunday, February 5th 2023
Running Discord Lowers NVIDIA GPU Memory Clocks by 200 MHz, Company Posts Workaround
The Windows app of Discord, the popular social-networking software, apparently trims the graphics card memory clock of NVIDIA GPUs by an innocuous 200 MHz, or so observe gamers. NVIDIA GeForce GPUs dynamically adjust memory clock speeds in response to load, as part of their power-management. Ideally, with gaming workloads, the GPU is supposed to hit its maximum rated memory frequency, but some keen-eyed gamers with monitoring tools noticed that with the Discord app running in the background, the memory clock tops out at T-minus 200 MHz (i.e. if it was supposed to be 7000 MHz, it tops out at 6800 MHz). Even under the infernal stress of Furmark, something that's designed to push memory clocks to the maximum rated speeds until the graphics card runs into thermal limits; the memory clock is seen falling 200 MHz short.
NVIDIA took note of this issue, and assured that a fix is on the way in a future GeForce driver update. In the meantime, it posted a DIY workaround to the problem that involves downloading the GeForce 3D Profile Manager utility, making the utility "export SLI profiles" (applicable even to single-GPU machines), editing the exported SLI profiles file as a plaintext document, and importing the profile back. This basically alters the way the driver behaves with the Discord app running. The NVIDIA 3D Profile Manager utility can be downloaded from here, and step-by-step instructions on using it to fix this issue, here.Update Feb 6th: NVIDIA released a GeForce driver application profile that automatically downloads to your driver, which should fix this issue. You don't need GeForce Experience to receive the update.
Source:
LinusTechTips (forums)
NVIDIA took note of this issue, and assured that a fix is on the way in a future GeForce driver update. In the meantime, it posted a DIY workaround to the problem that involves downloading the GeForce 3D Profile Manager utility, making the utility "export SLI profiles" (applicable even to single-GPU machines), editing the exported SLI profiles file as a plaintext document, and importing the profile back. This basically alters the way the driver behaves with the Discord app running. The NVIDIA 3D Profile Manager utility can be downloaded from here, and step-by-step instructions on using it to fix this issue, here.Update Feb 6th: NVIDIA released a GeForce driver application profile that automatically downloads to your driver, which should fix this issue. You don't need GeForce Experience to receive the update.
36 Comments on Running Discord Lowers NVIDIA GPU Memory Clocks by 200 MHz, Company Posts Workaround
To me it mostly seems like a weird / interesting issue and I would love to hear the cause, sadly I don't think we will ever know. Nvidia tends to be quite tight lipped about the details.
Discord uses hw acceleration for almost everything(including video playback) plus extensive HW encode support for video/desktop sharing(nvenc), my guees is that the drivers are putting the card in "video decode" mode and not overriding that when you open a game.
Now i'm super curious and will have to test this when i get back home, 200mhz is not that innocuous, it negates all the overclocking/power limit removal i've done to my gpu
As far as I know, the same issue should not affect the enterprise GPUs at all.
Did a quick test in Cyberpunk where I'm 100% GPU limited and it changed nothing in regard of performance like not even a single FPS on my end so wuteva I will just wait for an official fix instead of tinkering around.
And no change on my 4090 when using stock settings on it, while closing and opening Discord during a Furmark run.
I have tried them all and their performance not as good as opposed to accessing the sites directly thru a browser, so perhaps this "BUG" is in those apps, instead of the GPU drivers ??
Not that it really affects framerate at all. I already had my memory overclocked by 750MHz, so not like I was running slower than stock anyway.
NVDEC forces GPU into P2 CUDA state -> much higher power consumption than with VDPAU - Graphics / Linux / Linux - NVIDIA Developer Forums
There is more or less of a workaround on Windows:
How to turn off Nvidia's Force P2 Power State : nvidia (reddit.com)
But it really boils down to an intentional gimp on GeForce's compute performance by limiting the VRAM bus ever so slightly when CUDA is running. My suspicion that it has something to do with the usage of NVENC/NVDEC is that Discord was recently updated to support AV1 broadcasting when the user has an Ada GPU installed.
For me, who am very anxious and careful with my hardwares, those 200MHZ of memory make a lot of difference hehehe