Noise-normalized Cooler Testing
Making an apples-to-apples comparison between graphics card coolers has always been difficult. Just looking at temperatures obviously won't work since heat output between various graphics card models varies wildly even across batches of the same card. Another problem is that fan-control settings aren't identical across various graphics cards, which leads to different noise levels. Even noise-normalized testing will leave you with the fact that each card differs in power consumption, which directly affects heat output, making such a comparison inaccurate.
To overcome these problems, we crafted a special game-like graphics load in Unreal Engine 4 with the ability to adjust its GPU workload on demand, instantly—this allows us to dial in an exact power-consumption target. For all testing on this page, the cooler is running at a constant fan speed that is carefully selected to emit a noise level of 35 dBA at 50 cm. The card is installed on an open test bench, which is a different setup than our regular thermal testing, which is in a tower case, so some small differences are expected.
We are now able to test graphics card coolers at specific heat loads, which makes this testing independent of the used graphics card—the only variable is the heat output applied to the cooler, which we can control. This kind of noise-normalized test helps us understand just how effective a cooling solution is, how much thermal output it is designed for, and how much more it can take. The plotted temperature in the chart below is the GPU temperature as recorded by the on-die thermal sensors of the chip.
Very impressive, the ASUS Noctua cooler is better than any NVIDIA Founders Edition cooler with the exception of the RTX 3090 FE, and it's quite close.