Noise-normalized Cooler Testing
Making an apples-to-apples comparison between graphics card coolers has always been difficult. Just looking at temperatures obviously won't work since heat output between various graphics card models varies wildly even across batches of the same card. Another problem is that fan-control settings aren't identical across various graphics cards, which leads to different noise levels. Even normalizing just noise leaves you with the fact that each card differs in power consumption, which directly affects heat output.
To overcome these problems, we crafted a special game-like graphics load in Unreal Engine 4 with the ability to adjust its GPU workload on demand, instantly—this allows us to dial in an exact power-consumption target. For all testing on this page, the cooler is running at a constant fan speed that is carefully selected to emit a noise level of 40 dBA at 50 cm.
We are now able to test graphics card coolers at specific heat loads, which makes this testing independent of the used graphics card—the only variable is the heat output applied to the cooler, which we can control. This kind of noise-normalized test helps us understand just how effective a cooling solution is, how much thermal output it is designed for, and how much more it can take. The plotted temperature in the chart below is the GPU temperature as recorded by the on-die thermal sensors of the chip.