Noise Normalized Cooler Testing
Making an apples-to-apples comparison between graphics card coolers has always been difficult. Just looking at temperatures obviously won't work due to heat output between various graphics card models varying wildly, even from batch to batch across the same card. Another problem is that fan-control settings aren't identical across various graphics cards, either, which leads to different noise levels. Even normalizing just noise leaves you with the fact that each card differs in power consumption, which directly affects heat output.
To overcome these problems, we crafted a special game-like graphics load in Unreal Engine 4 with the ability to adjust its GPU workload on demand, instantly—this allows us to dial in an exact power consumption target. For all testing on this page, the cooler is running at a constant fan speed that is carefully selected to emit a noise level of 35 dBA.
We are now able to test graphics card coolers at specific heat loads, which makes this testing independent of the used graphics card—the only variable is the heat output applied to the cooler, which we can control. This kind of noise-normalized test helps us understand just how effective a cooling solution is, how much thermal output it is designed for, and how much more it can take. The plotted temperature in the chart below is the GPU temperature as recorded by the on-die thermal sensors of the chip.
MSI's thermal solution runs much lower temperatures than even the RTX 3080 FE cooler, very impressive.