Test System
Test System |
---|
Processor: | Intel Core i9-9900K @ 3.7 GHz base / 5.0 GHz OC |
---|
Motherboard: | MSI MEG Z390 ACE Provided by: MSI |
---|
Memory: | 2x 16 GB G.SKILL Trident Z Neo DDR4@ 3600 MHz 18-22-22-42 Provided by: G.SKILL |
---|
Video Card: | Palit GeForce RTX 3080 Gaming Pro OC |
---|
Storage: | Samsung 980 Pro 1 TB SSD |
---|
Power Supply: | EVGA SuperNova 750G2 |
---|
Case: | Custom test bench
|
---|
Operating System: | Windows 10 64-bit |
---|
TIM: | Noctua NT-H1 Provided by: Noctua |
---|
Test Methodology
The previously used CORSAIR Hydro XD5 pump/reservoir combo, an Aquaero 6 XT controller, and a Black Ice Nemesis GTX 480 radiator with CORSAIR ML120 PRO RGB fans complete the loop. The CPU is not placed in the loop to make the only source of heat the GPU and, thus, the GPU block itself. Average flow rate is set to 1 GPM, and calibrated in-line temperature sensors are used to measure the coolant's temperature.
Testing a block for thermal performance is fairly simple once you realize that you have to measure VRM and VRAM temperatures manually. As such, I installed an Omega NTC type thermistor on NVDD #1, another on the bottom VRAM module, and connected both to an external display for a VRM temperature readout. TechPowerUp GPU-Z was used to monitor GPU core temperatures. The GPU was overclocked to 2.1 GHz, although with how GPU Boost works these days with power limits more than anything else, it did vary slightly. Similarly, with core voltage being nearly impossible to set manually and fix at that point, it is best to compare the results below within the data set, not to other reviews elsewhere.
Everything required was placed inside a hotbox, and the ambient temperature was set to 25 °C. Noctua NT-H1 was used as the thermal paste of choice because not every block comes with TIM included, and cure time was taken into consideration. Three separate mounts/runs were done for statistical accuracy and to remove the chance of any mounting-related anomalies. For each run, a 60 minute loop of 3DMark Time Spy Extreme was done, and temperatures were monitored until a steady state was reached, after which they were recorded. A delta T of GPU core/VRM and loop temperatures was thus calculated for each run, and the average delta T that was then obtained across all three runs. This way, the cooling solution is taken out of the picture.
Test Results
As mentioned on the previous page, I chose to test everything with the backplate in place since the majority of blocks I got in the first round include backplates in the box. This keeps things simple and avoids a bunch of very similar entries from only a few companies.
So yes, the EK-Quantum Vector block is basically identical to the EK-Classic block. This by itself was not much of a surprise considering the main difference is in the design language, with the actual cooling engine basically the same as you will see in a subsequent review. The differences between these two block and backplate combinations is well within error margins, and even came out identical on the GPU core. The active backplate does change things somewhat, but let's hold that discussion for another time. Compared to the other two brands here, we see that the tested combination from EK does worse than the CORSAIR offering and bests the Alphacool options on the GPU core, but leaves both behind on VRAM and VRM cooling, even if it's not by much throughout. It does appear all the companies focused on GPU cooling, where the spread is much tighter compared to the other two fields.
I also made the tough call of removing the performance-per-dollar chart this time around because prices are so different in different regions. There is no single consistent trend even when going from EU to US or UK pricing, and we have active backplates make it worse. I will address this by instead discussing pricing more in the conclusion, which we get to now.