Power Consumption
Power Consumption Testing Details
Improving power efficiency of the GPU architecture has been the key to success for current-generation GPUs. It is also the foundation for low noise levels because any power consumed will turn into heat that has to be moved away from the GPU by its thermal solution. Lower heat output helps improve cost, too, as smaller, cheaper thermal solutions can be used.
For this test, we measure power consumption of only the graphics card via the PCI-Express power connector(s) and PCI-Express bus slot. A Keithley Integra 2700 digital multimeter with 6.5-digit resolution is used for all measurements. Again, these values only reflect the card's power consumption as measured at its DC inputs, not that of the whole system.
We use Metro: Last Light as a standard test for typical 3D gaming usage because it offers the following: very high power draw; high repeatability; is supported on all cards; drivers are actively tested and optimized for it; supports all multi-GPU configurations; test runs in a relatively short time and renders a non-static scene with variable complexity.
Our results are based on the following tests:
- Idle: Windows 10 sitting at the desktop (1920x1080) with all windows closed and drivers installed. The card is left to warm up in idle mode until power draw is stable.
- Multi-monitor: Two monitors are connected to the tested card, and both use different display timings. Windows 10 is sitting at the desktop (1920x1080 and 1280x1024) with all windows closed and drivers installed. The card is left to warm up in idle mode until power draw is stable. When using two identical monitors with the same timings and resolution, power consumption will be lower. Our test represents the usage model of many productivity users who have one big screen and a small monitor on the side.
- Media Playback: We use VLC Media Player to watch a 4K 30 FPS video that's encoded with H.264 AVC at 64 Mbps bitrate, making it similar enough to many streaming services as well, without adding a dependency on internet bandwidth. This codec should have GPU-accelerated decoding on every modern GPU, so it tests not only GPU power management, but also efficiency of the video decoding hardware.
- Average (Gaming): Metro: Last Light at 1920x1080 because it is representative of a typical gaming power draw. We report the average of all readings (12 per second) while the benchmark is rendering (no title/loading screen). In order to heat up the card, the benchmark is run once first without measuring its power consumption.
- Peak (Gaming): Same test as Average, but we report the highest single reading during the test.
- Sustained (Furmark): We use Furmark's Stability Test at 1600x900, 0xAA. This results in very high no-game power-consumption that can typically only be reached with stress-testing applications. We report the highest single reading after a short startup period. Initial bursts during startup are not included as they are too short to be relevant.
Power consumption results of other cards on this page are measurements of the respective reference design.
Single-monitor desktop power consumption is very low with only 4 W. Multi-monitor power consumption is almost five times as high, which has been an AMD problem since forever—when multiple monitors are connected, the memory will run at full 3D speed, which drives up power consumption. 18 W is not nearly as bad as the 35 W on the RX 5700, probably because fewer memory chips are used, and they are not clocked as high. What's interesting here is that VRAM size has no effect on multi-monitor power consumption as it's the number of chips that counts (compared to the RX 5500 4 GB vs. RX 5500 XT 8 GB).
Media playback power consumption is increased quite a bit, though. Maybe the memory makes a difference here.
Gaming power consumption is extremely reasonable with around 120-130 W, slightly higher than the 4 GB variant because the additional memory draws some extra power. When I saw the 8-pin power connector, I was worried. After seeing our test results, I'm not even sure why they didn't use a 6-pin. At those power consumption levels, the card will run with any power supply. People upgrading their OEM system, which often use PSUs of questionable quality, should have no problems running this card.
Gigabyte's factory overclock increases power draw only marginally, even though we see that Furmark can indeed go up to 180 W, which kind of justifies the use of an 8-pin power connector and also ensure there's enough power headroom for when it is needed.
Minimum recommended PSU: 400 W