Finally, AMD's new Vega architecture is here for gamers. It brings significant architectural improvements over the company's previous "Polaris 10" and "Fiji" cards. The most talked about feature is the high-bandwidth cache controller (HBCC), which lets the onboard HBM2 memory act as a cache, no longer requiring all models and textures to be loaded into VRAM. This capability is a huge leap forward for professionals working with large asset sizes, but I'm having doubts on whether it can provide tangible benefits for gamers. The promise is that you will never again see warnings like "You will need X GB of VRAM for Ultra textures." Vega RX 56 and 64 both come with 8 GB of VRAM, which should be enough for many years to come, at least until consoles increase their memory amounts significantly. Another novel feature is rapid packed math, which helps developers write much more efficient (=faster) shader code for those effects that don't need the full precision of regular shaders. I have high hopes for this feature, but it will take time to implement and requires game developers to actually spend time and money on these optimizations. AMD has added numerous other optimizations, many of which work under the hood, to provide needed improvements over the Fiji architecture.
AMD's Radeon RX Vega 64 delivers good performance that, when averaged, roughly matches the GTX 1080 Founders Edition. This makes the card almost 20% faster than the GTX 1070, 30% faster than the R9 Fury X, and 30% slower than the GTX 1080 Ti. What is interesting, though, is that individual benchmarks show huge differences between NVIDIA and AMD. In some games, NVIDIA is far ahead, while AMD is the clear winner in others. The difference here doesn't seem to be DirectX 12 (which was the case with Polaris). For example, in Hitman, which is even an AMD-sponsored title, the GTX 1080 is quite far ahead. What is also worth taking a look at are games that are CPU limited, like Civilization VI. Here, we see AMD's Vega cards bunch up at an invisible wall; a wall that sits at a lower FPS value than on NVIDIA. This suggests that AMD's drivers consume more CPU time than NVIDIA's to complete the same tasks, leaving less CPU time for the game. Given the numbers we see in our review, we can't recommend RX Vega 64 for 4K gaming - it will shine at 1440p, though. AMD needs to promote its new architectural features, such as packed math and primitive shaders, to game developers. A game that utilizes Vega's hardware resources the way AMD intended will be significantly faster than on the GTX 1080, including with optimizations for "Pascal."
Power efficiency has been AMD's weak point for a while, and AMD's RX Vega 64 can't really make that a thing of the past. It seems that in order to achieve the clock speeds required to beat the GTX 1080, the company had to increase voltages to far outside the comfort range of the Vega GPU, which results in terrible power consumption. Power consumption alone won't matter for most people, but higher power means more heat, which the cooler has to deal with, which usually means more fan noise. AMD has done what it can to limit noise levels by capping fan RPM and allowing temperatures to go up to 85°C, but it's just not enough. The card is very noisy with 45 dBA in gaming, which almost disqualifies it for people who are noise sensitive. I'm also missing the idle-fan-noise-off feature that turns off the card's fans during desktop work, Internet browsing, or even light gaming. Some coil noise at high FPS is audible on our sample, which other reviewers confirm as well.
AMD has included a dual BIOS switch on their card, with the second BIOS running at a reduced power limit. Also new are two predefined "overclocking" settings in WattMan, which change the allowed power levels of the card. This brings the total number of predefined configurations to six; we tested them all. As expected, the power save settings reduce power draw considerably, which also means less heat and noise at only a slight reduction in performance because the GPU is now operating at a more power efficient setting. Turbo mode, on the other hand, shows very little benefit in some games and actually reduces performance in others. My guess is that the card will run into its thermal limit more quickly because of its increased power limit, which will then dial down its clocks. Efficiency is lower than with the default "Balanced", which has the increase in heat lead to lower performance than on "Balanced." "Balanced" mode is the optimal mode in our testing.
Price-wise, the Radeon RX Vega 64 clocks in at $499, which is not unreasonable. It is basically priced the same as the GTX 1080, which does offer much better power/heat/noise levels. There are pricier "halo" variants of this card. At $599, you get the "Limited Edition" card, which is basically the reference card with a fancy brushed aluminum cooler body, which looks really cool, but has no performance benefits over the reference design. At $699, you get the Liquid Edition card with an AIO liquid-cooling solution, which enables higher clocks. This card is priced higher than the GTX 1080 Ti, and we doubt it can compete with that card in performance. At least AMD hasn't put out any internal performance numbers which suggest that it could. Also, the Liquid Edition card is part of the Radeon Packs, a neato combination of AMD hardware which could translate into savings. If you are looking for performance in very specific games, like Battlefield 1, then this could sway your buying decision; Vega64 is roughly 10% faster than the GTX 1080 in that title. Another reason for Vega could be FreeSync, which has monitors coming at no price increase, unlike G-Sync. Another major feature is Enhanced Sync, which I love personally since it works on all monitors and provides a near stutter-free experience in the simplest way possible.