After completing over 250 benchmark runs, we have a much better understanding of the implications PCI-Express 4.0 x8 brings for the Radeon RX 6600 XT. Probably the most important question for many of you is "How much performance is lost due to AMD cutting the PCIe interface in half, down to x8 from x16?" While we can't give you an exact answer in terms of percent or FPS because there is no way to run the card at x16, the data we have collected definitely lets us estimate it.
In the chart above, you see the results from our Relative Performance page listed in a slightly different style than our usual bar charts. On average, at 1080p Full HD, we measured a 2% performance drop going from PCIe 4.0 x8 to PCIe 3.0 x8 (half the bandwidth). Halving the bandwidth yet again cost a total of 7% performance, and running at the ancient PCI-Express 1.1 interface resulted in a 17% loss of FPS. With these four data points plotted (in purple), we can try to fit a curve (red dotted line) that goes through these points and continues onward to extrapolate to the value we're interested in: PCI-Express x16 4.0. The resulting performance difference is +1%. Even if we assume a 2% difference—same as between PCIe 4.0 and PCIe 3.0 despite diminishing returns from higher bandwidth, the performance hit is pretty much negligible. This definitely justifies what AMD did, as long as it brought them sufficient cost savings.
This reasoning is based on the average for all games, though. If we take a closer look at individual game results, the titles with the biggest delta between PCIe 4.0 and PCIe 3.0 are Hitman 3 with 13.5% and Death Stranding with 6.8%. Death Stranding uses the same engine as Horizon Zero Dawn, which last year was involved in some drama about PCIe speeds, too. Such performance differences are quite relevant even if they just reflect a worst-case. The underlying reason seems to be that certain engines copy tons of data between the GPU and CPU for some kind of post-processing or to implement other features like physics and AI. The reason why this effect is much bigger at 1080p than 4K is because the data transfer is performed for every single rendered frame. Since 1080p runs much higher FPS than 4K, the observed effect is much stronger at lower resolutions. Is such a performance loss in certain titles reasonable? I'm not as sure anymore as I was when we talked about 1–2%, but considering how rare these engines are and that the vast majority of titles see minimal effects, I'd say this is probably an issue for the game developer to fix.
The underlying reason we're seeing these effects in some games is that nearly all titles are developed for consoles first, which have just one kind of memory that's shared between CPU and GPU. This also means that moving data between the CPU and GPU is incredibly fast and doesn't incur the latency or bandwidth penalties we're seeing on the PC platform. Remember, consoles are basically using something similar to IGP graphics, which has CPU and GPU integrated as a single unit, unlike the PC where you have discrete GPUs sitting on the PCI-Express bus. The onus is now on the game developers to make sure that their games not only run the best on consoles, their cash cow, but also on the PC platform.
The second big question for this article is what happens to performance when you run RX 6600 XT on a system that only supports PCIe Gen 3 only, and the answer here is quite similar to that of the previous question: on average, you're looking at a 1–2% performance loss, much more in some exotic titles. Considering PCIe Gen 3 is still used in millions of PCs, this is just as important a question, but not worth losing sleep over in my opinion. The RX 6600 XT is a huge upgrade over cards like the GTX 1060 or RX 580 and offers excellent performance at 1080p Full HD no matter whether you're on PCIe Gen 4 or Gen 3. Every single title in our test suite ran very well, at the highest settings. Even for the case of Hitman 3; sure, it would have been nice to have 120 instead 105 FPS, but is that really an issue if your alternatives are the RTX 3060 at 100 FPS, RX 5700 XT at 97 FPS, RTX 2080 at 114 FPS, and RTX 2080 Super at 119 FPS? Or MUCH lower of course when using that old graphics card you replaced with the RX 6600 XT. On the other hand, the GeForce RTX 3060 and RTX 3060 Ti do support the full PCI-Express 4.0 x16 interface (and 3.0 x16, too), so it will see a negligible performance hit when moving from PCIe 4.0 to PCIe 3.0.