# PCI-Express 4.0 Performance Scaling with Radeon RX 5700 XT



## W1zzard (Jul 7, 2019)

PCI-Express 4.0 has been one of the main new features prominently brandished on the product boxes of both the 3rd generation Ryzen desktop processors and Radeon RX 5700 series graphics cards. We examine the performance impact of running these cards on older generations of PCIe.

*Show full review*


----------



## jabbadap (Jul 7, 2019)

Soo, not really game changer. How about CF, oh wait did navi even support that? Is there any mgpu dx12/vulkan titles out there?


----------



## Metroid (Jul 7, 2019)

Thanks for the review, as we can see here, no difference at all when bandwidth is plenty, even the archaic pcie2.0 still holds true to that.


----------



## Fatalfury (Jul 7, 2019)

so much for PCI 4.0...
no wonder they 5.0 will come sooner or later..


----------



## champsilva (Jul 7, 2019)

jabbadap said:


> Soo, not really game changer. How about CF, oh wait did navi even support that? Is there any mgpu dx12/vulkan titles out there?



NAVI doesnot support crossfire.


----------



## Fluffmeister (Jul 7, 2019)

This was always going to be nothing more than a marketing bullet point sadly. It will be more interesting in the coming years when much faster cards hit the market.


----------



## tfdsaf (Jul 7, 2019)

People are so ignorant and uneducated. PCI-E 4.0 isn't about the GPU performance just yet, its mostly about AMD's infinity fabric on the CPU side being better utilized and faster connections all round for the CPU. Second benefit is much faster SSD's, in fact double speed SSD's over the current generation.


----------



## Mac2580 (Jul 7, 2019)

tfdsaf said:


> People are so ignorant and uneducated. PCI-E 4.0 isn't about the GPU performance just yet, its mostly about AMD's infinity fabric on the CPU side being better utilized and faster connections all round for the CPU. Second benefit is much faster SSD's, in fact double speed SSD's over the current generation.


I guess thats one way to look at it. On the other hand, I personally dont really care either unless it improves FPS.


----------



## Easo (Jul 7, 2019)

This is going to be huge in the server/storage world, like stupidly huge, as the SSD's are just getting bigger, bandwidth has to be able to keep up. Double the capacity is amazing.
Gaming results, however, as as everyone else predicted (knew, actually).


----------



## epiqpnwage (Jul 8, 2019)

Mac2580 said:


> I guess thats one way to look at it. On the other hand, I personally dont really care either unless it improves FPS.



Good thing technology doesn't care about gamers who thinks pc is just for gaming.


----------



## Athlonite (Jul 8, 2019)

So no productivity tests then


----------



## Zubasa (Jul 8, 2019)

Fatalfury said:


> so much for PCI 4.0...
> no wonder they 5.0 will come sooner or later..


That is because Data Centers want more bandwidth for storage /network etc, there is nothing to do with gaming GPUs.


----------



## THU31 (Jul 8, 2019)

100 $ more for motherboards with completely useless technology. Great stuff!


----------



## ORLY (Jul 8, 2019)

This test indicates nothing.
What is really important is a frametime graph to see if there are any annoying frametime spikes.

Guys, seriously - why did you remove the minimum fps results from your tests? Average fps is nothing without 0.1% and 1% fps. Just sad.


----------



## Steevo (Jul 8, 2019)

ORLY said:


> This test indicates nothing.
> What is really important is a frametime graph to see if there are any annoying frametime spikes.
> 
> Guys, seriously - why did you remove the minimum fps results from your tests? Average fps is nothing without 0.1% and 1% fps. Just sad.


What's sad is the number of people who aren't capable of performing reviews posting ideas that **don't matter** if there is an issue with frame times and lag caused by PCIe3.0 doubling the bandwidth won't change that, as doubling the bandwidth does nothing for latency, or additional latency from CPU, driver, or cache misses.


----------



## danbert2000 (Jul 8, 2019)

The main win for PCIe 4.0 is doubling the lanes, and faster SSD performance. So yes, the graphics card doesn't really benefit from 4.0 x16, but you could run a graphics card in 4.0 x8 and still have 8 lanes directly wired to the CPU. People always talk about how Intel processors are limited in the consumer sphere to too few lanes and have to go over the DMI link, switched, for some of their devices. Well, here you go. Effectively double the lanes of Intel's chips on an x570 now.

The main benefit of these GPUs running 4.0 is that they can utilize x8 links at effectively PCIe 3.0 x16 bandwidth.


----------



## Steevo (Jul 8, 2019)

danbert2000 said:


> The main win for PCIe 4.0 is doubling the lanes, and faster SSD performance. So yes, the graphics card doesn't really benefit from 4.0 x16, but you could run a graphics card in 4.0 x8 and still have 8 lanes directly wired to the CPU. People always talk about how Intel processors are limited in the consumer sphere to too few lanes and have to go over the DMI link, switched, for some of their devices. Well, here you go. Effectively double the lanes of Intel's chips on an x570 now.
> 
> The main benefit of these GPUs running 4.0 is that they can utilize x8 links at effectively PCIe 3.0 x16 bandwidth.




Exactly, the GPU isn't benefitting in any real or tangable way, the peripherals like ssds, and eventually WAN will benefit. The comments are almost NPC level junk about the reality of it though.


----------



## RainingTacco (Jul 8, 2019)

Ok, but what about lows ie. 1% of lowest framerate? Does PCIe 4.0 helps with frame drops? Avg values are important, but lows are also!

As others noted, please test SSD performance using PCIe 4.0!


----------



## Steevo (Jul 9, 2019)

RainingTacco said:


> Ok, but what about lows ie. 1% of lowest framerate? Does PCIe 4.0 helps with frame drops? Avg values are important, but lows are also!
> 
> As others noted, please test SSD performance using PCIe 4.0!



PCIe bandwidth shouldn't change anything with lowest 1% frame rates, it's a question of latency and driver maturity. It's like having a 16 lane superhighway where more cars can fit but they still have to go the speed limit.


----------



## Darmok N Jalad (Jul 9, 2019)

Thanks for the review. I was wondering if this would be much of a difference maker. My gut told me no, but then AMD demoed the 5700XT on a PCIe 4.0 platform and made it a talking point. Considering the upper-mid-range nature of the card, I don't see how PCIe 4.0 could really make a significant difference. Even if there is a synergy to be had, I just don't see it making it into ver 1.0 motherboard, CPU, and GPU designs. Not enough saturation to make it a selling point at the GPU level.

Could you try some compute workloads and see how that does?


----------



## Prima.Vera (Jul 10, 2019)

So where are the results for PCI-E 4.0 16x ???


----------



## W1zzard (Jul 10, 2019)

Prima.Vera said:


> So where are the results for PCI-E 4.0 16x ???


Green bar?


----------



## bobbygamer (Jul 17, 2019)

Awesome review would be interesting to do more of a full system test to see if you could 'clog up' the pipeline with multiple gpus faster ones like 2x2080ti and m.2s in raid0 and run a timed sequence of tests maybe it will compound the differences.


----------



## W1zzard (Jul 17, 2019)

bobbygamer said:


> Awesome review would be interesting to do more of a full system test to see if you could 'clog up' the pipeline with multiple gpus faster ones like 2x2080ti and m.2s in raid0 and run a timed sequence of tests maybe it will compound the differences.


there should be no difference, the lanes are not shared


----------



## systemBuilder (Jul 19, 2019)

I have a feeling that these games are ALREADY over-optimized (i.e. dumbed-down) for NVidia cards with their compressed textures.  PCIe 4.0 should allow the system to sideload some amazing, vivid, detailed textures, when AMD gets to 50% marketshare, which is never.  In the meantime the 3D experience for everyone will be poorer because NVidia profits more.


----------



## SASBehrooz (Sep 18, 2019)

Thanks for review. i was going to build PCIe 4.0 system and choosing Radeon RX 5700 XT Over the RTX 2070 Super. now im sure i go with RTX 2070 Super   
thanks


----------



## tfdsaf (Sep 19, 2019)

SASBehrooz said:


> Thanks for review. i was going to build PCIe 4.0 system and choosing Radeon RX 5700 XT Over the RTX 2070 Super. now im sure i go with RTX 2070 Super
> thanks


The RX 5700XT is still a much better value just in terms of raw GPU performance, its only 6% slower than the 2070 super, but up to 30% cheaper. hardwareunboxed did a 39 games test and found the RX 5700xt just 6% slower on average. 

Plus PCI-E 4 is more for AMD's CPU's and the infinity fabrik, as well as much faster SSD's, plus you are future proofing. But yeah, even a RTX 2080ti won't outbandwidth a pci-e 3 slot.


----------



## computerdiehard (Nov 20, 2019)

The only true test is to put that video card on an older chipset.  It's not a true test all on the same motherboard.


----------



## W1zzard (Nov 20, 2019)

computerdiehard said:


> The only true test is to put that video card on an older chipset.  It's not a true test all on the same motherboard.


Why?


----------



## computerdiehard (Nov 20, 2019)

Why because the X570 has 44 PCI-Xpress lanes and it not like you can go into the BIOS and set how many PCI-Xpress lanes you want to dedicate.  You also can't set what PCI version you want in the BIOS.

Correct?


----------



## W1zzard (Nov 20, 2019)

computerdiehard said:


> Why because the X570 has 44 PCI-Xpress lanes and it not like you can go into the BIOS and set how many PCI-Xpress lanes you want to dedicate.  You also can't set what PCI version you want in the BIOS.
> 
> Correct?



Actually no. The PCIe speed (version) can be set in the BIOS, the number of lanes can be changed by physically taping off some on the slot, the card will autonegotiate to the correct link width.


----------



## Jo3yization (Dec 20, 2019)

Please test bandwidth at 4K PCIe 3.0 vs 4.0 with ReLive recording, I've been able to reproduce consistent spaced stutter when recording with ReLive under high VRAM usage in games like SottR & Gears 5, but dont have a PCIe 4.0 platform to compare against. I'm running a Sandisk Ultra II SSD & tried multiple drivers to see if that may have been the issue but it makes no difference, if I bring the Vram usage down by lowering texture quality or simply turn Relive off the stutter disappears so it's definitely connected to VRAM & ReLive.


----------



## Athlonite (Dec 21, 2019)

Jo3yization said:


> Please test bandwidth at 4K PCIe 3.0 vs 4.0 with ReLive recording, I've been able to reproduce consistent spaced stutter when recording with ReLive under high VRAM usage in games like SottR & Gears 5, but dont have a PCIe 4.0 platform to compare against. I'm running a Sandisk Ultra II SSD & tried multiple drivers to see if that may have been the issue but it makes no difference, if I bring the Vram usage down by lowering texture quality or simply turn Relive off the stutter disappears so it's definitely connected to VRAM & ReLive.



What CPU are you using


----------



## Jo3yization (Dec 21, 2019)

Athlonite said:


> What CPU are you using


i7-6700k,
Z170A Gaming Pro
2x8gb ddr4 3200mhz cl16 (Double checked in CPU-Z, its in dual channel, passes memtest no problems, I dont get any crashes or BSODs).
Sapphire pulse RX 5700(non-XT).

Here's what the stutter looks like;; 







  For the entire first ~26mins of the video recording is smooth, but after a few cutscenes at the timestamp, watch the vram usage + frametimes; you can see as it climbs to about 7.3gb when the stutter starts I knew what was happening straight away so I lowered texture streaming to eliminate it, it's happened in a bunch of other titles over different driver revisions too(19.9.1&2,, 19.11.2>3 and 19.12.1>2), . If ReLive is off, it doesnt happen at all even at just under 8gb usage. I also noticed it tends to be more prevalent in DX12 API, if I use DX11 in borderlands 3 for example at 3440x1440 same settings etc, the vram usage was a good 2gb lower so the problem didnt occur.

Here's Borderlands 3(this was on 19.11 drivers)








_First benchmark run is in DX11 API, there was a bug with 19.11 that reports incorrect VRAM usage in some utilities so the numbers arent reliable but they do show higher usage in DX12 at the very least, but you can see me run the benchmark smoothly in DX11 at first, then swap to DX12 & immediate stutter, which I resolve by lowering settings to lower the VRAM usage._

So as you can see I can consistently reproduce the problem & I've sent the details to AMD, though recording at 3440x1440 is a bit of a niche using the RX 5700 so obviously not many people have encountered it yet. Seeing as relive *should* be GPU encoding with only a small load on CPU, for it to happen only when VRAM usage is high I'm guessing theres either some kind of memory leak or texture swapping going on around a certain level of usage(looks like about 7.3gb) in Gears 5, as ReLive itself must need a certain amount for encoding & its somehow conflicting with the game.

So far I've reproduced it in:
Borderlands 3
Jedi Fallen order
Escape from tarkov
Red dead redemption 2
need for speed heat
& gears 5.

I did theorize maybe its a disk issue, so I tried changing the recording location to my other SSD along with dropping bitrate from 50MB/s to 5MB/s, and swapping from HEVC that I normally use to simply AVC but none of it made any difference, recording is smooth until the game VRAM usage gets to a certain point.. I also tried running completely stock(as you can tell Im running a core OC on the RX 5700) but it also did not resolve the problem. Also plenty of free space on all 3 of my SSDs & they all record smoothly at the same bitrates before the stutter occurs or if I simply turn texture quality//other settings down to keep the vram usage from hitting whatever amount triggers it.


For now I've just learned to live with it and put it down to either driver or ReLive/GPU limitations when high resolution recording ..But thought..Maybe PCIe 4.0 can help after seeing the behaviour of the 5500 XT driven into its bandwidth limit at igorslab; https://www.igorslab.de/en/amd-rade...h-the-morepowertool-and-into-bandwidth-limit/ even though its due to a much smaller bus width causing the issue with that GPU, maybe under certain high VRAM usage scenarios it could occur with Navi 10 GPUs under PCIe 3.0 as well... 

Or its all happening exclusively on the GPU with the Relive encoder fighting game textures for VRAM which makes the most sense & might be fixable via driver updates to prioritize or 'balance' VRAM usage better by dynamically reducing bitrate at high levels of usage to prevent texture/encoder swapping for lack of a better term.


----------



## Athlonite (Dec 22, 2019)

Jo3yization said:


> Or its all happening exclusively on the GPU with the Relive encoder fighting game textures for VRAM which makes the most sense



I'd say you just hit the proverbial nail on the head right there. So by  turning down the textures from Ultra to High you don't get the stutter well that seem to point directly at what you were getting at it's a resource fight between ReLive and the game engine for VRam personally for just recording I'd turn down the textures to high but leave it on Ultra for pure game play time


----------



## gphantom (Apr 12, 2020)

No one who is benchmarking video cards seem to use XPLANE or other flight sim software in their tests.  Xplane 11 especially, doesn't rely on cpu cores but rather how fast a cpu and graphics card is.  It does not rely on linked cards either so, to show a graphics card's true speed and bandwidth capability, use XPLANE 11 as a testing platform.  THEN we can truely judge how capable a graphics card and even a motherboard/cpu/memory really is.

Phil
Application/Software engineer since 1976


----------



## EarthDog (Apr 12, 2020)

gphantom said:


> No one who is benchmarking video cards seem to use XPLANE or other flight sim software in their tests.  Xplane 11 especially, doesn't rely on cpu cores but rather how fast a cpu and graphics card is.  It does not rely on linked cards either so, to show a graphics card's true speed and bandwidth capability, use XPLANE 11 as a testing platform.  THEN we can truely judge how capable a graphics card and even a motherboard/cpu/memory really is.
> 
> Phil
> Application/Software engineer since 1976


what?

The point here is to see if games are effected by available slot bandwidth. While a single core game may rely on clock speed, what makes you think there is more data going through the pcie slot that makes 'all flight sims' better for this type of test? How do you think it will differ from the other, mostly negligible, results?


----------



## Jo3yization (Apr 14, 2020)

So.. This thread popped up in my emails and I figured I'd post an update.

The stutter is definitely VRAM usage and can occur even *without* Relive running in certain titles that are very resource hungry, one such title is CoD: Warzone that many users are reporting stutter & fps drop problems with, I narrowed the issue down to the amount of VRAM the game uses as opposed to it's 'in-game' calculator showing less than half the estimate.










Here's a vid pretty much demonstrating the cause of stutter many are experiencing with CoD warzone, its basically texture swapping stutter, but the big factor is CoD warzone uses enough VRAM to cause swapping at only 1440p/1080p on lower VRAM cards, it'll hit 4gb usage very easily, I even tested very low 720p and was still able to hit 4gb vram usage.

Just sharing since its interesting as even though the cause of high VRAM usage is game related, the symptom is basically the same as the ReLive//VRAM usage stutter. Which confirms it definitely is some texture swapping issue.


----------



## ProDigit (May 8, 2020)

in other words, what you should have done as additional testing is pcie 4.0 x16, pcie 4.0 x8, pcie 4.0 x4, and pcie 4.0 x1. And see the performance differences there.


----------



## Jo3yization (Aug 21, 2020)

Oh I would if I had the hardware to test, but I'm stuck on a 6th gen PCIe 3.0 system for the moment. I also encountered the same High VRAM//swapping issue on Horizon Zero Dawn and was able to eliminate 'periodic' stutter on my 8gb RX5700 using the same method that works with warzone, lowering VRAM usage, it seems once the game has to start swapping data between the Video Memory and system ram, stutter occurs.. But given the bandwidth of both VRAM & DDR4 @ 3200mhz(40GB/s), the limitation in this case would actually be PCIe 3.0 bandwidth(32GB/s max),, so in theory, PCIe 4.0 with double the bandwidth would scale higher with faster memory speeds & possibly eliminate stutter at some point. I wish I had a PCIe gen 4 setup to test.

This VRAM 'buffer' performance loss when VRAM limits are reached also lines up with the Horizon Zero Dawn 'PCIe gen 3.0 vs 4.0' test done by Techspot shown in the benchmarks listed here: *https://www.techspot.com/review/2084-amd-or-intel-for-gaming-benchmarking/* , you can actually see that at the higher usage of 1440p & 4K, the PCIe gen 4 system is performing better in HZD with an 8GB GPU than on the otherwise faster intel system limited to Gen 3.0 that dominates pretty much all other benchmarks.

Here's a link to my own High VRAM usage stutter test in Horizon Zero Dawn that anyone running a Navi GPU can likely replicate too;


----------



## msroadkill612 (Sep 4, 2020)

The point here is that we sheep have, (since pcie2 gpus anyway)  for many years been overspending on gpuS & wasting 8x pcie lanes.

even on pcie 3, 16 lanes rarely helps.

the upshot is u can have a; cheaper to build, as good, 8x pcie 4 GPU, yet still be as future proof as a 16 lane pcie 3 gpu for when games inevitably do start finding more uses for evolving/faster/bigger system resources.


----------



## Jo3yization (Sep 5, 2020)

We can? I think it really depends on use case. Some situations & games will definitely benefit. Especially newer titles even at 1080p.


----------



## B_Bang (Jan 22, 2022)

Any chance you could do this on the 6600 NON - xt card.  Given it's lower performance i think it might be more affected by PCIe version...


----------



## ExalyThor (Apr 23, 2022)

Thanks for the testing! I still use an i7 2600K and I wondered if the pcie 2.0 will be a bottleneck.


----------

