• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4090 PCI-Express Scaling with Core i9-13900K

Joined
Mar 5, 2023
Messages
49 (0.08/day)
This comparison has no sense.
Alder and raptor uses PCI Express Gen5 and not Gen4.
What's the sense of this comparison?
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Excuse me? It was a simple question. I'm not, as you say, whining about it. I just want to know why AMD was excluded from what many might refer to as one of the most important series of benchmarks to be featured on the Internet.

While it might've been true in the past, Intel is no longer the top dog in the industry. They have competition yet it seems nearly every publication and YouTube influencer uses Intel chips as their base in many of their benchmark rigs. Why? I'm not just calling out Wizzard here, I'm calling out... everyone in the benchmark space. Why always Intel?
Because you didn't read the article.

W1zzard said:
Roughly every one to two years we're updating our test system, so we took the opportunity to revisiting this PCI-Express performance scaling topic on our latest 2023 VGA Test Bench.

Upgrading the graphics card review test-bed is no small feat here at TechPowerUp, it involves testing 40 graphics cards across 25 game tests in rasterization, plus nine with ray tracing, all of those at three resolutions, with additional time spent to retest to correct testing errors due to suspicious results. The whole exercise typically takes up to several weeks. We are finally done with our upgrade, and our latest machine rocks an Intel Core i9-13900K "Raptor Lake" processor, an EVGA Z790 DARK motherboard, 32 GB of DDR5-6000 memory, and an ATX 3.0 power supply that natively supports 12VHPWR. This is a significant uplift in not just CPU compute muscle, but also IPC from the eight "Raptor Cove" P-cores.

Why did W1zz choose the 13900K? Because at the time he updated the test system it was the fastest gaming CPU in the world, since the X3D Zen 4 CPUs hadn't been released yet.

Thank you for the article. I have an z690 board and am installing a new heat sink on my gen 4 samsung ssd and at the same time was considering moving it to the gen 4 m.2 slot and leave the gen 5 just for my 4090. My pc is used for gaming and I don't see any reason to move the ssd unless for better airflow, which I may do anyway just for grins.
You SSD doesn't need a heatsink. Don't waste your time.

Am I understanding you correctly? I'm on z790 with a 13900k and a 4090. I currently have a Gen 4 m.2 in the slot closest to my CPU. My GPU is still running at x16. Are you saying it has to be a gen 5 m.2?
Yes.
 
Last edited by a moderator:
Joined
Mar 5, 2023
Messages
49 (0.08/day)
The RTX 4090 is PCI-E Gen 4, that's why. There is no Gen 5 GPU in existence to test.

When a Gen 5 GPU exists then there will be Gen 5 scaling tests done.
The test is non sense.
If the purpose is to test the performance loss of the combination of a SSD with a GPU they should have tested using pcie5.

Current CPUs are pcie5, they should have included a pcie5 8x test, in this way, it's simply useless and a waste of time of both the author and the people who reads the article.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,845 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
The test is non sense.
If the purpose is to test the performance loss of the combination of a SSD with a GPU they should have tested using pcie5.

Current CPUs are pcie5, they should have included a pcie5 8x test, in this way, it's simply useless and a waste of time of both the author and the people who reads the article.
No PCIe 5 x8 GPU exists, can I borrow your time machine? and I also need a x16 one, too, so we can quantify the delta between x8 and x16
 
Joined
Mar 5, 2023
Messages
49 (0.08/day)
No PCIe 5 x8 GPU exists, can I borrow your time machine? and I also need a x16 one, too, so we can quantify the delta between x8 and x16
The reason of the article is to show how a GPU is limited by current gen CPUs when pairing them with an SSD of with a reduce bandwidth slot at 8x

Current CPUs used pcie5 slots so the article is a non sense.

You don't need a pcie5 GPU for this test, you can simply show that a pcie4 GPU has no reduction in performance while using a pcie5 slot at 8x

I repeat, this article is a waste of time and completely useful
 
Joined
Dec 12, 2012
Messages
774 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
The reason of the article is to show how a GPU is limited by current gen CPUs when pairing them with an SSD of with a reduce bandwidth slot at 8x

Current CPUs used pcie5 slots so the article is a non sense.

You don't need a pcie5 GPU for this test, you can simply show that a pcie4 GPU has no reduction in performance while using a pcie5 slot at 8x

I repeat, this article is a waste of time and completely useful

Did you register here just to throw a tantrum? Your parents must be proud of you.

If you connect a Gen5 SSD, your Gen4 GPU will be running at 4.0 x8. Your Gen3 GPU will be running at 3.0 x8. The lanes are cut in half, the generation doesn't matter.

This test shows that you basically don't lose any GPU performance when using a Gen5 SSD. It's simulated, but you don't need a Gen5 SSD for this test, because you can set the PCI-E speed in the BIOS. So 3.0 x16 will give you the same result as 4.0 x8.
 
Joined
Mar 5, 2023
Messages
49 (0.08/day)
Did you register here just to throw a tantrum? Your parents must be proud of you.

If you connect a Gen5 SSD, your Gen4 GPU will be running at 4.0 x8. Your Gen3 GPU will be running at 3.0 x8. The lanes are cut in half, the generation doesn't matter.

This test shows that you basically don't lose any GPU performance when using a Gen5 SSD. It's simulated, but you don't need a Gen5 SSD for this test, because you can set the PCI-E speed in the BIOS. So 3.0 x16 will give you the same result as 4.0 x8.
No it's not that way that it works.

If I connect a pcie5 SSD, GPU will work at pcie5 8x and SSD will work at pcie5 8x even if it uses only 4x.
 
Joined
Mar 31, 2014
Messages
1,533 (0.39/day)
Location
Grunn
System Name Indis the Fair (cursed edition)
Processor 11900k 5.1/4.9 undervolted.
Motherboard MSI Z590 Unify-X
Cooling Heatkiller VI Pro, VPP755 V.3, XSPC TX360 slim radiator, 3xA12x25, 4x Arctic P14 case fans
Memory G.Skill Ripjaws V 2x16GB 4000 16-19-19 (b-die@3600 14-14-14 1.45v)
Video Card(s) EVGA 2080 Super Hybrid (T30-120 fan)
Storage 970EVO 1TB, 660p 1TB, WD Blue 3D 1TB, Sandisk Ultra 3D 2TB
Display(s) BenQ XL2546K, Dell P2417H
Case FD Define 7
Audio Device(s) DT770 Pro, Topping A50, Focusrite Scarlett 2i2, Røde VXLR+, Modmic 5
Power Supply Seasonic 860w Platinum
Mouse Razer Viper Mini, Odin Infinity mousepad
Keyboard GMMK Fullsize v2 (Boba U4Ts)
Software Win10 x64/Win7 x64/Ubuntu
I see the children are here to whine as usual about "WhY DidN'T yOU RuN thIS on amD CPu" SHUT UP AND SIT DOWN. You don't bother to understand or care how much time and effort Wizz puts into running these benchmarks and providing the results FOR FREE, if you want AMD CPU benchmarks then run them yourself.


AMD didn't gimp it, they took a GPU that was designed to be used as a dGPU in laptops - connected to the CPU over 4 dedicated lanes of PCIe - and put it on a PCIe card, so they had something below 6600 to compete with Arc and 1650/3050. But it turns out that a low- to mid-range GPU, with a lower amount of VRAM, needs to transfer a lot more data over the PCIe bus, and a PCIe x4 link absolutely doesn't cut it in that scenario. On top of that, the 6500 XT GPU is also missing many features (because it was expected that the CPU it was coupled to would have them), that makes it even more of a disappointment.

The 6500 XT's "predecessor", the 5500 XT, was designed for desktop with a PCIe x8 link, and worked pretty well as a result. I still don't know why AMD didn't do a rebrand of the 5500 XT for the 6500 XT, instead of trying to fit a square peg into a round hole - it's not like AMD or NVI*DIA are strangers to rebranding old GPUs as new when necessary.
I think it's probably also worth noting that AMD's memory management on dGPUs appears to be less refined than Nvidia's, even with equal PCIe bus the performance on Nvidia cards tends to degrade much more gracefully as it runs into VRAM issues. The mechanism isn't really clear to me, NV could be automatically culling texture detail, but the FPS numbers rarely become as erratic as quickly as on AMD parts.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,845 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
If I connect a pcie5 SSD, GPU will work at pcie5 8x and SSD will work at pcie5 8x even if it uses only 4x.
No, the GPU will work at x8 with whatever PCIe capability it supports. You can't magically add new capabilities like that

Yes, I have a PCIe 5.0 SSD, engineering sample from Phison, the one with the small fan
 
Joined
Dec 12, 2012
Messages
774 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
No it's not that way that it works.

If I connect a pcie5 SSD, GPU will work at pcie5 8x and SSD will work at pcie5 8x even if it uses only 4x.

Your Gen4 x16 GPU will work as Gen5 x8?

Well, I guess there's no point in continuing this conversation. You should google some stuff up.
 
Joined
Sep 15, 2015
Messages
1,076 (0.32/day)
Location
Latvija
System Name Fujitsu Siemens, HP Workstation
Processor Athlon x2 5000+ 3.1GHz, i5 2400
Motherboard Asus
Memory 4GB Samsung
Video Card(s) rx 460 4gb
Storage 750 Evo 250 +2tb
Display(s) Asus 1680x1050 4K HDR
Audio Device(s) Pioneer
Power Supply 430W
Mouse Acme
Keyboard Trust
I haw motherboards with 1, 2 generation.
For movies and music usage.
 
Joined
May 30, 2015
Messages
1,929 (0.56/day)
Location
Seattle, WA
If I connect a pcie5 SSD, GPU will work at pcie5 8x and SSD will work at pcie5 8x even if it uses only 4x.

No, the GPU will be at Gen 4 because it only supports operating at Gen 4. Your graphics card does not magically become capable of operating at Gen 5 link speeds simply because you have Gen 5 on your CPU or SSD, that is not how PCI Express link training works. Remember; the generation number only tells you what the link bandwidth is PER LANE. A PCI-E Gen 4 device operates at 16GT/s per lane MAXIMUM. It cannot operate at 32GT/s because it is not designed to do so, and it will automatically only receive 16GT/s when plugged into a Gen 5 platform.

1678124591885.png
 
Joined
Aug 25, 2021
Messages
1,170 (0.98/day)
This comparison has no sense.
Alder and raptor uses PCI Express Gen5 and not Gen4.
What's the sense of this comparison?
Alder Lake and Raptor Lake motherboards do have PCI Gen5 x16 slot but, unfortunately, their owners can do literally nothing with it until late 2024, GPU-wise. Some high-end boards offer bifurcation for NVMe Gen5 drives or entire AIC for NVMe Gen5 drives, to "make sense" of those Gen5 lanes.

You should be rather asking the question whether those Alder Lake motherboards with Gen5 GPU slot make any sense in 2021, 2022 and 2023? By the time Gen5 peripherals become more mainstream, in 2024 and onwards, many high-end Alder Lake motherboard owners will surely want to buy a new motherboard as Intel will move to 1851 socket.
 
Joined
Aug 4, 2020
Messages
1,614 (1.02/day)
Location
::1
alright, can we all like, rlx

there is no tangible performance gains from a pcie 4.0x4 ssd, let alone 5.0 - the way nand works (not bit-addressible) the bottleneck's simply not at the bus.

just keep running your gpu x16 and enjoy your (free) 2% (up to 7%) performance.
AND save on your ssd asw. something like an sn570's all you'll ever need.
 
Joined
Aug 25, 2021
Messages
1,170 (0.98/day)
Yes, I have a PCIe 5.0 SSD, engineering sample from Phison, the one with the small fan
It must be a tremendous privilege these days to have Gen5 NVMe drive with a small fan.

If I connect a pcie5 SSD, GPU will work at pcie5 8x and SSD will work at pcie5 8x even if it uses only 4x.
Just think about how bifurcation works dude. Google it, educate yourself and listen to what members above wrote to you.
1. If you connect Gen5 SSD, first GPU slot will be capable of working at Gen5 x8 speed, only if you had a GPU that supports PCIe 5.0. There is none until 2024.
2. NVMe SSD uses x4 connection. If you have NVMe AIC Gen5 x8 for two Gen5 SSDs to connect to the second x16 slot, it will be capable of working as Gen5 x8 device only if you insert two gen5 SSDs. One SSD will work as Gen5 x4.
 
Joined
Feb 1, 2019
Messages
3,602 (1.69/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Seems like all we got from PCI-E 5.0 is more expensive motherboards.

GPUs don't even need 4.0 for gaming. Pointless marketing scheme for a feature that should be reserved for professional applications.
6.0 on the way which will require an ECC chip as well. Looks like no sign of slowing down on the PCI-E train.
 
Joined
Aug 25, 2021
Messages
1,170 (0.98/day)
GPUs don't even need 4.0 for gaming. Pointless marketing scheme for a feature that should be reserved for professional applications.
They do need Gen4. PCIe 3.0 x16 (4.0 x8) has been saturated, by a whisker. That's the finding of this test and previous scaling tests from last year. Average loss in performance is ~2%, with some variation across games.
The saturation point is between Gen4 x8 and Gen4 x16, much closer to x8.

I just want to know why AMD was excluded from what many might refer to as one of the most important series of benchmarks to be featured on the Internet.
There are several scaling testing reviews with AMD CPUs and GPUs, both on TPU and other tech outlets, such as Hardware Unboxed.
Find it dude. Don't shout around before you look around. Simple. I am telling you this as owner of several Intel and AMD systems at home and at work.
 
Joined
Dec 12, 2012
Messages
774 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
They do need Gen4. PCIe 3.0 x16 (4.0 x8) has been saturated, by a whisker. That's the finding of this test and previous scaling tests from last year. Average loss in performance is ~2%, with some variation across games.

I doubt this is a question of bandwidth. More like latency or overhead. But it's still a minimal difference. If you look at the previous PCI-E scaling tests, the results are always similar, with each generation being slightly slower.

The 3080 test from 2020 shows the card being perfectly usable on PCI-E 2.0. Even 1.1 was just 13% behind 4.0. The 4090 is twice as fast, but the difference is only slightly bigger.

I expect the 5090 to be within a 5% margin when using 3.0 x16.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.94/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Good to know 8x 2.0 still manages to hold up somehow, through the power of magic

Is it assumed that 4x 3.0 and 8x 2.0 would perform the same, or has that been verified in the past?

The 5800X rig was boot from MBR
That disables REBAR?
Seems like REBAR would be what benefits from extra bandwidth the most, other than DirectStorage
 
Joined
Aug 25, 2021
Messages
1,170 (0.98/day)
Is it assumed that 4x 3.0 and 8x 2.0 would perform the same, or has that been verified in the past?
In Hardware Unboxed tests, Gen3 x4 loses significant amount of performance. This is the same bandwidth as Thunderbolt 4 for external GPUs.
 
Joined
Mar 7, 2023
Messages
2 (0.00/day)
Processor 12700k
Motherboard z690
Memory 32gb
Video Card(s) rtx 3070
Storage 1tb
Display(s) oled
Power Supply 750w
Mouse viper 8k
is X12 4.0 perform as X16 ?
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,845 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
is X12 4.0 perform as X16 ?
While the PCIe spec allows x12 in theory, I'm not aware of any device that supports x12, only x16 or x8
 
Joined
Dec 12, 2012
Messages
774 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
In Hardware Unboxed tests, Gen3 x4 loses significant amount of performance. This is the same bandwidth as Thunderbolt 4 for external GPUs.

From what I remember, PCI-E bandwidth makes a huge difference when the card runs out of VRAM.

This was shown on the 6500 XT 4 GB which was limited to 4 lanes. On PCI-E 3.0, you could lose as much as 50% performance in certain games.
The 4090 is an infinitely faster card, yet it does fine even on PCI-E 1.1, as it never runs out of memory.

The 3050 is perfectly fine on PCI-E 3.0, with 8 lanes and 8 GB of VRAM.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,845 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Video summary of the article

 
Top