• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4090 PCI-Express Scaling

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,815 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
The new NVIDIA GeForce RTX 4090 is a graphics card powerhouse, but what happens when you run it on a PCI-Express 4.0 x8 bus? In our mini-review we've also tested various PCI-Express 3.0, 2.0 and 1.1 configs to get a feel for how FPS scales with bandwidth.

Show full review
 
Joined
Mar 14, 2014
Messages
1,387 (0.36/day)
Processor 11900K
Motherboard ASRock Z590 OC Formula
Cooling Noctua NH-D15 using 2x140mm 3000RPM industrial Noctuas
Memory G. Skill Trident Z 2x16GB 3600MHz
Video Card(s) eVGA RTX 3090 FTW3
Storage 2TB Crucial P5 Plus
Display(s) 1st: LG GR83Q-B 1440p 27in 240Hz / 2nd: Lenovo y27g 1080p 27in 144Hz
Case Lian Li Lancool MESH II RGB (I removed the RGB)
Audio Device(s) AKG Q701's w/ O2+ODAC (Sounds a little bright)
Power Supply Seasonic Prime 850 TX
Mouse Glorious Model D
Keyboard Glorious MMK2 65% Lynx MX switches
Software Win10 Pro
Why not use Vulkan on RDR2?
 
Joined
Sep 10, 2019
Messages
22 (0.01/day)
System Name Zen-TR16x
Processor AMD Threadripper 1950x
Motherboard Gigabyte Aurus x399 Gaming
Cooling Arctic Freezer 33 TR
Memory 32Gb 3200Mhz (4x8Gb)
Video Card(s) Asus RTX 3070 FE
Storage Samsung Evo 860 SSD 2Tb
Display(s) LG 34"
Case Phantec 500s
Power Supply Corsair 650W
Benchmark Scores Gears 5 : 87fps at 1080p
So, I am only checking 4K results as that is where you see what is GPU made off.

This article will be important for those deciding to go with AMD x670 or x670E or B650 or b650E. From your foundings, you should just be sure that m2 slot has that PCie Gen5 possibility as the GPU will not be able to use it. Saving maybe 50 bucks on that could be invested in a bit faster GPU.

Funny question: Is Nvidia SLI completely dead and if not, would it work with two 4090?
 
Joined
Jan 5, 2006
Messages
18,584 (2.69/day)
System Name AlderLake
Processor Intel i7 12700K P-Cores @ 5Ghz
Motherboard Gigabyte Z690 Aorus Master
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MT/s CL36
Video Card(s) MSI RTX 2070 Super Gaming X Trio
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p
Case Be quiet! Silent Base 600 - Window
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W
Mouse Logitech MX Anywhere 2 Laser wireless
Keyboard RAPOO E9270P Black 5GHz wireless
Software Windows 11
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
So basically if you still got PCIe 3.0 you're still ok with that and a 4090....

But as @ir_cow mentioned you'd probably have a huge CPU bottleneck if you were still on PCIe 3.0 platform with a 4090....
 
Last edited:
Joined
Aug 25, 2021
Messages
1,170 (0.99/day)
The new NVIDIA GeForce RTX 4090 is a graphics card powerhouse, but what happens when you run it on a PCI-Express 4.0 x8 bus? In our mini-review we've also tested various PCI-Express 3.0, 2.0 and 1.1 configs to get a feel for how FPS scales with bandwidth.
Would performance difference stay the same as shown in games (2-3%), if graphics productivity workloads are tested? We know that most games are not able to saturate x8 Gen4 link, but what happens if graphic designers, architects, engineers use their powerful software on x8 Gen4 link with 4090?
 
Joined
Oct 27, 2020
Messages
791 (0.53/day)
That settles the Pci-express conserns if you use Gen5 SSD.
It should be the same for the other upcoming 4080 models also:


NVIDIA cancels GeForce RTX 4080 12GB

Hi @W1zzard in your upcoming article featuring RTX 4090 performance with 13900K, are you going to use the newer 522.25 drivers (11/10) that improve DX12 performance?

NVIDIA’s new driver enables “substantial” DirectX12 performance improvements for GeForce RTX GPUs

"Our DirectX 12 optimizations apply to GeForce RTX graphics cards and laptops, though improvements will vary based on your specific system setup, and the game settings used. In our testing, performance increases were found in a wide variety of DirectX 12 games, across all resolutions:

  • Assassin’s Creed Valhalla: up to 24% (1080p)
  • Battlefield 2042: up to 7% (1080p)
  • Borderlands 3: Up to 8% (1080p)
  • Call of Duty: Vanguard: up to 12% (4K)
  • Control: up to 6% (4K)
  • Cyberpunk 2077: up to 20% (1080p)
  • F1Ⓡ 22: up to 17% (4K)
  • Far Cry 6: up to 5% (1440p)
  • Forza Horizon 5: up to 8% (1080P)
  • Horizon Zero Dawn: Complete Edition: up to 8% (4k)
  • Red Dead Redemption 2: up to 7% (1080p)
  • Shadow of the Tomb Raider: up to 5% (1080p)
  • Tom Clancy’s The Division 2: up to 5% (1080p)
  • Watch Dogs: Legion: up to 9% (1440p)"
 
Last edited:
Joined
May 21, 2020
Messages
36 (0.02/day)
Overall it's about the difference I expected.
What surprised me though is the difference between individual games and at different resolution both within the same game and compared to other games, did not expect that big of a variance.
 

Bcannon2000

New Member
Joined
Oct 14, 2022
Messages
8 (0.01/day)
Why are you benchmarking GPUs with a 5800X? Shouldn't you be using the best CPUs for benchmarking GPUs? You have three other options for the best gaming CPUs, Alder Lake, Zen 4, and the 5800X3D
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,815 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Why are you benchmarking using a 5800X? Shouldn't you be using the best CPUs for benchmarking GPUs? You have three other options for the best gaming CPUs, Alder Lake, Zen 4, and the 5800X3D

Why are you benchmarking GPUs with a 5800X? Shouldn't you be using the best CPUs for benchmarking GPUs? You have three other options for the best gaming CPUs, Alder Lake, Zen 4, and the 5800X3D
I'm using 13900K

Edit: soon
 
Last edited:
Joined
Jan 20, 2019
Messages
1,550 (0.73/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
I'm using 13900K

You might need to amend the test setup page... its showing 5800X

actually i would have been happier if it was 5800X hehe... looks like I might end up with an AM4 platform if Zen4/RPL can't be had at a preferred budget (no thanks to NVIDIA for contaminating budget allocation)
 
Joined
Feb 20, 2019
Messages
8,265 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Hah! PCIe 2.0 still being fine almost 16 years later :)

Sure, if you want that last 5% then you'll need PCIe 4.0 but these articles always prove that the PCIe race really isn't that necessary unless you're already chasing diminishing returns.
 
Joined
Aug 6, 2020
Messages
729 (0.46/day)
Hah! PCIe 2.0 still being fine almost 16 years later :)

Sure, if you want that last 5% then you'll need PCIe 4.0 but these articles always prove that the PCIe race really isn't that necessary unless you're already chasing diminishing returns.


its hard to make more of an impact when you're already massively CPU -bound at 1080p - this s why the first time ever, the gap INCREASES with resolution bump to 1440p
 
Last edited:
Joined
Apr 18, 2013
Messages
1,260 (0.30/day)
Location
Artem S. Tashkinov
And I guess mostly AMD fans have trashed NVIDIA for not supporting PCIe 5.0.

Considering there's on average a 2% difference between PCIe 3.0 and 4.0 there would be a 0% performance inrovement from using PCIe 5.0.

DP 2.0 for RTX 4090 - that's relevant for some. PCIe 5.0? The time has yet to come.
 
Last edited:
Joined
Feb 20, 2019
Messages
8,265 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
its hard to make more of an impact when you're already massively CPU -bound at 1080p

Looks like many titles are massively CPU-bound even at 4K too.

I think it's fair to say that the 4090 is so much more GPU than almost any CPU, display, or game can fully use right now - that the entire raison d'etre for the 4090 isn't here yet.

Maybe 240Hz 4K displays will give it a purpose beyond bragging rights.
 
Joined
Aug 6, 2020
Messages
729 (0.46/day)
Looks like many titles are massively CPU-bound even at 4K too.

I think it's fair to say that the 4090 is so much more GPU than almost any CPU, display, or game can fully use right now - that the entire raison d'etre for the 4090 isn't here yet.

Maybe 240Hz 4K displays will give it a purpose beyond bragging rights.


disagree man, look at the gap at 1440p versus the 3080:



New much more GPU-limited gap once you double the res (now 80% versus 88 at pcie 1.11)



it would be nice if we could fix thew CPU limits at 1080p, but at least we can switch to analysis at 1440p.
 

Style68

New Member
Joined
Oct 14, 2022
Messages
1 (0.00/day)
If I were to use a two Gen 4 NVMe SSDs in RAID 0 on the M.2 slots that are wired to the chipset of a high end z790 board like the ASUS ROG Maximus z790 Extreme, would I get the full PCIe x16 for the graphics card and get speeds matching a single Gen 5 NVMe SSD that would have been installed on the Gen 5 slot?
 
Joined
Apr 16, 2013
Messages
549 (0.13/day)
Location
Bulgaria
System Name Black Knight | White Queen
Processor Intel Core i9-10940X (28 cores) | Intel Core i7-5775C (8 cores)
Motherboard ASUS ROG Rampage VI Extreme Encore X299G | ASUS Sabertooth Z97 Mark S (White)
Cooling Noctua NH-D15 chromax.black | Xigmatek Dark Knight SD-1283 Night Hawk (White)
Memory G.SKILL Trident Z RGB 4x8GB DDR4 3600MHz CL16 | Corsair Vengeance LP 4x4GB DDR3L 1600MHz CL9 (White)
Video Card(s) ASUS ROG Strix GeForce RTX 4090 OC | KFA2/Galax GeForce GTX 1080 Ti Hall of Fame Edition
Storage Samsung 990 Pro 2TB, 980 Pro 1TB, 850 Pro 256GB, 840 Pro 256GB, WD 10TB+ (incl. VelociRaptors)
Display(s) Dell Alienware AW2721D 240Hz| LG OLED evo C4 48" 144Hz
Case Corsair 7000D AIRFLOW (Black) | NZXT ??? w/ ASUS DRW-24B1ST
Audio Device(s) ASUS Xonar Essence STX | Realtek ALC1150
Power Supply Enermax Revolution 1250W 85+ | Super Flower Leadex Gold 650W (White)
Mouse Razer Basilisk Ultimate, Razer Naga Trinity | Razer Mamba 16000
Keyboard Razer Blackwidow Chroma V2 (Orange switch) | Razer Ornata Chroma
Software Windows 10 Pro 64bit
Still > PCI 3.0 is pointless I see.
 
Joined
Oct 9, 2009
Messages
716 (0.13/day)
Location
Finland
System Name RGB-PC v2.0
Processor AMD Ryzen 7950X
Motherboard Asus Crosshair X670E Extreme
Cooling Corsair iCUE H150i RGB PRO XT
Memory 4x16GB DDR5-5200 CL36 G.SKILL Trident Z5 NEO RGB
Video Card(s) Asus Strix RTX 2080 Ti
Storage 2x2TB Samsung 980 PRO
Display(s) Acer Nitro XV273K 27" 4K 120Hz (G-SYNC compatible)
Case Lian Li O11 Dynamic EVO
Audio Device(s) Audioquest Dragon Red + Sennheiser HD 650
Power Supply Asus Thor II 1000W + Cablemod ModMesh Pro sleeved cables
Mouse Logitech G500s
Keyboard Corsair K70 RGB with low profile red cherrys
Software Windows 11 Pro 64-bit
The relative performance, noise, perf/w and perf/money graphs are always really awsome at TPU and this is no different. thank you @W1zzard

I would really like to see PCIe 5.0 graphics cards so I know for sure in the future whatever plans I have with the second PCIe x8 slot will definitely not affect graphics card performance. PCIe 5.0 x8 plenty for graphics and x8 for other stuff on platform.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,815 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Hi @W1zzard in your upcoming article featuring RTX 4090 performance with 13900K, are you going to use the newer 522.25 drivers (11/10) that improve DX12 performance?
Of course.. the press drivers that I used also have these improvements

Why not use Vulkan on RDR2?
On cards with small VRAM it will just crash, with DirectX 12 it will run slower but stable

if graphic designers, architects, engineers use their powerful software
What software is that? The people you listed don't need such a fast GPU as far as I understand, they only need some basic viewport acceleration

If I were to use a two Gen 4 NVMe SSDs in RAID 0 on the M.2 slots that are wired to the chipset of a high end z790 board like the ASUS ROG Maximus z790 Extreme, would I get the full PCIe x16 for the graphics card and get speeds matching a single Gen 5 NVMe SSD that would have been installed on the Gen 5 slot?
Good question, hard to say without data for Gen 5 drives, "10 GB/s" is just sequential which is of almost no consequence in real-life
 

Bcannon2000

New Member
Joined
Oct 14, 2022
Messages
8 (0.01/day)
I'm using 13900K

Edit: soon
So you didn’t want to change the benchmark system right before having to change it again for Raptor Lake? that makes sense.

The issue is that testing with the 5800X is disingenuous, not that I would blame you for not wanting to change setups 2 times so quickly.

I do hope you revisit the PCIe scaling and GPU performance when you get the 13900K. It seems the average 4K numbers for this 4090 review are 20-30% below what others got.
 

Bcannon2000

New Member
Joined
Oct 14, 2022
Messages
8 (0.01/day)
And I guess mostly AMD fans have trashed NVIDIA for not supporting PCIe 5.0.

Considering there's on average a 2% difference between PCIe 3.0 and 4.0 there would be a 0% performance inrovement from using PCIe 5.0.

DP 2.0 for RTX 4090 - that's relevant for some. PCIe 5.0? The time has yet to come.
To me, if I am paying $1,600 or more for a GPU, it better have all of the newest bells and whistles. It doesn’t have DP 2.0 and PCIe 5.0, even if it or I can’t use them, I still want them because of how much it cost
 
Joined
Oct 6, 2021
Messages
1,605 (1.41/day)
@W1zzard Wouldn't the most correct test methodology be pairing the 4090 with a 7700X?
The 5800X cannot extract the full potential of this GPU.
 
Top