• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4090 PCI-Express Scaling with Core i9-13900K

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,845 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Joined
Feb 29, 2016
Messages
641 (0.20/day)
Location
Chile
System Name Shinano
Processor AMD Ryzen 9 5900X
Motherboard ROG Strix B550-F GAMING WIFI II
Cooling Thermalright Peerless Assassin 120SE
Memory 32GB DDR4 3200MHz
Video Card(s) XFX RX 6700 10GB
Storage 970 EVO Plus 1TB | A1000 480GB
Display(s) Lenovo G27q-20 (1440p, 165Hz)
Case NZXT H510
Audio Device(s) Sony WH-1000XM4 | Edifier R1000T4
Power Supply SuperFlower Leadex Gold III 850W
Mouse Logitech G305
Keyboard IK75 v3 (QMK) | HyperX Alloy Origins
Certainly


4090 supports running in legacy mode and UEFI. The 5800X rig was boot from MBR, no UEFI
It doesn't need an UEFI compatible board? Color me impressed. Just saw someone ran a 2080ti in a Core 2 Duo. Someone should do that with the 4090!
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,845 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
It doesn't need an UEFI compatible board? Color me impressed. Just saw someone ran a 2080ti in a Core 2 Duo. Someone should do that with the 4090!
That would be a fun test actually, wish my review queue was shorter, maybe for the summer
 
Joined
Feb 1, 2019
Messages
3,602 (1.69/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
The push for faster PCI-Express was never made for graphics cards, but the ever lasting bandwidth requirement enterprise platforms require in NIC's, storage and such. By making this universal such as AGP was, they dont have to develop a seperate lane for the graphics card but simply unify the whole thing. I think by now such tests can be burried as there's hardly any difference on such a high end card and the games that are tested.
Indeed, the only GPUs that have significant issues with older PCIE is the manufacturers deliberately gimping them to lower lanes.

This chart shows that it is in terms of throughput that it is the same.
View attachment 286308
6.0 might be even worse on costs if a FEC chip is needed.
 

izy

Joined
Jun 30, 2022
Messages
1,015 (1.15/day)
Hah, no.
Those cost savings for the GPU manufacturer typically don't get passed on to us but the price hikes of Gen5 and Gen6 motherboards absolutely do.
What about the RX 6500 XT? I thought they gimped the PCIE lanes for cost savings.
 
Joined
Feb 20, 2019
Messages
8,284 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Certainly


4090 and pretty much every other card out there supports running in legacy mode and UEFI. The 5800X rig was boot from MBR, no UEFI
Ah okay,

I just remember having to go through a spate of BIOS updates to get some older Core2/Sandy/Ivy boards to recognise GPUs back in the early UEFI vbios days. Polaris/10-series IIRC. Probably just early pre-ratified UEFI support from those earlier boards.

What about the RX 6500 XT? I thought they gimped the PCIE lanes for cost savings.
For their cost savings. The 6500XT was a rip-off for consumers however you try to look at it. Hell, it's still a rip-off today - just buy a used GTX 1080 from ebay at half the price, or pickup a new 1660S on clearance for the same money and get vastly more performance and VRAM.
 
Joined
Dec 28, 2012
Messages
3,887 (0.89/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
Ah okay,

I just remember having to go through a spate of BIOS updates to get some older Core2/Sandy/Ivy boards to recognise GPUs back in the early UEFI vbios days. Polaris/10-series IIRC. Probably just early pre-ratified UEFI support from those earlier boards.


For their cost savings. The 6500XT was a rip-off for consumers however you try to look at it. Hell, it's still a rip-off today - just buy a used GTX 1080 from ebay at half the price, or pickup a new 1660S on clearance for the same money and get vastly more performance and VRAM.
The problem there is some of us were and still are waiting for a proper low profile replacement for the rx 560, and a 1080 just won't fit. 3050 would have been great had they released a 75w version.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,845 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
I just remember having to go through a spate of BIOS updates to get some older Core2/Sandy/Ivy boards to recognise GPUs back in the early UEFI vbios days. Polaris/10-series IIRC. Probably just early pre-ratified UEFI support from those earlier boards.
AMD UEFI support has been quite flakey in the past indeed
 
Joined
Feb 20, 2019
Messages
8,284 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
The problem there is some of us were and still are waiting for a proper low profile replacement for the rx 560, and a 1080 just won't fit. 3050 would have been great had they released a 75w version.
The 6500XT still isn't your answer as it's worse than a 560 for video output and has utterly crippled encode/decode hardware.
 
Joined
Feb 29, 2016
Messages
641 (0.20/day)
Location
Chile
System Name Shinano
Processor AMD Ryzen 9 5900X
Motherboard ROG Strix B550-F GAMING WIFI II
Cooling Thermalright Peerless Assassin 120SE
Memory 32GB DDR4 3200MHz
Video Card(s) XFX RX 6700 10GB
Storage 970 EVO Plus 1TB | A1000 480GB
Display(s) Lenovo G27q-20 (1440p, 165Hz)
Case NZXT H510
Audio Device(s) Sony WH-1000XM4 | Edifier R1000T4
Power Supply SuperFlower Leadex Gold III 850W
Mouse Logitech G305
Keyboard IK75 v3 (QMK) | HyperX Alloy Origins
The 6500XT still isn't your answer as it's worse than a 560 for video output and has utterly crippled encode/decode hardware.
It can decode VP9, which the 560 cannot do. It's technically better for desktop usage on that alone, CPU usage decoding VP9 1440p@60 YouTube videos is usually stupid.

The 6500XT crippled (more like removed) encode support, but decode is fine, just missing AV1.
 
Joined
Jan 11, 2009
Messages
9,250 (1.59/day)
Location
Montreal, Canada
System Name Homelabs
Processor Ryzen 5900x | Ryzen 1920X
Motherboard Asus ProArt x570 Creator | AsRock X399 fatal1ty gaming
Cooling Silent Loop 2 280mm | Dark Rock Pro TR4
Memory 128GB (4x32gb) DDR4 3600Mhz | 128GB (8x16GB) DDR4 2933Mhz
Video Card(s) EVGA RTX 3080 | ASUS Strix GTX 970
Storage Optane 900p + NVMe | Optane 900p + 8TB SATA SSDs + 48TB HDDs
Display(s) Alienware AW3423dw QD-OLED | HP Omen 32 1440p
Case be quiet! Dark Base Pro 900 rev 2 | be quiet! Silent Base 800
Power Supply Corsair RM750x + sleeved cables| EVGA P2 750W
Mouse Razer Viper Ultimate (still has buttons on the right side, crucial as I'm a southpaw)
Keyboard Razer Huntsman Elite, Pro Type | Logitech G915 TKL
Oh man now I'm really curious about how my i7 920 D0 OCed with my Gigabyte X58 UD3R and 6x4GB 24GB DDR3 Samsung UDIMMs OCed system would do compared to this 13900k system. That system was my pride and joy and it was all thanks to TechPowerUp
 
Joined
Jun 30, 2022
Messages
121 (0.14/day)
It can decode VP9, which the 560 cannot do. It's technically better for desktop usage on that alone, CPU usage decoding VP9 1440p@60 YouTube videos is usually stupid.

The 6500XT crippled (more like removed) encode support, but decode is fine, just missing AV1.
Nvidia should have run a milk carton ad for AMD's 'Missing' (nerfed) encoders and lanes. ;-)

(And as for future cards possibly having a 5.0 x 8 -- or worse -- electrical connection layout... please, no. Not everyone is an "enthusiast" rocking the latest mobos every year. Until AMD and Nvidia start to make their own, they don't have a direct need to push sales/people into 5.0 faster than it's happening organically.)
 
Joined
Mar 14, 2014
Messages
1,398 (0.36/day)
Processor 11900K
Motherboard ASRock Z590 OC Formula
Cooling Noctua NH-D15 using 2x140mm 3000RPM industrial Noctuas
Memory G. Skill Trident Z 2x16GB 3600MHz
Video Card(s) eVGA RTX 3090 FTW3
Storage 2TB Crucial P5 Plus
Display(s) 1st: LG GR83Q-B 1440p 27in 240Hz / 2nd: Lenovo y27g 1080p 27in 144Hz
Case Lian Li Lancool MESH II RGB (I removed the RGB)
Audio Device(s) AKG Q701's w/ O2+ODAC (Sounds a little bright)
Power Supply Seasonic Prime 850 TX
Mouse Glorious Model D
Keyboard Glorious MMK2 65% Lynx MX switches
Software Win10 Pro
It is consistent, seems some overhead or frame pacing issue. I posted a frametime chart for this recently, check my post history
Hey I'm sorry I can't seem to find it or I'm not looking in the right places. Could you kindly point me to it?
My quick thought about that though was the PSU. Maybe the spikes from the 4090 are asking too much from the 850w unit and it's not getting what it wants but the 4080 is. Have you ever tried using a 2nd PSU just for the GPU? It's worth a shot just to rule it out if anything.
 
Joined
Dec 12, 2012
Messages
774 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
Seems like all we got from PCI-E 5.0 is more expensive motherboards.

GPUs don't even need 4.0 for gaming. Pointless marketing scheme for a feature that should be reserved for professional applications.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,845 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Hey I'm sorry I can't seem to find it or I'm not looking in the right places. Could you kindly point me to it?

My quick thought about that though was the PSU. Maybe the spikes from the 4090 are asking too much from the 850w unit and it's not getting what it wants but the 4080 is. Have you ever tried using a 2nd PSU just for the GPU? It's worth a shot just to rule it out if anything.
That's not how it works with PSUs in the first place, and not how it works with physics. If you overload the PSU the voltage will drop and the card/system will crash, or the PSU will shut off, because one of its protections gets triggered... GPU performance does not go down in either scenario, the card doesn't even have a mechanism for "not enough power", which really means "too much voltage drop" .. and the Seasonic 850 W ATX 3.0 can take spikes well over 1000 W
 
Joined
Mar 31, 2014
Messages
1,533 (0.39/day)
Location
Grunn
System Name Indis the Fair (cursed edition)
Processor 11900k 5.1/4.9 undervolted.
Motherboard MSI Z590 Unify-X
Cooling Heatkiller VI Pro, VPP755 V.3, XSPC TX360 slim radiator, 3xA12x25, 4x Arctic P14 case fans
Memory G.Skill Ripjaws V 2x16GB 4000 16-19-19 (b-die@3600 14-14-14 1.45v)
Video Card(s) EVGA 2080 Super Hybrid (T30-120 fan)
Storage 970EVO 1TB, 660p 1TB, WD Blue 3D 1TB, Sandisk Ultra 3D 2TB
Display(s) BenQ XL2546K, Dell P2417H
Case FD Define 7
Audio Device(s) DT770 Pro, Topping A50, Focusrite Scarlett 2i2, Røde VXLR+, Modmic 5
Power Supply Seasonic 860w Platinum
Mouse Razer Viper Mini, Odin Infinity mousepad
Keyboard GMMK Fullsize v2 (Boba U4Ts)
Software Win10 x64/Win7 x64/Ubuntu
Seems like all we got from PCI-E 5.0 is more expensive motherboards.

GPUs don't even need 4.0 for gaming. Pointless marketing scheme for a feature that should be reserved for professional applications.
Yup was saying this for a while w.r.t Z690 boards and now again with AM5, I don't think anyone should give half a damn about whether their board supports PCIe 5.0... It's a complete gimmick for SSDs and useless for GPUs. Even with an expected long upgrade cycle on an AM5 board I don't think it's worth it unless it's the same price.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
I see the children are here to whine as usual about "WhY DidN'T yOU RuN thIS on amD CPu" SHUT UP AND SIT DOWN. You don't bother to understand or care how much time and effort Wizz puts into running these benchmarks and providing the results FOR FREE, if you want AMD CPU benchmarks then run them yourself.

What about the RX 6500 XT? I thought they gimped the PCIE lanes for cost savings.
AMD didn't gimp it, they took a GPU that was designed to be used as a dGPU in laptops - connected to the CPU over 4 dedicated lanes of PCIe - and put it on a PCIe card, so they had something below 6600 to compete with Arc and 1650/3050. But it turns out that a low- to mid-range GPU, with a lower amount of VRAM, needs to transfer a lot more data over the PCIe bus, and a PCIe x4 link absolutely doesn't cut it in that scenario. On top of that, the 6500 XT GPU is also missing many features (because it was expected that the CPU it was coupled to would have them), that makes it even more of a disappointment.

The 6500 XT's "predecessor", the 5500 XT, was designed for desktop with a PCIe x8 link, and worked pretty well as a result. I still don't know why AMD didn't do a rebrand of the 5500 XT for the 6500 XT, instead of trying to fit a square peg into a round hole - it's not like AMD or NVI*DIA are strangers to rebranding old GPUs as new when necessary.
 
Joined
Aug 4, 2020
Messages
1,614 (1.02/day)
Location
::1
i mean, this review had been well in the works b4 the 7950x3d even released.
and prior to that, the definitive no-punches-pulled gaming cpu's the 13900k.
so yeah
 
Joined
Mar 6, 2017
Messages
3,332 (1.18/day)
Location
North East Ohio, USA
System Name My Ryzen 7 7700X Super Computer
Processor AMD Ryzen 7 7700X
Motherboard Gigabyte B650 Aorus Elite AX
Cooling DeepCool AK620 with Arctic Silver 5
Memory 2x16GB G.Skill Trident Z5 NEO DDR5 EXPO (CL30)
Video Card(s) XFX AMD Radeon RX 7900 GRE
Storage Samsung 980 EVO 1 TB NVMe SSD (System Drive), Samsung 970 EVO 500 GB NVMe SSD (Game Drive)
Display(s) Acer Nitro XV272U (DisplayPort) and Acer Nitro XV270U (DisplayPort)
Case Lian Li LANCOOL II MESH C
Audio Device(s) On-Board Sound / Sony WH-XB910N Bluetooth Headphones
Power Supply MSI A850GF
Mouse Logitech M705
Keyboard Steelseries
Software Windows 11 Pro 64-bit
Benchmark Scores https://valid.x86.fr/liwjs3
I see the children are here to whine as usual about "WhY DidN'T yOU RuN thIS on amD CPu" SHUT UP AND SIT DOWN. You don't bother to understand or care how much time and effort Wizz puts into running these benchmarks and providing the results FOR FREE, if you want AMD CPU benchmarks then run them yourself.
Excuse me? It was a simple question. I'm not, as you say, whining about it. I just want to know why AMD was excluded from what many might refer to as one of the most important series of benchmarks to be featured on the Internet.

While it might've been true in the past, Intel is no longer the top dog in the industry. They have competition yet it seems nearly every publication and YouTube influencer uses Intel chips as their base in many of their benchmark rigs. Why? I'm not just calling out Wizzard here, I'm calling out... everyone in the benchmark space. Why always Intel?
 

Neodoris

New Member
Joined
Mar 5, 2023
Messages
1 (0.00/day)
Thanks for this, always useful.
Amusing to see that PCIe 2.0 x16 is still just about fine. You probably cannot pair a 4090 with anything that old - would the board even recognise the card?
I have an RTX 3090 with an i7 2600k and 16 GB of RAM. It works like a charm for video editing (my wife has a YouTube channel) and 4K gaming (well, that's for me).

If you ask me why this choice, I wanted a new system with an RTX 3080 but they were stuck at 1200 euros for months in Europe, so I decided to buy 3090 at MSRP and give up the purchase of a new CPU.

The i7 2600k can play gen7 games at 120 fps, and most last gens games at 60 fps, so it's okay for me at the moment.

My last purchase was Kena: Bridge of Spirits, not a very demanding game CPU-wise, but it was 60-fps locked. With ultra graphics it looks insane, so no regret at the moment.
 
Joined
Jul 16, 2013
Messages
205 (0.05/day)
System Name latest-greatest
Processor i7 12700K
Motherboard Z690 Rog Strix-E
Cooling Lian Li Galahad 360
Memory corsair vengeance Ddr5 4800
Video Card(s) 2080ti
Storage 980 pro gen4
Display(s) LG C1 4K 120Mhz
Case fractal meshify2
Audio Device(s) Realtec 4080
Power Supply Corsair rm1000x
Thank you for the article. I have an z690 board and am installing a new heat sink on my gen 4 samsung ssd and at the same time was considering moving it to the gen 4 m.2 slot and leave the gen 5 just for my 4090. My pc is used for gaming and I don't see any reason to move the ssd unless for better airflow, which I may do anyway just for grins.
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
1,997 (0.34/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASRock X870 Taichi Lite
Cooling Thermalright Phantom Spirit 120 EVO CPU
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA) / NVIDIA RTX 4090 Founder's Edition
Storage Crucial T500 2TB x 3
Display(s) LG 32GS95UE-B, ASUS ROG Swift OLED (PG27AQDP), LG C4 42" (OLED42C4PUA)
Case HYTE Hakos Baelz Y60
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000L
Mouse Logitech Pro Superlight 2 (White), G303 Shroud Edition
Keyboard Wooting 60HE+ / 8BitDo Retro Mechanical Keyboard (N Edition) / NuPhy Air75 v2
VR HMD Occulus Quest 2 128GB
Software Windows 11 Pro 64-bit 23H2 Build 22631.4317


That's not how it works with PSUs in the first place, and not how it works with physics. If you overload the PSU the voltage will drop and the card/system will crash, or the PSU will shut off, because one of its protections gets triggered... GPU performance does not go down in either scenario, the card doesn't even have a mechanism for "not enough power", which really means "too much voltage drop" .. and the Seasonic 850 W ATX 3.0 can take spikes well over 1000 W
This is true. This Corsair RM850x (White, 2021) that I'm running is rated for 850W, but it can obviously take spikes well above 1000W. I have run a RTX 3090 (for work) AND a RX 6900 XT (for gaming) on two separate PCI-E lanes at the same time (x8/x8) for semi-daily use. The only time it triggered the OCP (over-current protection) was if I [accidentally] put near-full load on both cards at the same time, and this means that the system will hard shut down.
 

DegeneRagingX

New Member
Joined
Mar 5, 2023
Messages
1 (0.00/day)
Am I understanding you correctly? I'm on z790 with a 13900k and a 4090. I currently have a Gen 4 m.2 in the slot closest to my CPU. My GPU is still running at x16. Are you saying it has to be a gen 5 m.2?
 
Top