• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4090 PCI-Express Scaling with Core i9-13900K

Certainly


4090 supports running in legacy mode and UEFI. The 5800X rig was boot from MBR, no UEFI
It doesn't need an UEFI compatible board? Color me impressed. Just saw someone ran a 2080ti in a Core 2 Duo. Someone should do that with the 4090!
 
It doesn't need an UEFI compatible board? Color me impressed. Just saw someone ran a 2080ti in a Core 2 Duo. Someone should do that with the 4090!
That would be a fun test actually, wish my review queue was shorter, maybe for the summer
 
The push for faster PCI-Express was never made for graphics cards, but the ever lasting bandwidth requirement enterprise platforms require in NIC's, storage and such. By making this universal such as AGP was, they dont have to develop a seperate lane for the graphics card but simply unify the whole thing. I think by now such tests can be burried as there's hardly any difference on such a high end card and the games that are tested.
Indeed, the only GPUs that have significant issues with older PCIE is the manufacturers deliberately gimping them to lower lanes.

This chart shows that it is in terms of throughput that it is the same.
View attachment 286308
6.0 might be even worse on costs if a FEC chip is needed.
 
Hah, no.
Those cost savings for the GPU manufacturer typically don't get passed on to us but the price hikes of Gen5 and Gen6 motherboards absolutely do.
What about the RX 6500 XT? I thought they gimped the PCIE lanes for cost savings.
 
Certainly


4090 and pretty much every other card out there supports running in legacy mode and UEFI. The 5800X rig was boot from MBR, no UEFI
Ah okay,

I just remember having to go through a spate of BIOS updates to get some older Core2/Sandy/Ivy boards to recognise GPUs back in the early UEFI vbios days. Polaris/10-series IIRC. Probably just early pre-ratified UEFI support from those earlier boards.

What about the RX 6500 XT? I thought they gimped the PCIE lanes for cost savings.
For their cost savings. The 6500XT was a rip-off for consumers however you try to look at it. Hell, it's still a rip-off today - just buy a used GTX 1080 from ebay at half the price, or pickup a new 1660S on clearance for the same money and get vastly more performance and VRAM.
 
Ah okay,

I just remember having to go through a spate of BIOS updates to get some older Core2/Sandy/Ivy boards to recognise GPUs back in the early UEFI vbios days. Polaris/10-series IIRC. Probably just early pre-ratified UEFI support from those earlier boards.


For their cost savings. The 6500XT was a rip-off for consumers however you try to look at it. Hell, it's still a rip-off today - just buy a used GTX 1080 from ebay at half the price, or pickup a new 1660S on clearance for the same money and get vastly more performance and VRAM.
The problem there is some of us were and still are waiting for a proper low profile replacement for the rx 560, and a 1080 just won't fit. 3050 would have been great had they released a 75w version.
 
I just remember having to go through a spate of BIOS updates to get some older Core2/Sandy/Ivy boards to recognise GPUs back in the early UEFI vbios days. Polaris/10-series IIRC. Probably just early pre-ratified UEFI support from those earlier boards.
AMD UEFI support has been quite flakey in the past indeed
 
The problem there is some of us were and still are waiting for a proper low profile replacement for the rx 560, and a 1080 just won't fit. 3050 would have been great had they released a 75w version.
The 6500XT still isn't your answer as it's worse than a 560 for video output and has utterly crippled encode/decode hardware.
 
The 6500XT still isn't your answer as it's worse than a 560 for video output and has utterly crippled encode/decode hardware.
It can decode VP9, which the 560 cannot do. It's technically better for desktop usage on that alone, CPU usage decoding VP9 1440p@60 YouTube videos is usually stupid.

The 6500XT crippled (more like removed) encode support, but decode is fine, just missing AV1.
 
Oh man now I'm really curious about how my i7 920 D0 OCed with my Gigabyte X58 UD3R and 6x4GB 24GB DDR3 Samsung UDIMMs OCed system would do compared to this 13900k system. That system was my pride and joy and it was all thanks to TechPowerUp
 
It can decode VP9, which the 560 cannot do. It's technically better for desktop usage on that alone, CPU usage decoding VP9 1440p@60 YouTube videos is usually stupid.

The 6500XT crippled (more like removed) encode support, but decode is fine, just missing AV1.
Nvidia should have run a milk carton ad for AMD's 'Missing' (nerfed) encoders and lanes. ;-)

(And as for future cards possibly having a 5.0 x 8 -- or worse -- electrical connection layout... please, no. Not everyone is an "enthusiast" rocking the latest mobos every year. Until AMD and Nvidia start to make their own, they don't have a direct need to push sales/people into 5.0 faster than it's happening organically.)
 
It is consistent, seems some overhead or frame pacing issue. I posted a frametime chart for this recently, check my post history
Hey I'm sorry I can't seem to find it or I'm not looking in the right places. Could you kindly point me to it?
My quick thought about that though was the PSU. Maybe the spikes from the 4090 are asking too much from the 850w unit and it's not getting what it wants but the 4080 is. Have you ever tried using a 2nd PSU just for the GPU? It's worth a shot just to rule it out if anything.
 
Seems like all we got from PCI-E 5.0 is more expensive motherboards.

GPUs don't even need 4.0 for gaming. Pointless marketing scheme for a feature that should be reserved for professional applications.
 
Hey I'm sorry I can't seem to find it or I'm not looking in the right places. Could you kindly point me to it?

My quick thought about that though was the PSU. Maybe the spikes from the 4090 are asking too much from the 850w unit and it's not getting what it wants but the 4080 is. Have you ever tried using a 2nd PSU just for the GPU? It's worth a shot just to rule it out if anything.
That's not how it works with PSUs in the first place, and not how it works with physics. If you overload the PSU the voltage will drop and the card/system will crash, or the PSU will shut off, because one of its protections gets triggered... GPU performance does not go down in either scenario, the card doesn't even have a mechanism for "not enough power", which really means "too much voltage drop" .. and the Seasonic 850 W ATX 3.0 can take spikes well over 1000 W
 
Seems like all we got from PCI-E 5.0 is more expensive motherboards.

GPUs don't even need 4.0 for gaming. Pointless marketing scheme for a feature that should be reserved for professional applications.
Yup was saying this for a while w.r.t Z690 boards and now again with AM5, I don't think anyone should give half a damn about whether their board supports PCIe 5.0... It's a complete gimmick for SSDs and useless for GPUs. Even with an expected long upgrade cycle on an AM5 board I don't think it's worth it unless it's the same price.
 
I see the children are here to whine as usual about "WhY DidN'T yOU RuN thIS on amD CPu" SHUT UP AND SIT DOWN. You don't bother to understand or care how much time and effort Wizz puts into running these benchmarks and providing the results FOR FREE, if you want AMD CPU benchmarks then run them yourself.

What about the RX 6500 XT? I thought they gimped the PCIE lanes for cost savings.
AMD didn't gimp it, they took a GPU that was designed to be used as a dGPU in laptops - connected to the CPU over 4 dedicated lanes of PCIe - and put it on a PCIe card, so they had something below 6600 to compete with Arc and 1650/3050. But it turns out that a low- to mid-range GPU, with a lower amount of VRAM, needs to transfer a lot more data over the PCIe bus, and a PCIe x4 link absolutely doesn't cut it in that scenario. On top of that, the 6500 XT GPU is also missing many features (because it was expected that the CPU it was coupled to would have them), that makes it even more of a disappointment.

The 6500 XT's "predecessor", the 5500 XT, was designed for desktop with a PCIe x8 link, and worked pretty well as a result. I still don't know why AMD didn't do a rebrand of the 5500 XT for the 6500 XT, instead of trying to fit a square peg into a round hole - it's not like AMD or NVI*DIA are strangers to rebranding old GPUs as new when necessary.
 
i mean, this review had been well in the works b4 the 7950x3d even released.
and prior to that, the definitive no-punches-pulled gaming cpu's the 13900k.
so yeah
 
I see the children are here to whine as usual about "WhY DidN'T yOU RuN thIS on amD CPu" SHUT UP AND SIT DOWN. You don't bother to understand or care how much time and effort Wizz puts into running these benchmarks and providing the results FOR FREE, if you want AMD CPU benchmarks then run them yourself.
Excuse me? It was a simple question. I'm not, as you say, whining about it. I just want to know why AMD was excluded from what many might refer to as one of the most important series of benchmarks to be featured on the Internet.

While it might've been true in the past, Intel is no longer the top dog in the industry. They have competition yet it seems nearly every publication and YouTube influencer uses Intel chips as their base in many of their benchmark rigs. Why? I'm not just calling out Wizzard here, I'm calling out... everyone in the benchmark space. Why always Intel?
 
Thanks for this, always useful.
Amusing to see that PCIe 2.0 x16 is still just about fine. You probably cannot pair a 4090 with anything that old - would the board even recognise the card?
I have an RTX 3090 with an i7 2600k and 16 GB of RAM. It works like a charm for video editing (my wife has a YouTube channel) and 4K gaming (well, that's for me).

If you ask me why this choice, I wanted a new system with an RTX 3080 but they were stuck at 1200 euros for months in Europe, so I decided to buy 3090 at MSRP and give up the purchase of a new CPU.

The i7 2600k can play gen7 games at 120 fps, and most last gens games at 60 fps, so it's okay for me at the moment.

My last purchase was Kena: Bridge of Spirits, not a very demanding game CPU-wise, but it was 60-fps locked. With ultra graphics it looks insane, so no regret at the moment.
 
Thank you for the article. I have an z690 board and am installing a new heat sink on my gen 4 samsung ssd and at the same time was considering moving it to the gen 4 m.2 slot and leave the gen 5 just for my 4090. My pc is used for gaming and I don't see any reason to move the ssd unless for better airflow, which I may do anyway just for grins.
 


That's not how it works with PSUs in the first place, and not how it works with physics. If you overload the PSU the voltage will drop and the card/system will crash, or the PSU will shut off, because one of its protections gets triggered... GPU performance does not go down in either scenario, the card doesn't even have a mechanism for "not enough power", which really means "too much voltage drop" .. and the Seasonic 850 W ATX 3.0 can take spikes well over 1000 W
This is true. This Corsair RM850x (White, 2021) that I'm running is rated for 850W, but it can obviously take spikes well above 1000W. I have run a RTX 3090 (for work) AND a RX 6900 XT (for gaming) on two separate PCI-E lanes at the same time (x8/x8) for semi-daily use. The only time it triggered the OCP (over-current protection) was if I [accidentally] put near-full load on both cards at the same time, and this means that the system will hard shut down.
 
Am I understanding you correctly? I'm on z790 with a 13900k and a 4090. I currently have a Gen 4 m.2 in the slot closest to my CPU. My GPU is still running at x16. Are you saying it has to be a gen 5 m.2?
 
Back
Top