• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Question, concern PCIe 4.0 / PCIe 3.0

If PCIe 4.0 16x is nowhere near being saturated - at least on single monitor setups & they have had PCIe 5.0 x16 since last last year (Alderlake), at what resolution on a single monitor will that become a bottleneck?
 
If PCIe 4.0 16x is nowhere near being saturated - at least on single monitor setups & they have had PCIe 5.0 x16 since last last year (Alderlake), at what resolution on a single monitor will that become a bottleneck?

The slot throughput 32 GB/s (via PCIe 4 x16) is between the CPU and the graphics card. It is always saturated - this is the speed of communication between the parts of your PC. It is saturated but increasing that speed doesn't scale framerate performance linearly, hence the "myth" that it is not saturated.
 
If PCIe 4.0 16x is nowhere near being saturated - at least on single monitor setups & they have had PCIe 5.0 x16 since last last year (Alderlake), at what resolution on a single monitor will that become a bottleneck?
I heard Nvidia is staying on PCIE 4.0 with the upcoming generation, tells you all about how unneeded PCIE 5.0 is for now. No, 3000 gen couldn’t saturate it, it’s just when you begin to downsize the port that it becomes a bottleneck, like 8x or 4x and then even halve the speed by just using 3.0. Like the 6500 XTs port is so small, it really needs PCIE 4.0 otherwise it’s bottlenecked by the port. Had it 8x instead, PCIE 3.0 would’ve sufficed as well.
 
Workstation cards are for workstation purposes.. yes. Water makes things wet. Having a team test a card that's used for compute and production on games and getting a "no" should've been obvious.

Don't use workstation for games, plain and simple.


Yes you can use workstation cards for games but you are on your own for support.

I heard Nvidia is staying on PCIE 4.0 with the upcoming generation, tells you all about how unneeded PCIE 5.0 is for now. No, 3000 gen couldn’t saturate it, it’s just when you begin to downsize the port that it becomes a bottleneck, like 8x or 4x and then even halve the speed by just using 3.0. Like the 6500 XTs port is so small, it really needs PCIE 4.0 otherwise it’s bottlenecked by the port. Had it 8x instead, PCIE 3.0 would’ve sufficed as well.
Pcie 4 only became relevant in 2019, we arent near 3 year mark.
 
Yes you can use workstation cards for games but you are on your own for support.


Pcie 4 only became relevant in 2019, we arent near 3 year mark.
It’s not even that relevant now. I’m still fully on PCIE 3.0, I don’t need it. Maybe DirectStorage or faster GPUs will change this.
 
:wtf:
how long has it been since you looked at a calendar?
:roll:
I think he also meant, “released” it was far from relevant 3 years ago, it’s slowly starting to become relevant now.
 
:wtf:
how long has it been since you looked at a calendar?
:roll:
What @eazen
wrote.

I'm typically busy with work, helping friends, working on my vehicle or on here that I don't look at a calander often. I do know next Monday is Independence Day for the U.S.A.

And to clarify the original PCIe 4.0 spec was launched in 2017 but it was almost 2 years later we heard of it being implemented in the consumer market AMD X570 Chipset and the RX 5700 GPU. As of sometime this July it would be that PCIE 4.0 has been on the consumer market for 3 years.
 
What @eazen
wrote.

I'm typically busy with work, helping friends, working on my vehicle or on here that I don't look at a calander often. I do know next Monday is Independence Day for the U.S.A.

And to clarify the original PCIe 4.0 spec was launched in 2017 but it was almost 2 years later we heard of it being implemented in the consumer market AMD X570 Chipset and the RX 5700 GPU. As of sometime this July it would be that PCIE 4.0 has been on the consumer market for 3 years.
dude, it was a comment ya know since 2019+3=2022. :D

but nevermind - didn't mean to rustle anyone's jimmies :shadedshu:
 
dude, it was a comment ya know since 2019+3=2022. :D

but nevermind - didn't mean to rustle anyone's jimmies :shadedshu:
Well thats the problem with written info, it can be misconstrued lol
 
I am finding it a little hard to believe that the Radeon Pro W5500 needs a greater system interface than PCIe 3.0 x8. Compared to the RX 6500 XT, it has lower compute, pixel fill speed, and texture mapping speed. The only advantage it has is memory bandwidth.

This should logically translate to a similar or lower bus interface bandwidth requirement for the PRO card. As PCIe 3.0 x8 has identical bandwidth to the PCIe 4.0 x4 of the 6500 XT, I find it hard to believe that you are bandwidth limited.
 
I am finding it a little hard to believe that the Radeon Pro W5500 needs a greater system interface than PCIe 3.0 x8. Compared to the RX 6500 XT, it has lower compute, pixel fill speed, and texture mapping speed. The only advantage it has is memory bandwidth.

This should logically translate to a similar or lower bus interface bandwidth requirement for the PRO card. As PCIe 3.0 x8 has identical bandwidth to the PCIe 4.0 x4 of the 6500 XT, I find it hard to believe that you are bandwidth limited.
Not "need" per se, it's only a bit faster with 4.0, about 5%, but in memory constrained situations much more, because then it uses the PCIE to reload textures fast and 3.0 bottlenecks it then.
 
Workstation cards are for workstation purposes.. yes. Water makes things wet. Having a team test a card that's used for compute and production on games and getting a "no" should've been obvious.

Don't use workstation for games, plain and simple.
Bruh, it's usually a rebranded Radeon with more vRAM solf as Radeon Pro. It's basically the same thing.
 
"t’s not as simple as that. You’re missing a variable. At least one. The software.

The PCIe lanes is where data and instructions are sent through from the CPU to the graphics card. And that is governed by the program running. What it needs to send to the GPU, how much and how often.

If the program is designed to send more than PCIe 2.0 x16’s 8 GB/s limit, it would flood those lanes - no matter what graphics card is at the other end.

Usually, though, programs are designed to not reach the limits of “current” technology. Thus, a game made in 2020 (say Cyberpunk 2077) wants to send (say) 100 MB each frame, while one from 2010 (say Skyrim) wants to send only 10 MB for each frame.

Then, what is the card meant to do for each of those? How long does it take to complete the instructions that program “told” it to perform? Which is what governs the maximum frame rate.

Add to this, some programs (games) throttle this frame rate. Which is why I used Skyrim as an example, since it limits the frame rate to 60 FPS, no matter if the graphics card is fast enough to calculate each frame quicker.

If those arbitrary data size per frame (chosen just as samples) above is correct, you can actually do some calculations. E.g. a PCIe 2.0 x16 connection can handle a maximum of 8 GB/s. Meaning it could run CyberPunk’s 100 MB/frame at a maximum of 8000/100 = 80 FPS, no matter if the card could calculate each frame quicker. Thus causing a PCIe bottleneck. But on that very same card, running Skyrim’s 10 MB/frame … 8000/10 = 800 FPS, and it get’s limited to 60 FPS, meaning it would NEVER exceed 10x60=600 MB/s = 0.6 GB/s.

I.e. even a RTX 3090, running Skyrim, would still not even come close to flooding a PCIe 2.0 x16 connection. But it might flood that if running Cyberpunk. Then again, an old GTX 980, might not be able to reach the frame rate needed to flood the max 8 GB/s of PCIE 2.0 x16, even when running Cyberpunk."
What were the first GPUs to saturate a PCIe x16 v2.0 slot? - Quora
 
Bruh, it's usually a rebranded Radeon with more vRAM solf as Radeon Pro. It's basically the same thing.
Also, i recently bought some ARCTIC MX-4 2 g (2019 Edition from ebay. Temps at idle is now at 34c and gaming only hits up to around 60c-80c depending on the game, compared to what it was doing at 99c all the time. Had to clean my card. I don't have any problems with gaming in general, just a few games i notice when playing those games ( its only 3 of them ), the GPU power level is always low, and not until i hit the "relive record" feature the GPU power level increases 2x and the framerate doubles on that game. idk why it does that. Its a mystery. Other than that, i have no problems with anything honestly.

Anyways, i have a quick question, will a 685 powersupply be enough to run a GPU that reads: TDP: 205 / Suggested PSU: 550w?
 
@eazen
run something like a 2080/higher and a couple of m.2 (i got 5), and i makes a difference, as im not getting bottlenecked using stuff at the same time, as gpu slot drops to 8x, and m.2 to 4x.
sure, the 2080 performance will be same or even go up a bit (no matter if 3.0 or 4.0) as less overhead, but i dont want my drives to drop (all +3/5/7gb/s)
 
@eazen
run something like a 2080/higher and a couple of m.2 (i got 5), and i makes a difference, as im not getting bottlenecked using stuff at the same time, as gpu slot drops to 8x, and m.2 to 4x.
sure, the 2080 performance will be same or even go up a bit (no matter if 3.0 or 4.0) as less overhead, but i dont want my drives to drop (all +3/5/7gb/s)
Banned as a fake account
 
Anyways, i have a quick question, will a 685 powersupply be enough to run a GPU that reads: TDP: 205 / Suggested PSU: 550w?
It should be, but if it is some really cheap and dodgy looking unit, who knows. You can do basic check. Look at 12 volt rail amperage then multiply amps by voltage and you will see how many watts you have on 12 volt rail.
 
...

Anyways, i have a quick question, will a 685 powersupply be enough to run a GPU that reads: TDP: 205 / Suggested PSU: 550w?
Depends on the rest of the system, is this the one in your system specs?
 
What is the point of even buying a video card that uses PCIe 4.0 if people have motherboards that only have PCIe 3.0 slots?
I have multiple gpus, currently using a radeon pro w5500 PCIe 4.0x8 which for the most part does what's needed, no problems with videos or gaming(for the most part, it does have opengl issues, and some games doesn't seem to using its full power/gpu power) (that beta driver that fixes opengl, does fix those opengl issues though).

However , my WX 7100 and RX 570 8GB PCIe 3.0x16 for example so its using its full power/potential and at times soem games run faster than the w5500. And from my understanding Intel motherboards that have pcie 4.0 slots are only what 2 exist?

I was thinking about buying a Radeon Pro W6600 PCIe 4.0x16 but again, what's the point when i don't have slots that has PCIe 4.0. Guess i will just stick with cards that has PCIe 3.0x16.
Here this may help put things into perspective, basically minimal affect on cards wired 16x/8x. More difference at 1080p and less at 4k. Lower end cards wired 4x, well depends on the game. For non-gaming/non-workstation stuff, (net, 2d desktop, vids, etc.) it's a non-issue all around.


 
Last edited:
mechtech Thanks for the info. I'm buying a new card soon , really soon. Will make sure its x16 though. But my radeon pro w5500 as i mention works perfectly fine, just some weird power gpu level stuff happening in 3 games. If someone had a 5500 of any kind , i would be curious if the issue is happening with them. Besides that, no problems with anything with this card.
Depends on the rest of the system, is this the one in your system specs?

yes. Soon, I'm choosing between a Radeon Pro W5700 or W6600 ( which has shouldn't be an issue ).
 
yes. Soon, I'm choosing between a Radeon Pro W5700 or W6600 ( which has shouldn't be an issue ).
Work station GPUs, if your not gaming on this rig then 685w PSU should be just ok, that would be minimal imo.
 
Work station GPUs, if your not gaming on this rig then 685w PSU should be just ok, that would be minimal imo.
All i use are workstation gpus for gaming, editing, videos, everything, but mainly gaming. I had to recently clean my w5500, because it kept going up to 99 temps when gaming on demanding games or any game for that matter, bought some paste and not it never goes above 75c-80c. (card was dirty as hell, common sense to clean it) But thanks :) Guess i will find out soon enough.
 
Back
Top