• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 6600 XT PCI-Express Scaling

Now there's a strawman argument. Wher did I say any of that? I didnt. The only thing I said was that AMD has a habit of kneecaping themselves when they start catching nvidia. Rebranding cards, too little memory (4GB 580) or a x8 bus that impacts performance in some games (6600xt, 5500xt was hit by BOTH of these issues).
Not a straw man, just highlighting the flawed logic behind your argument.
You know, outside of software that did show a performance difference. Of course:

So no real world consequences. Outside of real world consequences, but who coutnts those?
...sigh. I and several others have argued why considering those outliers precisely as outliers is reasonable. I have yet to see an actual argument for the opposite.
Right, so any time performance doesnt line up with expectations there are excuses.
No. Perspective and context is not the same as an excuse.
Using a x16 bus like nvidia would fix that problem, but the GPU isnt gimped. Everyone known that buggy console port games NEVER sell well or are popular, ever. Right?
Hey, look at that, a first attempt at an argument for why these games matter. However, it is once again deeply flawed. If a port is buggy and performs poorly, on whom is the onus to rectify that situation? The developer, engine developer/vendor, GPU (+driver) maker, OS vendor/platform owner, all of the above? Depending on the issue, I would say some balance of all of the above. You are arguing as if the only responsible party for improving things is AMD. Hitman's highly variable performance is widely documented, as is DS and HZD's issues. AMD has done work towards improving things with driver updates, as have developers, but at this point, outside of possible effects of unidentified early driver bugs that typically get fixed in 1-2 releases after launch of a new SKU, the remaining responsibility falls squarely on the developer and engine vendor.
If you have to come up with excuses for why examples of a x8 bus hurting performance dont actually matter you've answered your own question.
But that's the thing: we have no proof of that. We know that a 3.0 x8 bus on a current-gen high-end CPU sees a performance drop. We have no way of knowing whether a 4.0 x16 bus would perform better than an x8 one - it might be identical. A 3.0 x16 bus will likely perform better than an x8 one (due to matching 4.0 x16 in bandwidth), but that still leaves another issue with this 'fault': essentially nobody is going to use this GPU on PCIe 3.0 with a CPU as fast in gaming as the 5600X. Nobody who bought a 9900K or 10700K are going to buy a 6600 XT - most likely they already have something equivalent or better. And if your CPU is holding you back, well, then you won't have the performance overhead to actually see that bandwidth bottleneck at all.

So the scope of your issue is progressively shrinking:
It's not that PCIe 4.0 x8 is a bottleneck, it's only on older 3.0 systems.
It's not a bottleneck in all older systems, only those with CPUs that are close to the gamign performance of the 5600X.
It's not only on fast CPUs, but only in a highly selective range of titles with known issues unfixed by developers.
The Venn diagram of all of these caveats is tiny. Hence the 'issue' is overblown.
You've constructed your own argument here that you can never lose because you immedately discredit anything that goes against your narrative. I dont knwo what it is about the modern internet where any criticism against a product has to be handwaved away. The 6600xt is already a gargantuan waste of money, why defend AMD further screwing with it by doing this x8 bus thing that nvidia woudl get raked over the coals for doing?
Because criticism should be considered, relevant, accurate, useful, applicable, and so on. This criticism is neither. It is pointing to a spec-sheet deficiency that has negative consequences in an extremely restrictive and unlikely set of circumstances. Making that out to be a wholesale criticism of the product is irrational and deeply flawed logic, which either speaks to bias (implicit or explicit) or just poor reasoning.

You say the 6600 XT is a 'gargantuan waste of money' - in your opinion, is it more so than the 3060? Or the 3060 Ti? At MSRP, or at street prices? I could understand that argument in the current market if leveled against literally every GPU sold, as they all are. But you're singling out one specific card. That requires more reason than "it has the potential to slightly underperform if you play a highly specific selection of titles on a highly specific hardware configuration".

And that's the point here. People aren't defending the 6600 XT specifically as much as we are arguing against any singling out of it. The arguments for doing so do not stand up to scrutiny. Sure, there are (very) niche cases where it will be a poor choice, and if you're in that niche, then it really isn't the GPU you should be looking at buying. But extrapolating that into supposedly generally applicable advice? That's some really, really poor reasoning.
 
"where in this review outside of cases where it matters can you find examples of it mattering"

Well if you're going to immediately throw out evidence you dont like this conversation will go nowhere.
You’re the one who’s trying to ignore the evidence to the contrary and claiming it’s “kneecapped” the 2%? Still beats it’s competition with your “disability” regardless.
 
I'm currious how the card compares against the 5600 XT as this is supposedly a direct replacement/upgrade to that.
 
I'm currious how the card compares against the 5600 XT as this is supposedly a direct replacement/upgrade to that.
Check one of the many reviews? They all inlcude comparisons to a heap of other GPUs.
ASRock Phantom Gaming D
Asus ROG Strix OC
MSI Gaming X
XFX Speedster Merc 308
Sapphire Pulse XT

Tl;dr: going from the Pulse XT (which is the closest to stock clocks of the cards above) it's about 33% faster at 1080p and 1440p and 22% at 2160p (not that you'd use this GPU for 2160p gaming) with ~10W higher power draws.
 
Some time ago I read article (maybe even here) if any card can saturate PCI-E 3.0 x16. If I remember right, answer was that it cant. And since it wasnt that long time ago, Im fairly positive this kinda mainstream GPU is perfectly fine with x8.
 
Some time ago I read article (maybe even here) if any card can saturate PCI-E 3.0 x16. If I remember right, answer was that it cant. And since it wasnt that long time ago, Im fairly positive this kinda mainstream GPU is perfectly fine with x8.
IIRC the 2080 Ti was the first tested GPU to show significant (i.e. more than 1-2%) performance limitations when going from PCIe 3.0 x16 to 2.0 x16. And the 3080 Ti still doesnt' meaningfully differ between 4.0 x16 and 3.0 x16 (and only drops ~4% at 2.0 x16). If anything, the different PCIe scaling between these two GPUs in outlier games like Death Stranding points to something else besides PCIe bandwidth being the cause of the drop, as there's no bandwidth-related reason why the 6600 XT should lose more performance moving from 4.0 x8 to 3.0 x8 than the 3080 Ti does from 3.0 x16 to 2.0 x16 - those are the same bandwidth, after all. This indicates that the cause for the drop in performance in those titles isn't the bandwidth limitation itself, but some other bottleneck (driver issue? Some convoluted game bug? The engine for some reason transferring far more data to VRAM on RDNA2 than on Ampere?) that can't be identified through this testing. It's too bad the 3080 Ti wasn't tested with Hitman 3, as that would have been another interesting comparison.
 
Is there any explanation why is the card limited to x8?
 
Is there any explanation why is the card limited to x8?
Cost savings.
x8 is fewer traces on the PCB, simpler PCB layout, less expensive gold and copper used etc.

I know these cards are being scalped at $600 and have a high MSRP of $380 but realistically this is budget/entry-level design where cost-effectiveness is more important than outright max performance. If they can shave 3% off the price and it only has a 2% effect on the performance, then that's a worthwhile tradeoff.
 
Cost savings.
x8 is fewer traces on the PCB, simpler PCB layout, less expensive gold and copper used etc.

I know these cards are being scalped at $600 and have a high MSRP of $380 but realistically this is budget/entry-level design where cost-effectiveness is more important than outright max performance. If they can shave 3% off the price and it only has a 2% effect on the performance, then that's a worthwhile tradeoff.
Yeah, there's also the (small, but existent) die area savings from a smaller PCIe PHY and minute power savings from the same. Not an unreasonable thing for what this GPU is clearly designed to be - an upper midrange budget+ GPU (likely in the ~$300 range like the 5600 XT). Of course the market has made pricing like that unrealistic from many directions (increasing material prices, fab capacity shortages, silicon wafer shortages, SMD component shortages, etc., plus crypto, plus demand from gamers sitting on 2-generations-old hardware, +++), but this design was likely started long before this situation started making itself known in a major way. IIRC the 5600 XT was x8 as well, no? Alongside the RX 460/560 too. It's probably designed to scale downwards with 1-2 cut-down SKUs using the same reference board design (just not fully populated with VRMs etc.), which would make cost savings important as margins on those cheaper products would inevitably be lower.
 
Thank you @W1zzard for a very informative review. I was very curious about how much of an impact it would be between pci gen 3 vs 4.
 
There is probably nothing tragically wrong with this card. It all comes down to price and availability. MSRP is irrelevant in most cases. Of course this card comes with compromises, like ray tracing capabilities are non existent, but used 1070s and 980Ti's sell for $350+ on ebay, 6700 XTs go for $850 so I guess if one could easily buy a 6600 XT for $400, it wouldn't be such a bad deal. Again, in current market conditions. But If all cards were readily available at msrp it would be a whole different ball game. AMD knew they'll sell these like hot cakes regardless of reviews, so they figured they wouldn't have to design a top value product. If only availability improved, but we all know it'll take many more months for this nonsense market to stabilize.

I personally will just keep torturing my GTX 980 until I can buy something decent for $400-$450.
 
There is probably nothing tragically wrong with this card. It all comes down to price and availability. MSRP is irrelevant in most cases. Of course this card comes with compromises, like ray tracing capabilities are non existent, but used 1070s and 980Ti's sell for $350+ on ebay, 6700 XTs go for $850 so I guess if one could easily buy a 6600 XT for $400, it wouldn't be such a bad deal. Again, in current market conditions. But If all cards were readily available at msrp it would be a whole different ball game. AMD knew they'll sell these like hot cakes regardless of reviews, so they figured they wouldn't have to design a top value product. If only availability improved, but we all know it'll take many more months for this nonsense market to stabilize.

I personally will just keep torturing my GTX 980 until I can buy something decent for $400-$450.

This - and let's be honest, its not like Nvidia is producing top level product on the entire line up right now. Or for any sort of reasonable price.
 
yeah, i haven't checked here, but in AU, HUb found that the cheapest Nvidia at the same price as the 6600XT is the 1660 Super
 
Only <1% loss @ PCIe 3.0 x8; these cards seem perfect to run off PCIe 4.0 x4 if your workstation or multifunction appliance needs the lanes for other devices. Something like a 100gbps NIC, HBA, or RAID card might need the bandwidth more.
If it wasn't for the signal integrity issues, you could run your GPU off the CPU M.2 slot, riser'd to wherever you wanted, and retain your full pcie x16 slot. Say, for a bifurcated quad m.2 card?
 
So, strangely enough - I am measuring a significant performance loss in none other than CS:GO, on the map ancient. Very niché I know, but I am still taking it up with AMD right now...
 
Interesting thought there LabRat but I'd go and get a Zotack GT 730 PCIe x1 based card for lots cheaper. Yes they're out there though originally OEM only but Newegg has an entry. Very useful for a Server HTPC and other media playback with very light (Freecell) gaming.
 
The underlying reason we're seeing these effects in some games is that nearly all titles are developed for consoles first, which have just one kind of memory that's shared between CPU and GPU. This also means that moving data between the CPU and GPU is incredibly fast and doesn't incur the latency or bandwidth penalties we're seeing on the PC platform. Remember, consoles are basically using something similar to IGP graphics, which has CPU and GPU integrated as a single unit, unlike the PC where you have discrete GPUs sitting on the PCI-Express bus. The onus is now on the game developers to make sure that their games not only run the best on consoles, their cash cow, but also on the PC platform.
Thank you for this. Great reasoning. Those PCIe 1.1 x8 numbers are good for naming and shaming. Comparing the AMD numbers to the NVIDIA numbers in terms of the scaling and relative performance is quite revelatory. My guess is it's the AMD hardware GPU scheduler showing off.
 
Last edited:
So, strangely enough - I am measuring a significant performance loss in none other than CS:GO, on the map ancient. Very niché I know, but I am still taking it up with AMD right now...
Any updates? How much was the difference in performance? Did they fix it?
 
Any updates? How much was the difference in performance? Did they fix it?
They didn't fix jack. They didn't answer - as always. But I found a fix. ReBAR.

 
Don't buy..

Lowest prices by type of card:

RX 6600 XT:

1665079881242.png

AMD Radeon RX 6600 XT Grafikkarte (2022) Preisvergleich | Günstig bei idealo kaufen

RX 6600:

1665079917982.png

AMD Radeon RX 6600 Grafikkarte (2022) Preisvergleich | Günstig bei idealo kaufen

RX 6650 XT:

1665079949314.png

AMD Radeon RX 6650 XT Grafikkarte (2022) Preisvergleich | Günstig bei idealo kaufen


Whoever decides these pricings, is either mad or super stupid, with lost connection with the physical world :D
 
@MagnuTron: he pretty much doubles his framerates.... really? that's something trump would say. No benchmark that you can find out there gives rebar even 10% more performance.

I can even tell you how he faked that performance jump in the video, you enable multicore rendering in csgo settings - on my 5800x that gets you from around 180fps to 400fps (defualt csgo cap) and even more with fps_max 0. but its not a stable frametime.
but you should enable that setting and limit to the fps of your monitor and let it run in fullscreen windowed. otherwise you have some kind of weird lag especially if you turn in game.

tested on a 6900xt
 
Last edited:
@MagnuTron: he pretty much doubles his framerates.... really? that's something trump would say. No benchmark that you can find out there gives rebar even 10% more performance.

I can even tell you how he faked that performance jump in the video, you enable multicore rendering in csgo settings - on my 5800x that gets you from around 180fps to 400fps (defualt csgo cap) and even more with fps_max 0. but its not a stable frametime.
but you should enable that setting and limit to the fps of your monitor and let it run in fullscreen windowed. otherwise you have some kind of weird lag especially if you turn in game.

tested on a 6900xt
Bugs exist dude


Go look at how intels GPUs perform withour reBAR
 
this thread is about 6000 series
And? Unless you own the same hardware as the person who posted about his problem, you can't know if his situation was correct or not.

A user had a problem and posted a solution and you've decided to attack them and pretend they made the whole thing up.
 
Back
Top