- Joined
- May 2, 2017
- Messages
- 7,762 (2.81/day)
- Location
- Back in Norway
System Name | Hotbox |
---|---|
Processor | AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6), |
Motherboard | ASRock Phantom Gaming B550 ITX/ax |
Cooling | LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14 |
Memory | 32GB G.Skill FlareX 3200c14 @3800c15 |
Video Card(s) | PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W |
Storage | 2TB Adata SX8200 Pro |
Display(s) | Dell U2711 main, AOC 24P2C secondary |
Case | SSUPD Meshlicious |
Audio Device(s) | Optoma Nuforce μDAC 3 |
Power Supply | Corsair SF750 Platinum |
Mouse | Logitech G603 |
Keyboard | Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps |
Software | Windows 10 Pro |
Not a straw man, just highlighting the flawed logic behind your argument.Now there's a strawman argument. Wher did I say any of that? I didnt. The only thing I said was that AMD has a habit of kneecaping themselves when they start catching nvidia. Rebranding cards, too little memory (4GB 580) or a x8 bus that impacts performance in some games (6600xt, 5500xt was hit by BOTH of these issues).
...sigh. I and several others have argued why considering those outliers precisely as outliers is reasonable. I have yet to see an actual argument for the opposite.You know, outside of software that did show a performance difference. Of course:
So no real world consequences. Outside of real world consequences, but who coutnts those?
No. Perspective and context is not the same as an excuse.Right, so any time performance doesnt line up with expectations there are excuses.
Hey, look at that, a first attempt at an argument for why these games matter. However, it is once again deeply flawed. If a port is buggy and performs poorly, on whom is the onus to rectify that situation? The developer, engine developer/vendor, GPU (+driver) maker, OS vendor/platform owner, all of the above? Depending on the issue, I would say some balance of all of the above. You are arguing as if the only responsible party for improving things is AMD. Hitman's highly variable performance is widely documented, as is DS and HZD's issues. AMD has done work towards improving things with driver updates, as have developers, but at this point, outside of possible effects of unidentified early driver bugs that typically get fixed in 1-2 releases after launch of a new SKU, the remaining responsibility falls squarely on the developer and engine vendor.Using a x16 bus like nvidia would fix that problem, but the GPU isnt gimped. Everyone known that buggy console port games NEVER sell well or are popular, ever. Right?
But that's the thing: we have no proof of that. We know that a 3.0 x8 bus on a current-gen high-end CPU sees a performance drop. We have no way of knowing whether a 4.0 x16 bus would perform better than an x8 one - it might be identical. A 3.0 x16 bus will likely perform better than an x8 one (due to matching 4.0 x16 in bandwidth), but that still leaves another issue with this 'fault': essentially nobody is going to use this GPU on PCIe 3.0 with a CPU as fast in gaming as the 5600X. Nobody who bought a 9900K or 10700K are going to buy a 6600 XT - most likely they already have something equivalent or better. And if your CPU is holding you back, well, then you won't have the performance overhead to actually see that bandwidth bottleneck at all.If you have to come up with excuses for why examples of a x8 bus hurting performance dont actually matter you've answered your own question.
So the scope of your issue is progressively shrinking:
It's not that PCIe 4.0 x8 is a bottleneck, it's only on older 3.0 systems.
It's not a bottleneck in all older systems, only those with CPUs that are close to the gamign performance of the 5600X.
It's not only on fast CPUs, but only in a highly selective range of titles with known issues unfixed by developers.
The Venn diagram of all of these caveats is tiny. Hence the 'issue' is overblown.
Because criticism should be considered, relevant, accurate, useful, applicable, and so on. This criticism is neither. It is pointing to a spec-sheet deficiency that has negative consequences in an extremely restrictive and unlikely set of circumstances. Making that out to be a wholesale criticism of the product is irrational and deeply flawed logic, which either speaks to bias (implicit or explicit) or just poor reasoning.You've constructed your own argument here that you can never lose because you immedately discredit anything that goes against your narrative. I dont knwo what it is about the modern internet where any criticism against a product has to be handwaved away. The 6600xt is already a gargantuan waste of money, why defend AMD further screwing with it by doing this x8 bus thing that nvidia woudl get raked over the coals for doing?
You say the 6600 XT is a 'gargantuan waste of money' - in your opinion, is it more so than the 3060? Or the 3060 Ti? At MSRP, or at street prices? I could understand that argument in the current market if leveled against literally every GPU sold, as they all are. But you're singling out one specific card. That requires more reason than "it has the potential to slightly underperform if you play a highly specific selection of titles on a highly specific hardware configuration".
And that's the point here. People aren't defending the 6600 XT specifically as much as we are arguing against any singling out of it. The arguments for doing so do not stand up to scrutiny. Sure, there are (very) niche cases where it will be a poor choice, and if you're in that niche, then it really isn't the GPU you should be looking at buying. But extrapolating that into supposedly generally applicable advice? That's some really, really poor reasoning.