• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6600 XT PCI-Express Scaling

Joined
Sep 17, 2014
Messages
22,452 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
disable? you mean enable?
This is a pretty interesting thing to know. Resolution scaling can easily mitigate the bandwidth issue in that game... And if it does, the conclusion of the article might not be as accurate as it looks.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,842 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
This is a pretty interesting thing to know. Resolution scaling can easily mitigate the bandwidth issue in that game... And if it does, the conclusion of the article might not be as accurate as it looks.
I just checked in-game settings and the option is called Resolution Scaling Mode, which should be set to OFF instead of Dynamic to have a fair comparison between GPUs
I used "Off" of course, otherwise results are useless
 
Joined
Oct 11, 2017
Messages
22 (0.01/day)
Location
UK
System Name Rig Ryzen
Processor Ryzen 2600X 3.9Ghz
Motherboard Asus X470 Prime Pro
Cooling Be Quiet BK010 Shadow Rock Slim
Memory 16GB Corsair Vengeance 3000Mhz
Video Card(s) Asus GTX 1080 Ti Strix
Storage 250GB 970evo, 500Gb 860 Evo, 2TB S. Barracuda
Display(s) Samsung T24D390
Case Cooler Master 500P Mesh White
Power Supply Enermax 750W Revolution XT II
Mouse Razer Deathadder Elite
Keyboard Fnatic Gear Red Silent
Software Windows 10
Can you test horizon zero dawn as well, that game seems to like bandwidth.
 
Joined
Feb 20, 2019
Messages
8,283 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
I had to go look at Hitman's results to see the worst case scenario. That is pretty severe in my opinion. @W1zzard how much more FPS would you estimate Hitman 3 would have if it was PCIE 4 x16? I am wondering if the worst case scenario would still be bottlenecked by full PCIE 4.
The rebooted hitman franchise has been a stuttery, poorly-optimised anomaly for lots of reviewers over the years. Both in terms of messy frametime plots (useless 99th percentile scoring) and odd engine limits that get in the way of both CPU and GPU scaling.

Whilst the PCIe scaling does clearly show that it needs a lot of bandwidth I wouldn't treat this as representative of other games on the market. It's just an edge-case curiousity that shows there are more than zero situations where running at PCIe 3.0 x8 might be sub-optimal.
 
Joined
Apr 28, 2021
Messages
24 (0.02/day)
IF you like your GPUS with SERIOUS native bottlenecks, LESS -everything- LESS lanes, memory, bandwidth. CERO RT performance, etc... by all means BUY IT!! Me? HARD PASS AMD ! WAY TO MANY CUTS! This thing has short legs! (wait for the real next gen Hitman... this will DIE!) CERO VALUE AND NO REAL MARKET.

I think the RADEON division is FLOPPING HARD, THEY ARE 100% CLUELESS. AMD took a nice little GPU perfect for HTPC and casual gaming at 75w and $$$$ to hell INTO THIS MEGA JOKE.

WHY??? simple, at this level of price and power consumption? there are MUCH BETTER OPTIONS. Like a 3060TI that CRUSHES THIS ( btw for all the complainers. SORRY this is not my native language, dont expect fluent or persuasive speaking from me )
 
Last edited:
Joined
Mar 18, 2015
Messages
180 (0.05/day)
IF you like your GPUS with SERIOUS native bottlenecks, LESS -everything- LESS lanes, memory, bandwidth. CERO RT performance, etc... by all means BUY IT!! Me? HARD PASS AMD ! WAY TO MANY CUTS! This thing has short legs! (wait for the real next gen Hitman... this will DIE!) CERO VALUE AND NO REAL MARKET.

I think the RADEON division is FLOPPING HARD, THEY ARE 100% CLUELESS. AMD took a nice little GPU perfect for HTPC and casual gaming at 75w and $$$$ to hell INTO THIS MEGA JOKE.

WHY??? simple, at this level of price and power consumption? there are MUCH BETTER OPTIONS. Like a 3060TI that CRUSHES THIS ( btw for all the complainers. SORRY this is not my native language, dont expect fluent or persuasive speaking from me )
Not TYPING in random caps LOCK might make you come across as slightly less DeRaNgEd no matter what YOUR NAtive language is.

Although I'm not sure about that when you're suggesting it's possible to buy a 3060 Ti for anything like the same price as one of these, unless you happen to be A) in a country that has FE drops and B) online in the 30 seconds per month that they're available.
 
Low quality post by london
Joined
Apr 28, 2021
Messages
24 (0.02/day)
btw... why bother testing in a PERFECT VACUUM ??? The Ryzen 7 5800X @ 4.8 GHz IS DOING ALL THE HEAVY LIFTING HERE. THATS CHEATING... Try using a Ryzen 5 1600 and see how that goes .... this are the CPUS most folks that game a 1080p use IN THE REAL WORLD . AT WHAT MARKET IS THIS AIMED AT AMD? IT DOES NOT EXIST. ( btw i have a 5600x and 3600, STILL I will not touch this, don't expect years of badass PERFORMANCE from this crap)
 
Joined
Apr 21, 2010
Messages
578 (0.11/day)
System Name Home PC
Processor Ryzen 5900X
Motherboard Asus Prime X370 Pro
Cooling Thermaltake Contac Silent 12
Memory 2x8gb F4-3200C16-8GVKB - 2x16gb F4-3200C16-16GVK
Video Card(s) XFX RX480 GTR
Storage Samsung SSD Evo 120GB -WD SN580 1TB - Toshiba 2TB HDWT720 - 1TB GIGABYTE GP-GSTFS31100TNTD
Display(s) Cooler Master GA271 and AoC 931wx (19in, 1680x1050)
Case Green Magnum Evo
Power Supply Green 650UK Plus
Mouse Green GM602-RGB ( copy of Aula F810 )
Keyboard Old 12 years FOCUS FK-8100
another SAD attempt to push this utter crap GPU
You know , this gpu with less spec is faster than 5700XT and equal RTX 2080 ! That's crazy.
 

95Viper

Super Moderator
Staff member
Joined
Oct 12, 2008
Messages
12,996 (2.21/day)
Stay on topic.
Read the guidelines/rules before posting.

Here is a sampling:
All posts and private messages have a "report post" button on the bottom of the post, click it when you feel something is inappropriate. Do not use your report as a "wild card invitation" to go back and add to the drama and therefore become part of the problem.
If you disagree with moderator actions contact them via PM, if you can't solve the issue with the moderator in question contact a super moderator.
Under no circumstances should you start public drama.

Thank You and Have a Good (On-Topic) Discussion
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,842 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Can you test horizon zero dawn as well, that game seems to like bandwidth.
As mentioned in the conclusion Death Stranding uses the same engine as HZD and is affected by PCIe bandwidth limitations, too. Given the limited popularity of those two games, I have no plans to bench two games using Decima Engine, which would be almost 10% of the games test group
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
8,042 (1.10/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 3
Software Win 11 Pro x64
As mentioned in the conclusion Death Stranding uses the same engine as HZD and is affected by PCIe bandwidth limitations, too. Given the limited popularity of those two games, I have no plans to bench two games using Decima Engine, which would be almost 10% of the games test group
And Death Stranding is a WAY better implementation of the same engine. HZD still has overall performance issues
 
Joined
Feb 20, 2019
Messages
8,283 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
And Death Stranding is a WAY better implementation of the same engine. HZD still has overall performance issues
This. Death Stranding was designed by Kojima Studios from the ground up for an eventual cross-platform release and Kojima Studios also handled the PC version.

HZD was designed as a PS4 exclusive, with zero consideration given to PC compatibility and the PC port was outsourced to a third party (Virtuous Studios) who had no affiliation with the original developer; Even now they are still patching bugs in the PC port that are nothing to do with the original PS4 version and solely as a result of the third party learning from their mistakes as they go along. Rather than thinking of HZD PC as a PC version of the cross platform game, imagine that a newbie developer was given HZD PS4 assets and told to create a new game from scratch that looks like a copy of the PS4 version.
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
8,042 (1.10/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 3
Software Win 11 Pro x64
This. Death Stranding was designed by Kojima Studios from the ground up for an eventual cross-platform release and Kojima Studios also handled the PC version.

HZD was designed as a PS4 exclusive, with zero consideration given to PC compatibility and the PC port was outsourced to a third party (Virtuous Studios) who had no affiliation with the original developer; Even now they are still patching bugs in the PC port that are nothing to do with the original PS4 version and solely as a result of the third party learning from their mistakes as they go along.
Yeah it’s really night and day usage of the same engine tho DS is using a later version as I understand it but both games are pretty equal as far visuals, open world etc. but you would never think they were both the same engine..
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Price aside, I think the card is decent in performance. However, what I don’t like is that for a card that is meant for “budget” gamers who are mostly on PCI-E 3.0, the fact that you may not be able to get the most out of the GPU is quite annoying, even it’s not a common issue. I wonder if the main driver for AMD to cut down on number of PCI-E lane support is due to cost and power savings.
This is such a weird take, and makes me wonder if you read the article at all. "The fact that you may not be able to get the most out of the GPU" - how does that align with 1-2% average performance drop on PCIe 3.0? Yes, there are outliers that are worse than that, as there always will be. But they are highly specific outliers. The overall results from testing this is that you will get a level of performance not perceptibly different from the full 4.0 speed. That is what the conclusion says. Besides, if you're on a PCIe 3.0 platform, chances are you'll be more held back by whatever CPU you are using on that platform than by the PCIe bandwidth. (Unless, that is, you're using a 9900K, 10700K or similar with a new midrange GPU fr some reason.)
 
Joined
Aug 4, 2020
Messages
1,614 (1.02/day)
Location
::1
Tbf I feel like AMD cheaping out on the lanes (while understandable) is like, really cheap for a card of this class (midrange / entry-level midrange). Now, if this was something around a 1650 (ie, a 6500 or something), or even more budget I can totally understand that, but given how NVidia is quite consistently giving all their cards down to the x50s series an x16 (not that the bottommost would benefit but that's quite besides the point here), I cannot completely shake off the feeling that AMD's cheaping out on us here. Given their track record of being the budget vendor, that's not the smartest move they could've pulled imho.
 
Joined
Feb 20, 2019
Messages
8,283 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
This is such a weird take, and makes me wonder if you read the article at all. "The fact that you may not be able to get the most out of the GPU" - how does that align with 1-2% average performance drop on PCIe 3.0? Yes, there are outliers that are worse than that, as there always will be. But they are highly specific outliers. The overall results from testing this is that you will get a level of performance not perceptibly different from the full 4.0 speed. That is what the conclusion says. Besides, if you're on a PCIe 3.0 platform, chances are you'll be more held back by whatever CPU you are using on that platform than by the PCIe bandwidth. (Unless, that is, you're using a 9900K, 10700K or similar with a new midrange GPU fr some reason.)
The thing is, most people dropping $600+ on a scalped/marked-up GPU will not be using an ancient motherboard. B550/X570/Z490/Z590 all have PCIe 4.0 anyway.

The 1-2% performance loss (for the most part) on PCIe 3.0 x8 is negligible if it's going to be held back even more than that by an old AMD 2600X or Skylake quad-core, for example. Like you say, who would have spent big bucks on a 9900K only to then pair it up with a crap GPU that's already in need of an upgrade?
 
Joined
Dec 28, 2012
Messages
3,884 (0.89/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
Tbf I feel like AMD cheaping out on the lanes (while understandable) is like, really cheap for a card of this class (midrange / entry-level midrange). Now, if this was something around a 1650 (ie, a 6500 or something), or even more budget I can totally understand that, but given how NVidia is quite consistently giving all their cards down to the x50s series an x16 (not that the bottommost would benefit but that's quite besides the point here), I cannot completely shake off the feeling that AMD's cheaping out on us here. Given their track record of being the budget vendor, that's not the smartest move they could've pulled imho.
Frankly this is what AMD does when they catch up, they immediately kneecap themselves. It's not the first (or third) time in recent history they've done this.
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
8,042 (1.10/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 3
Software Win 11 Pro x64
Frankly this is what AMD does when they catch up, they immediately kneecap themselves. It's not the first (or third) time in recent history they've done this.
Where anywhere in the review outside of obvious outliers was it kneecapped and still beating the 3060 x16 “advantage” ?
 
Joined
Dec 28, 2012
Messages
3,884 (0.89/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
Where anywhere in the review outside of obvious outliers was it kneecapped and still beating the 3060 x16 “advantage” ?
"where in this review outside of cases where it matters can you find examples of it mattering"

Well if you're going to immediately throw out evidence you dont like this conversation will go nowhere.
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Frankly this is what AMD does when they catch up, they immediately kneecap themselves. It's not the first (or third) time in recent history they've done this.
... again, how are they kneecapping themselves? There is no notable performance limitation here. There is a spec deficit with no real-world consequences worthy of note. If that amounts to "kneecapping themselves", then you have some rather absurd standards. Or are all GPUs without HBM or a 512-bit memory bus also kneecapped? Yes, there are a couple of outliers. One has a ~7% deficit, the other has a ~15% one. The former is a console port running in an engine primarily developed for consoles and well known for porting issues. The other is a game notorious for buggy performance. If your favourite game genre is "buggy ports", then yes, these are highly relevant. If not, then no, they aren't. They are outliers, and while absolutely true and likely representative of their respective games, they aren't representative of modern games overall - the rest of the tested field demonstrates that. Remember, that 1-2% overall deficit includes those outliers.
The thing is, most people dropping $600+ on a scalped/marked-up GPU will not be using an ancient motherboard. B550/X570/Z490/Z590 all have PCIe 4.0 anyway.

The 1-2% performance loss (for the most part) on PCIe 3.0 x8 is negligible if it's going to be held back even more than that by an old AMD 2600X or Skylake quad-core, for example. Like you say, who would have spent big bucks on a 9900K only to then pair it up with a crap GPU that's already in need of an upgrade?
Exactly. If I were to buy one of these and stick it into my travel PC (an old and heavily modified Optiplex 990 SFF) with its i5-2400 and PCIe 2.0, the PCIe 2.0 really isn't what would be holding me back. That would be the CPU.
 
Last edited:
Joined
Jul 10, 2011
Messages
797 (0.16/day)
Processor Intel
Motherboard MSI
Cooling Cooler Master
Memory Corsair
Video Card(s) Nvidia
Storage Western Digital/Kingston
Display(s) Samsung
Case Thermaltake
Audio Device(s) On Board
Power Supply Seasonic
Mouse Glorious
Keyboard UniKey
Software Windows 10 x64
First thought before reading anything:

STICK IT IN A 1x SLOT

Edit: wow, the loss is actually quite small. <20% on the really outdated 1.1 8x is impressive, and the 2.0 results are almost not noticeable in general use.


How is it crap? its a great 1080p/1440p budget card, and the prices slaughter nvidia in many regions


And $700 budget card slaughter customers.


In that store 3060 and 3060 Ti have same prices.
 
Joined
Dec 28, 2012
Messages
3,884 (0.89/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
... again, how are they kneecaping themselves? There is no notable performance limitation here.
If that amounts to "kneecapping themselves", then you have some rather absurd standards. Or are all GPUs without HBM or a 512-bit memory bus also kneecapped?
Now there's a strawman argument. Wher did I say any of that? I didnt. The only thing I said was that AMD has a habit of kneecaping themselves when they start catching nvidia. Rebranding cards, too little memory (4GB 580) or a x8 bus that impacts performance in some games (6600xt, 5500xt was hit by BOTH of these issues).

There is a spec deficit with no real-world consequences worthy of note.
You know, outside of software that did show a performance difference. Of course:

Yes, there are a couple of outliers. One has a ~7% deficit, the other has a ~15% one.
So no real world consequences. Outside of real world consequences, but who coutnts those?
The former is a console port running in an engine primarily developed for consoles and well known for porting issues. The other is a game notorious for buggy performance. If your favourite game genre is "buggy ports", then yes, these are highly relevant. If not, then no, they aren't. They are outliers, and while absolutely true and likely representative of their respective games, they aren't representative of modern games overall - the rest of the tested field demonstrates that. Remember, that 1-2% overall deficit includes those outliers.
Right, so any time performance doesnt line up with expectations there are excuses. Using a x16 bus like nvidia would fix that problem, but the GPU isnt gimped. Everyone known that buggy console port games NEVER sell well or are popular, ever. Right?

If you have to come up with excuses for why examples of a x8 bus hurting performance dont actually matter you've answered your own question. You've constructed your own argument here that you can never lose because you immedately discredit anything that goes against your narrative. I dont knwo what it is about the modern internet where any criticism against a product has to be handwaved away. The 6600xt is already a gargantuan waste of money, why defend AMD further screwing with it by doing this x8 bus thing that nvidia woudl get raked over the coals for doing?
 
Joined
Dec 30, 2010
Messages
2,198 (0.43/day)
Lol all these threads on the net about how AMD necked it's users by providing a PCI-E x8 type of card.

Just depends on the user case, but overall it's still within and twice as fast as a polaris. @W1zzard > how does PCI-E overclocking yield with such cards and it's performance? You could use a older board without a NVME setup and be able to push it to 112Mhz PCI-E bus or so. Should be perfectly possible.
 
Joined
Jan 24, 2011
Messages
287 (0.06/day)
Processor AMD Ryzen 5900X
Motherboard MSI MAG X570 Tomahawk
Cooling Dual custom loops
Memory 4x8GB G.SKILL Trident Z Neo 3200C14 B-Die
Video Card(s) AMD Radeon RX 6800XT Reference
Storage ADATA SX8200 480GB, Inland Premium 2TB, various HDDs
Display(s) MSI MAG341CQ
Case Meshify 2 XL
Audio Device(s) Schiit Fulla 3
Power Supply Super Flower Leadex Titanium SE 1000W
Mouse Glorious Model D
Keyboard Drop CTRL, lubed and filmed Halo Trues
Now there's a strawman argument. Wher did I say any of that? I didnt. The only thing I said was that AMD has a habit of kneecaping themselves when they start catching nvidia. Rebranding cards, too little memory (4GB 580) or a x8 bus that impacts performance in some games (6600xt, 5500xt was hit by BOTH of these issues).


You know, outside of software that did show a performance difference. Of course:


So no real world consequences. Outside of real world consequences, but who coutnts those?

Right, so any time performance doesnt line up with expectations there are excuses. Using a x16 bus like nvidia would fix that problem, but the GPU isnt gimped. Everyone known that buggy console port games NEVER sell well or are popular, ever. Right?

If you have to come up with excuses for why examples of a x8 bus hurting performance dont actually matter you've answered your own question. You've constructed your own argument here that you can never lose because you immedately discredit anything that goes against your narrative. I dont knwo what it is about the modern internet where any criticism against a product has to be handwaved away. The 6600xt is already a gargantuan waste of money, why defend AMD further screwing with it by doing this x8 bus thing that nvidia woudl get raked over the coals for doing?

Since we don't have 4.0 x16 numbers, you can't say that AMD is "kneecapping" themselves with this choice. There is a grand total of ONE performance scenario in this review where the difference matters between 4.0 and 3.0 x8 (9 FPS vs 7 is irrelevant). As for previous generations, both the 5500 XT and the 480/580 had 8 GB versions available to those with a tiny bit more money. There's just really no basis for your argument.
 
Top